cd package
zip -r9 ../function.zip .
cd ..
zip -g function.zip function.py
aws lambda update-function-code --function-name MyFunctionName --zip-file fileb://function.zip
Apparently it’s not enough to gather information on your email sending, bounces and complaints. You also need to act when there’s a problem otherwise Amazon will put your account under probation and might even shut it completely if you don’t fix the problem.
Create an SNS topic. Let’s call it SES-BOUNCES. Or create 2 SNS topics (1 for bounces, another for complaints). I will go with just one.
Create a service on your site to handle the bounces. I used some simple PHP code (pasted below) for that.
Create a subscription in your SNS topic (created first)
The service should handle 2 things:
the subscription (that will happen once!): call the SubscribeURL and you’re done.
the bounces/complaints: Remove the email addresses form your mailing list, or blacklist them, or … The main idea is to stop sending emails to those who complain. And stop trying to send emails to addresses that bounce.
Here’s the code for the PHP service, it’s pretty straight forward
At work, we wanted to switch from Mandrill/Mailchimp to Amazon SES for a long time. But that was not happening mainly because the tools SES offered to monitor sent mail were, how should I say, DIY.
So, after some deliberation and when I found some time to tackle it, I did it 🙂
The setup is not too complex? Well, it is. But once you understand it, it’s pretty basic.
Let’s start at the source: Amazon
You will see this notice under Notifications for each Email Address you create/verify in SES:
Amazon SES can send you detailed notifications about your bounces, complaints, and deliveries.
Bounce and complaint notifications are available by email or through Amazon Simple Notification Service (Amazon SNS).
Next step is to create the SNS Topic, it’s just a label really.
You will also need an Amazon SQS queue. A standard queue should be good. Once it’s there, copy the ARN as you will need that for the SNS subscription.
Let’s go back to the SNS Topic we created and click on the Create subscription button. Choose Amazon SQS for the Protocol and paste the ARN of the SQS queue you created earlier. You may need to confirm that too? Just click the button if it’s there.
That’s all on the Amazon side! See how easy that was?!
Next you need a Graylog setup.
Where do I start? Well, first choose where do you want to put that Graylog “machine”. For Amazon EC2 I would just go with their ready-made AMIs. Here’s the link/docs to follow: http://docs.graylog.org/en/latest/pages/installation/aws.html (but and I quote: The Graylog appliance is not created to provide a production ready solution)
But since I like doing things the “easy” way, I went with the Ubuntu 16.04 package per http://docs.graylog.org/en/latest/pages/installation/operating_system_packages.html
Seriously, it’s much easier to use and maintain since I know where everything is. Maybe it’s just me …
Anyway, here’s my bash session:
I followed the instructions there, and installed Apache on top of that with the following configuration for the VirtualHost
ServerName example.com
# Letsencrypt it
SSLCertificateFile /etc/letsencrypt/live/example.com/fullchain.pem
SSLCertificateKeyFile /etc/letsencrypt/live/example.com/privkey.pem
Include /etc/letsencrypt/options-ssl-apache.conf
# The needed parts start here
ProxyRequests Off
Order deny,allow
Allow from all
RequestHeader set X-Graylog-Server-URL "https://example.com/api/"
ProxyPass http://127.0.0.1:9000/
ProxyPassReverse http://127.0.0.1:9000/
This will leave you with a Graylog server ready to receive the logs. Now, how do we get the logs over to Graylog? Easy! Pull them from SQS.
Start by adding a GELF HTTP Input in Graylog (System > Inputs > Select Input: GELF HTTP > Launch new input)
Make sure to get the port there right, you will need to configure the script below.
Then download the script, make sure it’s executable. Do run it manually, that way it will tell you what’s missing (BOTO3)
Make sure to configure AWS credentials. The quickest way is:
* to install awscli: apt-get install awscli
* and run its configuration: aws configure
Edit the script with the right configuration vars, add it to cron to run as much as you feel necessary (I use it @hourly)
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Codio is awesome. Coolio would have been a better name, but I guess that’s taken!
It’s free, but don’t let that fool you. What you get is a beautifully simple yet powerful IDE in your browser. It works great even on my tablet, so I can edit/update/deploy even on the road (or on the couch half asleep, and here’s where git comes in handy!). It supports github out of the box. And has a cool terminal with SSH and everything. Yay!
Now if that was not awesome enough, these guys went out of their way and added an even cooler feature:
Every project gets its own Box: an instantly available server-side development environment with full terminal access.
So you get to do it all without leaving their site. I wanted to try playing with a pet project I’m working on using django. It was pretty easy to get it all setup (especially that the code was on github, but you could as simply upload the code).
Here’s a step by step tutorial to get django up and running on codio.
Open a terminal: Tools > Terminal
I’m using MySQL, so:
parts install mysql
parts mysql start
mysqladmin -u root password StrongPassword
mysql -u root -pStrongPassword
mysq> create database dbname;
mysql> grant all privileges on dbname.* to 'dbuser'@'localhost' identified by 'dbpasswd';
mysql> flush privileges;
mysql> \q
Also make sure to install python using the ‘parts’ command:
Create the requirements.txt file if you don’t have that already. Hint: run ‘pip freeze > requirements.txt’ to save those. So in requirements.txt:
Django
MySQL-python
#distribute
#wsgiref
Run the following in the codio.com terminal: parts install pip
pip install -r requirements.txt
Upload the django project, or import it from GitHub then: python manage.py syncdb
python manage.py runserver 0.0.0.0:8009
I still need to get it running through apache. I’ll get on that later tonight.
parts install apache2_mod_wsgi # also installs apache2 and apr_util
============ apache2 ============
To start the Apache server:
$ parts start apache2
To stop the Apache server:
$ parts stop apache2
Apache config is located at:
$ /home/codio/.parts/etc/apache2/httpd.conf
Default document root is located at:
$ /home/codio/workspace
============ apache2_mod_wsgi ============
If Apache2 httpd is already running, you will need to restart it:
$ parts restart apache2
Default configuration for wsgi is:
WSGIScriptAlias / /home/codio/workspace
You can change default in /home/codio/.parts/etc/apache2/config/wsgi.conf file
So I changed the file in /home/codio/.parts/etc/apache2/config/wsgi.conf to:
LoadModule wsgi_module /home/codio/.parts/packages/apache2_mod_wsgi/3.4/mod_wsgi.so
WSGIScriptAlias / /home/codio/workspace/conf/wsgi.py # <-- this is where I put my wsgi.py file
WSGIPythonPath /home/codio/workspace
Order deny,allow
Require all granted
Then start apache, and visit your site at http://UNIQUE-NAME.codio.io:3000/: parts start apache2
You will probably see a ‘500 Internal Server Error’ message. So tail the apache ErrorLog file in the terminal to see what went wrong: tail -f /home/codio/.parts/var/apache2/log/error_log
Here’s what I see (and I haven’t figured the fix yet):
[] mod_wsgi (pid=912): Target WSGI script '/home/codio/workspace/conf/wsgi.py' cannot be loaded as Python module.
[] mod_wsgi (pid=912): Exception occurred processing WSGI script '/home/codio/workspace/conf/wsgi.py'.
[] Traceback (most recent call last):
[] File "/home/codio/workspace/conf/wsgi.py", line 5, in
[] from django.core.wsgi import get_wsgi_application
[] File "/home/codio/.parts/packages/python2/2.7.6/lib/python2.7/site-packages/django/core/wsgi.py", line 1, in
[] from django.core.handlers.wsgi import WSGIHandler
[] File "/home/codio/.parts/packages/python2/2.7.6/lib/python2.7/site-packages/django/core/handlers/wsgi.py", line 6, in
[] from io import BytesIO
[] File "/home/codio/.parts/packages/python2/2.7.6/lib/python2.7/io.py", line 51, in
[] import _io
[] ImportError: /home/codio/.parts/packages/python2/2.7.6/lib/python2.7/lib-dynload/_io.so: undefined symbol: PyUnicodeUCS2_Replace
to be continued…
… issue fixed by installing python using the ‘parts’ command above
Most email clients can request a return receipt. One way to set this up in Thunderbird for example is to go to “Preferences > Advanced > General > Return Receipts …” and check the “When sending messages, always request a return receipt”
The problem with this approach is that when the person you’re emailing has set his email client to reject that request (or does that manually every time), you will not get a receipt back. And honestly speaking there is no way to force that. Now there are some services out there that promise to track your mail when that’s read, and those rely on embedding a transparent gif that will call home when the message is opened. I tried a few of these services: the free ones did not deliver (or seemed too shady for my taste) and the paid ones did not look too good either. So I cooked up a quick solution that I can use when needed (like when I’m tracking my brother on his honeymoon trip *evil grin*)
The code is below, pretty self explanatory. You will need a transparent gif/png. That’s easy too 😉
I wrote previously about building dns zone files mostly by querying DNS and saving the results to a file.
The method is not perfect, and it will not be possible to get all the records. In fact, you could get all the records incorporated in the result file if you already have a list and feed it to your code.
I wrote a python module to manage DNS records using the RimuHosting DNS API. The code is hosted at GitHub.
I like VeriteCo’s TimelineJS. It’s elegant, easy to use and very useful. I don’t like editing Google a google spreadsheet every time I want to update the timeline although it might be easier for some people. So, I wrote a small pluggable app for django to embed a timeline in django templates.
The code is loosely based on the code for the WordPress plugin. And I’ll try to copy over more code and fixes from that soon.
For now, it’s a working app with an admin backend to enter the timeline information and events, and to edit the options. It’s also a custom template tag that you can use to embed the timeline in your own templates.
Should be easy enough to use it.
pip install django-timelinejs
then include it in your INSTALLED_APPS and ./manage.py syncdb
Drop me a note if you find a bug or want some help. You can use the issues in GitHub for that.