Get a unix command line environment that has python and ssh, I use cygwin under Windows, other times I dual boot into Ubuntu.
Get an Amazon EC2 account, create a ~/username.pem file, and make environmental variables for the keys (follow boto instructions).
Make sure pem permission are set to 700.
Edit ssh_config so that StrictHostChecking is set to no, otherwise ssh sessions started by the scripts will ask if it's okay to connect to every created instance- I could probably automate that response though.
Make sure there are no carriage returns (\r) in the pem file in Linux.
Get Elasticfox, put your credentials in.
Get boto
Get trajectorset
Create a security group called http that at least allows your ip to access a webserver of an ec2 instance that uses it.
At this point it should be possible to run ec2start.py, visit the ip address of the head node and watch the results come in. The ec2start script launches a few instances, one head node that will create noise seeds to send to the worker nodes via sqs, and then wait for the workers to process the seeds and send sqs messages back. The head node then copies the results files and renders the graphics, copying the latest results to folder that can be seen by index.html for web display.
My code is mainly for demonstration, so the key things I did that will help with alternate applications follow:
Custom AMI
You can use the AMI I created with the id 'ami-2bfd1d42', I used one of the Alestic Ubuntu amis and added Java, Xvfb, Boto, and a webserver like lighttpd (I forget if Xvfb was already installed or not).
Headless rendering
The EC2 instance lack graphics contexts at first, and trying to run a graphical application like an exported Processing project will not work (TBD did I ever try that?). Xvfb creates a virtual frame buffer that Processing can render to after running these commands:
Xvfb :2
export DISPLAY=:2
Launching processes and detaching from them
I use python subprocess.Popen frequently to execute commands on the instances like this:
cmd = "Xvfb :2"
whole_cmd = "ssh -i ~/lucasw.pem root@" + dns_name + " \"" + cmd + "\""
proc = subprocess.Popen(whole_cmd, shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
(stdout,stderr) = proc.communicate()
The problem is when one wants to run something and close the connection, and leave it running - like Xvfb above, it needs to run and stay running. One method is to leave the ssh connection open, but there is a limit of about 20 ssh sessions.
The trick is to use nohup:
cmd = "nohup Xvfb :2"
Don't put extra quotes around the command to execute, which brings me to the next topic.
Quote escaping
There are a few bash commands that require parts to be in quotes- but in python the bash command is already is in quotes, and python will not understand the inner set of quotes unless they are escaped with the backslash:
cmd = "echo \"blah\" > temp.txt";
Then at other times an additional level of quote escaping is required:
cmd = "echo \\\"blah\\\" > temp.txt";
(I do this when I pass all of the cmd variable to be executed by ssh, and ssh wants it in quotes)
One backslash escapes on level of quoting, three escapes two levels? It's because the escaping backslash itself needs to be escaped. This gets confusing fast, and some experimentation with python in interactive mode is required to get it right.
Config file driven
It's not currently, not as much as at it needs to be, which makes it very brittle- to change plots requires making about three different edits, when a source config file should specify it for all.
No comments:
Post a Comment