Apr 032013
 

For the past few years websites have really grown and outgrown. If expired domains is any indication there are more websites born than babies on this planet everyday. Or perhaps more websites dying than people on this planet. Ok I made those stats up but hey 79% of all stats a re made up right?

Well what I have seen is that it is not websites that are becoming popular over the past few years. The trend has been towards services. I am not just thinking SaaS and stuff. Those are covered by “whitepapers” already. I am thinking nexusmods. A simple service to download game mods. I do run a fairly new games news portal and forum (like 3 months old…. remember babies) over at http://gamingio.com My interest in games go far back…really far like 1985 far. I work on web services everyday at work but I want to see some websites so I make them. End of the day what really matters is the service, the value add. Websites have done that and it’s getting old. Stuff likes news, blog etc is done to death. There is still ample room for new websites with new ideas. The real winner is web service.

Steam DB is amazing ref: http://gamingio.com/2013/03/mortal-kombat-9-and-many-other-games-steam-achievements-spotted/ see the schema someone has drawn up. It has so much information to play around with. Steam has a usable web service.

Web services are becoming the norm for many hosted apps as well. For e.g. when I used zendesk or aws I barely access the site. Most of it is done via web services. There is no standard to these things and web services is still a lot of stuff to cover and everyone has their own idea of how to do things.

 

A few months back I started working on my own web service. Nothing big , just something I wanted to use personally inspired by nexusmods and steam db. Idea is to put together a database of games and mods and allow it be publicly accessible or data submitted, merged etc. Anyone can query it. I could enable file downloads as well and in my test so far it all works ok. I just don’t have any background in windows or software development to make a GUI anyway. Perhaps others can find it useful.

I don’t have any game data yet but I will be making the API publicly available. I am hoping to get volunteers to help fill in the gaps.

In the next post i will put up the URL and some sample queries. I dont yet have a way of letting everyone send game data for the DB but I can work on that.

 

Other thing I realized that the GamingIO forums does have an API from IPB (the forum software we use) but I haven’t looked into whether it has a downloads API.  Either one should work for my purposes though the forum one would be simpler.

Dec 202012
 

I have had this site hosted on Redhat Openshift for almost an year. Considering I got this hosting free (and you can too) I was a bit apprehensive of whether I should move the site elsewhere. I let it be here anyway. It was surprising that not only do I get awesome performance but the uptime has been incredible. I was going to set this blog up on an AWS EC2 micro instance.  However micro instance being what it is would have costed as much as a Linode VPS and would be a time shared CPU. This can get very annoying as you find that on and off the site would be slower.

Redhat openshift offers a free micro instance equivalent, the difference being that you are probably on a much bigger Instance since the PaaS run atop of AWS cloud making this setup akin to a VPS. This makes more sense instead of spending on a Small instance or settling for micro. In fact I don’t recommend Micro instance at all for any purpose other than testing, compiling or other such process where on-demand CPU is not necessary.

Comparatively my Linode VPS which costs me $40 a month is not doing as well in terms of performance when I use it to host WordPress sites. I am still not clear as to why the memory usage and swap is higher on Linode VPS (perhaps the CPU is over subscribed) but this Openshift instance is at 512 MB Ram and doing just great.

If you are a developer who does not want to get into the hassle of setting up servers and services and just want to get down to coding your stuff I recommend you give Redhat Openshift a try. You will not be disappointed. Specially if you build sites for your clients.

Considering the way things are changing with PaaS and Cloud , the price being what it is, I am thinking why do I still put up with my Dreamhost account which is barely usable and hosts 1000s of users and sites on a single server. I coulld not even do a basic PHP development and test on it. Same goes for pretty much any webhost or reseler like Godaddy, Media temple and blah blah.

 

Since I also use Google App Engine for development and learning it is worth adding why you would choose Redhat over App Engine. Familiarity is possibly number 1. Known that App Engine support MySQL now but it remains that you have shell access to your instance on Openshift much as you would on your own instance. You can also access some basic metrics and new services are being built all the time on Open Shift. Check out the Websockets beta here https://openshift.redhat.com/community/blogs/newest-release-websockets-port-forwarding-more recently launched.

 

My favorite Python web framework Flask is effing supported as well https://openshift.redhat.com/community/get-started/flask . I cannot describe how much pain is involved in hosting these python apps on just about any distro. I think I am going to setup my Flask sites over at Openshift. Ofcourse Django is supported. I have tried neither of them.

 

Now that I am confident about Openshift here are the things I would like to learn to get to production deployment of my Python projects.

  1. How do I add SSL certificates
  2. How do I enable autoscale ( I suppose this is just to do with your AWS account and ofcourse you have to Openshift Enterprise for it it seems)
  3. How do I use my existing RDS with openshift (and securely)
  4. I am sure there is a whole bunch of things I haven’t thought of yet.
All of the above is in the docs somewhere.
Concern: Given that Redhat’s own Enterprise support is not highly regarded among devs and ops, I wonder what does Openshift “Enterprise” will do for us?
OpenShift Enterprise
truth is I do not have experience with Redhat Enterprise Support or with OpenShift Enterprise Sservice. Question is, should I be the one to take the first hand experience myself? Or to bet my job on it? That is going to be some daring. xD

 

Dec 182012
 

Despite most my work involving PHP setup I have found python to be the most useful tool for a whole bunch of supporting tasks. One of them is to run commands or deploy packages to Amazon EC2 instances.
For really large setup this is a very cool way to get the job done. Imagine that you have 10-12 servers and autoscaling tends to change the numbers of servers every now and then. Let’s say you wanted to git update all the servers with the latest copy of the code or restart a process. You can do this with one command SSH. Yes but how on all the servers? So you search for “parallel SSH”. Seems all fine and dandy till you realize you still need to list all the hostnames. “Parallel SSH, why you no just read my mind”. We are going to make something like parallel SSH really quickly that works on AWS and easy to do whatever you want it to.

However this SSH is, well cross platform I suppose all you need to do is be able to run Python (2.5 or higher). I am not going into in-depth details. I want to show you how you can do this yourself, make you a believer. Then ofcourse I recommend you do further reading. There is a lot of literature out there but no working examples that does what I am showing you here. Once you get the idea, you will be unstoppable crazy lunatic and be quite pleased with your megalomaniac self. Back to reality….

 

Prepare

Fabric: prepare your python install by installing this package.

Boto: Next you need the Boto packages for Python. Install that too.

Get your AWS security keys with atleast the permission to Read information about ALL EC2 instances. You don’t need anymore if you just want to SSH into the systems.

Also prepare your SSH key ofcourse.  Place it anywhere and now you can begin writing some code.

There are two parts to this.

1st part:

Use Boto to choose your EC2 Instances.

All instances have some attributes. Plus good Devops always Tag their instances… you tagged your instance didn’t you? Well no matter.

Code below (ignore fabric references, we’ll get to that in a bit)

fabfile.py (name fabfile.py is important or it won’t work)

import boto
from fabric.api import env, run, parallel

AWS_ACCESS_KEY_ID = 'GET_YOUR_OWN'
AWS_SECRET_ACCESS_KEY = 'GET_TO_DAT_CHOPPA'

def set_hosts():
    from boto.ec2.connection import EC2Connection

    ec2conn = EC2Connection(AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY)
    i = []
    for reservation in ec2conn.get_all_instances(filters={'key-name' : 'privatekey','instance-state-name' : 'running','root-device-type': 'instance-store'}):
        #print reservation.instances (for debug)
        for host in reservation.instances:
            # build the user@hostname string for ssh to be used later
            i.append('USERNAME@'+str(host.public_dns_name))

    return i

env.key_filename = ['/path/to/key/privatekey.pem']
env.hosts = set_hosts()

Quick explanation: I have used a couple of filters above. Clearly outlined as the keyword  “filters” in code above. Here I chose to filter by Private Keyname for a bunch of servers that are in the state “running” (always good to have) and the root-device-type is instance-store. Now you see if you had tagged your servers, the key/value filter would be like this

{‘tag:Role’ : ‘WobblyWebFrontend’}

 

You can use the reference here to find more filters. We are basically filtering by Instance metadata or what you would get from ec2 describe command. You can even use autoscale group name. The idea is for you to select the servers you want to run a particular command on, so that is left to you. “I can only show you the code, you must debug it on your own”. lulz
2nd part:

Now that assuming I am right, you have chosen all the servers that you need to run your command on, we are going to write those commands in the same file.

Notice that the function set_hosts returns a list of hostnames. Thats all we needed.

continued…fabfile.py

@parallel
def uptime():
    run('uptime',env.hosts)

 

…and we are done with coding. No really.

Cd in to the directory where you saved fabfile.py . Run the program like so.

$fab uptime

Satisfying splurge of output on the command line follows….

No wait! What happened here?
When you invoke the command “fab” fabric looks for the fabfile.py and runs the function that matches the first argument. So you can keep writing multiple function for say “svn checkout”, “wget” or “shutdown now”  whatever.

The @parallel decorator before the function, tells fabric to execute the command “SIMULFKINGTANEOUSLY” on all servers. That is your parallel SSH.

May the force be with you. Although I know the look on your face right now.

Aug 312012
 

Scenario is this:

  •  Django App is running on an instance with web server and has no SSL installed.
  • SSL cert is installed on the ELB and the ELB is accepting requests for the django App (which is still non SSL)

 

 

Problem happens here is that URL’s that django generates is not secure i.e. HTTP as well as django isnot enforcing secure mode.

For this we can possibly use a Django middleware, example code

 

from django import http

class ELBMiddleware(object):
  def process_request(self, request):
   if 'HTTP_X_FORWARDED_PROTO' in request.META:
    if request.META['HTTP_X_FORWARDED_PROTO'] == 'https':
    request.is_secure = lambda: True
   return None

 

Remember to save this middleware in your django directory and enable it in settings.py You know how right. hint: filename.Classname 🙂

suggestions? ideas, improvements are welcome.