Wednesday, April 7, 2010

Running a HTTP Server in a Thread

I'm building a shop using Django and Satchmo and I ran into an interesting problem while building tests for the Paypal payment module.

The user is sent off to Paypal to process the payment and are returned to the success page. Paypal notifies the site of payment through an 'ipn' url on the shop. Faking the call to the ipn url is easy enough in the test:

>>> postdata = dict(payment_status='Completed',
..., memo=memo, txn_id='1234',
>>> response ='/checkout/paypal/ipn/', postdata)

But the problem then arose because the ipn django view then requests verification from paypal:

def confirm_ipn_data(data, PP_URL):
    # data is the form data that was submitted to the IPN URL.

    newparams = {}
    for key in data.keys():
        newparams[key] = data[key]

    newparams['cmd'] = "_notify-validate"
    params = urlencode(newparams)

    req = urllib2.Request(PP_URL)
    req.add_header("Content-type", "application/x-www-form-urlencoded")
    fo = urllib2.urlopen(req, params)

    ret =
    if ret == "VERIFIED":"PayPal IPN data verification was successful.")
    else:"PayPal IPN data verification failed.")
        log.debug("HTTP code %s, response text: '%s'" % (fo.code, ret))
        return False

    return True

The problem here is opening the url using urllib2. I couldn't use the http://testserver/ that is 'running' in the Django test code. So I tried to start up a simple HTTPServer instance but this would lock up the testrunner. Finally then I came up with the following code which does the job nicely:

# simple server to use as fake verification server
from BaseHTTPServer import BaseHTTPRequestHandler,HTTPServer
from threading import Thread

class PaypalHandler(BaseHTTPRequestHandler):
    Simple server to return verified value for our paypal tests
    def do_POST(self):
        self.send_response(200, 'OK')
        self.send_header('Content-type', 'text/html')
        self.wfile.write( "VERIFIED" )

class RunThread(Thread):
    def __init__(self, port):
        self.server = HTTPServer(('', port), PaypalHandler)
    def run(self):

server_thread = None

def start_server(port):
    global server_thread
    server_thread = RunThread(port)

def stop_server():
    global server_thread

Now in my test setup I can call

def setUp(suite):

def tearDown(suite):

And it works as expected.

Sunday, February 28, 2010

Snow Leopard and Zope3

I've been required to revisit an old zope application built using buildout. The application versions are quite old (3.4.* mostly) and includes old dependencies such as lxml-1.3.1, ZODB3-3.8 amongst others.

As the buildout was running I noticed compile errors and finally it would grind to a halt when import say `_zope_proxy_proxy` which is a compiled .so file.

To cut a long story short the solution for me was to edit `lib/python2.6/config/Makefile` (naturally this is within a `virtualenv`) and change the lines:

CC= /usr/bin/gcc-4.2
CXX= /usr/bin/g++-4.2

to read:

CC= /usr/bin/gcc-4.0
CXX= /usr/bin/g++-4.0

Saturday, January 9, 2010

Repoze BFG on Google App Engine


29/02/2010: I've gone a different direction than what is detailed here, instead I am using the approach taken by the bridal-demo. It wasn't too much trouble to plug a repoze.bfg application into the demo code. Hopefully I will write up a post about how I did that in the near future.
I've come back to a google appengine application that I'd put aside for a while. I've been using repoze.bfg in other work so was determined to do the same here. There is a good tutorial for getting a repoze.bfg application running on gae using appengine-monkey which sets up a virtualenv within which the application can be developed. Great, I'm a fan of virtualenv. A point to remember is that dev_appserver cannot be run with the `virtualev`/bin/python. I would forget that; so now I use a `` script to run the application.


I found it surprisingly difficult to set up a test environment for gae. My requirements were firstly to be able to use doctests, secondly to have coverage. Coverage comes with nose to I generally use nose for testing. Initially it seemed that `virtualenv`/bin/easy_install nose coverage would do the job until I came to testing models. No google.appengine, no datastore. NoseGAE seemed to be a good answer but wouldn't work well within the virtualenv (again import problems). I also looked at gaeunit but I can't see any benefits to running tests through the browser during development (I rarely even look at a web application that I'm developing - that's what tests are for, to save me the trouble).

However code from gaeunit helped me a lot from which, along with this post, I developed my own solution.

First part of the problem was to get sys.path set up correctly. I took what I needed from and to get the sys.path in working order.

Secondly, I needed to have a test datastore available for testing models, I took a stanza from Dom's post (also used by gaeunit).

Thirdly, when I  put() a model an error would be raised about a lack of  `app_id` so I solved that with a stanza from $GAE_PATH/tools/ to read app.yaml and set os.environ['APPLICATION_ID'].

The resulting files then are
#!/usr/bin/env python

# a special test runner
import sys
import os

# {{ the first stanza copied from $GAE_PATH/
GAE_PATH = "/usr/local/google_appengine"
SCRIPT_DIR = os.path.join(DIR_PATH, 'google', 'appengine', 'tools')
    os.path.join(DIR_PATH, 'lib', 'antlr3'),
    os.path.join(DIR_PATH, 'lib', 'django'),
    os.path.join(DIR_PATH, 'lib', 'webob'),
    os.path.join(DIR_PATH, 'lib', 'yaml', 'lib'),
sys.path = EXTRA_PATHS + sys.path
# }}

# {{ and the second from
here = os.path.dirname(__file__)
fixpaths = os.path.join(here, '')
# }}

from google.appengine.api import apiproxy_stub_map
from google.appengine.api import datastore_file_stub
from import ReadAppConfig

# read application config file 
appinfo_path = os.path.join(here, 'app.yaml')
config = ReadAppConfig(appinfo_path)

# set application id so that objects can be stored in test database
os.environ['APPLICATION_ID'] = config.application

def main(module=None):
    original_apiproxy = apiproxy_stub_map.apiproxy
        apiproxy_stub_map.apiproxy = apiproxy_stub_map.APIProxyStubMap()
        temp_stub = datastore_file_stub.DatastoreFileStub('TestDataStore', None, None, trusted=True)
        apiproxy_stub_map.apiproxy.RegisterStub('datastore', temp_stub)
        # Allow the other services to be used as-is for tests.
        for name in ['user', 'urlfetch', 'mail', 'memcache', 'images']:
             apiproxy_stub_map.apiproxy.RegisterStub(name, original_apiproxy.GetStub(name))
        from nose.core import TestProgram
        testprogram = TestProgram(module=module)
        apiproxy_stub_map.apiproxy = original_apiproxy

# accept module name to pass to test runner
    module = sys.argv[1]
    module = None

if __name__ == "__main__":

And for completeness here is setup.cfg

So far, so good. I do rather suspect that this remains incomplete.