Streaming Django Responses on Heroku

Django comes with a nice utility class for returning a stream of responses back to the user. This is useful for returning subprocess stdout or keeping up with the progress during processing of some file.

Not enough people use this helper!

Streaming Django Responses

Here's a small example "Valley Girl" stream

Valley Girl

import time
from django.http import HttpResponse, StreamingHttpResponse

def _valley_girl_stream():
    # Get ready to be streamed 50 seconds of this nonsense
    for _ in range(50):
        yield "like, whatever\n"

def some_endpoint(request):
    return StreamingHttpResponse(_valley_girl_stream())

You pass StreamingHttpResponse a generator and it does all of the hard work for you. So, so handy!


Watch out for problems with your WSGI servers and buffering data.

Buffering problems Buffering problems

For example, with Waitress and this code:

def _watch_process_import(xml_data):
    yield "Starting..."
    for something in whatever_dad:
        yield something
    yield "Done!"

def do_import(request):
    if request.method == 'POST':
        xml_data = request.FILES['file'].read()
        return StreamingHttpResponse(_watch_process_import(xml_data))

If you run this with a normal Waitress config, it will not send until the connection is closed and flushed. To make the messages actually stream realtime, I had to modify the first function:

def _watch_process_import(xml_data):
    # this fills the buffer so the messages start streaming
    yield "@" * 50 * 1024

    # and now we get back to business...
    yield "Starting..."

Obviously, that's not very pretty and makes us feel dirty for having to do it.

So, to fix that, add the arg --send-bytes=1 for Waitress to your Procfile, like this:

web: waitress-serve --port=$PORT --send-bytes=1 wsgi:application

That makes waitress flush the buffer as soon as it contains >= 1 byte, aka all the time with no delay!

— 17 May 2016
comments powered by Disqus