python - Soft Memory Limit Exceeded while serving large files with Django -


i continually hit "soft private memory limit exceeded" errors while attempting serve large streaming video files appengine instance (running django 1.5).

sample code:

def stream_file(request, blob_key):     blob_reader = blobstore.blobreader(blob_key, buffer_size=1048576)     content_type = 'video/mp4'      return http.streaminghttpresponse(blob_reader, content_type=content_type) 

my example serves via blobstore api i've experienced same problem using gcs lib + building own generator function.

how can efficiently serve large files without exceeding soft memory limit?

i noticed reading blob memory , sending out. have considered using handler sends directly blobstore?

https://cloud.google.com/appengine/docs/python/blobstore/#python_serving_a_blob


Comments

Popular posts from this blog

html - Outlook 2010 Anchor (url/address/link) -

javascript - Why does running this loop 9 times take 100x longer than running it 8 times? -

Getting gateway time-out Rails app with Nginx + Puma running on Digital Ocean -