简体   繁体   中英

Compress large string into json serializable object for AWS lambda function output

I have a large string that I would like to send as the response for a python lambda function on AWS. The documentation states that the response must be json serializable (must run through json.dumps). My string is roughly 12MB which is larger than the maximum allowed payload on AWS. If I gzip the string it is compressed to roughly 2MB, but then the object is no longer json serialiable.

Here is a minimal example:

import gzip
import json
largeString=b"sdlfkjs dlfkjs dflkj "
compressed=gzip.compress(largeString)
out={}
out['data']=compressed
json.dumps(out)

which returns the expected error:

TypeError: Object of type bytes is not JSON serializable

Use base64.b85encode() . This will expand the compressed data by 25%, so hopefully 2.5MB still fits.

Note that compressing may not be a robust solution to your problem. If your data gets larger or less compressible, you may still bust the payload limit.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM