简体   繁体   English

Compress large string into json serializable object for AWS lambda function output

[英]Compress large string into json serializable object for AWS lambda function output

I have a large string that I would like to send as the response for a python lambda function on AWS.我有一个大字符串,我想将其作为 AWS 上 python lambda function 的响应发送。 The documentation states that the response must be json serializable (must run through json.dumps).文档指出响应必须是 json 可序列化的(必须通过 json.dumps 运行)。 My string is roughly 12MB which is larger than the maximum allowed payload on AWS.我的字符串大约 12MB,比 AWS 上允许的最大有效负载大。 If I gzip the string it is compressed to roughly 2MB, but then the object is no longer json serialiable.如果我 gzip 字符串,它会被压缩到大约 2MB,但是 object 不再是 json 可序列化的。

Here is a minimal example:这是一个最小的例子:

import gzip
import json
largeString=b"sdlfkjs dlfkjs dflkj "
compressed=gzip.compress(largeString)
out={}
out['data']=compressed
json.dumps(out)

which returns the expected error:它返回预期的错误:

TypeError: Object of type bytes is not JSON serializable

Use base64.b85encode() .使用base64.b85encode() This will expand the compressed data by 25%, so hopefully 2.5MB still fits.这会将压缩数据扩展 25%,因此希望 2.5MB 仍然适合。

Note that compressing may not be a robust solution to your problem.请注意,压缩可能不是您问题的可靠解决方案。 If your data gets larger or less compressible, you may still bust the payload limit.如果您的数据变得更大或可压缩性降低,您仍然可能会超出有效负载限制。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM