I want to pickle my objects compressed with snappy-python . Because these objects are large I want to use something streaming -- ie pickle.dump, not pickle.dumps.
Unfortunately, the snappy and pickle APIs don't seem particularly compatible:
snappy.stream_compress(src, dst, blocksize=65536)
pickle.dump(obj, file, protocol=None)
Any thoughts on what magic I would need for something like snappy.stream_compress(pickle.dump_magic(obj), dst)
to work similarly to what pickle.dump(obj, dst)
does today?
The simplest would be to pickle.dump
to a file, and then snappy.stream_compress
the file.
def snappy_pickle_dump(obj, f):
tmpf = ...
with open(tmpf, 'wb') as F:
pickle.dump(obj, F)
with open(f, 'wb') as OUT:
with open(tmpf, 'rb') as IN:
snappy.stream_compress(IN, OUT)
os.remove(tmpf)
If you must avoid storing the entire uncompressed dump, you can have one thread pickle.dump
ing the object into a StringIO
object, and another thread snappy.stream_compress
ing it.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.