简体   繁体   中英

Is there any way to stream the data to the server using Python requests module?

Lets' say that I'm sending very large (but finite) amount of data that keeps growing in real time. I'd like to stream it to the server to prevent myself from running out of memory.

When there's no more data to be send, I'm leaving 'EndOfBatch' line in the body, so my server gonna know when it should stop listening for more data. [ it's not the problem ]

There's a quick example

import time
from io import BytesIO
import requests as requests

stream = BytesIO(b"Foo")

requests.post("http://127.0.0.1:8085", data=stream)
# -> Server received the initial request body - "Foo"


time.sleep(1)
# -> Processing more data to be sent 


stream.write(b'Bar')
stream.flush()
# -> Server receives another 'Bar' part

But of course, it doesn't work. How should I tackle this problem? This case has been covered in many languages, but i don't know how to do it in Python.

And yes, I can't modify my server source code - it's good - i just need to stream some data from the Python level, but I have no idea how to do that:|

I tried to use the requests module by putting the BytesIO object in the data argument of the requests.post method. https://requests.readthedocs.io/en/latest/user/advanced/#streaming-uploads

I expected that when I'll write more data to the stream, it'll be sent right away, but it didn't.

Using chunk encoded requests helps.

import time
import requests as requests


def data_generator():
    yield b"Foo"
    time.sleep(5)
    yield b"Bar"


requests.post("http://127.0.0.1:8085", data=data_generator())

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM