简体   繁体   中英

Streaming download large file with python-requests interrupting

i have problem with streaming download large file (about 1.5 GB) in python-requests v. 2.0.1

with open("saved.rar",'wb') as file:
    r = session.get(url,stream=True,timeout=3600)
    for chunk in r.iter_content(chunk_size=1024):
        if chunk:
            file.write(chunk)
            file.flush()

I tested it few times on my vps and sometimes it downloaded 200mb, 500mb or 800mb and saved it without any error. It doesnt reached the timeout, just stopped like finish downloading.

Host where im downloading this file is stable because i dont have any problems to download this file in browser.

There is any way to be download large file in python-requests and be 100% sure its whole file?

@Edit

I've solved it using urllib, problem is only with requests. anyway thanks for help.

There might be several issues that will cause download to be interrupted. Network issues, etc. But we know the file size before we start the download to check if you have downloaded the whole file, you can do this using urllib:

site = urllib.urlopen("http://python.org")
meta = site.info()
print meta.getheaders("Content-Length")

Using requests:

r = requests.get("http://python.org")
r.headers["Content-Length"]

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM