I am reading a TXT file (1GB). The content is a lot of URL records. I want to call a function to return 20 records. How can the code be modified to allow him to return the remaining N records each time? which makes me confused.
from itertools import islice
def iter_list(start,stop):
url = []
with open("domain.txt") as file:
for line in islice(file, start, stop):
url.append(line)
return url
def get_html(url): req = requests.get(url) print(req.status_code) """ I want to extract 20 pieces to process after the cycle is completed. I don't know if this expression can be understood. """ url = iter_list(1, 20) for i in url: get_html(i.strip())
from itertools import islice
def iter_list(start,stop):
url = []
tmp = []
with open("domain.txt") as file:
for line in islice(file, start, stop):
url.append(line)
if len(url) == 20:
url.append(tmp)
tmp = []
return url
for i in iter_list(start,stop):
print(i)
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.