[英]How to write more than one line to a file sequentially in Python
请参见下面的代码-
from sys import argv
from urllib2 import urlopen
from os.path import exists
script, to_file = argv
url = "http://numbersapi.com/random"
fact = 0
number = 0
print "Top 5 Facts of The World"
while fact < 5:
response = urlopen(url)
data = response.read()
fact += 1
number += 1
print
print "%s). %s " % (str(number), data)
print "Now, let us save the facts to a file for future use."
print "Does the output file exist? %r" % exists(to_file)
print "When you are ready, simply hit ENTER"
raw_input()
out_file = open(to_file, 'w')
out_file.write(data)
print "Alright, facts are saved in the repo."
out_file.close()
上面的代码中的问题是当我打开file1.txt时,我只看到1个事实。 作为一种变体,我将所有内容都放入了while循环中。 这导致了同样的问题。 我相信它会写一个事实,但随后会覆盖下一个事实,直到只保存最后一个事实。
我究竟做错了什么?
“数据”仅保留分配给它的最后一个值。
from sys import argv
script, to_file = argv
fact = 0
number = 0
out_file = open(to_file, 'w')
while fact < 5:
data = str(fact)
out_file.write(str(data) + '\n')
fact += 1
number += 1
print
print "%s). %s " % (str(number), data)
out_file.close()
您每次循环迭代都会覆盖data
。 尝试这个:
out_file = open(to_file, 'w')
while fact < 5:
response = urlopen(url)
data = response.read()
fact += 1
number += 1
print
print "%s). %s " % (str(number), data)
out_file.write(data)
out_file.write('\n') #one fact per line
out_file.close()
看来您正在覆盖循环中的数据,因此最后只有最后一个数据。 尝试更改为以下内容:
[...]
final_data=''
while fact < 5:
response = urlopen(url)
data = response.read()
fact += 1
number += 1
print
print "%s). %s " % (str(number), data)
final_data+=data
[...]
out_file.write(final_data)
问题是您要在循环后写入文件,以便data
指向最后获取的URL数据。 要解决此问题,请将data
存储在列表中,然后像这样写列表中的所有内容:
for fact in data:
out_file.write(fact + '\n')
您需要附加这样提取的事实:
data.append(response.read())
或在获取事实之前询问是否要将其写入文件,然后像这样移动文件操作:
with open(to_file, 'wb') as out_file:
while fact < 5:
response = urlopen(url)
data = response.read()
if should_write:
out_file.write(data + '\n')
fact += 1
number += 1
print
print "%s). %s " % (str(number), data)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.