[英]How to write more than one line to a file sequentially in Python
請參見下面的代碼-
from sys import argv
from urllib2 import urlopen
from os.path import exists
script, to_file = argv
url = "http://numbersapi.com/random"
fact = 0
number = 0
print "Top 5 Facts of The World"
while fact < 5:
response = urlopen(url)
data = response.read()
fact += 1
number += 1
print
print "%s). %s " % (str(number), data)
print "Now, let us save the facts to a file for future use."
print "Does the output file exist? %r" % exists(to_file)
print "When you are ready, simply hit ENTER"
raw_input()
out_file = open(to_file, 'w')
out_file.write(data)
print "Alright, facts are saved in the repo."
out_file.close()
上面的代碼中的問題是當我打開file1.txt時,我只看到1個事實。 作為一種變體,我將所有內容都放入了while循環中。 這導致了同樣的問題。 我相信它會寫一個事實,但隨后會覆蓋下一個事實,直到只保存最后一個事實。
我究竟做錯了什么?
“數據”僅保留分配給它的最后一個值。
from sys import argv
script, to_file = argv
fact = 0
number = 0
out_file = open(to_file, 'w')
while fact < 5:
data = str(fact)
out_file.write(str(data) + '\n')
fact += 1
number += 1
print
print "%s). %s " % (str(number), data)
out_file.close()
您每次循環迭代都會覆蓋data
。 嘗試這個:
out_file = open(to_file, 'w')
while fact < 5:
response = urlopen(url)
data = response.read()
fact += 1
number += 1
print
print "%s). %s " % (str(number), data)
out_file.write(data)
out_file.write('\n') #one fact per line
out_file.close()
看來您正在覆蓋循環中的數據,因此最后只有最后一個數據。 嘗試更改為以下內容:
[...]
final_data=''
while fact < 5:
response = urlopen(url)
data = response.read()
fact += 1
number += 1
print
print "%s). %s " % (str(number), data)
final_data+=data
[...]
out_file.write(final_data)
問題是您要在循環后寫入文件,以便data
指向最后獲取的URL數據。 要解決此問題,請將data
存儲在列表中,然后像這樣寫列表中的所有內容:
for fact in data:
out_file.write(fact + '\n')
您需要附加這樣提取的事實:
data.append(response.read())
或在獲取事實之前詢問是否要將其寫入文件,然后像這樣移動文件操作:
with open(to_file, 'wb') as out_file:
while fact < 5:
response = urlopen(url)
data = response.read()
if should_write:
out_file.write(data + '\n')
fact += 1
number += 1
print
print "%s). %s " % (str(number), data)
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.