[英]Python compare two csv files and append data to csv file
我有以下格式的兩個csv文件:
第一個是outputTweetsDate.csv:
Here is some text;13.09.13 16:45
Here is more text;13.09.13 16:45
And yet another text;13.09.13 16:46
第二個文件是apiSheet.csv:
13.09.13 16:46;89.56
13.09.13 16:45;90.40
我想比較這兩個文件,如果兩個日期時間值匹配,則將文本和數據添加到一個新文件(finalOutput.csv)中:
|89.56|,|Here is some text|
|89.56|,|Here is more text|
|90.49|,|And yet another text|
這是我到目前為止的代碼:
with open("apiSheet.csv", "U") as in_file1, open("outputTweetsDate.csv", "rb") as in_file2,open("finalOutput.csv", "wb") as out_file:
reader1 = csv.reader(in_file1,delimiter=';')
reader2 = csv.reader(in_file2,delimiter='|')
writer = csv.writer(out_file,delimiter='|')
for row1 in reader1:
for row2 in reader2:
if row1[0] == row2[1]:
data = [row1[1],row2[0]]
print data
writer.writerow(data)
我編輯了代碼,到目前為止,它可以正常工作,但是它不能正確遍歷所有代碼。 暫時我的輸出是這樣的:
|89.56|,|Here is some text|
|89.56|,|Here is more text|
因此,即使它們相同,也不會向我顯示第三個。 似乎沒有遍歷文件。
謝謝!
在讀取文件1的第二行之前,第二個循環到達文件2的末尾(outputTweetsDate.csv)。
試試這個片段:
with open("apiSheet.csv", "U") as in_file1, open("outputTweetsDate.csv", "rb") as in_file2,open("finalOutput.csv", "wb") as out_file:
reader1 = csv.reader(in_file1,delimiter=';')
reader2 = csv.reader(in_file2,delimiter='|')
writer = csv.writer(out_file,delimiter='|')
row2 = reader2.next()
for row1 in reader1:
while row2 and row1[0] <= row2[1]:
if row1[0] == row2[1]:
data = [row1[1],row2[0]]
print data
writer.writerow(data)
row2 = reader2.next()
編輯逆序比較棘手。 讓我們停止嘗試變得聰明,做些蠻力。 由於文件遠遠少於您的RAM,它將可以完美地工作。
with open("apiSheet.csv", "U") as in_file1, open("outputTweetsDate.csv", "rb") as in_file2,open("finalOutput.csv", "wb") as out_file:
reader1 = csv.reader(in_file1,delimiter=';')
reader2 = csv.reader(in_file2,delimiter='|')
writer = csv.writer(out_file,delimiter='|')
rows2 = [row for row in reader2] # all the content of file2 goes in RAM.
for row1 in reader1:
for row2 in rows2:
if row1[0] == row2[1]:
data = [row1[1],row2[0]]
print data
writer.writerow(data)
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.