[英]Python compare two csv files and append data to csv file
我有以下格式的两个csv文件:
第一个是outputTweetsDate.csv:
Here is some text;13.09.13 16:45
Here is more text;13.09.13 16:45
And yet another text;13.09.13 16:46
第二个文件是apiSheet.csv:
13.09.13 16:46;89.56
13.09.13 16:45;90.40
我想比较这两个文件,如果两个日期时间值匹配,则将文本和数据添加到一个新文件(finalOutput.csv)中:
|89.56|,|Here is some text|
|89.56|,|Here is more text|
|90.49|,|And yet another text|
这是我到目前为止的代码:
with open("apiSheet.csv", "U") as in_file1, open("outputTweetsDate.csv", "rb") as in_file2,open("finalOutput.csv", "wb") as out_file:
reader1 = csv.reader(in_file1,delimiter=';')
reader2 = csv.reader(in_file2,delimiter='|')
writer = csv.writer(out_file,delimiter='|')
for row1 in reader1:
for row2 in reader2:
if row1[0] == row2[1]:
data = [row1[1],row2[0]]
print data
writer.writerow(data)
我编辑了代码,到目前为止,它可以正常工作,但是它不能正确遍历所有代码。 暂时我的输出是这样的:
|89.56|,|Here is some text|
|89.56|,|Here is more text|
因此,即使它们相同,也不会向我显示第三个。 似乎没有遍历文件。
谢谢!
在读取文件1的第二行之前,第二个循环到达文件2的末尾(outputTweetsDate.csv)。
试试这个片段:
with open("apiSheet.csv", "U") as in_file1, open("outputTweetsDate.csv", "rb") as in_file2,open("finalOutput.csv", "wb") as out_file:
reader1 = csv.reader(in_file1,delimiter=';')
reader2 = csv.reader(in_file2,delimiter='|')
writer = csv.writer(out_file,delimiter='|')
row2 = reader2.next()
for row1 in reader1:
while row2 and row1[0] <= row2[1]:
if row1[0] == row2[1]:
data = [row1[1],row2[0]]
print data
writer.writerow(data)
row2 = reader2.next()
编辑逆序比较棘手。 让我们停止尝试变得聪明,做些蛮力。 由于文件远远少于您的RAM,它将可以完美地工作。
with open("apiSheet.csv", "U") as in_file1, open("outputTweetsDate.csv", "rb") as in_file2,open("finalOutput.csv", "wb") as out_file:
reader1 = csv.reader(in_file1,delimiter=';')
reader2 = csv.reader(in_file2,delimiter='|')
writer = csv.writer(out_file,delimiter='|')
rows2 = [row for row in reader2] # all the content of file2 goes in RAM.
for row1 in reader1:
for row2 in rows2:
if row1[0] == row2[1]:
data = [row1[1],row2[0]]
print data
writer.writerow(data)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.