[英]How do I go to the first record of a CSV file within the same “for” loop?
I had asked a similar question here and the answer that I get was to use the seek()
method. 我在这里问了一个类似的问题,我得到的答案是使用
seek()
方法。 Now I am doing the following: 现在我正在做以下事情:
with open("total.csv", 'rb') as input1:
time.sleep(3)
input1.seek(0)
reader = csv.reader(input1, delimiter="\t")
for row in reader:
#Read the CSV row by row.
However, I want to navigate to the first record of the CSV within the same for loop . 但是,我想在同一个for循环中导航到CSV的第一条记录。 I know that my loop won't terminate that ways but that's precisely what I want.
我知道我的循环不会以这种方式终止,但这正是我想要的。 I don't want the
for
loop to end and if it reaches the last record I want to navigate back to the first record and read the whole file all over again (and keep reading it). 我不希望
for
循环结束,如果它到达最后一条记录,我想导航回第一条记录并重新读取整个文件(并继续阅读)。 How do I do that? 我怎么做?
Thanks! 谢谢!
Does it have to be in the for
-loop? 它必须在
for
-loop中吗? You could achieve this behaviour like this (untested): 您可以像这样(未经测试)实现此行为:
with open("total.csv", 'rb') as input1:
time.sleep(3)
reader = csv.reader(input1, delimiter="\t")
while True:
input1.seek(0)
for row in reader:
#Read the CSV row by row.
For simplicity, create an generator: 为简单起见,创建一个生成器:
def repeated_reader(input, reader):
while True:
input.seek(0)
for row in reader:
yield row
with open("total.csv", 'rb') as input1:
reader = csv.reader(input1, delimiter="\t")
for row in repeated_reader(input1, reader):
#Read the CSV row by row.
I actually calculated the total number of rows in the CSV and when I was on the last row I did input1.seek(0)
我实际上计算了CSV中的总行数,当我在最后一行时,我做了
input1.seek(0)
row_count = sum(1 for row in csv.reader(open('total.csv')))
print row_count
row_count2 = 0
with open("total.csv", 'rb') as input1:
time.sleep(3)
input1.seek(0)
reader = csv.reader(input1, delimiter="\t")
for row in reader:
count += 1
#Read the CSV row by row.
if row_count2 == row_count:
row_count2 = 0
time.sleep(3)
input1.seek(0)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.