[英]Python generator to lazy read large csv files and shuffle the rows
我想寫一個 function 產生一個 csv 文件的洗牌行,該文件太大而無法放入 memory(約 2500 萬行)。
如何構建生成器以逐行生成數據,但與它們出現在 csv 文件中的順序不同?
是否可以在惰性生成器 function 中隨機化/打亂行?
def readCSV(csvname, shuffle=True):
for row in open(csvname, "r"):
if shuffle:
# Do something to shuffle the order of the rows
# But I dont' know how to do this.
yield row
這可以通過首先為大型 CSV 文件創建索引來完成。 除非數據更改,否則只需執行一次。 該索引將包含所有換行符所在文件的偏移量。
然后可以通過首先尋找所需的偏移量並讀入一行來輕松讀入隨機行。
例如:
import random
import csv
import os
import io
def create_index(index_filename, csv_filename):
with open(csv_filename, 'rb') as f_csv:
index = 1
line_indexes = [] # Use [0] if no header
linesep = ord(os.linesep[-1])
while True:
block = f_csv.read(io.DEFAULT_BUFFER_SIZE * 1000)
if block:
block_index = 0
line_indexes.extend(offset + index for offset, c in enumerate(block) if c == linesep)
index += len(block)
else:
break
with open(index_filename, 'w') as f_index:
f_index.write('\n'.join(map(str, line_indexes)))
def get_rows(count, index_filename, csv_filename):
sys_random = random.SystemRandom()
with open(index_filename) as f_index:
line_indexes = list(map(int, f_index.read().splitlines()))
row_count = len(line_indexes)
with open(csv_filename) as f_csv:
for _ in range(count):
line_number = sys_random.randint(0, row_count-1)
f_csv.seek(line_indexes[line_number])
if line_number == row_count - 1:
line = f_csv.read()
else:
line = f_csv.read(line_indexes[line_number + 1] - line_indexes[line_number])
yield line_number, next(csv.reader(io.StringIO(line)))
index_filename = 'index.txt'
csv_filename = 'input.csv'
create_index(index_filename, csv_filename) # only needed ONCE
for row_number, row in get_rows(10, index_filename, csv_filename):
print(f"Row {row_number} {row}")
相同的想法可用於從隨機起始行讀取或以打亂順序讀取。
顯然來回查找不會像順序讀取文件那樣快,但它應該比從頭開始讀取要快得多。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.