[英]Reading a CSV file with Python
Good morning everyone.大家,早安。
I have a csv file with 3 columns: product, quantity and price.我有一个包含 3 列的 csv 文件:产品、数量和价格。
I want to create 3 list, one for each column.我想创建 3 个列表,每列一个。
I try with the following code, but this create a list for each row instead of a list for each column.我尝试使用以下代码,但这会为每一行创建一个列表,而不是为每列创建一个列表。 Any idea, please??有什么想法吗??
Thanks in advance :)提前致谢 :)
import csv
with open ("./02_CSV_data.csv","r") as myCSV:
contenido=csv.reader(myCSV)
for i in contenido:
print(i)
Append each element of the list to the appropriate list.将列表的每个元素附加到适当的列表中。
import csv
products = []
quantities = []
prices = []
with open ("./02_CSV_data.csv","r") as myCSV:
contenido=csv.reader(myCSV)
for i in contenido:
products.append(i[0])
quantities.append(i[1])
prices.append(i[2])
I know you are trying to do this with csv
but Pandas can do this easily :我知道您正在尝试使用csv
执行此操作,但Pandas可以轻松完成此操作:
Install Pandas using pip
:使用pip
安装 Pandas:
pip install pandas
Code to convert csv columns to list:将 csv 列转换为列表的代码:
import pandas as pd
df = pd.read_csv(r'path/of/csv/file.csv')
list1 = df['col_name_1'].to_list()
list2 = df['col_name_2'].to_list()
list3 = df['col_name_3'].to_list()
You can transpose the read in rows - this is less efficient then Barmars solution though - it creates the big list of data and then creates 3 smaller lists from it instead of only creating the smaller lists:您可以按行转置读取 - 虽然这比 Barmars 解决方案效率低 - 它创建大数据列表,然后从中创建 3 个较小的列表,而不是仅创建较小的列表:
Create data:创建数据:
with open ("./02_CSV_data.csv","w") as f:
f.write("""
1,1a,1b
2,2a,2b
3,3a,3b
""")
Read data as rows:按行读取数据:
import csv
data = []
with open ("./02_CSV_data.csv","r") as myCSV:
contenido = csv.reader(myCSV)
for i in contenido:
if i: # avoid empty rows
data.append(i)
Extract colums using zip():使用 zip() 提取列:
products, quantities, prices = map(list, zip(*data))
print(products)
print(quantities)
print(prices)
Output:输出:
['1', '2', '3']
['1a', '2a', '3a']
['1b', '2b', '3b']
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.