[英]Python 2.7, how to store multiple fields into seperate list
I am opening a large text file in read mode. 我正在以阅读模式打开一个大文本文件。 I was able to split the data by commas and strip it from all white space.
我能够用逗号分割数据并将其从所有空白中剥离。 Now I am trying to store each field into a separate list.
现在,我试图将每个字段存储到单独的列表中。
Example of the data: 数据示例:
Montgomery,2327375.079,1273061.532,3 ,3
Stanly,2224024.174,1267300.181,5 ,5
Henderson,1559580.515,1277200.634,5 ,5
Current code: 当前代码:
county = []
x_coor = []
y_coor = []
observed = []
expected = []
for line in lines:
for value in line.split(','):
value = value.strip()
I believe my next step would be, county.append[value]
but when I tried that and printed county, it printed: 我相信我的下一步将是
county.append[value]
但是当我尝试并打印出County时,它会打印出:
[['3']] [['3'], ['5']] [['3'], ['5'], ['5']]
Does anyone have any guidance on how to write each column into it's own individual list? 有人对如何将每一列写入自己的单独列表有任何指导吗?
You're pretty close! 你很亲密! I would iterate over all of the lines, splitting each on the comma character to create the underlying row of data.
我将遍历所有行,将每个行分隔为逗号,以创建基础数据行。 Then, append each individual component of the row to its appropriate list:
然后,将该行的每个单独的组件附加到其适当的列表中:
county = []
x_coor = []
y_coor = []
observed = []
expected = []
for line in lines:
row = line.split(',')
county.append(row[0])
x_coor.append(row[1])
y_coor.append(row[2])
observed.append(row[3])
expected.append(row[4])
An arguably more pythonic way of doing this would be through list comprehensions. 可以通过列表理解来说是一种更Python的方法。 Say you have a .csv file named "file.csv" containing the data.
假设您有一个名为“ file.csv”的.csv文件,其中包含数据。 The following would give you what you want:
以下内容将为您提供所需的内容:
import csv
with open('file.csv', 'rb') as data:
reader = csv.reader(data)
awesome_list = list(reader)
county = [x[0] for x in awesome_list]
x_coor = [x[1] for x in awesome_list]
y_coor = [x[2] for x in awesome_list]
observed = [x[3] for x in awesome_list]
expected = [x[4] for x in awesome_list]
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.