[英]Read csv file in R
I am trying to read a .csv file in R. My file looks like this- 我正在尝试读取R中的.csv文件。我的文件看起来像这样-
A,B,C,D,E
1,2,3,4,5
6,7,8,9,10
.
.
.
number of rows. 行数。
All are strings. 都是字符串。 First line is the header.
第一行是标题。
I am trying to read the file using- 我正在尝试使用-读取文件
mydata=read.csv("devices.csv",sep=",",header = TRUE)
But mydata is assigned X observations of 1 variable. 但是为mydata分配了1个变量的X个观察值。 Where X is number of rows.
其中X是行数。 The whole row becomes a single column.
整行变成一列。 But I want every header field in different column.
但是我希望每个标题字段都在不同的列中。 I am not able to understand the problem.
我无法理解问题。
If there are quotes ( "
), by using the code in the OP's post 如果有引号(
"
),请使用OP帖子中的代码
str(read.csv("devices.csv",sep=",",header = TRUE))
#'data.frame': 2 obs. of 1 variable:
#$ A.B.C.D.E: Factor w/ 2 levels "1,2,3,4,5","6,7,8,9,10": 1 2
We could remove the "
with gsub
after reading the data with readLines
and then use read.table
在使用
readLines
读取数据后,我们可以使用gsub
删除"
,然后使用read.table
read.csv(text=gsub('"', '', readLines('devices.csv')), sep=",", header=TRUE)
# A B C D E
#1 1 2 3 4 5
#2 6 7 8 9 10
Another option if we are using linux
would be to remove quotes with awk
and pipe with read.csv
如果我们使用的是
linux
则另一个选择是使用awk
删除引号,并使用read.csv
删除管道
read.csv(pipe("awk 'gsub(/\"/,\"\",$1)' devices.csv"))
# A B C D E
#1 1 2 3 4 5
#2 6 7 8 9 10
Or 要么
library(data.table)
fread("awk 'gsub(/\"/,\"\",$1)' devices.csv")
# A B C D E
#1: 1 2 3 4 5
#2: 6 7 8 9 10
v1 <- c("A,B,C,D,E", "1,2,3,4,5", "6,7,8,9,10")
write.table(v1, file='devices.csv', row.names=FALSE, col.names=FALSE)
The code which you've written should work unless your csv file is corrupted. 除非您的csv文件已损坏,否则您编写的代码应该可以使用。
Check giving absolute path of devices.csv 检查给出devices.csv的绝对路径
To test: data[1]
will give you column 1 results 要测试:
data[1]
将为您提供第1列的结果
Or, You can try it this way too 或者,您也可以这样尝试
data = read.table(text=gsub('"', '', readLines('//fullpath to devices.csv//')), sep=",", header=TRUE)
Good Luck! 祝好运!
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.