[英]How to read in multiple txt files in R with differing number of columns
I am trying to read in multiply txt files with differing columns in R.我正在尝试读取 R 中具有不同列的多个 txt 文件。 I already saw How can you read a CSV file in R with different number of columns and tried it with colClasses
and also col.names
(both with fill=T
).我已经看到了如何在 R 中读取具有不同列数的 CSV 文件,并使用colClasses
和col.names
进行了尝试(两者都使用fill=T
)。 But it does not work.但它不起作用。 col.names
gives me invalid value for quotatines
(whatever that is). col.names
为我提供了invalid value for quotatines
(无论是什么)。
ep_dir <- "C:/Users/J/Desktop/e_prot_unicode"
# reading the data. empty list that gets filled up
ep_ldf<-list()
# creates a list of all the files in the directory with ending .txt
listtxt_ep<-list.files(path = ep_dir, pattern="*.txt", full.names = T)
# loop for reading all the files in the list
for(m in 1:length(listtxt_ep)){
ep_ldf[[m]]<-read.table(listtxt_ep[m],fill=T,header=T,sep = "\t",stringsAsFactors=F,fileEncoding = "UTF-16LE",dec = ",",colClasses = c("numeric", rep("character", 41)))
ep <- bind_rows(ep_ldf,ep_ldf[[m]])
}
#another try because it is not working properly
f_ep = "C:/Users/J/Desktop/e_prot_unicode/22WS.U1"
#reading and merging the files, data.table is then called d_ep
d_ep = data.frame()
for(f_ep in listtxt_ep){
tmp_ep <- read.delim(f_ep,row.names = NULL,sep = "\t",fileEncoding="UTF-16LE",fill = T,header = T,dec = ",",col.names = "V",seq_len(41)) %>% as.data.frame(stringsAsFactors = F)
d_ep <- rbind.fill(d_ep, tmp_ep)
}
How to read in multiple files with differing column number into R?如何将具有不同列号的多个文件读入R?
Not sure I totally follow what you are trying to do, but dplyr's bind_rows() function allows you to combine dataframes with different columns.不确定我是否完全按照您的要求进行操作,但是 dplyr 的 bind_rows() function 允许您将数据帧与不同的列结合起来。 You can also pass a list of dataframes to bind_rows() at it will combine all of them at once which will simplify some of your code.您还可以将数据框列表传递给 bind_rows() ,它将一次组合所有这些数据框,这将简化您的一些代码。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.