简体   繁体   中英

compare columns from multiple text files, to csv column file python

I have 662 text files in one folder, to open them I used a code I found somwhere around here:

os = '..'
path = './'
for infile in glob.glob(os.path.join(path, '*.*')):
    print('current file is: ' + infile)

To exctract the first column I used:

with open(infile) as infile:
    for line in infile:
        print(line.split()[0])

But here the first problem, it extracts only the first columns of the last file nor all the files.

Second problem , I need to compare these columns to first column in csv file.. same as "in.index" option in pandas (but it's csv vs. text)

it print last file nor all the files because you print the columns after the file loop is end and it print the first column because you write line.split()[0]

you can make it like this

for infile in glob.glob(os.path.join(path, '*.*')):
  print('current file is: ' + infile)
  with open(infile) as infile:
      for line in infile:
         for i in range(len(line))
           print(line.split()[i])

i just put all the code in one loop and new loop to print all the columns

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM