简体   繁体   中英

How do I import a load of csvs into different python dataframes via a loop?

I have a load of csv files. I want to create a loop that allows me to do this;

    df_20180731 = pd.read_csv('path/cust_20180731.csv')

for each of about 36 files.

My files are df_20160131, df_20160231 ...... df_20181231 etc. Basically dates by the end of the month.

Thanks

# include here all ids
files = ['20160131', '20160231']

_g = globals()

for f in files:
    _g['df_{}'.format(f)] = pandas.read_csv('path/cust_{}.csv'.format(f))


print(df_20160131)

You could do something like:

import glob
import pandas as pd

datasets = {}
for file in glob.glob('path/df_*'):
    datasets[file] = pd.read_csv(file)
import os
import pandas as pd

# get a list of all the files in the directory
files = os.listdir(<path of the directory containing all the files>)

#iterate over all the files and store it in a dictionary 
dataframe = {file: pd.read_csv(file)  for file in files}

#if the directory must contain other files, 
#you can check the file paths with any logic(extension etc.), in that case


def logic(fname):
  return  '.csv' in fname

dataframe = {file: pd.read_csv(file)  for file in files if logic(file) }
#this will create a dictionary of file : dataframe_objects 

I hope it helps 

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM