简体   繁体   中英

How to merge all csv files in a folder to single csv ased on columns?

Given a folder with multiple csv files with different column lengths

Have to merge them into single csv file using python pandas with printing file name as one column.

Input: https://www.dropbox.com/sh/1mbgjtrr6t069w1/AADC3ZrRZf33QBil63m1mxz_a?dl=0

Output:

Id  Snack      Price    SheetName
5   Orange      55     Sheet1
7   Apple       53     Sheet1
8   Muskmelon   33     Sheet1
11  Orange             Sheet2
12  Green Apple        Sheet2
13  Muskmelon          Sheet2

You can use:

files = glob.glob('files/*.csv')
dfs = [pd.read_csv(fp).assign(SheetName=os.path.basename(fp).split('.')[0]) for fp in files]
df = pd.concat(dfs, ignore_index=True)
print (df)
   Id  Price SheetName        Snack
0  11    NaN   Sheet 2       Orange
1  12    NaN   Sheet 2  Green Apple
2  13    NaN   Sheet 2    Muskmelon
3   5   55.0    Sheet1       Orange
4   7   53.0    Sheet1        Apple
5   8   33.0    Sheet1    Muskmelon

EDIT:

dfs = []
for fp in files:
    df = pd.read_csv(fp).assign(SheetName=os.path.basename(fp).split('.')[0])
    #another code
    dfs.append(df)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM