简体   繁体   中英

Automate the read and save of several json files (with different information) to different pandas dataframes

I have several json files in a folder, each one with different shapes (number of lines and columns) and information.

I have the following code to open and save a json file to a pandas df:

with open('f_fruit.json', 'r') as f:
    data = json.load(f)

df_fruit = pd.DataFrame(data['fruit'])

In the end, I would like to have different pandas dataframes, one for each json file:

df_fruit

df_clothes

df_games

What is the best way to automate this code, considering that the files names and information do not follow a pattern? Is it possible?

Assuming that your files are named following the same logic I would do the following:

files = ['f_fruit.json','f_clothes.json','f_games.json'] #you can use os.walk to get a list of files from a specific folder

for file_name in files:
    col_name = file_name.split('.')[0][2:]
    with open(file_name, 'r') as f:
        data = json.load(f)
    var_name = 'df_{}'.format(col_name)
    globals()[var_name] = pd.DataFrame(data[col_name])

However, if

files names and information do not follow a pattern

then there is no easy way to automate this. You need a pattern.

Here the part you are probably interested in, ie how to create a variable from a value already in memory using globals() .

>>> col_name = 'fruit'
>>> var_name = 'df_{}'.format(col_name)
>>> globals()[var_name] = 'some value'
>>> df_fruit
'some value'

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM