简体   繁体   中英

Batch convert json to csv python

Similar to this question batch process text to csv using python

I've got a batch of json files that need to be converted to csv so that they can be imported into Tableau.

The first step was to get json2csv ( https://github.com/evidens/json2csv ) working, which I did. I can successfully convert a single file via the command line.

Now I need an operation that goes through the files in a directory and converts each in a single batch operation using that json2csv script.

TIA

I actually created a jsontocsv python script to run myself. It basically reads the json file in chunks, and then goes through determining the rows and columns of the csv file.

Check out Opening A large JSON file in Python with no newlines for csv conversion Python 2.6.6 for the details of what was done and how it built the .csv from the json. The actual conversion would depend on your actual json format.

The json parse function with a chunk size of 0x800000 was what was used to read in the json data.

If the data becomes available at specific times, you can set this up using crontab.

I used

 from optparse import OptionParser

to get the input and output files as arguments as well as setting the various options that were required for the analysis and mapping.

You can also use a batch script in the given directory

for f in *.json; do
  mybase=`basename $f .json`
  json2csv $f -o ${mybase}.csv
done

alternatively, use find with the -exec {} option

If you want all the json files to go into a single .csv file you can use

json2csv *.json -o myfile.csv

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM