简体   繁体   中英

Python - How to parse and save JSON to MYSQL database

As the title indicates, how does one use python to elegantly access an API and parse and save the JSON contents onto a relational database (MYSQL) for later access?

Here, I saved the data onto a pandas object. But how do I create a mysql database, save the json contents onto it, and access the contents for later use?

# Libraries
import json, requests
import pandas as pd
from pandas.io.json import json_normalize

# Set URL
url = 'https://api-v2.themuse.com/jobs'

# For loop to
for i in range(100):
    data = json.loads(requests.get(
        url=url,
        params={'page': i}
    ).text)['results']

data_norm = pd.read_json(json.dumps(data))

If this is merely for storage for processing later, kind of like a cache, a varchar field is enough. If however you need to retrieve some structured jdata, JSON field is what you need.

You create your Mysql table on your server using something like Mysql Workbench CE . then in python you do this. I wasnt sure if you want to use data in for loop or data_norm so for ease of use, here some functions. insertDb() can be put in your for loop, since data will be overwriten by itself in every iteration.

import MySQLdb

def dbconnect():
    try:
        db = MySQLdb.connect(
            host='localhost',
            user='root',
            passwd='password',
            db='nameofdb'
        )
    except Exception as e:
        sys.exit("Can't connect to database")
    return db

def insertDb():
    try:
        db = dbconnect()
        cursor = db.cursor()
        cursor.execute("""
        INSERT INTO nameoftable(nameofcolumn) \
        VALUES (%s) """, (data))
        cursor.close()
    except Exception as e:
        print e

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM