I have a Python programm which is scraping some values from my localhost. My aim is to store that values and to insert them into my live Website. To be more specific, i would like to insert those values into my live website database using my Python programm. The parser looks like that
from bs4 import BeautifulSoup
import urllib
x=urllib.urlopen("http://localhost/askisi2.html")
s = x.read()
soup = BeautifulSoup(s)
m = soup.find("div",{"id":"s_number"})
id = m.text
print id
Now from this point i would like to connect to my live database and insert the "id".How is it possible with Python to have access in my remote Database and what techniques should i follow?
You'll probably want to use the DB-API , and specifically for MySQL, the MySQLdb
module . This is nonstandard, so you'll have to install it. First, you connect:
import MySQLdb
# other parameters are available; for details, see
# http://mysql-python.sourceforge.net/MySQLdb.html#functions-and-attributes
conn = MySQLdb.connect(user='someone', passwd='hunter2', db='some_database')
After you've connected, it's pretty simple to insert a row:
conn.execute('insert into some_table (id) values (?)', (id,))
When you're done with the connection, don't forget to close it.
conn.close()
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.