简体   繁体   中英

Chinese character encoding issue when adding Scrapy for Python items into MYSQL

I'm trying to get scraped Chinese text into a MYSQL database from python scrapy, but it seems that either scrapy or MYSQL can't handle Chinese characters using my current method.

def insert_table(datas):
    sql = "INSERT INTO %s (name, uses, time_capt) \
        values('%s', '%s', NOW())" % (SQL_TABLE,
            escape_string(datas['name']),
            escape_string(datas['uses']),
            )
    if cursor.execute(sql):
                print "Inserted"
    else:
        print "Something wrong"

I keep getting this error when adding into the MYSQL database:

exceptions.UnicodeEncodeError: 'ascii' codec can't encode characters in position 1-7: ordinal not in range(128)

datas['name'] contains correctly formatted Chinese characters. If I print the variable within the code, it comes out properly. I've tried adding .encode['utf8'] before adding it into MYSQL, which makes the error go away, but that makes it come out as garbled gibberish within my database. Am I doing something wrong?

Edit, This is my current code:

 15 def insert_table(datas):
 16     sql = "INSERT INTO %s (name, uses, time_capt) \
 17         values(%s, %s, NOW())",  (SQL_TABLE,
 18             escape_string(datas['name']),
 19             escape_string(datas['uses']),
 20             )
 21     if cursor.execute(sql):
 22                 print "Inserted"
 23     else:
 24         print "Something wrong"                           

I have met similar problem when I use name = u"大学" as a variable in "scrapy" project. After I add the comment -*- coding: utf-8 -*- the error is gone.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM