简体   繁体   中英

How to ensure all my content is UTF-8 and fix encoding issues?

I'm getting some content from different websites, some of of them send this content type header:

Content-Type: text/html; charset=utf-8

and others

Content-Type: text/html

I used a Python script using the requests library to check the encoding in bulk:

for site in sites:
    r = requests.get(site)

    print r.encoding

It printed UTF-8 for some websites and for the others ISO-8859-1 , I'm storing these results in a mysql database the collation is latin1_swedish_ci which is the default (I'm using XAMPP).

The issue is that these articles have special characters like é ë ü ï for some websites these characters become like this ë which should be ë , and the others work fine.

What I'm looking for is a solution to get the same result in both cases, I searched and found some solutions that don't work in both cases, if the string is ok it'll become messed :

$str = "ë";

echo utf8_decode($str);

First I'm sorry about this question, but I had to post it beccause I don't know anything about encoding, so what can I do to get the same result ?

If it matters I'm using QueryPath to parse the html of these sites, and I'm passing as the options array('convert_to_encoding' => 'utf-8');

Set your database collation to utf8_unicode_ci (phpMyAdmin > select the DB > Operations > Collation). This character encoding can handle a wider range of "exotic" characters than latin1.

You will probably need to re-insert the content with dodgy characters again.

I've never had dodgy character display problems since using this collation for my databases, combined with using the correct UTF-8 charset meta tag in my HTML documents:

<meta charset="utf-8">

These two actions combined should handle the problem.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM