I have a quite big Excel file containing several tables.
I was able to serialize them into JSONs using an ADODB.Stream in VBA code:
This is VBA code:
Dim st As ADODB.Stream
' create a stream object
Set st = New ADODB.Stream
' set properties
st.Charset = "utf-8"
st.Type = adTypeText
' open the stream object and write some text
st.Open
st.WriteText myString
st.SaveToFile filepath, adSaveCreateOverWrite
st.Close
Now I want to read it in Python to pass it to a data frame with:
This is Python code:
import JSON
with open(myfilecomplete) as f:
data = json.load(f)
I get this error:
UnicodeDecodeError: 'charmap' codec can't decode byte 0x9d in position 407330: character maps to
The file is quite big (1MB of text). I dont even know how to find character 407330.
Besides I am supposed to be writting in disc in uft-8 the most common encoding form. Right?
Why is JSON not able to undecode utf-8?
EDIT AFTER ANSWER / WARNING: Don't try to serialise a JSON file from Excel manually. this will end up giving errors almost always. Use a proper library for doing that like in here :
You get it:
with open(myfilecomplete, encoding='utf-8-sig') as f:
data = json.load(f)
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.