I have many csv files with a different size containing tick data for some symbols. Here is an image of one sample file.
Everything is in one columns seprated by ';'. I want to read the data with second row as header and skipping the first row. Till this time I have tried evrything that I can find out regarding loading the csv file while skipping the first row and using the second row as header. Here are some of my code snippet that I tired
df = pd.read_csv(cwd + folder + name +'.csv',delimiter=';', skip_blank_lines=True, encoding='utf-8', skiprows=[0])
another is like this
df = pd.read_csv(cwd + folder + name +'.csv',delimiter=';', encoding='utf-8', skiprows=[0], header=1)
and the output of all of these are with single column named 'Unnamed: 0' with all the values in dataframe as NaN. I have tried different solutions like
Python Pandas read_csv skip rows but keep header but none of them worked for me. If I do not skip the first row and read the file without any delimiter then it gives unicodeerror in Python. How to solve this problem?
After trying two solution in first two answers this is my output for both codes
in skiprows you need to give number of rows you want to skip from the top of your csv
use utf-16
df = pd.read_csv(cwd + folder + name +'.csv',delimiter=';', encoding='utf-16', skiprows=1)
for more info:
To check the encoding i have checked in libreoffice. if you open with libreoffice in its starting window you can choose delimiter, in which it also shows
utf
encoding of that file.
@Urvish - I have used the same code that you have used in your post,and see that the output is exact. pls check your file.
import pandas as pd
df = pd.read_csv("C:\\Users\\user\\Downloads\\sof.csv" ,delimiter=';', skip_blank_lines=True, encoding='utf-8', skiprows=[0])
print(df)
Output:
Date bid ask last volume
2017 06 05 799 149.6 149.7 0.0 0.0
2017 06 05 799 149.6 149.7 0.0 0.0
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.