[英]backtrader time column : ValueError: time data '0' does not match format '%Y-%m-%d %H:%M:%S'
I have a price table that has date and time in a csv format:我有一个价格表,其中包含 csv 格式的日期和时间:
Date Time o h l c v
0 2020-07-09 15:10:00 8 8 7.5 7.94 41
1 2020-07-09 15:00:00 7.61 8.24 7.61 8.24 10
2 2020-07-09 14:50:00 8.3 8.3 7.7 7.7 7
3 2020-07-09 14:40:00 8.72 8.72 8.3 8.3 7
4 2020-07-09 14:30:00 8.72 8.72 8.39 8.39 8
5 2020-07-09 14:20:00 8.35 8.6 8.3 8.6 6
6 2020-07-09 14:10:00 8.18 8.46 8.18 8.45 22
7 2020-07-09 14:00:00 8.5 8.5 8.5 8.5 1
ValueError: time data '0' does not match format '%Y-%m-%d %H:%M:%S' ValueError: 时间数据“0”与格式“%Y-%m-%d %H:%M:%S”不匹配
This is the error I get from running these code snippets.这是我从运行这些代码片段中得到的错误。
data = bt.feeds.GenericCSVData(dataname='ticks2.csv',
params = (
('nullvalue', float('NaN')),
('dtformat', '%Y/%m/%d'),# %H:%M:%S
('tmformat', '%H:%M:%S'),
('datetime', 0),
('time', 1),
('open', 2),
('high', 3),
('low', 4),
('close', 5),
('volume', 6),
I tried to merge Date and time columns for fixing this problem but to no avail...since the error stays the same.我试图合并日期和时间列来解决这个问题,但无济于事......因为错误保持不变。
df = pd.read_csv('ticks.csv', parse_dates=[['Date', 'Time']])
print(df)
del df["Unnamed: 0"]
First thing is that you have index as the first column in your CSV (ie 0, 1, 2, 3, 4...), but you don't have a column name for this column in your first line of CSV, so you need to add name for it into header of CSV (first line), just name it like "Index" so that first modified CSV line should look like Index Date Time ohlcv
.首先,您将索引作为 CSV 中的第一列(即 0、1、2、3、4...),但是您在 CSV 的第一行中没有此列的列名,因此您需要将其名称添加到 CSV 的标题(第一行)中,只需将其命名为“Index”,以便第一个修改的 CSV 行看起来像
Index Date Time ohlcv
。
Second thing is that looks like you have tabs instead of comma in your CSV as cells separator so you need to specify this in your read_csv as sep = '\\t'
ie pd.read_csv('test.csv', sep = '\\t', parse_dates = [['Date', 'Time']])
.第二件事是,看起来您的 CSV 中有制表符而不是逗号作为单元格分隔符,因此您需要在 read_csv 中将其指定为
sep = '\\t'
即pd.read_csv('test.csv', sep = '\\t', parse_dates = [['Date', 'Time']])
。
Below is a working corrected example, I did my example for the case of sep = ','
because tabs are removed by StackOverflow from text and I can't show them.下面是一个有效的更正示例,我为
sep = ','
的情况做了我的示例sep = ','
因为 StackOverflow 从文本中删除了选项卡,我无法显示它们。 For your case just modify sep = ','
to sep = '\\t'
inside read_csv(...)
.对于您的情况,只需在
read_csv(...)
sep = ','
修改为sep = '\\t'
。 You can see in my example that my csv contains added Index
in the beginning of first csv line.您可以在我的示例中看到,我的 csv 在第一个 csv 行的开头包含添加的
Index
。 Also in the beginning of my example I have test csv file writing block, you don't need this block as you already have your file.同样在我的示例的开头,我有测试 csv 文件写入块,您不需要这个块,因为您已经有了文件。
To conclude you have to do two things:总而言之,您必须做两件事:
Index
plus tab.Index
加选项卡的第一行。sep = '\\t'
to your read_csv(...)
if you have tab separated CSV and looks like you have.sep = '\\t'
添加到您的read_csv(...)
。# This file-writing block is not needed, it is to create example file
with open('test.csv', 'w', encoding = 'utf-8') as f:
f.write("""
Index,Date,Time,o,h,l,c,v
0,2020-07-09,15:10:00,8,8,7.5,7.94,41
1,2020-07-09,15:00:00,7.61,8.24,7.61,8.24,10
2,2020-07-09,14:50:00,8.3,8.3,7.7,7.7,7
3,2020-07-09,14:40:00,8.72,8.72,8.3,8.3,7
4,2020-07-09,14:30:00,8.72,8.72,8.39,8.39,8
5,2020-07-09,14:20:00,8.35,8.6,8.3,8.6,6
6,2020-07-09,14:10:00,8.18,8.46,8.18,8.45,22
7,2020-07-09,14:00:00,8.5,8.5,8.5,8.5,1
""")
# This code is needed to solve task
# Change to "sep = '\t'" for your case of tab-separated CSV
import pandas as pd
df = pd.read_csv('test.csv', sep = ',', parse_dates = [['Date', 'Time']])
print(df)
Output:输出:
Date_Time Index o h l c v
0 2020-07-09 15:10:00 0 8.00 8.00 7.50 7.94 41
1 2020-07-09 15:00:00 1 7.61 8.24 7.61 8.24 10
2 2020-07-09 14:50:00 2 8.30 8.30 7.70 7.70 7
3 2020-07-09 14:40:00 3 8.72 8.72 8.30 8.30 7
4 2020-07-09 14:30:00 4 8.72 8.72 8.39 8.39 8
5 2020-07-09 14:20:00 5 8.35 8.60 8.30 8.60 6
6 2020-07-09 14:10:00 6 8.18 8.46 8.18 8.45 22
7 2020-07-09 14:00:00 7 8.50 8.50 8.50 8.50 1
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.