[英]Pandas to_records() timestamp to date
I have the following data frame object我有以下数据框 object
total
scanned_date
2021-11-01 0
2021-11-02 0
2021-11-03 0
2021-11-04 0
2021-11-05 0
Where scanned_date
is Timestamp
object.其中
scanned_date
是Timestamp
object。 I want to convert the data to a list of tuples like我想将数据转换为元组列表,例如
[
(2021-11-01, 0),
(2021-11-02, 0),
(2021-11-03, 0),
...
]
But when using但是使用的时候
list(df.to_records())
It is adding timezone, while I only want the date string它正在添加时区,而我只想要日期字符串
[('2021-11-01T00:00:00.000000000', 0), ('2021-11-02T00:00:00.000000000', 0), ('2021-11-03T00:00:00.000000000', 0)]
How can I remove the timezone string T00:00:00.00000000
from the to_records()
output?如何从
to_records()
output 中删除时区字符串T00:00:00.00000000
?
Try convert strftime
尝试转换
strftime
df.index = df.index.strftime('%Y-%m-%d')
list(df.to_records())
Out[212]:
[('2021-11-01', 0),
('2021-11-02', 0),
('2021-11-03', 0),
('2021-11-04', 0),
('2021-11-05', 0)]
I tried to do the date conversion in numpy but chose to switch to pandas.我尝试在 numpy 中进行日期转换,但选择切换到 pandas。 In numpy your working with a 64 bit integer.
在 numpy 中,您使用 64 位 integer。 I used a map function and a lambda to convert the dataframe record into a date and value tuple
我使用了 map function 和 lambda 将 Z6A8064B5DF47945550DZ 记录 55CZ
txt="""scanned_date,total
2021-11-01,0
2021-11-02,0
2021-11-03,0
2021-11-04,0
2021-11-05,0
"""
#https://www.py4u.net/discuss/17020
df = pd.read_csv(io.StringIO(txt),sep=',',parse_dates=['scanned_date'])
print(list(map(lambda tuple_obj:
(
pd.to_datetime(tuple_obj[1],'%M/%d/%Y')
#str(tuple_obj[1].astype("datetime64[M]").astype(int)% 12 + 1)
# + "-" + str(tuple_obj[1].astype(object).day)
# + "-" + str(tuple_obj[1].astype("datetime64[Y]"))
,
tuple_obj[2]),
df.to_records())))
output: output:
[(Timestamp('2021-11-01 00:00:00'), 0), (Timestamp('2021-11-02 00:00:00'), 0), (Timestamp('2021-11-03 00:00:00'), 0), (Timestamp('2021-11-04 00:00:00'), 0), (Timestamp('2021-11-05 00:00:00'), 0)]
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.