[英]How to convert a list of dictionary values containing Timesamp objects into datetime objects in Python?
I have a dictionary containing results of a user's social media activity, the values of the dictionary are in a list containing Timestamp objects and I want to convert it to time objects (datetime.time())我有一个包含用户社交媒体活动结果的字典,字典的值位于包含 Timestamp 对象的列表中,我想将其转换为时间对象 (datetime.time())
here is the dictionary这是字典
{{'Instagram':[Timestamp('2020-08-23 04:16:05.12456'), Timestamp('2020-08-23 04:17:02.88754'), Timestamp('2020-08-23 05:20:21.43215'), Timestamp('2020-08-23 06:21:19.63441'), Timestamp('2020-08-23 08:23:15.76421')]},
{'Twitter':[Timestamp('2020-08-23 05:19:12.21245'), Timestamp('2020-08-23 06:21:10.09875'), TiTimestamp('2020-08-23 07:22:08.65784'),Timestamp('2020-08-23 08:23:25.09123')]},
{'Facebook':[Timestamp('2020-08-23 04:1:46.436778'), Timestamp('2020-08-23 05:19:19.34213'), Timestamp('2020-08-23 05:20:25.56784'), Timestamp('2020-08-23 08:23:12.22567')]}}
and here is the code I wrote to convert the Timestamp object to datetime object这是我编写的将 Timestamp 对象转换为 datetime 对象的代码
for i in range(2, len(dataset.columns)):
(d[dataset.columns[i]])=pd.to_datetime(d[dataset.columns[i]], unit = "ms")
But I get an error stating: ValueError: unit='ms' not valid with non-numerical val='2020-08-23 04:16:05.12456'
但我收到一条错误消息:
ValueError: unit='ms' not valid with non-numerical val='2020-08-23 04:16:05.12456'
Use neste dict with list comprehension:将 neste dict 与列表理解结合使用:
d = {k: [x.time() for x in v] for k, v in d.items()}
But if processing solution from this you can create new column filled by times and pass after groupby()
:但是,如果从处理解决这个你可以通过创建充满时代新柱和后通过
groupby()
df['d'] = pd.to_datetime(df['DateTime'], format='(%Y,%m,%d,%H,%M,%S)')
#added time column
df['time'] = df['d'].dt.time
day1 = df['d'].dt.date[0]
df = df[df['d'].dt.date.eq(day1)]
df = df.melt(['DateTime','d'])
df = df[df['value'].eq('Y')]
d = df.groupby('variable')['time'].agg(list).to_dict()
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.