[英]Summing two values from different dataframes if certain criteria is matched python
I would like to sum two columns, each in different frame if certain criteria is met.我想总结两列,如果满足某些条件,每一列都在不同的框架中。
Dataframe 1: Dataframe 1:
desk Type total_position
desk1 ES 786.0
desk1 ES1 100
desk2 ES1 0
desk2 ES2 10
desk3 ES 0
desk4 ES1 0
desk5 ES -757
Dataframe 2: Dataframe 2:
desk Type total_position
desk1 ES -758.0
desk2 ES 0
desk3 ES -29
desk4 ES 0.0
desk5 ES 786.0
I would like to sum both the positions if only the type is "ES" in the first dataframe and it is the same desk.如果只有第一个 dataframe 中的类型是“ES”并且它是同一张桌子,我想将这两个位置相加。
How do i do that?我怎么做?
Expected Answer预期答案
desk Type total_position
desk1 ES 29
desk2 ES1 0
desk3 ES -29
desk4 ES1 0
desk5 ES 29
Try this:尝试这个:
add via the index, and update the missing values from df1 using combine_first
通过索引添加,并使用
combine_first
从 df1 更新缺失值
df1.set_index('desk').add(df2.set_index('desk')).combine_first(df1.set_index('desk'))
NB: this works on the naive assumption that desk in df1 is same in df2.注意:这适用于 df1 中的办公桌在 df2 中相同的天真假设。
Type total_position
desk
desk1 ES 28.0
desk2 ES1 0.0
desk3 ES -29.0
desk4 ES1 0.0
desk5 ES 29.0
I would map
and then add
:我会
map
然后add
:
df1['total_position'] = (df1['total_position'].add(
df1['desk'].map(df2.set_index('desk')['total_position']))
print(df1)
desk Type total_position
0 desk1 ES 28.0
1 desk2 ES1 0.0
2 desk3 ES -29.0
3 desk4 ES1 0.0
4 desk5 ES 29.0
EDIT for type:编辑类型:
m = (df1['desk'].map(df2.set_index('desk')['total_position'])
.where(df1['Type'].eq('ES')).fillna(0))
df1['total_position'] = df1['total_position'].add(m)
print(df1)
desk Type total_position
0 desk1 ES 28.0
1 desk2 ES1 0.0
2 desk3 ES -29.0
3 desk4 ES1 0.0
4 desk5 ES 29.0
Use where
where
使用
>>> df['total_position'] = (df.total_position+df1.total_position).where(df.desk.eq(df1.desk) & df.Type.eq('ES'), 0)
>>> df
desk Type total_position
0 desk1 ES 28.0
1 desk2 ES1 0.0
2 desk3 ES -29.0
3 desk4 ES1 0.0
4 desk5 ES 29.0
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.