简体   繁体   English

使用 value_counts 和多列过滤 Pandas DataFrame?

[英]Filter Pandas DataFrame using value_counts and multiple columns?

I have a dataset of orders and people who have placed those orders.我有一个订单数据集和下订单的人。 Orders have a unique identifier, and buyers have a unique identifier across multiple orders.订单具有唯一标识符,买家在多个订单中具有唯一标识符。 Here's an example of that dataset:这是该数据集的示例:

| Order_ID | Order_Date | Buyer_ID |
|----------|------------|----------|
| 123421   | 01/01/19   | a213422  |
| 123421   | 01/01/19   | a213422  |
| 123421   | 01/01/19   | a213422  |
| 346345   | 01/03/19   | a213422  |
| 567868   | 01/05/19   | a346556  |
| 567868   | 01/05/19   | a346556  |
| 234534   | 01/10/19   | a678909  |

I want to be able to filter the dataset to individuals who have only placed one order, even if that order has multiple items:我希望能够将数据集过滤到只下过一个订单的个人,即使该订单有多个项目:

| Order_ID | Order_Date | Buyer_ID |
|----------|------------|----------|
| 567868   | 01/05/19   | a346556  |
| 567868   | 01/05/19   | a346556  |
| 234534   | 01/10/19   | a678909  |

If I try df[df['Buyer_ID'].map(df['Buyer_ID'].value_counts()) == 1] I get a really weird situation where the resulting dataframe is only rows where there's a 1 to 1 relationship between Order_ID and Buyer_ID .如果我尝试df[df['Buyer_ID'].map(df['Buyer_ID'].value_counts()) == 1]我会遇到一个非常奇怪的情况,结果数据框只是其中存在 1 到 1 关系的行Order_IDBuyer_ID Like this:像这样:

| Order_ID | Order_Date | Buyer_ID |
|----------|------------|----------|
| 346345   | 01/03/19   | a213422  |
| 234534   | 01/10/19   | a678909  |

In the result I want, Buyer_ID a213422 should not appear at all because that person has more than one Order_ID .在我想要的结果中, Buyer_ID a213422根本不应该出现,因为那个人有多个Order_ID

This leads me to believe that value_counts() is either not the appropriate way to perform this filter, or I'm doing it wrong.这让我相信value_counts()要么不是执行此过滤器的合适方法,要么我做错了。 What would be the appropriate way to perform this filter?执行此过滤器的适当方法是什么?

Method 1: boolean indexing with groupby.transform方法 1:使用groupby.transform布尔索引

df[df.groupby('Buyer_ID')['Order_ID'].transform('nunique').eq(1)]

Method 2: Groupby.filter方法二: Groupby.filter

df.groupby('Buyer_ID').filter(lambda x: x['Order_ID'].nunique()==1)

Method 3: boolean indexing with Series.map方法 3:使用Series.map boolean indexing

df[df['Buyer_ID'].map(df.groupby('Buyer_ID')['Order_ID'].nunique().eq(1))]

Output输出

   Order_ID Order_Date Buyer_ID
4    567868   01/05/19  a346556
5    567868   01/05/19  a346556
6    234534   01/10/19  a678909

If you want to remove duplicates use DataFrame.drop_duplicates at the end:如果要删除重复DataFrame.drop_duplicates ,请在最后使用DataFrame.drop_duplicates

df[df.groupby('Buyer_ID')['Order_ID'].transform('nunique').eq(1)].drop_duplicates()


   Order_ID Order_Date Buyer_ID
4    567868   01/05/19  a346556
6    234534   01/10/19  a678909

Here's another way you could do it:这是您可以执行的另一种方法:

import pandas as pd

# | Order_ID | Order_Date | Buyer_ID |
# |----------|------------|----------|
# | 123421   | 01/01/19   | a213422  |
# | 123421   | 01/01/19   | a213422  |
# | 123421   | 01/01/19   | a213422  |
# | 346345   | 01/03/19   | a213422  |
# | 567868   | 01/05/19   | a346556  |
# | 567868   | 01/05/19   | a346556  |
# | 234534   | 01/10/19   | a678909  |

df = pd.DataFrame.from_dict({
    "Order_ID": [123421, 123421, 123421, 346345, 567868, 567868, 234534],
    "Order_Date": ["01/01/19", "01/01/19", "01/01/19", "01/03/19", "01/05/19", "01/05/19", "01/10/19"],
    "Buyer_ID": ["a213422", "a213422", "a213422", "a213422", "a346556", "a346556", "a678909"],
})

buyers_with_one_order = df.groupby(["Buyer_ID"]) \
                          .agg(num_orders=("Order_ID", pd.Series.nunique)) \
                          .query("num_orders == 1") \
                          .reset_index() \
                          .Buyer_ID

filtered_df = df.merge(buyers_with_one_order).drop_duplicates()

print(filtered_df.to_string(index=False))

# | Order_ID | Order_Date | Buyer_ID |
# |----------|------------|----------|
# | 567868   | 01/05/19   | a346556  |
# | 234534   | 01/10/19   | a678909  |

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM