[英]How to Remove Rows from Pandas Data Frame that Contains any String in a Particular Column
I have CSV data in the following format: 我有以下格式的CSV数据:
+-------------+-------------+-------+
| Location | Num of Reps | Sales |
+-------------+-------------+-------+
| 75894 | 3 | 12 |
| Burkbank | 2 | 19 |
| 75286 | 7 | 24 |
| Carson City | 4 | 13 |
| 27659 | 3 | 17 |
+-------------+-------------+-------+
The Location
column is of the object
datatype. Location
列是object
数据类型。 What I would like to do is to remove all rows that have non-numeric Location labels. 我想要做的是删除所有具有非数字位置标签的行。 So my desired output, given the above table would be:
所以我想要的输出,如上表所示:
+----------+-------------+-------+
| Location | Num of Reps | Sales |
+----------+-------------+-------+
| 75894 | 3 | 12 |
| 75286 | 7 | 24 |
| 27659 | 3 | 17 |
+----------+-------------+-------+
Now, I could hard code the solution in the following manner: 现在,我可以通过以下方式对解决方案进行硬编码:
list1 = ['Carson City ', 'Burbank'];
df = df[~df['Location'].isin(['list1'])]
Which was inspired by the following post: 其灵感来自以下帖子:
How to drop rows from pandas data frame that contains a particular string in a particular column? 如何从包含特定列中特定字符串的pandas数据框中删除行?
However, what I am looking for is a general solution, that will work for any table of the type outlined above. 但是,我正在寻找的是一般解决方案,适用于上述类型的任何表。
Or you could do 或者你可以做到
df[df['Location'].str.isnumeric()]
Location Num of Reps Sales 0 75894 3 12 2 75286 7 24 4 27659 3 17
You can use pd.to_numeric
to coerce non numeric values to nan
and then filter based on if the Location is nan
: 您可以使用
pd.to_numeric
将非数字值强制为nan
,然后根据Location是否为nan
进行过滤:
df[pd.to_numeric(df.Location, errors='coerce').notnull()]
#Location Num of Reps Sales
#0 75894 3 12
#2 75286 7 24
#4 27659 3 17
In [139]: df[~df.Location.str.contains('\D')]
Out[139]:
Location Num of Reps Sales
0 75894 3 12
2 75286 7 24
4 27659 3 17
df[df['Location'].str.isdigit()]
Location Num of Reps Sales
0 75894 3 12
2 75286 7 24
4 27659 3 17
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.