[英]Select rows from a DataFrame based on presence of null value in specific column or columns
I have an imported xls file as pandas dataframe, there are two columns containing coordinates which i will use to merge the dataframe with others which have geolocation data. 我有一个导入的xls文件作为pandas数据帧,有两列包含坐标,我将用于将数据框与其他具有地理位置数据的数据框合并。 df.info() shows 8859 records, the coordinatess columns have '8835 non-null float64' records.
df.info()显示8859条记录,坐标列有'8835非null float64'记录。
I want to eyeball the 24 rows (that i assume are null) with all columns records to see if one of the other columns (street address town) can't be used to manually add back the coordinates for those 24 records. 我想用所有列记录来观察24行(我假设为空)以查看其他列(街道地址镇)之一是否不能用于手动添加这24条记录的坐标。 Ie.
IE浏览器。 return dataframe for column in df.['Easting'] where isnull or NaN
返回df。['Easting']中列的数据帧,其中isnull或NaN
I have adapted the method given here as below; 我已经适应给出的方法在这里如下;
df.loc[df['Easting'] == NaN]
But get back an empty dataframe (0 rows × 24 columns), which makes no sense (to me). 但是回到一个空数据帧(0行×24列),这对我来说毫无意义。 Attempting to use Null or Non null doesn't work as these values aren't defined.
尝试使用Null或Non null不起作用,因为未定义这些值。 What am i missing?
我错过了什么?
I think you need isnull
for checking NaN
values with boolean indexing
: 我认为你需要
isnull
来检查NaN
值与boolean indexing
:
df[df['Easting'].isnull()]
Warning
警告
One has to be mindful that in python (and numpy), the nan's don't compare equal, but None's do.
必须要注意的是,在python(和numpy)中,nan的比较并不相同,但是没有。 Note that Pandas/numpy uses the fact that np.nan != np.nan, and treats None like np.nan.
请注意,Pandas / numpy使用np.nan!= np.nan的事实,并像np.nan一样处理None。
In [11]: None == None
Out[11]: True
In [12]: np.nan == np.nan
Out[12]: False
So as compared to above, a scalar equality comparison versus a None/np.nan doesn't provide useful information.
因此,与上面相比,标量相等比较与None / np.nan不提供有用的信息。
In [13]: df2['one'] == np.nan
Out[13]:
a False
b False
c False
d False
e False
f False
g False
h False
Name: one, dtype: bool
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.