简体   繁体   中英

Applying circular filter to image in Python / Applying function to each element of numpy array

I have a Python code that works, but it's quite slow and I believe there has to be a way of doing this more efficiently. The idea is to apply a filter to an image. The filter is an average of the points which fall within a specified radius. The input is a mx2 array representing x,y and mx1 array z, representing coordinates of m observation points.

The program that works is the following

import numpy as np
def haversine(point, xy_list):
    earth_radius = 6378137.0
    dlon = np.radians(xy_list[:,0]) - np.radians(point[0])
    dlat = np.radians(xy_list[:,1]) - np.radians(point[1])
    a = np.square(np.sin(dlat/2.0)) + np.cos(np.radians(point[0])) * np.cos(np.radians(xy_list[:,0])) * np.square(np.sin(dlon/2.0))
    return 2 * earth_radius *  np.arcsin(np.sqrt(a))

def circular_filter(xy, z, radius):
    filtered = np.zeros(xy.shape[0])
    for q in range(xy.shape[0]):
        dist = haversine(xy[q,:],xy) 
        masked_z = np.ma.masked_where(dist>radius, z) 
        filtered[q] = masked_z.mean()
    return filtered

x = np.random.uniform(low=-90, high=0, size=(1000,1)) # x represents longitude
y = np.random.uniform(low=0, high=90, size=(1000,1)) # y represents latitude
xy = np.hstack((x,y))
z = np.random.rand(1000,)
fitered_z = circular_filter(xy, z, radius=100.)

The problem is that I have 6 million points per data set, and the code is horribly slow. There must be a way to do this more efficiently. I thought of using scipy.spatial.distance.cdist() which is fast, but then I'd have to reproject the data to UTM, and I'd like to avoid reprojection. Any suggestions?

Thanks, Reniel

After a lot of reading an searching I finally found the reason my code took forever to run. It's because I needed to understand and apply the concept of a filter kernel. Basically I realized there was a connection between my problem and this post: Local Maxima with circular window

The downside: User needs to provide proper EPSG code, but I think I can find workarounds for this later.

The upside: It is very fast and efficient.

What worked for me was converting the lat long to UTM so that we can create a circular kernel and apply generic_filter from scipy.

import numpy as np
from pyproj import Proj, transform
from scipy.ndimage.filters import generic_filter
def circular_filter(tile, radius):
    x, y = np.meshgrid(tile['lon'], tile['lat'])
    x    = x.reshape(x.size)
    y    = np.flipud(y.reshape(y.size))
    z    = tile['values'].reshape(tile['values'].size)
    wgs84 = Proj(init='epsg:4326')
    utm18N = Proj(init='epsg:26918')
    x,y = transform(wgs84,utm18N,x,y)

    dem_res = np.abs(x[1]-x[0]) # calculates the raster resolution, (original data is geoTiff read using gdal).

    radius = int(np.ceil(radius/dem_res)) # user gives meters, but we figure out number of cells
    print(radius)
    kernel = np.zeros((2*radius+1, 2*radius+1))
    y,x = np.ogrid[-radius:radius+1, -radius:radius+1]
    mask = x**2 + y**2 <= radius**2
    kernel[mask] = 1
    print('Commence circular filter.'); start = time.time()
    tile['values'] = generic_filter(tile['values'], np.mean, footprint=kernel)
    print('Took {:.3f} seconds'.format(time.time()-start))

I also took a look at clustering techniques from here: http://geoffboeing.com/2014/08/clustering-to-reduce-spatial-data-set-size/

But I realized these clustering techniques serve a completely different purpose.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM