简体   繁体   中英

How to convert my python code to run on GPU instead of CPU?

I was given a python code that takes in two images as input and uses the Gabor Filter to find the correlation of RGB of the two images and saves it in a csv file. So I need to execute the program using GPU as it takes much time and CPU utilization. I have a GeForce GTX 1050 Ti and am a complete beginner in programming.

I did some research and learned about CUDA and Tensorflow, but I am really unsure on how to go on about implementing it, and what is the best way to do it without changing much of the code.

#Gabor Filter
def build_filters():
    filters = []
    #tesing phrase filter - reduce
    for ksize in range(9, 19, 5):
        for theta in np.arange(45, 225, 45):
            for sigma in range(2,6,2):
                kern = cv2.getGaborKernel((ksize, ksize), sigma, theta, 5.0, 0.5, 0, ktype=cv2.CV_32F)
                kern /= 1.5*kern.sum()
                filters.append(kern)
    return filters

#Apply filter into the image
def process(images, f):
    accum = np.zeros_like(images)
    for kern in f:
        fimg = cv2.filter2D(images, cv2.CV_8UC3, kern)
        np.maximum(accum, fimg, accum)
    return accum

The full code: https://gitlab.com/t.tansuwan/image_diff_kce/blob/master/allPixelNoCrop.py

Thank you!

Numba can convert a small sub-set of Python to .

You'll want to install numba and cudatoolkit with the conda package manager: conda install numba cudatoolkit . Then you can add @jit(nopython=True, parallel=True)

I'm not sure Numba can be used with OpenCV, but you could certainly try. Python is not really suited for high-performance computation, you're better off learning FORTRAN, a shader language, or C and implementing your computation in that.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM