简体   繁体   中英

matrix max a la matlab on gpu

I'm porting some code from Matlab to run on an Nvidia GPU. I can't figure out a way to do the following:

B = max(A, 0)

where A and B are matrices. In words, I need to replace negative values in a matrix with zeros. I know how to write a kernel function to do with, but I'd like to stick with cuBLAS or magma calls if possible (to avoid adding nvcc to my build process).

I've come up with something using thrust:

thrust::transform(A, A + m*n, [](double x) { thrust::max(x,0.0); });

If this is incorrect, or if there is a better solution I'm open for other suggestions.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM