简体   繁体   English

CUDA共享内存的可能性

[英]CUDA Shared Memory Possibility

I'm having a bit of a issue understanding how I could implement CUDA shared memory since I'm not using the thread number for anything other than to check which calculations that certain thread should do. 我在了解如何实现CUDA共享内存时遇到了一个问题,因为除了检查某些线程应该执行的计算之外,我没有将线程号用于其他任何事情。

__global__ void gpu_histogram_equalization(unsigned char * img_out, unsigned char * img_in,
                            int * hist_in, int img_size, int nbr_bin, int numOfThreads, int * lut){


    int i = 0;
    int x = threadIdx.x + blockDim.x*blockIdx.x;

    int start;
    int end;

    /* Get the result image */
    if(x >= img_size) {
       return;
    }
    start = ((img_size/numOfThreads) * x);
    if(numOfThreads == 1) {
       end = (img_size/numOfThreads);
    }
    else {
       end = ((img_size/numOfThreads) * (x+1));
    }
    for(i = start; i < end; i ++){
        if(lut[img_in[i]] > 255){
            img_out[i] = 255;
        }
        else{
            img_out[i] = (unsigned char)lut[img_in[i]];
        }

    }
}

Can anyone clarify that my speculation is true, that this is not possible to make use of shared memory? 任何人都可以澄清我的猜测是对的,这不可能利用共享内存吗?

Using shared memory will give you a preformance increase if you reuse the data multiple times. 如果您多次重用数据,则使用共享内存可以提高性能。 The code can be rewritten to utilize higher memory bandwidth and discard the use of shared memory. 可以重写代码以利用更高的内存带宽,并放弃使用共享内存。

Something like this: 像这样:

__global__ void gpu_histogram_equalization(unsigned char * img_out, unsigned char * img_in,
                        int * hist_in, int img_size, int nbr_bin, int numOfThreads, int * lut){
  int lutval;
  int x = threadIdx.x + blockDim.x*blockIdx.x;

  /* Get the result image */
  if(x >= img_size) {
     return;
  }

  lutval = lut[img_in[x]];

  if(lutval > 255){
   img_out[x] = 255;
  }
  else{
    img_out[i] = (unsigned char)lutval;
  }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM