简体   繁体   中英

how to extract the relative colour intensity in a black and white image in python?

Suppose I have got a black an white image, how do I convert the colour intensity at each point into a numerical value that represents its relativity intensity?

I checked somewhere on the web and found the following: Intensity = np.asarray(PIL.Image.open('test.jpg'))

What's the difference between asarray and array? Besides, the shape of the array Intensity is '181L, 187L, 3L'. The size of the image test.jpg is 181x187, so what does the extra '3' represent?

And are there any other better ways of extracting the colour intensity of an image? thank you.

The image is being opened as a color image, not as a black and white one. The shape is 181x187x3 because of that: the 3 is there because each pixel is an RGB value. Quite often images in black and white are actually stored in an RGB format. For an image image , if np.all(image[:,:,0]==image[:,:,1]) and so on, then you can just choose to use any of them (eg, image[:,:,0] ). Alternatively, you could take the mean with np.mean(image,axis=2) .

Note too that the range of values will depend on the format, and so depending upon what you mean by color intensity, you may need to normalize them. In the case of a jpeg, they are probably uint8s, so you may want image[:,:,0].astype('float')/255 or something similar.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM