简体   繁体   中英

Relationship between brigthness scale and luminance range of the screen

I have following problem - I've measured with the use of manual photometer luminance of the screen expressed in cd/m2. Measurement was performed for RGB(0, 0, 0) and for RGB(255, 255, 255) patterns. Thus, I've got a luminance values equal respectively to 0.639 and 101.2 cd/m2. Now, taking into account resulting range created of those two values, I want to create 10 gray-scale colors (or even RGB colors) equaly spread according to aforementioned luminance range, so I'll get following luminance values of those colors:

[  10.0561   20.1122   30.1683   40.2244   50.2805   60.3366   70.3927
   80.4488   90.5049  100.561 ]

I was thinking about using brightness scale, but to be honest, I don't know how I could achieve values like shown above by manipulating aforementioned scale. Thank you in advance.

I found a solution based on 6 following luminance measurements of configuration patterns with HSB brightness equal respectively to 0, 20, 40, 60, 80 and 100, after which polynomial 2nd order curve fit is performed to measured luminance values. As result I get some kind of exponential function:

在此处输入图片说明

Now, I can just pick desired brightness value and choose an appropriate luminance value for it according to the values of fitted curve.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM