简体   繁体   English

Camera.Parameters.getHorizontalViewAngle() 和 Camera.Parameters.getVerticalViewAngle() 的 Android Camera2 API 等价物是什么?

[英]What is the Android Camera2 API equivalent of Camera.Parameters.getHorizontalViewAngle() and Camera.Parameters.getVerticalViewAngle()?

It's all in the title, but in the now-deprecated Android Camera API, there were two methods: Camera.Parameters.getHorizontalViewAngle() and Camera.Parameters.getVerticalViewAngle().一切都在标题中,但在现已弃用的 Android Camera API 中,有两个方法:Camera.Parameters.getHorizontalViewAngle() 和 Camera.Parameters.getVerticalViewAngle()。

Now, with the current Camera2 API, it seems there is no equivalent to these in the docs.现在,对于当前的 Camera2 API,文档中似乎没有与这些等效的东西。 I'm assuming that this is because FOV angles are more complicated and nuanced than a simple horizontal and vertical value, but I can't find any information online about how to calculate the total field of view for an Android device using the newer Camera2 API.我假设这是因为 FOV 角度比简单的水平和垂直值更复杂和细微差别,但我无法在网上找到任何有关如何使用较新的 Camera2 API 计算 Android 设备的总视野的信息.

The basic formula is 基本公式是

FOV.x = 2 * atan(SENSOR_INFO_PHYSICAL_SIZE.x / (2 * LENS_FOCAL_LENGTH))
FOV.y = 2 * atan(SENSOR_INFO_PHYSICAL_SIZE.y / (2 * LENS_FOCAL_LENGTH))

This is an approximation assuming an ideal lens, etc, but generally good enough. 这是假设理想镜头等的近似值 ,但通常足够好。

This calculates the FOV for the entire sensor pixel array. 这计算整个传感器像素阵列的FOV。

However, the actual field of view of a given output will be smaller; 但是,给定输出的实际视野会更小; first, the readout area of the sensor is often smaller than the full pixel array, so instead of using PHYSICAL_SIZE directly, you need to first scale it by the ratio of the pixel array pixel count to the active array pixel count (SENSOR_INFO_ACTIVE_ARRAY_SIZE / SENSOR_INFO_PIXEL_ARRAY_SIZE). 首先,传感器的读出区域通常小于全像素阵列,因此不需要直接使用PHYSICAL_SIZE,而是首先按像素阵列像素数与有效数组像素数的比例进行缩放(SENSOR_INFO_ACTIVE_ARRAY_SIZE / SENSOR_INFO_PIXEL_ARRAY_SIZE) 。

Then, the field of view depends on the aspect ratio of the output(s) you've configured (a 16:9 FOV will be different than a 4:3 FOV), relative to the aspect ratio of the active array, and the aspect ratio of the crop region (digital zoom) if it's smaller than than the full active array. 然后,视场取决于您配置的输出的纵横比(16:9 FOV将不同于4:3 FOV),相对于有源阵列的纵横比,以及裁剪区域的纵横比(数字变焦)如果小于完整的活动阵列。

Each output buffer will be the result of minimally further cropping the cropRegion for the corresponding capture request to reach the correct output aspect ratio. 每个输出缓冲区将最小化进一步裁剪cropRegion以获得相应的捕获请求以达到正确的输出宽高比。 ( http://source.android.com/devices/camera/camera3_crop_reprocess.html has diagrams). http://source.android.com/devices/camera/camera3_crop_reprocess.html有图表)。

So let's say we have a sensor that has a pixel array of (120,120), and we have an active array rectangle of (10,10)-(110,110), so width/height of 100,100. 所以我们假设我们有一个像素数组为(120,120)的传感器,我们有一个有效的数组矩形(10,10) - (110,110),所以宽度/高度为100,100。

We configure two outputs, output A is (40,30), output B is (50, 50). 我们配置两个输出,输出A是(40,30),输出B是(50,50)。 Let's leave the crop region at the maximum (0,0)-(100,100). 让我们将作物区域保持在最大值(0,0) - (100,100)。

The horizontal FOV for output A and B will be the same, because the maximum-area crop will result in both outputs using the full active array width: 输出A和B的水平FOV将是相同的,因为最大区域裁剪将导致两个输出使用完整的有效阵列宽度:

output_physical_width = SENSOR_INFO_PHYSICAL_SIZE.x * ACTIVE_ARRAY.w / PIXEL_ARRAY.w
FOV_x = 2 * atan(output_physical_width / (2 * LENS_FOCAL_LENGTH))

However, the vertical FOVs will differ - output A will only use 3/4 of the vertical space due to the aspect ratio mismatch: 但是,垂直FOV会有所不同 - 由于纵横比不匹配,输出A仅使用垂直空间的3/4:

active_array_aspect = ACTIVE_ARRAY.w / ACTIVE_ARRAY.h
output_a_aspect = output_a.w / output_a.h
output_b_aspect = output_b.w / output_b.h
output_a_physical_height = SENSOR_INFO_PHYSICAL_SIZE.y * ACTIVE_ARRAY.h / PIXEL_ARRAY.h * output_a_aspect / active_array_aspect
output_b_physical_height = SENSOR_INFO_PHYSICAL_SIZE.y * ACTIVE_ARRAY.h / PIXEL_ARRAY.h * output_b_aspect / active_array_aspect
FOV_a_y = 2 * atan(output_a_physical_height / (2 * LENS_FOCAL_LENGTH))
FOV_b_y = 2 * atan(output_b_physical_height / (2 * LENS_FOCAL_LENGTH))

The above works when the output aspect ratio is <= active array aspect ratio (letterboxing); 当输出宽高比<=有源阵列宽高比(letterboxing)时,上述工作原理; if that's not true, then the output horizontal dimension is reduced and the vertical dimension covers the whole active array (pillarboxing). 如果不是这样,则输出水平尺寸减小,垂直尺寸覆盖整个有源阵列(柱箱)。 The scale factor for the horizontal direction is then active_array_aspect/output_aspect. 然后,水平方向的比例因子为active_array_aspect / output_aspect。

If you want to calculate the FOV for a zoomed-in view, then substitute the crop region dimensions/aspect ratio for the active array dimensions/aspect ratio. 如果要计算放大视图的FOV,请将裁剪区域尺寸/纵横比替换为活动阵列尺寸/纵横比。

private float getHFOV(CameraCharacteristics info) {
    SizeF sensorSize = info.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE);
    float[] focalLengths = info.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS);

    if (focalLengths != null && focalLengths.length > 0) {
        return (float) (2.0f * atan(sensorSize.getWidth() / (2.0f * focalLengths[0])));
    }

    return 1.1f;
}

While Eddy Talvala's answer gives you a solution, that is only viable for objects at a distance.虽然Eddy Talvala 的回答为您提供了一个解决方案,但这只适用于远处的物体。 However, as https://en.wikipedia.org/wiki/Angle_of_view#Macro_photography notes, for close subjects, you cannot assume the distance between lense and sensor to be equal to the focus distance.但是,正如https://en.wikipedia.org/wiki/Angle_of_view#Macro_photography所指出的,对于近距离拍摄对象,您不能假设镜头和传感器之间的距离等于焦距。 This distance cannot be queried but calculated S_2 = S_1 * focal length / (S_1 - focal length)这个距离无法查询但是可以计算S_2 = S_1 * focal length / (S_1 - focal length)

In this case S_1 is the minimal focus distance which can be computed from LENS_INFO_MINIMUM_FOCUS_DISTANCE when LENS_INFO_FOCUS_DISTANCE_CALIBRATION is not uncalibrated , in which case LENS_INFO_MINIMUM_FOCUS_DISTANCE has a unit of diopter ( 1/m ).在这种情况下, S_1是最小焦距,当LENS_INFO_FOCUS_DISTANCE_CALIBRATIONuncalibrated时,可以根据LENS_INFO_MINIMUM_FOCUS_DISTANCE计算得出,在这种情况下, LENS_INFO_MINIMUM_FOCUS_DISTANCE的单位是屈光度 ( 1/m )。

So the final formula becomes:所以最后的公式变成:

// focal length/physical size is in mm -> convert from diopter (1/m) to mm
S_1 = 1000 / LENS_INFO_MINIMUM_FOCUS_DISTANCE
S_2 = LENS_FOCAL_LENGTH * S_1 / (S_1 - LENS_FOCAL_LENGTH)
FOV.x = 2 * atan(SENSOR_INFO_PHYSICAL_SIZE.x / (2 * S_2))

I measured this and compared with the computation for a Mi 9T Pro with following results:我对此进行了测量,并与 Mi 9T Pro 的计算结果进行了比较:

name名称 value价值
Focal Length焦距 4.77 mm 4.77 毫米
Physical Sensor Size物理传感器尺寸 6,4x4,8 mm 6,4x4,8 毫米
minimum focus distance最小焦距 10 diopter 10屈光度
hyperfocal distance超焦距 0.123 diopter 0.123屈光度
calculated FOV.y hyperfocal计算的 FOV.y 超焦距 53.4° 53.4°
calculated FOV.y macro计算的 FOV.y 宏 51.2° 51.2°
Framesize (y)帧大小 (y) 9.6 cm 9.6 厘米
Distance距离 10.2 cm 10.2 厘米
measured FOV测量视场 50.4° 50.4°

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM