[英]Apple ARKit — Create an ARFrame from a CGImage
I would like to use ARKit to obtain a light estimate from an image. 我想使用ARKit从图像获得光线估计。 I was able to retrieve a light estimate from a frames in a video.
我能够从视频中的帧中检索出一些估计值。
var SceneView = new ARSCNView();
var arConfig = new ARKit.ARWorldTrackingConfiguration { PlaneDetection = ARPlaneDetection.Horizontal };
SceneView.Session.Run(arConfig, ARSessionRunOptions.ResetTracking);
var frame = SceneView.Session.CurrentFrame;
float light = frame.LightEstimate.AmbientIntensity;
However is it possible to instantiate an ARFrame using a CGImage?
但是是有可能实例化一个ARFrame使用CGImage?
Like 喜欢
CGImage img = new CGImage("my file.jpg");
ARFrame frame = new ARFrame(img);
float light = frame.LightEstimate.AmbientIntensity;
Solutions using swift or Xamarin are welcome 欢迎使用swift或Xamarin解决方案
Sorry, but ARFrame wraps CVPixelBuffer, which represents a video frame and depending on the device is likely in a different format than CGImage. 抱歉,ARFrame包装了CVPixelBuffer,它代表一个视频帧,并且取决于设备,其格式可能与CGImage不同。 Also ARFrame has no public initializer and the
var capturedImage: CVPixelBuffer
property is read only. 此外,ARFrame没有公共初始化程序,并且
var capturedImage: CVPixelBuffer
属性是只读的。 However if you are getting the CGImage from the camera then why no get the light estimate at the time of capture and save it along with the image? 但是,如果您是从相机获取CGImage的,那么为什么在捕获时却没有获得光线估计值并将其与图像一起保存呢?
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.