问题描述
我从 Xcode 的默认 AR 项目开始做这个增强现实项目.
我需要知道 ARKit 使用的相机的焦距.
据苹果称,
值 fx 和 fy 是像素焦距,对于方形像素.值 ox 和 oy 是主体的偏移量从图像框的左上角指向.所有值都是以像素表示.
我得到的 fx
和 fy
的数字相同,即 1515.481
.
要获得以毫米为单位的真实焦距,
正如您在 ARKit 中所说,有一个 3x3 相机矩阵,可让您在 2D 相机平面和 3D 世界坐标空间之间进行转换.
var 内在函数:simd_float3x3 { get }
使用这个矩阵你可以打印 4 个重要参数:
fx
、fy
、ox
和oy
.让我们把它们全部打印出来:DispatchQueue.main.asyncAfter(deadline: .now() + 2.0) {打印(焦距:\(self.sceneView.pointOfView?.camera?.focalLength)")打印(传感器高度:\(self.sceneView.pointOfView?.camera?.sensorHeight)")//传感器高度 (mm)让框架 = self.sceneView.session.currentFrame//内部矩阵打印(内在 fx:\(帧?.camera.intrinsics.columns.0.x)")打印(内在 fy:\(帧?.camera.intrinsics.columns.1.y)")打印(内在牛:\(帧?.camera.intrinsics.columns.2.x)")打印(Intrinsics oy:\(框架?.camera.intrinsics.columns.2.y)")}
对于
iPhone X
,打印以下值:当您应用公式时,您会得到一个令人难以置信的结果(请继续阅读以找出原因).
关于广角镜头和光学防抖
iPhone X 有两个图像传感器,两个摄像头模块都配备了光学图像稳定器 (OIS).广角镜头焦距为28毫米,光圈为
f/1.8
,而长焦镜头为56毫米,f/2.4.ARKit 和 RealityKit 使用广角镜头后模块.在 iPhone X 的情况下,它是一个 28 毫米镜头.但是打印值
focal length = 20.78 mm
怎么样,嗯?我认为28 mm
和20.78 mm
值之间的差异是由于视频防抖占用了总图像面积的 25% 左右.这样做是为了最终为最终图像获得28 mm
的焦距值.红框是稳定阶段的裁剪边距.
结论
这是我自己的结论.我没有找到有关该主题的任何参考资料,因此如果我的观点有误(我承认可能是错误的),请不要严格评判我..
众所周知,相机抖动会随着焦距的增加而放大.因此,焦距值越低,相机抖动越小.这对于 AR 应用程序中的无抖动高质量世界跟踪非常重要.此外,我坚信光学图像稳定器在焦距值较低的情况下效果更好.因此,ARKit 工程师为 AR 体验选择了较低的
focal length
值(捕获更宽的图像区域)也就不足为奇了,然后在稳定后,我们得到了图像的修改版本,例如它有焦距 = 28 毫米
.因此,以我的拙见,为 RealityKit 和 ARKit 计算真实的
焦距
是没有意义的,因为存在FAKE"焦距
已由 Apple 工程师实施,以提供强大的 AR 体验.I am doing this Augmented Reality project starting from Xcode's default AR project.
I need to know the focal length of the camera used by ARKit.
This page defines Focal Length well:
Said that, Apple offers this camera matrix called intrinsics, defined as
According to Apple,
I am getting the same number for
fx
andfy
, that is1515.481
.To obtain the real focal length in millimeters,
- This page says I need to use this formula:
F(mm) = F(pixels) * SensorWidth(mm) / ImageWidth (pixel)
but I don't have the sensor dimensions. - this other page says
FC = fx/sx = fy/sy
, wheresx
andsy
are the image dimensions width and height, what I suppose will give me two numbers, becausefx
=fy
... and this is back to square zero.
On iPhone 11, ARCamera captures a frame with the following dimensions: 1920x1440, at least this number is reported by the property
camera.imageResolution
.In the name of mental sanity, is there a way to get the focal length of
ARCamera
used byRealityKit
?解决方案ARKit and RealityKit does definitely have identical values of
focal length
parameter. That's because these two frameworks are supposed to work together. And although there's nofocal length
instance property forARView
at the moment, you can easily print in Console a focal length forARSCNView
orSCNView
.@IBOutlet var sceneView: ARSCNView! sceneView.pointOfView?.camera?.focalLength
However, take into account that ARKit, RealityKit and SceneKit frameworks don't use a screen resolution, they rather use a viewport size. A magnification factor for iPhones' viewports is usually
1/2
or1/3
.Intrinsic Camera Matrix
As you said in ARKit there's a 3x3 camera matrix allowing you convert between the 2D camera plane and 3D world coordinate space.
var intrinsics: simd_float3x3 { get }
Using this matrix you can print 4 important parameters:
fx
,fy
,ox
andoy
. Let's print them all:DispatchQueue.main.asyncAfter(deadline: .now() + 2.0) { print(" Focal Length: \(self.sceneView.pointOfView?.camera?.focalLength)") print("Sensor Height: \(self.sceneView.pointOfView?.camera?.sensorHeight)") // SENSOR HEIGHT IN mm let frame = self.sceneView.session.currentFrame // INTRINSICS MATRIX print("Intrinsics fx: \(frame?.camera.intrinsics.columns.0.x)") print("Intrinsics fy: \(frame?.camera.intrinsics.columns.1.y)") print("Intrinsics ox: \(frame?.camera.intrinsics.columns.2.x)") print("Intrinsics oy: \(frame?.camera.intrinsics.columns.2.y)") }
For
iPhone X
the following values are printed:When you apply your formulas you'll get a implausible result (read on to find out why).
About Wide-Angle Lens and OIS
ARKit and RealityKit use a wide-angle lens rear module. In iPhone X case it's a 28-mm lens. But what about printed value
focal length = 20.78 mm
, huh? I believe that the discrepancy between the value of28 mm
and20.78 mm
is due to the fact that video stabilization eats up about 25% of the total image area. This is done in order to eventually get a focal length's value of28 mm
for final image.Red frame is a cropping margin at stabilisation stage.
Conclusion
This is my own conclusion. I didn't find any reference materials on that subject, so do not judge me strictly if my opinion is wrong (I admit it may be).
We all know a fact that camera shake is magnified with an increase in focal length. So, the lower value of focal length is, the less camera shake is. It's very important for non-jittering high-quality world tracking in AR app. Also, I firmly believe that Optical Image Stabilisers work much better with lower values of focal length. Hence, it's not a surprise that ARKit engineers have chosen a lower value of
focal length
for AR experience (capturing a wider image area), and then after stabilization, we get a modified version of the image, like it hasfocal length = 28 mm
.So, in my humble opinion, it makes no sense to calculate a REAL
focal length
for RealityKit and ARKit 'cause there is a "FAKE"focal length
already implemented by Apple engineers for a robust AR experience.这篇关于RealityKit 中使用的相机的真实焦距是多少?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!
- This page says I need to use this formula: