问题描述
我的目标:
我正在尝试进行人脸跟踪时,为TruthDepth相机获取TruthDepth相机参数(例如固有,外在,镜头失真等).我读到有一些示例,并且可以通过 OpenCV 实现.我只是想知道应该在Swift中实现类似的目标.
I am trying to get the TruthDepth camera parameters (such as the intrinsic, extrinsic, lens distortion etc) for the TruthDepth camera while I am doing the face tracking. I read that there is examples and possible to that with OpenCV. I am just wondering should one achieve similar goals in Swift.
我已阅读并尝试过的内容:
我阅读了有关ARCamera的苹果文档: intrinsics 和AVCameraCalibrationData: extrinsicMatrix 和 intrinsicMatrix .
I read that the apple documentation about ARCamera: intrinsics and AVCameraCalibrationData:extrinsicMatrix and intrinsicMatrix.
但是,我发现的只是AVCameraCalibrationData
和ARCamera
的声明:
However, all I found was just the declarations for both AVCameraCalibrationData
and ARCamera
:
对于AVCameraCalibrationData
对于internalMatrix
For intrinsicMatrix
var intrinsicMatrix: matrix_float3x3 { get }
对于extrinsicMatrix
For extrinsicMatrix
var extrinsicMatrix: matrix_float4x3 { get }
我也阅读了这篇文章:获取iOS上的相机校准数据,并尝试了Bourne的建议:
I also read this post: get Camera Calibration Data on iOS and tried Bourne's suggestion:
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
let ex = photo.depthData?.cameraCalibrationData?.extrinsicMatrix
//let ex = photo.cameraCalibrationData?.extrinsicMatrix
let int = photo.cameraCalibrationData?.intrinsicMatrix
photo.depthData?.cameraCalibrationData?.lensDistortionCenter
print ("ExtrinsicM: \(String(describing: ex))")
print("isCameraCalibrationDataDeliverySupported: \(output.isCameraCalibrationDataDeliverySupported)")
}
但是它根本不打印矩阵.
But it does not printing the matrix at all.
对于ARCamera
,我从安迪·费多罗夫(Andy Fedoroff)的:
For ARCamera
I have read from Andy Fedoroff's Focal Length of the camera used in RealityKit:
var intrinsics: simd_float3x3 { get }
func inst (){
sceneView.pointOfView?.camera?.focalLength
DispatchQueue.main.asyncAfter(deadline: .now() + 2.0) {
print(" Focal Length: \(String(describing: self.sceneView.pointOfView?.camera?.focalLength))")
print("Sensor Height: \(String(describing: self.sceneView.pointOfView?.camera?.sensorHeight))")
// SENSOR HEIGHT IN mm
let frame = self.sceneView.session.currentFrame
// INTRINSICS MATRIX
print("Intrinsics fx: \(String(describing: frame?.camera.intrinsics.columns.0.x))")
print("Intrinsics fy: \(String(describing: frame?.camera.intrinsics.columns.1.y))")
print("Intrinsics ox: \(String(describing: frame?.camera.intrinsics.columns.2.x))")
print("Intrinsics oy: \(String(describing: frame?.camera.intrinsics.columns.2.y))")
}
}
它显示了渲染摄像机的参数:
It shows the render camera parameters:
Focal Length: Optional(20.784610748291016)
Sensor Height: Optional(24.0)
Intrinsics fx: Optional(1277.3052)
Intrinsics fy: Optional(1277.3052)
Intrinsics ox: Optional(720.29443)
Intrinsics oy: Optional(539.8974)
但是,这仅显示渲染摄像头,而不是我用于面部跟踪的TruthDepth摄像头.
However, this only shows the render camera instead of the TruthDepth camera that I am using for face tracking.
因此,有人可以帮助我开始获取TruthDepth相机参数,因为文档中并未真正显示除声明以外的任何示例吗?
So can anyone help me get started with getting the TruthDepth camera parameters as the documentation did not really show any example other than the declarations?
非常感谢您!
推荐答案
之所以无法打印内在函数,可能是因为您在可选链中得到了nil
.您应该在此处和此处.
The reason why you cannot print the intrinsics is probably because you got nil
in the optional chaining. You should have a look at Apple's remark here and here.
因此,如果要获取TrueDepth相机的intrinsicMatrix
和extrinsicMatrix
,则应使用builtInTrueDepthCamera
作为输入设备,将管道照片输出的isDepthDataDeliveryEnabled
设置为true
,然后设置拍摄照片时,从isDepthDataDeliveryEnabled
到true
.然后,可以通过访问photo
参数的depthData.cameraCalibrationData
属性来访问photoOutput(_: didFinishProcessingPhoto: error:)
回调中的固有矩阵.
So if you want to get the intrinsicMatrix
and extrinsicMatrix
of the TrueDepth camera, you should use builtInTrueDepthCamera
as the input device, set the isDepthDataDeliveryEnabled
of the pipeline's photo output to true
, and set isDepthDataDeliveryEnabled
to true
when you capture the photo. Then you can access the intrinsic matrices in photoOutput(_: didFinishProcessingPhoto: error:)
call back by accessing the depthData.cameraCalibrationData
attribute of photo
argument.
这是一个用于设置的代码示例这样的管道.
这篇关于Swift:在ARKit中获取用于面部跟踪的TruthDepth相机参数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!