问题描述
我正在构建一个 ARKit 应用程序,我们希望能够在其中拍摄场景照片.我发现 ARCamera 视图的图像质量不足以在 iPad Pro 上拍照.
I am building an ARKit app where we want to be able to take a photo of the scene. I am finding the image quality of the ARCamera view is not good enough to take photos with on an iPad Pro.
标准相机图像:
ARCamera 图片:
我看到一个 Apple 论坛帖子提到这可能是 iPad Pro 10.5 特有的,并且与固定镜头位置有关 (https://forums.developer.apple.com/message/262950#262950).
I have seen an Apple forum post that mentions this could be iPad Pro 10.5 specific and is related to fixed lens position (https://forums.developer.apple.com/message/262950#262950).
是否有公开的方式来更改设置?
Is there are public way to change the setting?
或者,我尝试使用 AVCaptureSession 拍摄一张普通照片并将其应用到 sceneView.scene.background.contents
以在照片所在的点切换为更高分辨率图像的模糊图像已拍摄但无法让 AVCapturePhotoOutput 与 ARKit 一起使用
Alternatively, I have tried to use AVCaptureSession to take a normal photo and apply it to sceneView.scene.background.contents
to switch out a blurred image for higher res image at the point the photo is taken but can't get AVCapturePhotoOutput to work with ARKit
推荐答案
更新: 恭喜提交 功能请求!在 iOS 11.3(又名ARKit 1.5")中,您至少可以控制一些捕获设置.您现在可以在默认启用自动对焦的情况下获得 1080p.
Update: Congrats to whoever filed feature requests! In iOS 11.3 (aka "ARKit 1.5"), you can control at least some of the capture settings. And you now get 1080p with autofocus enabled by default.
检查 ARWorldTrackingConfiguration.supportedVideoFormats
ARConfiguration.VideoFormat
对象的列表,每个对象定义了分辨率和帧速率.列表中的第一个是您当前设备支持的默认(也是最佳)选项,因此如果您只想获得最佳分辨率/帧率,则无需执行任何操作.(如果您想通过设置 videoFormat
,基于数组顺序而不是硬编码大小可能更好.)
Check ARWorldTrackingConfiguration.supportedVideoFormats
for a list of ARConfiguration.VideoFormat
objects, each of which defines a resolution and frame rate. The first in the list is the default (and best) option supported on your current device, so if you just want the best resolution/framerate available you don't have to do anything. (And if you want to step down for performance reasons by setting videoFormat
, it's probably better to do that based on array order rather than hardcoding sizes.)
自动对焦在 iOS 11.3 中默认处于开启状态,因此您的示例图片(主体相对靠近相机)应该会表现得更好.如果由于某种原因你需要关闭它,有一个开关.
Autofocus is on by default in iOS 11.3, so your example picture (with a subject relatively close to the camera) should come out much better. If for some reason you need to turn it off, there's a switch for that.
仍然没有用于更改 ARKit 使用的底层捕获会话的相机设置的 API.
There's still no API for changing the camera settings for the underlying capture session used by ARKit.
根据 WWDC 工程师的说法,ARKit 使用有限的相机捕捉功能子集来确保高帧率,同时将对 CPU 和 GPU 使用率的影响降至最低.产生更高质量的实时视频会产生一些处理开销,但也有一些处理开销用于使 ARKit 工作的计算机视觉和运动传感器集成系统——增加过多的开销,你就会开始增加延迟.对于应该向用户展示他们世界的实时"增强视图的技术,您不希望增强"部分使相机运动滞后多帧.(此外,最重要的是,您可能需要一些 CPU/GPU 时间让您的应用在相机视图顶部渲染漂亮的 3D 内容.)
According to engineers back at WWDC, ARKit uses a limited subset of camera capture capabilities to ensure a high frame rate with minimal impact on CPU and GPU usage. There's some processing overhead to producing higher quality live video, but there's also some processing overhead to the computer vision and motion sensor integration systems that make ARKit work — increase the overhead too much, and you start adding latency. And for a technology that's supposed to show users a "live" augmented view of their world, you don't want the "augmented" part to lag camera motion by multiple frames. (Plus, on top of all that, you probably want some CPU/GPU time left over for your app to render spiffy 3D content on top of the camera view.)
iPhone 和 iPad 设备的情况是一样的,但您会在 iPad 上注意到更多,因为屏幕大得多——720p 视频在 4-5" 屏幕上看起来并没有那么糟糕,但它看起来可怕的拉伸以填满 10-13 英寸的屏幕.(幸运的是,您在 iOS 11.3 中默认获得 1080p,应该会更好看.)
The situation is the same between iPhone and iPad devices, but you notice it more on the iPad just because the screen is so much larger — 720p video doesn't look so bad on a 4-5" screen, but it looks awful stretched to fill a 10-13" screen. (Luckily you get 1080p by default in iOS 11.3, which should look better.)
AVCapture 系统确实提供在视频捕获期间拍摄更高分辨率/更高质量的静态照片,但 ARKit 不会以任何方式公开其内部捕获会话,因此您不能将 AVCapturePhotoOutput
与它.(在会话期间捕获高分辨率剧照可能仍然是一个很好的功能请求.)
The AVCapture system does provide for taking higher resolution / higher quality still photos during video capture, but ARKit doesn't expose its internal capture session in any way, so you can't use AVCapturePhotoOutput
with it. (Capturing high resolution stills during a session probably remains a good feature request.)
这篇关于如何在 ARKit 中提高相机质量的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!