问题描述
我正在构建一个ARKit应用,我们希望可以在其中拍照.我发现ARCamera视图的图像质量不足以在iPad Pro上拍照.
I am building an ARKit app where we want to be able to take a photo of the scene. I am finding the image quality of the ARCamera view is not good enough to take photos with on an iPad Pro.
标准相机图像:
ARCamera图片:
我看过一个苹果论坛上的帖子,提到这可能是iPad Pro 10.5专用的,并且与镜头固定位置有关( https://forums.developer.apple.com/message/262950#262950 ).
I have seen an Apple forum post that mentions this could be iPad Pro 10.5 specific and is related to fixed lens position (https://forums.developer.apple.com/message/262950#262950).
是否存在公开更改设置的方法?
Is there are public way to change the setting?
或者,我尝试使用AVCaptureSession拍摄普通照片并将其应用于sceneView.scene.background.contents
,以便在拍摄照片时将模糊的图像切换为高分辨率图像,但无法使AVCapturePhotoOutput与ARKit配合使用
Alternatively, I have tried to use AVCaptureSession to take a normal photo and apply it to sceneView.scene.background.contents
to switch out a blurred image for higher res image at the point the photo is taken but can't get AVCapturePhotoOutput to work with ARKit
推荐答案
更新:恭喜提交功能要求!在iOS 11.3(又名"ARKit 1.5")中,您可以控制至少一些捕获设置.现在,您将获得默认启用自动对焦的1080p.
Update: Congrats to whoever filed feature requests! In iOS 11.3 (aka "ARKit 1.5"), you can control at least some of the capture settings. And you now get 1080p with autofocus enabled by default.
检查 ARWorldTrackingConfiguration.supportedVideoFormats
以获取ARConfiguration.VideoFormat
对象的列表,每个定义分辨率和帧速率.列表中的第一个是当前设备支持的默认(也是最佳)选项,因此,如果您只想获得最佳分辨率/帧速率,则无需执行任何操作. (并且,如果您出于性能原因要退出,请设置 videoFormat
,最好是基于数组顺序而不是硬编码大小.)
Check ARWorldTrackingConfiguration.supportedVideoFormats
for a list of ARConfiguration.VideoFormat
objects, each of which defines a resolution and frame rate. The first in the list is the default (and best) option supported on your current device, so if you just want the best resolution/framerate available you don't have to do anything. (And if you want to step down for performance reasons by setting videoFormat
, it's probably better to do that based on array order rather than hardcoding sizes.)
默认情况下,iOS 11.3中自动对焦功能处于打开状态,因此示例图片(被摄对象相对靠近相机)应该表现得更好.如果您出于某些原因需要将其关闭,请为此可以进行切换.
Autofocus is on by default in iOS 11.3, so your example picture (with a subject relatively close to the camera) should come out much better. If for some reason you need to turn it off, there's a switch for that.
仍然没有用于更改ARKit使用的基础捕获会话的相机设置的API.
There's still no API for changing the camera settings for the underlying capture session used by ARKit.
根据WWDC的工程师的说法,ARKit使用有限的一部分摄像机捕获功能来确保高帧率,并且对CPU和GPU使用的影响最小.产生更高质量的实时视频会有一些处理开销,但是使ARKit工作的计算机视觉和运动传感器集成系统也有一些处理开销-过多增加开销,您就开始增加延迟.对于应该向用户显示实时"增强现实世界的技术,您不希望增强"部分使摄像机运动滞后多帧. (此外,最重要的是,您可能希望为应用留出一些CPU/GPU时间,以便在相机视图顶部呈现漂亮的3D内容.)
According to engineers back at WWDC, ARKit uses a limited subset of camera capture capabilities to ensure a high frame rate with minimal impact on CPU and GPU usage. There's some processing overhead to producing higher quality live video, but there's also some processing overhead to the computer vision and motion sensor integration systems that make ARKit work — increase the overhead too much, and you start adding latency. And for a technology that's supposed to show users a "live" augmented view of their world, you don't want the "augmented" part to lag camera motion by multiple frames. (Plus, on top of all that, you probably want some CPU/GPU time left over for your app to render spiffy 3D content on top of the camera view.)
iPhone和iPad设备之间的情况相同,但您在iPad上注意到的更多,只是因为屏幕更大— 720p视频在4-5英寸的屏幕上看起来并不差,但看起来糟糕透顶,充满了10-13英寸的屏幕. (幸运的是,在iOS 11.3中默认情况下您会获得1080p,这看起来会更好.)
The situation is the same between iPhone and iPad devices, but you notice it more on the iPad just because the screen is so much larger — 720p video doesn't look so bad on a 4-5" screen, but it looks awful stretched to fill a 10-13" screen. (Luckily you get 1080p by default in iOS 11.3, which should look better.)
AVCapture系统确实提供了在视频捕获期间拍摄更高分辨率/更高质量的静态照片的功能,但是ARKit不会以任何方式公开其内部捕获会话,因此您不能将其与AVCapturePhotoOutput
一起使用. (在会话期间捕获高分辨率静止图像可能仍然是一个很好的功能请求.)
The AVCapture system does provide for taking higher resolution / higher quality still photos during video capture, but ARKit doesn't expose its internal capture session in any way, so you can't use AVCapturePhotoOutput
with it. (Capturing high resolution stills during a session probably remains a good feature request.)
这篇关于如何在ARKit中提高相机质量的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!