问题描述
我找不到任何用于捕捉实时照片的API。我错过了什么吗?
I can't find any API to capture live photos. Did I miss something?
Live Photos是iOS 9的一项新功能,允许用户捕捉和
重温他们喜欢的时刻,比传统的
照片更丰富。当用户按下快门按钮时,相机应用程序
会捕获更多内容以及常规照片,包括
音频和照片前后的其他帧。当通过这些照片浏览
时,用户可以与他们互动并回放所有
捕获的内容,使照片变得生动。
Live Photos is a new feature of iOS 9 that allows users to capture and relive their favorite moments with richer context than traditional photos. When the user presses the shutter button, the Camera app captures much more content along with the regular photo, including audio and additional frames before and after the photo. When browsing through these photos, users can interact with them and play back all the captured content, making the photos come to life.
iOS 9.1引入了API,允许应用程序合并
Live Photos的播放,以及导出数据以进行共享。在Photos框架中有新的
支持从
PHImageManager对象获取PHLivePhoto对象,该对象用于表示
包含Live Photo的所有数据。您可以使用PHLivePhotoView对象(在PhotosUI框架中定义
)来显示Live Photo的内容。
PHLivePhotoView视图负责显示图像,处理
所有用户交互,并应用视觉处理来回放
内容。
iOS 9.1 introduces APIs that allow apps to incorporate playback of Live Photos, as well as export the data for sharing. There is new support in the Photos framework to fetch a PHLivePhoto object from the PHImageManager object, which is used to represent all the data that comprises a Live Photo. You can use a PHLivePhotoView object (defined in the PhotosUI framework) to display the contents of a Live Photo. The PHLivePhotoView view takes care of displaying the image, handling all user interaction, and applying the visual treatments to play back the content.
您还可以使用PHAssetResource访问PHLivePhoto
对象的数据以进行共享。您可以使用PHImageManager或
UIImagePickerController在用户照片库中为
资源请求PHLivePhoto对象。如果您有共享扩展,您还可以使用NSItemProvider
获取PHLivePhoto对象。在接收方
的共享中,您可以从发件人最初导出的
文件集中重新创建PHLivePhoto对象。
You can also use PHAssetResource to access the data of a PHLivePhoto object for sharing purposes. You can request a PHLivePhoto object for an asset in the user’s photo library by using PHImageManager or UIImagePickerController. If you have a sharing extension, you can also get PHLivePhoto objects by using NSItemProvider. On the receiving side of a share, you can recreate a PHLivePhoto object from the set of files originally exported by the sender.
在主题演讲中,他们提到Facebook将支持实时照片,所以我怀疑必须有一种方法来捕捉实时照片。
During the keynote, they mentioned that Facebook will support Live Photos, so I would suspect there has to be a way to capture Live Photos.
推荐答案
UIImagePickerController看起来可以捕获实时照片。
UIImagePickerController looks like it will allow the capture of live photos.
实时照片是支持设备上的相机应用程序功能,使照片不仅可以在一个时刻出现,还可以包含拍摄前后瞬间的动作和声音。 PHLivePhoto对象表示Live Photo,PHLivePhotoView类提供系统标准的交互式用户界面,用于显示Live Photo并播放其内容。
实时照片仍是照片。 当您使用图像选择器控制器捕获或选择静止图像时(通过仅在mediaTypes数组中包含kUTTypeImage类型),作为实时照片捕获的资产仍会显示在选择器中。但是,当用户选择资源时,您的委托对象仅接收包含Live Photo的静止图像表示的UIImage对象。
要在用户使用图像选择器捕获或选择实时照片时获取完整的动作和声音内容,您必须在mediaTypes数组中包含kUTTypeImage和kUTTypeLivePhoto类型。有关更多信息,请参阅UIImagePickerControllerDelegate协议参考中的UIImagePickerControllerLivePhoto。
Live Photos is a Camera app feature on supported devices, enabling a picture to be not just a single moment in time but to include motion and sound from the moments just before and after its capture. A PHLivePhoto object represents a Live Photo, and the PHLivePhotoView class provides a system-standard, interactive user interface for displaying a Live Photo and playing back its content. Live Photos are still photos. When you use an image picker controller to capture or choose still images (by including only the kUTTypeImage type in the mediaTypes array), assets that were captured as Live Photos still appear in the picker. However, when the user chooses an asset, your delegate object receives only a UIImage object containing a still-image representation of the Live Photo. To obtain the full motion and sound content when the user captures or chooses a Live Photo with the image picker, you must include both the kUTTypeImage and kUTTypeLivePhoto types in the the mediaTypes array. For more information, see UIImagePickerControllerLivePhoto in UIImagePickerControllerDelegate Protocol Reference.
这篇关于用于在iOS9中捕获实时照片的API的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!