本文介绍了ARCore:如何在检测到图像时在相框中播放视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想在检测到图像时在相框中播放视频,有人用 ARCore 做过吗?会有很大帮助

I want to play video in in photo frame when image is detected, anybody who have done this using ARCore? would be great help

谢谢

推荐答案

我想您的意思是您想在 ARCore 中添加视频作为可渲染对象,在您检测到图像的情况下.

I think you mean you want to add a video as a renderable in ARCore, in your case when an image is detected.

实际上(在撰写本文时)Sceneform 中包含一个示例,展示了如何将视频添加为可渲染的 - 可在此处获得:https://github.com/google-ar/sceneform-android-sdk/tree/master/samples/chromakeyvideo

There is actually (at the time of writing) an example included with Sceneform showing how to add a video as a renderable - it is available here: https://github.com/google-ar/sceneform-android-sdk/tree/master/samples/chromakeyvideo

此特定示例还应用了色度过滤器,但您可以简单地忽略该部分.

This particular example also applies a Chroma filter but you can simply ignore that part.

做法大致是:

  • 创建一个 ExternalTexture 来播放视频
  • 创建一个 MediaPlayer 并将其表面设置为 ExternalTexture 的表面
  • 使用 ExternalTexture 构建一个新的可渲染对象
  • 创建一个节点并将其添加到您的场景中
  • 将节点的可渲染对象设置为您构建的新 ModelRenderable

对于增强图像,只要图像的状态为正在跟踪",ArCore 就会自动计算它检测到的图像的大小.来自文档:

For Augmented images, ArCore will automatically calculate the size of the image that it detects so long as the state of the image is 'TRACKING". From the documentation:

ARCore 将尝试根据其对世界的理解来估计物理图像的宽度.如果在数据库中指定了可选的物理尺寸,则此估计过程将更快地进行.但是,估计尺寸可能与指定尺寸不同.

默认情况下,您的可渲染对象的大小将适合它,但您也可以根据需要向上或向下缩放可渲染对象.

Your renderable will be sized to fit inside this by default but you can scale the renderable up or down as you want also.

有一系列可用的文章可能涵盖您的具体情况,具体取决于您的需要,以及此处的一些示例代码:https://proandroiddev.com/arcore-sceneform-simple-video-playback-3fe2f909bfbc

There is a series of articles available which may cover your exact case, depending on exactly what you need, along with some example code here: https://proandroiddev.com/arcore-sceneform-simple-video-playback-3fe2f909bfbc

这篇关于ARCore:如何在检测到图像时在相框中播放视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-18 05:40