本文介绍了如何使用iPhone的相机跟踪动作?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

限时删除!!

我看到有人制作了一款使用相机跟踪你的脚的应用程序,这样你就可以在iPhone屏幕上踢一个虚拟足球。

I saw that someone has made an app that tracks your feet using the camera, so that you can kick a virtual football on your iPhone screen.

你怎么能这样做?有没有人知道有关使用iPhone相机检测对象和跟踪它们的任何代码示例或其他信息?

How could you do something like this? Does anyone know of any code examples or other information about using the iPhone camera for detecting objects and tracking them?

推荐答案

我刚才给了在SecondConf的演讲中,我演示了如何使用iPhone的相机使用OpenGL ES 2.0着色器跟踪彩色对象。随附的帖子,包括我的幻灯片和所有演示的示例代码都可以找到。

I just gave a talk at SecondConf where I demonstrated the use of the iPhone's camera to track a colored object using OpenGL ES 2.0 shaders. The post accompanying that talk, including my slides and sample code for all demos can be found here.

我写的示例应用程序,其代码可以从基于Apple为在WWDC 2007上展示Core Image而制作的示例。该示例在。

The sample application I wrote, whose code can be downloaded from here, is based on an example produced by Apple for demonstrating Core Image at WWDC 2007. That example is described in Chapter 27 of the GPU Gems 3 book.

基本思路是您可以使用自定义GLSL着色器实时处理来自iPhone相机的图像,确定哪些像素与给定阈值内的目标颜色匹配。然后,这些像素将其标准化的X,Y坐标嵌入其红色和绿色分量中,而所有其他像素标记为黑色。然后对整个帧的颜色进行平均以获得彩色对象的质心,当它在相机视图中移动时可以跟踪它。

The basic idea is that you can use custom GLSL shaders to process images from the iPhone camera in realtime, determining which pixels match a target color within a given threshold. Those pixels then have their normalized X,Y coordinates embedded in their red and green color components, while all other pixels are marked as black. The color of the whole frame is then averaged to obtain the centroid of the colored object, which you can track as it moves across the view of the camera.

虽然这不是要解决跟踪像脚这样的更复杂对象的情况,应该能够编写这样的着色器,以便能够选择这样一个移动物体。

While this doesn't address the case of tracking a more complex object like a foot, shaders like this should be able to be written that could pick out such a moving object.

作为一个更新到上面,在我写这篇文章后的两年里,我现在开发了,封装图像和视频的OpenGL ES 2.0着色器处理。最近添加的一个是GPUImageMotionDetector类,它处理场景并检测其中的任何类型的运动。它将返回它作为简单回调块的一部分检测到的整体运动的质心和强度。使用此框架执行此操作应该比滚动自己的解决方案容易得多。

As an update to the above, in the two years since I wrote this I've now developed an open source framework that encapsulates OpenGL ES 2.0 shader processing of images and video. One of the recent additions to that is a GPUImageMotionDetector class that processes a scene and detects any kind of motion within it. It will give you back the centroid and intensity of the overall motion it detects as part of a simple callback block. Using this framework to do this should be a lot easier than rolling your own solution.

这篇关于如何使用iPhone的相机跟踪动作?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

1403页,肝出来的..

09-06 07:34