VideoCaptureViewController

VideoCaptureViewController

在我的项目中,我正在使用一个正在运行的演示中的OpenCV框架,其中缺少一些小问题。

重现步骤:

  • http://aptogo.co.uk/2011/09/opencv-framework-for-ios/下载示例应用程序
  • 用钛创建一个新的Titainum iOS模块create --platform = iphone --type = module --dir =。 --name = opencv --id = opencv
  • 打开XCode项目,从FaceTracker应用程序和其他必需的框架中拖入OpenCV框架。
  • 将OTHER_LDFLAGS = $(继承)-framework OpenCV添加到module.xcconfig
  • 创建名为OpencvView和OpencvViewProxy的新TiUIView和TiUIViewProxy类。
  • 在新的OpencvView类中,实例化使用OpenCV的UIViewController。

  • 该构建将构建Titanium模块,但是当我尝试运行模块测试工具时,我收到有关OpenCV对象的以下错误:

    体系结构i386的 undefined symbol :
    “_CMSampleBufferGetImageBuffer”,引用自:
    -libopencv.a(VideoCaptureViewController.o)中的[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:]
    从以下位置引用的“_CMSampleBufferGetOutputPresentationTimeStamp”
    -libopencv.a(VideoCaptureViewController.o)中的[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:]
    “_CMTimeMake”,引用自:
    -libopencv.a(VideoCaptureViewController.o)中的[VideoCaptureViewController createCaptureSessionForCamera:qualityPreset:grayscale:]
    从以下位置引用的“_CVPixelBufferGetBaseAddress”
    -libopencv.a(VideoCaptureViewController.o)中的[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:]
    从以下位置引用的“_CVPixelBufferGetBaseAddressOfPlane”
    -libopencv.a(VideoCaptureViewController.o)中的[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:]
    “_CVPixelBufferGetHeight”,引用自:
    -libopencv.a(VideoCaptureViewController.o)中的[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:]
    从以下位置引用的“_CVPixelBufferGetPixelFormatType”
    -libopencv.a(VideoCaptureViewController.o)中的[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:]
    “_CVPixelBufferGetWidth”,引用自:
    -libopencv.a(VideoCaptureViewController.o)中的[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:]
    从以下位置引用的“_CVPixelBufferLockBaseAddress”
    -libopencv.a(VideoCaptureViewController.o)中的[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:]
    从以下位置引用的“_CVPixelBufferUnlockBaseAddress”
    -libopencv.a(VideoCaptureViewController.o)中的[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:]
    “cv::_ InputArray::_ InputArray(cv::Mat const&)”,引用自:
    -libopencv.a(DemoVideoCaptureViewController.o)中的[DemoVideoCaptureViewController processFrame:videoRect:videoOrientation:]
    “cv::_ OutputArray::_ OutputArray(cv::Mat&)”,引用自:
    -libopencv.a(DemoVideoCaptureViewController.o)中的[DemoVideoCaptureViewController processFrame:videoRect:videoOrientation:]
    “cv::CascadeClassifier::load(std::string const&)”,引用自:
    -libopencv.a(DemoVideoCaptureViewController.o)中的[DemoVideoCaptureViewController viewDidLoad]
    “cv::CascadeClassifier::CascadeClassifier()”,引用自:
    -libopencv.a(DemoVideoCaptureViewController.o)中的[DemoVideoCaptureViewController .cxx_construct]
    “cv::CascadeClassifier::〜CascadeClassifier()”,引用自:
    -libopencv.a(DemoVideoCaptureViewController.o)中的[DemoVideoCaptureViewController .cxx_destruct]
    “cv::Mat::deallocate()”,引用自:
    -libopencv.a(VideoCaptureViewController.o)中的[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:]
    -libopencv.a(UIImage + OpenCV.o)中的[UIImage(UIImage_OpenCV)CVMat]
    -libopencv.a(UIImage + OpenCV.o)中的[UIImage(UIImage_OpenCV)CVGrayscaleMat]
    “cv::Mat::create(int,int const *,int)”,引用自:
    -libopencv.a(UIImage + OpenCV.o)中的[UIImage(UIImage_OpenCV)CVMat]
    -libopencv.a(UIImage + OpenCV.o)中的[UIImage(UIImage_OpenCV)CVGrayscaleMat]
    “cv::flip(cv::_ InputArray const&,cv:_OutputArray const&,int)”,引用自:
    -libopencv.a(DemoVideoCaptureViewController.o)中的[DemoVideoCaptureViewController processFrame:videoRect:videoOrientation:]
    “cv::resize(cv::_ InputArray const&,cv::OutputArray const&,cv::Size,double,double,int)”,引用自:
    -libopencv.a(DemoVideoCaptureViewController.o)中的[DemoVideoCaptureViewController processFrame:videoRect:videoOrientation:]
    “cv::fastFree(void *)”,引用自:
    -libopencv.a(VideoCaptureViewController.o)中的[VideoCaptureViewController captureOutput:didOutputSampleBuffer:fromConnection:]
    -libopencv.a(UIImage + OpenCV.o)中的[UIImage(UIImage_OpenCV)CVMat]
    -libopencv.a(UIImage + OpenCV.o)中的[UIImage(UIImage_OpenCV)CVGrayscaleMat]
    “cv::transpose(cv::_ InputArray const&,cv::_ OutputArray const&)”,引用自:
    -libopencv.a(DemoVideoCaptureViewController.o)中的[DemoVideoCaptureViewController processFrame:videoRect:videoOrientation:]
    从以下位置引用的“_kCVPixelBufferPixelFormatTypeKey”
    -libopencv.a(VideoCaptureViewController.o)中的[VideoCaptureViewController createCaptureSessionForCamera:qualityPreset:grayscale:]
    ld:找不到体系结构i386的符号
    铛:错误:链接器命令失败,退出代码为1(使用-v查看调用)

    最佳答案

    我在Xcode 4.5.1上使用默认的Apple llvm编译器时遇到了类似的问题。尝试将其更改为gcc(通过构建选项),然后查看是否可行。

    关于opencv - 无法在XCode中链接OpenCV框架符号,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/13882191/

    10-12 20:06