问题描述
一切!
我知道,这里有很多关于iOS上FFMPEG的问题,但没有一个答案适合我的情况:(
每个案例都有些奇怪的事情发生当我试图在我的项目中链接FFMPEG时,请帮助我!
I know, there are a lot of questions here about FFMPEG on iOS, but no one answer is appropriate for my case:(Something strange happens each case when I am trying to link FFMPEG in my project, so please, help me!
我的任务是为iOS编写视频聊天应用程序,它使用RTMP协议发布和读取自定义Flash Media Server的视频流。
My task is to write video-chat application for iOS, that uses RTMP-protocol for publishing and reading video-stream to/from custom Flash Media Server.
我决定使用rtmplib,免费的开源库,通过RTMP流式传输FLV视频,因为它是唯一合适的库。
I decided to use rtmplib, free open-source library for streaming FLV video over RTMP, as it is the only appropriate library.
当我开始研究它时出现了很多问题,但后来我明白它应该如何工作。
Many problem appeared when I began research of it, but later I understood how it should work.
现在我可以在我的应用程序的帮助下阅读FLV视频的直播流(来自网址)并将其发送回频道。
Now I can read live stream of FLV video(from url) and send it back to channel, with the help of my application.
我的麻烦现在是从相机发送视频。
基本操作顺序,据我所知,应该如下:
My trouble now is in sending video FROM Camera.Basic operations sequence, as I understood, should be the following:
-
使用AVFoundation ,帮助o f sequence(Device-AVCaptureSession-AVVideoDataOutput-> AVAssetWriter)我把它写到一个文件中(如果需要,我可以更详细地描述这个流程,但在问题的上下文中它并不重要)。此流程是将实时视频从摄像机硬件加速转换为H.264编解码器所必需的。但它是MOV容器格式。 (这是完成的步骤)
Using AVFoundation, with the help of sequence (Device-AVCaptureSession-AVVideoDataOutput-> AVAssetWriter) I write this to a file(If you need, I can describe this flow more detailed, but in the context of question it is not important). This flow is necessary to make hardware-accelerated conversion of live video from the camera into H.264 codec. But it is in MOV container format. (This is completed step)
我在写入每个样本时读取此临时文件,并获取视频数据的字节流,(H.264编码) ,在QuickTime容器中)。 (这已经完成了步骤)
I read this temporary file with each sample written, and obtain the stream of bytes of video-data, (H.264 encoded, in QuickTime container). (this is allready completed step)
我需要将视频数据从QuickTime容器格式转换为FLV。这一切都是实时的。(逐包)
I need to convert videodata from QuickTime container format to FLV. And it all in real-time.(packet - by - packet)
如果我有FLV容器格式的视频数据包,我将能够使用rtmplib通过RTMP发送数据包。
If i will have the packets of video-data, contained in FLV container format, I will be able to send packets over RTMP using rtmplib.
现在,对我来说最复杂的部分是步骤3。
Now, the most complicated part for me, is step 3.
我认为,我需要使用ffmpeg lib进行此转换(libavformat)。我甚至找到了源代码,展示了如何解码来自MOV文件的h.264数据包(在libavformat中查找,我发现甚至可以从字节流中提取这些数据包,这对我来说更合适)。完成后,我需要将数据包编码为FLV(使用ffmpeg或手动,以一种方式将FLV标头添加到h.264数据包,这不是问题,如果我是正确的话很容易)。
I think, I need to use ffmpeg lib to this conversion (libavformat). I even found the source code, showing how to decode h.264 data packets from MOV file (looking in libavformat, i found that it is possible to extract this packets even from byte stream, which is more appropriate for me). And having this completed, I will need to encode packets into FLV(using ffmpeg or manually, in a way of adding FLV-headers to h.264 packets, it is not problem and is easy, if I am correct).
FFMPEG有很棒的文档,是非常强大的库,我认为使用它不会有问题。但是这里的问题是我无法在iOS项目中使用它。
FFMPEG has great documentation and is very powerfull library, and I think, there won't be a problem to use it. BUT the problem here is that I can not got it working in iOS project.
我花了3天时间阅读文档,stackoverflow和谷歌搜索答案的问题如何构建适用于iOS的FFMPEG我认为,如果我再花一周时间尝试编译这个库,我的PM会解雇我。)
I have spend 3 days reading documentation, stackoverflow and googling the answer on the question "How to build FFMPEG for iOS" and I think, my PM is gonna fire me if I will spend one more week on trying to compile this library:))
我试图使用许多不同的构建脚本和配置文件,但是当我构建FFMPEG时,我得到了x86架构的libavformat,libavcodec等(即使我在构建脚本中指定了armv6 arch)。 (我使用lipo -info libavcodec.a来显示架构)
I tried to use many different build-scripts and configure files, but when I build FFMPEG, i Got libavformat, libavcodec, etc. for x86 architecture (even when I specify armv6 arch in build-script). (I use "lipo -info libavcodec.a" to show architectures)
所以我无法构建这些源代码,并决定找到预构建的FFMPEG,这是为架构armv7构建的,armv6,i386。
So I cannot build this sources, and decided to find prebuilt FFMPEG, that is build for architecture armv7, armv6, i386.
我从github下载了来自MidnightCoders的iOS Comm Lib,它包含使用FFMPEG的示例,它包含avcodec,avformat和avformat的预构建.a文件另一个FFMPEG图书馆。
I have downloaded iOS Comm Lib from MidnightCoders from github, and it contains example of usage FFMPEG, it contains prebuilt .a files of avcodec,avformat and another FFMPEG libraries.
我检查他们的架构:
iMac-2:MediaLibiOS root# lipo -info libavformat.a
Architectures in the fat file: libavformat.a are: armv6 armv7 i386
我发现它适合我!
当我尝试将这些库和标题添加到xCode项目时,它编译得很好(我甚至没有像为另一个架构编译库这样的警告),我可以使用标题中的结构,但是当我尝试从libavformat(av_register_all())调用C函数时,编译器显示错误消息找不到架构armv7的符号:av_register_all。
And I found that it is appropriate for me!When I tried to add this libraries and headers to xCode project, It compiles fine(and I even have no warnings like "Library is compiled for another architecture"), and I can use structures from headers, but when I am trying to call C-function from libavformat (av_register_all()), the compiler show me error message "Symbol(s) not found for architecture armv7: av_register_all".
I想想,也许lib中没有符号,并试图向他们展示:
I thought, that maybe there are no symbols in lib, and tried to show them:
root# nm -arch armv6 libavformat.a | grep av_register_all
00000000 T _av_register_all
现在我被困在这里,我不明白,为什么xCode看不到这个符号,也无法继续前进。
Now I am stuck here, I don't understand, why xCode can not see this symbols, and can not move forward.
如果我对从iOS发布RTMP流的流程有所了解,请纠正我,并帮助我构建和链接FFMPEG for iOS。
Please, correct me if I am wrong in the understanding of flow for publishing RTMP-stream from iOS, and help me in building and linking FFMPEG for iOS.
我有iPhone 5.1。 SDK和xCode 4.2。
I have iPhone 5.1. SDK and xCode 4.2.
推荐答案
我虽然写了这个问题的答案,即使它已经老了,因为没有人接受回答。
I though to write an answer for this question, even it is old, because there is none accepted answer.
根据问题中给出的解释,我认为他能够准确地编译FFmpeg的来源。如果有人害怕这样做,那么已经编译好的版本可以从。
According to the given explanation in the question, I assume he was able to compile the source of FFmpeg accurately. If anyone afraid of doing this, there is already compiled version that you can download from here.
无论如何,我认为错误原因是因为 PROJECT 标题搜索路径 / strong>的构建设置。
Anyway, I believe the error cause because of not giving Header Search Paths in your PROJECT's Build Settings.
您可以做的是,添加您的Xcode项目路径以查找Header Search Paths键并设置它是递归的。
What you can do is, add your Xcode project path as to the Header Search Paths key and set it recursive.
我相信你必须将下面提到的所有三个库链接起来而不仅仅是前两个。
And I believe you have to linked all three libraries mentioned below not just the first two.
- libz.dylib
- libbz2.dylib
- libiconv.dylib
这篇关于如何构建FFMPEG并将其链接到iOS?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!