本文介绍了如何写一个Live555 FramedSource,让我流H.264的的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直在试图写一个类,从Live555中的FramedSource派生,这将允许我流实时数据从我的D3D9应用程序到MP4或类似的。

I've been trying to write a class that derives from FramedSource in Live555 that will allow me to stream live data from my D3D9 application to an MP4 or similar.

我做的每个帧是抓取backbuffer到系统内存作为纹理,然后将其转换从RGB - > YUV420P,然后使用x264编码,然后理想情况下传递NAL数据包到Live555。我做了一个类叫H264FramedSource从FramedSource派生基本上通过复制DeviceSource文件。

What I do each frame is grab the backbuffer into system memory as a texture, then convert it from RGB -> YUV420P, then encode it using x264, then ideally pass the NAL packets on to Live555. I made a class called H264FramedSource that derived from FramedSource basically by copying the DeviceSource file. Instead of the input being an input file, I've made it a NAL packet which I update each frame.

我对编解码器和流媒体非常陌生,所以我把它作为一个NAL包,而不是一个输入文件。可能做的一切都完全错了。在每个doGetNextFrame()中我应该抓住NAL包并做一些像

I'm quite new to codecs and streaming, so I could be doing everything completely wrong. In each doGetNextFrame() should I be grabbing the NAL packet and doing something like

memcpy(fTo, nal->p_payload, nal->i_payload)

我假设有效载荷是我的帧数据的字节数?如果任何人有一个类的例子,他们从FramedSource派生,可能至少接近我想要做的,我很想看到它,这是对我来说是新的,有点棘手,弄清楚发生了什么。

I assume that the payload is my frame data in bytes? If anybody has an example of a class they derived from FramedSource that might at least be close to what I'm trying to do I would love to see it, this is all new to me and a little tricky to figure out what's happening. Live555's documentation is pretty much the code itself which doesn't exactly make it easy for me to figure out.

推荐答案

好吧,我的文档是一个很好的代码本身,终于得到了一些时间花在这,并得到它的工作!

Ok, I finally got some time to spend on this and got it working! I'm sure there are others who will be begging to know how to do it so here it is.

你需要自己的FramedSource来获取每一帧,编码,并准备它的流,我将提供一些这个很快的源代码。

You will need your own FramedSource to take each frame, encode, and prepare it for streaming, I will provide some of the source code for this soon.

基本上将你的FramedSource放到H264VideoStreamDiscreteFramer中,然后把它放到H264RTPSink中。像这样

Essentially throw your FramedSource into the H264VideoStreamDiscreteFramer, then throw this into the H264RTPSink. Something like this

scheduler = BasicTaskScheduler::createNew();
env = BasicUsageEnvironment::createNew(*scheduler);

framedSource = H264FramedSource::createNew(*env, 0,0);

h264VideoStreamDiscreteFramer
= H264VideoStreamDiscreteFramer::createNew(*env, framedSource);

// initialise the RTP Sink stuff here, look at
// testH264VideoStreamer.cpp to find out how

videoSink->startPlaying(*h264VideoStreamDiscreteFramer, NULL, videoSink);

env->taskScheduler().doEventLoop();

现在在主渲染循环中,将保存到系统内存中的后台缓冲区FramedSource所以它可以编码等。有关如何设置编码的东西的更多信息检查出这个答案

Oh and for those who want to know what my concurrent queue is, here it is, and it works brilliantly http://www.justsoftwaresolutions.co.uk/threading/implementing-a-thread-safe-queue-using-condition-variables.html

享受和祝你好运!

这篇关于如何写一个Live555 FramedSource,让我流H.264的的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-23 02:39