本文介绍了gstreamer appsrc适用于xvimagesink,但不适用于theoraenc! gg的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用gstreamer和icecast流式投射计算机生成的视频,但是我无法使gstreamer appsrc正常工作.如果我使用xvimagesink作为接收器,我的应用程序将按预期工作(请参阅下面的注释代码).但是一旦将其通过管道传输到theoraenc,它将无法运行.

I am trying to stream cast a computer generated video using gstreamer and icecast, but I cannot get gstreamer appsrc to work. My app works as expected if I use xvimagesink as the sink(see commented code below). But once I pipe it to theoraenc it does not run.

我用文件接收器交换了shout2send,以检查问题是否被冰封了,结果是没有数据写入文件.用testvideosrc替换appsrc可以正常工作.有什么建议吗?

I exchanged shout2send with filesink to check if the problem was icecast, the result is that no data is written to the file. Substituting appsrc with testvideosrc works as expected. Any suggestion?

#!/usr/bin/env python
import sys, os, pygtk, gtk, gobject
import pygst
pygst.require("0.10")
import gst
import numpy as np

class GTK_Main:
    def __init__(self):
        window = gtk.Window(gtk.WINDOW_TOPLEVEL)
        window.connect("destroy", gtk.main_quit, "WM destroy")
        vbox = gtk.VBox()
        window.add(vbox)
        self.button = gtk.Button("Start")
        self.button.connect("clicked", self.start_stop)
        vbox.add(self.button)
        window.show_all()

        self.player = gst.Pipeline("player")
        source = gst.element_factory_make("appsrc", "source")
        caps = gst.Caps("video/x-raw-gray,bpp=16,endianness=1234,width=320,height=240,framerate=(fraction)10/1")
        source.set_property('caps',caps)
        source.set_property('blocksize',320*240*2)
        source.connect('need-data', self.needdata)
        colorspace = gst.element_factory_make('ffmpegcolorspace')
        enc = gst.element_factory_make('theoraenc')
        mux = gst.element_factory_make('oggmux')
        shout = gst.element_factory_make('shout2send')
        shout.set_property("ip","localhost")
        shout.set_property("password","hackme")
        shout.set_property("mount","/stream")
        caps = gst.Caps("video/x-raw-yuv,width=320,height=240,framerate=(fraction)10/1,format=(fourcc)I420")
        enc.caps = caps
        videosink = gst.element_factory_make('xvimagesink')
        videosink.caps = caps

        self.player.add(source, colorspace, enc, mux, shout)
        gst.element_link_many(source, colorspace, enc, mux, shout)
        #self.player.add(source, colorspace, videosink)
        #gst.element_link_many(source, colorspace, videosink)

    def start_stop(self, w):
        if self.button.get_label() == "Start":
            self.button.set_label("Stop")
            self.player.set_state(gst.STATE_PLAYING)
        else:
            self.player.set_state(gst.STATE_NULL)
            self.button.set_label("Start")

    def needdata(self, src, length):
        bytes = np.int16(np.random.rand(length/2)*30000).data
        src.emit('push-buffer', gst.Buffer(bytes))

GTK_Main()
gtk.gdk.threads_init()
gtk.main()

推荐答案

我认为您的问题最有可能与对时间戳记时间戳有关.我已经使用该代码进行了一些快速测试,并使用oggdemux,theoradec,ffmpegcolorspace和ximagesink替换了shout元素.最初,我没有任何输出,但是在完全放弃了混合/解复用之后,我得到了一个静态图像以及一些有关时间戳的调试消息.在appsrc上将is-live和do-timestamp属性设置为true之后,我得到了正确的输出.

I think that your problem is most likely to do with timestamping of the buffers. I've done some quick testing, using that code and replacing the shout element with oggdemux, theoradec, ffmpegcolorspace and ximagesink. At first, I got no output, but after I dispensed with the muxing/demuxing altogether, I got a static image, along with some debug messages about timestamps. I got the correct output after setting the is-live and do-timestamp properties to true on appsrc.

我认为应该可以直接在要从appsrc中推出的缓冲区上设置时间戳,但是我还没有发现如何做到这一点.

I assume that it should be possible to directly set the timestamps on the buffers that you are pushing out of appsrc, but alas I've not discovered how to do that.

这篇关于gstreamer appsrc适用于xvimagesink,但不适用于theoraenc! gg的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-18 23:15