我已经为appsrc编写了代码到appsink,它可以工作。我看到了实际的缓冲区。它以H264(vpuenc = avc)编码。现在,我想将其保存在文件(filesink)中。我该如何处理?

应用程式:

int main(int argc, char *argv[]) {

gst_init (NULL, NULL);

GstElement *pipeline, *sink;
gchar *descr;
GError *error = NULL;
GstAppSink *appsink;

descr = g_strdup_printf (
    "mfw_v4lsrc device=/dev/video1 capture_mode=0 !  "  // grab from mipi camera
    "ffmpegcolorspace ! vpuenc codec=avc ! "
    "appsink name=sink"
);
pipeline = gst_parse_launch (descr, &error);
if (error != NULL) {
    g_print ("could not construct pipeline: %s\n", error->message);
    g_error_free (error);
    exit (-1);
}

gst_element_set_state(pipeline, GST_STATE_PAUSED);
sink = gst_bin_get_by_name (GST_BIN (pipeline), "sink");
appsink = (GstAppSink *) sink;
gst_app_sink_set_max_buffers ( appsink, 2); // limit number of buffers queued
gst_app_sink_set_drop( appsink, true ); // drop old buffers in queue when full

gst_element_set_state (pipeline, GST_STATE_PLAYING);


int i = 0;
while( !gst_app_sink_is_eos(appsink) )
{

    GstBuffer *buffer = gst_app_sink_pull_buffer(appsink);
    uint8_t* data = (uint8_t*)GST_BUFFER_DATA(buffer);
    uint32_t size = GST_BUFFER_SIZE(buffer);

    gst_buffer_unref(buffer);
}
return 0; }

最佳答案

如果如评论中所述,您真正想知道的是如何在GStreamer中进行网络视频流,则您可能应该关闭此问题,因为您走的路不对。您不需要为此使用appsink或filesink。您将要研究的是与RTP,RTSP,RTMP,MPEGTS或什至MJPEG(如果图像大小足够小)相关的GStreamer元素。

这是两个基本的发送/接收视频流管道:


  gst-launch-0.10 v4l2src! ffmpegcolorspace!视频规模! video / x-raw-yuv,width = 640,height = 480! vpuenc! h264parse! rtph264pay! udpsink host =本地主机端口= 5555
  
  gst-launch-0.10 udpsrc port = 5555! application / x-rtp,编码名称= H264,有效载荷= 96! rtph264depay! h264parse! ffdec_h264!视频转换! ximagesink

关于c++ - Gstreamer。将appsink写入filesink,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/28040857/

10-13 08:19