I'm developing on Jetson Nano, and my goal is to output two overlapping videos.
Here is the command I wrote first.
gst-launch-1.0 v4l2src device=/dev/video0 io-mode=2 ! image/jpeg, width=1920, height=1080 ! nvjpegdec ! video/x-raw ! queue ! nvvidconv ! 'video/x-raw(memory:NVMM),format=RGBA' ! queue ! comp.sink_0 \
v4l2src device=/dev/video1 io-mode=2 ! image/jpeg, width=1920, height=1080 ! nvjpegdec ! video/x-raw ! queue ! nvvidconv ! 'video/x-raw(memory:NVMM),format=RGBA' ! queue ! comp.sink_1 \
alsasrc device="hw:1" ! audioconvert ! audioresample ! audiorate ! "audio/x-raw, rate=48000, channels=2" ! queue ! faac bitrate=128000 rate-control=2 ! queue ! muxer. \
nvcompositor name=comp sink_0::xpos=0 sink_0::ypos=0 sink_0::width=768 sink_0::height=432 sink_0::zorder=2 sink_0::alpha=1.0 sink_1::xpos=0 sink_1::ypos=0 sink_1::width=1920 sink_1::height=1080 sink_1::zorder=1 sink_1::alpha=1.0 ! \
'video/x-raw(memory:NVMM),format=RGBA' ! nvvidconv ! 'video/x-raw(memory:NVMM),format=(string)I420' ! nvv4l2h265enc ! h265parse ! queue ! mpegtsmux name=muxer ! filesink location=test.mp4 \
This is my Command. and This is My C Code
gst_bin_add_many (GST_BIN (gstContext->pipeline),
gstContext->videosrc, gstContext->video_filter1, gstContext->jpegdec, gstContext->x_raw, gstContext->queue_video1, gstContext->video_convert2, gstContext->video_filter4, gstContext->queue_video2 ,
gstContext->videosrc2, gstContext->video_filter2_1, gstContext->jpegdec2, gstContext->x_raw2, gstContext->queue_video2_1, gstContext->video_convert3, gstContext->video_filter2_3, gstContext->queue_video2_2 ,
gstContext->video_mixer , gstContext->video_filter3, gstContext->video_convert, gstContext->video_filter2, gstContext->video_encoder, gstContext->video_pasre, gstContext->queue_video3, gstContext->muxer,gstContext->sink2, NULL);
if(
!gst_element_link_many(gstContext->video_mixer, gstContext->video_filter3, gstContext->video_convert, gstContext->video_filter2, gstContext->video_encoder, gstContext->video_pasre, gstContext-> queue_video3, gstContext->muxer, NULL)
|| !gst_element_link_many(gstContext->videosrc, gstContext->video_filter1, gstContext->jpegdec, gstContext->x_raw, gstContext->queue_video1, gstContext->video_convert2, gstContext->video_filter4, gstContext->queue_video2 ,NULL)
|| !gst_element_link_many(gstContext->videosrc2, gstContext->video_filter2_1, gstContext->jpegdec2, gstContext->x_raw2, gstContext->queue_video2_1, gstContext->video_convert3, gstContext->video_filter2_3, gstContext->queue_video2_2 , NULL)
)
{
g_error("Failed to link elementsses!#!#!#!#");
pthread_mutex_unlock(&context->lock);
return -2;
}
queue_video2 = gst_element_get_static_pad (gstContext->queue_video2, "src");
queue_video2_2 = gst_element_get_static_pad (gstContext->queue_video2_2, "src");
mixer2_sinkpad = gst_element_get_request_pad (gstContext->video_mixer, "sink_%u");
mixer1_sinkpad = gst_element_get_request_pad (gstContext->video_mixer, "sink_%u");
if (gst_pad_link (queue_video2, mixer2_sinkpad) != GST_PAD_LINK_OK ||
gst_pad_link (queue_video2_2, mixer1_sinkpad) != GST_PAD_LINK_OK) {
g_printerr ("\n\n\n source0 and mixer pads could not be linked22222222222.\n\n\n");
gst_object_unref (gstContext->pipeline);
return -1;
}
g_object_unref(queue_video2);
g_object_unref(queue_video2_2);
And Log
0:00:00.926466171 968 0x7f54324050 FIXME videodecoder gstvideodecoder.c:933:gst_video_decoder_drain_out:<nvjpegdec0> Sub-class should implement drain()
0:00:00.979159196 968 0x7f54324050 WARN basesrc gstbasesrc.c:3055:gst_base_src_loop:<videosrc1> error: Internal data stream error.
0:00:00.979208884 968 0x7f54324050 WARN basesrc gstbasesrc.c:3055:gst_base_src_loop:<videosrc1> error: streaming stopped, reason not-linked (-1)
0:00:00.979354772 968 0x7f54324050 WARN queue gstqueue.c:988:gst_queue_handle_sink_event:<queue_video1> error: Internal data stream error.
0:00:00.979391699 968 0x7f54324050 WARN queue gstqueue.c:988:gst_queue_handle_sink_event:<queue_video1> error: streaming stopped, reason not-linked (-1)
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:videosrc1: Internal data stream error.
Additional debug info1:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:videosrc1:
I connected the elements using requestpad. But why am I getting this error?
Please let me know if I've done anything wrong
The command works as soon as you run it.
thank you
Related
I have the following two pipelines to transmit opus encoded audio from server to client:
The server:
gst-launch-1.0 -v alsasrc ! audioconvert ! audioresample ! audio/x-raw, rate=16000, channels=1, format=S16LE ! opusenc ! rtpopuspay ! udpsink host=0.0.0.0 port=4000
The client:
gst-launch-1.0 udpsrc port=4000 ! application/x-rtp,payload=96,encoding-name=OPUS ! rtpopusdepay ! opusdec ! autoaudiosink
I try to create a custom GstElement based plugin to replace rtpopusdepay in the client side with a hand-crafted one (to be backward compatible with an existing server implementation that doesn't use rtpopuspay but uses a hand-crafted byte-format to wrap the opus encoded data).
To test the concept I would like to use the pipelines above, but replace the client side with:
GST_PLUGIN_PATH=. gst-launch-1.0 udpsrc port=4000 ! simpacketdepay ! opusdec ! autoaudiosink
Where simpacketdepay is the plugin I created. The plugin is quite simple, it has fixed caps (ANY for its sink and "audio/x-opus" for its src). In its chain function I simply remove the payload rtpopuspay adds to the encoded opus stream (first 96 bits) and push the data forward.
The full code:
#include "gstsimpacketdepay.h"
#include <stdio.h>
#include <string.h>
#include <gst/gst.h>
#include <gst/gstcaps.h>
GST_DEBUG_CATEGORY_STATIC (gst_simpacketdepay_debug);
#define GST_CAT_DEFAULT gst_simpacketdepay_debug
/* Enum to identify properties */
enum
{
PROP_0
};
static GstStaticPadTemplate sink_factory = GST_STATIC_PAD_TEMPLATE(
"sink",
GST_PAD_SINK,
GST_PAD_ALWAYS,
GST_STATIC_CAPS("ANY")
);
static GstStaticPadTemplate src_factory = GST_STATIC_PAD_TEMPLATE (
"src",
GST_PAD_SRC,
GST_PAD_ALWAYS,
GST_STATIC_CAPS("audio/x-opus, rate=16000, channels=1, channel-mapping-family=0, stream-count=1, coupled-count=0")
);
/* Define our element type. Standard GObject/GStreamer boilerplate stuff */
#define gst_simpacketdepay_parent_class parent_class
G_DEFINE_TYPE(GstSimpacketdepay, gst_simpacketdepay, GST_TYPE_ELEMENT);
static GstFlowReturn gst_simpacketdepay_chain (GstPad *pad, GstObject *parent, GstBuffer *buf);
static void gst_simpacketdepay_class_init (GstSimpacketdepayClass * klass)
{
GObjectClass *gobject_class;
GstElementClass *gstelement_class;
gstelement_class = (GstElementClass *) klass;
/* Set sink and src pad capabilities */
gst_element_class_add_pad_template (gstelement_class, gst_static_pad_template_get(&src_factory));
gst_element_class_add_pad_template (gstelement_class, gst_static_pad_template_get(&sink_factory));
/* Set metadata describing the element */
gst_element_class_set_details_simple (
gstelement_class,
"simpacketdepay plugin",
"simpacketdepay plugin",
"Sim Packet depay",
"Test"
);
}
static void gst_simpacketdepay_init (GstSimpacketdepay * simpacketdepay)
{
simpacketdepay->sinkpad = gst_pad_new_from_static_template (&sink_factory, "sink");
simpacketdepay->srcpad = gst_pad_new_from_static_template (&src_factory, "src");
gst_pad_use_fixed_caps(simpacketdepay->sinkpad);
gst_pad_use_fixed_caps(simpacketdepay->srcpad);
gst_element_add_pad (GST_ELEMENT (simpacketdepay), simpacketdepay->sinkpad);
gst_element_add_pad (GST_ELEMENT (simpacketdepay), simpacketdepay->srcpad);
gst_pad_set_chain_function (simpacketdepay->sinkpad, gst_simpacketdepay_chain);
}
static GstFlowReturn gst_simpacketdepay_chain (GstPad *pad, GstObject *parent, GstBuffer *inBuf)
{
GstSimpacketdepay *filter = GST_SIMPACKETDEPAY(parent);
GstMapInfo info;
gst_buffer_map(inBuf, &info, GST_MAP_READ);
const size_t inSize = info.size;
printf("Incoming size %lu\n", info.size);
gst_buffer_unmap(inBuf, &info);
GstBuffer* outBuf = gst_buffer_copy(inBuf);
GstMemory* const inMemory = gst_buffer_get_memory(inBuf, 0);
GstMemory* const outMemory = gst_memory_share(inMemory, 12, inSize - 12);
gst_buffer_remove_all_memory(outBuf);
gst_buffer_prepend_memory(outBuf, outMemory);
gst_buffer_map(outBuf, &info, GST_MAP_READ);
printf("Outgoing size: %lu\n", info.size);
fflush(stdout);
gst_buffer_unmap(outBuf, &info);
gst_buffer_unref (inBuf);
GstFlowReturn result = gst_pad_push (filter->srcpad, outBuf);
return result;
}
static gboolean simpacketdepay_plugin_init (GstPlugin * plugin)
{
GST_DEBUG_CATEGORY_INIT (gst_simpacketdepay_debug, "simpacketdepay", 0, "simpacketdepay");
return gst_element_register (plugin, "simpacketdepay", GST_RANK_NONE, GST_TYPE_SIMPACKETDEPAY);
}
#ifndef VERSION
#define VERSION "1.0.0"
#endif
#ifndef PACKAGE
#define PACKAGE "FIXME_package"
#endif
#ifndef PACKAGE_NAME
#define PACKAGE_NAME "FIXME_package_name"
#endif
#ifndef GST_PACKAGE_ORIGIN
#define GST_PACKAGE_ORIGIN "http://FIXME.org/"
#endif
GST_PLUGIN_DEFINE (
GST_VERSION_MAJOR,
GST_VERSION_MINOR,
simpacketdepay,
"FIXME plugin description",
simpacketdepay_plugin_init,
VERSION,
"LGPL",
PACKAGE_NAME,
GST_PACKAGE_ORIGIN
)
The negotiations and everything goes well, until I push the first buffer to the source pad from gst_simpacketdepay_chain with GstFlowReturn result = gst_pad_push (filter->srcpad, outBuf);
Then I get the following error (pasted the detailed debug log here)
0:00:00.510871708 42302 0x55fbd0c44000 LOG audiodecoder gstaudiodecoder.c:2034:gst_audio_decoder_chain:<opusdec0> received buffer of size 160 with ts 0:00:00.006492658, duration 99:99:99.999999999
0:00:00.510877845 42302 0x55fbd0c44000 WARN audiodecoder gstaudiodecoder.c:2084:gst_audio_decoder_chain:<opusdec0> error: decoder not initialized
0:00:00.510882963 42302 0x55fbd0c44000 DEBUG GST_MESSAGE gstelement.c:2110:gst_element_message_full_with_details:<opusdec0> start
0:00:00.510896592 42302 0x55fbd0c44000 INFO GST_ERROR_SYSTEM gstelement.c:2140:gst_element_message_full_with_details:<opusdec0> posting message: GStreamer error: negotiation problem.
0:00:00.510910301 42302 0x55fbd0c44000 LOG GST_MESSAGE gstmessage.c:303:gst_message_new_custom: source opusdec0: creating new message 0x7f519c002910 error
0:00:00.510919198 42302 0x55fbd0c44000 WARN structure gststructure.c:1861:priv_gst_structure_append_to_gstring: No value transform to serialize field 'gerror' of type 'GError'
0:00:00.510929043 42302 0x55fbd0c44000 DEBUG GST_BUS gstbus.c:315:gst_bus_post:<bus1> [msg 0x7f519c002910] posting on bus error message: 0x7f519c002910, time 99:99:99.999999999, seq-num 43, element 'opusdec0', GstMessageError, gerror=(GError)NULL, debug=(string)"gstaudiodecoder.c\(2084\):\ gst_audio_decoder_chain\ \(\):\ /GstPipeline:pipeline0/GstOpusDec:opusdec0:\012decoder\ not\ initialized";
0:00:00.510937098 42302 0x55fbd0c44000 DEBUG bin gstbin.c:3718:gst_bin_handle_message_func:<pipeline0> [msg 0x7f519c002910] handling child opusdec0 message of type error
0:00:00.510942210 42302 0x55fbd0c44000 DEBUG bin gstbin.c:3727:gst_bin_handle_message_func:<pipeline0> got ERROR message, unlocking state change
0:00:00.510947151 42302 0x55fbd0c44000 DEBUG bin gstbin.c:4065:gst_bin_handle_message_func:<pipeline0> posting message upward
0:00:00.510955219 42302 0x55fbd0c44000 WARN structure gststructure.c:1861:priv_gst_structure_append_to_gstring: No value transform to serialize field 'gerror' of type 'GError'
0:00:00.510962328 42302 0x55fbd0c44000 DEBUG GST_BUS gstbus.c:315:gst_bus_post:<bus2> [msg 0x7f519c002910] posting on bus error message: 0x7f519c002910, time 99:99:99.999999999, seq-num 43, element 'opusdec0', GstMessageError, gerror=(GError)NULL, debug=(string)"gstaudiodecoder.c\(2084\):\ gst_audio_decoder_chain\ \(\):\ /GstPipeline:pipeline0/GstOpusDec:opusdec0:\012decoder\ not\ initialized";
<opusdec0> error: decoder not initialized? Do I need to do something special to initialize the opus decoder? What step do I miss?
I was able to solve the issue. When the plugin element enters playing state we should push a gst_event_new_caps event to the source pad. Even with fixed caps... I haven't found anything in the documentation that can explain this requirement.
So I added the following state change handler and the pipeline started to work:
static GstStateChangeReturn gst_simpacketdepay_change_state (GstElement *element, GstStateChange transition)
{
const GstStateChangeReturn result = GST_ELEMENT_CLASS(parent_class)->change_state (element, transition);
if (result == GST_STATE_CHANGE_FAILURE) {
return result;
}
switch (transition) {
case GST_STATE_CHANGE_PAUSED_TO_PLAYING: {
GstSimpacketdepay *filter = GST_SIMPACKETDEPAY(element);
gst_pad_push_event(filter->srcpad, gst_event_new_caps(gst_pad_template_get_caps(gst_static_pad_template_get(&src_factory))));
} break;
default:
break;
}
return result;
}
I'm sad to see how underdocumented this part of GStreamer is.
I want to do echo cancellation using gstreamer but i am getting error
Error: No echo probe with name mp_echo_cancellation1 found.
I have used below functions
tx side :
mp_echo_cancellation = gst_element_factory_make ("webrtcdsp", "echo cancellation");
gst_element_link_many (mp_source, mp_queue,mp_echo_cancellation, mp_level, mp_filter ,mp_conv , mp_muxer, mp_sink, NULL);
rx side :
mp_echo_cancellation1 = gst_element_factory_make ("webrtcechoprobe", "echo cancellation1");
gst_element_link_many (mp_source1, mp_queue1,/*rtpdemuxer,*/ mp_decoder1 , mp_conv1, mp_echo_cancellation1,/*mp_resample,*/ mp_sink1, NULL);
Please help me out . Thanks in advance
I'm using qtmux to merge audio and video to mp4 container file with GStreamer. My pipeline looks like:
gst-launch-1.0 autovideosrc ! x264enc ! queue ! qtmux0. autoaudiosrc! wavenc ! queue ! qtmux ! filesink location=file.mp4
videotestsrc --> x264enc -----\
>---> qtmux ---> filesink
audiotestsrc --> wavenc ------/
It's working good with commandline. But I want to code it in C code. I was stuck in this part:
x264enc -----\
>---> qtmux
wavenc ------/
This is codes for this part.
gst_element_link_many(audiosource, wavenc, audioqueue, NULL);
gst_element_link_many(videosource, x264enc, videoqueue, NULL);
gst_element_link_many(qtmux, filesink, NULL);
audio_pad = gst_element_get_request_pad (audioqueue, "src");
mux_audio_pad = gst_element_get_static_pad (qtmux, "sink_1");
gst_pad_link (audio_pad,mux_audio_pad); **# ERROR HERE**
video_pad = gst_element_get_request_pad (videoqueue, "src");
mux_video_pad = gst_element_get_static_pad(qtmux, "sink_2");
gst_pad_link (video_pad,mux_video_pad); **# ERROR HERE**
But it's wrong in step link pads. And the error type: GST_PAD_LINK_NOFORMAT (-4) – pads do not have common format
How can I fix it ?
I think you have switches request/static pad calls here. The queue should have static pads while the muxer has request pads.
You can also make your life easier by using gst_parse_launch() function to create a pipeline as you do on the command line therefore saving a lot of error prone code.
I am trying below code to change brightness of a video pipeline. I can see the video but brightness never changes although I am trying to change it every 60 seconds. Any idea what I am missing ?
static gboolean broadcasting_timeout_cb (gpointer user_data)
{
GstElement *vaapipostproc = NULL;
vaapipostproc = gst_bin_get_by_name(GST_BIN(broadcasting_pipeline),
"postproc");
if (vaapipostproc == NULL) {
fprintf(stderr, "unable to get vaapipostproc from broadcasting
pipeline\n");
return TRUE;
}
g_object_set (G_OBJECT (vaapipostproc), "brightness", -1.0, NULL);
fprintf(stderr, "brightness changed by -1.0\n");
return TRUE;
}
main() {
//pipeline code goes here and then below code comes //
broadcasting_pipeline = gst_parse_launch (compl_streaming_pipe, &error);
if (!broadcasting_pipeline) {
fprintf (stderr, "Parse error: %s\n", error->message);
exit (1);
}
loop_broadcasting = g_main_loop_new (NULL, FALSE);
g_timeout_add_seconds (60, broadcasting_timeout_cb, loop_broadcasting);
gst_element_set_state (broadcasting_pipeline, GST_STATE_PLAYING);
g_main_loop_run(loop_broadcasting);
// rest of the code for main function comes here
}
It seems that vaapipostproc properties like brightness etc can't be change dynamically during runtime !!
However, I found videobalance element works as suggested by Millind Deore. But videobalance is causing cpu usage to be too high and becoming bottleneck for streaming pipeline.
So, I tried with glcolorbalance which is same like videobalance but it is uses gpu for brightness conversion.
Here is my experiment goes :
If I use below pipeline then I can broadcast to youtube successfully:
gst-launch-1.0 filesrc location=Recorded_live_streaming_on__2018_01_20___13_56_33.219076__-0800.flv ! decodebin name=demux ! queue ! videorate ! video/x-raw,framerate=30/1 ! glupload ! glcolorconvert ! gldownload ! video/x-raw ! vaapih264enc dct8x8=true cabac=true rate-control=cbr bitrate=8192 keyframe-period=60 max-bframes=0 ! flvmux name=mux ! rtmpsink sync=true async=true location="rtmp://x.rtmp.youtube.com/XXXXX live=1" demux. ! queue ! progressreport ! audioconvert ! audiorate ! audioresample ! faac bitrate=128000 ! audio/mpeg,mpegversion=4,stream-format=raw ! mux.
error: XDG_RUNTIME_DIR not set in the environment.
libva info: VA-API version 0.39.0
libva info: va_getDriverName() returns 0
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/i965_drv_video.so
libva info: Found init function __vaDriverInit_0_39
libva info: va_openDriver() returns 0
Setting pipeline to PAUSED ...
error: XDG_RUNTIME_DIR not set in the environment.
libva info: VA-API version 0.39.0
libva info: va_getDriverName() returns 0
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/i965_drv_video.so
libva info: Found init function __vaDriverInit_0_39
libva info: va_openDriver() returns 0
Pipeline is PREROLLING ...
Got context from element 'vaapiencodeh264-0': gst.vaapi.Display=context, gst.vaapi.Display=(GstVaapiDisplay)"(GstVaapiDisplayGLX)\ vaapidisplayglx1";
Got context from element 'gldownloadelement0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"(GstGLDisplayX11)\ gldisplayx11-0";
Redistribute latency...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
progressreport0 (00:00:05): 4 / 1984 seconds ( 0.2 %)
progressreport0 (00:00:10): 9 / 1984 seconds ( 0.5 %)
progressreport0 (00:00:15): 14 / 1984 seconds ( 0.7 %)
progressreport0 (00:00:20): 19 / 1984 seconds ( 1.0 %)
However, if I use glcolorbalance in this pipeline then it gives me following error and I cannot stream to youtube any more:
gst-launch-1.0 filesrc location=Recorded_live_streaming_on__2018_01_20___13_56_33.219076__-0800.flv ! decodebin name=demux ! queue ! videorate ! video/x-raw,framerate=30/1 ! glupload ! glcolorbalance ! glcolorconvert ! gldownload ! video/x-raw ! vaapih264enc dct8x8=true cabac=true rate-control=cbr bitrate=8192 keyframe-period=60 max-bframes=0 ! flvmux name=mux ! rtmpsink sync=true async=true location="rtmp://x.rtmp.youtube.com/XXXXX live=1" demux. ! queue ! progressreport ! audioconvert ! audiorate ! audioresample ! faac bitrate=128000 ! audio/mpeg,mpegversion=4,stream-format=raw ! mux.
error: XDG_RUNTIME_DIR not set in the environment.
libva info: VA-API version 0.39.0
libva info: va_getDriverName() returns 0
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/i965_drv_video.so
libva info: Found init function __vaDriverInit_0_39
libva info: va_openDriver() returns 0
Setting pipeline to PAUSED ...
error: XDG_RUNTIME_DIR not set in the environment.
libva info: VA-API version 0.39.0
libva info: va_getDriverName() returns 0
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/i965_drv_video.so
libva info: Found init function __vaDriverInit_0_39
libva info: va_openDriver() returns 0
Pipeline is PREROLLING ...
Got context from element 'vaapiencodeh264-0': gst.vaapi.Display=context, gst.vaapi.Display=(GstVaapiDisplay)"(GstVaapiDisplayGLX)\ vaapidisplayglx1";
Got context from element 'gldownloadelement0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"(GstGLDisplayX11)\ gldisplayx11-0";
Redistribute latency...
WARNING: from element /GstPipeline:pipeline0/GstDecodeBin:demux: Delayed linking failed.
Additional debug info:
./grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstDecodeBin:demux:
failed delayed linking some pad of GstDecodeBin named demux to some pad of GstQueue named queue0
^Chandling interrupt.
Interrupt: Stopping pipeline ...
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
It seems like glcolorbalanc is causing decodebin not to link with vaapih264enc since it is the only difference between above pipeline.
I am new to gstreamer and Can any one tell me what is wrong with 2nd pipeline and why linking is failing ?
I'm trying to build the following, well working pipeline in c code:
gst-launch-1.0 -v udpsrc port=5000 \
! application/x-rtp,payload=96,media=video,clock-rate=90000,encoding-name=H264,sprop-parameter-sets=\"J2QAH6wrQCIC3y8A8SJq\\,KO4CXLA\\=\" \
! rtph264depay ! avdec_h264 \
! videoconvert ! autovideosink sync=false
Following the tutorial I instantiated the needed elements and checked if they have been created:
GstElement *pipeline, *source, *depay, *decode, *videoconvert, *sink;
// Initialize GStreamer
gst_init (NULL, NULL);
// Create the elements
source = gst_element_factory_make("udpsrc", "source");
depay = gst_element_factory_make("rtph264depay", "depay");
decode = gst_element_factory_make("avdec_h264", "decode");
videoconvert = gst_element_factory_make("videoconvert", "videoconvert");
sink = gst_element_factory_make("autovideosink", "sink");
// Create the empty pipeline
pipeline = gst_pipeline_new ("pipeline");
//Check if all elements have been created
if (!pipeline || !source || !depay || !decode || !videoconvert || !sink) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
The code compiles successfully, the output when executing however is "Not all elements could be created." Further tries made me find the decoder element to be not created.
Where is my mistake? Why is the decoder element not created?
Im on OS X using gcc in the eclipse environment. Include path and linker flag is set.
Update:
Running the code with GST_DEBUG=3 as suggested by max taldykin outputs
0:00:00.011208000 5624 0x7f863a411600 INFO GST_ELEMENT_FACTORY gstelementfactory.c:467:gst_element_factory_make: no such element factory "avdec_h264"!
0:00:00.011220000 5624 0x7f863a411600 INFO GST_ELEMENT_FACTORY gstelementfactory.c:467:gst_element_factory_make: no such element factory "videoconvert"!