want to set the time to capture the video using GSTREAMER - c

gst-launch-1.0 -e v4l2src ! x264enc ! h264parse ! mp4mux !
splitmuxsink max-size-time=30000000000 location=test1.mp4
This is my pipeline when I'm executing this it gives error like this
WARNING: erroneous pipeline: no element "splitmuxsink"
any solution for this WARNING. I installed every plugins and I am beginner to gstreamer.
Give me some idea to set time of capturing the video.

splitmuxsink sink was added in gstreamer version 1.6, make sure you have 1.6/+ (gst-launch-1.0 --version should tell you which version is installed), also its part of gst-plugins-good make sure you have installed it. Adding a link for your reference to install gstreamer on linux machines.
gst-launch-1.0 -e v4l2src ! x264enc ! h264parse ! mp4mux !
splitmuxsink max-size-time=30000000000 location=test1.mp4
Also your above pipeline is wrong it should something like below,
gst-launch-1.0 -e v4l2src num-buffers=500 !
video/x-raw,width=320,height=240 ! videoconvert ! queue ! timeoverlay
! x264enc key-int-max=10 ! h264parse ! splitmuxsink
location=video%02d.mov max-size-time=10000000000
max-size-bytes=1000000
Refer splitmuxsink, it adds the muxer for you.

Related

Gstreamer use compositor with shmsrc

I have the a few pipelines that send raw video shmsink like below.
gst-launch-1.0 videotestsrc ! video/x-raw,format=I420,width=640,height=360,framerate=15/1 ! timeoverlay ! queue ! shmsink socket-path=/tmp/test1 shm-size=20000000 sync=true
gst-launch-1.0 videotestsrc ! video/x-raw,format=I420,width=640,height=360,framerate=15/1 ! timeoverlay ! queue ! shmsink socket-path=/tmp/test2 shm-size=20000000 sync=true
I am trying to mux the videos using the compositor plugin in another process like this
gst-launch-1.0 shmsrc socket-path=/tmp/test1 is-live=true ! queue ! video/x-raw,format=I420,width=640,height=360 ! compositor name=comp sink_1::xpos=860 ! videoconvert ! autovideosink \
shmsrc socket-path=/tmp/test2 is-live=true ! queue ! video/x-raw,format=I420,width=640,height=360 ! comp.
However I get an assertion error that GST_FORMAT_TIME is not available like below.
ERROR:../subprojects/gst-plugins-base/gst-libs/gst/video/gstvideoaggregator.c:2322:gst_video_aggregator_sink_event: assertion failed: (seg.format == GST_FORMAT_TIME)
Bail out! ERROR:../subprojects/gst-plugins-base/gst-libs/gst/video/gstvideoaggregator.c:2322:gst_video_aggregator_sink_event: assertion failed: (seg.format == GST_FORMAT_TIME)
[1] 268025 abort (core dumped) GST_DEBUG=4 gst-launch-1.0 shmsrc socket-path=/tmp/test1 is-live=true !
This is all implemented programatically using gstreamer-rs (gstreamer rust).
But I am able to reproduce the same issue running the pipelines above.
Is there a way to manually add GST_FORMAT_TIME?
I tried the videomixer element and have the same issue there. I tried inserting identity sync=true but that doesnt seem to do the trick.
I appreciate any help in this!
Thanks a lot!
It seems to work with do-timestamp=true and an increased shm-size:
gst-launch-1.0 shmsrc socket-path=/tmp/test1 is-live=true do-timestamp=true ! queue ! video/x-raw,format=I420,width=640,height=360,framerate=15/1 ! compositor name=comp sink_1::xpos=860 ! videoconvert ! autovideosink shmsrc socket-path=/tmp/test2 is-live=true do-timestamp=true ! queue ! video/x-raw,format=I420,width=640,height=360,framerate=15/1 ! comp.
gst-launch-1.0 videotestsrc ! video/x-raw,format=I420,width=640,height=360,framerate=15/1 ! timeoverlay ! queue ! shmsink socket-path=/tmp/test1 shm-size=200000000
gst-launch-1.0 videotestsrc do-timestamp=true ! video/x-raw,format=I420,width=640,height=360,framerate=15/1 ! timeoverlay ! queue ! shmsink socket-path=/tmp/test2 shm-size=200000000
however it doesn't seem to be in sync even if adding sync=true.

gst-launch-1.0 is showing blank screen for YUYV frame format for image size 864x480

I am trying to acquire image data from a logitech USB camera (C270 HD WEBCAM) connected to a NVIDIA Jetson Nano for image size 864x480 using the below GStreamer command but I am experiencing a blank screen attached below (which means it is not working though there are no issues).
gst-launch-1.0 -v v4l2src device="/dev/video1" ! 'video/x-raw,width=(int)864,height=(int)480' ! videoconvert ! ximagesink
Blank window created by ximagesink
When I try to capture the same image(864x480) with jpeg compression then it is working
gst-launch-1.0 -v v4l2src device="/dev/video0" ! 'image/jpeg,width=(int)864,height=(int)480' ! jpegparse ! jpegdec ! videoconvert ! fpsdisplaysink video-sink=ximagesin
Checked both the pipeline in C programming too but same result.
Please let me know if there are any issues with the first pipeline. Thanks in advance.
-RK
Maybe your camera does not support yuv. You can check it using.
v4l2-ctl --list-formats-ext
On the other hand you might specify YUV format for the gstreamer to use. Something like this may work:
... 'video/x-raw, width=1280, height=720, format=YUY2' ! ...
or
... videoconvert ! 'video/x-raw, width=1280, height=720, format=YUY2' ! ...

How to embed subtitles into an mp4 file using gstreamer

My Goal
I'm trying to embed subtitles into an mp4 file using the mp4mux gstreamer element.
What I've tried
The pipeline I would like to use is:
GST_DEBUG=3 gst-launch-1.0 filesrc location=sample-nosub-avc.mp4 ! qtdemux ! queue ! video/x-h264 ! mp4mux name=mux reserved-moov-update-period=1000 ! filesink location=output.mp4 filesrc location=english.srt ! subparse ! queue ! text/x-raw,format=utf8 ! mux.subtitle_0
It just demuxes a sample mp4 file for the h.264 stream and then muxes it together with an srt subtitle file.
The error I get is:
Setting pipeline to PAUSED ...
0:00:00.009958915 1324869 0x5624a8c7a0a0 WARN basesrc gstbasesrc.c:3600:gst_base_src_start_complete:<filesrc0> pad not activated yet
Pipeline is PREROLLING ...
0:00:00.010128080 1324869 0x5624a8c53de0 WARN basesrc gstbasesrc.c:3072:gst_base_src_loop:<filesrc1> error: Internal data stream error.
0:00:00.010129102 1324869 0x5624a8c53e40 WARN qtdemux qtdemux_types.c:239:qtdemux_type_get: unknown QuickTime node type pasp
0:00:00.010140810 1324869 0x5624a8c53de0 WARN basesrc gstbasesrc.c:3072:gst_base_src_loop:<filesrc1> error: streaming stopped, reason not-negotiated (-4)
0:00:00.010172990 1324869 0x5624a8c53e40 WARN qtdemux qtdemux.c:3237:qtdemux_parse_trex:<qtdemux0> failed to find fragment defaults for stream 1
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc1: Internal data stream error.
Additional debug info:
gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline0/GstFileSrc:filesrc1:
streaming stopped, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
My Thoughts
I believe the issue is not related to the above warning but rather mp4mux's incompatibility with srt subtitles.
The reason I belive this is because, other debug logs hint at it, but also stealing the subititles from another mp4 file and muxing it back together does work.
gst-launch-1.0 filesrc location=sample-nosub-avc.mp4 ! qtdemux ! mp4mux name=mux ! filesink location=output.mp4 filesrc location=sample-with-subs.mp4 ! qtdemux name=demux demux.subtitle_1 ! text/x-raw,format=utf8 ! queue ! mux.subtitle_0
A major catch 22 I am having is that mp4 files don't typically support srt subtitles, but gstreamer's subparse element doesn't support parsing mp4 subtitle formats (tx3g, ttxt, etc.) so I'm not sure how I'm meant to put it all together.
I'm very sorry for the lengthy question but I've tried many things so it was difficult to condense it. Any hints or help is appreciated. Thank you.

GStreamer split mp4 video by seconds with decodebin and splitmuxsink

With the example from gstreamer
gst-launch-1.0 -e filesrc location=audio/mario.mp4 ! queue ! qtdemux name=demux demux.video_0 ! queue ! h264parse ! queue ! mux.video demux.audio_0 ! queue ! aacparse ! queue ! mux.audio_0 splitmuxsink location=audio/test//out_%d.mp4 max-size-time=10000000000 muxer=mp4mux name=mux
I am able to split the mp4 to several pieces. But when I change the qtdemux to decodebin, things change
gst-launch-1.0 filesrc location=audio/mario_00.mp4 ! decodebin name=demuxer ! queue ! video/x-raw ! videoconvert ! x264enc ! mux.video audio/x-raw ! queue ! audioconvert ! wavenc ! queue ! mux.audio_0 splitmuxsink location=audio/test/video%02d.mp4 max-size-time=10000000000 muxer=mp4mux name=mux
This give me error messages
GStreamer-CRITICAL **: 07:58:00.441: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed
WARNING: erroneous pipeline: no element "audio"
I change audio/x-raw to demuxer.audio/x-raw, demuxer.audio, demuxer.audio_0, they give me error messages
(gst-launch-1.0:11252): GStreamer-CRITICAL **: 07:59:35.828: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed
WARNING: erroneous pipeline: unexpected reference "demuxer" - ignoring
How could I split the mp4 after I change the qtdemux to decodebin?

Gstreamer - too much delay while listening to RTP stream

I am new to Gstreamer and I want to use it to listen to RTP stream.
To do that, I use this pipeline :
gst-launch-1.0 udpsrc caps=application/x-rtp port=5000 ! rtpjitterbuffer ! rtpopusdepay ! opusdec ! alsasink
I don't know why, but I have some delay (~ 1s) and I want to minimize it.
I'm sure that this is not coming from source and transport.
If anyone has any ideas :)
So,
If anyone has the same problem, this is the properties that helped me :
latency of rtpjitterbuffer
buffer-time and latency-time of alsasink
And also update gstreamer :)
try playing with the latency setting on the jitter buffer, eg.
gst-launch-1.0 udpsrc caps=application/x-rtp port=5000 ! rtpjitterbuffer latency=250 ! rtpopusdepay ! opusdec ! alsasink

Resources