Gstreamer. Sending multiple files via rtp - udp

I built a pipeline that reads one file and sends it via rtp
gst-launch-1.0 filesrc location="00001.mp4" ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! rtph264pay ! udpsink host=127.0.0.1 port=5000
What does a pipeline look like that reads several files at the same time, set ssrc to them and also sends via rtp?
UPD
I try to use:
gst-launch-1.0 filesrc location="/home/ml/00002.mp4" ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! rtph264pay ! "application/x-rtp, ssrc=(uint)1111111" ! queue name=qsink ! udpsink host=127.0.0.1 port=5000 filesrc location="/home/ml/00001.mp4" ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! rtph264pay ! "application/x-rtp, ssrc=(uint)1111112" ! qsink
but get error: WARNING: erroneous pipeline: no element "qsink"
How can i merge two streams?

At the end i built working pipeline:
funnel name=f ! udpsink host=0.0.0.0 port=5000 filesrc location="/var/tmp/video_folder/00003.mp4" ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! rtph264pay ! application/x-rtp, ssrc=(uint)100000, payload=(int)96 ! f.sink_0 filesrc location="/var/tmp/video_folder/00002.mp4" ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! rtph264pay ! application/x-rtp, ssrc=(uint)100001, payload=(int)97 ! f.sink_1 filesrc location="/var/tmp/video_folder/00001.mp4" ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! rtph264pay ! application/x-rtp, ssrc=(uint)100002, payload=(int)98 ! f.sink_2

Related

Gstreamer cant play stream from other pc: h264->rtp->udp

I want to stream a h264 video over UDP to another pc.
I am using this pipeline to produce the stream:
videotestsrc ! video/x-raw,width=400,height=400,framerate=7/1 ! videoconvert ! x264enc ! h264parse config-interval=1 ! video/x-h264,stream-format=byte-stream,alignment=nal ! rtph264pay ! udpsink host=192.168.1.100 port=2705
I can play this on the same machine (with ip address 192.168.1.100) with this pipeline:
udpsrc port=2705 ! application/x-rtp,width=400,height=400,encoding-name=H264,payload=96,framerate=7/1 ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
But when I try to stream it from another pc to the same machine I get only this output and it waits forever:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
Redistribute latency...
What can be the problem here?
I found the solution. A videoconvert element is needed in the playing pipeline.
The working playing pipeline is:
udpsrc port=2705 ! application/x-rtp,width=400,height=400,encoding-name=H264,payload=96,framerate=7/1 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! autovideosink

Implement GStreamer pipeline using UDp socket #command line

I want to tranfer the .mp4 file from one terminal to another terminal using Gstreamer pipeline.
first I have test using videotestsrc
#server side gst-launch-1.0 -v videotestsrc pattern=ball ! video/x-raw,width=1280,height=720 ! jpegenc ! rtpjpegpay ! udpsink name=sink host=localhost port=34400 sync=false async=false
#client gst-launch-1.0 udpsrc port=34400 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, payload=(int)26" ! rtpjpegdepay ! jpegdec ! filesink location=a.mp4
but I want read .mp4 file from local directory so I have used following command
Server: $gst-launch-1.0 -v filesrc location =<file_path/>video_test.mp4 ! qtdemux ! video/x-h264 ! rtph264pay ! udpsink host=127.0.0.1 port=9001
#client: $gst-launch-1.0 -v udpsrc port=9001 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! filesink location=a.mp4
but i am unable to capture # client side because a.mp4 file contains 0 bytes.
#server : gst-launch-1.0 -v filesrc location=//video_test.mp4 ! qtdemux name=demux demux. ! h264parse config-interval=10 ! queue ! mux. mpegtsmux name=mux ! rtpmp2tpay ! udpsink host=127.0.0.1 port=5002 demux. ! aacparse ! queue ! mux.
#client :gst-launch-1.0 -v udpsrc port=5002 caps="application/x-rtp" ! rtpmp2tdepay ! tsparse ! filesink location=x.mp4
Its working fine for me.

GStreamer v1.0 UDP Multicast streaming not properly decoded if client starts after server

I am trying to make UDP Multicast screen streaming using GStreamer. My screen casting server should run on Windows and my client should run on Linux.
If I start the client before the server, everything is fine.
The problem is when I start the client and the server had already been launched. The video is received, but it is terribly distorted.
This is the result.
Server:
gst-launch-1.0 -e gdiscreencapsrc ! queue ! video/x-raw, framerate=25/1 ! videoconvert ! \
x264enc noise-reduction=10000 tune=zerolatency bitrate=2500 speed-preset="fast" byte-stream=true threads=4 key-int-max=15 intra-refresh=true ! \
h264parse ! rtph264pay config-interval=1 \
! udpsink host=224.1.1.1 port=5000 auto-multicast=true
Client:
gst-launch-1.0 -v udpsrc multicast-group=224.1.1.1 auto-multicast=true port=5000 ! application/x-rtp ! rtph264depay ! h264parse ! queue ! decodebin ! videoconvert ! autovideosink caps='video/x-raw, format=RGB'
I have already tried using dx9screencapsrc, but the behaviour is the same.
The issue is fixed only if I replace gdiscreencapsrc with videotestsrc.
If I launch the server on Linux, using ximagesrc, I still have some issues, but the video is improving over time.
Any help would be appreciated!
Adding cabac=false to my x264enc element fixed the ussue.
gst-launch-1.0 -v gdiscreencapsrc ! queue ! video/x-raw,framerate=60/1 ! decodebin ! videoscale ! videoconvert ! \
x264enc cabac=false tune=zerolatency bitrate=4000 speed-preset="fast" ! \
h264parse ! rtph264pay config-interval=-1 \
! udpsink host=224.1.1.1 port=5000 auto-multicast=true sync=false

Why can I stream h264 encoded video from webcam to BOTH display and file, but NOT raw video?

I want to stream raw video from a Logitech C920 webcam and while both displaying and saving the video to file using GStreamer 1.0.
This works if I stream h264 encoded video from the camera (the camera provides hardware encoded h264), but it fails if I stream raw video from the camera. However, if I only display, or only save to file, streaming raw video works.
Why does it work with a h264 video stream but not with a raw video stream?
h264 encoded video stream from camera to BOTH display and file (WORKS):
gst-launch-1.0 -v v4l2src device=/dev/video0 \
! video/x-h264,width=640,height=480,framerate=15/1 ! tee name=t \
t. ! queue ! h264parse ! avdec_h264 ! xvimagesink sync=false \
t. ! queue ! h264parse ! matroskamux \
! filesink location='h264_dual.mkv' sync=false
raw video stream from camera to ONLY display (WORKS):
gst-launch-1.0 -v v4l2src device=/dev/video0 \
! video/x-raw,format=YUY2,width=640,height=480,framerate=15/1 \
! xvimagesink sync=false
raw video stream from camera to ONLY file (WORKS):
gst-launch-1.0 -v v4l2src device=/dev/video0 \
! video/x-raw,format=YUY2,width=640,height=480,framerate=15/1 \
! videoconvert ! x264enc ! matroskamux \
! filesink location='raw_single.mkv' sync=false
raw video stream from camera to BOTH display and file (FAILS):
gst-launch-1.0 -v v4l2src device=/dev/video0 \
! video/x-raw,format=YUY2,width=640,height=480,framerate=15/1 \
! tee name=t \
t. ! queue ! xvimagesink sync=false \
t. ! queue ! videoconvert ! x264enc ! h264parse ! matroskamux \
! filesink location='raw_dual.mkv' sync=false
The last command (raw video to both display and file) fails without any warnings or errors. The gst-launch terminal output is exactly the same as when only writing to file. The xvimage window appears and displays an image from the camera, but the image does not change (i.e. it is frozen). A zero bytes file appears too.
I have tried multiple versions of the above commands, but I think those are the minimal commands that can reproduce the problem.
Does anyone understand what I am doing wrong?
Streaming raw video from a webcam (not specific to C920) to both display and h.264 encoded file can be done. The x264enc property tune needs to be set to zerolatency.
h.264 example:
gst-launch-1.0 -v v4l2src device=/dev/video0 \
! video/x-raw,format=YUY2,width=640,height=480,framerate=15/1 \
! tee name=t t. ! queue ! xvimagesink sync=false t. ! queue ! \
videoconvert ! x264enc tune=zerolatency ! h264parse ! \
matroskamux ! filesink location='raw_dual.mkv' sync=false
Alternatively, one can skip h.264 altogether and encode to theora or vp8 instead.
theora example:
gst-launch-1.0 -v v4l2src device=/dev/video0 ! \
video/x-raw,format=YUY2,width=640,height=480,framerate=15/1 ! \
tee name=t t. ! queue ! xvimagesink sync=false t. ! queue ! \
videoconvert ! theoraenc ! theoraparse ! \
matroskamux ! filesink location='raw_dual.mkv' sync=false
vp8 example:
gst-launch-1.0 -v v4l2src device=/dev/video0 ! \
video/x-raw,format=YUY2,width=640,height=480,framerate=15/1 ! \
tee name=t t. ! queue ! xvimagesink sync=false t. ! queue ! \
videoconvert ! vp8enc ! \
matroskamux ! filesink location='raw_dual.mkv' sync=false
Thanks a lot to Jan Spurny and Tim.

Gstreamer UDPSink output issue

I have the following command I am using but somehow its not letting me send it over network(local). if I view local it works but not when I try with IP-address.
gst-launch-0.10 v4l2src device=/dev/video0 ! video/x-raw-yuv,width=320,height=240 ! videobox left=-320 border-alpha=0 ! queue ! videomixer name=mix ! ffmpegcolorspace ! xvimagesink v4l2src device=/dev/video1 ! video/x-raw-yuv,width=320,height=240 ! videobox left=1 ! queue ! send-config=true ! udpsink host=127.0.0.1 port=5000
this gives me error:
WARNING: erroneous pipeline: link without source element
but without the udp it works fine.
gst-launch-0.10 v4l2src device=/dev/video0 ! video/x-raw-yuv,width=320,height=240 ! videobox left=-320 border-alpha=0 ! queue ! videomixer name=mix ! ffmpegcolorspace ! xvimagesink v4l2src device=/dev/video1 ! video/x-raw-yuv,width=320,height=240 ! videobox left=1 ! queue ! mix.
my client side is this:
gst-launch udpsrc uri=udp://127.0.0.1:5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1, config=(string)000001b001000001b58913000001000000012000c48d88007d0a041e1463000001b24c61766335322e3132332e30, payload=(int)96, ssrc=(uint)298758266, clock-base=(uint)3097828288, seqnum-base=(uint)63478" ! rtpmp4vdepay ! ffdec_mpeg4 ! autovideosink
what am I doing wrong? Any help would be great.
The warning is the reason you are not able to send:
"queue ! send-config=true ! udpsink" is the 'link without source element'
What is send-config=true? Isn't this a property for some element you didn't type there?