I have the following command I am using but somehow its not letting me send it over network(local). if I view local it works but not when I try with IP-address.
gst-launch-0.10 v4l2src device=/dev/video0 ! video/x-raw-yuv,width=320,height=240 ! videobox left=-320 border-alpha=0 ! queue ! videomixer name=mix ! ffmpegcolorspace ! xvimagesink v4l2src device=/dev/video1 ! video/x-raw-yuv,width=320,height=240 ! videobox left=1 ! queue ! send-config=true ! udpsink host=127.0.0.1 port=5000
this gives me error:
WARNING: erroneous pipeline: link without source element
but without the udp it works fine.
gst-launch-0.10 v4l2src device=/dev/video0 ! video/x-raw-yuv,width=320,height=240 ! videobox left=-320 border-alpha=0 ! queue ! videomixer name=mix ! ffmpegcolorspace ! xvimagesink v4l2src device=/dev/video1 ! video/x-raw-yuv,width=320,height=240 ! videobox left=1 ! queue ! mix.
my client side is this:
gst-launch udpsrc uri=udp://127.0.0.1:5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1, config=(string)000001b001000001b58913000001000000012000c48d88007d0a041e1463000001b24c61766335322e3132332e30, payload=(int)96, ssrc=(uint)298758266, clock-base=(uint)3097828288, seqnum-base=(uint)63478" ! rtpmp4vdepay ! ffdec_mpeg4 ! autovideosink
what am I doing wrong? Any help would be great.
The warning is the reason you are not able to send:
"queue ! send-config=true ! udpsink" is the 'link without source element'
What is send-config=true? Isn't this a property for some element you didn't type there?
Related
I built a pipeline that reads one file and sends it via rtp
gst-launch-1.0 filesrc location="00001.mp4" ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! rtph264pay ! udpsink host=127.0.0.1 port=5000
What does a pipeline look like that reads several files at the same time, set ssrc to them and also sends via rtp?
UPD
I try to use:
gst-launch-1.0 filesrc location="/home/ml/00002.mp4" ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! rtph264pay ! "application/x-rtp, ssrc=(uint)1111111" ! queue name=qsink ! udpsink host=127.0.0.1 port=5000 filesrc location="/home/ml/00001.mp4" ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! rtph264pay ! "application/x-rtp, ssrc=(uint)1111112" ! qsink
but get error: WARNING: erroneous pipeline: no element "qsink"
How can i merge two streams?
At the end i built working pipeline:
funnel name=f ! udpsink host=0.0.0.0 port=5000 filesrc location="/var/tmp/video_folder/00003.mp4" ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! rtph264pay ! application/x-rtp, ssrc=(uint)100000, payload=(int)96 ! f.sink_0 filesrc location="/var/tmp/video_folder/00002.mp4" ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! rtph264pay ! application/x-rtp, ssrc=(uint)100001, payload=(int)97 ! f.sink_1 filesrc location="/var/tmp/video_folder/00001.mp4" ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! rtph264pay ! application/x-rtp, ssrc=(uint)100002, payload=(int)98 ! f.sink_2
I want to stream a h264 video over UDP to another pc.
I am using this pipeline to produce the stream:
videotestsrc ! video/x-raw,width=400,height=400,framerate=7/1 ! videoconvert ! x264enc ! h264parse config-interval=1 ! video/x-h264,stream-format=byte-stream,alignment=nal ! rtph264pay ! udpsink host=192.168.1.100 port=2705
I can play this on the same machine (with ip address 192.168.1.100) with this pipeline:
udpsrc port=2705 ! application/x-rtp,width=400,height=400,encoding-name=H264,payload=96,framerate=7/1 ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
But when I try to stream it from another pc to the same machine I get only this output and it waits forever:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
Redistribute latency...
What can be the problem here?
I found the solution. A videoconvert element is needed in the playing pipeline.
The working playing pipeline is:
udpsrc port=2705 ! application/x-rtp,width=400,height=400,encoding-name=H264,payload=96,framerate=7/1 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! autovideosink
I want to tranfer the .mp4 file from one terminal to another terminal using Gstreamer pipeline.
first I have test using videotestsrc
#server side gst-launch-1.0 -v videotestsrc pattern=ball ! video/x-raw,width=1280,height=720 ! jpegenc ! rtpjpegpay ! udpsink name=sink host=localhost port=34400 sync=false async=false
#client gst-launch-1.0 udpsrc port=34400 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, payload=(int)26" ! rtpjpegdepay ! jpegdec ! filesink location=a.mp4
but I want read .mp4 file from local directory so I have used following command
Server: $gst-launch-1.0 -v filesrc location =<file_path/>video_test.mp4 ! qtdemux ! video/x-h264 ! rtph264pay ! udpsink host=127.0.0.1 port=9001
#client: $gst-launch-1.0 -v udpsrc port=9001 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! filesink location=a.mp4
but i am unable to capture # client side because a.mp4 file contains 0 bytes.
#server : gst-launch-1.0 -v filesrc location=//video_test.mp4 ! qtdemux name=demux demux. ! h264parse config-interval=10 ! queue ! mux. mpegtsmux name=mux ! rtpmp2tpay ! udpsink host=127.0.0.1 port=5002 demux. ! aacparse ! queue ! mux.
#client :gst-launch-1.0 -v udpsrc port=5002 caps="application/x-rtp" ! rtpmp2tdepay ! tsparse ! filesink location=x.mp4
Its working fine for me.
I am trying to make UDP Multicast screen streaming using GStreamer. My screen casting server should run on Windows and my client should run on Linux.
If I start the client before the server, everything is fine.
The problem is when I start the client and the server had already been launched. The video is received, but it is terribly distorted.
This is the result.
Server:
gst-launch-1.0 -e gdiscreencapsrc ! queue ! video/x-raw, framerate=25/1 ! videoconvert ! \
x264enc noise-reduction=10000 tune=zerolatency bitrate=2500 speed-preset="fast" byte-stream=true threads=4 key-int-max=15 intra-refresh=true ! \
h264parse ! rtph264pay config-interval=1 \
! udpsink host=224.1.1.1 port=5000 auto-multicast=true
Client:
gst-launch-1.0 -v udpsrc multicast-group=224.1.1.1 auto-multicast=true port=5000 ! application/x-rtp ! rtph264depay ! h264parse ! queue ! decodebin ! videoconvert ! autovideosink caps='video/x-raw, format=RGB'
I have already tried using dx9screencapsrc, but the behaviour is the same.
The issue is fixed only if I replace gdiscreencapsrc with videotestsrc.
If I launch the server on Linux, using ximagesrc, I still have some issues, but the video is improving over time.
Any help would be appreciated!
Adding cabac=false to my x264enc element fixed the ussue.
gst-launch-1.0 -v gdiscreencapsrc ! queue ! video/x-raw,framerate=60/1 ! decodebin ! videoscale ! videoconvert ! \
x264enc cabac=false tune=zerolatency bitrate=4000 speed-preset="fast" ! \
h264parse ! rtph264pay config-interval=-1 \
! udpsink host=224.1.1.1 port=5000 auto-multicast=true sync=false
For bandwidth reasons I've modified the slice-header spacing to use more slices per I-frame, this causes tearing on the receiving end.
The problem appears as if individual slices are getting decoded without an entire I-frame getting buffered up for the omxh264dec? This is a bit strange as the Tegra decoder is supposed to only work on a frame level..
Perhaps this problem can be alleviated by correct synchronization of GstBuffer:s on the receiving end?
Repro case: (Jetson TX2)
# Sender:
gst-launch-1.0 nvcamerasrc ! 'video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, format=(string)I420, framerate=(fraction)60/1' ! nvvidconv flip-method=0 ! 'video/x-raw(memory:NVMM), format=(string)I420' ! omxh264enc iframeinterval=1 bit-packetization=TRUE slice-header-spacing=450000 control-rate=2 preset-level=0 profile=1 qp-range=-1,-1:10,10:-1,-1 ! 'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! rtph264pay mtu=60000 ! udpsink host=127.0.0.1 port=5000
# Reciever:
gst-launch-1.0 udpsrc port=5000 ! "application/x-rtp,encoding-name=H264,payload=96" ! rtph264depay ! h264parse ! omxh264dec ! videoconvert ! xvimagesink async=TRUE sync=TRUE
As Florian Zwoch suggested an rtpjitterbuffer solves this issue.