Gstreamer - rtph264pay command - udp

I need to figure out the mechanism behind rtph264pay command. I have retrieved its output, but I don't understand how to read it. Where is the beginning of a frame ?
If I replace filesink with udpsink, frames are sent over the network. How does udpsink determine where a frame begins and ends.
c:\gstreamer\1.0\x86_64\bin\gst-launch-1.0.exe filesrc location=%string:\=/% ^
! decodebin ! x264enc ^
! "video/x-h264, stream-format=(string)avc, alignment=(string)au, profile=(string)baseline" ^
! h264parse ^
! rtph264pay ^
! filesink ^
location=video.txt

The rtph264pay element takes in H264 data as input and turns it into RTP packets. RTP is a standard format used to send many types of data over a network, including video. RTP is formally outlined in RFC 3550, and specific information on how it is used with H264 can be found in RFC 6184.
How does udpsink determine where a frame begins and ends.
udpsink doesn't need to know where frames begin and end, it simply sends the RTP packets created by rtph264depay over UDP. It doesn't have any special understanding of the RTP format. It is the receiver of that data that needs to understand RTP in order to depayload it and decode it.
Since RTP is a format designed to be sent over a network, it's somewhat unusual to save them to a file with filesink.

Related

How to create a video stream with Gstreamer without RTP?

I am trying to create a simple UDP video stream with Gstreamer1.0.
The problem is that for the purpose of my project I need to be able to have a vanilla UDP stream but almost all the tutorials I was able to find have RTP enabled.
So I would like to translate this simple stream:
Player:
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
Server:
gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480 ! x264enc ! h264parse ! rtph264pay ! udpsink host=127.0.0.1 port=5000
Can someone point me in the right direction on how to translate this simple example in UDP only?
The pipeline you stated above, at the sender's side I do not see any use of rtp. Ideally rtpjpegpay should have been used at the sender's side, which is then depayed at the receiver using rtpjpegdepay.
Have you tried the same pipelines, without UDP. It would complain you that the packet size at udpsink is more than the buffer. You will need rtpxpay to fragment the video stream into rtp packets.
In case you do not need rtp, try sending the stream directly but with a limit on the buffer size at the udpsink. This can also result in increased delay in rendering video, some packets getting lost, etc. Try experimenting with different values for the buffer sizes/packet size on udpsink. Unfortunately udpsink does not provide direct control on configuring these sizes. So you may have to find out other ways.

Streaming MP4 Video File on Gstreamer

I am working on gstreamer for first time and trying to Stream an MP4 Video file from a server to client using Gstreamer (RTP and UDP) .
The Command Line which I am trying to use :
On Server Side:
gst-launch-1.0 -v filesrc location = file_name.mp4 ! decodebin ! x264enc ! rtph264pay ! udpsink host=192.1XX.XX.XX port=9001
On Client Side:
gst-launch-1.0 -v udpsrc port=9001 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtpstreamdepay ! decodebin ! videoconvert ! autovideosink
I am able to Stream the video successfully. But, I don't want decodebin and x264enc operations on the server side.
So, I removed these operations and used this command line on the server side
gst-launch-1.0 -v filesrc location =file_name.MP4 ! rtpstreampay ! udpsink host=192.1XX.XX.XX port=9001
On which I was not able to Stream the Video.
Could anybody guide me, why do we need to have the decode and encode operations in this scenario while sending the data.
Is there any way by which we can send data without using these operations.
Thanks.
Decoding and re-encoding is not necessary. The element you are after is demultiplexer, and in this case, qtdemux.
Here a clip from it's document:
Demultiplex a QuickTime file into audio and video streams ISO base
media file format support (mp4, 3gpp, qt, mj2)
It is enough to demultiplex the video container open and just read the encoded video stream directly from the container. mp4 containers usually contain H.264 encoded video, so your server-side pipeline would simplify into:
gst-launch-1.0 -v filesrc location = file_name.mp4 ! qtdemux ! video/x-h264 ! rtph264pay ! udpsink host=192.1XX.XX.XX port=9001
Make sure you know the encoding of the video you are trying to stream. With VLC you can get the codec information:
For H264:
The following pipeline works:
filesrc location=<video location>.mp4 ! qtdemux ! h264parse config-interval=-1 ! rtph264pay pt=96 name=pay0
And for mp4v:
The following pipeline works:
filesrc location=<video location>.mp4 ! qtdemux ! mpeg4videoparse ! rtpmp4vpay pt=96 name=pay0
Also the examples above work if you only care about streaming the video as is. A different pipeline is needed if you want to change the encoding or any other video property.

How to mux two x-rtp streams using gstreamer?

I want to stream audio and video through the same port using udpsink. Is this possible?
I tried this pipeline :
gst-launch \
rtpmux name=mux ! udpsink clients=[client IP]:[client port] sync=false async=false -v \
videotestsrc ! ffenc_h264 ! rtph264pay ! mux.sink_0 \
audiotestsrc is-live=true ! mad ! rtpmpapay ! mux.sink_1
But its not working. Any suggestions?
RTPMUX only works if the two streams have the EXACT SAME Clock rate. rtph264pay pass off a clock rate of at least 90000, which I can almost guarantee that your audio stream does not match.

gstreamer pipeline for decklinksrc video capture card with udpsrc and udpsink using RTP

Hi and thank you for reading,
I'm having trouble figuring out what my gstreamer pipeline should look like to send my Blackmagic decklinksrc video from one Ubuntu machine to another on the same network using RTP and UDP.
To view the video locally I use this pipeline:
gst-launch-0.10 decklinksrc mode=11 connection=0 ! ffmpegcolorspace ! xvimagesink sync=false
This works perfectly fine on both machines with my local setups. Note that mode 11 is 1080i 59.94FPS.
Here is my rough pipeline for both the host machine and the client:
Client (started first):
gst-launch-0.10 udpsrc port=6100 caps="application/x-rtp, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2,width=(string)1920, height=(string)1080,colorimetry=(string)BT709-2, depth=(string)8" ! rtpvrawdepay ! xvimagesink
Host:
gst-launch-0.10 decklinksrc mode=11 connection=0 ! tee ! queue ! ffmpegcolorspace ! rtpvrawpay ! udpsink host=xx.xx.xxx.xx port=6100 tee0. ! queue ! xvimagesink sync=false
I've tried various parameters and I've tried multiple ports with no luck. Each time both machines output:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ..
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Any information to help explain the different attributes (e.g. rtpvrawpay) is appreciated. If you know what I'm doing wrong, even better!
Thanks,
Randy
Try it,
Host :
gst-launch-0.10 decklinksrc mode=11 connection=0 ! videorate ! videoscale ! ffmpegcolorspace ! "video/x-raw-yuv, format=(fourcc)I420, width=(int)1920, height=(int)1080, framerate=(fraction)25/1" ! tee ! queue ! ffmpegcolorspace ! rtpvrawpay ! udpsink host=192.168.40.103 port=6100 tee0. ! queue ! xvimagesink sync=false -v
client:
gst-launch-0.10 udpsrc port=6100 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:0, depth=(string)8, width=(string)1920, height=(string)1080" ! rtpvrawdepay ! xvimagesink
If doesn't work you can add -v option on each pipeline to know what is format using between each plugin.
I tried with videotestsrc on my computer and udp works. So with decklinksrc I am not sure this will works.

Network streaming using Gstreamer

I tried the following basic pipelines to play audio over a network:
Server:
gst-launch-0.10 -v audiotestsrc ! udpsink host=127.0.0.1 port=1234
Client:
gst-launch-0.10 -v udpsrc port=1234 ! fakesink dump=1
But I get no output, although the pipeline gets set to PLAYING state.
I looked at other questions such as this one : Webcam streaming using gstreamer over UDP
Although it's the same pipeline there too, it doesn't work for me.
What am i doing wrong?