Stream webcam video with Gstreamer 1.0 over UDP to VLC player - udp

I'm trying to stream a webcam video with gstreamer 1.0 over UDP. This is the command I used:
gstreamer-1.4.0$ gst-launch-1.0 -v v4l2src device=/dev/video1 ! "image/jpeg,width=1280, height=720,framerate=30/1" ! rtpjpegpay \ ! udpsink host = 10.234.2.65 port = 5000
I got receiving via VLC player working with gstreamer 0.1 using an sdp file as explained here.
How do I achieve the same with gstreamer 1.0?

Related

How to create a video stream with Gstreamer without RTP?

I am trying to create a simple UDP video stream with Gstreamer1.0.
The problem is that for the purpose of my project I need to be able to have a vanilla UDP stream but almost all the tutorials I was able to find have RTP enabled.
So I would like to translate this simple stream:
Player:
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
Server:
gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480 ! x264enc ! h264parse ! rtph264pay ! udpsink host=127.0.0.1 port=5000
Can someone point me in the right direction on how to translate this simple example in UDP only?
The pipeline you stated above, at the sender's side I do not see any use of rtp. Ideally rtpjpegpay should have been used at the sender's side, which is then depayed at the receiver using rtpjpegdepay.
Have you tried the same pipelines, without UDP. It would complain you that the packet size at udpsink is more than the buffer. You will need rtpxpay to fragment the video stream into rtp packets.
In case you do not need rtp, try sending the stream directly but with a limit on the buffer size at the udpsink. This can also result in increased delay in rendering video, some packets getting lost, etc. Try experimenting with different values for the buffer sizes/packet size on udpsink. Unfortunately udpsink does not provide direct control on configuring these sizes. So you may have to find out other ways.

Streaming MP4 Video File on Gstreamer

I am working on gstreamer for first time and trying to Stream an MP4 Video file from a server to client using Gstreamer (RTP and UDP) .
The Command Line which I am trying to use :
On Server Side:
gst-launch-1.0 -v filesrc location = file_name.mp4 ! decodebin ! x264enc ! rtph264pay ! udpsink host=192.1XX.XX.XX port=9001
On Client Side:
gst-launch-1.0 -v udpsrc port=9001 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtpstreamdepay ! decodebin ! videoconvert ! autovideosink
I am able to Stream the video successfully. But, I don't want decodebin and x264enc operations on the server side.
So, I removed these operations and used this command line on the server side
gst-launch-1.0 -v filesrc location =file_name.MP4 ! rtpstreampay ! udpsink host=192.1XX.XX.XX port=9001
On which I was not able to Stream the Video.
Could anybody guide me, why do we need to have the decode and encode operations in this scenario while sending the data.
Is there any way by which we can send data without using these operations.
Thanks.
Decoding and re-encoding is not necessary. The element you are after is demultiplexer, and in this case, qtdemux.
Here a clip from it's document:
Demultiplex a QuickTime file into audio and video streams ISO base
media file format support (mp4, 3gpp, qt, mj2)
It is enough to demultiplex the video container open and just read the encoded video stream directly from the container. mp4 containers usually contain H.264 encoded video, so your server-side pipeline would simplify into:
gst-launch-1.0 -v filesrc location = file_name.mp4 ! qtdemux ! video/x-h264 ! rtph264pay ! udpsink host=192.1XX.XX.XX port=9001
Make sure you know the encoding of the video you are trying to stream. With VLC you can get the codec information:
For H264:
The following pipeline works:
filesrc location=<video location>.mp4 ! qtdemux ! h264parse config-interval=-1 ! rtph264pay pt=96 name=pay0
And for mp4v:
The following pipeline works:
filesrc location=<video location>.mp4 ! qtdemux ! mpeg4videoparse ! rtpmp4vpay pt=96 name=pay0
Also the examples above work if you only care about streaming the video as is. A different pipeline is needed if you want to change the encoding or any other video property.

IP camera with RTSP on the web, RED5 and ffmpeg

i have a ip camera with rtsp protocol and i want to stream on the web using flash video. I know i can use vlc but i do not want to use that.
I installed red5 and ffmpeg for convert RTSP to RTMP
ffmpeg -i "rtsp://46.13.85.43:8020/ch0.h264" -f flv -r 25 -s 640x480 -an "rtmp://localhost/live"
and result is: UDP timeout. retrying with TCP
Any idea where is the problem ?
You have to specify that tcp protocol is used, or at least that helped me. Please add this parameter -rtsp_transport tcp to your ffmpeg command.

How to mux two x-rtp streams using gstreamer?

I want to stream audio and video through the same port using udpsink. Is this possible?
I tried this pipeline :
gst-launch \
rtpmux name=mux ! udpsink clients=[client IP]:[client port] sync=false async=false -v \
videotestsrc ! ffenc_h264 ! rtph264pay ! mux.sink_0 \
audiotestsrc is-live=true ! mad ! rtpmpapay ! mux.sink_1
But its not working. Any suggestions?
RTPMUX only works if the two streams have the EXACT SAME Clock rate. rtph264pay pass off a clock rate of at least 90000, which I can almost guarantee that your audio stream does not match.

Network streaming using Gstreamer

I tried the following basic pipelines to play audio over a network:
Server:
gst-launch-0.10 -v audiotestsrc ! udpsink host=127.0.0.1 port=1234
Client:
gst-launch-0.10 -v udpsrc port=1234 ! fakesink dump=1
But I get no output, although the pipeline gets set to PLAYING state.
I looked at other questions such as this one : Webcam streaming using gstreamer over UDP
Although it's the same pipeline there too, it doesn't work for me.
What am i doing wrong?