I am streaming camera from imx6q platform on remote display which has NVIDIA GPU. I am using below gstreamer pipeline on remote display to stream camera. In below pipeline we are using NVIDIA decoder plugin.
gst-launch-1.0 -e -v udpsrc port=5000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtpjitterbuffer latency=0 drop-on-latency=TRUE ! rtph264depay ! video/x-h264 ! h264parse ! nvdec ! glimagesink sync=false
with this pipeline we are getting ~116ms to ~175ms latency in camera streaming on remote display. How we can reduce our latency in video streaming. Is there any gstreamer plugin which can help us reduce latency or any other way.
Thanks
Related
I want to stream a h264 video over UDP to another pc.
I am using this pipeline to produce the stream:
videotestsrc ! video/x-raw,width=400,height=400,framerate=7/1 ! videoconvert ! x264enc ! h264parse config-interval=1 ! video/x-h264,stream-format=byte-stream,alignment=nal ! rtph264pay ! udpsink host=192.168.1.100 port=2705
I can play this on the same machine (with ip address 192.168.1.100) with this pipeline:
udpsrc port=2705 ! application/x-rtp,width=400,height=400,encoding-name=H264,payload=96,framerate=7/1 ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
But when I try to stream it from another pc to the same machine I get only this output and it waits forever:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
Redistribute latency...
What can be the problem here?
I found the solution. A videoconvert element is needed in the playing pipeline.
The working playing pipeline is:
udpsrc port=2705 ! application/x-rtp,width=400,height=400,encoding-name=H264,payload=96,framerate=7/1 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! autovideosink
I have Janus(WebRTC) server. And I am using VP8/OPUS. Then Janus RTP Packet forwards to GStreamer. I have two questions.
Do I have to run one GStreamer(with multiple threads) or multiple GStremaer? Actually, Janus sent to Gstreamer multiple RTP streams. Ex) Two peer are in WebRTC room. Then, Janus sent 4 RTP packet to GStreamer. peer1: video/audio, peer2: video/audio. If I ran just one GStreamer, it is not possible to ascertain who each stream is from. So To classify I have to separate port with multiple GStreamer procceses.
Like this:
Process1:
gst-launch-1.0 \ rtpbin name=rtpbin \ udpsrc name=videoRTP port=5000 \ caps=“application/x-rtp, media=(string)video, payload=98, encoding-name=(string)VP8-DRAFT-IETF-01, clock-rate=90000” \ ! rtpvp8depay ! webmmux ! queue \ ! filesink location=track1.webm \ udpsrc port=5002 \ caps=“application/x-rtp, media=audio, payload=111, encoding-name=(string)OPUS, clock-rate=48000" \ ! rtpopusdepay ! opusparse ! oggmux \ ! filesink location=audio.ogg
process2:
gst-launch-1.0 \ rtpbin name=rtpbin \ udpsrc name=videoRTP port=5003 \ caps=“application/x-rtp, media=(string)video, payload=98, encoding-name=(string)VP8-DRAFT-IETF-01, clock-rate=90000” \ ! rtpvp8depay ! webmmux ! queue \ ! filesink location=track1.webm \ udpsrc port=5005 \ caps=“application/x-rtp, media=audio, payload=111, encoding-name=(string)OPUS, clock-rate=48000" \ ! rtpopusdepay ! opusparse ! oggmux \ ! filesink location=audio.ogg
So I confuse. Whether multiple threads? or multiple processes? Tell me details plz!
How do I mux VP8/OPUS to mp4 container in realtime? I searched for it for a long time. But I can't yet. GStreamer has so many options for each version.
I am waiting for your advice! Thank your.
I've tried as much as I can.
I expect way and mp4 files.
Hi one solution may be the plugin tee
found on the help pages
Description
Split data to multiple pads. Branching the data flow is useful when e.g. capturing a video where the video is shown on the screen and also encoded and written to a file. Another example is playing music and hooking up a visualisation module.
One needs to use separate queue elements (or a multiqueue) in each branch to provide separate threads for each branch. Otherwise a blocked dataflow in one branch would stall the other branches.
Example launch line
1
gst-launch-1.0 filesrc location=song.ogg ! decodebin ! tee name=t ! queue ! audioconvert ! audioresample ! autoaudiosink t. ! queue ! audioconvert ! goom ! videoconvert ! autovideosink
Play song.ogg audio file which must be in the current working directory and render visualisations using the goom element (this can be easier done using the playbin element, this is just an example pipeline).
Based on the gStreamer Tutorial 3 for Android i'm trying to implement a UDP connection.
On the server side i have a raspberry pi with the following line to start the server.
raspivid -t 0 -hf -n -h 480 -w 640 -fps 15 -b 2000000 -o - | gst-launch-1.0 -v fdsrc ! h264sparse ! rtph264pay config-interval=10 pt=96! gdppay ! udpsink host=192.168.1.1 port=5000
On the receiver side i have the tutorial 3 in which i changed the pipeline to the following line.
>data->pipeline = gst_parse_launch("udpsrc port=5000 caps=\"application/x-rtp, media=video, clock-rate=90000, payload=96\" ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! autovideosink", &error);
I also included the following plugins:
>$(GSTREAMER_PLUGINS_CORE) $(GSTREAMER_PLUGINS_PLAYBACK) $(GSTREAMER_PLUGINS_CODECS) $(GSTREAMER_PLUGINS_CODECS_RESTRICTED) $(GSTREAMER_PLUGINS_NET) $(GSTREAMER_PLUGINS_SYS)
When i start the app there is a blackscreen with no video or audio.
In Logcat i get an error that H264 and AAC mapping is not possible.
Maybe i have to include sprop-parameter-sets?
And if i have to, how can i easily do that with the right syntax?
On the server side you use rtph264pay and gdppay. You should remove gdppay.
I'm trying to stream a video with h264. Source is a Axis camera. I managed to stream jpeg with multicast but not h264.
With jpeg I used following command:
gst-launch-1.0 udpsrc uri=udp://239.194.0.177:1026 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink
I tried to stream h264 but it fails, used following command:
gst-launch-1.0 -v udpsrc host=239.194.0.177 port=1026 ! rtph264depay ! ffdec_h264 ! xvimagesink
I get the following error:
ERROR: pipeline could not be constructed: no element "udpsrc".
With this line:
gst-launch-1.0 udpsrc uri=udp://239.194.0.177:1026 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=H264 ! rtph264depay ! h264parse
I did not get any errors but no video streamed and this was printed in terminal:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
I tried the commands from following pages:
Stream H.264 video over rtp using gstreamer
https://developer.ridgerun.com/wiki/index.php/Using_UDP_Multicast_with_GStreamer
http://labs.isee.biz/index.php/Example_GStreamer_Pipelines#H.264_RTP_Streaming
But could not get it to work.
When running in verbos mode I get litte more info.
Command:
gst-launch-1.0 -v udpsrc uri=udp://239.194.0.177:1026 ! application/x-rtp, media=video, payload=96, encoding-name=H264 ! rtph264depay ! avdec_h264 ! videoconvert ! fakesink
Output:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = "application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ encoding-name\=\(string\)H264\,\ clock-rate\=\(int\)90000"
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = "application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ encoding-name\=\(string\)H264\,\ clock-rate\=\(int\)90000"
How do I stream H264 via multicast with gstreamer?
Too long for comment - and since nobody is answering posting this draft of thoughts as answer..
The first error about no element udpsrc is really weird. But I think its complaining about missing uri parameter. What version are you using? I do not have the host parameter for udpsrc..
In third pipeline it ends with h264parse - is this s typo? you need to decode the h264.. not just parse it:
gst-launch-1.0 udpsrc uri=udp://239.194.0.177:1026 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=H264 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! autovideosink
Also add some logs (maybe with pastebin if too long) with running GST_DEBUG=3 gst-launch-1.0 .... or so.
What does it mean:
But could not get it to work
This does not say too much ;)
Usually when working with rtp you need to provide really all capabilities otherwise it may not link or play at all..
Maybe try with uridecodebin? Not sure if its the best idea:
gst-launch-1.0 uridecodebin uri=udp://etcetc:port ! videoconvert ! autovideosink
If you get any new infos/questions add them as updates to make the picture whole (for others as well..)
HTH
I would like to capture a video stream (+audio) in MJPEG from my webcam into .mts container using this pipeline:
gst-launch-1.0 v4l2src do-timestamp=true device=/dev/video0 \ !
'image/jpeg,framerate=30/1,width=1280,height=720' ! videorate \
! queue ! mux2. pulsesrc do-timestamp=true \
device="alsa_input.pci-0000_00_1b.0.analog-stereo" ! \
'audio/x-raw,rate=88200,channels=1,depth=24' ! audioconvert ! \
avenc_aac compliance=experimental ! queue ! \
mux2. mpegtsmux name="mux2" ! filesink location=/home/sina/Webcam.mts
it seems that my pipeline doesn't recognize the mpegtsmux (?)
when i use avimux or even matroskamux it works but as far as I know for MPEG-TS I need to use the correct muxer which is "mpegtsmux"
This is the warning:
WARNING: erroneous pipeline: could not link queue0 to mux2
Can you please tell me what part of my pipeline is wrong? or what shall I change in order to get a timestamped video stream at the end (duration of the video must be shown when I play it via kdenlive or VLC)?
Best,
Sina
I think you are missing some encoder before mux.
Just try this without audio(added x264enc):
gst-launch-1.0 v4l2src device=/dev/video0 ! videorate ! queue ! x264enc ! mpegtsmux name="mux2" mux2. ! filesink location=bla.mts
The warning you are getting is saying it clearly.. it cannot link mux because the mux does not support capabilities image/jpeg.. just check the Capabilities section of sink pad with command:
gst-inspect-1.0 mpegtsmux
But it supports for example video/x-h264 - therefore the need for x264enc