multiple RTP Packet to each file. And Muxing - webrtc

I have Janus(WebRTC) server. And I am using VP8/OPUS. Then Janus RTP Packet forwards to GStreamer. I have two questions.
Do I have to run one GStreamer(with multiple threads) or multiple GStremaer? Actually, Janus sent to Gstreamer multiple RTP streams. Ex) Two peer are in WebRTC room. Then, Janus sent 4 RTP packet to GStreamer. peer1: video/audio, peer2: video/audio. If I ran just one GStreamer, it is not possible to ascertain who each stream is from. So To classify I have to separate port with multiple GStreamer procceses.
Like this:
Process1:
gst-launch-1.0 \ rtpbin name=rtpbin \ udpsrc name=videoRTP port=5000 \ caps=“application/x-rtp, media=(string)video, payload=98, encoding-name=(string)VP8-DRAFT-IETF-01, clock-rate=90000” \ ! rtpvp8depay ! webmmux ! queue \ ! filesink location=track1.webm \ udpsrc port=5002 \ caps=“application/x-rtp, media=audio, payload=111, encoding-name=(string)OPUS, clock-rate=48000" \ ! rtpopusdepay ! opusparse ! oggmux \ ! filesink location=audio.ogg
process2:
gst-launch-1.0 \ rtpbin name=rtpbin \ udpsrc name=videoRTP port=5003 \ caps=“application/x-rtp, media=(string)video, payload=98, encoding-name=(string)VP8-DRAFT-IETF-01, clock-rate=90000” \ ! rtpvp8depay ! webmmux ! queue \ ! filesink location=track1.webm \ udpsrc port=5005 \ caps=“application/x-rtp, media=audio, payload=111, encoding-name=(string)OPUS, clock-rate=48000" \ ! rtpopusdepay ! opusparse ! oggmux \ ! filesink location=audio.ogg
So I confuse. Whether multiple threads? or multiple processes? Tell me details plz!
How do I mux VP8/OPUS to mp4 container in realtime? I searched for it for a long time. But I can't yet. GStreamer has so many options for each version.
I am waiting for your advice! Thank your.
I've tried as much as I can.
I expect way and mp4 files.

Hi one solution may be the plugin tee
found on the help pages
Description
Split data to multiple pads. Branching the data flow is useful when e.g. capturing a video where the video is shown on the screen and also encoded and written to a file. Another example is playing music and hooking up a visualisation module.
One needs to use separate queue elements (or a multiqueue) in each branch to provide separate threads for each branch. Otherwise a blocked dataflow in one branch would stall the other branches.
Example launch line
1
gst-launch-1.0 filesrc location=song.ogg ! decodebin ! tee name=t ! queue ! audioconvert ! audioresample ! autoaudiosink t. ! queue ! audioconvert ! goom ! videoconvert ! autovideosink
Play song.ogg audio file which must be in the current working directory and render visualisations using the goom element (this can be easier done using the playbin element, this is just an example pipeline).

Related

Gstreamer: RTP stream to VP8/webm recording does not produce a seekable file

I'm using Gstreamer to capture a WebRTC stream to a webm file. I've noticed when using VP8 encoding in an rtp stream, the file produced is not seekable in any players (Chrome or VLC for example). By seekable, I mean that you cannot play the file from an arbitrary point in the stream, it only starts from the very start when played.
I've instead used h264 encoding with rtp and the resulting mp4 file is seekable. Also, removing rtp encoding/decoding produces a seekable output file for VP8.
Here's are test pipelines to produce these results.
# produces a valid webm file that is not seekable (I need this to be seekable)
gst-launch-1.0 -e \
autovideosrc ! autovideoconvert ! vp8enc ! rtpvp8pay ! rtpvp8depay ! webmmux ! \
filesink location=test.webm
# produces a valid webm file that is seekable
gst-launch-1.0 -e \
autovideosrc ! autovideoconvert ! vp8enc ! webmmux ! \
filesink location=test.webm
# produces a valid mp4 file that is seekable
gst-launch-1.0 -e \
autovideosrc ! autovideoconvert ! x264enc ! rtph264pay ! rtph264depay ! h264parse ! mp4mux ! \
filesink location=test.mp4
Is there a method of creating a seekable rtp/VP8/matroska recording? Is this a bug in Gstreamer?
I've used both Gstreamer 1.14 and 1.18 on Ubuntu with the same results.
This is an oddity in the webm format which also affects the MediaRecorder API.
See here.
In a nutshell you need to update the duration metadata at the beginning of the file when you stop the recording.

How to stream h264 with udp gstreamer

I'm trying to stream a video with h264. Source is a Axis camera. I managed to stream jpeg with multicast but not h264.
With jpeg I used following command:
gst-launch-1.0 udpsrc uri=udp://239.194.0.177:1026 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink
I tried to stream h264 but it fails, used following command:
gst-launch-1.0 -v udpsrc host=239.194.0.177 port=1026 ! rtph264depay ! ffdec_h264 ! xvimagesink
I get the following error:
ERROR: pipeline could not be constructed: no element "udpsrc".
With this line:
gst-launch-1.0 udpsrc uri=udp://239.194.0.177:1026 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=H264 ! rtph264depay ! h264parse
I did not get any errors but no video streamed and this was printed in terminal:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
I tried the commands from following pages:
Stream H.264 video over rtp using gstreamer
https://developer.ridgerun.com/wiki/index.php/Using_UDP_Multicast_with_GStreamer
http://labs.isee.biz/index.php/Example_GStreamer_Pipelines#H.264_RTP_Streaming
But could not get it to work.
When running in verbos mode I get litte more info.
Command:
gst-launch-1.0 -v udpsrc uri=udp://239.194.0.177:1026 ! application/x-rtp, media=video, payload=96, encoding-name=H264 ! rtph264depay ! avdec_h264 ! videoconvert ! fakesink
Output:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = "application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ encoding-name\=\(string\)H264\,\ clock-rate\=\(int\)90000"
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = "application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ encoding-name\=\(string\)H264\,\ clock-rate\=\(int\)90000"
How do I stream H264 via multicast with gstreamer?
Too long for comment - and since nobody is answering posting this draft of thoughts as answer..
The first error about no element udpsrc is really weird. But I think its complaining about missing uri parameter. What version are you using? I do not have the host parameter for udpsrc..
In third pipeline it ends with h264parse - is this s typo? you need to decode the h264.. not just parse it:
gst-launch-1.0 udpsrc uri=udp://239.194.0.177:1026 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=H264 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! autovideosink
Also add some logs (maybe with pastebin if too long) with running GST_DEBUG=3 gst-launch-1.0 .... or so.
What does it mean:
But could not get it to work
This does not say too much ;)
Usually when working with rtp you need to provide really all capabilities otherwise it may not link or play at all..
Maybe try with uridecodebin? Not sure if its the best idea:
gst-launch-1.0 uridecodebin uri=udp://etcetc:port ! videoconvert ! autovideosink
If you get any new infos/questions add them as updates to make the picture whole (for others as well..)
HTH

using mpegtsmux in gstreamer's pipeline for recording video

I would like to capture a video stream (+audio) in MJPEG from my webcam into .mts container using this pipeline:
gst-launch-1.0 v4l2src do-timestamp=true device=/dev/video0 \ !
'image/jpeg,framerate=30/1,width=1280,height=720' ! videorate \
! queue ! mux2. pulsesrc do-timestamp=true \
device="alsa_input.pci-0000_00_1b.0.analog-stereo" ! \
'audio/x-raw,rate=88200,channels=1,depth=24' ! audioconvert ! \
avenc_aac compliance=experimental ! queue ! \
mux2. mpegtsmux name="mux2" ! filesink location=/home/sina/Webcam.mts
it seems that my pipeline doesn't recognize the mpegtsmux (?)
when i use avimux or even matroskamux it works but as far as I know for MPEG-TS I need to use the correct muxer which is "mpegtsmux"
This is the warning:
WARNING: erroneous pipeline: could not link queue0 to mux2
Can you please tell me what part of my pipeline is wrong? or what shall I change in order to get a timestamped video stream at the end (duration of the video must be shown when I play it via kdenlive or VLC)?
Best,
Sina
I think you are missing some encoder before mux.
Just try this without audio(added x264enc):
gst-launch-1.0 v4l2src device=/dev/video0 ! videorate ! queue ! x264enc ! mpegtsmux name="mux2" mux2. ! filesink location=bla.mts
The warning you are getting is saying it clearly.. it cannot link mux because the mux does not support capabilities image/jpeg.. just check the Capabilities section of sink pad with command:
gst-inspect-1.0 mpegtsmux
But it supports for example video/x-h264 - therefore the need for x264enc

Gstreamer: How to pipe rtpvp8depay into webmmux without reencoding?

From a Webrtc providing browser i receive an RTP stream which gets decrypted using janus gateway. Upon receiving only the video rtp packets get relayed to a local multicast group for testing purpose.
So, let's assume i receive vp8 encoded rtp packets on a udp port. I'm also able to request for a new keyframe at any time.
The problem pipeline:
gst-launch-1.0 -v -v -v -v udpsrc multicast-group=224.1.1.1 auto-multicast=true port=1235 ! "application/x-rtp, payload=100, clock-rate=90000" ! rtpvp8depay ! webmmux streamable=true ! filesink location=/tmp/test.webm
produces the error
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = "application/x-rtp\,\ payload\=\(int\)100\,\ clock-rate\=\(int\)90000\,\ media\=\(string\)video\,\ encoding-name\=\(string\)VP8-DRAFT-IETF-01"
/GstPipeline:pipeline0/GstRtpVP8Depay:rtpvp8depay0.GstPad:src: caps = "video/x-vp8\,\ framerate\=\(fraction\)0/1"
/GstPipeline:pipeline0/GstRtpVP8Depay:rtpvp8depay0.GstPad:sink: caps = "application/x-rtp\,\ payload\=\(int\)100\,\ clock-rate\=\(int\)90000\,\ media\=\(string\)video\,\ encoding-name\=\(string\)VP8-DRAFT-IETF-01"
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2933): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 0:00:00.039571113
Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstWebMMux:webmmux0.GstPad:src: caps = video/webm
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
The Kurento project provides a gstreamer plugin called "vp8parse" which solves the issue:
gst-launch-1.0 -v -v -v -v udpsrc multicast-group=224.1.1.1 auto-multicast=true port=1235 ! "application/x-rtp, payload=100, clock-rate=90000" ! rtpvp8depay ! vp8parse ! webmmux streamable=true ! filesink location=/tmp/test.webm
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = "application/x-rtp\,\ payload\=\(int\)100\,\ clock-rate\=\(int\)90000\,\ media\=\(string\)video\,\ encoding-name\=\(string\)VP8-DRAFT-IETF-01"
/GstPipeline:pipeline0/GstRtpVP8Depay:rtpvp8depay0.GstPad:src: caps = "video/x-vp8\,\ framerate\=\(fraction\)0/1"
/GstPipeline:pipeline0/KmsVp8Parse:kmsvp8parse0.GstPad:src: caps = "video/x-vp8\,\ framerate\=\(fraction\)0/1"
/GstPipeline:pipeline0/KmsVp8Parse:kmsvp8parse0.GstPad:sink: caps = "video/x-vp8\,\ framerate\=\(fraction\)0/1"
/GstPipeline:pipeline0/GstRtpVP8Depay:rtpvp8depay0.GstPad:sink: caps = "application/x-rtp\,\ payload\=\(int\)100\,\ clock-rate\=\(int\)90000\,\ media\=\(string\)video\,\ encoding-name\=\(string\)VP8-DRAFT-IETF-01"
HERE THE PIPELINE BLOCKS UNTIL A KEYFRAME IS RECEIVED
/GstPipeline:pipeline0/KmsVp8Parse:kmsvp8parse0.GstPad:src: caps = "video/x-vp8\,\ width\=\(int\)640\,\ height\=\(int\)480\,\ framerate\=\(fraction\)10/1"
/GstPipeline:pipeline0/GstWebMMux:webmmux0.GstMatroskamuxPad:video_0: caps = "video/x-vp8\,\ width\=\(int\)640\,\ height\=\(int\)480\,\ framerate\=\(fraction\)10/1"
/GstPipeline:pipeline0/GstWebMMux:webmmux0.GstPad:src: caps = video/webm
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/webm
/GstPipeline:pipeline0/GstWebMMux:webmmux0.GstPad:src: caps = "video/webm\,\ streamheader\=\(buffer\)\<\ 1a45dfa301000000000000104282857765626d0042878102428581021853806701ffffffffffffff1549a96601000000000000502ad7b1830f42404d809f4753747265616d657220706c7567696e2076657273696f6e20312e342e31005741994753747265616d6572204d6174726f736b61206d7578657200446188062408b80e88c4001654ae6b010000000000003cae0100000000000033d7810183810173c588786b225315e5f279536e86566964656f00e00100000000000008b0820280ba8201e08686565f56503800\ \>"
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = "video/webm\,\ streamheader\=\(buffer\)\<\ 1a45dfa301000000000000104282857765626d0042878102428581021853806701ffffffffffffff1549a96601000000000000502ad7b1830f42404d809f4753747265616d657220706c7567696e2076657273696f6e20312e342e31005741994753747265616d6572204d6174726f736b61206d7578657200446188062408b80e88c4001654ae6b010000000000003cae0100000000000033d7810183810173c588786b225315e5f279536e86566964656f00e00100000000000008b0820280ba8201e08686565f56503800\ \>"
Having a look at the vp8parse source it seems that this plugin does nothing else than pipe the frames which rtpvp8depay provides untouched to its sink BUT also set's the src caps to the video width,height and framerate.
An alternative pipeline which works is:
gst-launch-1.0 -v -v -v -v udpsrc multicast-group=224.1.1.1 auto-multicast=true port=1235 ! "application/x-rtp, payload=100, clock-rate=90000" ! rtpvp8depay ! vp8dec ! vp8enc ! webmmux streamable=true ! filesink location=/tmp/test.webm
but using vp8dec ! vp8enc obviously doesnt make much sense as i already receive a vp8 encoded stream.
Now my question is how can i solve this without reencoding the stream and without depending on vp8parse? If there is no alternative it seems i have to use it, but as this is currently a plugin which is not available via standard gstreamer plugin packages i would like to avoid this. Is it possible to force the caps to a specific width,height,framerate so webmmux wouldnt complain? Cause i think that's the reason why the very first pipeline is not-negotiated.
I tried using capsfilter like rtpvp8depay ! capsfilter caps="video/x-vp8,width=640,height=480,framerate=10/1" ! webmmux but it doesnt negotiate either.
It is bug 747208, already fixed upstream,
but the version of Gstreamer in your system might be old
(1.2.4 in Ubuntu 14.04) and still affected.
As a workaround in such old versions,
if you know the frame size of the video
you can use a capssetter element after the depayloader
to manually set the caps that the depayloader misses:
gst-launch-1.0 -v \
udpsrc multicast-group=224.1.1.1 auto-multicast=true port=1235 \
! "application/x-rtp, payload=100, clock-rate=90000" \
! rtpvp8depay ! capsetter caps="video/x-vp8,width=640,height=480" \
! webmmux streamable=true ! filesink location=/tmp/test.webm

GStreamer Videomixer Raspivid

I have searched all over the place and have not found anyone using the videomixer function from gstreamer with the raspberry pi's raspivid.
I am trying to duplicate the raspivid output and merge them side by side and then eventually send a stream over tcp. But for right now I am just looking for some help with getting the videomixing to work.
The resulting video should be 1280x568 for my specific application and I do not care that there is any angle between the videos to create a "3d effect" because it is not required for the specific application I'm making.
I am using gstreamer 1.2 so the function call is gst-launch-1.0 and I can not use ffmpeg b/c I believe it has depricated, so I assume I would just use videoconvert to achieve the same result.
Im not sure if I should be using h264parse instead of decodebin. So here is what Ive got so far:
gst-launch-1.0 fdsrc | raspivid -t 0 -h 568 -w 640 -fps 25 -hf -b 2000000 -o - ! decodebin ! queue ! videoconvert ! videobox border-alpha=0 right=-640 ! videomixer name=mix ! videoconvert ! autovideosink fdsrc | raspivid -t 0 -h 568 -w 640 -fps 25 -hf -b 2000000 -o - ! decodebin ! queue ! videoconvert ! videobox border-alpha=0 left=-640 ! mix.
Im trying to model this based off these two sources(the raspivid command in the first link works for me):
http://www.raspberry-projects.com/pi/pi-hardware/raspberry-pi-camera/streaming-video-using-gstreamer
http://www.technomancy.org/gstreamer/playing-two-videos-side-by-side/
I know I am probably so far off but I am having a lot of difficulty finding examples of how to do this, especially with the raspivid function. I would greatly appreciate any help. Thank You.
You can find a exemple, with some explaination how to use videomixer using videomixer
Example of using videomixer combining 3 videos
Note: UNIX paths are used in this example
gst-launch-1.0 -e \
videomixer name=mix background=0 \
sink_1::xpos=0 sink_1::ypos=0 \
sink_2::xpos=200 sink_2::ypos=0 \
sink_3::xpos=100 sink_3::ypos=100 \
! autovideosink \
uridecodebin uri='file:///data/big_buck_bunny_trailer-360p.mp4' \
! videoscale \
! video/x-raw,width=200,height=100 \
! mix.sink_1 \
uridecodebin uri='file:///data/sintel_trailer-480p.webm' \
! videoscale \
! video/x-raw,width=200,height=100 \
! mix.sink_2 \
uridecodebin uri='file:///data/the_daily_dweebs-720p.mp4' \
! videoscale \
! video/x-raw,width=200,height=100 \
! mix.sink_3