Gstreamer - tracer subsystem - how to interpret these measures? - camera

I'm learning Gstreamer. I'm trying to use the Tracer subsystem but I don't know how to interpret the measurements.
The data:
Rasberry Pi 3
Raspbian Buster
Raspberry Camera Module V2
gstreamer, version 1.17.1
Capture data: Motion-JPEG, 426x240, 15 fps and 429000 bps of bitrate
Pipeline 1:
/opt/vc/bin/raspivid -t 60000 -cd MJPEG -w 426 -h 240 -fps 15 -b 429000 -vf -hf \
-o - | GST_DEBUG="GST_TRACER:7" GST_DEBUG_FILE=240.log GST_TRACERS="latency(flags=pipeline+element+reported)" \
gst-launch-1.0 -v fdsrc do-timestamp=true ! \
"image/jpeg,width=426,height=240,framerate=15/1,payload=(int)26" ! \
jpegparse ! rtpjpegpay ! \
udpsink port=13000 host=192.168.1.111 && gst-stats-1.0 240.log
Measure (pipeline 1):
Latency Statistics:
0x20340f0.fdsrc0.src|0x20ff150.udpsink0.sink: mean=0:00:00.001291400 min=0:00:00.000751041 max=0:00:00.007009887
Element Latency Statistics:
0x21040f0.capsfilter0.src: mean=0:00:00.000236961 min=0:00:00.000096875 max=0:00:00.000554530
0x20ec2d0.jpegparse0.src: mean=0:00:00.000558282 min=0:00:00.000346250 max=0:00:00.006139732
0x20f80b8.rtpjpegpay0.src: mean=0:00:00.000458047 min=0:00:00.000248542 max=0:00:00.004859044
Element Reported Latency:
0x20340f0.fdsrc0: min=0:00:00.000000000 max=0:00:00.000000000 ts=0:00:00.644349393
0x21040f0.capsfilter0: min=0:00:00.000000000 max=0:00:00.000000000 ts=0:00:00.644468508
0x20ec2d0.jpegparse0: min=0:00:00.000000000 max=0:00:00.000000000 ts=0:00:00.644528403
0x20f80b8.rtpjpegpay0: min=0:00:00.000000000 max=99:99:99.999999999 ts=0:00:00.644595383
Pipeline 2(rpicamsrc):
GST_DEBUG="GST_TRACER:7" GST_DEBUG_FILE=240p.log GST_TRACERS="latency(flags=pipeline+element+reported)" \
gst-launch-1.0 rpicamsrc preview=0 rotation=180 bitrate=4429000 ! \
image/jpeg,width=1270, height=720,framerate=30 ! \
jpegparse ! rtpjpegpay ! port=13000 host=192.168.1.111 && gst-stats-1.0 240p.log
Measure (pipeline 2):
Latency Statistics:
0x485738.rpicamsrc0.src|0x550e98.udpsink0.sink: mean=0:00:00.002029413 min=0:00:00.000996300 max=0:00:00.007922902
Element Latency Statistics:
0x556118.capsfilter0.src: mean=0:00:00.000220204 min=0:00:00.000116354 max=0:00:00.000882290
0x53c2e8.jpegparse0.src: mean=0:00:00.000840342 min=0:00:00.000421510 max=0:00:00.006688112
0x54a0c8.rtpjpegpay0.src: mean=0:00:00.000968794 min=0:00:00.000458436 max=0:00:00.006141604
Element Reported Latency:
0x485738.rpicamsrc0: min=0:00:00.000000000 max=0:00:00.000000000 ts=0:00:00.473207970
0x556118.capsfilter0: min=0:00:00.000000000 max=0:00:00.000000000 ts=0:00:00.473302137
0x53c2e8.jpegparse0: min=0:00:00.000000000 max=0:00:00.000000000 ts=0:00:00.473359064
0x54a0c8.rtpjpegpay0: min=0:00:00.000000000 max=0:00:00.000000000 ts=0:00:00.473419220
I understand that in the first pipeline, it takes 644.3 ms o receive the data stream from the camera and in the second pipeline it takes 473.2 ms.
Is this interpretation of the data correct? If so, why does it take so long to acquire the flow?
Cheers.

Related

Mediasoup inject stream freezes

I am using the following ffmpeg command to inject an rtmp stream to mediasoup.
ffmpeg \
-re \
-v info \
-stream_loop -1 \
-i rtmp://3.126.121.45:1935/live/stream \
-map 0:a:0 \
-acodec libopus -ab 128k -ac 2 -ar 48000 \
-map 0:v:0 \
-c:v libvpx -minrate 2500k -maxrate 2500k -b:v 2500k -r 30 -g 60 -max_delay 0 -bf 0 -deadline realtime -cpu-used 1 \
-f tee \
"[select=a:f=rtp:ssrc=11111111:payload_type=101]rtp://52.29.30.225:41299?rtcpport=40612&pkt_size=1300|[select=v:f=rtp:ssrc=22222222:payload_type=102]rtp://52.29.30.225:44083?rtcpport=48791&pkt_size=1300"
But the video seems to freeze randomly and plays again. Any idea how I can fix this? Tried the solution given here and here with no luck.
Update: Seems like it is the problem of RTP retransmission when some packets are lost. Unfortunately, ffmpeg doesn't fare well with RTP streaming as mentioned here. Meaning ffmpeg doesn't support retransmission mechanism like nack, pli etc. So considering gstreamer instead as suggested in the mediasoup discourse discussion.

Fail to restore the checkpoint in gem5 se mode

I want to use checkpoints to accelerate my simulation. The problem is that when I restore from the checkpoint, the gem5 simulation aborted.
The mode I am using is se mode. I am using m5 pseudo instruction m5_checkpoint(0,0) in
my application program to create checkpoints.
I change the CPU model when restoring checkpoints and find out when the system doesn't have cache, the restoration is successful.
The error outputs are as below:
0: system.remote_gdb: listening for remote gdb on port 7005
build/ARM/sim/process.cc:389: warn: Checkpoints for pipes, device drivers and sockets do
not work.
Switch at curTick count:10000
gem5.opt: build/ARM/sim/eventq.hh:766: void gem5::EventQueue::schedule(gem5::Event*,
gem5::Tick, bool): Assertion `when >= getCurTick()' failed.
Program aborted at tick 16277372800
The command line to create checkpoint is:
$GEM5_BIN --outdir=$OUTPUT_PATH $GEM5_PATH/configs/example/se.py \
--num-cpu 1 --cpu-clock 2.5GHz --cpu-type AtomicSimpleCPU \
--mem-type DDR3_2133_8x8 --mem-size 1GB \
-c "$TARGET_PATH" --options "$DATA_PATH" --output
"$OUTPUT_PATH/output.txt"
The command line to restore checkpoint is:
$GEM5_BIN --outdir=$OUTPUT_PATH $GEM5_PATH/configs/example/se.py \
--num-cpu 1 --cpu-clock 2.5GHz --cpu-type O3_ARM_v7a_3 \
--caches --l2cache --l1i_size 64kB --l1d_size 32kB --l2_size 256kB --l1i_assoc 8
--l1d_assoc 8 --l2_assoc 16 --cacheline_size 128 \
--l2-hwp-type StridePrefetcher --mem-type DDR3_2133_8x8 --mem-size 1GB \
-r 1 --checkpoint-dir "$CHECK_PATH" \
-c "$TARGET_PATH" --options "$DATA_PATH" --output
$OUTPUT_PATH/output.txt" \
The version of gem5 I am using is 21.1.0.2.
Best Regards, Gelin

How to set Udpsrc for GStreamer to a remote UDP - streaming from a GoPro

Hej!
I'm trying to set up the following:
I have a GoPro Session 5 which streams from udp://10.5.5.9:8554
I have a raspberry pi 3 that is connected to the GoPro's wifi hotspot
I'd like to stream from the GoPro via the rpi over 4g to a Gstreamer viewer (in this case QGroundControl) on another computer with a static ip.
I can get this working by means of this tool:
https://github.com/KonradIT/gopro-py-api
and this command:
ffmpeg -f mpegts -i udp://10.5.5.9:8554 -map 0:0 -c copy -f rtp
udp://94.234.203.109:5000
Now I'd like to see if I can get better performance using Gstreamer.
I'm trying the following:
#!/bin/bash
gst-launch-1.0 -v \
rtpbin name=rtpbin \
udpsrc uri=udp://10.5.5.9:8554 \
! queue \
! avenc_h264_omx bitrate=500000 \
! "video/x-h264,profile=high" \
! h264parse \
! queue max-size-bytes=10000000 \
! rtph264pay pt=96 config-interval=1 \
! rtpbin.send_rtp_sink_0 rtpbin.send_rtp_src_0 \
! udpsink port=5000 host=94.234.203.109 ts-offset=0 name=vrtpsink
rtpbin.send_rtcp_src_0 \
! rtpbin.recv_rtcp_sink_0
And get this error:
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Could not get/set settings from/on resource.
Additional debug info:
gstudpsrc.c(1548): gst_udpsrc_open (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: bind failed: Error binding to address: Cannot assign requested address
Setting pipeline to NULL ...
Freeing pipeline ...
Can someone spare me some knowledge? Thanks!

GStreamer pipeline wont preroll

Hi guys im trying to setup Gstreamer between my Pi and a windows computer. My comands are:
Pi:
~ raspivid -n -w 1280 -h 720 -b 1000000 -fps 15 -t 0 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=10 pt=96 ! udpsink host=[IP] port=9000
PC:
gst-launch-1.0 -v udpsrc port=9000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink sync=f
I get the error:
sudo: /home/pi: command not found
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/GstH264Parse:h264parse0: No valid frames found before end of stream
Additonal debug info:
gst_base_parse_sink_event_default (): /GstPipeline:pipeline0/GstH264Parse:h264parse0
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
Any help would be great thanks!!
(He just omitted the sudo in the line, but he actually typed it).
His home directory is /home/pi, and ~expands to /home/pi.
~ raspivid -n -w 1280 -h 720 -b 1000000 -fps 15 -t 0 -o -
expands to
/home/pi raspivid -n -w 1280 -h 720 -b 1000000 -fps 15 -t 0 -o -
Thus the "command not found", since /home/pi is not an executable. The result of this erroneous command is piped to gst-launch-1.0 and, of course, there are no valid frames!

Stream webcam video with gstreamer 1.0 over UDP to PC

Im trying to stream video from a Raspberry Pi (on Raspbian) to a Windows 7 PC like in this video: https://www.youtube.com/watch?v=lNvYanDLHZA
I have a Logitech C270 connected to the Raspberry Pi, and have managed to stream webcam video over TCP using:
gst-launch v4l2src device=/dev/video0 ! \
'video/x-raw-yuv,width=640,height=480' ! \
x264enc pass=qual quantizer=20 tune=zerolatency ! \
rtph264pay ! tcpsink host=$pi_ip port=5000
from my Pi. Receive this using VLC works, but with a 3 sec delay.
I want to do this over UDP to get a shorter delay (correct me if I'm wrong). But cannot for the life of me figure it out. I have tried following:
gst-launch-1.0 v4l2src device=/dev/video0 ! \
'video/x-raw-yuv,width=640,height=480' ! \
x264enc pass=qual quantizer=20 tune=zerolatency ! \
rtph264pay ! udpsink host=$pc_ip port=1234
and
gst-launch-1.0 udpsrc port=1234 ! \
"application/x-rtp, payload=127" ! \
rtph264depay ! ffdec_h264 ! fpsdisplaysink sync=false text-overlay=false
For the Pi and PC side, respectively (taken from
Webcam streaming using gstreamer over UDP)
but with no luck. (tried to change the video/x-raw-yuv to fit 1.0 version but still without luck)
Any hints would be highly appreciated!
Edit
With the raspi camera (not the webcam) the following works:
Windows batch script:
#echo off
cd C:\gstreamer\1.0\x86_64\bin
gst-launch-1.0 -e -v udpsrc port=5000 ! application/x-rtp, payload=96 !
rtpjitterbuffer ! rtph264depay ! avdec_h264 ! fpsdisplaysink sync=false
text-overlay=false
Raspberry Pi Bash Script:
#!/bin/bash
clear
raspivid -n -t 0 -rot 270 -w 960 -h 720 -fps 30 -b 6000000 -o - | gst-
launch-1.0 -e -vvvv fdsrc ! h264parse ! rtph264pay pt=96 config-interval=5 !
udpsink host=***YOUR_PC_IP*** port=5000
But I cannot figure out how to use to webcam instead of the raspberry pi camera (i.e. v4l2src instead of raspivid) in the same manner
Edit 2
The following works, but is very slow and has a huge delay:
RPi
gst-launch-1.0 -vv -e v4l2src device=/dev/video0 \
! videoscale \
! "video/x-raw,width=400,height=200,framerate=10/1" \
! x264enc pass=qual quantizer=20 tune=zerolatency \
! h264parse \
! rtph264pay config-interval=5 pt=96 \
! udpsink host=$myip port=$myport
PC:
gst-launch-1.0 -e -v udpsrc port=5001 ! ^
application/x-rtp, payload=96 ! ^
rtpjitterbuffer ! ^
rtph264depay ! ^
avdec_h264 ! ^
autovideosink sync=false text-overlay=false
I now suspect that (thanks to hint from #Mustafa Chelik) that the huge lag is due to the fact that the raspberry pi has to encode the webcam video, while the raspberry pi video is already encoded, not sure if this makes sense though?
Found hints to the solution from http://www.z25.org/static/rd/videostreaming_intro_plab/
The following worked very well for streaming video from Logitech c270 on raspberry pi to a windows 7 pc:
PC side:
gst-launch-1.0 -e -v udpsrc port=5001 ! ^
application/x-rtp, encoding-name=JPEG,payload=26 ! ^
rtpjpegdepay ! jpegdec ! ^
autovideosink
RPi side:
gst-launch-1.0 -v v4l2src device=/dev/video0 \
! "image/jpeg,width=1280, height=720,framerate=30/1" \
! rtpjpegpay \
! udpsink host=$myip port=$myport
I suspect that it was the encoding of the webcam video to h264 that was too slow on the raspberry pi, however the webcamera already gave jpeg frames and thus no encoding was nescessary using "image/jpeg"
I have used for my webcamstream the MJPG-Streamer and get a 0,2 seconds delay.
http://wiki.ubuntuusers.de/MJPG-Streamer
And the advantage is that you can watch it with the webbrowser.