Gstreamer ffdec_h264 missing - camera

I am running this script to view cameras on network:
gst-launch udpsrc port=1234 ! "application/x-rtp, payload=127" ! rtph264depay ! ffdec_h264 ! xvimagesink sync=false
I am getting this error:
WARNING: erroneous pipeline: no element "ffdec_h264"
I am getting error with ffdec_h264. I have all the packages from g-streamer but I don't know which one I am missing.
when I run gst-inspect | grep 264
I get this output:
h264parse: legacyh264parse: H264Parse
x264: x264enc: x264enc
videoparsersbad: h264parse: H.264 parser
typefindfunctions: video/x-h264: h264, x264, 264
rtp: rtph264pay: RTP H264 payloader
rtp: rtph264depay: RTP H264 depayloader
Which shows I don't have this ffdec_h264
which package I am missing?

This might depend on your OS/distribution and GStreamer version.
Over here (Debian jessie, GStreamer 0.10.36) gst-inspect ffdec_h264 gives the following output:
Factory Details:
Long name: FFmpeg H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 decoder
Class: Codec/Decoder/Video
Description: FFmpeg h264 decoder
Author(s): Wim Taymans <wim.taymans#gmail.com>, Ronald Bultje <rbultje#ronald.bitfreak.net>, Edwar$
Rank: primary (256)
Plugin Details:
Name: ffmpeg
Description: All FFmpeg codecs and formats (system install)
Filename: /usr/lib/x86_64-linux-gnu/gstreamer-0.10/libgstffmpeg.so
Version: 0.10.13
License: GPL
Source module: gst-ffmpeg
Binary package: FFmpeg
Origin URL: http://ffmpeg.org/
So on my system, ffdec_h264 is in the gst-ffmpeg module (which was installed using apt-get install gstreamer0.10-ffmpeg).

You have to enter this command and have an idea what h264 decoder you have with gstreamer, in other words what Gstreamer is calling it.
gst-inspect | grep "h264"

Related

ffmpeg error SSL routines:ssl3_write_pending:bad write retry

Recently, I see this error very often. Does anyone know what the error is and how to fix ? Please check full ffmpeg output here ( I already update openssl latest version from yum )
ffmpeg version N-93715-gd0e4d04 Copyright (c) 2000-2019 the FFmpeg developers
built with gcc 4.8.5 (GCC) 20150623 (Red Hat 4.8.5-36)
configuration: --prefix= ....
Input #0, mpegts, from '/tmp/4028813_video_0.ts':
Duration: 00:04:08.19, start: 1.410111, bitrate: 1504 kb/s
Program 1
Metadata:
service_name : Service01
service_provider: FFmpeg
Stream #0:0[0x100]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(tv, bt709, progressive), 608x1080 [SAR 1:1 DAR 76:135], 30 fps, 30 tbr, 90k tbn, 60 tbc
Stream #0:1[0x101](eng): Audio: aac (LC) ([15][0][0][0] / 0x000F), 44100 Hz, stereo, fltp, 121 kb/s
[ sh: 2019-05-06 9:39:46 ]
Stream mapping:
Stream #0:0 -> #0:0 (copy)
Stream #0:1 -> #0:1 (aac (native) -> aac (native))
Press [q] to stop, [?] for help
Output #0, flv, to 'rtmps://live-api-s.facebook.com:443/rtmp/289693778580218?s_bl=1&s_sw=0&s_vt=api-s&a=Abxj1aU9OTqh0RtS':
Metadata:
comment : gs4028813
encoder : Lavf58.27.103
Stream #0:0: Video: h264 (High) ([7][0][0][0] / 0x0007), yuv420p(tv, bt709, progressive), 608x1080 [SAR 1:1 DAR 76:135], q=2-31, 3000 kb/s, 30 fps, 30 tbr, 1k tbn, 30 tbc
Stream #0:1(eng): Audio: aac (LC) ([10][0][0][0] / 0x000A), 44100 Hz, stereo, fltp, 128 kb/s
Metadata:
encoder : Lavc58.52.100 aac
[ sh: 2019-05-06 9:39:47 ]size= 169kB time=00:00:00.52 bitrate=2643.9kbits/s speed=1.03x
......
[ sh: 2019-05-06 9:40:10 ]size= 3386kB time=00:00:22.72 bitrate=1220.7kbits/s speed=0.999x
[ sh: 2019-05-06 9:40:11 ]size= 3842kB time=00:00:25.32 bitrate=1242.9kbits/s speed= 1x
[tls # 0x30ab3c0] error:00000000:lib(0):func(0):reason(0)82 bitrate=1247.3kbits/s speed= 1x
av_interleaved_write_frame(): Input/output error
[flv # 0x3030f80] Failed to update header with correct duration.
[flv # 0x3030f80] Failed to update header with correct filesize.
Error writing trailer of rtmps://live-api-s.facebook.com:443/rtmp/289693778580218?s_bl=1&s_sw=0&s_vt=api-s&a=Abxj1aU9OTqh0RtS: Input/output error
frame= 776 fps= 30 q=-1.0 Lsize= 3932kB time=00:00:25.82 bitrate=1247.3kbits/s speed= 1x
video:3530kB audio:406kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
[tls # 0x30ab3c0] error:1409F07F:SSL routines:ssl3_write_pending:bad write retry
This error normally indicates a bug in the program calling OpenSSL. When an application using OpenSSL attempts to write application data to an SSL/TLS connection using the "SSL_write" function it may be unable to complete the write in one go (for example because the underlying network buffers are full). In this case a recoverable "retry" error occurs. The application is supposed to retry the write with exactly the same data as last time. If the calling application retries the write with different data then you get the "bad write retry" message (a fatal error). As to how to fix it that unfortunately depends on identifying the bug in the calling application and ensuring that retries are only ever performed with exactly the same data.

Can TensorFlow lite can be build with custom CPU?

I'm looking the TF Lite Android App
Which can be found on GIT: https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/lite/java/demo
How can I compile the tensorflow lite framework to use the optimized "atom" cpu type?
Is it possible to compile it on a MAC os with the CPU optimizations for the "atom" cpu?
I want to run the app on an Android device (SDK 22) with an "Intel Atom" Processor.
When I run the application without any changes through Android Studio the rate was about 1200ms per frame.
Compering the same APK installed on my Galaxy S9 (arm - snapdragon processor) was about 30ms per frame.
In the "build.gradle" there is this section:
dependencies {
...
compile 'org.tensorflow:tensorflow-lite:0.0.0-nightly'
...
}
So it's seems that it's downloading the framework,
How can I compile it locally with the CPU optimization and set the app to use it instead of downloading the non optimized nightly version?
I tried to run this tutorial :
Installing TensorFlow from Sources with the cpu flags but not sure exactly how it's helping me with the Android scenario..
Assuming that your Atom device is x86, use the --fat_apk_cpu flag to specify the x86 ABI:
$ bazel build -c opt --cxxopt='--std=c++11' \
--fat_apk_cpu=x86 \
//tensorflow/contrib/lite/java/demo/app/src/main:TfLiteCameraDemo
Switch x86 with x86_64 if you're building for a 64-bit device.
The built APK, available at bazel-bin/tensorflow/contrib/lite/java/demo/app/src/main/TfLiteCameraDemo.apk, will contain the x86 .so file:
$ zipinfo bazel-bin/tensorflow/contrib/lite/java/demo/app/src/main/TfLiteCameraDemo.apk | grep lib
-rw---- 2.0 fat 1434712 b- defN 80-Jan-01 00:00 lib/x86/libtensorflowlite_jni.so
If your device is connected, you can use bazel mobile-install instead of bazel build to directly install the app:
$ bazel mobile-install -c opt --cxxopt='--std=c++11' \
--fat_apk_cpu=x86 \
--start_app \
//tensorflow/contrib/lite/java/demo/app/src/main:TfLiteCameraDemo

How to make maximize performance of Gstreamer

I'm trying to stream video of camera and audio of mic using Gstreamer.
Use Odroid C1+ as a server and PC as a client.
When I use lan cable, it's perfact. But it's performance significantly decrease when I use WiFi.
I used
$ gst-launch-1.0 v4l2src ! gdppay ! tcpserversink host=localhost port=5000 alsasrc device="hw:1,0" ! gdppay ! tcpserversink host=localhost port=6000
$ gst-launch-1.0 tcpclientsrc host=localhost port=5000 ! gdpdepay ! autovideosink sync=false tcpclientsrc host=localhost port=6000 ! gdpdepay ! autoaudiosink sync=false
and
$ gst-launch-1.0 v4l2src ! videoconvert ! x264enc tune=zerolatency ! rtph264pay ! gdppay ! tcpserversink host=localhost port=5000 sync=false alsasrc device='hw:0,0' ! gdppay ! tcpserversink host=localhost port=6000 sync=false
$ gst-launch-1.0 tcpclientsrc host=localhost port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink sync=false tcpclientsrc host=localhost port=6000 ! gdpdepay ! autoaudiosink sync=false
How can I maximize the performance of the streaming of Gstreamer,
and also synchronize video and audio?
Update
$ gst-launch-1.0 v4l2src ! vaapih264enc ! rtph264pay ! gdppay ! tcpserversink host=localhost port=5000
Setting pipeline to PAUSED ...
libva info: VA-API version 0.39.0
libva info: va_getDriverName() returns 0
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/vmwgfx_drv_video.so
libva info: va_openDriver() returns -1
libva info: VA-API version 0.39.0
libva info: va_getDriverName() returns 0
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/vmwgfx_drv_video.so
libva info: va_openDriver() returns -1
libva info: VA-API version 0.39.0
libva info: va_getDriverName() returns -1
libva error: va_getDriverName() failed with unknown libva error,driver_name=(null)
libva info: VA-API version 0.39.0
libva info: va_getDriverName() returns -1
libva error: va_getDriverName() failed with unknown libva error,driver_name=(null)
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstVaapiEncodeH264:vaapiencodeh264-0: Could not initialize supporting library.
Additional debug info:
gstvideoencoder.c(1559): gst_video_encoder_change_state (): /GstPipeline:pipeline0/GstVaapiEncodeH264:vaapiencodeh264-0:
Failed to open encoder
Setting pipeline to NULL ...
Freeing pipeline ...
I ran this at Ubuntu16.04, VMware on i5-6400 CPU and no GPU PC for test.

linux how to show video to webcam

I am trying the following command, so that my webcam shows a video when i access it from network or android emulator.
I am using archlinux. I have installed v4l2loopback-dkms. and added the module to the kernel. then i launched
gst-launch-0.10 filesrc location=/home/simha/1.3gp ! decodebin2 ! ffmpegcolorspace ! videoscale ! ffmpegcolorspace ! v4l2sink device=/dev/video0
but no sucess.
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0: Device '/dev/video0' is not a output device.
Additional debug info:
v4l2_calls.c(528): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0:
Capabilities: 0x85200001
Setting pipeline to NULL ...
Freeing pipeline ...
Also tried:
ffmpeg -f x11grab -r 15 -s 1280x720 -i :0.0+0,0 -vcodec rawvideo -pix_fmt yuv420p -threads 0 -f v4l2 /dev/video0
but not working
ffmpeg version 2.8 Copyright (c) 2000-2015 the FFmpeg developers
built with gcc 5.2.0 (GCC)
configuration: --prefix=/usr --disable-debug --disable-static --disable-stripping --enable-avisynth --enable-avresample --enable-fontconfig --enable-gnutls --enable-gpl --enable-ladspa --enable-libass --enable-libbluray --enable-libfreetype --enable-libfribidi --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopencore_amrnb --enable-libopencore_amrwb --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libv4l2 --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-shared --enable-version3 --enable-x11grab
libavutil 54. 31.100 / 54. 31.100
libavcodec 56. 60.100 / 56. 60.100
libavformat 56. 40.101 / 56. 40.101
libavdevice 56. 4.100 / 56. 4.100
libavfilter 5. 40.101 / 5. 40.101
libavresample 2. 1. 0 / 2. 1. 0
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 2.101 / 1. 2.101
libpostproc 53. 3.100 / 53. 3.100
Input #0, x11grab, from ':0.0+0,0':
Duration: N/A, start: 1456374857.672770, bitrate: N/A
Stream #0:0: Video: rawvideo (BGR[0] / 0x524742), bgr0, 1280x720, 15 fps, 15 tbr, 1000k tbn, 15 tbc
[v4l2 # 0x55f5129180a0] ioctl(VIDIOC_G_FMT): Invalid argument
Output #0, v4l2, to '/dev/video0':
Metadata:
encoder : Lavf56.40.101
Stream #0:0: Video: rawvideo (I420 / 0x30323449), yuv420p, 1280x720, q=2-31, 200 kb/s, 15 fps, 15 tbn, 15 tbc
Metadata:
encoder : Lavc56.60.100 rawvideo
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo (native) -> rawvideo (native))
Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument
You might be using the wrong driver for your webcam.
Use the lsusb command to identify your webcam. Then find out what driver you need to get it working. You can find out more information about this by reading the Webcam Arch Wiki Page

gstreamer android client to stream udp multicast smooth on android 4.1 but choppy in 4.0.4

I'm working with tutorial3 from slomo gst-sdk-tutorials for gstreamer1.0,
I can play a streaming video smoothly in android 4.1 but in 4.0 the quality is not smooth enough , the following are the information retrieved by ffmpeg -i :
Input #0, mpegts, from 'udp://224.2.2.2:1234':
Duration: N/A, start: 2024.535278, bitrate: 128 kb/s
Program 259
Stream #0:0[0x109]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv), 720x576 [SAR 16:15 DAR 4:3], max. 15000 kb/s, 25 fps, 25 tbr, 90k tbn, 50 tbc
Stream #0:1[0x10a]: Audio: mp2 ([4][0][0][0] / 0x0004), 48000 Hz, stereo, s16p, 128 kb/s
My pipeline is :
gst-launch-1.0 udpsrc port=1234 multicast-group=224.2.2.2 ! tsdemux ! mpegvideoparse ! mpeg2dec ! videorate ! video/x-raw,framerate=25/1 ! autovideosink sync=false
Can anyone help me to find the right pipeline to play video smoothly + audio ?
Thanks.