I have built and successfully imported FFMPEG library in the android studio. How can I develop a program to stabilize a video using that?
You should implementate ffmpeg library with vid.stab like
https://github.com/tanersener/mobile-ffmpeg
You should give read/write storage permission
You should execute two ffmpeg commands:
-y -i $VIDEO -vf vidstabdetect=shakiness=10:accuracy=15:result=${VIDEO}trfFile -f null -
-y -i $VIDEO -vf vidstabtransform=smoothing=30:input=${VIDEO}output.mp4 -c:v mpeg4 /storage/emulated/0/Android/output.mp4
Where VIDEO is constant of your video path
It takes a lot of time because FFMPEG can't use phone GPU and render takes 2 minutes of 6 seconds of video.
Related
I want to extract the FPS value of my video using react-native,
Please suggest the best available solution.
Thanks in advance
You can use FFmpeg with low level APIs.
Here is the Document :- FFmpeg Doc
FFmpeg for react-native (npm)
npm FFmpeg
And here is command for the extract FPS value from video
ffprobe -v 0 -of compact=p=0 -select_streams 0 \ -show_entries stream=r_frame_rate 'The Master (2022).mp4
Result wil be
r_frame_rate=24000/1001
I have a USB audio dongle connected to the USB port on the QNAP NAS. I have on the NAS a script called "radio" that streams me internet radio streams via a USB audio dongle to the soundbar. The whole thing is controlled by Raspberry Pi (with the Domoticz home automation system). I mean RPi sends ssh commands to the NAS to run the script "radio" on the NAS. Everything works fine as long as it's an HTTP MP3 stream. I use mpg123 then, that is convertion MP3 to WAV. For AAC stream, I have to use FFMPEG to convert AAC to WAV, needed for aplay. Unfortunately, the number of commands available on the NAS is very limited and I can only use FFMPEG and APLAY. If I run the "radio" script directly (from the console) on the NAS, everything works fine. However, when I run it remotely from RPi, MP3 streams play correctly but AAC does not. Below is the command I am using in "radio" script on NAS at the moment (after many attempts). When I run it from the NAS console, everything works fine. However, when I run it remotely using SSH with RPi, both FFMPEG and APLAY are launched but nothing is played on the NAS.
....
[ ! -e /dev/shm/pipe ] && $path_bin/mkfifo /dev/shm/pipe
....
ffmpeg -y -i "$url" -vn -acodec pcm_s16le -ar 44100 -f wav /dev/shm/pipe & $path_bin/aplay -D sysdefault:Device --file-type raw --format=cd /dev/shm/pipe
....
If I run "radio" script from NAS console, FFMPEG start to display kind of counter of received/transcoded kbits. When I call it remotely, counter does not start on RPi console. Probably FFMPEG works, but does not transcode stream. Any idea what to do for proper run radio streaming?
EDIT-1
stderr output:
ffmpeg version 3.3.6 Copyright (c) 2000-2017 the FFmpeg developers
built with gcc 4.9.2 (Debian 4.9.2-10)
configuration: --enable-cross-compile --arch=i686 --target-os=linux --disable-yasm --disable-static --enable-shared --enable-gpl --enable-libmp3lame --disable-libx264 --enable-libsoxr --enable-version3 --enable-nonfree --enable-openssl --disable-decoder=ac3 --disable-decoder=ac3_fixed --disable-decoder=eac3 --disable-decoder=dca --disable-decoder=truehd --disable-encoder=ac3 --disable-encoder=ac3_fixed --disable-encoder=eac3 --disable-encoder=dca --disable-decoder=hevc --disable-decoder=hevc_cuvid --disable-encoder=hevc_nvenc --disable-encoder=nvenc_hevc --disable-decoder=h264 --disable-decoder=h264_cuvid --disable-encoder=libx264 --disable-encoder=libx264rgb --disable-encoder=h264_nvenc --disable-encoder=nvenc --disable-encoder=nvenc_h264 --disable-decoder=mpeg2video --disable-decoder=mpegvideo --disable-decoder=mpeg2_cuvid --disable-encoder=mpeg2video --disable-decoder=mpeg4 --disable-decoder=mpeg4_cuvid --disable-decoder=msmpeg4v1 --disable-decoder=msmpeg4v2 --disable-decoder=msmpeg4v3 --disable-encoder=mpeg4 --disable-encoder=msmpeg4v2 --disable-encoder=msmpeg4v3 --disable-decoder=mvc1 --disable-decoder=vc1 --disable-decoder=vc1_cuvid --disable-decoder=vc1image --disable-decoder=aac --disable-decoder=aac_fixed --disable-decoder=aac_latm --disable-encoder=aac --extra-ldflags='-L/root/daily_build/64_41/4.5.1/LinkFS/usr/lib -L/root/daily_build/64_41/4.5.1/Model/TS-X72/build/RootFS/usr/local/medialibrary/lib -Wl,--rpath -Wl,/usr/local/medialibrary/lib' --extra-cflags='-I/root/daily_build/64_41/4.5.1/LinkFS/usr/include -I/root/daily_build/64_41/4.5.1/Model/TS-X72/build/RootFS/usr/local/medialibrary/include -D_GNU_SOURCE -DQNAP' --prefix=/root/daily_build/64_41/4.5.1/Model/TS-X72/build/RootFS/usr/local/medialibrary
libavutil 55. 58.100 / 55. 58.100
libavcodec 57. 89.100 / 57. 89.100
libavformat 57. 71.100 / 57. 71.100
libavdevice 57. 6.100 / 57. 6.100
libavfilter 6. 82.100 / 6. 82.100
libswscale 4. 6.100 / 4. 6.100
libswresample 2. 7.100 / 2. 7.100
libpostproc 54. 5.100 / 54. 5.100
Finally I found a workaround. It's not an "elegant" solution - but it works.
I used the "expect" functionality and created a script on RPi that triggers SSH streaming to the NAS. Script content below.
#!/usr/bin/expect -f
set channel [lindex $argv 0]
set timeout 5
spawn /usr/bin/ssh -p669 -x -q -i /home/debian/.ssh/id_debian admin#192.168.0.7
expect "*# "
send -- "/share/homes/media/Pobrane/RADIO/radio $channel\r"
expect "*# "
send -- "exit\r"
expect eof
Thats all.
I am looking for a solution for pocketsphinx for a long time. I tried everything
apt-get remove pulseaudio -y
aptitude purge pulseaudio -y
apt-get install bison -y
cd /usr/install
tar -xvf sphinxbase-0.8.tar.gz
cd sphinxbase-0.8
./configure
make
make install
tar -xvf pocketsphinx-0.8.tar.gz
cd pocketsphinx-0.8
./configure
make
make install
There is no file /etc/modprob.d/alsa-base.config so I updated /usr/shared/alsa/alsa.config uncommenting load card-specific configuration files (on request) and also /lib/modprob.d/aliases.conf commenting options snd-usb-audio index=-2
Every try from various resources I have done. I can record arecord -f cd -D plughw:1,0 -d 20 test.wav and play the same file using aplay test.wav
my soundcards are
0 [ALSA ]: bcm2835 - bcm2835 ALSA
bcm2835 ALSA
1 [CAMERA ]: USB-Audio - USB2.0 PC CAMERA
ARKMICRO USB2.0 PC CAMERA at usb-3f980000.usb-1.2,high speed
Every supported libraries are downloaded(Dont'know how many) but still It is not working.
I am using raspbian jessie image.
It is not recommended to use pocketsphinx-0.8, pocketsphinx-5prealpha is much more accurate.
Audio device for recording is specified with -adcdev option:
pocketsphinx_continuous -inmic yes -adcdev plughw:1,0
You can also configure alsa to use plughw:1,0 as default recording device, in that case you would not need -adcdev
I'm trying to save the live feed from an IP camera to a file but the resulting file always plays much faster than the original speed.
I have tried with the following commands:
ffmpeg -i http://171.22.3.47/image -vcodec copy -an -t 900 c:\output.mp4
ffmpeg -i http://171.22.3.47/image -c:v libx264 -an c:\output.mp4
Does anybody know what I'm missing? Both commands create the file and I can use Windows Media Player to play them, but they run much faster.
Try forcing output framerate by adding -r key
ffmpeg -i http://171.22.3.47/image -c:v libx264 -an -r 30 c:\output.mp4
You can also try to slow down the resulting video as an option. This will make output.mp4 2 times slower:
ffmpeg -i output.mp4 -filter:v "setpts=2.0*PTS" -c:v libx264 -an output-slow.mp4
I'm trying to get a simple local preview of my webcam from an FFMpeg udp stream using an embedded Mplayer window. I can view the live stream using MPlayer but the image is unstable. I'm using the following FFMpeg command:
ffmpeg -f dshow -video_size 640x480 -i video="Lenovo EasyCamera" -an -f rawvideo -pix_fmt yuyv422 -r 15 udp://127.0.0.1:1234
And this is the MPlayer command:
mplayer -demuxer rawvideo -rawvideo fps=15:w=640:h=480:format=yuy2 -nofs -noquiet -identify -idle -slave -nomouseinput -framedrop -wid 1051072
Sometimes the stream image is OK, but intermittently the image tears randomly and this is how it looks (sorry, not enough rep for images in posts)
http://imgur.com/sLC3FW0
I have tried with FFPlay to see if it's a problem with MPlayer, but I get the same result:
ffplay -s 640x480 -pix_fmt yuyv422 -f rawvideo -i udp://127.0.0.1:1234
http://imgur.com/06L42Cj
This effect is happening at random. If I stop and restart the video might be OK, or it may look like the above. Using aything other than udp and rawvideo adds a delay to the video stream, which I want to avoid.
The FFMpeg streaming guide suggest methods when you get packet loss, but as far as I'm aware I don't seem to be getting that.
I'm new to FFMpeg/Mplayer/video streaming and any help or thoughts greatly appreciated.