is there a way for discovering all the available encodings of a certain webcam (e.g x-raw-rgb -xraw-yuv)?
Morevoer, I would like to discover also the available resolutions.
Thanks!
Yes, set the v4l2src element to ready and check the caps on the src pad. The element will narrow the list of caps down to the ones actually supported when it has opened and queried an actual device. That happens in READY state.
What I do is the following (command line):
GST_DEBUG=v4l2src:3 gst-launch v4l2src ! decodebin2 ! xvimagesink
If the video source in onboard else change the "v4l2src". This will show ALOT of info, from "probed caps:" it will long line of possible formats the video source supports.
Here is a same copy/paste from my machine:
probed caps: video/x-raw-yuv, format=(fourcc)YUY2, width=(int)1280,
height=(int)720, interlaced=(boolean)false,
pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 10/1 };
video/x-raw-yuv, format=(fourcc)YUY2, width=(int)640, height=(int)480,
interlaced=(boolean)false, pixel-aspect-ratio=(fraction)1/1,
framerate=(fraction){ 30/1 };
So the info your looking for is:
! video/x-raw-yuv, framerate=30/1, width=640, height=480, interlaced=false !
If anything NOT from the probed list will result in error:
could not negotiate format
Related
I try to operate NI USRP-2920 by GNUradio.
At the first, I typed "uhd_find_devices".
Result is below.
----------------------------------------------
-- UHD Devices 0
----------------------------------------------
Device Address:
serial: 3077BE9
adde: 192.168.10.4
name:
type: usrp2
In the next, I typed "uhd_usrp_probe" in terminal.
Result is below.
Error: RuntimeError: Please update the firmware and FPGA images for
your device. See the application notes for USRP2/N-Series for
instructions. Expected FPGA compatibility number 11, but got 10: The
FPGA build is not compatible with the host code build. Please run:
"/usr/local/lib/uhd/utils/uhd_images_downloader.py"
"/usr/local/bin/uhd_image_loader" \
--args="type=usrp2,addr=192.168.10.4"
Therefore, I runned uhd_images_downloader. But, I typed
"uhd_usrp_probe" is the same result(RuntimError:update the firmware).
Could you tell me any advices?
Thank you.
Thank you.
I typed "uhd_image_downloder" after renewing ubuntu.
As a result, I could connect USRP.
Thank you for advice.
According to Kurento documentation: http://doc-kurento.readthedocs.io/en/stable/mastering/kurento_API.html
GstreamerFilter is a generic filter interface that allow use GStreamer filter in Kurento Media Pipelines.
I was trying to find Gstreamer filters on google, all I found was Gstreamer plugins. (https://gstreamer.freedesktop.org/documentation/plugin-development/advanced/
Does this mean I can use the Kurento Gstreamer filter, to add plugins such as rtph264depay and rtmpsink with it?
e.g.
WebRTC endpoint > RTP Endpoint > (rtph264depay) Gstreamer filter (rtmpsink) > RTMP server.
All without installing Gstreamer separately?
GstreamerFilter allows you to configure a filter using a native GStreamer filter (the same way than when you are using gst-launch-1.0). For example, the following Kurento filter allows to rotate horizontally your media within KMS:
GStreamerFilter filter = new GStreamerFilter.Builder(pipeline, "videoflip method=horizontal-flip").build();
Said that, and regarding your question, for the best of my knowledge, I think so, you can use GstreamerFilter to use rtph264depay and rtmpsink.
Boni Garcia 's code is right.
But if you replace "videoflip method=horizontal-flip" as "rtmpsink location=rtmp://deque.me/live/test01", you will get a error message: "Given command is not valid, pad templates does not match".
You can go deeper to check kms-filter source code from https://github.com/Kurento/kms-filters, in kms-filters/src/server/implementation/objects/GStreamerFilterImpl.cpp there is a line:
99 throw KurentoException (MARSHALL_ERROR,
100 "Given command is not valid, pad templates does not match");
I afraid you can't use GstreamerFilter to send data to rtmp server, maybe you should modify the source code a bit.
Kurento
Just looking at the source - the GStreamerFilter is limited to simple GStreamer plugins. They reject bins and I don't see how you would specify/isolate multiple pads so it probably won't do it.
(EDIT: Maybe I'm wrong here - I'm still learning. I see the mixer example isolating media types and that makes me think it may be possible)
gstreamer
On the other hand installing gstreamer shouldn't really be that much overhead - then link the output RTP connection to a gst-launch pipeline that can output RTMP. It just sucks you can't manage the full pipeline using kurento.
(I don't know what that pipeline would look like - investigating it myself. It's something like this:
gst-launch-1.5 -v \
udpsrc port=9999 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! mux. \
multifilesrc location=sample.aac loop=1 ! aacparse ! mux. \
mpegtsmux name=mux mux. ! rtpmp2tpay ! queue ! udpsink host=10.20.20.20 port=5000
But I'm faking audio in this and haven't gotten the full stream working)
back to kurento
Further exploration suggested maybe the Composite MediaElement would work (tl;dr: no):
Composite composite = new Composite.Builder(pipeline).build();
HubPort in_audio = new HubPort.Builder(composite).build();
HubPort in_video = new HubPort.Builder(composite).build();
HubPort out_composite = new HubPort.Builder(composite).build();
GStreamerFilter filter = new GStreamerFilter.Builder(pipeline, "rtmpsink location=rtmp://127.0.0.1/live/live_stream_720p").build();
webRtcEndpoint.connect(in_audio, MediaType.AUDIO);
webRtcEndpoint.connect(in_video, MediaType.VIDEO);
out_composite.connect(filter);
results in (kurento logs):
...15,011560 21495 [0x4f01700] debug KurentoWebSocketTransport WebSocketTransport.cpp:422 processMessage() Message: >{"id":28,"method":"create","params":{"type":"GStreamerFilter","constructorParams":{"mediaPipeline":"5751ec53_kurento.MediaPipeline","command":"rtmpsink location=rtmp://127.0.0.1/live/live_stream_720p"},"properties":{},"sessionId":"d8abb1d8"},"jsonrpc":"2.0"}<
...15,011862 21495 [0x4f01700] debug KurentoGStreamerFilterImpl GStreamerFilterImpl.cpp:47 GStreamerFilterImpl() Command rtmpsink location=rtmp://127.0.0.1/live/live_stream_720p
...15,015698 21495 [0x4f01700] error filterelement kmsfilterelement.c:148 kms_filter_element_set_filter() <kmsfilterelement0> Invalid factory "rtmpsink", unexpected pad templates
...15,016841 21495 [0x4f01700] debug KurentoWebSocketTransport WebSocketTransport.cpp:424 processMessage() Response: >{"error":{"code":40001,"data":{"type":"MARSHALL_ERROR"},"message":"Given command is not valid, pad templates does not match"},"id":28,"jsonrpc":"2.0"}
I.e. failure.
I program my experiments on a Macbook Pro with OSX 10.9.5, graphic card Intel HD Graphics 4000 1024 MB, with VLC Version 2.0.10 Twoflower (Intel 32bit). I used to present videos (avi and mp4 files, 60 frames per second) successfully with moviestim up to version 1.80. After upgrading to version 1.81 by installing the standalone version I tried to use moviestim2, adapting the code in Moviestim2.py. When I run the code below:
from psychopy import visual, core
import time, os, pylab
os.chdir('/Users/till/work/edv/psychopy/test/')
win = visual.Window([1440, 900])
win.setRecordFrameIntervals(True)
mov = visual.MovieStim2(win, 'jwpIntro.mov',
size=[800,800],
pos=[0, 100],
flipVert=False,
flipHoriz=False,
loop=False)
shouldflip = mov.play()
while mov.status != visual.FINISHED:
if shouldflip:
win.flip()
else:
time.sleep(0.001)
shouldflip = mov.draw()
intervalsMS = pylab.array(win.frameIntervals[1:])*1000
m=pylab.mean(intervalsMS)
nTotal=len(intervalsMS)
nDropped=sum(intervalsMS>(1.5*m))
print "nTotal", nTotal
print "nDropped", nDropped
core.quit()
the video is shown in full length, the output is
nTotal 142
nDropped 2
(Warnings deleted). When I run the code with one of my videos (file format mov, size adjusted to 800x800), generated with ffmpeg in format H.264 from 852 png files with 60 frames per second to show moving objects for a tracking task (no audio data), the window closes immediately after probably showing the first frame. The output is
nTotal 0
nDropped 0
/Applications/PsychoPy2.app/Contents/Resources/lib/python2.7/numpy/core/_methods.py:55: RuntimeWarning: Mean of empty slice.
warnings.warn("Mean of empty slice.", RuntimeWarning)
/Applications/PsychoPy2.app/Contents/Resources/lib/python2.7/numpy/core/_methods.py:67: RuntimeWarning: invalid value encountered in double_scalars
ret = ret.dtype.type(ret / rcount)
(Other warnings deleted) Tests with file formats avi and mp4 generated nTotals of 1 to 2 and accordingly no Runtime Warnings and the same result.
Any help would be appreciated, because up to now I was not able to return to PsychoPy 1.80 using moviestim as before with avbin 10 (window freezes, but PsychoPy does not crash) as a workaround.
Best,
Till
The issue likely has to do with your videos not having any audio track. Try setting the 'noAudio' kwarg to True when you create the MovieStim2.
visual.MovieStim2(win, 'jwpIntro.mov',
size=[800,800],
pos=[0, 100],
noAudio=True,
flipVert=False,
flipHoriz=False,
loop=False)
MovieStim2 should really be able to auto detect when there is no audio stream at all; so that should be changed when there is time. ;)
If the above does not work, can you post a link to one of your sample videos so I can download and debug?
Update: I tested my suggested workaround, only to discover it uncovered some other issues. (Arrrg..) These issues are now fixed, however this means that for this suggestion to work, you will need to update your psychopy package source from the psychopy github master stream as of October 23rd, 2014, or use an official package update if one is available that was released after this date.
I have instaled ARToolKit on Ubuntu 12.10 on a 64bit Asus. The install gave no errors so I think I'm ok. But when I want to try one af the examples it can't find the camera. If I don't fill anything in at char *vconf = ""; I get
No video config string supplied, using defaults.
ioctl failed
The most often found solution implies
char *vconf = "v4l2src device=/dev/video0 use-fixed-fps=false ! ffmpegcolorspace ! capsfilter caps=video/x-raw-rgb,width=640,height=480 ! identity name=artoolkit ! fakesink";
But this doesn't work for me. I get
r#r-K55VD:~/Downloads/Artoolkit-on-Ubuntu-12.04-master/bin$ ./simpleTest
Using supplied video config string [v4l2src device=/dev/video0 use-fixed-fps=false ! ffmpegcolorspace ! capsfilter caps=video/x-raw-rgb,width=640,height=480 ! identity name=artoolkit ! fakesink].
ARVideo may be configured using one or more of the following options,
separated by a space:
DEVICE CONTROLS:
-dev=filepath
specifies device file.
-channel=N
specifies source channel.
-noadjust
prevent adjusting the width/height/channel if not suitable.
-width=N
specifies expected width of image.
-height=N
specifies expected height of image.
-palette=[RGB|YUV420P]
specifies the camera palette (WARNING:all are not supported on each camera !!).
IMAGE CONTROLS (WARNING: every options are not supported by all camera !!):
-brightness=N
specifies brightness. (0.0 <-> 1.0)
-contrast=N
specifies contrast. (0.0 <-> 1.0)
-saturation=N
specifies saturation (color). (0.0 <-> 1.0) (for color camera only)
-hue=N
specifies hue. (0.0 <-> 1.0) (for color camera only)
-whiteness=N
specifies whiteness. (0.0 <-> 1.0) (REMARK: gamma for some drivers, otherwise for greyscale camera only)
-color=N
specifies saturation (color). (0.0 <-> 1.0) (REMARK: obsolete !! use saturation control)
OPTION CONTROLS:
-mode=[PAL|NTSC|SECAM]
specifies TV signal mode (for tv/capture card).
What is a methodological way of finding out what exactly to put at char *vconf = " " ?Because I feel I tried a lot of variations at random, but nothing works. I know it needs a path like /dev/video0, but what else seems up in the air to me.
char *vconf = "v4l2src device=/dev/video0 use-fixed-fps=false !
ffmpegcolorspace ! capsfilter
caps=video/x-raw-rgb,width=640,height=480 ! identity name=artoolkit !
fakesink";
The above configuration you tried is for GStreamer driver.
Since you are using VideoLinuxV4L , instead of the above use:
char *vconf = "-dev=/dev/video0 ";
For more you can refer to "{ARtoolkit Folder}/doc/video/index.html"
I'm trying to stream h264 video over the network using gstreamer ( in windows ) over UDP.
First if I use a pipeline like this, everything appears to be ok, and I see the test pattern:
videotestsrc, ffmpegcolorspace, x264enc, rtph264pay, rtph264depay, ffdec_h264, ffmpegcolorspace, autovideosink
Now I decided to divide this pipeline in client and server parts, transmitting the stream over udp using udpsink and udpsrc.
Server: videotestsrc, ffmpegcolorspace, x264enc, rtph264pay, udpsink
Client: udpsrc, rtph264depay, ffdec_h264, ffmpegcolorspace, autovideosink
On server I use something like that:
source = gst_element_factory_make ("videotestsrc", "source");
ffmpegcolortoYUV = gst_element_factory_make ("ffmpegcolorspace", "ffmpegcolortoYUV");
encoder = gst_element_factory_make ("x264enc", "encoder");
rtppay = gst_element_factory_make ("rtph264pay", "rtppay");
udpsink = gst_element_factory_make ("udpsink", "sink");
g_object_set (source, "pattern", 0, NULL);
g_object_set( udpsink, "host", "127.0.0.1", NULL );
g_object_set( udpsink, "port", 5555, NULL );
Then I add the elements to the pipeline and run, there are no errors anywhere.
Now if I look for UDP port 5555, it's not listening!!!!
The client part also runs but if there is no UDP port listening on server side it won't work.
EDIT: In fact I was very close to the solution... If I start the client it works, but with some problems on the visualization... I think the problem is the x264enc configuration. Anybody knows how to change x264enc parameters like speed-preset or tune???
I tried to instantiate GstX264EncPreset or GstX264EncTune but I have no the declarations of these strcutures.
Anybody knows any way to setup x264enc in other way, like parsing a string or something like that?
I know this is an older post, but you can set the GstX264EncPreset value using a simple integer that corresponds to the preset value.
g_object_set(encoder, "speed-preset", 2, NULL); works for me. The values can be found using gst-inspect-1.0 x264enc and are as follows:
speed-preset : Preset name for speed/quality tradeoff options (can affect decode compatibility - impose restrictions separately for your target decoder)
flags: readable, writable
Enum "GstX264EncPreset" Default: 6, "medium"
(0): None - No preset
(1): ultrafast - ultrafast
(2): superfast - superfast
(3): veryfast - veryfast
(4): faster - faster
(5): fast - fast
(6): medium - medium
(7): slow - slow
(8): slower - slower
(9): veryslow - veryslow
(10): placebo - placebo
Try setting the caps on the udpsrc element to "application/x-rtp".