I'm trying to link Kinect with the PCL library, and following this tutorial, but the grabber solution that I have built just crashes, stating that abort() was called. What exactly caused the problem and is there a workaround for this?
this is my grabber https://github.com/giacomodabisias/libfreenect2pclgrabber.
I am integrating it in pcl here:https://github.com/PointCloudLibrary/pcl/pull/1410
You can find example for a single kinect2 and form multiple kinect2s in the repo.
I believe Kinect v2.0 is supported under the grabber class of PCL. You can get it from the web.
http://download.csdn.net/detail/xx__hu/8571209
Related
I have been unable to find out how i can set the Kinect depth mode to near (40-300mm) using OpenNi. I am programming in C++ and would like to keep using the OpenNi libraries for retreiving the depth data, however i cannot find any property of setting that i should use for changing modes.
I too wanted to do the same thing. I googled up a bit and found following
libfreenect which is a fork from OpenKinect
This fork with the branch "nearMode" provides capability to work in NEAR_MODE. This is a compile time solution only. I plan to compile and use it.
I'm exploring on how to connect to GPS on Windows Mobile using VB.net. (Compact framework)
Most of the result that I found is using GPS Intermediate Driver . but this seems not solve my problem(or perhaps I just don't understand it?)
Besides, I found that there is something like gpsapi.dll in which I think I can add as a reference. But where can I get this?
I've read article:using the managed GPS sample too, but it seems only for C?
Please guide me / correct me if I'm wrong
It looks like it is for C#. And the download for the Windows Mobile 6.5.3 Developer Tool Kit
is here: http://www.microsoft.com/en-au/download/details.aspx?id=5389
Read the steps in the article link carefully:
Find the samples folder, compile the sample, grab the generated dll, import into your solution as a reference, put in the imports directive and you should be there.
Since I haven't got any response on the Unity3d or Evernote forums, I'll try it here.
The last year I have worked a lot with Unity3D, mostly because the good integration with the Vuforia Augmented Reality library and the fact that publishing for multiple platforms is a piece of cake.
Now I want to show notes in an AR setting and am looking at the Evernote API for this. I couldn't find anything about using this with Unity, I can see why this is not the most common combination.
My question is: do you think I can access the Evernote API through Unity? If so, how should I do this? Or is it for this purpose perhaps wiser to make (parts of) the application with Eclipse/xCode?
Hope to hear from you!
Link to Evernote API: http://dev.evernote.com/doc/
The Evernote API has a C# SDK which you should be able to call through Unity. In terms of how to do it, you will probably need to download the SDK and follow the instructions yourself. Their github seems like a good starting point.
One thing to note is that Unity's .Net library for mobile clients are quite limited and with webplayer you will need to deal with sandbox security issues. But start with the standalone build first and see how you go
I know this has been all over the net. I browsed a lot and found a lot of information about Gesture recognition .However i got more confused after reading most of them seem to work on Kinect Official SDK , Microsoft's SDK and i don't want to use them.
I need to develop it for Linux and was wondering if OpenNI, NITE gives me the flexibility to recognize a gesture.
I am trying to develop a sign language program, which recognizes the gestures and draws stuff(spheres,cubes etc) on screen,
Could any one give me a clear guidelines to begin this project.I have no clue where to begin it from.
Any help is welcomed.
Thanks
For getting starting with understanding gestures I suggest checking out a blog post I made a while back:
http://www.exceptontuesdays.com/gestures-with-microsoft-kinect-for-windows-sdk-v1-5/
This, plus the post it links to, go into what makes up a "gesture" and how to implement them using Kinect for Windows SDK.
You can also check out thine Kinect Toolbox, which also does gestures for the official Kinect SDK:
http://kinecttoolbox.codeplex.com
These will give you a good understanding of how to deal with gestures. You can then move those concepts and ideas into an OpenNI environment.
If in your sign language system you want to use hand gestures you can have a look at Kinect 3D handtracking. If it includes whole body movement then have a look at KineticSpace. Both tools work for linux but the first tool requires CUDA enabled GPU as well.
I think the Gesture recognition is not depended on whether you use Kinect Official SDK or OpenNI, if you can get the skeleton data or depeth image from the Kinect,then you can extract the gesture or postion from the relations between skeltons with times. As far as i know ,those SDKs are all provide those informations.
I developed kinect on windows ,but the priciple are the same, I still suggest to know the principle about the Gesture recognition, then you can find the way to do recognition using other SDKs.
Does anyone know how to integrate the Playstation Eye with Labview? Can a driver somehow be used to allow Labview to recognize it as a webcam?
You should be able to do this with vision (install IMAQdx and Vision Dev Module)- it seems to be DirectShow, which IMAQ can do- or try out the code found on this page: http://www.labviewforum.de/thread-21279.html - it uses the original dlls.
as there are NO official dll´s for the PS3 Eye on Windows, the ONLY Option is to use the 3rd Party drivers from Code Laboratries or directly interface the Hardware via USB-RAW commands. Code Laboratries PS3 Implementation however does not seem to be 100% conform with the Direct Show standard. You can get a PS3 Eye to work with Labview (via Direct Show and IMAQ), but you will be limited by the usable framerates.
I tried to interface the dll from code laboratries directly, but got stuck on a stange error with the second function i tried (see the already referenced Thread http://www.labviewforum.de/thread-21279.html). However it seems as for now there is a Vi Package available for the PS3 Eye to support LabView under OSX with the full available framerate. More Information can be found here:
http://labview.epfl.ch/
Hope this helps.
Best Regards,
Jan