How to match gestures with kinect v2? - kinect

I am planning to develop a web app which recognizes user hand gesture(thumb up and thumb down) and perform basic functions(like updating a value). how to accomplish this with kinect v2?

Do you know about visual gesture builder?
Visual Gesture Builder (VGB) generates data that applications use to perform gesture detection at run time.
Basically you can use the data it generates to perform gesture recognition like swipe up, wave, hands up etc...
Steps:
Record in Kinect studio 2 the position you want the pose to be.
open up Visual gesture builder to train your clips( selection of the clip that is correct)
build the vgbsln in the visual gesture builder to produce a gbd file( this will be imported into your project as the file that the gesturedetector.cs will read and implement into your project.
code out your own logic on what will happen when user have matching poses in the gestureresultview.cs.
Here is a video you must watch to help get you started in building application with kinect, This video
It will teach you the process i mentioned above. really do replay it again and again as that is what i did.
I have a WPF application that can detect several gestures such as swipe up, hands up, swipe left etc... you can do your own testing by pulling my project on github here.
However, I do not have experience on implementing the gestures on a web application yet. But you can start by learning how to work with the kinect first :D
Do let me know if you find my help useful and select this as the answer to your problems XP

Related

How To add Focus, Brightness and Zoom controls to react native camera?

I used react-native-camera on my iOS app and now trying to add Focus, Brightness and Zoom controls to it. So far I was unable to come up with a solution. Any idea how to do this?
I tried to find an option in different react native camera packages. Also posted in their git repos for help. Finally tried this post: https://medium.com/react-native-development/react-native-camera-app-with-live-preview-saturation-and-brightness-filters-d34535cc6d14 Where they take a photo from the camera every 5 milliseconds and adjust its brightness which seems to be very unstable and it makes the app crash.
It is not possible to use the focus and zoom functionalities with react-native-camera.
Unluckily the focus api has many bugs and the zoom functionalities will not render fast enough with javascript.
Maybe a solution is not using react-native-camera and instead just writing an intent to open the default camera application
The following app uses this solution, all the camera functionalities work perfectly.
Could they re-open the issue as it seems to not be solved?
Developers may need to review all the open issues to estimate the project deadlines.

Embedded camera in Xamarin Forms

I have a client that adamantly insists on a solution with embedded camera in terms of having a ContentPage with an camera stream and custom buttons and icons, similar to https://github.com/pierceboggan/Moments, or at very least as I understand it seeing as it is a Snapchat clone. And my client wants similar swipe capabilities as to how navigation works in Snapchat.
However, as far as I can tell most of what is utilized in that solution has been deprecated.
I have suggested using the Media Plugin https://github.com/jamesmontemagno/MediaPlugin but they're not satisfied with the camera being pushed on the stack.
I've looked into implementing it natively and using dependency injections but it appears to be an overwhelming amount of work just to implement the most basic functions, particularly for Android's Camera2.
I'm hoping someone can provide me with good news of an easier alternative or an alteration to either Moments or Media Plugin or anything similar that will facilitate the requirements or if my only option is time consuming and complex?
From the code of Moment, you can do what you want to achieve. I did this for iOS.
You will have to create a custom renderer to display the camera page. You will be able to add buttons on top of it.
You could try this example which use custom renderer to add a take photo button and switch camera button on the camera view. Which is able to use on iOS and Android platform.
Main Page:
Camera View with custom button page:

Is it possible to programmatically generate Touchpad gesture event?

I'm working on OS X project and i want to programmatically generate Touchpad gesture event like NSEventTypeSwipe or NSEventTypeRotate
So I can rotate/zoom etc on other applications.
I found out how to generate mouse/keyboard events but not touchpad.
Any idea?
There is no public API for generating those events.
You can find some work on synthesizing those events in this project: calftrail/Touch.
Reference: Cocoa Event Handling Guide
The above Mac Developer Library guide does not state any known methods to programmatically generate Touchpad Gestures.
It goes so far as to say the touchpad gestures themselves occur outside of the OS:
"The trackpad hardware includes built-in support for interpreting common gestures and for mapping movements..."
That guide also explicitly mentions apps should not rely on that sole input mechanism, and it's best to include support for keyboard and mouse for that reason.
Now that Mac and Windows (i.e., Windows 8) are supporting touchscreen monitors at the OS level, it's a matter of time before programmatically touchpad & touchscreen gestures can be incorporated into services like your project or remote desktop control using the appropriate API when it becomes available.
I think that touch gesture events cannot be easily generated as there is no public, official Apple API. This NSEventTypeMagnify, NSEventTypeRotate, NSEventTypeSwipe are I think only for read only purposes while handling exitsting system events. Maybe Apple for some reason don't want to make from Magic Mouse a magic touchpad by 3rd party developers. This Project rob mayoff mentioned is corrently not working as apple probably changes something in structure of event data. So relaying on such hacking isn't futuristic.
But If you think a little more you could achieve what touch events will do by other means.
magnification (pinch gesture) -> its just zoom in, zoom out -> most programs is using shortcut for this like CMD and +, CMD and -.
rotation it is usable with photos and there is shortcut like CMD and L, CMD and R in preview app.
swiping - changing spaces (desktops) -> use CTRL and arrows <- or ->
When I see the xcode simulator, I understand that touch events could be generated programmatically, and that xcode simulator is using those routines and functions to transfer the cursor touches and translate them to touch events, however based on what #robMayoff said, it seems that Apple did not make that library open for public. Emulating the same behavior and creating functions from scratch for that would be a bit challenging.
This is the NSTouch class reference:
http://developer.apple.com/library/mac/#documentation/AppKit/Reference/NSTouch_Class/Reference/Reference.html#//apple_ref/occ/cl/NSTouch

How do I position in mono for android?

Hello i am new to Mono for android. I am trying to make a Calculator, in a normal windows forms application.
I can Drag a button or textbox to any position I want but how does that work in Mono for android, I want the buttons next to each other not only downwards. If I place buttons under eachother that go out of the framework I dont want that either..
I am not English il hope you will understand.
please help.
Android "supports", but has deprecated and doesn't endorse, pixel-perfect layout. Unfortunately the Windows Forms-style of dragging and dropping controls onto a design surface at specific pixel locations requires pixel perfect layout, so you can see the mismatch here.
For a Calculator, what you would instead want to do use a Table Layout or some other "resizable" container, so that your Activity can support the variety of device sizes that Android covers.

porting iPhone openGLES app to OSX?

Developing for the iPhone has been my first experience with objective-c and first in-depth experience with xcode. How difficult would it be to port an openGLES iPhone app to the OSX desktop using openGL? I am not asking about user interface - obviously there is no equivalent to cocoa touch UI on the desktop. I am asking specifically about the app delegate and openGLES layers. Are there any major hurdles? Is it as straight forward as simply creating a new app delegate in a project of type cocoa?
I've started looking into just the same thing, and it appears that OpenGL-heavy applications would be among the easiest to backport to the Mac. Pretty much everything in OpenGL ES is present in OpenGL on the desktop (with the exception of some of the fixed-point stuff), so that code can stay the same.
The way that OpenGL is handled on the iPhone is via a Core Animation layer (CAEAGLLayer), rather than a specific view. Therefore, you should be able to transfer that across to a Leopard-based desktop application, although you'll need to convert all references to EAGL classes to their OpenGL equivalent (EAGLContext to NSOpenGLContext, for example). You could render into a CAOpenGLLayer that's displayed by itself, or use that layer to back a custom NSView.
The fundamental structure of a desktop Cocoa application will be different than a Cocoa Touch one, but you should be able to start from one of the Xcode templates and add back in your components from the Cocoa Touch application.
Again, I haven't yet done this for my application, but it looks reasonably straightforward.