Use Webcam to Track User's Finger Location - webcam

I have a project just starting up that requires the kind of expertise I have none of (yet!). Basically, I want to be able to use the user's webcam to track the position of their index finger, and make a particular graphic follow their finger around, including scaling and rotating (side to side of course, not up and down).
As I said, this requires the kind of expertise I have very little of - my experience consists mostly of PHP and some Javascript. Obviously I'm not expecting anyone to solve this for me, but if anyone was able to direct me to a library or piece of software that could get me started, I'd appreciate it.
Cross compatibility is of course always preferred but not always possible.
Thanks a lot!

I suggest you to start reading about OpenCV
OpenCV (Open Source Computer Vision) is a library of programming functions mainly aimed at real-time computer vision, developed by Intel Russia research center in Nizhny Novgorod, and now supported by Willow Garage and Itseez.1 It is free for use under the open-source BSD license. The library is cross-platform. It focuses mainly on real-time image processing.
But first, as PHP and JavaScript are mainly used for web development, you should start reading about a programming language which is supported by OpenCV like C, C++, Java, Python or even C# (using EmguCV) etc.
Also, here are some nice tutorials to get you started with hand gestures recognition using OpenCV. Link
Good luck!

Related

Controlling Nikon camera with MTP

I was wondering how i would be able to get started with controlling my nikon DSLR camera? I have been reading on the Nikon SDK and MPT/PTP and is really confused on how to start with writing a script to control it. Thanks for helping me.
If you are just wanting to script stuff, under Linux libgphoto2 and gphoto2 are a good start.
You can use them under windows, I'm not sure if there are pre-compiled build available, but that would also require installing the USB wrapper libraries, and that a touch fiddly.
The next step above that is to compile libgphoto2 in cygwin (there are some good guides how to this on the web), but that overkill.
I am currently using digicamcontrol in windows, and for Nikon and C# code it's really nice to use, and very fast, plus it has no hassle on the USB front. It wouldn't be too hard to write a small C# that does what you want (unknown) and then run that from scripts.
this is what you are looking for:
http://sourceforge.net/projects/nikoncswrapper/
Good luck
In case anybody is still looking at this: the answer is a bit more complex if what you are looking to do is write your own code to access a Nikon DSLR. Thomas Dideriksen's SDK wrapper referenced above is great in making it easy to access Nikon's SDK to control almost all camera functions - but it is restricted to USB-cable access since that SDK does not support wireless access. If the latter is what you want, your best option may be Duka Istvan's digiCamControl, which Simeon suggests above. This open-source C# project can be used as a standalone library. (See the development documentation page.) It is not all that well documented, though, so figuring out how to control all camera parameters can be tricky.

Learning openGL on the mac: GLUT or native windowing system?

I'm starting to learn openGL on the mac, and being a cocoa developer already I find the native windowing system to be very appealing.
The books I'm reading all mentions the use of GLUT. Now, I'm wondering what the majority of people use for developing opengl programs, or if it's just a matter of taste.
Generally (Free)GLUT is not used for developing actual applications. It's used for demoing effects or simple things, which is why so many online materials use it. It takes all the cross-platform stuff and shoves it into a corner, thus focusing the user's attention on OpenGL.
GLUT owns the message processing loop. For simple applications, that's fine. But for most real programs, you will need to control message processing on your own. So GLUT fails. Also, GLUT doesn't really mesh well with the rest of the UI; it has no facilities for creating GUI controls (except for context menus).
If you're learning OpenGL, then you should be focused on learning OpenGL, not GUI programming. So use what makes sense for the task in question.

What do I need to have to be ready to write a Compact Framework application communicating with GPS?

Simply I am asked to write an application for a smart device (smart cell phone), which will get the GPS coordinates from the device itself.
I have no smart device at all. And I am kind of lost among questions like how can I check if the device have a gps by using the code, if it has how can I obtain them in a "standard" way, do I need to be using frameworks like GeoFrameWork?
So, may somebody list the must or most required things I need to have ready?
Geoframeworks GPS.NET is free these days and it's pretty comprehensive so there's no point reinventing the wheel. It's also friendly to beginners which helps. I strongly recommend downloading it and playing with some of the sample apps. It's a bit tricky if you don't have a physical device to play around with but it does have GPS emulation classes that you can use.
All you need is a copy of VS2008 Pro with the Smart Device SDK installed.

Qt4.5 vs Cocoa for native Mac UI

I've been developing for Windows and *nix platforms for quite some time, and am looking to move into Mac development. I am tossing up between using ObjC/Cocoa and C++/Qt4.5.
The C++/moc semantics make more sense to me, and improving knowledge in Qt seems like a sensible thing to do given that you end up with a skill set that covers more platforms.
Am I likely to handicap my applications by skipping Cocoa?
The sample Qt applications look pretty Mac-native to me, but they are quite simple so potentially don't tell the whole story. Are there other pros to the Xcode way that Qt doesn't have, such as packaging, deployment, etc.?
Here's an easy way to answer it:
If you were developing a Windows app with .NET or MFC, would you handicap your applications by using Qt? If the answer to that is yes, then the situation is likely to be the same on the Mac.
A few negatives I can think of off the top of my head:
Licensing
Qt apps, while good, are not completely a native UI experience and there's things a native UI designer can do in Cocoa which boggle the mind. While I can't be sure that all the same functionality isn't available in Qt, I doubt it.
Qt is always a little behind. If Microsoft or Apple come out with a great new technology, you have to wait for the Qt developers to update Qt.
However, with all that said, only you can determine the business value of using Qt. If you think cross-platform development is going to be a major part of your development, then Qt might be worth it, despite the issues mentioned.
Ask yourself: how many of the best Mac applications that you know of use Qt instead of native Cocoa?
For our robotic systems, we originally wrote our control software in C++ using the cross-platform wxWidgets library (we avoided Qt due to some licensing concerns), because we felt that we had to target Windows, Linux, and Mac platforms for our end users. This is what we shipped for over a year until I started tinkering with Cocoa.
Right away, the thing that most impressed me was how quickly you could develop using Cocoa. Eventually, we decided to drop support for Linux and Windows and rewrite our entire control applications in Cocoa. What had taken us years to put together in C++ required only three months to completely reimplement in Cocoa.
Aside from the "lowest common denominator" interface issues that others have pointed out, the rapid development allowed by Cocoa has become a competitive advantage for our company. Our software has advanced far more quickly since our conversion to Cocoa, and it has allowed us as a new company with one developer to pull even with 10-year-old competitors that have 20-man development teams. This appears to be a common story in the Mac development space, where you see a lot of small teams who are able to create products that compete with those of much larger companies.
As a final note, using Cocoa gives you the ability to stay on top of the new APIs Apple is continually rolling out. We're now working on a new control interface that will make heavy use of Core Animation, something that would be painful to deal with using Qt.
I'm currently developing both with QT (actually PyQT, but it makes no difference to your question) and native Cocoa app. For me it's no brainer, I'd chose Cocoa. It's really worth time to explore Cocoa in general, there are many great concepts within the Cocoa framework, and Objective-C 2.0 as well.
I'd use Qt if you want this to be a crossplatform application.
You can have a look at the QMacCocoaViewContainer class. It acts as some kind of wrapper for generic Cocoa views, so you can also have Cocoa elements which are not officially supported by Qt.
Of course this means learning a little about Cocoa and Objective C and how a Cocoa UI would need to look like. But if you already know Qt well and if it’s not like your application is all and only about the GUI this could be a good way to go.
And don’t forget about the QMacStyle::WidgetSizePolicy or you won’t understand why your tables come out so huge.
Obviously, the best option is to use a cross platform suite that supports native widgets.
With QT4 you can build your base user interface. Then just add native support for your specific target platform.
Sure, Cocoa has a lot of fancy stuff (and you can still use them trough QT4), but let me be clear. I see a lot of fancy Apps on the AppStore, pretty ones, but most of then are just crap, expensive.. what ever. I really missed my Kate text editor, my okular viewer, my krita drawing software... those are just better than the commercial and expensive alternatives and are free. so i just tweak the source code a little bit too have a REAL native and great experience.
What if i have to use a linux app on my main computer with is a mac os x? or windows? or whatever? only?
For example, why on earth i have to buy a expensive ,fancy but far less featured image editor software for my mac like pixelmator when i can use a full featured real image manipulation software like Gimp? YES Gimp is gtk2 based which is a pain on any platform, specially on Mac because is really ugly. Gimp should be ported to QT4. Inkscape should be ported to QT4 too, and it would feel so great.
Is so simple to do.. gosh!
http://doc.qt.nokia.com/4.7-snapshot/demos-macmainwindow.html
Even you can add support for the the new native lion fullscreen feature, unified title and toolbar menus, etc
I , as a user, i really care about efficient, featured, good and cross platform apps, i don't really care about developer's convenience or laziness .
I do a lot of cross-platform development (Mac, Windows, Linux), and for some projects use Qt. It is a fine framework, and provides a rich class library. If you need to deploy on multiple platforms, cannot afford to spend the time/effort on platform-specific front-ends, or the "generic" support for each platform is sufficiently good, then use Qt.
However, Qt inevitably suffers in some ways from the lowest common denominator syndrome, and sometimes does not feel quite native enough. There are also certain features that are either difficult to support, or are simply not provided in the Qt libraries. So if you can afford the time and effort, or your app really demands the attention to detail and fit & finish, then developing separate front-ends may be worth it.
In either case, you ought to be writing your back-end (aka domain) code in a platform-neutral and front-end neutral manner. This way, the front-end is easily replaced, or modified between platforms.
You could always start with a Qt front-end and go for a quick time to market, then develop a native front-end down the line.
In practice, I've noticed that a Qt app on Windows looks most "native", while on Mac there are certain subtle telltale signs that make it look/feel not quite right. And Mac users tend to have much higher expectations when it comes to UI/UX!
Since posting this, i've been learning the Cocoa / Objective-C way, and have been quite impressed. Despite what I initially thought was quite a quirky syntax, Objc appears to be a very effective language for implementing UI code, and the XCode sugar - things like Core Data and bindings - make short work of all of the boring bits.
I spent a while with the QT examples and documentation before digging into cocoa, and tend to agree with what has been said above w.r.t being slightly behind the curve and less 'aqua-ish' - albeit from a fairly trivial inspection. If I absolutely had to be build a cross-platform app i'd probably use QT rather than trying to separate out the UI code, as it seems like it would provide close-enough visuals, but for mac only purposes, Cocoa seems like a definite win.
Thanks all for your responses, they've all been very helpful!
DO NOT use Qt for a Mac app. You will get no hardware acceleration for 2D rendering, and you will not be able to deliver ADA compliance.
Depending on what kind of apps you want to write, another contender is REALbasic now called Xojo.
The move from C++ is pretty easy (I have 15 years C++ experience) and the framework and IDE extremely productive. You have the added bonus of being able to deploy to Linux and Windows with trivial effort. Their framework compiles to native code and uses native widgets so you don't have an emulated look and feel.
The big reason for learning Cocoa and coding in Objective-C is if you want to hone your iPhone skills or are chasing a really fancy user experience. If you wanted to rival the cutting edge of WPF development then I'd recommend Cocoa.

How do I get input from an XBox 360 controller?

I'm writing a program that needs to take input from an XBox 360 controller. The input will then be sent wirelessly to an RC Helicopter that I am building.
So far, I've learned that this can be done using either the XInput library from DirectX, or the Input framework in XNA.
I'm wondering if there are any other options available. The scope of my program is rather small, and having to install a large gaming library like DirectX or XNA seems like excessive. Further, I'd like the program to be cross platform and not Microsoft specific.
Is there a simple lightweight way I can grab the controller input with something like Python?
Edit to answer some comments:
The copter will have 6 total propellers, arranged in 3 co-axial pairs. Basically, it will be very similar to this, only it will cost about $1,000 rather than $15,000. It will use an Arduino for onboard processing, and Zigbee for wireless control.
The 360 controller was selected because it is well designed. It is very ergonomic and has all of the control inputs needed. For those familiar with helicopter controls, the left joystick will control the collective, the right joystick with control the pitch and roll, and the analog triggers will control the yaw. The analog triggers are a big feature for the 360 controller. PS and most others do not have them.
I have a webpage for the project, but it is still pretty sparse. I do plan on documenting the whole design though, so eventually it will be interesting.
http://tricopter.googlecode.com
On a side note, would it kill Google to have a blog feature for googlecode projects?
I would like the 360 controller input program to run in both Linux and Windows if possible. Eventually though, I'd like to hook the controller directly to an embedded microcontroller board (such as Arduino) so that I don't have to go through a computer, but its not a high priority at the moment.
It is not all that difficult. As the earlier guy mentioned, you can use the SDL libraries to read the status of the xbox controller and then you can do whatever you'd like with it.
There is a SDL tutorial: http://sdl.beuc.net/sdl.wiki/Handling_Joysticks which is fairly useful.
Note that an Xbox controller has the following:
two joysticks:
left joystick is axis 0 & 1;
left trigger is axis 2;
right joystick is axis 3 & 4;
right trigger is axis 5
one hat (the D-pad)
11 SDL buttons
two of them are joystick center presses
two triggers (act as axis, see above)
The upcoming SDL v1.3 also will support force feedback (aka. haptic).
I assume, since this thread is several years old, you have already done something, so this post is primarily to inform future visitors.
PyGame can read joysticks, which is what the X360 controller shows up as on a PC.
Well, if you really don't want to add a dependency on DirectX, you can use the old Windows Joystick API -- Windows Multimedia -> Joystick Reference in the platform SDK.
The standard free cross plaform game library is Simple DirectMedia Layer, originally written to port Windows games to Unix (Linux) systems. It's a very basic, lightweight API that tends to support the minimal subset of features on each system, and it has bindings for most major languages. It has very basic joystick and gamepad support (no force feedback, for example) but it might be sufficient for your needs.
Perhaps the Mono.Xna library has added GamePad support, which would provide the cross platform functionality you were looking for:
http://code.google.com/p/monoxna/
As far as the concerns about the library being too heavy weight, sure, for this option it may be true ... however, it could open up opportunities to do some nice visualization in the future.
disclaimer: I'm not familiar with the status of the mono xna project, so it may not have added this feature yet. But still, 'tis an option :-)