Augmented reality in mono touch - mono

I'm developing a typical "Windows GUI" based app for iPhone using MONO technologies. I need to add a little AR based functionality to it. It is just about opening up the camera, and showing information to the user regarding nearby businesses.
How can I do this using mono?

Of course it is possible. I have created a project and works very nice. It is quite complicated and I would need three pages to explain it and, the time to do it which I do not have.
In general, you need to look into:
CLLocationManager for location and
compass.
MapKit, if you want to provide
reverse geocoding information.
Implement an overlay view over the
UIImagePickerController which will
act as your canvas.
And of course, drawing.
I hope these guidelines will get you started.

Related

Xamarin Forms - How to get camera stream and play with it?

Since a while, but without success, I'm trying to achieve a cross-platform solution that makes me able to use a custom camera with custom functionalities. However, no one on the internet seems to get it done over each platform (Often, only Android & iOS are implemented, but no UWP) and I still don't understand why...
I've been searching for the past months how to make something, like a service, a dependency service like, from which you can get the stream/frames of the camera. Once you get it, be able to put it into an Xamarin.Forms.Image.
The principle of this conception would allow developers to implement functions, inside of the dependency service, such as taking video or taking pictures from the native stream camera.
You could say "But you can already use NuGet as Xam.Plugin.Media from James Montemagno.". Yes, but with his package, you call the native built-in camera so you can't implement your own design or your own functionalities..
So my question is: "Does someone has any tips or any project that can help to realize this project/idea?". If I can make it work, then I will create a project on my public GitHub, in order to help future people who would like to realize it.
Thank for any help
PS: There is some results about some researches I made: https://forums.xamarin.com/discussion/comment/284359/#Comment_284359
This article looks to be similar to what you are after:
Full Page Camera in Xamarin
It derives a camera page from ContentPage then creates platform specific custom renderers based on PageRenderer.
Bonus - there is source code on GitHub

Kinect Hand Gestures

I have been working with Kinect gestures for a while now and so far the tools that are available to create gestures are only limited to track entire body movements for instance swiping your arm to left and right. The JOINT TYPES available in the original Kinect SDK involves elbows, wrists, hands, shoulders etc but doesn’t include minor details like index finger, thumb, and middle finger. I am mentioning al this because I am trying to create gestures involving only hand movements (like victory sign, thumb up/down). Can anyone guide me though this? Is there a blog or website where codes for hand movements are written?
I have been developing application with Kinect one year ago, and then it was very hard or nearly impossible to do that. Now Google shows me projects like this, be sure to check it out. If you generally want to focus on hands gestures, I really advise you to use LEAP Motion
My friends at SigmaRD have developed something called the SigmaNIL Framework. You can get it from the OpenNI website.
It offers "HandSegmentation", "HandSkeleton", "HandShape" and "HandGesture" modules which may cover your needs.
Also check out the rest of the OpenNI Middleware and Libraries that you can download from their website. Some of them also work with the Microsoft SDK.

Creating a simple, single-view Quartz app

I'm an Objective-C newbie. Most of my experience is in Java. Also, I've never really used Xcode before and so I'm pretty new at that as well.
I'm trying to create a simple, single-view Quartz OS X app (not iOS) to display agent-modeling simulations. The graphics are pretty simple; just colored squares and grids. I have been looking at Quartz tutorials and I can see how I could accomplish this (as far as drawing things are concerned). What I can't find is an example that tells me how to tie it all together. What do I put in AppDelegate? Do I need a WindowController? How do I link that up with AppDelegate? I got as far as creating a Quartz Composer View in Interface Builder for my app, but I have no idea where to go from there.
As I mentioned before, I've looked for numerous tutorials but there is nothing that I can find that gives me information as far as linking everything together.
You should visit this web page before you do anything else. It will show you how a Cocoa application is structured and where the appropriate entry points are to place your code.
While the entire article merits reading, visit the section "Entry and Exit Points," which best addresses your particular questions.

iTunes / About This Mac Storage Capacity Bars

I'm trying to find out if it's possible to recreate the capacity bars in Lion, like so...
http://thenextweb.com/files/2011/07/about-this-mac-storage.png
I've seen third party apps (such as Windows Phone 7 Connector) using an identical control. Is there an official way to do this?
It's a custom view. So the only "official" way to do it is to subclass NSView and go crazy in drawRect:.
You might as well start with the official documentation on the subject.
After that, there are numerous tutorials and such on the web. This one isn't half bad.

Window list ordered by recently used

I'm trying to create a window switching application. Is there any way of getting a list of the windows of other applications, ordered by recently used?
Start with the Accessibility framework. Many of the hooks for screen readers are also useful here. Particularly look at the UIElementInspector sample and the NSAccessiblity protocol.
There's also Quartz Window services, which can easily give you a list of all the windows on screen. Unfortunately, it doesn't tie into concepts like window focus (just level), and I don't know of a way to get notifications back from it when levels change. You might do something like tap into the Quartz Event framework to capture Cmd-Tab and the like, but that's complex and fragile. There is unfortunately no good way to convert a CGWindowID into an AXUIElementRef (the post is for 10.5, but I don't know of anything that was added in 10.6 to improve this). But hopefully you can do everything you need through the Accessibility framework.
You would want to use
[NSWorkspace runningApplications]
to get you a list of all the applications running, and watch
[NSRunningApplication currentApplication]
to know when the user switches to a new application to keep up with which one is used more recently.