Import Unity along with UIKit project - objective-c

We are working on a game which is based on MapKit from Apple. Choosing to attack some venues from the map trigger some built-in minigames which currently are made in cocos2d. We want to change one of this minigames and replace it with a Unity-based game, so I wonder if it's possible to run code generated by Unity along with the rest of the project.
I don't want some deep integration between Unity and the rest of the project, just to start the Unity-based game and to stop it when the player finished the minigame. Do you know if this is possible and what are the steps to achieve this?
Thank you

Related

Xamarin Forms - How to get camera stream and play with it?

Since a while, but without success, I'm trying to achieve a cross-platform solution that makes me able to use a custom camera with custom functionalities. However, no one on the internet seems to get it done over each platform (Often, only Android & iOS are implemented, but no UWP) and I still don't understand why...
I've been searching for the past months how to make something, like a service, a dependency service like, from which you can get the stream/frames of the camera. Once you get it, be able to put it into an Xamarin.Forms.Image.
The principle of this conception would allow developers to implement functions, inside of the dependency service, such as taking video or taking pictures from the native stream camera.
You could say "But you can already use NuGet as Xam.Plugin.Media from James Montemagno.". Yes, but with his package, you call the native built-in camera so you can't implement your own design or your own functionalities..
So my question is: "Does someone has any tips or any project that can help to realize this project/idea?". If I can make it work, then I will create a project on my public GitHub, in order to help future people who would like to realize it.
Thank for any help
PS: There is some results about some researches I made: https://forums.xamarin.com/discussion/comment/284359/#Comment_284359
This article looks to be similar to what you are after:
Full Page Camera in Xamarin
It derives a camera page from ContentPage then creates platform specific custom renderers based on PageRenderer.
Bonus - there is source code on GitHub

Kinect Hand Gestures

I have been working with Kinect gestures for a while now and so far the tools that are available to create gestures are only limited to track entire body movements for instance swiping your arm to left and right. The JOINT TYPES available in the original Kinect SDK involves elbows, wrists, hands, shoulders etc but doesn’t include minor details like index finger, thumb, and middle finger. I am mentioning al this because I am trying to create gestures involving only hand movements (like victory sign, thumb up/down). Can anyone guide me though this? Is there a blog or website where codes for hand movements are written?
I have been developing application with Kinect one year ago, and then it was very hard or nearly impossible to do that. Now Google shows me projects like this, be sure to check it out. If you generally want to focus on hands gestures, I really advise you to use LEAP Motion
My friends at SigmaRD have developed something called the SigmaNIL Framework. You can get it from the OpenNI website.
It offers "HandSegmentation", "HandSkeleton", "HandShape" and "HandGesture" modules which may cover your needs.
Also check out the rest of the OpenNI Middleware and Libraries that you can download from their website. Some of them also work with the Microsoft SDK.

Record sound of one application

I want to develop an application for Mac OS X to record audio from one application.
I played around with Soundflower, but it only grabs the full system audio.
I know that I have to use a HAL plug-in. This plug-in is loaded from an application that uses Core Audio and then I can communicate with the plug-in to grab the audio.
My question is: How does such a plug-in look like? Are there examples on the internet? I have not found anything about this topic.
Now that you've decided that using Cocoa injection is a feasible solution to your problem, let's start there.
What you need to do is find out how the ObjC classes in the app are setting up to play audio, and hook in to set a different AU in place of the default system out.
There are two options (besides writing your own custom AU from scratch, which you don't need to do). You can use AUHAL as the AU, and capture the data from AUHAL. This is a bit easier from the point of view of hooking things up, but it means you have to write the code that renderers and saves the audio. Or you can just hook in a save-to-file AU, which is a bit harder to hook up, but once you do it takes care of rendering automatically.
So, how do you hook things in? Well, most of the higher-level CA calls are written to just write to the current output. If the app is doing things that way, you just need to hook in at startup to find your replacement AU and set it as the current output, in place of the default. On the other hand, if the app is writing directly to an AU that it stores in a variable, you have to hook it to store your AU as a variable. And if it's building a graph of AUs, you either replace the default output, or stick yours in front of it, in the graph.
See TN2091 for some sample code fragments for most of the hard parts for most of the possibilities. It doesn't show you how to put them together, and it's got a lot more about setting inputs than outputs (because that's harder), and the terminology can get confusing, but if you read it carefully, you should be able to find the parts you need.
If you haven't yet built a simple AU host and AU plugin before, you really should take the time to work through the whole Audio Unit Development Fundamentals guide. (And if you don't think you really need to know all that to do something simple, you're wrong. Why CoreAudio is Hard explains half of the reason; the changes between OS X versions versions are the other half of the reason.)
You probably also want to look at CocoaDev's CoreAudioAndAudioUnitsTutorial page for a placeholder page for a complete tutorial that nobody's ever written, with links to a lot of useful stuff.
Meanwhile, if injecting the whole MTCoreAudio framework into the app is feasible, it comes with a ton of nice, complete samples. In fact, even if you aren't going to use the framework, it's worth reading the Overview documentation, and possibly the source code.

What is a good way to implement a message board or other common UI plugins

I am thinking there must be some libraries out there that people have developed which can be used as "plugins" or whatever people call them to do simple and common UI types of things.
I am using the message board idea as just an example, but I am looking for a general solution. For example, is there a place where I can browse "gems" for RoR that just take care of some UI component?
How do people usually integrate such pieces as a message board present at the bottom of every page, or some other ui tool without writing their own, or using a CMS?
Thanks,
Alex
Two good places to browse gems are http://ruby-toolbox.com/ and of course http://rubygems.org/

Augmented reality in mono touch

I'm developing a typical "Windows GUI" based app for iPhone using MONO technologies. I need to add a little AR based functionality to it. It is just about opening up the camera, and showing information to the user regarding nearby businesses.
How can I do this using mono?
Of course it is possible. I have created a project and works very nice. It is quite complicated and I would need three pages to explain it and, the time to do it which I do not have.
In general, you need to look into:
CLLocationManager for location and
compass.
MapKit, if you want to provide
reverse geocoding information.
Implement an overlay view over the
UIImagePickerController which will
act as your canvas.
And of course, drawing.
I hope these guidelines will get you started.