Window list ordered by recently used - objective-c

I'm trying to create a window switching application. Is there any way of getting a list of the windows of other applications, ordered by recently used?

Start with the Accessibility framework. Many of the hooks for screen readers are also useful here. Particularly look at the UIElementInspector sample and the NSAccessiblity protocol.
There's also Quartz Window services, which can easily give you a list of all the windows on screen. Unfortunately, it doesn't tie into concepts like window focus (just level), and I don't know of a way to get notifications back from it when levels change. You might do something like tap into the Quartz Event framework to capture Cmd-Tab and the like, but that's complex and fragile. There is unfortunately no good way to convert a CGWindowID into an AXUIElementRef (the post is for 10.5, but I don't know of anything that was added in 10.6 to improve this). But hopefully you can do everything you need through the Accessibility framework.

You would want to use
[NSWorkspace runningApplications]
to get you a list of all the applications running, and watch
[NSRunningApplication currentApplication]
to know when the user switches to a new application to keep up with which one is used more recently.

Related

What OS X events can I access programmatically from Swift?

I'd like to find both current running programs (or at least program in the foreground) programmatically - and also key events on OS X.
I found inter-application communication guidelines, but they don't seem to say I can find out what applications are running.
I've found key events, but it seems to imply that the current task in the forefront is the one that gets the key events and only if it doesn't handle them does it go up to event chain. I'd like to programmatically intercept them.
Seems dubious, I know. I'm trying to use key events along with screen captures to try to best learn text on the screen - it's for research.
I'm using swift, but I understand that an obj-c example is pretty helpful since they all use the same libraries.
To get a list of running applications use:
NSWorkspace.sharedWorkspace().runningApplications
You can build a key logger (so to speak) by creating an event monitor (same document you linked, just a different section).

What is the nature of the gestures needed in Windows 8?

Most of touchpads on laptops don't handle multitouch, hence are not able to send swipe gestures to the OS.
Would it be possible to send some gestures to Windows from an external device, like a Teensy, or a recent Arduino, that can already emulate a keyboard and a mouse. I could send buttons 4 and 5 (mouse wheel up and down), but I would like to send a real swipe gesture (for example with a flex sensor...).
One of the ways that you could work with arduino and similar is to use the Microsoft .NET Microframework, which is an open source code, available for no cost from: Micro Framework
There are other frameworks available for the Artuino that you might want to use. So if you have a great idea on how to utilize the sensor hardware, then the output must meet certain specifications.
To be able to connect to your hardware that reads gestures, you will need to understand how drivers are created, so take a look at this: Info on drivers.
To find that type of information you would need to take a look at above link, this is for sensors, which would appear to be not quite what you are looking for, you are looking to use "gestures" but first you have to be able to make the connection to your device, this guide MIGHT help. I have reviewed it for other reasons.
There is a bunch of stuff to dig through, but first of all, imo, is to understand how to get your software to communicate with Windows 8. Let me know if you have any other questions. I am not the best person, you might want to refer to the community at the Micro Framework link shown above.
Good luck.
That's perfectly possible. What your effectively suggesting is that you want to create your own input peripheral like a trackpad and use that to send inputs. As long as windows recognizes this device as an input source it will work.

Augmented reality in mono touch

I'm developing a typical "Windows GUI" based app for iPhone using MONO technologies. I need to add a little AR based functionality to it. It is just about opening up the camera, and showing information to the user regarding nearby businesses.
How can I do this using mono?
Of course it is possible. I have created a project and works very nice. It is quite complicated and I would need three pages to explain it and, the time to do it which I do not have.
In general, you need to look into:
CLLocationManager for location and
compass.
MapKit, if you want to provide
reverse geocoding information.
Implement an overlay view over the
UIImagePickerController which will
act as your canvas.
And of course, drawing.
I hope these guidelines will get you started.

Is it possible to record screen with Titanium / Appcelerator?

We're in process of developing a desktop application which needs to record user's screen once he clicks a button. I read a tutorial about Adobe AIR, which says it is easy to do with AIR: http://www.adobe.com/devnet/air/flex/articles/air_screenrecording.html
But our preference is Titanium as we've explored it a little bit. So I want to know is that even possible? If yes, how can we get started with?
There's also an interesting solution which uses Java applet for recording, as demonstrated here: http://www.screencast-o-matic.com/create?step=info&sid=default&itype=choose
But again, we're not sure about JAVA and would like to know how can it be done? or if its even possible to run a JAVA applet in Titanium?
When you say "record screen", I'm assuming you mean video. Correct?
The only way to do this in Titanium Desktop right now is to take a bunch of screenshots and string them together (encoding would probably need to be done server-side).
Depending on how long your videos need to be, this probably won't work for you. I'm also not confident in how quickly you could capture screenshots, and if it would have a high enough frame rate to be usable.
Past that, a module could be developed for Desktop to support some native APIs to record video. That's not something I see on the horizon, though.
I hope this helps, albiet a rather dismal answer. -Dawson

In OSX, how do I determine the position of windows, and which window is active?

I'm working on an idea for a type of window manager for OSX, similar to Cinch or SizeUp. In order to do this I need to be able to determine the positions of various windows, and which window is active. Some kind of callback when the active window changes would also be useful, as would how to handle multiple screens and multiple spaces.
I'm resigned to the fact that I'll probably need to learn Objective C for this, but if there is a way to do this type of thing from Java, that would be particularly awesome.
Can anyone point me to the appropriate place in the OSX API for these things?
You'll want to take a look at the Accessibility API, as discussed here.