Can I "move" the screen, but have objects on the screen still moving? - objective-c

As the title said, I want to move the screen up, so that my app's window can come in from the bottom. What I currently have now is a floating window, whose background has been set to a screenshot of the screen. Like this:
It looks fine to the user, except now any objects that work in the background don't appear above the window, it's basically frozen.
Can I do this? This effect is similar to what Notification Center does in 10.8.

You can do this, for a certain definition of "can" - there's no API for it, but you can likely use the same SPI that the Finder/Dock/etc uses. The only complication may lie in needing special privileges, or needing your code to be specially signed - I'm not sure what checks are in place.
It's not too tricky to figure this out; you can use tools like nm, otool and even class-dump.

Related

Windows Desktop Manager Overlay

I like to make an overlay with the following properties:
should work at least on Windows 8.1
should be on top on everything, like a mouse cursor
should incorporate the pixels which are already on the background, like a blur filter
no flickering
Details to each of this points:
1) I assume that WDM is activated and DirectX 11.2 is used. Sure it would be nice to have it working on other Windows versions but this has no priority.
2) The problem is that with simply using the WS_EX_TOPMOST, menus from applications are over my overlay. In my case this really hurts as I like to display something with the same properties as a cursor. Imagine that a cursor suddenly is hidden if you open a menu -> unacceptable.
3) I like to read the pixels from the Windows desktop, including any effect Windows applies (like blur), and use this information for my filter. If I add my overlay, as described in 2, I should be able to get a fresh unobstructed copy of the background in the next frame and not read out my own overlay.
4) If I just write something into the Windows desktop directly, it gets overwritten immediately on the next frame by Windows itself. This is not acceptable.
One example of such an application is a magnifying glass, which exactly has all the properties I need. But for this case Windows 8.1 has an API. In contrast I like to write a program which displays a hand on the desktop (which is controlled by Leap Motion) which influences the Windows desktop, so you almost "feel" how you move your hand over the desktop.
If I write a tiny DirectX and/or OpenGL application for myself this is all very easy:
render all the regular stuff to a texture
use this texture for a post processing filter and add all my stuff on top of it
render just a quad to the back buffer
But I like to do that for the whole Windows desktop.
I found many different application, but they are to no use for me:
application which claim to be on top, are still behind menus. This normally doesn't really hurt, but is unacceptable for a cursor-alike thing
screen capturing programs which hook them self in all running programs are nice, but I want to hook myself into WDM
normally screen capturing programs do not draw anything into the back buffer, so they get every frame a new unobstructed back buffer
My questions can be boiled down to: How can I write my own magnifying glass for Windows 8.1.
I fear that my only serious option is to hook myself into WDM, what I try to avoid.
I'm happy to hear any idea how to achieve this, or hints to application which are doing what I describe.

iPad not all screen recognising touches

So my current app project is a camera based app and all is going well so far but I have run in to a weird little issue and don't know if there's something basic i'm missing or if it's something more complex.
When I run my app on the iPad in landscape mode (right hand home button), the right end of the screen doesn't recognise touch down events, though if an item is spread across the border (half recognising touches, half not) and you press on the good half and drag, it still recognises the touch and also recognises the touch up event when you let go. Through testing, I worked out that it works fine up to pixel 768 so this makes me think that one of the views thinks that the application is still running in portrait. But then when I run it in portrait, the bottom section (same portion) doesn't work either.
I have looked at another couple of posts on SO:
Article 1
Article 2
I have tried the fixes they say, but have had no luck as of yet. It may be something to do with the fact I have various different views created both programatically and in the interface builder and somewhere along the way, something isn't being initialised correctly but I have tried changing them all, I may have missed some though.
If anybody can shed any light on my situation, that would be greatly appreciated.
Thanks,
Matt
I think the problem has something to do with autoresizing mask. Have you set this? Try to set the background color of all views to see where they are.

I want to animate the movement of a foreign OS X app's window

Background: I recently got two monitors and want a way to move the focused window to the other screen and vice versa. I've achieved this by using the Accessibility API. (Specifically, I get an AXUIElementRef that holds the AXUIElement associated with the focused window, then I set the NSAccessibilityPositionAttribute value to move the window.
I have this working almost exactly the way I want it to, except I want to animate the movement of windows. I thought that if I could get the NSWindow somehow, I could get its layer and use CoreAnimation to animate the window movement.
Unfortunately, I found out that this isn't possible. (Correct me I'm wrong though -- if there's a way to do it this way it'd be great!) So I'm asking you all for help. How should I go about animating the movement of the focused window, if I have access to the AXUIElementRef?
-R
--EDIT
I was able to get a crude animation going by creating a while loop and moving the position of the window by a small amount each time to make a successful animation. However, the results are pretty sub-par. As you can guess, it takes a lot of unnecessary processing power, and is still very choppy. There must be a better way.
The best possible way I can imagine would be to perform some hacky property comparison between the AXUIElement info values for the window and the info returned from the CGWindow api. Once you're able to ascertain what windows in the CGWindow API match AXUIElementRefs, you could grab bitmaps of the current window contents, overlay the screen with your own custom animation draw of the faux windows, then as you drop the overlay set the real AXUIElementRef's to the desired-end-animation positions.
Hacky, tho.

Mac OSX Overlay

How would I go about programming a HUD type overlay in OSX.
I want to be able to have an application that will display text at a certain point over a different application's window.
And thus if the (other applications) window moves the HUD part will stay at the same coordinate of the other window.
For the window itself, use a borderless, transparent window (plenty of examples) with your own custom view into which to draw your overlaid elements.
For the "other applications' windows" part, there's no public API that's going to let you do this smoothly. You use Universal Access and its window location/navigation API, but it requires your users to turn on "Enable access for assistive devices" (I think it still can't be done programmatically). I don't believe it "lets you know" when a window moves, but I could be wrong. If it does, it'd likely be a one-shot "here's where I am now", so your overlay would likely not keep up. I also don't think it gives you the "window level" to allow you to make sure you're "above" any given window/sheet/palette.
The only other option (to move with other apps' windows) is a system-wide, invasive hack a la Application Enhancer (which is quite controversial). It's easy to get this wrong and destabilize a user's system (hence the controversy).
You could use undocumented CoreGraphics Functions in order to track a window, see http://code.google.com/p/undocumented-goodness/source/browse/trunk/CoreGraphics/CGSPrivate.h

Display something on the screen everytime action made

I have a problem not sure how to solve this. Hmm I am developing a game, a multi touch game, I already can make everything working fine, except a small issue that I want to show messages on the playing screen, each time the player makes actions. like his finger moves right the message says : "this finger moving right" nicely at the bottom of the screen, then if the finger move left, then it says the his finger moves left... something like that, can anyone show me how. I am using Cocos2D , it shall be much easier in Cocoa.
Thanks a alot for any help.
You'll probably need to be more specific with your question, but for now, here's a general answer:
Handling touch events on the iPhone and Handling touch ("trackpad") events on the Mac.
You'll receive and process the events per the above, then you'll display the results somehow. For testing, you'll probably just want to log the results to the console. For the final version, you might have a label or even a custom view that draws the "instruction" in some fancier way. If the latter is the case, you'll want to read up on custom views and drawing for whichever platform you're using (or both).