Detecting what part of screen/application window is using OpenGL? - objective-c

I am interested in identifying which part of application is making use of OpenGL.
Take an example of Chrome where Youtube video being played in Flash (get rendered via OpenGL). I am interested in detecting only the area of application where that OpenGL activity is being done?
If the condition is that I need to be inside the application, like say to inject in Chrome, I can do that too.
Let me know if I can clarify question more.

You tagged your question as MacOS X. Then you can simply assume everything on screen being drawn using OpenGL, because OpenGL is used as the graphics backend for the whole system.

Their is this private API which allow you to know the surface on which opengl is rendering.
CG_EXTERN CGError CGSGetSurfaceBounds(CGSConnectionID, CGWindowID, CGSSurfaceID, CGRect* bounds);
Using this we can detect specific area of application which makes use of OpenGL.

Related

Rendering with D3D device in external DLL used in a WinPRT Xaml/DirectX sample has constant black flickering

I have a C++ DirectX-based third-party game engine compiled into a Windows Phone Runtime Component DLL. I'm working on integrating it into a project based off of a Windows Phone Direct3D with XAML App. The game engine DLL uses the the D3D device, context and render view texture provided by the application's Direct3DBackground::Draw() method.
The built-in renderer from the sample is gone and replaced by the game engine's.
I can render but there is constant black flickering. Every other frame is black. To prove to myself that it wasn't the renderer (which has been proven to work elsewhere), I cut out all the rendering code from the game engine DLL to a simply setting a clear color. The result is still the same.
At first I thought it was because the Direct3DXamlAppComponent generated by the sample was maybe running in a different thread from the game engine DLL, but that's not the case. They're on the same thread.
What rendering problem could this configuration be causing?
Does the game engine's renderer need a separate d3d device?
Does the game engine's renderer need a separate d3d device context?
Things I haven't tried yet:
creating a second d3d device on the DLL
converting the game engine to provide its own IDrawingSurfaceManipulationHandler. But I'm not sure if it'll just have the same problem as above.
The problem came from the render target view. I didn't realize that the pointer to it gets updated every frame. I had just set it to the game engine renderer once at start up. Now I update the render view target pointer every frame and now the black flickers are gone.

Objective-C, Methods for animating gui

I've created many types of interfaces using the Cocoa API — some of them using documented basic animation techniques and others simply by experimenting (such as placing an animated .gif inside an NSImage class) — which had somewhat catastrophic consequences. The question I have is what is the correct or the most effective way to create an animated and dynamic GUI so that it runs optimally and properly?
The closest example I can think of that would use a similar type of animation would be something one might see done in flash on any number of interactive websites or interfaces. I'm sure flash can be used in a Cocoa app, although if there is a way to achieve a similar result without re-inventing the wheel, or having to use 3rd party SDKs, I would love to get some input. Keep in mind I'm not just thinking of animation for games, iOS, etc. — I'm most interested in an animated GUI for Mac OS X, and making it 'flow' as one might interact in it.
If u wish to add many graphics animations, then go for OpenGLES based xcode project for iOS. That helps u to reduce performance problem. You can render each of the frames in gif as 2D texture.
I would recommend that you take a look at Core Animation. It is Apples framework for hardware accelerated animations for both OS X and iOS. It's built for making animated GUIs.
You can animate the property changes for things like position, opacity, color, transforms etc and also animate gradients with CAGradientLayer and animate non-rectagunal shapes using CAShapeLayer and a lot of other things.
A good resource to get you started is the Core Animation Programming Guide.

360 degree video in MPMoviePlayerController

I am trying to develop an iphone application which needs to show a 360 degree video like the one and rotate the video as per the phone movement. How can i do this? Is it possible to do this with normal MPMovieplayer controller?
I don't think you can do this with a normal MPMoviePlayerController, but there are several libraries out there to achieve this. Have a look here:
PanoramaGL
Panorama 360
They work with OpenGL and you can embed them in your Objective-C code.
EDIT:
As #Mangesh Vyas kindly pointed out those are intended to use with fixed images only. However they might be a suitable starting point for embedding video as well, if you modify the code accordingly. They already do the handling of direction, accelerometer etc. so you don't have to implement all that yourself.

Modifying video frames with QTKit and OpenGL

I am working on a project where I would like to open a video (on a Mac) with QTKit. That part I can do no problem, but as I am playing it, I would like to edit or modify the video on the fly using OpenGL.
From what I understand, I should be able to intercept the frames and change them before it hits the display, but no matter what I do, I cannot seem to do so.
It sounds like you should have a look at Core Video and the display link mechanic.
You can basically get a callback on a high priority thread with the decoded frame in a CVImageBuffer and do whatever you like with it (including packing it up as a texture for OpenGL processing and display).
Apple provides documentation and demo code snippets on the developer sites.

Apple Magic Mouse Api

I just bought a Magic Mouse and I like it pretty much. But as a Mac Developer it's even cooler. But there's one problem: is there already an API available for it? I want to use it for one of my applications. For, example, detect the user's finger positions, swipe or stretch gestures etc...
Does anyone know if there's an API for it (and how to use it)?
The Magic Mouse does not use the NSTouch API. I have been experimenting with it and attempting to capture touch information. I've had no luck so far. The only touch method that is common to both the mouse and the trackpad is the swipeWithEvent: method. It is called for a two finger swipe on the device only.
It seems the touch input from the mouse is being interpreted somewhere else, then forwarded on to the public API. I have yet to find the private API that is actually doing the work.
get a look here: http://www.iphonesmartapps.org/aladino/?a=multitouch
there's a full working proof-of-concept using the CGEventPost method.
--
all the best!
I have not tested, but I would be shocked if it didn't use NSTouch. NSTouch is the API you use to interact with the multi-touch trackpads on current MacBook Pros (and the new MacBooks that came out this week). You can check out the LightTable sample project to see how it is used.
It is part of AppKit, but it is a Snow Leopard only API.
I messed around with the below app before getting my magic mouse. I was surprised to find that the app also tracked the multi touch points on the mouse.
There is a link in the comments to some source that gets the raw data similarly, but there is no source to this actual app.
http://lericson.blogg.se/code/2009/november/multitouch-on-unibody-macbooks.html