Share Textures Between 2 OpenGL Contexts - objective-c

I have an existing openGL context, using an OpenGL 2.1 core profile. I am able to draw objects/textures/etc no problem. However, now I want to be able to have my application to launch a separate NSWindow, with an NSOpenGLView, that displays part of a texture I drew in the original renderer's view. After some reading, I eventually bumped into the topic of context sharing, which I think may be the route I have to take if I want to pull this off.
My shared openGL context is of type - CGLContextObj, but I don't know what to do with it as my window resides in a different process. I've read the Apple documentation on rendering contexts, but I am unable to apply the concepts they laid out if there's barely any examples for me to go through. Any advice will be really appreciated, thank you in advance.
EDIT:
Perhaps I did not give enough description, my apologies. I subclass my NSOpenGLView, and it's init I do the following:
// *** irrelevant initialization stuff above inside init *** //
// Get pixel format from first context to be used for NSOpenGLView when it's finally initialized later
_pixFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:(void*)_attribs];
// We will create CGPixelFormatObj from our C array of pixel format ttributes
GLint nPix;
CGPixelFormatObj myCgPixObj;
CGLChoosePixelFormat(_attribs, &myCgPixOPbj, &nPix);
// Now that we have the pixel format in CGPixelFormatObj form, create CGLContextObj to be passed in later when we init NSOpenGLView
CGLContextObj newContext;
CGLCreateContext(myCgPixObj, mainRenderingContext, &newContext);
// Create an NSOpenGLContext object here to feed into NSOpenGLView
NSOpenGLContext* _contextForGLView = [[NSOpenGLContext alloc] initWithCGLContextObj:newContext];
[newContext setView:self];
[self setOpenGLContext:newContext];
// We don't need this anymore
CGLDestroyPixelFormat(myCgPixObj);
return self;
I am able to draw objects in this view just fine. But I get a blank white rectangle whenever I try to use the textures created in the main rendering context. I'm a little lost on how to proceed from here, I have never dealt with shared contexts before.

Seems like I got it working, partially at least since I had to force the view to redraw by moving my Window around to actually render the texture from the main context (another problem for another time!). Anyways, here's how I did it:
My main rendering context is supplied by a host application (yes, I'm working on a plugin), and is of type CGLContextObj. I wrap that context in an NSOpenGLContext object via calling initWithCGLContextObj
Next step was to create an NSOpenGLPixelFormat object, initializing it with the pixel format attributes used by the host application's renderer. This step is important as it ensures that the rendering context that will be used in my view will have the same OpenGL core profile, along with other attributes used by the host application.
Then in my subclassed NSOpenGLView, I create a new NSOpenGLContext object, preferably in the prepareOpenGL method, by using initWithFormat:shareContext: for allocation. I used the NSOpenGLPixelFormat and NSOpenGLContext objects created previously to pass as parameters.
Upon assigning the newly created context to my view, I was able to render the textures from the main rendering context.

Related

Transferring Touch points between classes [Objective-C]

I'm working on an alternate version of a program I already wrote, it's mostly for the sake of understanding a little more.
In x-Code (Objective-C), I have a ViewController that calls out a UIView (GraphicsView) that draws a line from the center to the touch point. This sub-view is smaller than the larger ViewController.
The view controller has a label that outputs the coordinates of the touched point.
So far I was able to get everything working, so that if you touch inside the sub-view you get the line AND the coordinates updated and if you touch outside the sub-view you only get the coordinates updated. I did this using delegates which was a little complicated.
I've been reading some books and I learned about using the extern feature and global variables (which are supposed to be bad practice) and I wanted to try the same app but using global variables.
I declared my externCGPoint in the ViewController.h and imported it on the GraphicsView.m file and in the method touches began I put the definition of myGlobalPoint = touchedpoint; followed by an NSLog that displays the coordinates. So far it works. (However it does not update the coordinates)
However whenever I touch outside the sub-view, into the main view the app crashes with a EXC_BAD_ACCESS message. From what I understand the main View cannot access the global variable if it's declared in another class ?
I've read many there stack overflows about this and I;ve tried it in the methods suggested but I keep getting this error.

Apples ZoomingPDFViewer Example - Object creation

I'm currently working on an App which should display and allow users to zoom a PDF page.
Therefore I was looking on the Apple example ZoomingPDFViewer.
Basically I understand the sample code.
But a few lines are not obvious to me.
Link to the sample code:
http://developer.apple.com/library/ios/#samplecode/ZoomingPDFViewer/Introduction/Intro.html
in PDFView.m:
//Set the layer's class to be CATiledLayer.
+ (Class)layerClass {
return [CATiledLayer class];
}
What does the code above do?
And the second code snippet I don't understand in PDFView.m again:
self = [super initWithFrame:frame];
if (self) {
CATiledLayer *tiledLayer = (CATiledLayer *)[self layer];
...
I know it creates a CATiledLayer object. But how it will be created is not clear to me.
I hope someone could give me a short answer to my question because I don't want to use code which I don't understand.
Thank you!
The TiledPDFView.h class is a subclass of UIView, so you can see what documentation UIView has on that method. According to the docs I see, it looks like:
layerClass - Implement this method only if you want your view to use a different Core Animation layer for its backing store. For example, if you are using OpenGL ES to do your drawing, you would want to override this method and return the CAEAGLLayer class.
So it seems that it is asking the Core Animation system to use a tiled-layer. Further docs from CATiledLayer:
CATiledLayer is a subclass of CALayer providing a way to
asynchronously provide tiles of the layer's content, potentially
cached at multiple levels of detail.
As more data is required by the renderer, the layer's drawLayer:inContext: method is called on one or more background
threads to supply the drawing operations to fill in one tile of data.
The clip bounds and CTM of the drawing context can be used to
determine the bounds and resolution of the tile being requested.
Regions of the layer may be invalidated using the setNeedsDisplayInRect: method however the update will be asynchronous.
While the next display update will most likely not contain the updated
content, a future update will.

Cocoa app gives EXC_BAD_ACCESS on any GL function

It seems that no matter what GL function I call, I get EXC_BAD_ACCESS. However, I'm calling these functions in readFromURL:ofType:error: of an NSDocument subclass, for some offscreen drawing. If I remove that code, and try to use GL later, once everything's loaded, everything works fine. Is this a GL context issue?
I read Apple's GL guide, but in the section about offscreen drawing, it just told me how to use framebuffers. Which I do, but since glGenFramebuffersEXT crashes just like everything else, it's not very helpful.
Is there some sort of context creation I need to perform, and if so, what's the best way to do it?
Yes, OpenGL calls need a context. If you have NSOpenGLView, you need to get its context and make it current:
[[openGLView openGLContext] makeCurrentContext];
// glCalls()
If you're not using NSOpenGLView, you can create NSOpenGLContext youself.

CABasicAnimation and custom types

I'm not very familiar with CoreAnimation, so I hope I've just missed something pretty simple. I want to animate a custom property (NSGradient) of a NSView in a simple manner, with [[view animator] setGradient:gradient];. I defined + (id)defaultAnimationForKey:(NSString *)key and returned a simple CABasicAnimation, however, no animation is executed. Since this works for simpler types and NSColor, I guess CABasicAnimation doesn't work with gradients. Fine, but in this particular case gradients are trivial (two stops, always), so I can easily write an interpolation functions. The question: how can I define a custom interpolation? I googled around regarding delegates on view, layer and animations, subclassing animation class etc., but I wasn't able to figure the things out. Thanks!
I thought I remembered passing by some Apple documentation when I was learning how to use Core Animation that showed how to set up animations that couldn't be handled by properticode describedes that are supplied with defined animations. Along the way I stumbled across some sample code from Apple that is described as:
A single gradient layer is displayed and continuously animated using new random colors.
That may be the answer to the specific task you already handled another way. I found it in the Documentation and API Reference within Xcode and the name of the sample code is simply Gradients. (Note that there is an original version 1.0 and an updated version 1.1 that was redone this year in April and so should be easier to use with current tools.
But, the larger question of creating a custom animation that can't be automated by Core Animation itself is to follow the example from Apple's Animation Programming Guide for Cocoa in the section Using an NSAnimation Object. It's described under the topic Subclassing NSAnimation and the recommended method is shown under the heading Smooth Animations. You override the setCurrentProgress: method so that each time it is called you first invoke Super so that NSAnimation updates the progress value, i.e., your custom animated property and then do any updating or drawing needed for the next frame of your animation. Here are the notes and example code provided by Apple in the referenced documentation:
As mentioned in “Setting and Handling Progress Marks,” you can attach a series of progress marks to an NSAnimation object and have the delegate implement the animation:didReachProgressMark: method to redraw an object at each progress mark. However, this is not the best way to animate an object. Unless you set a large number of progress marks (30 per second or more), the animation is probably going to appear jerky.
A better approach is to subclass NSAnimation and override the setCurrentProgress: method, as illustrated in Listing 4. The NSAnimation object invokes this method after each frame to change the progress value. By intercepting this message, you can perform any redrawing or updating you need for that frame. If you do override this method, be sure to invoke the implementation of super so that it can update the current progress.
Listing 4 Overriding the setCurrentProgress: method
- (void)setCurrentProgress:(NSAnimationProgress)progress
{
// Call super to update the progress value.
[super setCurrentProgress:progress];
// Update the window position.
NSRect theWinFrame = [[NSApp mainWindow] frame];
NSRect theScreenFrame = [[NSScreen mainScreen] visibleFrame];
theWinFrame.origin.x = progress *
(theScreenFrame.size.width - theWinFrame.size.width);
[[NSApp mainWindow] setFrame:theWinFrame display:YES animate:YES];
}
So basically you define a "progress value" (possibly composed of several values) that defines the state of your custom animation and write code that given the current "progress value" draws or changes what is drawn when the animation is at that particular state. Then you let NSAnimation run the animation using the normal methods of setting up an animation and it will execute your code to draw each frame of the animation at the appropriate time.
I hope that answers what you wanted to know. I doubt I could have found this easily by searching without having seen it before since I finally had to go to where I thought it might be and skim page by page through the entire topic to find it again!

Problems with an unusual NSOpenGLView setup

I'm trying to set a subclassed NSOpenGLView in an unusual way and I am running into some problems. Basically, I am writing a program to perform a bioengineering simulation for my PhD and I need to be able to compile it under both MacOSX and Unix (my machine is a Mac, but the sim will eventually run on a more powerful Unix machine). Since the code will get longer and longer over the next year and a half I'd rather not have to keep track of two completely different versions of the program. So, I'm hoping to be able to compile the ObjectiveC code under Unix by avoiding ObjectiveC-2.0 and keeping the interface optional (it will mostly be there to perform setup before the long simulations and monitor things for the short ones during development).
The current version works well without the interface - the simulation is performed correctly and the program is capable of rendering OpenGL frames and exporting them into image and video files without any problems. Since I am now adding the interface (right now just a simple window with an NSOpenGLView subclass and a "start" button") on top of that (so that I can run the code with an alternate version of main() without it) I have to "wire" OpenGL together in a weird way, since the drawing code is not in the drawRect function, or even anywhere in the subclassed view, but instead in the "basic" program.
What I've done so far is this:
The main program (using an object called "Lattice") performs all the simulations and rendering, correctly outputing images and video to files. This also contains the NSOpenGLContext and calls [renderContext flushBuffer];
A subclass of NSOpenGLView called PottsView contains an instance of a lattice, which is initialized together with the view like this:
- (id)initWithFrame:(NSRect)frame {
if(![super initWithFrame:frame])
return nil;
// code
frameSize.width = WIN_WIDTH;
frameSize.height = WIN_HEIGHT;
[self setFrameSize:frameSize];
init_genrand64(time(0));
latt = [Lattice alloc];
if (SEED_TYPE) {
[latt initWithRandomSites];
} else {
[latt initWithEllipse];
}
[[latt context] makeCurrentContext];
return self;
}
drawRect() is empty.
PottsController is the object instanced in the InterfaceBuilder which connects the start button to the view. The start button simply tells the lattice to run for a number of steps.
Now, pressing start results in the simulation running correctly (i.e. output to files and terminal), but the PottsView is not working correctly. It remains white, but if I cmd+tab parts if it change to sections of a rendered frame. Same if I press Expose (F3).
I've tried several combinations of flushing, setNeedsDisplay, etc, but frankly speaking I'm lost. I haven't done any programming before this April and with this being (as far as I can tell) a completely backwards way of using NSOpenGLView I'm out of ideas. I'm hoping someone can suggest how I can make the current setup work or how to completely rewire the program (while still keeping the interface optional).
It's not clear how you think that you have 'wired' the context and the view together. You can have as many openglContexts as you like - just by drawing into one won't make it's contents show up in a random NSOpenGLView. Apologies if i have missed something.
NSOpenGLView is a fairly simple subclass of NSView that creates the context and pixel format. As you already have those you can do away with NSOpenGLView and use a custom NSView subclass.
You should look at this instruction.. http://developer.apple.com/library/mac/#documentation/GraphicsImaging/Conceptual/OpenGL-MacProgGuide/opengl_drawing/opengl_drawing.html
To draw to the screen you must flush the graphics context from -drawRect:
This will block the main thread while the gpu processes your instructions, this could be a problem if you have many instructions. It also can not happen more than 50fps.
If you are already rendering your frames to files woudn't you be better observing the output directory and drawing the image each time a new one is added, no opengl required?