OpenGL on Cocoa: CGLChoosePixelFormat() is terribly slow? - objective-c

I've built a small Cocoa app that uses OpenGL to draw some light content. I've used CAOpenGLLayer. While the app itself is very small and works blazingly fast, the launch time of the app is not satisfactory at all. At random times the app would stall for about a second or two upon launch, before showing the content.
I've narrowed down the problem and found that the bottleneck is the CGLChoosePixelFormat() function that is called during OpenGL initialization. It literally takes ~1 second to execute.
For a clean experiment, I created a blank Cocoa app and added an NSOpenGLView to the window. Immediately, the app launch time has grown by 1-2 seconds, for the same reason.
Is there a way to fight this problem? There seems no way to avoid using CGLChoosePixelFormat(), it's essential for getting OpenGL to work on a Mac.
Also, it's said that Core Animation is built upon OpenGL under the hood, but my Core Animation apps do not exhibit this slow startup problem at all. Also, I tried a symbolic breakpoint on CGLChoosePixelFormat in a Core Animation app, but it doesn't trigger. So Core Animation is either not using OpenGL or there is a way to initialize it in a different way. Does anybody has a solution?
P.S. I know Metal is now the way to go for 3D graphics on Macs, but I need to do this particular project on OpenGL for backward compatibility reasons.

I've met the exactly same problem with you. So far I've found that the problem only occurs on MacBook Pro 16" 2019.
Since this problem is hard to reproduce, I can do nothing more than guess. For now I have changed the NSOpenGLPixelFormat initWithAttributes: method to [NSOpenGLView defaultPixelFormat], and hope it will work.

Related

Why do I get a delay when executing animations first time?

I am using imageWithContentsOfFile: to load my .png images into an array. When the IBAction triggers the animation, there is a noticeable delay before the animation starts. Then, each time after that, it executes smoothly. I originally tried to load them with the imageNamed: method but experienced the same delay plus big performance problems on the device.
Load the images in ViewWillLoad instead.
If that doesn't work, use the developer tool Instruments (built into xcode) to find what is causing the performance hit. Choose Open GL ES driver, under the iOS -> graphics menu. Then tick Hide system libraries. Then it will show you which methods are taking a long time, click into those methods to see exactly which lines, then just try and figure out alternative ways to do it.

AVAssetWriter fails when Application enters Background during Rendering

In my app I am rendering a Video generated from images I retrieve from the users photos. I have set up an AVAssetwriter with a AVAssetwriterInput has an AVAssetWriterInputPixelBufferAdaptor. I'm able to transform the ALAsset objects I retrieve from the users library to CVPixelBuffers and add them to the Video, which then is saved as an mp4. Adding all the images to the video is done on a background thread which sends a Notification to the main thread every frame, so the interface can be updated. All this works well, and I get a usable Movie file out of the app.
My problem now is, that when the user enters another application, after becoming active again the status of the ALAssetWriter changes to "failed", an I am not able to add any more images to the movie file. First I thought I might have to end the current Session on the writer and reopen a new one, once the app has become active again, but that doesn't seem to help.
I was just wondering how the general approach would be when I'd like the user to enter other applications. The best solution would be, if the rendering could continue in the background. I suppose I'd need a background thread from the UIApplication. But for now I'd be happy, if rendering could just continue after resuming my app.
I won't post any code for now, because it's really a lot, and my question possibly is conceptual. If you need to see code, I'll post it.
Edit 1:
Tested on iOS 4.3 and iOS 5. I've seen background rendering on other apps such as iTimelapse, but I'm not sure which frameworks they use.
Edit2:
I now have the information of an apple devforum member, that AVAssetWriter does not work in the background. So is there any other framework out there capable of rendering quicktime videos?
Turns out that AVAssetWriter just won't survive the app being suspended. You can add an extra 10 Minutes of rendertime by requesting background time, but after that the AssetWriter fails. Same happens if you use certain services on the phone. For example making or answering a call will make the AVAssetWriter fail as well.
If there are any OpenGL calls made when your app is in the background then that would explain this behaviour, looks fairly likely. From the OpenGL ES Pragramming Guide
Background Applications May Not Execute Commands on the Graphics Hardware
An OpenGL ES application is terminated if it attempts to execute
OpenGL ES commands on the graphics hardware. This not only refers to
calls made to OpenGL ES while your application is in the background,
but also refers to previously submitted commands that have not yet
completed. The main reason for preventing background applications from
processing OpenGL ES commands is to make the graphics processor
completely available to the frontmost application. The frontmost
application should always present a great experience to the user.
Allowing background applications to hog the graphics processor might
prevent that. Your application must ensure that all previously
submitted commands have been finished prior to moving into the
background.
The docs go on to enumerate a set of guidelines for the enter background/foreground app delegate callbacks. I think finding a way to do the rendering without the graphics hardware would be tricky, also the frameworks that allow mp4 encoding (like ffMpeg) are mainly GPL/LGPL, so you need to be careful if dealing with a commercial product (LGPL means you can link to the library dynamically, not statically, which is useless on iOS), as the license would propagate to your code.

OpenGL updating uniforms on Mac OSX

I'm programming a nice little game that uses shader generated simplex noise for displaying on the fly computed random terrain.
I'm using Objective-C and Xcode 4 and I have gotten everything to run nicely using a subclass of NSOpenGLView. The subclass first compiles the shader and then renders a quad with the noise texture. The program has no problems running this at an acceptable speed (60Hz).
The subclass of NSOpenGLView uses an NSRunLoop to to fire a selector which in turn calls the drawRect:(NSRect)dirtyRect. This is done every frame.
Now, I want the shader to use a uniform that is updated each frame.
The shader should be able to react to a variable change that might occur every frame thus I'm trying to update the uniform at this frequency. The update of the uniform is done in the drawRect:(NSRect)dirtyRect function.
I am partially successful. The screen updates exactly as I'd like for the first 30 frames, then it stops updating the uniform even though I have a call to glUniform1f() and NSLog right next to each other and the NSLog always fires..!
The strange part is that if a hold space pressed (or any other key for that matter) the uniform is updated as it should be.
Clearly I am missing something here in regards to how OSX or OpenGL or something else handles uniforms.
An explanation of what might be ailing me would be appreciated but a pointer to where I can find information about this will suffice.
Update: After fiddling with glGetError() and glGetUniform*() I've noticed that the program works as intended when left alone. However, when I use the trackpad for input the uniform is reset to 0.000 while the rest of the program shows no errors.
First, have you tried calling glGetError() right after glUniform1f() to see what comes out?
I have a very limited knowledge of Mac OS programming (I did some iOS programming two years ago and forgot most of it since then) so the following is a wild guess.
Are you sure drawRect:(NSRect)dirtyRect is called by the same thread as the thread that owns the OpenGL context? As far as I know, OpenGL is not thread safe, so it could be that your glUniform1f() attempts are called from a different thread thus being unable to do anything.

Floating pictures screensaver on Mac OS X

I'm just learning to use XCode and program in Objective-C (my plan is to write a an app we need for my business since I can't find one that does what I need). It's going well and as an exercise I've been playing with database apps and screensavers.
I'm trying to write a screensaver that all it does is to show 3 or 4 pictures randomly floating on the screen. Similar to the out of the box pictures screensaver that comes with Mac OS X.
I've played with the code in http://cocoadevcentral.com/articles/000088.php and while very informative I still don't know how to add those 3 or 4 pictures and make them move.
Anyone out there can point me to sample code? Or a project that I can use as reference?
Again, this is just self learning.
Thank you!
This will probably be your first port of call.
My approach to this was to create a Quartz Composition using Quartz composer. I found this easier than figuring out how to draw and animate everything by hand.
You can display a quartz composition inside a ScreensaverView by using the QCView class. This setup worked well for me.

Jerky/juttery (core-)animation in a screensaver?

I've built a screensaver for Leopard which utilises core-animation. It doesn't do anything overly complicated; uses a tree of CALayers and CATextLayers to produce a "table" of data in the following structure:
- root
› maincontainer
› subcontainer
› row [multiple]
› cell [multiple]
› text layer
At most there are 50 CALayers rendered on the screen at any one time.
Once I've built the "table", I'm adding animating the "subcontainer" into view using CABasicAnimation. Again, I'm not doing anything fancy - just a simple fade-in.
The problem is that while the animation does happen its painful to watch. It's jerky on my development machine which is a 3.06Ghz iMac with 4GB of RAM, and seems to chop the animation into 10 steps rather than showing a gradual change.
It gets worse on the ppc mac-mini the screensaver is targeted for; it refuses to even play the animation, generally "tweening" from the beginning of the animation (0% opacity) to half-way (50%) then completing.
I'm relatively new to ObjectiveC and my experience is based on using garbage-collected environments, but I can't believe I'm leaking enough memory at the point the screensaver starts to cause such problems.
Also, I'm quite sure its not a problem with the hardware. I've tested the built-in screensavers which use core-animation, and downloaded a few free CA-based for comparison, and they run without issue on both machines.
Information is pretty thin on Google with regards to using CA in screensavers, or using CA in general for that matter, and advice/tutorials on profiling/troubling screensavers seems to be non-existant. So any help the community can provide would be well welcomed!
--- UPDATE ---
Seems as though implicit animations help smooth things out a little. Still kinda jerky, but not as bad as trying to animate everything with explicit animations as in my solution.
There isn't much special about a screen saver. I assume you've started with the Core Animation Programming Guide? Running it through Instruments will give you a lot of information about where you're taking too much time.
The code you're using to do the fade-in would be useful. For what you're describing, you don't even need CABasicAnimation; you can just set the animatable properties of the layers, and they by default animate. Make sure you've read up on Implicit Animations. The rest of that page is probably of use as well.
Most of your job in CoreAnimation is getting out of the way. I generally knows what it's doing, and most problems come from second guessing it to trying to tell it too much.