I am experiencing HUGE performance issues with iOS9 and I just can't figure out what to do. I've read many posts - here, and here for example, but their suggested solutions don't help or make little difference.
My game has gone from 60fps on an old iPad 2 (iOS 8.4) to < 15fps on a new iPad mini (iOS 9).
I'm trying to work out the main culprit. I'm pretty certain one of them is SKCropNodes. I usually render several SKCropNodes in my scene (6 - 18). This was never an issue in iOS8, but it appears iOS9, while it does a better job of cropping, also eats up performance doing so.
If I render the crop nodes as normal SKSpriteNodes, I gain maybe 5fps on old devices, and up to 30 on a newer iPhone 6. I have no alternative to using crop nodes, but it can't be the whole problem.
I thought perhaps the wrong texture atlas was being used - i.e. one of a much larger resolution. However forcing my device to use a very small atlas made no difference.
I'm using Texture Packer to generate my atlases with scaling variants for the different devices. I've noticed XCAssets now features an option to add a Sprite Atlas (I can't seem to find any documentation about this). This isn't suitable to my game, since I use 100s of sprites. I've tried adding my atlases to XCAssets, but then for some reason it won't use the scaling variants. Nevertheless, with low res textures, it still runs terribly.
I have tried setting
skView.ignoresSiblingOrder = YES;
and given all my nodes zPosition values, but still no effect. I have also added the .png extension to every image name (originally a problem that meant they wouldn't render.)
I have some SKEffectNodes in my scenes, but removing and adding these doesn't seem to have an effect.
I don't understand how the same hardware and the same code can produce such drastically different results. Obviously Apple has changed something to do with rendering that has had an adverse effect. They also seem to have no intention of fixing these issues. I am aware of bugs on the issue that have been there for months - long before iOS9 was released.
I've been working on this game for 2 years now, and only just released it before iOS9. It's now suffering from terrible performance, and regular crashes.
Has anyone figured out what exactly Apple did to kill performance? If I knew that, I could at least try to work around it... Thanks.
UPDATE
Below are some figures for the same scene, with the absolute maximum number of nodes the game would generate at one time.
iOS 8, iPad 2, ~200 nodes, ~100 draws, 58.7 - 60 fps
iOS 9, iPhone6, ~280 nodes, ~216 draws, around 20 fps
I assume the difference in number of nodes is due to the different screen sizes. If I change the scene on iPhone 6 to achieve equivalent values, the FPS is still around 24.
UPDATE 2
Using Xcode's template Sprite Kit project, and changing the spaceship to an SKCropNode containing the spaceship, on iOS 8 I am able to add 100s of ships with no frame rate issue. On iOS 9, same project, I can add about 25 before the frame rate drops to < 30.
iOS 8 on iPad2:
iOS 9 iPhone 5:
In terms of texture atlas use, as in my comment, I can't guarantee that anything will be drawn from the same atlas. My game contains customised characters, with assets from a range of atlases (each of which contains ~100 textures). There can be up to 9 characters on screen at a time. I understand this isn't the most efficient in terms of draws, but I've never had a problem until iOS9...
Update 3
I've submitted a bug to Apple, including my sample program. I also used up one of my tech support requests. So far, nothing from Apple.
There are two main issues.
One is a radical drop in performance of Sprite Kit from iOS 8 to iOS 9 for all manner of reasons, some of which you've linked to, but there are others. It seems many aspects of rendering, sorting and storing/dealing with nodes is either broken or doubling or tripling their previous loads on CPU/GPU.
However there's another issue that further compounds efforts to resolve any of the possible performance problems. It's a pervasive and seemingly arbitrary frame rate capping mechanism that's most frequently noticeable when it operates at 40fps. But it also operates at other frequencies.
For many years this capping has been (rarely) noticed when folks manually use CADisplayLink to create game loops or other timing based mechanisms on a frame per frame basis.
With iOS 9 this seemingly automated capping has become a terribly unwanted "feature" of Sprite Kit, SceneKit, Metal and OpenGL ES based applications.
In the case of SceneKit this is most telling because the capping occurs regardless of rendering choice - Metal or OpenGL - and seemingly on all devices, including things like the new 6S phones and the iPad Air 2, even with very simple, default template projects.
"Renderer" is a line item in the detailed statistics of SceneKit, onscreen, on the device. It's the most telling indication of a capping process. It's not there when a game runs stably at 60fps.
When capped at 40fps, no matter the amount of time required to do other activities onscreen and in logic, this component will absorb all remaining time in the game loop needed to maintain a solid 40fps capping. It varies according to the time required for other actives, always forcing an apparent goal of the underlying OS to hold the frame-rate at 40fps.
This problem in conjunction with the issues regarding iOS 9 Sprite Kit performance mean that it may not currently be possible to resolve all your issues. It will be extremely difficult to ascertain when you're hitting one of these (seemingly) arbitrary fps caps versus having caused an actual problem.
Just as an aside, these caps are not limited to 40fps. I've noticed them at 30fps, 24fps, 20fps, 15fps, 12fps and 8fps.
Of course Apple has never conceded nor admitted to this capping mechanism within the OS, nor commented on when/how/why it's so heavily impacting game and rendering processes.
My theory, as expressed in this post ( Inconsistent SceneKit framerate ), is that it's part of iOS designed to facilitate the use of variable frame rate technology soon to come in the iPad Pro, and possibly in other devices.
It would make sense that 120Hz become a base rate for future devices, particularly given the focus on performance advantages of iOS, the new Apple TV and 240Hz sampling of the screen touches/pen within the iPad Pro... and the considerable number of 120Hz televisions in the market.
Even without variable frame rate technology (say... your TV), 120Hz display rates means 24fps movies can be played back at a stable 5:5:5 pattern of frame display -- this massively increases the joy/immersion when watching films, just about all of which are shot at and truly exploit the advantages of a true 24fps for blur and motion effects.
120Hz with either variable frame rate technology or 5:5:5 frame display will also save Apple enormous effort in terms of compression and decompression of films when compared to the pulldown methods currently used on all devices with a maximum frame rate of 60fps.
All speculation, but I'd guess the use of these frame rate cappings in game engine technologies are there to help make games use less power, too, and give (in the future) developers an option to framerate lock their games in a variable frame rate device world. It's very unfortunate that (if this is the case) they have done such a poor job or sorting out the capping issues in the OS and the nature of Sprite Kit, leading to a scenario where you're fighting blindly to get good, high, consistent frame rates.
Apple silence and seemingly uncaring attitude towards the problems these two sets of issues are causing is (quite possibly) a very strong indication of how they feel about "their game development community".
And it's the greatest single problem of dealing with the kinds of cutting edge and performance critical development problems inherent to game making within a closed source framework from a needlessly secretive and uncommunicative (almost belligerent) organisation.
It seems the majority of iOS9 issues are addressed in the latest iOS9.2 beta & Xcode 7.2 beta.
Thank you Apple for finally addressing the problem. Too bad I spent 2 months working on work-arounds.
I've been posting on Apple's forums, submitting bugs and communicating with a support technician. It's a shame that at no point did Apple offer any clarity over what issues they were aware of, and what the Sprite Kit team were addressing.
Still, at least it seems the majority of Sprite Kit issues are now resolved, and a career change is no longer necessary :]
In the example you provided I can see one way to improve the draw count and to validate that the crop node isn't the problem. I don't know how practical this is in your actual game, but this drastically limits your draw calls.
#import "GameScene.h"
#interface GameScene ()
#property(nonatomic, strong)SKTexture *spaceshipTexture;
#end
#implementation GameScene
-(void)didMoveToView:(SKView *)view {
/* Setup your scene here */
SKLabelNode *myLabel = [SKLabelNode labelNodeWithFontNamed:#"Chalkduster"];
myLabel.text = #"Hello, World!";
myLabel.fontSize = 45;
myLabel.position = CGPointMake(CGRectGetMidX(self.frame),
CGRectGetMidY(self.frame));
[self addChild:myLabel];
//Create one texture to be used over an over
SKSpriteNode *sprite = [SKSpriteNode spriteNodeWithImageNamed:#"Spaceship"];
sprite.xScale = 0.5;
sprite.yScale = 0.5;
SKCropNode *cropNode = [[SKCropNode alloc] init];
SKSpriteNode *mask = [SKSpriteNode spriteNodeWithColor:[UIColor blackColor] size:CGSizeMake(100, 100)];
mask.position = sprite.position;
[cropNode setMaskNode:mask];
[cropNode addChild:sprite];
[self addChild:cropNode];
//temp add to scene to create the texture than remove
//keep pointer to texture so it can be reused
self.spaceshipTexture = [self.view textureFromNode:cropNode];
[cropNode removeFromParent];
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
/* Called when a touch begins */
for (UITouch *touch in touches) {
CGPoint location = [touch locationInNode:self];
//re use texture
SKSpriteNode *sprite = [SKSpriteNode spriteNodeWithTexture:self.spaceshipTexture];
sprite.position = location;
[self addChild:sprite];
}
}
#end
The downside is that the issue remains when you get up to 120+ ish nodes on screen.
Also note I created a fresh SpriteKit app using the default template and removed the SKAction. Unfortunately even the default Apple template suffers huge fps drops when you get up to 120+ nodes. So I don't know if there will be an answer and your best course of action is to file a bug report because I believe it should (and use to be able to) handle that many on screen without a drop in fps.
Also something worth noting this error comes up on the default project which could be related...
2015-10-20 10:46:50.640 Test[2218:658384] <CAMetalLayer: 0x15fe987c0>: calling -display has no effect.
Update
Doing a little more research I ran into this...
https://forums.developer.apple.com/thread/14487
Looks like others are filing bugs for this...
Lots of draws... Have you tried this?
skView.ignoresSiblingOrder = YES;
Are you using SKLightNode? If so try removing all light node's from your code. I did this and my game is running back at 60fps like it did in iOS8.
Source:SKLightNode performance issues
Related
I'm trying to build a weather application on the iPad but it seems that I need some help in animation. Say I'm animating a Radar, so the radar source files have 10 gif/jpeg pictures in 900x700 pixel size. I've tried the UIImage animation technique using the tutorial here:
http://www.icodeblog.com/2009/07/24/iphone-programming-tutorial-animating-a-game-sprite/
but it seems that loading 10 images that big is too much for the iPad to handle and its crashing due to memory warnings. I'm researching other techniques to animate but I can't seem to find something that will do this efficiently.
I've looked at others like Core Animation using sprites, and Cocos2D with sprites. Can someone point in the right direction the best way to animate these big images? (keep in mind that these images are dynamic and changes often so the sprites will have to be recreated on a server and fetched from the iPad to do the animation). Thanks
OpenGL only creates textures with dimensions at powers of 2. In the case of your images, that's 1024x1024, which is a meg of memory per image. Still, that shouldn't be a problem with the iPad.
First, investigate using Xcode's profiling tools to ensure the images aren't being repeatedly loaded into memory at each loop of the animation (likely by way of new objects that aren't sharing cached textures). That could solve your problem from the start.
Second, I recommend using Cocos2D if only for the easy handling of textures and caching. Toss the images into a CCAnimation, pop that into a CCRepeatForever, run it with a CCSequence. When you're done hit CCTextureCache to release unused textures.
Third, lower your animation framerate to 30 or less (if only for this animation). It may be the iPad, but you making a weather app. Not a video game.
Finally, downgrade the size of your image. Justify all you want, but a large radar animation will not sell your app. And just because a website might already be playing that animation beautifully, remember that a desktop has vastly more memory and power than any smart phone.
Try breaking the animation image into into smaller parts and animate those instead by treating each components as sprites. It would be best if you use primarily code (CoreGraphics) and draw your radar "by hand" instead of just using images as if they were animated GIFs.
I have a game with a number of animated "monsters". The animation is made with ~20 png images for each monster. So I use UIImageView with setted animationImages:.
The problem is that sometimes there can be a lot of monsters on the screen (up to 110 in total and up to 10 different). So when all of them are on the screen at the same time - I see animation problems (very low fps).
Please, can you give me some advice - how can I solve this problem?
You can use CoreAnimation as described in this tutorial. It explains pretty well all the techniques you can use to increase the performance from where you are now (first of all it doesn't use UIViews and the standard animationImages, second it makes use of sprites (also called texture atlases) which will not only increase performance but also will make your life a lot more easier when it comes to managing the image resources).
Also you can use CADisplayLink to create a game loop in which you can make all the updates. There are several questions/answers here on SO that describe just that.
I'm working on a simple program that has 500 "particles" that have an x and a y coordinate. They move around the screen and respond to touches. As I go past 500 particles the app starts running much slower. Using CPU sampler I discovered that drawing the particles is taking up the most CPU time.
This is the drawing code:
CGContextSetFillColorWithColor(context, [UIColor colorWithRed:red/255 green:green/255 blue:blue/255 alpha:1].CGColor);
CGRect rectangle = CGRectMake(xpos,ypos,9,9);
CGContextAddEllipseInRect(context, rectangle);
CGContextFillPath(context);
red,green,and blue are floats used to change the color of the particles based on their speed, but this isn't the problem.
This is how I was taught to use Quartz and it works just fine for most drawing, but this code is executed 500+ times and the game starts slowing down. I've run the program with CPU sampler with the drawing code commented out and there is hardly any CPU usage despite all the math going on in the background.
Is there a more efficient way to draw circles in iOS?
You can try two different approaches to help speed up performance...
Use prerendered UIImage/CGImage instead of points (won't give you the ability to change colors/sizes dynamically, but maybe you only need a limited range for your app)
Use OpenGL, GL_POINTS
Quartz is generally slower than OpenGL especially for path based drawing from all the research I've done on the IPhone. Refer to the IPhone Dev forums and you'll see a general consensus about this.
Making a layer (CALayer) for each particle might actually make sense. In general, doing drawing "yourself" in -drawRect: is the path to slowness on iOS. Avoid it if at all possible.
I have a game that's moving fewer than 10 small animated UIImageViews at once, maximum. I'm driving their animation with a CADisplay timer running at 60fps. Here is an example of how I move the views in my update method:
// for each insect in insectArray
insectView.center = insect.hitCenter // I pull a position from my model object
The graphics are 32 x 32 pixels with up to 5 animation frames each, if that helps. They have an alpha channel for transparency. I've profiled and eliminated any in-game calculations as a bottleneck. I've also made the opacity property = YES, for a very small speedup. Having the animated frames playing or not makes no difference.
The frame rates are mostly great, except on older devices like the iPhone 1G and 3G. There I get intermittent stuttering.
Before switching to OpenGL, is there any way to get a bit more performance?
I experienced the same kind of bottleneck with CoreAnimation; it is very limited in terms of system complexity that you can display with decent performance. From what I have read and discussed with others, there is no silver bullet for you (or me) there, sorry!
My usage was actually quite close to yours (no animated frames, though), and using OpenGL ES made it go from painfully sluggish to perfectly snappy, so there's hope for you!
CoreAnimation isn't designed for frame-by-frame animation, you tell it a few keyframes and times, and it will do the rest for you. Why not switch to OpenGL? You can't support the old devices forever...
I agree with FX that there is no silver bullet, but if you provided a little more code, we could make some specific suggestions. Here are a few general ones:
Don't round corners using -setCornerRadius on the UIImageView's layer. You'd never believe how much this can degrade performance
If you're using drop shadows behind your view, make sure you specify a shadow path on the layer as well.
Try turning -shouldRasterize on on the UIImageView layer: [[insectView layer] setShouldRasterize:YES];
Hate to say it, but after this, as others have said, OpenGL is the only other choice.
I need some information about using cocos2d in iPad.
Can we use 2048x2048 sprite sheets ? I read in this form that we can use but with limitation not more than 3 or 4 sprite sheets.
But, I have 10 animations in my game. maximum of 4 animations run at a time.
Can we use the CCDirectors in AppDelegate in the same way as we use in iPhone ?
if( ! [CCDirector setDirectorType:CCDirectorTypeDisplayLink] )
[CCDirector setDirectorType:CCDirectorTypeDefault];
[[CCDirector sharedDirector] setPixelFormat:kPixelFormatRGBA8888];
[CCTexture2D setDefaultAlphaPixelFormat:kTexture2DPixelFormat_RGBA8888];
What can be the maximum size of the image that we can use?
Any limitations regarding the cocos2d and iPad please post them.
Thank you.
The iPad resolution is 1024x768. It doesn't make sense to use higher resolution images in your game unless you intend to zoom in to see a LOT more detail. Even so you need to evaluate if you really want to do this. If you do, be sure to turn on mipmapping for your textures in Cocos2D.
If you use higher resolution images, the iOS device's PowerVR chip (video processor) is going to pay and your game will perform much slower. If you can help it I would either tile much lower pixel sized images (say 256x256 or lower). In the end it depends on how fast your game runs on your target device.