Objective-C, Core-Plot Real Time Graph vs CPU - objective-c

I've been working on implementing a real time Core Plot graph into my application on OS X. To my dismay I noticed a fairly significant issue. Once the line gets to the end of the X-Axis and it starts scrolling to keep up with the line the CPU load hits 30-35% non-stop.
I figured before I proceed any further I had better go back and see if I had made some type of mistake in my code for the CPU to spike like that. There wasn't anything out of the ordinary that I noticed, and I tried to adjust the framerate and updating frequency but without luck. I decided to go back to the real time example project they include and it has the same effect on the CPU.
Is there anything I can do about this, or is that just the nature of
real time graphing on OS X?
. .
Everything is fine for the first 50 frames (indicated by the line with arrows), but once it gets to the end of it that's where things turn for the worse.
Side Note:
I noticed Swift does graphing in the Playground, and even though it's apparently not real time (and I'm using Obj-C) it looks really sharp. Is the Swift graphing feature only available within playgrounds, or is there a way to implement that into a project? I'm only mentioning this because I'm looking to find something soon that is efficient.

That's the expected behavior with Core Plot. Once the graph starts to scroll, it has to redraw the plot, both axes, and all of the grid lines for each animation frame. You could reduce the drawing load by decreasing the number of grid lines and/or axis tick marks.
The playground graphs are a private part of the playground environment.

Related

Unreal Engine 5.1 Retargeted Animation From Manny to Mixamo Model Causes Single Animation Sequence To Rotate 90*

I apologize if this has been asked before. I tried searching for this first and nothing is coming up. I'm pretty new to Unreal Engine 5.1 so this might be something I'm doing as well.
I've been exploring animation retargeting in unreal and have tried following the steps we learned in class using one of the models from mixamo.com. Everything appears to work fine at the start and I can get the actual IK and IKR objects working just fine. However when I try to export the animations either from the IKR object or by right clicking the ABP object for the source mesh one and only one of the animation sequences rotates 90*. It is always the same animation (the Land animation sequence) and I'm not sure how to go about fixing it.
Also tried looking on google and turned up nothing.
I'm hoping this is some stupid newb mistake that is easy to fix or maybe there's something I'm overlooking. Any help is greatly appreciated and I will continue trying to fix the problem myself as well and will post if I fix it
Tried retargeting using the following steps
Create an IK_Object for the model you wish to project your animations on with chains for each. Mine looked like the following
IK_Remy
Repeat step 1 for the model you wish to source animations from. Mine looked like the following
IK_Manny
Create an IKR_Object Linking the two together, here's what mine looked like
IKR_Remy
Find the ABP for your source model, right click, and select "Retarget Animation Assets->Duplicate and Retarget Animation Assets". Here's what I'm selecting for that
Retarget Dialog
When I do the following most of the animation sequences for "Manny" export just fine. However the "Land" animation flips for some reason (see image below)
Exported Animation Images
Even stranger, when I preview the MM_Land animation in my IKR object it looks fine i.e. not rotated. However, if I try to export the animation from the IKR object the same thing happens i.e. it rotates 90*. I would expect this to be a case of WYSIWYG where if it's working in the preview it would export correctly. However that apparently is not the case
Also I tried modifying the animation sequence manually but it won't let me. If I try to rotate the model in the animation sequence and save it, once I close the sequence it's re-rotated and the changes do not persist.
I can export the sequence as a new sequence, modify it, and save it, and then rename it as my exported "Land" animation to hard force it and it at least looks normal. However when I actually play the game and jump, when the land animation it still flips sideways and in addition causes the character to scale and warp for a second which makes me think there's something going on here that I don't know enough to fix. Really hoping someone with more experience in Unreal Engine can help.
EDIT: Fixed Image Descriptions
I can confirm that this is an issue - I'm seeing the same behaviour. I haven't managed to fix it yet, but my suspicion is that it's due to scaling - in my instance, I have had to scale up my custom character by around 2.5 times to replicate the scale of the default mannequin. Did you scale your custom character at all?

CGPointMake origin for (0,0) in cocos2d

I'm looking over some sample code in a cocos2d project. I had previous built a project using Core Graphics (Quartz) where coordinate (0,0) is the upper left corner of the screen. In this project, if I use CGPointMake(0,0) it is in the lower left corner. I understand that the coordinate systems are different, but where exactly would a program specify which coordinate system to use? What is the setting or method that actually makes this switch?
There is no switch. If you want to work with Cocos2D, get used to its coordinate system origin being at the lower left corner of the screen.
I've seen users make all kinds of attempts to "fix" this, either by hacking around in the Cocos2D source code, or by overriding the setPosition property of all nodes only to find out that this isn't enough. I bet all of them have been running into lots of issues, including the fact that whenever you need to re-use someone else's code, you're faced with making the necessary coordinate system fixes to that code as well. It's a never-ending struggle that is really not worth spending any amount of time in.
Instead, rather than changing the code, change your perception. Get used to a different coordinate system and thinking in it. Way easier and much less trouble for the future. After all, all you really need to change in your head is that the sign of the Y coordinate has changed.

OpenGL updating uniforms on Mac OSX

I'm programming a nice little game that uses shader generated simplex noise for displaying on the fly computed random terrain.
I'm using Objective-C and Xcode 4 and I have gotten everything to run nicely using a subclass of NSOpenGLView. The subclass first compiles the shader and then renders a quad with the noise texture. The program has no problems running this at an acceptable speed (60Hz).
The subclass of NSOpenGLView uses an NSRunLoop to to fire a selector which in turn calls the drawRect:(NSRect)dirtyRect. This is done every frame.
Now, I want the shader to use a uniform that is updated each frame.
The shader should be able to react to a variable change that might occur every frame thus I'm trying to update the uniform at this frequency. The update of the uniform is done in the drawRect:(NSRect)dirtyRect function.
I am partially successful. The screen updates exactly as I'd like for the first 30 frames, then it stops updating the uniform even though I have a call to glUniform1f() and NSLog right next to each other and the NSLog always fires..!
The strange part is that if a hold space pressed (or any other key for that matter) the uniform is updated as it should be.
Clearly I am missing something here in regards to how OSX or OpenGL or something else handles uniforms.
An explanation of what might be ailing me would be appreciated but a pointer to where I can find information about this will suffice.
Update: After fiddling with glGetError() and glGetUniform*() I've noticed that the program works as intended when left alone. However, when I use the trackpad for input the uniform is reset to 0.000 while the rest of the program shows no errors.
First, have you tried calling glGetError() right after glUniform1f() to see what comes out?
I have a very limited knowledge of Mac OS programming (I did some iOS programming two years ago and forgot most of it since then) so the following is a wild guess.
Are you sure drawRect:(NSRect)dirtyRect is called by the same thread as the thread that owns the OpenGL context? As far as I know, OpenGL is not thread safe, so it could be that your glUniform1f() attempts are called from a different thread thus being unable to do anything.

OpenGL - animation stuttering when in full screen

I'm currently running into a problem regarding animation in OpenGL. I have between 200 and 10000 gears on the screen at a time all rotating. When the window is not in maximized view, my CPU runs at about 10-20 % consistently. No spikes, no stuttering in the animation, it runs perfectly smooth regardless of the number of gears on screen. When I maximize the window though, everything falls apart. My CPU maxes out, I begin getting weird spikes in CPU usage, the animation begins stuttering as a result, and it just looks really ugly, even when I have only 200 gears on screen.
My animation technique looks like this:
While Animating
Calculate current rotation angle based on a running timer
draw Image
call glFlush()
End While
If it helps, I'm using the Tao framework in VB.net. I'm not performing any other calculations other than the ones to calculate the rotation angle mentioned above and perform a few glRotateD and glscaleD in the method to draw the image.
In addition, I guess I was under the impression that regardless of the window size in an orthographic 2-dimensional drawing that is scaling on resizing of the window, the drawing time would always take the same amount of time. Is this a correct assumption?
Any help is greatly appreciated =)
Edit
Note that I've seen the animation run perfectly smooth at full screen before. Every once in awhile, OpenGL will decide it's happy and run perfectly at full screen using between 10-20% of the CPU (same as when not maximized). I haven't pinpointed what causes this though, because it will run one time perfectly, then without changing anything, I will run it again and encounter the choppiness. I simply want to pinpoint what causes the animation to slow down and eliminate it.
I've run a dot trace on my program and it says that the swapBuffers method is using 55 % of my processing time even though I'm never explicitly calling the method. Is the method called by something else that I can eliminate, or is this simply OpenGL's "dead time" method to limit the animation to 60 fps?
I was under the impression that regardless of the window size in an orthographic 2-dimensional drawing that is scaling on resizing of the window, the drawing time would always take the same amount of time. Is this a correct assumption?
If only :)
More pixels require more memory bandwidth/shader units/etc. Look into fillrate.

Jerky/juttery (core-)animation in a screensaver?

I've built a screensaver for Leopard which utilises core-animation. It doesn't do anything overly complicated; uses a tree of CALayers and CATextLayers to produce a "table" of data in the following structure:
- root
› maincontainer
› subcontainer
› row [multiple]
› cell [multiple]
› text layer
At most there are 50 CALayers rendered on the screen at any one time.
Once I've built the "table", I'm adding animating the "subcontainer" into view using CABasicAnimation. Again, I'm not doing anything fancy - just a simple fade-in.
The problem is that while the animation does happen its painful to watch. It's jerky on my development machine which is a 3.06Ghz iMac with 4GB of RAM, and seems to chop the animation into 10 steps rather than showing a gradual change.
It gets worse on the ppc mac-mini the screensaver is targeted for; it refuses to even play the animation, generally "tweening" from the beginning of the animation (0% opacity) to half-way (50%) then completing.
I'm relatively new to ObjectiveC and my experience is based on using garbage-collected environments, but I can't believe I'm leaking enough memory at the point the screensaver starts to cause such problems.
Also, I'm quite sure its not a problem with the hardware. I've tested the built-in screensavers which use core-animation, and downloaded a few free CA-based for comparison, and they run without issue on both machines.
Information is pretty thin on Google with regards to using CA in screensavers, or using CA in general for that matter, and advice/tutorials on profiling/troubling screensavers seems to be non-existant. So any help the community can provide would be well welcomed!
--- UPDATE ---
Seems as though implicit animations help smooth things out a little. Still kinda jerky, but not as bad as trying to animate everything with explicit animations as in my solution.
There isn't much special about a screen saver. I assume you've started with the Core Animation Programming Guide? Running it through Instruments will give you a lot of information about where you're taking too much time.
The code you're using to do the fade-in would be useful. For what you're describing, you don't even need CABasicAnimation; you can just set the animatable properties of the layers, and they by default animate. Make sure you've read up on Implicit Animations. The rest of that page is probably of use as well.
Most of your job in CoreAnimation is getting out of the way. I generally knows what it's doing, and most problems come from second guessing it to trying to tell it too much.