I am trying to program an iPhone app where the user can handwrite. I am using touchesmoved to track the user's finger, but I get between 30 and 50 updates a second. For quick finger movements, this is often no enough to get smooth letters: small letters like i often just get the touches began and then the touches ended points, with no points in between.
I was wondering if I could get better time (and space) resolution by using an NSTimer to query the touch object for its locationInView without waiting for it to call touchesmoved. Anyone knows if this might work?
Thanks.
I'll advise against NSTimer - its better to use CADisplayLink approach. For example, try to use my tiny lib https://github.com/nt9/NTLoop
Related
I'm working on an iOS movie editor project. For this editor, i use MPMoviePlayer to show the video file selected by the user.
I use custom controls, and I have a UISlider that enables the user to move the player's currentTime position. When the user touches the slider, movie is paused and its currentTime changes along with the UISlider's value.
Everything works perfectly, but now i need to let the user hear the sound at this currentTime position.
For those who know iMovie, when you move your mouse over a movie event, you see the image and hear the sound at this position, and that's what i'd like in my editor.
I've tried to call player's play method with à NSTimer to stop after 0.2 seconds, but the result is kind of messy.
Has anyone already achieved to do something like this ?
Thanks !
Best regards.
Seeking takes time; that's why you've ended up using a timer. The real problem here is that MPMoviePlayerController, while convenient because it gives you controls, is a blunt instrument; it's just a massive simplified convenience built on top of AVFoundation. But you don't need the built-in controls, so I would suggest throwing away your entire implementation and getting down to the real stuff, using AVFoundation instead (AVPlayer etc). Now you have a coherent way to seek and get a notification when the seek has completed (seekToTime:completionHandler:), so you'll be able to start playing as soon as possible. Plus, AVFoundation is the level where you'll be doing all your "editing" anyway.
I have a queston about movment of object in app (game) I am creating. I tried move ImageView with timer. So every 0.01 second object move for 1px. But movement is not smooth. So i also tried with animations. But there is problem, that if I close app and run it again (app was stil opened in background), there are problems, that picture of View stays on the end of animation. And also I want to check every 0.01 second if moving object and my character did colide, so animation is not the best option. Is there a way to move my object smooth with local time on phone? Or there is some other way to move object?
It sounds like you're not using SpriteKit or any game engine for that matter. Any game engine will come with an update loop that does all of this.
You can learn about SpriteKit here... http://www.raywenderlich.com/42699/spritekit-tutorial-for-beginners
I am using CADisplayLink to draw frames using the EAGLView method in a game at 60 times per second.
When I call UIView animateWithDuration: the framerate drops down to exactly half, from 60 to 30 fps for the duration of the animation. Once the animation is over, the fps rises instantly back up to 60.
I also tried using NSTimer animation method instead of CADisplayLink and still get the same result.
The same behavior happens when I press the volume buttons while the speaker icon is fading out, so it may be using animateWithDuration. As I would like to be able to handle the speaker icon smoothly in my app, this means I can't just rewrite my animation code to use a different method other than animateWithDuration, but need to find a solution that works with it.
I am aware that there is an option to slow down animations for debug on the simulator, however, I am experiencing this on the device and no such option is enabled. I also tried using various options for animateWithDuration such as the linear one and the user interaction, but none had an improvement.
I am also aware I can design an engine that can still work with a frame rate that varies widely. However, this is not an ideal solution to this problem, as high fps is desirable for games.
Has someone seen this problem or solved it before?
The solution to this is to do your own animation and blit during the CADisplayLink callback.
1) for the volume issue, put a small volume icon in the corner, or show it if the user takes some predefined touch action, and give them touch controls. With that input you can use AVAudioPlayer to vary the volume, and just avoid the system control altogether. you might even be able to determine the user has pressed the volume buttons, and pop some note saying do it your way. This gets you away from any animations happening by the system.
2) When you have an animation you want to do, well, create a series of images in code (either then or before hand), and every so many callbacks in the displayLink blit the image to the screen.
Here's an old thread that describes similar drops in frame rate. In that case, the cause of the problem was adding two or more semi-transparent sprites, but I'd guess that any time you try to composite several layers together you may be doing enough work to cut the frame rate, and animateWithDuration very likely does exactly that kind of thing.
Either use OpenGL or CoreAnimation. They are not compatible.
To test this remove any UIView animation, the frame rate will be what you expect. Add back UIView animation, it will drop to 30fps.
You said:
When I call UIView animateWithDuration: the framerate drops down to exactly half, from 60 to 30 fps for the duration of the animation. Once the animation is over, the fps rises instantly back up to 60
I dont know why your not accepting my answer, this is exactly what happens when you combine UIView animation with CA animation not using a UIView.
I have a UIView that a user drags around via setting its center in the touchesMoved: method. When the user lets go, I want the UIView to fall off the screen according to how fast and what direction they were moving it in.
Do I need to somehow create a vector by comparing the UIView's last center point to it's new center point? And subtract a fixed amount to the vector's y value every so often with a NSTimer?
Yes, you will need to a decent amount of physics in calculating
Speed of swipe
Direction of swipe
You will need to use the touchesMoved method along with a timer to track the amount of time for the swipe and also the co-ordinates for the new location of the object. This should be fairly straight forward. Once you are done finding those you can simply add a UIAnimation for the object to move to its new place.
~ A suggestion:
I would suggest that you have a look at Cocos2D and integrate the library with you app. You will not need to implement touch delegate methods and compute things yourself ~ there are libraries for that :) It has a lot of libraries especially for moving objects (or sprites if you wish to call them that way) and you have animation method like easeInEaseOut, etc.. that can be impacted on the moving object. If you are developing a game of some sort, have a look at chipMunk engine in Cocos2D as well.
Hey all, I'm developing a rhythm game for the iPhone at the moment, just wondered if anyone had any thoughts on the best pieces to use for reaction time.
I have all the coding worked out, and I've narrowed it down to about 2 ways:
1: Using instances of UIButton that bypass the UIControlEvent or whatever, in order to use touchesBegan and touchesEnded. I've found this to be a bit faster in the past.
2: Using UIViews with custom functions to change the state of the buttons. They would also use touches began.
The rhythm pads (eight of them) need to be able to play a sound with minimal lag, and provide some sort of feedback, i.e. changing the image of the button.
My question: Is it better to Use UIViews and make my own buttons, or actual UIButtons that have been subclassed to use touchesBegan etc?
For anyone that finds this while searching, using touchesBegan, ended, and moved is WAY faster. In terms of a rhythm game or application requiring fast input, this is the way to go.