I am using cocos2d. I would like to be able to detect whether the screen is touched at a particular instant - that is, rather than intercepting an event when it occurs, I want to detect the presence of touch at a particular moment.
The reason is that I am animating sprites and want to determine if the sprite should keep moving - if the screen is still touched. I cannot use ccTouchesEnded because each time an animation starts I set isTouchEnabled to false because I also want the user to be able to tap rapidly on the screen to move the sprite but if they tapped too rapidly, it would mess with the position of the sprite during the animation process - which I have found screws up the positions of my objects in weird ways.
Is this possible?
There does not appear to be any public API to detect touches other than enabling and receiving these events in the main UI run loop.
You can keep handling events, and set the state left by the last touch event in a model object or global variables. Then you can poll your app's own internal state at any time.
Instead of disabling touches, you can just have your touch handler not do inappropriate things if the event time stamp is too close to some animation start time.
Related
I'm trying to add a scene/node on top of a paused scene with a transparent background so the active scene view will still be visible.
I pause the scene with scene.view.paused = YES; but that means that everything is paused but the update method.
With Cocos2d one could easily push a new scene on top of an existing one but unfortunately SpriteKit does not have this capability.
Is there a way to pause a scene and add an active scene/node on top of it ?
A solution I've tried :
Creating an additional view controller with the pause-scene content and present it via the active view controller when needed but the background is always black so transparency is not achieved and some other problems occur with the original scene (I'll elaborate on these if needed).
It's all there. You do not have to present a new scene or view.
In a nutshell:
Pause the node where your game content is on that you want to pause. If you do all the processing in the scene class you'll have to refactor it, I'm afraid. The key is having a SKNode that acts as your "game layer" so that you can pause just that particular node. If you receive update: and other regular method calls to the game layer, it should check its self.paused state before doing any processing.
When you have that, pause the game layer node which will pause all the nodes and actions in it.
Now add another "layer" node with whatever UI you need to the scene. That could be your game over or pause layer, and it will be perfectly happy to receive input and run actions while the game layer is paused.
I have an application that has a canvas (NSView) where a user can drag an element around. When the mouse leaves the edge of that view it becomes a drag operation.
What I would like to do is when the mouse entered the originating view again it would cancel the operation and would automatically start the move within the canvas again.
I can figure out the second part I just need to figure out how to force a drag and drop to cancel. I need to do this somehow from draggingEntered: so before the mouse is even released.
Make your canvas view respond to drags as well. Initially all its <NSDraggingDestionation> protocol methods would just return “no, do nothing”, but if you start a drag from within the canvas you’d keep track of that, and once the drag leaves and comes back your NSDraggingDestionation methods would return, “Ok, we accept, drag is over, don’t bother animating.”
Then you could continue tracking locally. Like, assuming your canvas had called:
- (void)dragImage:(NSImage *)anImage at:(NSPoint)viewLocation offset:(NSSize)initialOffset event:(NSEvent *)event pasteboard:(NSPasteboard *)pboard source:(id)sourceObj slideBack:(BOOL)slideFlag;
That method would then return control to your canvas
I was able to get this working in iOS but I am having trouble with Win 8. What I need to do is this:
I have a drawing view that is the forward most child of a ScrollViewer. When I enable drawing mode from a button in the app bar, this view becomes active. With one touch, I want the user to be able to perform the current drawing action, but with two touches I want the touches to go to the scroll view instead for scrolling and zooming. I am able to distinguish between 1 and 2 (or more) fingers, but I don't know what to do after that point.
I tried removing the manipulation mode from the drawing view so that it would not block the scroll view, but the touches continued to be swallowed by the drawing view. I also tried calling ReleasePointerCapture but that had no effect either. How can I forward touch events using the WinRT API?
For some more information, I am making use of the PointerPressed, PointerChanged, etc events in the drawing view.
I'd like to be notified when the mouse moves and get the location on the screen. I tried this:
[NSEvent addGlobalMonitorForEventsMatchingMask:NSMouseMovedMask handler:^(NSEvent *event) {
CGPoint location = [NSEvent mouseLocation];
NSLog(#"Position: %#", NSStringFromPoint(location));
}];
However this seems to only work as long as the mouse is in my app's window. As soon as leave it I'm not notified. Until I enter the window again. Shouldn't this event be global?
UPDATE:
I extracted the code in question and made a separate sample project (Dropbox-Link). Just to be sure, there is nothing else interfering with it. It's a clean project with just the code above in applicationDidFinishLaunching:. I get the same results and it's really strange. I uploaded a video to youtube: http://www.youtube.com/watch?v=I3AKgmURaMk.
These are my observations:
Immediately after launching the app, no events at all are delivered, no matter where I move the mouse.
Clicking the app's window will somehow activate event delivery. Now I receive NSMouseMovedMask events, no matter where I move the mouse (this is what I want to achieve).
Clicking back to Xcode doesn't change anything. I keep getting the events.
However, moving the focus back to my app results in a strange behavior. When I move my mouse over Xcode I only get events over some parts of the window, while some parts seem to absorb the event. More specifically I don't receive events over the editor or the log view. I only receive them while moving the mouse over the gray split view separator area (with the tab bars embedded in them).
It should work, I've seen weird behavior when the window is up but not in the foreground. I've noticed that if the application is in the background but not in the dock you will only get events when the mouse transitions between windows in the foreground. Minimizing the application seems to fix this problem.. I'm very new at cocoa dev/ObjC.. I might be wrong, but I've noticed when I minimize my app, I get all events.
I have a button and when the user touches down and holds a popup appears. However, when the user releases his thumb before the pop animation finishes I'd like the animation to stop where it is and autoreverse to the initial position. How can I accomplish this?
Currently I'm simply using UIViews -animateWithDuration:animations:completion:. Do I have to set the animations explicitly in this case?
I've already tried reading the current state from the presentationLayer properties, but that somehow didn't work.
You can start the second animation using the UIViewAnimationOptionBeginFromCurrentState option. This will stop the first animation if it's still running.