Detect a touch down event immediately on bottom's area in IOS7 - cocoa-touch

In IOS7 application,Touch down event in bottom's area(being related to control center) is not triggered immediately.
down event is not triggered when touch down.
down event is triggered when touch move.
I know that it's not a bug and features in IOS7.
But, I believe there is some solutions for it.

Related

What is "tap" on mobile platform?

What does the word "tap" mean on mobile platform? I can see there is IsTapEnabled property on UI.Element(s), however I don't understand what kind of action is "tap". What is difference between "tap" and "pressed"?
Pressed event is published when you start touching a control (= your finger starts to touch the control on the screen).
Tap event is published after you release the touch from the control (= remove your finger from the screen). It's the equivalent of Clicked in desktop applications.

NSView regisstering events on first click of mouse (making the app active)

I'm kind of pulling my hair out here. I have a single window application with a NSscrollView and custom NSViews inside of the scroll view. The custom NSViews are registering mouseUP and mouseDown events but my problem is that when the app/window is inactive and you click on it anywhere to make it active the mouseUP and mouseDown events are being triggered in the NSView that you click on.
I overrode the '(BOOL)acceptsFirstMouse:(NSEvent *)theEvent' to return NO just to be sure (i know this is the default.
I can't figure out what I'm doing wrong. I'm principally an iOS developer so my OS X experience is not super extensive. Any input helps. Thanks!
Found the issue. I had a NSTexField on the subview that was capturing the first mouseDown event. Just overlooked it.

Observe global mouse location on screen in Cocoa?

I'd like to be notified when the mouse moves and get the location on the screen. I tried this:
[NSEvent addGlobalMonitorForEventsMatchingMask:NSMouseMovedMask handler:^(NSEvent *event) {
CGPoint location = [NSEvent mouseLocation];
NSLog(#"Position: %#", NSStringFromPoint(location));
}];
However this seems to only work as long as the mouse is in my app's window. As soon as leave it I'm not notified. Until I enter the window again. Shouldn't this event be global?
UPDATE:
I extracted the code in question and made a separate sample project (Dropbox-Link). Just to be sure, there is nothing else interfering with it. It's a clean project with just the code above in applicationDidFinishLaunching:. I get the same results and it's really strange. I uploaded a video to youtube: http://www.youtube.com/watch?v=I3AKgmURaMk.
These are my observations:
Immediately after launching the app, no events at all are delivered, no matter where I move the mouse.
Clicking the app's window will somehow activate event delivery. Now I receive NSMouseMovedMask events, no matter where I move the mouse (this is what I want to achieve).
Clicking back to Xcode doesn't change anything. I keep getting the events.
However, moving the focus back to my app results in a strange behavior. When I move my mouse over Xcode I only get events over some parts of the window, while some parts seem to absorb the event. More specifically I don't receive events over the editor or the log view. I only receive them while moving the mouse over the gray split view separator area (with the tab bars embedded in them).
It should work, I've seen weird behavior when the window is up but not in the foreground. I've noticed that if the application is in the background but not in the dock you will only get events when the mouse transitions between windows in the foreground. Minimizing the application seems to fix this problem.. I'm very new at cocoa dev/ObjC.. I might be wrong, but I've noticed when I minimize my app, I get all events.

UIButton events. What's the difference?

I've encountered a problem where my button should remain "pressed down" while it shows popover called from it. Popover is selector for some filter and filter is shown on button itself. When I tap on it and it shows popover it becomes deselected no matter what.
I think I have to redefine it's behavior on touch event and make it respond not to standart touch up inside. Then I wondered what are other events responsible for? But I couldn't find events list in iOS library and in StackOverflow are only questions about incorrect behavior of touch up inside or touch down.
So what's the difference betweeen touch events?
touch cancel - when you touch button but move your finger away and
it remains deselected?
touch down - right on tap.
touch down repeat ??
touch drag enter ??
touch drag exit ??
touch drag inside ??
touch drag outside ??
touch up inside - when you tap and release button remaining in it's
bounds . It changes UIButtons state to Normal.
touch up outside - when you tap and release button leaving it's
bounds ?
other IBActions are not sent by UIButton, right?
Also how those events change UIButton's appearance? Like highlighted or selected?
I'd appreciate a link on good article about IBActions, because I couldn't find it.
From Apple's doc for UIControlEvents:
UIControlEventTouchCancel
A system event canceling the current touches for the control.
UIControlEventTouchDown
A touch-down event in the control.
UIControlEventTouchDownRepeat
A repeated touch-down event in the control; for this event the value of the UITouch tapCount method is greater than one.
UIControlEventTouchDragEnter
An event where a finger is dragged into the bounds of the control.
UIControlEventTouchDragExit
An event where a finger is dragged from within a control to outside its bounds.
UIControlEventTouchDragInside
An event where a finger is dragged inside the bounds of the control.
UIControlEventTouchDragOutside
An event where a finger is dragged just outside the bounds of the control.
UIControlEventTouchUpInside
A touch-up event in the control where the finger is inside the bounds of the control.
UIControlEventTouchUpOutside
A touch-up event in the control where the finger is outside the bounds of the control.
Listed in, what I would consider, order of common use/likelihood of occurrence for a normal button:
UIControlEventTouchDown: The user tapped the button. This fires on the finger/stylus making contact.
UIControlEventTouchUpInside: The user tapped the button. This fires on the finger/stylus contact pulled back away from the screen.
Useful for sliders and drag events like moving a component around. The below are in order of occurrence:
UIControlEventTouchDragInside: Triggered as the finger drags into the button area.
UIControlEventTouchDragExit: Triggered during a drag motion. It is called only once, as the users finger/stylus leaves the bounds of the button.
UIControlEventTouchDragOutside: Triggered during a drag motion, after 'UIControlEventTouchDragExit', and is called continuously, as long as the original touch continues.
UIControlEventTouchUpOutside: This is simply the finger/stylus being lifted BUT only if the finger/stylus is no longer within the bounds of the button. The important thing (and probably obviously) to call out is that the touch had to have been within the button at some point to associate this event with the button.
Note: My understanding is that the above can be helpful for:
Sliders: as you might expect the touch may have been intentional but because of the quick swipe action, their finger movement may be sloppy and lift up outside of the slider area.
Moving components around, as when you push things around a screen you want the movement to happen when the finger/stylus touches the border of the component/object.
Other events:
UIControlEventTouchCancel: Something out of the user's control is cancelling their touch action. Think of this as something "going wrong" on the phone side of things.
UIControlEventTouchDownRepeat: Want to detect when your user is mad and tapping a button furiously? Want to detect if they're still in Windows mode and are trying to "double click"? Or maybe you designed a button to do something different if they tap twice. This event helps with all of those!
References:
SO 1: Dif between UIControlEventTouchDragOutside and UIControlEventTouchDragExit
SO 2: What is UIControlEventTouchCancel?

Touch detection without event processing

I am using cocos2d. I would like to be able to detect whether the screen is touched at a particular instant - that is, rather than intercepting an event when it occurs, I want to detect the presence of touch at a particular moment.
The reason is that I am animating sprites and want to determine if the sprite should keep moving - if the screen is still touched. I cannot use ccTouchesEnded because each time an animation starts I set isTouchEnabled to false because I also want the user to be able to tap rapidly on the screen to move the sprite but if they tapped too rapidly, it would mess with the position of the sprite during the animation process - which I have found screws up the positions of my objects in weird ways.
Is this possible?
There does not appear to be any public API to detect touches other than enabling and receiving these events in the main UI run loop.
You can keep handling events, and set the state left by the last touch event in a model object or global variables. Then you can poll your app's own internal state at any time.
Instead of disabling touches, you can just have your touch handler not do inappropriate things if the event time stamp is too close to some animation start time.