I have a CCButton, which is initialised with two images, normal and pressed.
Problem is, when I touch the button, it reacts, the images are changing, but nothing else happens. I have set a selector, and it doesn't fire it up. When I log the location of the touch, it shows up in the console wherever I touch, except when I touch the button - it's like the button swallows touch but does nothing.
What could be a reason for this kind of behaviour? The button reacts, it changes states of unpressed/pressed, but it does not execute given selector and when pressed, location does not get logged in the console.
Related
We built a React Native tablet kiosk app which displays multiple pages of input fields to a user at a front desk. The user has to fill out all the forms and can send them at the end.
Users can just walk away from the kiosk at any time, which would result in their last screen being the "welcome screen" for the next user. To avoid that, the app resets after some time if there was no user interaction (any touch event on the screen).
Right now, we use a countdown and reset it on each input field, button and background touch there is. This results in passing the reset callback to a lot of components. It works, but it is just a lot of redundancy and can lead to errors easily.
Is there any way, we can add an overlay to the very top of the view, which can catch all touch events and call the reset callback, but also pass the touch event to the views below? So when a user clicks on a button, the overlay calls the its callback but also the button is clicked (same for input fields, etc).
We also tried the Gesture Responder System, but could not get the touch event to pass through - it was always consumed by the component with the Gesture Responder System.
You can try adding a pointerEvents attribute to your View.
I have an application that has a canvas (NSView) where a user can drag an element around. When the mouse leaves the edge of that view it becomes a drag operation.
What I would like to do is when the mouse entered the originating view again it would cancel the operation and would automatically start the move within the canvas again.
I can figure out the second part I just need to figure out how to force a drag and drop to cancel. I need to do this somehow from draggingEntered: so before the mouse is even released.
Make your canvas view respond to drags as well. Initially all its <NSDraggingDestionation> protocol methods would just return “no, do nothing”, but if you start a drag from within the canvas you’d keep track of that, and once the drag leaves and comes back your NSDraggingDestionation methods would return, “Ok, we accept, drag is over, don’t bother animating.”
Then you could continue tracking locally. Like, assuming your canvas had called:
- (void)dragImage:(NSImage *)anImage at:(NSPoint)viewLocation offset:(NSSize)initialOffset event:(NSEvent *)event pasteboard:(NSPasteboard *)pboard source:(id)sourceObj slideBack:(BOOL)slideFlag;
That method would then return control to your canvas
I have an iPad app that presents a UITextField inside a UIPopover when a button is tapped. This button is near the bottom of the screen. So, the user taps the button, the popover appears and becomes the first responder, which causes the keyboard to appear. This, in turn, causes the popover to move up as the keyboard slides in. This works fine, except for VoiceOver.
It appears that VoiceOver gets confused by the moving view. It starts to describe the new text field, but then stops mid-word as soon as it starts to move.
Does anyone know of a good work-around. The best I've come up with so far is to listen for UIKeyboardDidShowNotification and then find some way to kick VoiceOver to talk again, though I'm not sure how to kick VoiceOver into action.
You can inform VoiceOver of changes to your screen layout by using the accessibility notifications - UIAccessibilityLayoutChangedNotification or UIAcessibilityScreenChangedNotification would be good candidates.
I'd like to be notified when the mouse moves and get the location on the screen. I tried this:
[NSEvent addGlobalMonitorForEventsMatchingMask:NSMouseMovedMask handler:^(NSEvent *event) {
CGPoint location = [NSEvent mouseLocation];
NSLog(#"Position: %#", NSStringFromPoint(location));
}];
However this seems to only work as long as the mouse is in my app's window. As soon as leave it I'm not notified. Until I enter the window again. Shouldn't this event be global?
UPDATE:
I extracted the code in question and made a separate sample project (Dropbox-Link). Just to be sure, there is nothing else interfering with it. It's a clean project with just the code above in applicationDidFinishLaunching:. I get the same results and it's really strange. I uploaded a video to youtube: http://www.youtube.com/watch?v=I3AKgmURaMk.
These are my observations:
Immediately after launching the app, no events at all are delivered, no matter where I move the mouse.
Clicking the app's window will somehow activate event delivery. Now I receive NSMouseMovedMask events, no matter where I move the mouse (this is what I want to achieve).
Clicking back to Xcode doesn't change anything. I keep getting the events.
However, moving the focus back to my app results in a strange behavior. When I move my mouse over Xcode I only get events over some parts of the window, while some parts seem to absorb the event. More specifically I don't receive events over the editor or the log view. I only receive them while moving the mouse over the gray split view separator area (with the tab bars embedded in them).
It should work, I've seen weird behavior when the window is up but not in the foreground. I've noticed that if the application is in the background but not in the dock you will only get events when the mouse transitions between windows in the foreground. Minimizing the application seems to fix this problem.. I'm very new at cocoa dev/ObjC.. I might be wrong, but I've noticed when I minimize my app, I get all events.
I've encountered a problem where my button should remain "pressed down" while it shows popover called from it. Popover is selector for some filter and filter is shown on button itself. When I tap on it and it shows popover it becomes deselected no matter what.
I think I have to redefine it's behavior on touch event and make it respond not to standart touch up inside. Then I wondered what are other events responsible for? But I couldn't find events list in iOS library and in StackOverflow are only questions about incorrect behavior of touch up inside or touch down.
So what's the difference betweeen touch events?
touch cancel - when you touch button but move your finger away and
it remains deselected?
touch down - right on tap.
touch down repeat ??
touch drag enter ??
touch drag exit ??
touch drag inside ??
touch drag outside ??
touch up inside - when you tap and release button remaining in it's
bounds . It changes UIButtons state to Normal.
touch up outside - when you tap and release button leaving it's
bounds ?
other IBActions are not sent by UIButton, right?
Also how those events change UIButton's appearance? Like highlighted or selected?
I'd appreciate a link on good article about IBActions, because I couldn't find it.
From Apple's doc for UIControlEvents:
UIControlEventTouchCancel
A system event canceling the current touches for the control.
UIControlEventTouchDown
A touch-down event in the control.
UIControlEventTouchDownRepeat
A repeated touch-down event in the control; for this event the value of the UITouch tapCount method is greater than one.
UIControlEventTouchDragEnter
An event where a finger is dragged into the bounds of the control.
UIControlEventTouchDragExit
An event where a finger is dragged from within a control to outside its bounds.
UIControlEventTouchDragInside
An event where a finger is dragged inside the bounds of the control.
UIControlEventTouchDragOutside
An event where a finger is dragged just outside the bounds of the control.
UIControlEventTouchUpInside
A touch-up event in the control where the finger is inside the bounds of the control.
UIControlEventTouchUpOutside
A touch-up event in the control where the finger is outside the bounds of the control.
Listed in, what I would consider, order of common use/likelihood of occurrence for a normal button:
UIControlEventTouchDown: The user tapped the button. This fires on the finger/stylus making contact.
UIControlEventTouchUpInside: The user tapped the button. This fires on the finger/stylus contact pulled back away from the screen.
Useful for sliders and drag events like moving a component around. The below are in order of occurrence:
UIControlEventTouchDragInside: Triggered as the finger drags into the button area.
UIControlEventTouchDragExit: Triggered during a drag motion. It is called only once, as the users finger/stylus leaves the bounds of the button.
UIControlEventTouchDragOutside: Triggered during a drag motion, after 'UIControlEventTouchDragExit', and is called continuously, as long as the original touch continues.
UIControlEventTouchUpOutside: This is simply the finger/stylus being lifted BUT only if the finger/stylus is no longer within the bounds of the button. The important thing (and probably obviously) to call out is that the touch had to have been within the button at some point to associate this event with the button.
Note: My understanding is that the above can be helpful for:
Sliders: as you might expect the touch may have been intentional but because of the quick swipe action, their finger movement may be sloppy and lift up outside of the slider area.
Moving components around, as when you push things around a screen you want the movement to happen when the finger/stylus touches the border of the component/object.
Other events:
UIControlEventTouchCancel: Something out of the user's control is cancelling their touch action. Think of this as something "going wrong" on the phone side of things.
UIControlEventTouchDownRepeat: Want to detect when your user is mad and tapping a button furiously? Want to detect if they're still in Windows mode and are trying to "double click"? Or maybe you designed a button to do something different if they tap twice. This event helps with all of those!
References:
SO 1: Dif between UIControlEventTouchDragOutside and UIControlEventTouchDragExit
SO 2: What is UIControlEventTouchCancel?