I have this view hierarchy:
The three buttons don't response. They are enabled, just when I touch there, they don't do anything (not even change their gradient/tint) which tells me they are not even received the touches. Is there something I am doing wrong?
Thanks
User interaction is enabled on all the buttons' superviews, and the buttons themselves. The buttons are also enabled.
Are you using any imageViews on top or containing the UIButtons? If so, sometimes, those absorb the touches because they have the userInteraction boolean set to NO by default. Maybe even getting rid of your gradient imageView (for the time being) to determine if those are causing the issues.
If not, I would set a breakpoint on the viewDidLoad method of the superview and type in [self.view recursiveDescription]. It should give you at-a-glance overview of what's around those buttons and might be stealing the touches.
Good luck!
Related
This may sound like a weird question, but what exactly happens when it becomes hidden. It would be great to see the UIView source code, but that isn't going to happen.
Here why I'm wondering:
I'm trying to add a UIWindow (a transparent one with userinteractionenabled set to NO) above my application to tint the screen. It works perfectly fine until the user tries to share by SMS using Apples MessageUI.framework. When this happens and the MFMessageComposeViewController or MFMailComposeViewController appears, these view controllers won't receive user input. I've tried tons of thing and the only things that worked, allowing the user to interact with the views, was setting the UIWindow (the one I added) to either an alpha of 0 or set hidden to YES. I want to replicate this without hiding the view, which is why I want to know exactly what happens when the UIWindow (which is a subclass of UIView) is hidden.
There is usually only one window in iOS apps. You're better off using just a UIView for this task instead of a UIWindow. UIWindow adds some view hierarchy and event management capabilities to the UIView class. This functionality is interfering with the expected behavior in your app. I think it will just work if you change the class of this view to UIView instead of UIWindow.
I am working on an app, which actually works like MSPaint (something to draw lines, etc...).
I got a white UIView, basically where the user draws. On top of this UIView I set up a UIImage, which is gray, with a 0,4 alpha. I want this UIImage to be used as a blotting paper. The idea is to disable touch when the user put the palm of his hand on this area, so it's more comfortable to draw (multitouch is disabled, and with this "blotting paper" you won't draw something accidentally with your palm...)
Even if I bring the UIImage to the front, on top of the view, and even if I disable user interactions on this UIImage, it is still possible to draw on the UIView. , behind the UIImage (kind of strange!)
I do not understand what's happening, because, it seems that the image is transparent, and that the UIView "behind" is still active, even if she's overlaid by the UIImage?!
Any help/indication/idea would be much appreciated! Thank you :-)
Have you set the "userInteractionEnabled" property of the UIImage to "NO" ?
You may actually want to do the opposite. When you disable user interaction or touches, the view basically becomes invisible to touches and they are passed on to the next view.
In your case you do want userInteractionEnabled because you want the view to catch those touches.
You have to disable the user interaction on the UIImageView not the UIImage and it should work.
Edit:
Or you could be sneaky and just add an empty view over it. Use the same frame size so it overlaps perfectly and thats it. You'll be able to see everything you need and it's not a subview of it so there will eb no interaction and touches will get registered but won't have any effect. :P
No better ideas unless you post some of your code...
OK, so I managed to do what I wanted to! YAY!
I got 3 different classes :
StrokesViewController (UIViewController)-the view controller
StrokesView (UIView) - the view where the user draws the strokes.
BlottingPaper (UIView) - the blotting paper.
I got a XIB file "linked" to all three.
I created this new class, called "BlottingPaper", typed UIView. the .h and .m file are actually empty (I do import #import < Foundation/Foundation.h >)
User interaction is enable on BlottingPaper.
I do not use the exclusive touch on this class.
On the XIB file, I just put a view on top of StrokesView. I link it to BlottingPaper (modify the alpha as I want, blablabla...)
And that's it! When I put the palm of my hand on it, it doesn't draw anything on the area where my hand is, but I still can draw with my finger on the rest of the StrokesView!
In addition to Dancreek's response, you should be setting buvard.userInteractionEnabled = YES; so that it captures interaction.
You should also set buvard.exclusiveTouch = YES; so that buvard is the only view which will receive touch events.
When you remove buvard you should set buvard.exclusiveTouch = NO; so that other views will regain their ability to receive touches.
I have a few Popovers that I present from UIBarButtonItems.
The popovers are Property settings for an Object, Color, Size, Positioning, name, line thickness, etc.
As the settings in the popover change, the Object they are properties for also changes in realtime.
Most of the time a tap away from the popover clears it. Though sometimes something happens to the environment where the responder chain seems to be broken and a tap away from the popover no longer dismisses the popover. The Property controls in the popover still work, though there is no way to dismiss the popover.
On Popovers with TextFields, I can alter the text, dismiss the Keyboard and then I can Dismiss the popover normally.
Any Tips on where to start Logging and see who's on top of the responder chain and who's able to receive gestures/touches.
Thanks!
When I was debugging recognizers and such (similar), I put a UITapGestureRecognizer on the main background view, calling a method viewTapped:. In viewTapped: you can dismiss any outstanding popovers (handy if they are properties of the VC). You can also send resignFirstResponder if you worry about any textFields that might not be cooperating.
Are you setting any passthrough views? You might examine those. Normally you don't have any ability to tap outside the popover without dismissing unless you specifically set the passthroughViews.
Good luck,
Damien
I have a simple question. In my app, I have a some buttons for navigation inside a UIScrollView, which scroll with the content. This way, when the user enters a text field and the keyboard pops up, the buttons will scroll away for extra space. However, the buttons don't highlight immediately when I tap on them. I've learned that I can eliminate this problem by setting delaysContentTouches to NO, but this makes scrolling nearly impossible, because all the UITextFields and buttons in the view also highlight immediately, stealing the scroll.
I have found a way to only not delay the buttons via a UIScrollView subclass, so this is an option, but I was wondering if there is another way. I generally hate subclassing when it is to fix just one little thing.
The touchesShouldBegin:withEvent:inContentView: method of UIScrollView is intended to be overridden by subclasses if delaysContentTouches is set to YES. So this is the case when subclassing is completely OK.
I have a UIButton linked up in IB correctly(I believe). The button fires inconsistently, every time I reload the view to show updated info, the button works sometimes and sometimes does not.It gives no errors. I can't find a pattern to when it works and when it doesn't, the same code is run every time I open the view and it still works when it wants. Besides linking it in IB I have also tried to addTarget in ViewDidLoad and remove the IB connection but it still has the same inconsistency,
[_buttonScreen addTarget:self action:#selector(buttonScreenClicked) forControlEvents:UIControlEventTouchUpInside];
If I add NSLog(#"Clicked"); to buttonScreenClicked I see that the method doesn't always get called, what would cause it to do this, I have made sure that I set:
[_buttonScreen setAlpha:0.1];
[_buttonScreen setHidden:NO];
[_buttonScreen setUserInteractionEnabled:YES];
I have no Image, text, or color in the button, but it still works sometimes.
I'm using AFKPageFlipper on the same view but it still had the same problem before I added AFKPageFlipper, so I don't think its that.
If anyone could point me in any direction to start trouble shooting this problem I would appreciate.
Thanks
I just had the same problem and worked it out. The 5 seconds is the clue.
Somewhere you have a gesture recognizer covering the same space as your button. More specifically you have a gesture recognizer that is eating your Taps but not your LongPresses. If you just tap the button the gesture recognizer runs off with your event; Hold your finger down long enough and the gesture recognizer no longer considers it a tap and the event is passed through to your button.
Instrument your Tap gesture recognizer handlers and the problem should pop out at you.
Make sure you don't have any other UIView descendants overlaying the button (like a transparent UIScrollView) as these will intercept the touch events first.
Also make sure that the containing view (the view with the button in) is correctly sized, by default you can place a view outside the bounds of another view and the clipsToBounds is set to false so you will see it but not be able to interact with it.
Things to try:
Do you have any other actions on the button?
Do you have any other UIViews which could possibly be accepting the key presses (above or below, or un-shown)
Also, please check that you have only one UIViewController instance for this screen. Other issues may arrise because of that.
What happens if you dont set the alpha level?
Do you release the object properly in the dealloc only ?
Hope this helps