I am creating an iOS app without using Interface Builder and I seem to be missing something vital whereby the controls I am creating (UITextField, UIButton, etc.) are not responding to touch events.
Here's my view hierarchy:
UIWindow->UIView->(UITextField, UIButton)
I am able to create the above hierarchy and everything is showing up fine on the screen, but tapping on the UITextField or the UIButton do nothing. I also tried adding some UIGestureRecognizer subclass instances to the UIView to no avail.
I am able to call becomeFirstResponder on the UITextField and the application starts with the keyboard up and able to receive input. However when the keyboard is dismissed the interface goes back to its "dead" mode.
What am I missing?
I was calling init on my UIWindow instead of initWithFrame:. For some reason everything was drawing correctly but the UIWindow was not responding to events because every user tap was outside the bounds of the zero-size window.
Related
I have a class which subclassed UIView. The class only have touch methods (to rotate view according to users touch).
From storyboard i added a view to my ViewController's view and subclassed it with that custom class. The idea behind to receive its touch events, thats it.
In that View some UIButton is place covering the whole view.
Now requirement is when user clicks on button then its action should call else if he tries to rotate the view it should give spin rotation.
Every thing is working fine if i make user interaction of button to be false as obviously now Uiview is the responder. But this not what i want. I want button to be interact to action as usual and if user move his figure then rotate the view.
I achieve same thing with UIRotationGestureRecognizer but it is meant with 2 figures touch and i need to achieve same thing with single finger touch.
Any help and suggestions will be appreciated.
This may sound like a weird question, but what exactly happens when it becomes hidden. It would be great to see the UIView source code, but that isn't going to happen.
Here why I'm wondering:
I'm trying to add a UIWindow (a transparent one with userinteractionenabled set to NO) above my application to tint the screen. It works perfectly fine until the user tries to share by SMS using Apples MessageUI.framework. When this happens and the MFMessageComposeViewController or MFMailComposeViewController appears, these view controllers won't receive user input. I've tried tons of thing and the only things that worked, allowing the user to interact with the views, was setting the UIWindow (the one I added) to either an alpha of 0 or set hidden to YES. I want to replicate this without hiding the view, which is why I want to know exactly what happens when the UIWindow (which is a subclass of UIView) is hidden.
There is usually only one window in iOS apps. You're better off using just a UIView for this task instead of a UIWindow. UIWindow adds some view hierarchy and event management capabilities to the UIView class. This functionality is interfering with the expected behavior in your app. I think it will just work if you change the class of this view to UIView instead of UIWindow.
I have a UITableView in a controller that is nested under a UITabBar.
The interaction is all wired up in Interface Builder so far, nothing done programmatically in terms of view switching.
I've added a UISearchDisplayController as the header of my UITableView. It displays fine, and when I tap on the text entry area, the cancel button appears and the black overlay flies in.
However, the keyboard never appears and when tapping the cancel button, the overlay flies out and the cancel button disappears, but the text entry area keeps focus and the caret stays flashing there, so I cannot tap there again to re-display the search results.
So essentially I have two problems:
Keyboard not appearing when starting to edit text on UISearchBar from UISearchDisplayController
UISearchBar not loosing focus when cancel button is tapped.
What am I doing wrong?
The .xib file that had my tab bar in it contained a UIWindow.
This lead to all sorts of craziness and in the end I gave up on trying to do this with interface builder, and resorted to constructing the UITabBar in code, thereby not creating a second UIWindow.
This resolved the problems and the UISearchDisplayController behaved correctly.
check this method in UISearchBarDelegate:
- (void)searchBarCancelButtonClicked:(UISearchBar *) searchBar;
Try to see if this is getting called and do keyboard-related removal in here. If not, try making another UISearchDisplayController. (I actually never use the default viewController's one). Also, make sure the delegate is correctly set.
I have the problem that in Notification Center widgets touch events are not being registered. Lets say, i have a simple widget with a view (_view) and a UIButton with target:self forEvent:touchDown. If I then press the button on my device nothing happens. I need to hold it for a short period of time, then the "touch" (more like hold) gets recognized and the action for the button starts. I've seen widgets where touch events work fine (UISettings, SBSettings 5), what do I need to modify in order to behave like a "normal" UIView?
I ended up figuring it out myself. I just added a UITapGestureRecognizer to the UIButton. The selector for the Gesture gets called immediately when the screen is touched, and doesn't have the annoying "delay" effect like the UI Objects. I have used it with three UIObjects so far: UIButton UIBarButtonItem and UISegmentedControl. For the segmented control simply detect the x-coordinate of the touch and then select the relevant segment. It should also work with UISlider UISwitch etc. . The only object that isn't working for me is UITextField. The clear button on that isn't responding to the clear selector so i wanted to add a gesture for that, without success.
I am implementing a preferences window, with a subclass of NSToolbarItem that has an IBOutlet to an NSView (the idea being that when an item is clicked, it will display its view). However, when I connect a toolbar item to an instance of the subclass, that item's image disappears and it is not clickable (although the text remains dark and does not fade).
If I disconnect the IBOutlet, everything works again (well, nothing does, since it isn't bound to the view, but you get the idea).
Connecting the view to the NSToolbarItem actually sets the view where the toolbar item's image normally is. This is useful in cases where you need a view in the toolbar (for example, the iTunes volume slider), but not in your case.
What you need to do is create an NSViewController for your view, and create an IBAction that shows the view. You should be able to connect the IBAction to the toolbar item (in Interface Builder), and everything should work as expected.