I am trying to mimic the GLPaint app and I have a TDPaintingView that has methods for -(void)touchesBegan... as well touchesMoved, touchesEnded, and touchesCanceled. However, I am not seeing any of them fire.
I have a TDAppDelegate that is hooked up to a Window in IB and a TDPaintingView in IB. The initWithCoder method for my TDPaintingView definitely fires, so I know it is being initialized. Furthermore, I can see from logging the object TDPaintingView from the context of applicationDidFinishLaunching that it is definitely there. When I change the view in trivial ways in IB (making bg red for example) it is reflected on simulator.
However, my view doesn't receive touches, and it's driving me crazy!
I set a breakpoint on -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event and it never stops, regardless of where I touch on the screen.
My TDPaintingView has no subviews, and it explicitly defines methods for those touches actions.
Where am I going wrong?
Related
Why would anyone subclass a UIView instead of creating our own custom view like this:
CGRect labelRect = CGRectMake(20,20,20,20); //Frame to contain the current view
UILabel *label = [[UILabel alloc] initWithFrame:labelRect];
label.text = #”This is a custom view with out subclassing UIView”
[self.view addSubView:label];
Is don't see any tradeoffs or advantages? Am I missing something?
In a lot of cases where you are just rearranging text, and utilizing some of the built in buttons, labels, etc; it is not necessary to subclass a UIView and can instead be done programmatically through a UIViewController or added to a storyboard/nib file. However, a great example of when and why you would want to subclass a UIView is if you wanted to add some custom touch behavior or drawing to the view that you are using.
So the way I see it, the main considerations in subclassing a UIView comes down to the following:
1) Touch interaction,
2) Custom Drawing
*Note that there may be other reasons you may want to create a view subclass simply from a program organization perspective, but these are the main three that the behavior cannot be delegate to another function, or added directly to the UIView. For example, support for general animations and background image manipulation can all be handled without subclassing UIView.
Touch Interaction - Touch interaction with a UIView can be handled in a couple different ways, you can interact directly with the touch events, or add a gesture recognizer to the UIView. In many cases, you can accomplish everything that you need to through a gesture recognizer (or custom gesture recognizer), rather than attempting to look at the individual touches that are occurring on your view, but if you wanted/needed to access the raw touch events, you would override the following functions to implement the desired behavior:
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
-(void) touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
Custom Drawing - In a UIView you can do basic 2D drawing through Quartz/Core Graphics. One example is that if you were doing a graphing application and wanted your UIView to draw the actual lines for you within its context on the screed. To accomplish this, you would override the drawRect:(CGRect)rect function and list any custom drawing here.
That being said, this is by no means the only way that you could draw a line, and could complete this task without performing custom drawing. For example, that same line, or set of lines, could be drawn into an image buffer by your UIViewController, and then displayed as the background of your view or a subview accordingly. When doing that however, the UIView loses the knowledge of the line on screen, and touch interaction directly with that line becomes more challenging. In the end, keep in mind that there are creative ways to get around subclassing just to display custom drawing, but sometimes from a program design perspective it may make more sense to just create a UIView subclass.
These are by no means the only reasons that you would want to subclass a UIView, but have been the 2 biggest reasons that I have seen used for creating a subclass of UIView. I would suggest that you look at UIView class reference or View Programming Guide for iOS for more details.
for the same reason UIView is subclassed with controls and text layout views in UIKit: because the existing implementations don't do everything, and custom implementation or members are needed (at times).
however, you might subclass UILabel in this case, if you needed to do something specific, or reuse an implementation and subclassing were a good solution.
I would say custom behaviors and custom layouts. Handling resizing beyond what is available to auto-resizing mask or smart loading subviews is more difficult to do from an outside controller.
I've created a custom image view that has a scroll view to see a long list of images in a three column display. It internally manages the images, so it only loads views for images that are seen on the screen.
Could you imagine having to build a UITableView by hand compositing the scroll view with cells, managing the queue of cell views, having to smart load the cells only when they would appear, and handling all the tap, swipe, and pan gestures every time you needed a screen with a tabular look?
I have a custom UIView, to which I have added a UITapGestureRecognizer to the view in order to detect taps (i.e. finger press and release) on the view. This works fine.
However, I now want to change the appearance of the UIView (making it a bit darker) while the finger is pressed on the view (just like the behavior of a UIButton).
How do I do this?
Change the appearance in -(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event.
In an iPad app, wherever a user is touching the screen I want to display an image, highlighting the points they are touching. The app contains a number of nested views, all of which should receive touches and behave normally.
Seems simple, but I've failed to find a good way to do it. Using the touches began: with event and related functions on the root view controller does not work because if a subview is touched the events are not fired. I've also created a 'dummy' gesture recognizer which just passes touch events to another class which draws images. That works great-ish and buttons work, but breaks UIScrollViews and I'm guessing other subviews with gesture reconizers.
Is there nowhere you can just access all touch events without affecting where those touches are headed?
thanks.
Your dummy gesture recognizer should be ok. Just watch out for setting states. possible -> began -> ...
Basically your gesture recognizer is forwarding all touches so it can be in began or possible state all the time while any touch exists.
To get rid of problems with other gesture recognizers return YES in this delegate method.
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
return YES;
}
Other option is to subclass main UIWindow in your app and override this method
- (void)sendEvent:(UIEvent *)event
Here you should have access to all events. It's quite easy to filter them.
You can apply a UITapGestureRecognizer to the entire view and set the cancelsTouchesInView property to NO. This will allow you to be notified of all taps on the view and its subviews without intercepting all of the touch events.
Additionally, you can implement the -gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: delegate method to keep this gesture recognizer from stomping on the ones used by views like UIScrollView.
you can try overriding hitTest:withEvent:
-(UIView*)hitTest:(CGPoint)point withEvent:(UIEvent *)event
This maybe what you are looking for.
I'm getting exactly the behavior I want in a UIScrollView: when I return NO for touchesShouldBegin, the scrolling behavior happens. Otherwise, the content views get the event.
However, I'd like to show something on touch down and touch up when the scrolling behavior is occurring. Unfortunately, returning NO for touchesShouldBegin blocks the touchesBegan and touchesEnded methods.
The delegate method:
-(void) scrollViewWillBeginDragging:(UIScrollView *)scrollView {
doesn't work because sometimes the user touches but doesn't drag. How can I register touchUp and touchDown events yet preserve scrolling behavior?
You cannot. While a UIScrollview is scrolling, the subview(s) (and the view itself) are not updated or redrawn. It only scrolls. It is just how the UIResponder chain works in this scenario.
Is it possible to control when the UITableView scrolls in my own code.
I am trying to get behaviour where a vertical swipe scrolls and a horizontal swipe gets passed through to my code (of which there are many example)
BUT
I want a DIAGONAL swipe to do nothing, i.e the UITableView should not even begin scrolling.
I tried catching it in here
- (void)scrollViewWillBeginDragging:(UIScrollView *)scrollView
but the scrollView.contentOffset.x is always 0 so I cannot detect a horizontal movement.
I also tried subclassing UITableView and implementing
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
etc..
but the UITableView (and I guess it's parent UIScrollView) start to scroll before the touches are notified?
To re-iterate, I want the UITableView scrolling to be locked if a diagonal swipe is made, but to scroll vertically normally.
(This behaviour can be seen in Tweetie(Twitter) for the iPhone)
Thanks for any help!
If you can work with 3.2 or later, the UIGestureRecognizer suite should allow this. There are a series of calls allowing some gestures to cancel or interoperate with other gestures, and you should be able to create a custom diagonal swipe gesture that cancels other gestures but does not actually do anything itself.
Prior to 3.2 the UIScrollView gesture handling is essentially undocumented. You can detect taps but not movements through the standard touchesBegan UIResponder calls. I think that once it detects movement it commandeers the run loop and captures all events, bypassing the normal event chain.
Note that setContentOffset is always called. You can create a subclass of UIScrollView and when you detect a diagonal shift during event tracking do not pass it to super. I do not know if, or how well this would work but it is somewhere to start.