UIWebView won't scroll with stacked views, subviews - cocoa-touch

I'm working on an app the builds a lot of stacked views dynamically and I initially had event handling problems described here. That post solved most of my problems but I still have one issue. When a user clicks on a summary of an rss feed item, I create an almost-full screen web view with just a single "close" button. The close button works but the web view refuses to scroll. I use a variation on the code described in the the above link that is meant to work for "screens" and "gadgets" on those screens where the screens are always 1024x768 but never need to deal with events but other views and buttons on those screens do deal with events. A "screen" has a base template that never changes and another "screen" on top that does change. The code to handle the "gadget" event dispatching is:
#implementation ContainerView
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
for (int i=self.subviews.count-1; i >= 0; i--) {
UIView *view = [self.subviews objectAtIndex:i];
BOOL isContainerView = [view isMemberOfClass:[ContainerView class]];
if (isContainerView) {
return [super hitTest:point withEvent:event];
}
else {
CGPoint viewPoint = [view convertPoint:point fromView:self];
if ([view pointInside:viewPoint withEvent:event]) {
return view;
}
}
}
return nil;
}
#end
This correctly dispatches events to "Btn 1" and "Btn 2" (see diagram). I also see that this same code returns my UIWebVIew when I try to scroll it but it still doesn't scroll.
The following is a diagram of how the views stack up:
diagram
Maybe UIWebView scrolling is handled in some special way that circumvents the hitTest hacking?

I sent a support email to Apple about this and all they could really say was "UIWebView does a lot of custom touch handling that makes it elusive to catch with hitTest". So, if you have this situation and want to use this hitTest override solution, don't expect it to work for UIWebView. In may case, I'm probably going to have to make some design changes to not stack my views this way.

Related

How to use touchesBegan from one UIView in another UIViewController

I have made a graph with data in a UIView called HeartrateGraph. In a UIViewController named HRGraphInfo, I have a connected label that should output values when the graph is touched. The problem is, I don't know how to send a touched event using delegates from the UIView to the UIViewController.
Here is my touch assignment code in the UIView:
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
for (int i = 0; i < kNumberOfPoints; i++)
{
if (CGRectContainsPoint(touchAreas[i], point))
{
graphInfoRF.heartRateGraphString = [NSString stringWithFormat:#"Heart Rate reading #%d at %# bpm",i+1, dataArray[i]];
graphInfoRF.touched = YES;
break;
}
}
This segment of code is in a touchesBegan and properly stores the data value and number in the object graphInfoRF (I just did not show the declarations of dataArray, kNumberOfPoints, etc).
I am able to access graphInfoRF in the UIViewController using:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if (graphInfoRF.touched == YES) {
self.heartRateLabel.text = graphInfoRF.heartRateGraphString;
}
else {
self.heartRateLabel.text = #"No data got over to this file";}
}
The label will show the correct string, but only after the data point on the graph is touched AND the label is touched right after. How do I change the touchesBegan so that once I touch the data point on the graph it will fill the label with the data automatically without the need for a second and separate touch on the label?
All ViewControllers comes with a single view it manages once it's initialized. You should be familiar with this view, you see it whenever you use a ViewController in the Interface Builder and you can access it using self.view if you're modifying a subclass.
Since ViewControllers come with a view, it also receives touch events for that view. Implementing touchesBegan in the ViewController will then receive events for that view, and normally any subviews that view is managing. Since you've down your own implementation of 'touchesBegan' in your HeartRateGraph, and since HeartRateGraph is a subview of ViewControllers main view, HeartRateGraph will receive and handle the touch event first before the ViewController ever has a chance to receive and handle the event like it normally would (think of bubbling up).
So what's happening is, the code to change the label in ViewController is only called when the label is touched because the label is a subview of the ViewController's main view... and also label doesn't have its own touches implementation, so ViewController and is able to retrieve and handle the event the way you want only when you click somewhere outside the graph. If you understand then there are two ways to solve this.
Either pass the event up to your superview
[self.superview touchesBegan:touches withEvent:eventargs];
or the proper recommended way of doing it:
Protocols and Delegates where your View makes a delegate call to it ViewController letting it know the graph has been touched and the ViewController needs to update its contents

After bouncing of table to top App get crash [duplicate]

Here's how the scroll views work: One scroll view is paging enabled in the horizontal direction. Each 'page' of this scroll view contains a vertically scrolling UITableView. Without modification, this works OK, but not perfectly.
The behaviour that's not right: When the user scrolls up and down on the table view, but then wants to flick over to the next page quickly, the horizontal flick/swipe will not work initially - it will not work until the table view is stationary (even if the swipe is very clearly horizontal).
How it should work: If the swipe is clearly horizontal, I'd like the page to change even if the table view is still scrolling/bouncing, as this is what the user will expect too.
How can I change this behaviour - what's the easiest or best way?
NOTE For various reasons, a UIPageViewController as stated in some answers will not work. How can I do this with cross directional UIScrollViews (/one is a table view, but you get the idea)? I've been banging my head against a wall for hours - if you think you can do this then I'll more than happily award a bounty.
According to my understanding of the question, it is only while the tableView is scrolling we want to change the default behaviour. All the other behaviour will be the same.
SubClass UITableView. UITableViews are subClass of UIScrollViews. On the UITableView subClass implement one UIScrollView's UIGestureRecognizer's delegate method
- (BOOL)gestureRecognizer:(UIPanGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UISwipeGestureRecognizer *)otherGestureRecognizer
{
//Edit 1
//return self.isDecelerating;
//return self.isDecelerating | self.bounces; //If we want to simultaneous gesture on bounce and scrolling
//Edit 2
return self.isDecelerating || self.contentOffset.y < 0 || self.contentOffset.y > MAX(0, self.contentSize.height - self.bounds.size.height); // #Jordan edited - we don't need to always enable simultaneous gesture for bounce enabled tableViews
}
As we only want to change the default gesture behaviour while the tableView is decelerating.
Now change all 'UITableView's class to your newly created tableViewSubClass and run the project, swipe should work while tableView is scrolling. :]
But the swipe looks a little too sensitive while tableView is scrolling. Let's make the swipe a little restrictive.
SubClass UIScrollView. On the UIScrollView subclass implement another UIGestureRecognizer's delegate method gestureRecognizerShouldBegin:
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer
{
if ([gestureRecognizer isKindOfClass:[UIPanGestureRecognizer class]]) {
CGPoint velocity = [(UIPanGestureRecognizer *)gestureRecognizer velocityInView:self];
if (abs(velocity.y) * 2 < abs(velocity.x)) {
return YES;
}
}
return NO;
}
We want to make the "swipe is clearly horizontal". Above code only permits gesture begin if the gesture velocity on x axis is double than on y axis. [Feel free to increase the hard coded value "2" if your like. The higher the value the swipe needs to be more horizontal.]
Now change the `UiScrollView' class (which has multiple TableViews) to your ScrollViewSubClass. Run the project. :]
I've made a project on gitHub https://github.com/rishi420/SwipeWhileScroll
Although apple doesn't like this method too much:
Important: You should not embed UIWebView or UITableView objects in UIScrollView objects. If you do so, unexpected behavior can result
because touch events for the two objects can be mixed up and wrongly
handled.
I've found a great way to accomplish this.
This is a complete solution for the problem. In order to scroll the UIScrollView while your UITableView is scrolling you'll need to disable the interaction you have it.
- (void)viewDidLoad
{
[super viewDidLoad];
_myScrollView.contentSize = CGSizeMake(2000, 0);
data = [[NSMutableArray alloc]init];
for(int i=0;i<30;i++)
{
[data addObject:[NSString stringWithFormat:#"%d",i]];
}
UITapGestureRecognizer * tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleTap:)];
[self.view addGestureRecognizer:tap];
}
- (void)handleTap:(UITapGestureRecognizer *)recognizer
{
[_myTableView setContentOffset:_myTableView.contentOffset animated:NO];
}
- (void)scrollViewWillBeginDecelerating:(UIScrollView *)scrollView
{
scrollView.userInteractionEnabled = NO;
}
- (void)scrollViewDidEndDecelerating:(UIScrollView *)scrollView
{
scrollView.userInteractionEnabled = YES;
}
To sum up the code above, if the UITableView is scrolling, set userInteractionEnabled to NO so the UIScrollView will detect the swipe. If the UITableView is scrolling and the user taps on the screen, userInteractionEnabled will be set to YES.
Instead of using UIScrollView as a container for these multiple table views, try using a UIPageViewController.
You can even integrate this into your existing view controller setup as a child view controller (directly replacing the UIScrollView).
In addition, you'll likely want to implement the required methods from UIPageViewControllerDataSource and possibly one or more of the methods from UIPageViewControllerDelegate.
Did you try the methods : directionalLockEnabled of both your table and scroll and set them up to horizontal for one and vertical for the other ?
Edit :
1)
What you want to do is very complicate since the touch wait some time (like 0.1s) to know what your movement will be. And if your table is moving, it will take your touch immediately whatever it is (because it's suppose to be reactive movement on it).
I don't see any other solution for you but to override touch movement from scratch to detect immediately the kind of mouvement you want (like if the movement will be horizontal) but it will be more than hard to do it good.
2)
Another solution I can advise you is to make your table have left and right margin, where you can touch the parent scroll (pages thing so) and then even if your table is scrolling, if you touch here, only your paging scroll will be touched. It's simpler, but could not fit with your design maybe...
Use UIPageViewController and in the -viewDidLoad method (or any other method what best suits your needs or design) get UIPageViewController's UIScrollView subview and assign a delegate to it. Keep in mind that, its delegate property won't be nil. So optionally, you can assign it to another reference, and then assign your object, which conforms to UIScrollViewDelegate, to it. For example:
id<UIScrollViewDelegate> originalPageScrollViewDelegate = ((UIScrollView *)[pageViewController.view.subviews objectAtIndex:0]).delegate;
[((UIScrollView *)[pageViewController.view.subviews objectAtIndex:0]) setDelegate:self];
So that you can implement UIScrollViewDelegate methods with ease. And your UIPageViewController will call your delegate's -scrollViewDidScroll: method.
By the way, you may be obliged to keep original delegate, and respond to delegate methods with that object. You can see an example implementation in ViewPagerController class on my UI control project here
I faced the same thing recently. My UIScrollview was on paging mode and every page contained a UITableView and like you described it worked but not as you'd expected it to work. This is how solved it.
First I disabled the scrolling of the UIScrollview
Then I added a UISwipeGestureRecognizer to the actual UITableView for left and right swipes.
The action for those swipes were:
[scroll setContentOffset:CGPointMake(currentPointX + 320, PointY) animated:YES];
//Or
[scroll setContentOffset:CGPointMake(currentPointX - 320 , PointY) animated:YES];
This works flawlessly, the only down side is that if the user drags his finger on the UITableVIew that will be considered as a swipe. He won't be able to see half of screen A and half of screen B on the same screen.
You could subclass your scroll view and your table views, and add this gesture recognizer delegate method to each of them...
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldRecognizeSimultaneouslyWithGestureRecognizer:
(UIGestureRecognizer *)otherGestureRecognizer {
return YES;
}
I can't be sure this is exactly what you are after, but it may come close.

UIPageViewController Traps All UITapGestureRecognizer Events

It's been a long day at the keyboard so I'm reaching out :-)
I have a UIPageViewController in a typical implementation that basically follows Apple's standard template. I am trying to add an overlay that will allow the user to do things like touch a button to jump to certain pages or dismiss the view controller to go to another part of the app.
My problem is that the UIPageViewController is trapping all events from my overlay subview and I am struggling to find a workable solution.
Here's some code to help the example...
In viewDidLoad
// Page creation, pageViewController creation etc....
self.pageViewController.delegate = self;
[self.pageViewController setViewControllers:pagesArray
direction:UIPageViewControllerNavigationDirectionForward
animated:NO
completion:NULL];
self.pageViewController.dataSource = self;
[self addChildViewController:self.pageViewController];
[self.view addSubview:self.pageViewController.view];
// self.overlay being the overlay view
if (!self.overlay)
{
self.overlay = [[MyOverlayClass alloc] init]; // Gets frame etc from class init
[self.view addSubview:self.overlay];
}
This all works great. The overlay gets created, it gets show over the top of the pages of the UIPageViewController as you would expect. When pages flip, they flip underneath the overlay - again just as you would expect.
However, the UIButtons within the self.overlay view never get the tap events. The UIPageViewController responds to all events.
I have tried overriding -(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch per the suggestions here without success.
UIPageViewController Gesture recognizers
I have tried manually trapping all events and handling them myself - doesn't work (and to be honest even if it did it would seem like a bit of a hack).
Does anyone have a suggestion on how to trap the events or maybe a better approach to using an overlay over the top of the UIPageViewController.
Any and all help very much appreciated!!
Try to iterate through UIPageViewController.GestureRecognizers and assign self as a delegate for those gesture and implement
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch;
Your code may be like this:
In viewDidLoad
for (UIGestureRecognizer * gesRecog in self.pageViewController.gestureRecognizers)
{
gesRecog.delegate = self;
}
And add the following method:
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch
{
if (touch.view != self.pageViewController.view]
{
return NO;
}
return YES;
}
The documented way to prevent the UIPageViewController from scrolling is to not assign the dataSource property. If you assign the data source it will move into 'gesture-based' navigation mode which is what you're trying to prevent.
Without a data source you manually provide view controllers when you want to with setViewControllers:direction:animated:completion method and it will move between view controllers on demand.
The above can be deduced from Apple's documentation of UIPageViewController (Overview, second paragraph):
To support gesture-based navigation, you must provide your view controllers using a data source object.

Detect horizontal panning in UITableView

I'm using a UIPanGestureRecognizer to recognize horizontal sliding in a UITableView (on a cell to be precise, though it is added to the table itself). However, this gesture recognizer obviously steals the touches from the table. I already got the pangesturerecognizer to recognize horizontal sliding and then snap to that; but if the user starts by sliding vertical, it should pass all events from that touch to the tableview.
One thing i have tried was disabling the recognizer, but then it wouldn't scroll untill the next touch event. So i'd need it to pass the event right away then.
Another thing i tried was making it scroll myself, but then you will miss the persistent speed after stopping the touch.
Heres some code:
//In the viewdidload method
UIPanGestureRecognizer *slideRecognizer = [[UIPanGestureRecognizer alloc]initWithTarget:self action:#selector(sliding:)];
[myTable addGestureRecognizer:slideRecognizer];
-(void)sliding:(UIPanGestureRecognizer *)recognizer
{
if (recognizer.state == UIGestureRecognizerStateBegan)
{
CGPoint translation = [recognizer translationInView:favoritesTable];
if (sqrt(translation.x*translation.x)/sqrt(translation.y*translation.y)>1) {
horizontalScrolling = YES; //BOOL declared in the header file
NSLog(#"horizontal");
//And some code to determine what cell is being scrolled:
CGPoint slideLocation = [recognizer locationInView:myTable];
slidingCell = [myTable indexPathForRowAtPoint:slideLocation];
if (slidingCell.row == 0) {
slidingCell = nil;
}
}
else
{
NSLog(#"cancel");
}
if (recognizer.state == UIGestureRecognizerStateEnded || recognizer.state == UIGestureRecognizerStateCancelled)
{
horizontalScrolling = NO;
}
if (horizontalScrolling)
{
//Perform some code
}
else
{
//Maybe pass the touch from here; It's panning vertically
}
}
So, any advice on how to pass the touches?
Addition: I also thought to maybe subclass the tableview's gesture recognizer method, to first check if it's horizontal; However, then i would need the original code, i suppose... No idea if Apple will have problems with it.
Also: I didn't subclass the UITableView(controller), just the cells. This code is in the viewcontroller which holds the table ;)
I had the same issue and came up with a solution that works with the UIPanGestureRecognizer.
In contrast to Erik I've added the UIPanGestureRecognizer to the cell directly, as I need just one particular cell at once to support the pan. But I guess this should work for Erik's case as well.
Here's the code.
- (BOOL)gestureRecognizerShouldBegin:(UIPanGestureRecognizer *)gestureRecognizer
{
UIView *cell = [gestureRecognizer view];
CGPoint translation = [gestureRecognizer translationInView:[cell superview]];
// Check for horizontal gesture
if (fabsf(translation.x) > fabsf(translation.y))
{
return YES;
}
return NO;
}
The calculation for the horizontal gesture is copied form Erik's code – I've tested this with iOS 4.3.
Edit:
I've found out that this implementation prevents the "swipe-to-delete" gesture. To regain that behavior I've added check for the velocity of the gesture to the if-statement above.
if ([gestureRecognizer velocityInView:cell].x < 600 && sqrt(translate...
After playing a bit on my device I came up with a velocity of 500 to 600 which offers in my opinion the best user experience for the transition between the pan and the swipe-to-delete gesture.
My answer is the same as Florian Mielke's, but I've simplified and corrected it some.
How to use:
Simply give your UIPanGestureRecognizer a delegate (UIGestureRecognizerDelegate). For example:
UIPanGestureRecognizer *panner = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panDetected:)];
panner.delegate = self;
[self addGestureRecognizer:panner];
Then have that delegate implement the following method:
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer {
CGPoint translation = [(UIPanGestureRecognizer *)gestureRecognizer translationInView:gestureRecognizer.view.superview];
return fabsf(translation.x) > fabsf(translation.y);
}
Maybe you can use the UISwipeGestureRecognizer instead? You can tell it to ignore up/down swipes via the direction property.
You may try using the touch events manually instead of the gesture recognizers. Always passing the event back to the tableview except when you finally recognize the swipe gesture.
Every class that inherits from UIResponder will have the four touch functions (began, ended, canceled, and moved). So the simplest way to "forward" a call is to handle it in your class and then call it explicitly on the next object that you would want to handle it (but you should make sure to check if the object responds to the message first with respondsToSelector: since it is an optional function ). This way, you can detect whatever events you want and also allow the normal touch interaction with whatever other elements need it.
Thanks for the tips! I eventually went for a UITableView subclass, where i check if the movement is horizontal (in which case i use my custom behaviour), and else call [super touchesMoved: withEvent:];.
However, i still don't really get why this works. I checked, and super is a UITableView. It appears i still don't fully understand how this hierarchy works. Can someone try and explain?

Sending an event to an NSControl underneath another NSControl if it does not handle it

I have multiple superposed controls which can handle a mouse click under certain conditions. What I want to be able to do is:
The top control receives the
mouseDown: event.
The top control decides if it handles the mouseDown: event.
If it does, do something and prevent other controls from receiving the mouseDown: event.
If it does not, send the event to the control that's underneath.
This control decides if it handles the event.
etc.
In essence I'm trying to send the event to the control whose "Z-Order" is just below the top control, without the top control needing to know about the other controls or needing some special setup at instantiation.
The first thing that came to my mind was to send the event to [topControl nextResponder] but it seems the nextResponder for all controls on the window is the window itself and not a chain of controls based on their Z-Order as I previously thought.
Is there a way to do this without resorting to setting the next responder manually? The goal is to get a control which is independent from the other controls and can just be dropped on a window and work as expected.
Thanks in advance!
All you have to do is call [super mouseDown:event]. Since Mac OS X 10.5 (this did not work the same way before) NSView knows how to handle overlapping views and will take care of event handling for you.
If you need to target releases before 10.5: This is a really bad idea. Not only does the event handling mechanism not know how to deal with overlapping subviews, neither does the drawing system and you can see some very strange artefacts. That said, if you're determined:
Override -[NSView hitTest:] in your custom control/view. AppKit uses this method to determine which view in the hierarchy to deliver mouse events to. If you return nil that point in your custom view is ignored and the event should get delivered to the next view covering that point.
Mind though, this is still a bad idea because of the reasons I outlined above. It just wasn't something formally supported by AppKit at the time. The more generally accepted workaround on 10.4 and earlier is to use a child window.
It's hard to know exactly the best approach because I don't know what your application does, but here's a thought. It sounds like you want to pass the messages up through the view hierarchy... somehow.
Regardless, a view would do one of two things:
handle the message
pass it to the "next view" (how you define "next view" depends on your application)
So. How would you do this? The default behavior for a view should be to pass the message to the next view. A good way of implementing this kind of thing is through an informal protocol.
#interface NSView (MessagePassing)
- (void)handleMouseDown:(NSEvent *)event;
- (NSView *)nextViewForEvent:(NSEvent *)event;
#end
#implementation NSView (MessagePassing)
- (void)handleMouseDown:(NSEvent *)event {
[[self nextView] handleMouseDown:event];
}
- (NSView *)nextViewForEvent:(NSEvent *)event {
// Implementation dependent, but here's a simple one:
return [self superview];
}
#end
Now, in the views that should have that behavior, you'd do this:
- (void)mouseDown:(NSEvent *)event {
[self handleMouseDown:event];
}
- (void)handleMouseDown:(NSEvent *)event {
if (/* Do I not want to handle this event? */) {
// Let superclass decide what to do.
// If no superclass handles the event, it will be punted to the next view
[super handleMouseDown:event];
return;
}
// Handle the event
}
You would probably want to create an NSView subclass to override mouseDown: that you would then base your other custom view classes on.
If you wanted to determine the "next view" based on actual z-order, keep in mind that z-order is determined by the order within the subviews collection, with later views appearing first. So, you could do something like this:
- (void)nextViewForEvent:(NSEvent *)event {
NSPoint pointInSuperview = [[self superview] convertPoint:[event locationInWindow] fromView:nil];
NSInteger locationInSubviews = [[[self superview] subviews] indexOfObject:self];
for (NSInteger index = locationInSubviews - 1; index >= 0; index--) {
NSView *subview = [[[self superview] subviews] objectAtIndex:index];
if (NSPointInRect(pointInSuperview, [subview frame]))
return subview;
}
return [self superview];
}
This might be way more than you wanted, but I hope it helps.