I have a custom UIView class. Inside that view, there is an image view. (I'm making a UISlider from scratch).
I am trying to get the image view -- the thumb of the slider -- to move across the view. Below is the code from the UIView class
- (BOOL)continueTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event {
CGPoint touchPoint = [touch locationInView:self];
if ( CGRectContainsPoint(self.thumb.frame, touchPoint))
self.thumb.center = CGPointMake(touchPoint.x, self.thumb.center.y);
return YES;
}
When I place my finger on the thumb of the slider and try to drag it, nothing happens. However, when I touch outside the thumb of the slider, drag my finger to the thumb without letting go, and then try dragging the thumb, it works fine.
How can I modify my code so that the method will be called when someone holds the thumb and tries to drag it?
My guess is that you have user interaction enabled on your image view, which is in turn not relaying the touch events to the backing logic view. Do something like this:
self.thumb.userInteractionEnabled = NO
Put that in whatever init method you are using and the thumb view will not interrupt the touch events anymore.
Related
I have made a graph with data in a UIView called HeartrateGraph. In a UIViewController named HRGraphInfo, I have a connected label that should output values when the graph is touched. The problem is, I don't know how to send a touched event using delegates from the UIView to the UIViewController.
Here is my touch assignment code in the UIView:
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
for (int i = 0; i < kNumberOfPoints; i++)
{
if (CGRectContainsPoint(touchAreas[i], point))
{
graphInfoRF.heartRateGraphString = [NSString stringWithFormat:#"Heart Rate reading #%d at %# bpm",i+1, dataArray[i]];
graphInfoRF.touched = YES;
break;
}
}
This segment of code is in a touchesBegan and properly stores the data value and number in the object graphInfoRF (I just did not show the declarations of dataArray, kNumberOfPoints, etc).
I am able to access graphInfoRF in the UIViewController using:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if (graphInfoRF.touched == YES) {
self.heartRateLabel.text = graphInfoRF.heartRateGraphString;
}
else {
self.heartRateLabel.text = #"No data got over to this file";}
}
The label will show the correct string, but only after the data point on the graph is touched AND the label is touched right after. How do I change the touchesBegan so that once I touch the data point on the graph it will fill the label with the data automatically without the need for a second and separate touch on the label?
All ViewControllers comes with a single view it manages once it's initialized. You should be familiar with this view, you see it whenever you use a ViewController in the Interface Builder and you can access it using self.view if you're modifying a subclass.
Since ViewControllers come with a view, it also receives touch events for that view. Implementing touchesBegan in the ViewController will then receive events for that view, and normally any subviews that view is managing. Since you've down your own implementation of 'touchesBegan' in your HeartRateGraph, and since HeartRateGraph is a subview of ViewControllers main view, HeartRateGraph will receive and handle the touch event first before the ViewController ever has a chance to receive and handle the event like it normally would (think of bubbling up).
So what's happening is, the code to change the label in ViewController is only called when the label is touched because the label is a subview of the ViewController's main view... and also label doesn't have its own touches implementation, so ViewController and is able to retrieve and handle the event the way you want only when you click somewhere outside the graph. If you understand then there are two ways to solve this.
Either pass the event up to your superview
[self.superview touchesBegan:touches withEvent:eventargs];
or the proper recommended way of doing it:
Protocols and Delegates where your View makes a delegate call to it ViewController letting it know the graph has been touched and the ViewController needs to update its contents
I am writing a board game app (like chess). The main view recognizes swipe gestures (UISwipeGestureRecognizer) started anywhere in its fullscreen view, which make the board rotating.
Now I added a square-shaped transparent subview exactly over the board. This is an UIControl subclass that detects touches - as moves of pawns around the board:
[self.view addSubview:self.boardControl]
I expected that swipe gestures will be blocked at the area of screen that is covered with UIControl subclass. And they are not. So when I make a quick touch and drag (a swipe) over my square boardControl, it first is detected as a pawn move, but then is detected again as swipe which rotates the board.
How can I make my UIControl subclass to block a flow of touch events started in its frame to its superviews?
I can prevent my app from reacting to swipe gestures, by filtering touches at the level of superview, either by:
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer
{
CGPoint location = [gestureRecognizer locationInView:self.view];
CGRect frame = self.boardControl.frame;
if ( CGRectContainsPoint(frame, location) )
return NO;
return YES;
}
or by:
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldReceiveTouch:(UITouch *)touch
{
CGPoint location = [touch locationInView:self.view];
CGRect frame = self.boardControl.frame;
if (CGRectContainsPoint(frame, location))
return NO;
return YES;
}
but I want to solve that issue one level earlier: I want boardControl to not pass up any of started within their frame touch events, higher in views hierarchy.
Can UIControl subclass "cover" its superview, and "eat" all touches it gets, so the superview will not need to access its frame to guess if such a touch has to be filtered out or not?
All you need is implement UIGestureRecognizerDelegate is provides way to make requested things.
I think you should start from gestureRecognizer:shouldReceiveTouch: examples
Here's how the scroll views work: One scroll view is paging enabled in the horizontal direction. Each 'page' of this scroll view contains a vertically scrolling UITableView. Without modification, this works OK, but not perfectly.
The behaviour that's not right: When the user scrolls up and down on the table view, but then wants to flick over to the next page quickly, the horizontal flick/swipe will not work initially - it will not work until the table view is stationary (even if the swipe is very clearly horizontal).
How it should work: If the swipe is clearly horizontal, I'd like the page to change even if the table view is still scrolling/bouncing, as this is what the user will expect too.
How can I change this behaviour - what's the easiest or best way?
NOTE For various reasons, a UIPageViewController as stated in some answers will not work. How can I do this with cross directional UIScrollViews (/one is a table view, but you get the idea)? I've been banging my head against a wall for hours - if you think you can do this then I'll more than happily award a bounty.
According to my understanding of the question, it is only while the tableView is scrolling we want to change the default behaviour. All the other behaviour will be the same.
SubClass UITableView. UITableViews are subClass of UIScrollViews. On the UITableView subClass implement one UIScrollView's UIGestureRecognizer's delegate method
- (BOOL)gestureRecognizer:(UIPanGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UISwipeGestureRecognizer *)otherGestureRecognizer
{
//Edit 1
//return self.isDecelerating;
//return self.isDecelerating | self.bounces; //If we want to simultaneous gesture on bounce and scrolling
//Edit 2
return self.isDecelerating || self.contentOffset.y < 0 || self.contentOffset.y > MAX(0, self.contentSize.height - self.bounds.size.height); // #Jordan edited - we don't need to always enable simultaneous gesture for bounce enabled tableViews
}
As we only want to change the default gesture behaviour while the tableView is decelerating.
Now change all 'UITableView's class to your newly created tableViewSubClass and run the project, swipe should work while tableView is scrolling. :]
But the swipe looks a little too sensitive while tableView is scrolling. Let's make the swipe a little restrictive.
SubClass UIScrollView. On the UIScrollView subclass implement another UIGestureRecognizer's delegate method gestureRecognizerShouldBegin:
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer
{
if ([gestureRecognizer isKindOfClass:[UIPanGestureRecognizer class]]) {
CGPoint velocity = [(UIPanGestureRecognizer *)gestureRecognizer velocityInView:self];
if (abs(velocity.y) * 2 < abs(velocity.x)) {
return YES;
}
}
return NO;
}
We want to make the "swipe is clearly horizontal". Above code only permits gesture begin if the gesture velocity on x axis is double than on y axis. [Feel free to increase the hard coded value "2" if your like. The higher the value the swipe needs to be more horizontal.]
Now change the `UiScrollView' class (which has multiple TableViews) to your ScrollViewSubClass. Run the project. :]
I've made a project on gitHub https://github.com/rishi420/SwipeWhileScroll
Although apple doesn't like this method too much:
Important: You should not embed UIWebView or UITableView objects in UIScrollView objects. If you do so, unexpected behavior can result
because touch events for the two objects can be mixed up and wrongly
handled.
I've found a great way to accomplish this.
This is a complete solution for the problem. In order to scroll the UIScrollView while your UITableView is scrolling you'll need to disable the interaction you have it.
- (void)viewDidLoad
{
[super viewDidLoad];
_myScrollView.contentSize = CGSizeMake(2000, 0);
data = [[NSMutableArray alloc]init];
for(int i=0;i<30;i++)
{
[data addObject:[NSString stringWithFormat:#"%d",i]];
}
UITapGestureRecognizer * tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(handleTap:)];
[self.view addGestureRecognizer:tap];
}
- (void)handleTap:(UITapGestureRecognizer *)recognizer
{
[_myTableView setContentOffset:_myTableView.contentOffset animated:NO];
}
- (void)scrollViewWillBeginDecelerating:(UIScrollView *)scrollView
{
scrollView.userInteractionEnabled = NO;
}
- (void)scrollViewDidEndDecelerating:(UIScrollView *)scrollView
{
scrollView.userInteractionEnabled = YES;
}
To sum up the code above, if the UITableView is scrolling, set userInteractionEnabled to NO so the UIScrollView will detect the swipe. If the UITableView is scrolling and the user taps on the screen, userInteractionEnabled will be set to YES.
Instead of using UIScrollView as a container for these multiple table views, try using a UIPageViewController.
You can even integrate this into your existing view controller setup as a child view controller (directly replacing the UIScrollView).
In addition, you'll likely want to implement the required methods from UIPageViewControllerDataSource and possibly one or more of the methods from UIPageViewControllerDelegate.
Did you try the methods : directionalLockEnabled of both your table and scroll and set them up to horizontal for one and vertical for the other ?
Edit :
1)
What you want to do is very complicate since the touch wait some time (like 0.1s) to know what your movement will be. And if your table is moving, it will take your touch immediately whatever it is (because it's suppose to be reactive movement on it).
I don't see any other solution for you but to override touch movement from scratch to detect immediately the kind of mouvement you want (like if the movement will be horizontal) but it will be more than hard to do it good.
2)
Another solution I can advise you is to make your table have left and right margin, where you can touch the parent scroll (pages thing so) and then even if your table is scrolling, if you touch here, only your paging scroll will be touched. It's simpler, but could not fit with your design maybe...
Use UIPageViewController and in the -viewDidLoad method (or any other method what best suits your needs or design) get UIPageViewController's UIScrollView subview and assign a delegate to it. Keep in mind that, its delegate property won't be nil. So optionally, you can assign it to another reference, and then assign your object, which conforms to UIScrollViewDelegate, to it. For example:
id<UIScrollViewDelegate> originalPageScrollViewDelegate = ((UIScrollView *)[pageViewController.view.subviews objectAtIndex:0]).delegate;
[((UIScrollView *)[pageViewController.view.subviews objectAtIndex:0]) setDelegate:self];
So that you can implement UIScrollViewDelegate methods with ease. And your UIPageViewController will call your delegate's -scrollViewDidScroll: method.
By the way, you may be obliged to keep original delegate, and respond to delegate methods with that object. You can see an example implementation in ViewPagerController class on my UI control project here
I faced the same thing recently. My UIScrollview was on paging mode and every page contained a UITableView and like you described it worked but not as you'd expected it to work. This is how solved it.
First I disabled the scrolling of the UIScrollview
Then I added a UISwipeGestureRecognizer to the actual UITableView for left and right swipes.
The action for those swipes were:
[scroll setContentOffset:CGPointMake(currentPointX + 320, PointY) animated:YES];
//Or
[scroll setContentOffset:CGPointMake(currentPointX - 320 , PointY) animated:YES];
This works flawlessly, the only down side is that if the user drags his finger on the UITableVIew that will be considered as a swipe. He won't be able to see half of screen A and half of screen B on the same screen.
You could subclass your scroll view and your table views, and add this gesture recognizer delegate method to each of them...
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldRecognizeSimultaneouslyWithGestureRecognizer:
(UIGestureRecognizer *)otherGestureRecognizer {
return YES;
}
I can't be sure this is exactly what you are after, but it may come close.
I have a question that I've searched but can't find a definative answer to. Here is my layout:
UIView - ViewController
|_UIScrollView - added programatically
| |_UIView to hold a backgound/perimeter - added programmatically
|_UIView 1 - added programmatically
|_UIView 2 - added programmatically
and so on
My question is how come the ViewController calls "touchesMoved" only once when I move say UIView 2 on touch?
Now UIView has it's own touchesMoved method, but I need the controller's touchesMoved to get called as I need it to talk to the ScrollView to update its position. Such as when that UIView 2 is near the corner, so that the ScrollView moves a little to fully show UIView 2.
If there is no way around this is there a way to update ScrollView from UIView 2 to scroll when its near a corner?
Edit:
I think I may have found a work around. Not sure if this will be accepted by Apple but:
I just made a call to a instance variable that is = to self.superview which then allows me to talk back to my ScrollView within UIView's touchesMoved
in that i can call the method [ScrollView setContentOffset:(CGPoint)contentOffset animated:(BOOL)animated] so my ScrollView gets updated as the subview(UIView2) moves close to the edge of the UIWindow.
Thank you for the suggestions.
The behavior you describe is the result of the UIScrollView hijacking the touch moved event. In other words, as soon as the UIScrollView detect that a touch moved event falls within its frame, it takes control of it. I experienced the same behavior while trying so create a special swipe handler, and it failed each time a UIScrollView was also interested by the swipe.
In my case, I solved the issue by intercepting the event in sendEvent: overridden in my custom UIWindow, but I don't know if you want to do the same. In any case, this is what worked for me:
- (void)sendEvent:(UIEvent*)event {
NSSet* allTouches = [event allTouches];
UITouch* touch = [allTouches anyObject];
UIView* touchView = [touch view];
//-- UIScrollViews will make touchView be nil after a few UITouchPhaseMoved events;
//-- by storing the initialView getting the touch, we can overcome this problem
if (!touchView && _initialView && touch.phase != UITouchPhaseBegan)
touchView = _initialView;
//-- do your own management of the event
//-- let the event propagate if you want also the default event management
[super sendEvent:event];
}
An alternative approach that you might investigate is attaching a gesture recognizer to your views -- they have a pretty high priority, so maybe the UIScrollView will not mess with them and it might work better for you.
If there is no way around this is there a way to update ScrollView from UIView 2 to scroll when its near a corner?
Have you tried to make the UIScrollView scroll by calling:
- (void)setContentOffset:(CGPoint)contentOffset animated:(BOOL)animated
How can I make the touchable button area to be of the same shape of the image provided?
Say I have a custom button with a triangle image, how can I make sure that only touches within that triangle will be processed.
Or do I have to do some kind of math in the on touch action function?
OBShapedButton is a great Project for that
It works by subclassing UIButton and overriding -pointInside:withEvent:
#implementation OBShapedButton
// UIView uses this method in hitTest:withEvent: to determine which subview
// should receive a touch event.
// If pointInside:withEvent: returns YES, then the subview’s hierarchy is
// traversed; otherwise, its branch
// of the view hierarchy is ignored.
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
// Return NO if even super returns NO (i.e. if point lies outside our bounds)
BOOL superResult = [super pointInside:point withEvent:event];
if (!superResult) {
return superResult;
}
// We can't test the image's alpha channel if the button has no image.
// Fall back to super.
UIImage *buttonImage = [self imageForState:UIControlStateNormal];
if (buttonImage == nil) {
return YES;
}
CGColorRef pixelColor = [[buttonImage colorAtPixel:point] CGColor];
CGFloat alpha = CGColorGetAlpha(pixelColor);
return alpha >= kAlphaVisibleThreshold;
}
#end
[aUIImage colorAtPixel:point] is a category-method that is attached.
This feature is definitely not provided by the core cocoa touch classes related to a UIButton. I would guess you would have to look into subclassing UIButton and intercepting the taps as you mentioned.
Ole's OBShapedButton project should be referenced more on SO. Thanks to vikingosegundo, who's my-programming-examples github has helped in the past, you can see some of what Ole put together to take away the actions from the transparent background part of a custom UIButton. I added this to my project and was able to get it up and running with these steps.
download the project
add the UIImage+ColorAtPixel.h/m and OBShapedButton.h/m files to your project folder
make sure the .m files are added to your target's Build Phases > Compile Sources list
change the UIButton class type of all your buttons to the OBShapedButton class in IB
make sure your .png file has a transparent background and make it the "Image" of the button
This project will help anyone who has multiple overlapping UIButtons that have a transparent backgrounds, even if they are in different UIViews or completely covered by another UIButton (e.g you can't select it in IB without Editor > Arrange > Send to Back/Front). However, if you have a OBShapedButton in a UIView, where part of the transparent UIView is overlapping an OBShapedButton, you can not receive the click or drag event on the UIView covered part of the OBShapedButton, so layer your UIViews with this in mind.
Download Ole and viking's projects for your tutorial collection......DO IT!