touches in touch event handling in UIView vs UIViewControllers - objective-c

So, I'm capturing multiple touches and determining the number and where each of the touches occurred and I'm seeing different behavior between a UIView UIViewController.
Basically, the event handler is coded below: (in this case the touchesEnded, but it really doesn't matter). What happens in the View, is that I get the entire collection of touches, but in the controller, I get only one touch at a time, that is there is never a collection of touches, but the TouchesEnded gets executed for each touch that occurred in the same moment.
I add the subviews as identically as possible...
Can someone 'spain to me what's going on? Why doesn't the controller version give me the collection?
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
int total = 0;
for (UITouch *touch in touches)
{
for(UIView *myView in [contentView subviews])
{
if (CGRectContainsPoint([myView frame], [touch locationInView:self.view]))
total++; // this is always 1 for the UIViewController
// but I get the full number within the UIView version
}
}
}

I suspect you are getting the touches but that the views are different. Check that contentView is the same and has the same subviews.

I am not sure that I am right but this thing always work for me.
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
If(touch.view==your view)
{
NSLog("do your task here");
}
}

Related

touchesEnded called when any part of the screen is tapped with 2 fingers at the same time

I've found this strangest behavior with touch event.
The targeting view is not even allocated yet touchesEnded will get called by tapping any part of the screen with 2 fingers not 1, must be 2... then it skips touchesBegan, call touchesEnded.
I even checked if the targeted view's userInteraction is set to YES but no, it's set to NO obviously cuz it's NO by default but it's not allocated anyway.
All this does not happen when the targeted view is already allocated and positioned etc.
Has anyone experienced this??
Why does this happen and do I must allocate the property in order to prevent its touch events getting called randomly like crazy?
Also why would this strange behavior require 2 fingers at the same time instead of just one tap.... it's not important but I'm just very curious.
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
if (touch.view == self.buttonStartButton) {
NSInteger levelUp = [self.levelModel checkForLevelUp];
if (levelUp == 0) {
[self byeGameContainer];
}
else {
[self.delegate levelingUp];
}
}}
So, tried allocating the target and setting its frame out of the screen's bounds didn't work touchesBegan and touchesEnded would STILL get called when any part of the screen is tapped. I even set its userInteractionEnabled = NO but touch events would STILL get called.
Ended up doing this and this does work.
By checking whether the targeted property is allocated or not, touch events finally do not get called when they're not supposed to/shouldn't.
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
if (touch.view == self.buttonStartButton && self.buttonStartButton != nil) {
NSInteger levelUp = [self.levelModel checkForLevelUp];
if (levelUp == 0) {
[self byeGameContainer];
}
else {
[self.delegate levelingUp];
}
}}

Dragging multiple images in iOS

I'm brand new to touch and drag, and I'm trying to make eight draggable images. Here's what it looks like in my ViewController.m file:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
//check for which peg is touched
if ([touch view] == darkGreyPeg){darkGreyPeg.center = location;}
else if ([touch view] == brownPeg){brownPeg.center = location;}
//there are six more else ifs
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
[self touchesBegan:touches withEvent:event];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
[self ifCollided];
}
If I take out the if statements, and the locationInView:self.view to locationInView:touch.view, I'm able to drag the darkGreyPeg around with no problems, and it triggers my [ifCollided] whenever appropriate.
I'm watching a youTube tutorial (Milmers Xcode tutorials, "Dragging Multiple Images"), and he has exactly this same type of code, and mine doesn't work. Can anyone point me in the direction of why?
Thanks.
Try to check the User interaction Enabled & multiple touch in your view images atribute. Without check the user interaction enabled you can't use multiple dragging. I have try this, and succes to drag multiple image, label or button.
You should not implement these methods in your ViewController, but in your View class that you want to be able to drag around instead.
I think the problem is using == to compare objects:
[touch view] == darkGreyPeg
You should use isEqual: to compare objects:
[[touch view] isEqual:darkGrayPeg]
After Edit: I think the problem is that you forgot to set userInteractionEnabled to YES for the image views. Without doing that, the touch.view will be the superview, not the image view. Also, you probably don't need all those if statements to determine which image view to move, you can just use touch.view:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if (![touch.view isEqual:self.view]) touch.view.center = location;
}
If you have other views, besides self.view that you don't want to move, then you either have to exclude them too, or use a different exclusion condition (like only move the object if it's an image view, or only an object with a certain tag value).

Can I detect if user has touched anywhere except from Annotation, on the map?

I am building an app and I would like to know when user has touched an annotation or anywhere else on the map. I have a button that I would like to display only if an Annotation has been selected. So, if user after annotation try to touch anywhere on the map (if is not another annotation) then make button invisible.
At the moment, I have tried the touchesEnded method but the problem is that it does not recognises annotations and land.
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
if([touches isMemberOfClass:[BuildingViewController class]])
printf("Building");
else
printf("Land!");
}
Thanks in advance.
touches is object with class NSSet, so you can get object from NSSet and check it`s class membership. For example:
UITouch *touch = [touches anyObject];
You can get UIView from touch
UIView *touchedView = touch.view;
Then check this UIView class and compare it with yours
[touchedView isMemberOfClass:[BuildingView class]]
Also I advice you to check all touches in that NSSet.
BOOL isBuilding = NO;
for(UITouch *touch in touches){
if([touch.view isMemberOfClass:[BuildingView class]]){
isBuilding = YES;
break;
}
}
if(isBuilding){
printf("Building");
}else{
printf("Land!");
}

Objective C: Stop Scrolling on Touch

I am trying to temporarily stop the scrolling of my view when the user starts to touch the screen and move.
I have coded this
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
if (touch){
[scrollView setScrollEnabled:FALSE];
}
}
it is not working.
If you have a ScrollView and the User touches it its Defaults beheviour that it stops to scroll.
If this is not whatyou ment please be more specific what you want ;)
try this:
- (void)scrollViewWillBeginDragging:(UIScrollView *)scrollView{
[scrollView setScrollEnabled:FALSE];
}
now i just wonder where you enable it again...

How can I convert the coordinates of a touch event on a subview to the coordinates in its parent view?

I'm diving into iOS development and I'm playing with touch events. I have a class named UIPuzzlePiece that's a subclass of UIImageView and it represents the puzzle piece objects that you can move around on the screen. My goal is to be able to move the puzzle pieces around on the screen with my finger.
Currently, I have the touchesBegan and touchesMoved events implemented inside the UIPuzzlePiece class...
// Handles the start of a touch
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event touchesForView:self] anyObject];
CGPoint cgLocation = [touch locationInView:self];
[self setCenter:cgLocation];
}
// Handles the continuation of a touch.
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event touchesForView:self] anyObject];
CGPoint cgLocation = [touch locationInView:self];
[self setCenter:cgLocation];
}
The problem I'm facing is the CGPoint location that is returned represents the location inside the UIPuzzlePiece view, which makes sense, but I need to know the location of the touch inside the parent view. That way the statement [self setCenter:cgLocation] will actually move the UIPuzzlePiece to the location of the touch. I can always write some hacky algorithm to convert it, but I was hoping there's a better or simpler way to accomplish my goal. If I only had a single puzzle piece, I would just implement the touch event code inside the view controller of the parent view, but I have many many puzzle pieces, which is why I'm handling it inside the UIPuzzlePiece view.
Your thoughts?
Thanks so much in advance for your wisdom!
UIView implements -convertRect:fromView:, -convertRect:toView:, -convertPoint:fromView: and -convertPoint:toView:. Sounds like you just need to use one of the latter methods to convert between view coordinate spaces. For example, if you want to convert a CGPoint from the coordinate space of self to the parent view, you can use either [self convertPoint:thePoint toView:self.superview] or you can use [self.superview convertPoint:thePoint fromView:self] (they will do the same thing).