UIView not passing touches to subviews - objective-c

I'm sure this is a basic misunderstanding of the responder chain - has it changed since iOS4? I have a view with some subviews. The superview has no purpose other than positioning the subviews, and should have nothing to do with handling touches. I want the subviews to listen for touches, but the superview seems to block touch events. I thought superviews passed touches on to subviews? I've read several posts on this, but I can't get my head around it. Several posts seem to suggest that setting userInteractionEnabled to NO for the superview will work, but doing this still doesn't seem to let the touch through.
In case it's not clear what I mean, here's some simplified code. Tapping inside the red view does NOT trigger the NSLog...
#implementation ViewController
#synthesize redView;
- (void)viewDidLoad
{
UIView *blueView = [[UIView alloc]initWithFrame:CGRectMake(100, 100, 200, 400)];
blueView.backgroundColor = [UIColor blueColor];
[self.view addSubview:blueView];
redView = [[UIView alloc]initWithFrame:CGRectMake(20, 20, 40, 40)];
redView.backgroundColor = [UIColor redColor];
[blueView addSubview:redView];
[super viewDidLoad];
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
if (CGRectContainsPoint(redView.frame, touchPoint)) {
NSLog(#"red got touched");
}
}
#end

The coordinates of the touch you are receiving are expressed in the coordinate system of the enclosing view. So you need to correct them with the offset of the blueView. One way would be to add a blueView property to your object and then correct in the touchesBegan method:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
touchPoint.x -= blueView.frame.origin.x;
touchPoint.y -= blueView.frame.origin.y;
if (CGRectContainsPoint(redViewFrame, touchPoint)) {
NSLog(#"red got touched");
}
}
Or you could subclass UIView for the blueView and override touchesBegan there

I can see an error here isreally important to call [super viewDidLoad] right after the begin bracket of that method. Second i can suggest you to use a gesture recognizer, that are based on the target action pattern and the are a lot simply to use. Third you are messing with coordinate system.

Related

UISlider change values by touching its track

I am beginner in ios and I have done basic functionality with UISlider , changing its value on dragging thumb. But now in my app I want to move the thumb of slider gradually in same direction where I am touching slider's track without dragging the thumb of slider. Thumb must reach to exact touched position on the track in 2 or more steps.
I know it requires involvement of UITouch but unable to understand HOW?
You need to subclass UISlider to add this functionality. After subclassing, you need to implement touchesBegan and touchedMoved in your subclass. Well, here how i did
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGFloat maxX = CGRectGetMaxX(self.frame);
CGFloat minX = CGRectGetMinX(self.frame);
int value = (maxX - minX) /(self.maximumValue - self.minimumValue);
[self setValue:([touch locationInView:self].x - minX) / value animated:YES];
[self beginTrackingWithTouch:touch withEvent:event];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
[self continueTrackingWithTouch:[touches anyObject] withEvent:event];
}
You need more work to stabilise a little bit more when user first tap, but i hope you get the idea.

Dragging multiple images in iOS

I'm brand new to touch and drag, and I'm trying to make eight draggable images. Here's what it looks like in my ViewController.m file:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
//check for which peg is touched
if ([touch view] == darkGreyPeg){darkGreyPeg.center = location;}
else if ([touch view] == brownPeg){brownPeg.center = location;}
//there are six more else ifs
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
[self touchesBegan:touches withEvent:event];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
[self ifCollided];
}
If I take out the if statements, and the locationInView:self.view to locationInView:touch.view, I'm able to drag the darkGreyPeg around with no problems, and it triggers my [ifCollided] whenever appropriate.
I'm watching a youTube tutorial (Milmers Xcode tutorials, "Dragging Multiple Images"), and he has exactly this same type of code, and mine doesn't work. Can anyone point me in the direction of why?
Thanks.
Try to check the User interaction Enabled & multiple touch in your view images atribute. Without check the user interaction enabled you can't use multiple dragging. I have try this, and succes to drag multiple image, label or button.
You should not implement these methods in your ViewController, but in your View class that you want to be able to drag around instead.
I think the problem is using == to compare objects:
[touch view] == darkGreyPeg
You should use isEqual: to compare objects:
[[touch view] isEqual:darkGrayPeg]
After Edit: I think the problem is that you forgot to set userInteractionEnabled to YES for the image views. Without doing that, the touch.view will be the superview, not the image view. Also, you probably don't need all those if statements to determine which image view to move, you can just use touch.view:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if (![touch.view isEqual:self.view]) touch.view.center = location;
}
If you have other views, besides self.view that you don't want to move, then you either have to exclude them too, or use a different exclusion condition (like only move the object if it's an image view, or only an object with a certain tag value).

Sprite with userInteractionEnabled set to YES does not receive touches when covered by normal sprites

I put a sprite A (subclassed to receive hits (userInteractionEnabled to YES)), and then a normal sprite B that does not take hits on top of that (userInteractionEnabled default NO), completely covering sprite A.
Tapping on sprite B, I assume that sprite A would get the touch but nothing happens. The part from docs about the matter is below.
I feel something is unclear here because it seems still that sprite B receives the touch but throws it away. OR, spriteA is removed from possible touch receiver because it's not visible.
From the docs:
https://developer.apple.com/library/ios/documentation/GraphicsAnimation/Conceptual/SpriteKit_PG/Nodes/Nodes.html#//apple_ref/doc/uid/TP40013043-CH3-SW7
For a node to be considered during hit-testing, its
userInteractionEnabled property must be set to YES. The default value
is NO for any node except a scene node. A node that wants to receive
events needs to implement the appropriate responder methods from its
parent class (UIResponder on iOS and NSResponder on OS X). This is one
of the few places where you must implement platform-specific code in
Sprite Kit
Anyway to fix this?
As long as something's userInteractionEnabled is NO, it shouldn't interfere with other touch receivers.
Update: Even setting sprite B's alpha to 0.2, making sprite A very visible, will not make sprite A touchable. Sprite B just totally "swallows" the touch, despite being not enabled for interaction.
Here is my solution until Apple updates SpriteKit with correct behaviour, or someone acctually figures out how to use it like we want.
https://gist.github.com/bobmoff/7110052
Add the file to your project and import it in your Prefix header. Now touches should work the way intended.
My solution doesn't require isKindOfClass sort of things...
In your SKScene interface:
#property (nonatomic, strong) SKNode* touchTargetNode;
// this will be the target node for touchesCancelled, touchesMoved, touchesEnded
In your SKScene implementation
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
NSArray* nodes = [self nodesAtPoint:[touch locationInNode:self]];
self.touchTargetNode = nil;
for( SKNode* node in [nodes reverseObjectEnumerator] )
{
if( [node conformsToProtocol:#protocol(CSTouchableNode)] )
{
SKNode<CSTouchableNode>* touchable = (SKNode<CSTouchableNode>*)node;
if( touchable.wantsTouchEvents )
{
self.touchTargetNode = node;
[node touchesBegan:touches
withEvent:event];
break;
}
}
}
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
[self.touchTargetNode touchesCancelled:touches
withEvent:event];
self.touchTargetNode = nil;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[self.touchTargetNode touchesMoved:touches
withEvent:event];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[self.touchTargetNode touchesEnded:touches
withEvent:event];
self.touchTargetNode = nil;
}
You will need to define the CSTouchableNode protocol... call it what you wish :)
#protocol CSTouchableNode <NSObject>
- (BOOL) wantsTouchEvents;
- (void) setWantsTouchEvents:(BOOL)wantsTouchEvents;
#end
For your SKNodes that you want to be touchable, they need to conform to the CSTouchableNode protocol. You will need to add something like below to your classes as required.
#property (nonatomic, assign) BOOL wantsTouchEvents;
Doing this, tapping on nodes works as I would expect it to work. And it shouldn't break when Apple fixes the "userInteractionEnabled" bug. And yes, it is a bug. A dumb bug. Silly Apple.
Updated
There is another bug in SpriteKit... the z order of nodes is weird. I have seen strange orderings of nodes... thus I tend to force a zPosition for nodes/scenes that need it.
I've updated my SKScene implementation to sorting the nodes based on zPosition.
nodes = [nodes sortedArrayUsingDescriptors:#[[NSSortDescriptor sortDescriptorWithKey:#"zPosition" ascending:false]]];
for( SKNode* node in nodes )
{
...
}
Here is an example for getting touches etc..
The Scene
-(id)initWithSize:(CGSize)size {
if (self = [super initWithSize:size]) {
/* Setup your scene here */
HeroSprite *newHero = [HeroSprite new];
newHero.position = CGPointMake(CGRectGetMidX(self.frame), CGRectGetMidY(self.frame));
[self addChild:newHero];
SKSpriteNode *overLapSprite = [SKSpriteNode spriteNodeWithColor:[UIColor orangeColor] size:CGSizeMake(70, 70)];
overLapSprite.position = CGPointMake(CGRectGetMidX(self.frame) + 30, CGRectGetMidY(self.frame));
overLapSprite.name = #"overlap";
[self addChild:overLapSprite];
}
return self;
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
SKNode *node = [self nodeAtPoint:location];
if ([node.name isEqualToString:#"heroNode"]) {
NSLog(#"Scene detect hit on hero");
}
if ([node.name isEqualToString:#"overlap"]) {
NSLog(#"overlap detect hit");
}
// go through all the nodes at the point
NSArray *allNodes = [self nodesAtPoint:location];
for (SKNode *aNode in allNodes) {
NSLog(#"Loop; node name = %#", aNode.name);
}
}
The Subclassed sprite / Hero
- (id) init {
if (self = [super init]) {
[self setUpHeroDetails];
self.userInteractionEnabled = YES;
}
return self;
}
-(void) setUpHeroDetails
{
self.name = #"heroNode";
SKSpriteNode *heroImage = [SKSpriteNode spriteNodeWithImageNamed:#"Spaceship"];
[self addChild:heroImage];
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint locationA = [touch locationInNode:self];
CGPoint locationB = [touch locationInNode:self.parent];
NSLog(#"Hero got Hit at, %# %#", NSStringFromCGPoint(locationA), NSStringFromCGPoint(locationB));
}
#end
Tried many things, but the touch doesn't go through the overlapping sprite. I guess you can use the scene class to detect the touch via looping through the nodes then directly call that node.
I had this problem when a particle emitter covered a button...
I think this could be a design choice, but certainly not very flexible. I prefer to handle objects that can overlap (like game objects) in Sprite Kit at Scene level with nodeAtPoint: combined with isKindOfClass: methods. For objects like buttons that usually do not overlap and are placed in a different layer (like overlays and buttons), I handle their user interaction inside their classes.
I would like my touch events to "bubble up" the nodes tree as it happens in Sparrow Framework, but i fear it is more a feature request than a bug report.
P.S.: By handling touches at scene level you can easily touch and move nodes even if they are completely covered by non-movable and non-touchable nodes!
Solution with fewer lines of code. Remember to set userInteractionEnabled in every node you want to receive touch.
- (void)handleTap:(UITapGestureRecognizer *)gestureRecognizer {
CGPoint touchLocation = self.positionInScene;
SKSpriteNode *touchedNode = (SKSpriteNode *)[self nodeAtPoint:touchLocation];
NSArray *nodes = [self nodesAtPoint:touchLocation];
for (SKSpriteNode *node in [nodes reverseObjectEnumerator]) {
NSLog(#"Node in touch array: %#. Touch enabled: %hhd", node.name, node.isUserInteractionEnabled);
if (node.isUserInteractionEnabled == 1)
touchedNode = node;
}
NSLog(#"Newly selected node: %#", touchedNode);
}

Dragging UIImages to the selected area

Newbie obj-c question.
My task is to make visually represented test for iPad. In view I have three UIImages (with images of answers to test question) what will dragged to area for answer. If selected right image, then it needs to stay in this area, if not - it goes back into the start position. If user stop dragging not in area for answer it also needs to goes back to the start position.
I tried to implement this: http://www.cocoacontrols.com/platforms/ios/controls/tkdragview but it's too hard for me because I am a super newbie.
I implemented simple dragging of images by PanRecognizer, so every from three images dragging
-(IBAction)controlPan:(UIPanGestureRecognizer *)recognizer {
CGPoint translation = [recognizer translationInView:self.view];
recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x, recognizer.view.center.y + translation.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self.view];
}
If I think in right way, I need to set CGPoint's with start and destination coordinates and then to customize Pan Gestures method? If it's not right, in what way can I do this?
You can consider this three method, maybe these are easier for you:
– touchesBegan:withEvent:
– touchesMoved:withEvent:
– touchesEnded:withEvent:
Detailed demo for – touchesBegan:withEvent: is:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesBegan:touches withEvent:event];
UITouch *touched = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touched.view];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
UITouch *touched = [[event allTouches] anyObject];
// Here I suppose self.imageView contains the image user is dragging.
CGPoint location = [touch locationInView:touched.view];
// Here the check condition is the image user dragging is absolutely in the answer area.
if (CGRectContainsRect(self.answerArea.frame, self.imageView.frame)) {
// set transition to another UIViewController
} else {
self.imageView.center = location; // Here I suppose you just move the image with user's finger.
}
}
Here is the Apple's doc for this UIResponder Class Reference

touches in touch event handling in UIView vs UIViewControllers

So, I'm capturing multiple touches and determining the number and where each of the touches occurred and I'm seeing different behavior between a UIView UIViewController.
Basically, the event handler is coded below: (in this case the touchesEnded, but it really doesn't matter). What happens in the View, is that I get the entire collection of touches, but in the controller, I get only one touch at a time, that is there is never a collection of touches, but the TouchesEnded gets executed for each touch that occurred in the same moment.
I add the subviews as identically as possible...
Can someone 'spain to me what's going on? Why doesn't the controller version give me the collection?
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
int total = 0;
for (UITouch *touch in touches)
{
for(UIView *myView in [contentView subviews])
{
if (CGRectContainsPoint([myView frame], [touch locationInView:self.view]))
total++; // this is always 1 for the UIViewController
// but I get the full number within the UIView version
}
}
}
I suspect you are getting the touches but that the views are different. Check that contentView is the same and has the same subviews.
I am not sure that I am right but this thing always work for me.
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
If(touch.view==your view)
{
NSLog("do your task here");
}
}