touching objects from mutable array - objective-c

I have created some buttons and I keep them in NSMutableArray.
In the view where they appear there are a few simple sdk methods to handle touches.
The problem is touching exactly this objects from MutableArray and nothing else.
Here is method touchesBegan:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if (isEarthquake == NO) {
for (UITouch *touch in touches) {
for (int i = 0; i < 9; i++) {
UIButton *menuButton;
menuButton = [menuButtons objectAtIndex:i];
if (CGRectContainsPoint([menuButton frame], [touch locationInView:self.view])) {
[self startTouchTimer:3.00];
}
}
}
}
isEarthquake is simple bool that checks if action can be performed.
after that I want to check all objects if they are being touched.
What's wrong?
Thanks in advance.

Touches began is a view method, not a view controller method. You are using
locationInView:self.view
this should be
locationInView:self
Check this code is in a view subclass and not a view controller subclass.
Also, you might need to use
[self convertRect:[menuButton bounds] fromView:menuButton];
to convert the CGRect to the proper coordinate space.

Related

Touch not detected - subview

I have been trying to figure this out for some time now. I am using auto layout with a scrollview. Inside the scrollview is a UIView, another UIView and then the subview within that. I can't get touch to work with the subview. I have tried touchesbegan along with uitapgesture with no success. Any suggestions? Here is the layout hierarchy.
ScrollView
---- UIView
------- UIView
------ UIView - Need to detect touch here.
EDIT:
Sorry guys, was at work. Here is some code. I have subclassed my scrollview to receive touches as it seems to be working best so far getting down to the subview level of other view subviews.
The following code detects that the view is that class without the pointInside. I am currently having issues receiving the right point for the view.
- (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
UIView *relatedVidsView = [[[touches allObjects] firstObject] view];
for (int i = 0; i < relatedVidsView.subviews.count; i++) {
UIView *view = [relatedVidsView.subviews objectAtIndex:i];
CGPoint touchPoint = [[[touches allObjects] firstObject] locationInView:self];
if ([view isKindOfClass:[RecipeVideoView class]] && [view pointInside:touchPoint withEvent:nil]) {
NSLog(#"TEST");
}
}
[super touchesBegan:touches withEvent:event]; }
EDIT 2:
So I was able to figure this out. I subclassed the view that was containing the subviews that i needed touch events for. What I did was passed the events from the parent to the subviews using touchesBegan. I will post my code later, thanks for the help anyway.

Sprite with userInteractionEnabled set to YES does not receive touches when covered by normal sprites

I put a sprite A (subclassed to receive hits (userInteractionEnabled to YES)), and then a normal sprite B that does not take hits on top of that (userInteractionEnabled default NO), completely covering sprite A.
Tapping on sprite B, I assume that sprite A would get the touch but nothing happens. The part from docs about the matter is below.
I feel something is unclear here because it seems still that sprite B receives the touch but throws it away. OR, spriteA is removed from possible touch receiver because it's not visible.
From the docs:
https://developer.apple.com/library/ios/documentation/GraphicsAnimation/Conceptual/SpriteKit_PG/Nodes/Nodes.html#//apple_ref/doc/uid/TP40013043-CH3-SW7
For a node to be considered during hit-testing, its
userInteractionEnabled property must be set to YES. The default value
is NO for any node except a scene node. A node that wants to receive
events needs to implement the appropriate responder methods from its
parent class (UIResponder on iOS and NSResponder on OS X). This is one
of the few places where you must implement platform-specific code in
Sprite Kit
Anyway to fix this?
As long as something's userInteractionEnabled is NO, it shouldn't interfere with other touch receivers.
Update: Even setting sprite B's alpha to 0.2, making sprite A very visible, will not make sprite A touchable. Sprite B just totally "swallows" the touch, despite being not enabled for interaction.
Here is my solution until Apple updates SpriteKit with correct behaviour, or someone acctually figures out how to use it like we want.
https://gist.github.com/bobmoff/7110052
Add the file to your project and import it in your Prefix header. Now touches should work the way intended.
My solution doesn't require isKindOfClass sort of things...
In your SKScene interface:
#property (nonatomic, strong) SKNode* touchTargetNode;
// this will be the target node for touchesCancelled, touchesMoved, touchesEnded
In your SKScene implementation
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
NSArray* nodes = [self nodesAtPoint:[touch locationInNode:self]];
self.touchTargetNode = nil;
for( SKNode* node in [nodes reverseObjectEnumerator] )
{
if( [node conformsToProtocol:#protocol(CSTouchableNode)] )
{
SKNode<CSTouchableNode>* touchable = (SKNode<CSTouchableNode>*)node;
if( touchable.wantsTouchEvents )
{
self.touchTargetNode = node;
[node touchesBegan:touches
withEvent:event];
break;
}
}
}
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
[self.touchTargetNode touchesCancelled:touches
withEvent:event];
self.touchTargetNode = nil;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[self.touchTargetNode touchesMoved:touches
withEvent:event];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[self.touchTargetNode touchesEnded:touches
withEvent:event];
self.touchTargetNode = nil;
}
You will need to define the CSTouchableNode protocol... call it what you wish :)
#protocol CSTouchableNode <NSObject>
- (BOOL) wantsTouchEvents;
- (void) setWantsTouchEvents:(BOOL)wantsTouchEvents;
#end
For your SKNodes that you want to be touchable, they need to conform to the CSTouchableNode protocol. You will need to add something like below to your classes as required.
#property (nonatomic, assign) BOOL wantsTouchEvents;
Doing this, tapping on nodes works as I would expect it to work. And it shouldn't break when Apple fixes the "userInteractionEnabled" bug. And yes, it is a bug. A dumb bug. Silly Apple.
Updated
There is another bug in SpriteKit... the z order of nodes is weird. I have seen strange orderings of nodes... thus I tend to force a zPosition for nodes/scenes that need it.
I've updated my SKScene implementation to sorting the nodes based on zPosition.
nodes = [nodes sortedArrayUsingDescriptors:#[[NSSortDescriptor sortDescriptorWithKey:#"zPosition" ascending:false]]];
for( SKNode* node in nodes )
{
...
}
Here is an example for getting touches etc..
The Scene
-(id)initWithSize:(CGSize)size {
if (self = [super initWithSize:size]) {
/* Setup your scene here */
HeroSprite *newHero = [HeroSprite new];
newHero.position = CGPointMake(CGRectGetMidX(self.frame), CGRectGetMidY(self.frame));
[self addChild:newHero];
SKSpriteNode *overLapSprite = [SKSpriteNode spriteNodeWithColor:[UIColor orangeColor] size:CGSizeMake(70, 70)];
overLapSprite.position = CGPointMake(CGRectGetMidX(self.frame) + 30, CGRectGetMidY(self.frame));
overLapSprite.name = #"overlap";
[self addChild:overLapSprite];
}
return self;
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
SKNode *node = [self nodeAtPoint:location];
if ([node.name isEqualToString:#"heroNode"]) {
NSLog(#"Scene detect hit on hero");
}
if ([node.name isEqualToString:#"overlap"]) {
NSLog(#"overlap detect hit");
}
// go through all the nodes at the point
NSArray *allNodes = [self nodesAtPoint:location];
for (SKNode *aNode in allNodes) {
NSLog(#"Loop; node name = %#", aNode.name);
}
}
The Subclassed sprite / Hero
- (id) init {
if (self = [super init]) {
[self setUpHeroDetails];
self.userInteractionEnabled = YES;
}
return self;
}
-(void) setUpHeroDetails
{
self.name = #"heroNode";
SKSpriteNode *heroImage = [SKSpriteNode spriteNodeWithImageNamed:#"Spaceship"];
[self addChild:heroImage];
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint locationA = [touch locationInNode:self];
CGPoint locationB = [touch locationInNode:self.parent];
NSLog(#"Hero got Hit at, %# %#", NSStringFromCGPoint(locationA), NSStringFromCGPoint(locationB));
}
#end
Tried many things, but the touch doesn't go through the overlapping sprite. I guess you can use the scene class to detect the touch via looping through the nodes then directly call that node.
I had this problem when a particle emitter covered a button...
I think this could be a design choice, but certainly not very flexible. I prefer to handle objects that can overlap (like game objects) in Sprite Kit at Scene level with nodeAtPoint: combined with isKindOfClass: methods. For objects like buttons that usually do not overlap and are placed in a different layer (like overlays and buttons), I handle their user interaction inside their classes.
I would like my touch events to "bubble up" the nodes tree as it happens in Sparrow Framework, but i fear it is more a feature request than a bug report.
P.S.: By handling touches at scene level you can easily touch and move nodes even if they are completely covered by non-movable and non-touchable nodes!
Solution with fewer lines of code. Remember to set userInteractionEnabled in every node you want to receive touch.
- (void)handleTap:(UITapGestureRecognizer *)gestureRecognizer {
CGPoint touchLocation = self.positionInScene;
SKSpriteNode *touchedNode = (SKSpriteNode *)[self nodeAtPoint:touchLocation];
NSArray *nodes = [self nodesAtPoint:touchLocation];
for (SKSpriteNode *node in [nodes reverseObjectEnumerator]) {
NSLog(#"Node in touch array: %#. Touch enabled: %hhd", node.name, node.isUserInteractionEnabled);
if (node.isUserInteractionEnabled == 1)
touchedNode = node;
}
NSLog(#"Newly selected node: %#", touchedNode);
}

Can I detect if user has touched anywhere except from Annotation, on the map?

I am building an app and I would like to know when user has touched an annotation or anywhere else on the map. I have a button that I would like to display only if an Annotation has been selected. So, if user after annotation try to touch anywhere on the map (if is not another annotation) then make button invisible.
At the moment, I have tried the touchesEnded method but the problem is that it does not recognises annotations and land.
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
if([touches isMemberOfClass:[BuildingViewController class]])
printf("Building");
else
printf("Land!");
}
Thanks in advance.
touches is object with class NSSet, so you can get object from NSSet and check it`s class membership. For example:
UITouch *touch = [touches anyObject];
You can get UIView from touch
UIView *touchedView = touch.view;
Then check this UIView class and compare it with yours
[touchedView isMemberOfClass:[BuildingView class]]
Also I advice you to check all touches in that NSSet.
BOOL isBuilding = NO;
for(UITouch *touch in touches){
if([touch.view isMemberOfClass:[BuildingView class]]){
isBuilding = YES;
break;
}
}
if(isBuilding){
printf("Building");
}else{
printf("Land!");
}

Retrieve and change the point at which touchUpInside changes to touchUpOutside

I have made a UISlider work just like the "slide to unlock" slider.
What I need to do is determine the point at which lifting your finger off is classed as touchUpOUTSIDE and not touchUpINSIDE. This is the point where you slide your finger past the end of the slider too far.
I guess it's the same as with a UIButton, you can press the button then slide your finger off the button and depending how far you go, it can still be classed as touchUpInside.
If possible, i'd like to mark the target area with a circle.
Once i've managed to find where this point is, is it possible to change it? So I could have a bigger target area?
I really don't know where to start with this. Thanks
According to the docs, the UIControlEventTouchUpOutside event is triggered when the finger is outside the bounds of the control. If you're trying to change that area, the slider will scale with it. Why not just tie the action for UIControlEventTouchUpOutside to the same as UIControlEventTouchUpInside?
It's taken me a few hours, but i've managed to sort this.
I've done a lot of testing overriding touchesMoved, touchesEnded and sendAction:action:target:event and it seems that any touch within 70px of the frame classes as a touch INSIDE. So for a UISlider that's 292x52 any touch from x:-70 to x:362 or y:-70 to 122 will count as an inside touch, even though it's outside the frame.
I have come up with this code that will override a custom class to allow a bigger area of 100px around the frame to count as an inside touch:
#import "UICustomSlider.h"
#implementation UICustomSlider {
BOOL callTouchInside;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
callTouchInside = NO;
[super touchesMoved:touches withEvent:event];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touchLocation = [[touches anyObject] locationInView:self];
if (touchLocation.x > -100 && touchLocation.x < self.bounds.size.width +100 && touchLocation.y > -100 && touchLocation.y < self.bounds.size.height +100) callTouchInside = YES;
[super touchesEnded:touches withEvent:event];
}
-(void)sendAction:(SEL)action to:(id)target forEvent:(UIEvent *)event
{
if (action == #selector(sliderTouchOutside)) { // This is the selector used for UIControlEventTouchUpOutside
if (callTouchInside == YES) {
NSLog(#"Overriding an outside touch to be an inside touch");
[self sendAction:#selector(UnLockIt) to:target forEvent:event]; // This is the selector used for UIControlEventTouchUpInside
} else {
[super sendAction:action to:target forEvent:event];
}
} else {
[super sendAction:action to:target forEvent:event];
}
}
With a little bit more tweaking I should be able to use it for the opposite also. (Using a closer touch as an outside touch).

touches in touch event handling in UIView vs UIViewControllers

So, I'm capturing multiple touches and determining the number and where each of the touches occurred and I'm seeing different behavior between a UIView UIViewController.
Basically, the event handler is coded below: (in this case the touchesEnded, but it really doesn't matter). What happens in the View, is that I get the entire collection of touches, but in the controller, I get only one touch at a time, that is there is never a collection of touches, but the TouchesEnded gets executed for each touch that occurred in the same moment.
I add the subviews as identically as possible...
Can someone 'spain to me what's going on? Why doesn't the controller version give me the collection?
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
int total = 0;
for (UITouch *touch in touches)
{
for(UIView *myView in [contentView subviews])
{
if (CGRectContainsPoint([myView frame], [touch locationInView:self.view]))
total++; // this is always 1 for the UIViewController
// but I get the full number within the UIView version
}
}
}
I suspect you are getting the touches but that the views are different. Check that contentView is the same and has the same subviews.
I am not sure that I am right but this thing always work for me.
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
If(touch.view==your view)
{
NSLog("do your task here");
}
}