I have a CCLayer subclass MyLayer in which I handle touch events:
(BOOL) ccTouchBegan:(UITouch *) touch withEvent:(UIEvent *) event
I set the content size of MyLayer instances like this:
`myLayer.contentSize = CGSizeMake(30.0, 30.0);`
I then add MyLayer instances as children of ParentLayer. For some reason I can tap anywhere on the screen and a MyLayer instance will detect the tap. I want to only detect taps on the visible portion/content size. How can I do this?
Are the MyLayer instances somehow inheriting a "tappable area" from somewhere else? I've verified that the contentSize of the instance just tapped is (30, 30) as expected. Perhaps the contentSize isn't the way to specify the tappable area of CCLayer subclass.
When the touch is enabled on a particular CCLayer, it receives all of the touch events in the window. That being said, if there are multiple layers, all layers will receive the same touches.
To alleviate this, obtain the location from the UITouch, convert it to Cocos2d coordinates, then check to see if it is within the bounds of the Layer you are concerned with.
Here's some code to work with:
CCLayer * ccl = [[CCLayer alloc] init];
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector] convertToGL:location];
if (CGRectContainsPoint(CGRectMake(ccl.position.x - ccl.contentSize.width/2, ccl.position.y - ccl.contentSize.height/2, ccl.contentSize.width, ccl.contentSize.height), location)) {
//continue from there...
}
Related
I have a scene (SKScene), in which, whenever a click is performed a ball (SKSprtieNode) is dropped from that point.
Now what I want to do is, whenever a click on the ball is performed, the ball should bounce or something.
What I have in GameScene.m is
- (void)mouseDown:(NSEvent *)theEvent {
CGPoint location = [theEvent locationInNode:self];
[self addBallAtLocation:location];
}
- (void)addBallAtLocation:(CGPoint) location {
Ball *ball = [Ball new];
ball.position = location;
[self addChild:ball];
}
And in Ball.m I add the bounce action to mouseDown method:
- (void)mouseDown:(NSEvent *)theEvent{
CGPoint point = [theEvent locationInNode:self];
CGVector impulse = CGVectorMake(point.x * 5.0, point.y * 5.0);
[self.physicsBody applyImpulse:impulse];
}
Right now a new ball is created even when clicked on a exiting ball. I thought that the ball's mouseDown method would be called since I clicked on it, and if that did not exist the scene's mouseDown method would be called.
P.S. I have a feeling that this could be solved with delegate, I could very easily be wrong, but since I am not totally clear on to use them, I didn't. If you think that might be a good way to resolve this issue, please do use them, as it may help me understand how to use them.
By default SKSpriteNode has its userInteractionEnabled property set to NO (certainly for performance reasons). You simply have to set it to YES and your event-handling methods will be called :)
I'm a newbie to this and remaking an app. I am trying to use UITapGestureRecognizer. It works in the initial project file but not the new one. The only difference is that the old one uses a navigational controller but mine doesn't.
In the new one the self distance:location to:centre is stuck at 640 no matter where you press on the screen.
Can anyone help? I have no idea why it isn't working.
- (void)handleSingleTap:(UITapGestureRecognizer *)recognizer {
CGPoint location = [recognizer locationInView:[recognizer.view superview]];
CGPoint centre = CGPointMake(512,768 / 2);
NSInteger msg = [self distance:location to:centre];
NSLog(#"location to centre: %d", msg);
if ([self distance:location to:centre] < 330)
The part that looks suspicious to me is [recognizer.view superview].
When the gesture recognizer was added to self.view in a UIViewController that is not inside a container controller (e.g. a Navigation Controller) that view does not have a superview. If you send the superview message to self.view without a container view controller it will return nil.
So your message looked basically like this
CGPoint location = [recognizer locationInView:nil];
This will return the location in the window, which is also a valid CGPoint that tells you were you tapped the screen.
Since this didn't work I guess in [self distance:location to:centre] you do something that does only work with coordinates relative to the view. Maybe it's related to rotation, because the coordinates of the window don't rotate when you rotate the device.
Without knowing your code I'm not sure what the problem is, but it probably doesn't matter.
Just replace [recognizer.view superview] with recognizer.view.
Refer below Link, You may get your answer. It's an example if Gesture Recognition.
http://www.techotopia.com/index.php/An_iPhone_iOS_6_Gesture_Recognition_Tutorial
I'm trying to make a cocos2d scene on top of some UIKit views. I understand that the cocos2d 2.0 CCDirector is a subclass of UIViewController so that I can just add that as a subview.
I've written this function to handle switching to a scene in my appDelegate (currently thats
the delegate of my MainWindow.xib following http://www.raywenderlich.com/4817/how-to-integrate-cocos2d-and-uikit):
- (void)switchToScene
{
self.viewController = [CCDirector sharedDirector];
[[CCDirector sharedDirector] view].center = self.window.center;
[self.window addSubview:[self.viewController view]];
[[CCDirector sharedDirector] runWithScene:[CCTransitionSlideInL transitionWithDuration:1 scene:[BattleLayer scene]]];
}
So I'm missing quite a bit from this code. But with this, I get some resemblance of a scene to load http://i.stack.imgur.com/pubWE.png
My question is:
1: If I'm going in the right direction
2: If it is, is the next step to manually adjust the orientation of the scene?
2.1: How do I do this :)
I'm not sure how you want it to look but it seems that the CCDirector's view is rotated 90 degrees CW. This may be because of a few reasons:
you add the view wrong (i.e. rotated 90 degrees CW)
you rotated the device and the CCDirector didn't rotate (in which case you should look at orientation parameters and who and what rotates. It may be that the app rotates but CCDirector doesn't receive the rotation flag).
you can rotate the CCDirector view with CGAffineTransformRotate (i'm not sure..but because CCDirector is a subclass of UIViewController..it should work).
I solved the issue by creating a UINavigationController
- (void)switchToScene:(CCScene *)sceneObj
{
navController_ = [[UINavigationController alloc] initWithRootViewController:director_];
[window_ setRootViewController:navController_];
[[CCDirector sharedDirector] pushScene:sceneObj];
}
I am using a UIPanGestureRecognizer on a UIView that is in a UICollectionViewCell. I am using the translationInView method to get the translation along the X axis and slide the UIView left and right. It works fine, but now I cannot scroll up and down in the CollectionView if my finger is on a cell. Is there a way to make the PanGestureRecognizer pass the vertical scrolling to the UICollectionView?
I am trying to replicate the sliding seen in Reeder.app for iOS.
Once the UIPanGestureRecognizer activates in your UIView touches are no longer forwarded through the responder chain, because of that your UICollectionView is not receiving the touches anymore.
You can try setting your UIView as the delegate of your UIPanGestureRecognizer and place the logic of whether the UIPanGestureRecognizer should activate. You have to implement the delegate's method in your UIView:
-(BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer;
For example with the code below you would tell it not to activate if the velocity on the y axis of the view is greater than the velocity of the x axis (therefore sending the touches to the UICollectionView under it)
-(BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer{
UIPanGestureRecognizer *recognizer = (UIPanGestureRecognizer *)gestureRecognizer;
CGPoint velocity =[recognizer velocityInView:self];
if(abs(velocity.y)>=(abs(velocity.x))){
return NO;
}else return YES;
}
Hope this is helpful.
As you all know in Cocoa-touch the GUI is made up of several UIViews. Now in my view I have an UIImageView (containing an image) which is aligned at x:10 y:10, then I have a UILabel aligned at x:30 y:10 and then finally another UILabel aligned at x:50 y:10.
Now I would want all these UIViews to respond to the same touch event. What would be the easiest way to accomplish this? Would it be to create an UIView that will align from x:10 y:10 and cover all the views and then place this view on top of these views (-bringSubviewToFront)?
Cheers,
Peter
I think your variant is ok! Just create UIView and catch all touch events by it.
Or you can put all your subviews (image, label, label) on one subview mergeView of main view, disable user interaction of that views (image, label, label) and to mergeView add gesture recognizer or whatever you want to catch touches. With such approach it will be easier to reposition your views: just reposition mergeView. Don't forget to enable user interaction of mergeView.
Catch the touch events on the top View and explicitly pass those touches to other views you are interested in. But makes sure the touch point interesects the other views to whom you are sending the touch events
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [[touches allObjects] objectAtIndex:0];
CGPoint point = [touch locationInView:topView];
if (CGRectContainsPoint(view1.frame, point)) {
[view1 touchesBegan:[NSSet setWithObject:touch] withEvent:nil];
}
if (CGRectContainsPoint(view2.frame, point)) {
[view2 touchesBegan:[NSSet setWithObject:touch] withEvent:nil];
}
}