Why won't my Cocos2d test app fire "touchesBegan" events? - objective-c

In my app delegate, I made sure I have the line:
[glView setMultipleTouchEnabled: YES];
And I have a simple layer meant only to figure out how multi touch works. The .mm file looks like:
#import "TestLayer.h"
#implementation TestLayer
-(id) init
{
if( (self=[super init])) {
[[CCTouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:YES];
}
return self;
}
-(void) draw{
[super draw];
glColor4f(1.0, 0.0, 0.0, 0.35);
glLineWidth(6.0f);
ccDrawCircle(ccp(500,500), 250,CC_DEGREES_TO_RADIANS(360), 60,YES);
}
-(void) ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"got some touches");
}
-(void) ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"some touches moved.");
}
-(BOOL) ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
NSLog(#"a touch began");
return FALSE;
}
#end
When I touch the screen, I always see "a touch began", but no matter how I touch it (simulator or actual device), I never see "some touches moved" or "got some touches".
Is there something further I need to do to make multi touch work?
Specifically, I'm just trying to do basic pinch-to-zoom functionality... I heard there is some sort of gesture recognizer for iPhone...does it work for Coco2ds? Would it work even if I can't get simple multi touch events to fire?

UIGestureRecognizers absolutely work for Cocos2D, I personally used them, you just need to add them to the correct view by using:
[[[CCDirector sharedDirector] openGLView] addGestureRecognizer:myGestureRecognizer];
Regarding your touches, I guess you enabled them for the scene you are working in?
scene.isTouchEnabled = YES;
In any case you shouldn't use the addTargetDelegate method, take a look here

add self.isTouchEnabled = YES; to your init
and for the gesture recognizers look at the other answer

Related

How to remove a CCnode with a touch?

I need help I have tried to remove a ccnode that constantly is respawning at different locations and adding them to an array to get control of which sprites are on screen, but the thing is that I can't get to remove them. It detects the touches but doesn't get removed any ideas? Here is the code I'm using to get rid of the node.
- (void)touchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector] convertToUI:location];
for (CCNode *sprite in _spritesOnScreen) {
if (CGPointEqualToPoint(sprite.position, location)) {
[_spritesOnScreen removeObject:sprite];
[self removeChild:sprite cleanup:YES];
}
}
}
Allow me to offer you a slightly different approach. Subclass CCNode to CCAppleNode and within the CCAppleNode.m file detect touches and call removeFromParent on touchBegan. This way the CCAppleNode class is taking up the responsibility of removing itself from parent when its touched, taking away this responsibility from your main game scene.
-(void) touchBegan:(UITouch *)touch withEvent:(UIEvent *)event{
[self removeFromParentAndCleanup:YES];
[super touchBegan:touch withEvent:event];
}

Custom gesture not sending action message

I have created a UIGestureRecognizer subclass called LongPressGestureRecognizer to simulate a long press gesture. (Yes, I know about the concrete subclass which already exists, I'm just learning objective-c and experimenting a bit.)
I have overridden only the following methods :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
self.state = UIGestureRecognizerStatePossible;
[self performSelector:#selector(setState:) withObject:[NSNumber numberWithInt:UIGestureRecognizerStateRecognized] afterDelay:2];
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
[NSObject cancelPreviousPerformRequestsWithTarget:self];
self.state = UIGestureRecognizerStateCancelled;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[NSObject cancelPreviousPerformRequestsWithTarget:self];
self.state = UIGestureRecognizerStateFailed;
}
and in my view controller with a property recognizer to store the gesture recogniser, i have the following code:
- (LongPressGestureRecognizer *)recognizer
{
if (!_recognizer) {
_recognizer = [[LongPressGestureRecognizer alloc] init];
}
return _recognizer;
}
- (void)viewDidLoad
{
[super viewDidLoad];
[self.recognizer addTarget:self action:#selector(log:)];
[self.view addGestureRecognizer:self.recognizer];
}
- (IBAction)log:(LongPressGestureRecognizer *)recognizer
{
//blah blah blah
}
My problem is that log: is not getting called at all... By logging the UIGestureRecogniserStates in the console, I know that the gesture recogniser is working as expected as far as states are concerned...
What am I doing wrong here?
There may be other issues as well, but I can tell you that your -performSelector:withObject:afterDelay: call isn't going to work the way you're hoping; you'll end up passing a pointer to an NSNumber object instead of an int, so state will be set to some kind of junk value. Create a method that will do self.state = UIGestureRecognizerStateRecognized and call that instead.

Hiding Keyboard in xcode storyboard

New to xcode,i'm creating a simple login form in xcode 4.2 and i would like to hide the keyboard,i have the correct code i think,from the tutorial it says i need to change the class of the view to UIControl but there is no option for this, is there another way when working with storyboards?
- (IBAction)backGroundTouched:(id)sender
{
[emailTextField resignFirstResponder];
[passTextField resignFirstResponder];
}
Assuming you are doing them inside the viewCotroller, invoke
[self.view endEditing:YES];
If your two text fields are subviews of some higher-level view you can also use [higherLevelView endEditing]; and not care which subview is currently active.
Make sure your both text fields is connect with it's IBOutlets.
No need to change UIView to UIControl.
// Connect every textfield's "Did end on exit" event with this method.
-(IBAction)textFieldReturn:(id)sender
{
[sender resignFirstResponder];
}
// Use this method also if you want to hide keyboard when user touch in background
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
[emailTextField resignFirstResponder];
[passTextField resignFirstResponder];
}
I followed this tutorial: http://www.techotopia.com/index.php/Writing_iOS_7_Code_to_Hide_the_Keyboard and it's working for me:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
if ([_textField isFirstResponder] && [touch view] != _textField) {
[_textField resignFirstResponder];
}
[super touchesBegan:touches withEvent:event];
}

Dragging a tableview cell that contains uibuttons

I'm trying to implement a uitableview that its rows can be dragged to right and left (and show something behind them).
The code works fine, I've implemented it using the following methods:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
My problem is that the rows also contain UIButtons that when clicking them- should be clicked but when dragging - should drag the entire cell.
I've found this solution. Basically to bubble up the events when clicking on the UIButtons:
[super touchesBegan:touches withEvent:event];
[self.nextResponder touchesBegan:touches withEvent:event];
But, it seems taht the event touchesMoved only bubbles once.
I've seen all sort of questions in this area. Example. But I don't see any solution or responses.
Any help, suggestion or creative workaround would be appreciated!
Instead of implementing touchesBegan, etc. why not use a UIPanGestureRecognizer? I tested this with just a simple rectangular view which was mostly covered by a UIButton. The view was dragged, no matter where I touched, and the button method fired if I clicked over the button.
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePanGesture:)];
[self.theView addGestureRecognizer:panGesture]; //theView is IBOutlet for small view containing a button
}
-(void)viewDidAppear:(BOOL)animated {
self.currentViewFrame = self.theView.frame;
}
- (IBAction)handlePanGesture:(UIPanGestureRecognizer *)sender {
CGPoint translate = [sender translationInView:self.view];
CGRect newFrame = self.currentViewFrame;
newFrame.origin.x += translate.x;
newFrame.origin.y += translate.y;
sender.view.frame = newFrame;
if (sender.state == UIGestureRecognizerStateEnded)
self.currentViewFrame = newFrame;
}
-(IBAction)doClick:(id)sender {
NSLog(#"click");
}
Just check to see which one was touched using the tag system.

ccTouchesBegan vs ccTouchBegan - Touch Detection & SIGABRT crash

I'm currently doing an application where I'm trying to detect touch-positions of the user.
I changed from ccTouchBegan to ccTouchesBegan in the process of implementing the "detect touch-position" function.
But I can't get it to work. I changed from ccTouchBegan to ccTouchesBegan:
-(void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
instead of using:
-(BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event{
When i did this the whole thing crashes when I click the screen. Generating a SIGABRT error higlighting:
#ifdef __IPHONE_OS_VERSION_MAX_ALLOWED
-(BOOL) ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
NSAssert(NO, #"Layer#ccTouchBegan override me");
return YES;
}
#endif
#end
So my questions are:
Why do you think it crashes?
What's the difference between ccTouchBegan & ccTouchesBegan? Multi touch abilities?
For further help, this is my code:
-(id) init
{
if( (self=[super init])) {
[CCTexture2D setDefaultAlphaPixelFormat:kCCTexture2DPixelFormat_RGBA8888];
self.isTouchEnabled = YES;
// Set up background
background = [CCSprite spriteWithFile:#"Image.png"];
background.scaleX = 1;
background.scaleY = 1;
background.position = ccp(0,0);
[self addChild:background];
[[CCTouchDispatcher sharedDispatcher]addTargetedDelegate:self
priority:0
swallowsTouches:YES];
// Preload sound effect
soundFile = [SimpleAudioEngine sharedEngine];
if (soundFile != nil) {
[soundFile preloadBackgroundMusic:#"sound.wav"];
}
}
return self;
}
-(void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
NSLog(#"ccTouchesBegan");
// Sets the sound variable to YES
ifOne = YES;
prevPos = [touch locationInView:[touch self]];
prevPos = [[CCDirector sharedDirector] convertToGL:[touch locationInView:touch.self]];
[self schedule:#selector(timerUpdate:) interval:0.1];
//return YES;
}
It's a nice feature in cocos2d which lets you swallow touches in cases that you want to handle only a single touch event.
Try adding this function to your class:
- (void) registerWithTouchDispatcher {
[[CCTouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:INT_MIN+1 swallowsTouches:YES];
}
ccTouchesBegan happens the second you tap the screen
ccTouchesEnded happens the second you let go after tapping on the screen
and instead of
-(void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
instead of using:
-(BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event{
try using
-(void)ccTouchesBegan:(UITouch *)...
or
-(void)ccTouchesBegan:(NSSet *)...
or
-(BOOL)ccTouchesBegan:(NSSet *)...
your problem may just be invalid data types or some crap like that, my advice is just try switching the types of touches around.
I would give more info but you didn't provide alot of information to work with, so this is the best I can do.