UIImageview and TouchesEnded Event - objective-c

Im trying to write a simple program that takes 5 images and allows you to drag them from the bottom of the screen and snap them on to 5 other images on the top in any order you like. I have subclassed the UIImageView with a new class and added the touches began, touches moved and touches ended. Then on my main viewcontroller I place the images and set their class to be the new subclass I created. The movement works great in fact I am NSLogging the coordinates in the custom class.
Problem is I'm trying to figure out how to get that CGpoint info from the touches end out of the custom class and get it to call a function in my main view controller that has the image objects so i can test whether or not the image is over another image. and move it to be centered on the image its over (snap onto it).
here is the code in my custom class m file. MY view controller is basically empty with a xib file with 10 image views 5 of which are set to this class and 5 are normal image views..
#import "MoveImage.h"
CGPoint EndPoint;
#implementation MoveImage
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
startPoint = [[touches anyObject] locationInView:self];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint newPoint = [[touches anyObject] locationInView:self.superview];
newPoint.x -= startPoint.x;
newPoint.y -= startPoint.y;
CGRect frm = [self frame];
frm.origin = newPoint;
[self setFrame:frm];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *end = [[event allTouches] anyObject];
EndPoint = [end locationInView:self];
NSLog(#"end ponts x : %f y : %f", EndPoint.x, EndPoint.y);
}
#end

You could write a MoveImageDelegate protocol which your app controller would then implement. This protocol would have methods that your MoveImage class can call to get its information.
In your header file ( if you don't know how to write a protocol )
#protocol MoveImageDelegate <TypeOfObjectsThatCanImplementIt> // usually NSObject
// methods
#end
then you can declare an instance variable of type id, but still have the compiler recognize that that variable responds to your delegate selectors ( so you can code without those annoying warnings ).
Something like:
#interface MoveImage : UIImageView
{
id <MoveImageDelegate> _delegate;
}
#property (assign) id <MoveImageDelegate> delegate;
Lastly:
#implementation MoveImage
#synthesize delegate = _delegate;
And your delegate is complete.

Related

View jumps back after setting center in iOS 8

I’m beginner with Xcode, and i have problem with counter in my app.
When i put timer to count points the player(paddle) jump in the place where it was
in first time(in the middle)
everytime when timer count point.
So what i should do to keep paddle where it is?
Is there something other way to count points?
Its doesnt matter how to control player with swipe or g-sensor,
but in this example i control it with swipe.
And this problem appears only in ios 8.0> , not in ios 7.
Here some code:
.h file:
int ScoreNumber;
#interface ViewController : UIViewController{
IBOutlet UILabel *ScoreLabel;
NSTimer *Timer;
}
#property (nonatomic, strong)IBOutlet UIImageView *paddle;
#property (nonatomic) CGPoint paddleCenterPoint;
#end
.m file:
#implementation ViewController
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
CGFloat yPoint = self.paddleCenterPoint.y;
CGPoint paddleCenter = CGPointMake(touchLocation.x, yPoint);
self.paddle.center = paddleCenter;
}
-(void)Score{
ScoreNumber = ScoreNumber +1;
ScoreLabel.text = [NSString stringWithFormat:#"%i", ScoreNumber];
}
-(void)viewDidLoad {
Timer = [NSTimer scheduledTimerWithTimeInterval:1 target:self selector:#selector(Score) userInfo:nil repeats:YES];
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
}
In iOS 8, autolayout is king, and setting center doesn't change the layout constraints. This has nothing to do with the timer. It has to do with changing the label's text. When that happens, it triggers a layout of the whole view, and that reasserts the constraints (via layoutIfNeeded). Even though you're probably not setting any constraints in IB, Xcode will automatically insert them during the build.
There are several solutions. First, you could insert the paddle programmatically in viewDidLoad, which would bypass the constraints system. For this specific problem, that's probably ok (and it might even be how I'd do it personally). Something like this:
#interface ViewController ()
#property (nonatomic, strong) UIView *paddle;
#end
#implementation ViewController
- (void)viewDidLoad {
self.paddle = [[UIView alloc] initWithFrame:CGRectMake(100, 100, 200, 50)];
self.paddle.backgroundColor = [UIColor blueColor];
[self.view addSubview:self.paddle];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
self.paddle.center = CGPointMake(touchLocation.x, self.paddle.center.y);
}
#end
But how can we solve this without throwing away constraints? Well, we could just modify the constraint. Add your paddle view in IB, and add constraints for the left, bottom, width, and height. Then create an IB outlet for the horizontal and width constraints.
Edit the Horizontal Space Constraint to remove "Relative to margin" from both the first and second items:
Now, you can modify the constraint rather than the center:
#interface ViewController ()
#property (nonatomic, weak) IBOutlet UIView *paddle;
#property (weak, nonatomic) IBOutlet NSLayoutConstraint *paddleHorizontalConstraint;
#property (weak, nonatomic) IBOutlet NSLayoutConstraint *paddleWidthConstraint;
#end
#implementation ViewController
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
CGFloat width = self.paddleWidthConstraint.constant;
self.paddleHorizontalConstraint.constant = touchLocation.x - width/2;
}
#end
You need to read the width constraint because the frame or bounds may not be correct until viewDidLayoutSubviews: is called. But you don't want to update the constraint there, since that'll force another layout.

With autolayout, label position is reset when I change label text

I am developing a very simple application in Objective-c.
In the app, the user can change a position of the label by dragging the screen.
Tap & drag the screen, the label moves up and down to match the position of the finger.
When you release your finger, the coordinates information of the label is set to its text.
But the label position is reset at the moment its text is changed.
// IBOutlet UILabel *label
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touch = [[touches anyObject] locationInView:self.view];
label.center = CGPointMake(label.center.x, touch.y);
}
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touch = [[touches anyObject] locationInView:self.view];
label.center = CGPointMake(label.center.x, touch.y);
}
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touch = [[touches anyObject] locationInView:self.view];
label.text = [NSString stringWithFormat:#"y = %f", touch.y];
}
In touchesEnded event, just only change the text of the label,
but the position of it has reset.
I tried to change touchesEnded event like following but it haven't solved the problem.
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touch = [[touches anyObject] locationInView:self.view];
label.text = [NSString stringWithFormat:#"y = %f", touch.y];
label.center = CGPointMake(label.center.x, touch.y); // add this line
}
I want to resolve this weird behavior without uncheck "Use auto layout".
I'd like to keep using auto layout.
I have 4 screenshots for my app.
4 Screenshots
The first image is a Storyboard.
I have a label with auto layout constraints.
The second image is a screenshot right after launching app.
The third image is when a user drag the screen,
and the label move down to match the finger.
The forth image is right after release your finger
and the label text has changed.
You cannot use auto layout and change the center / frame of things; those are opposites. Do one or the other - not both.
So, you don't have to turn off auto layout, but if you don't, then you must use auto layout, and auto layout only, to position things.
When you move the label, do not change its center - change its constraints. Or at least, having changed its center, change its constraints to match.
Otherwise, when layout occurs, the constraints will put it back where you have told them to put it. And layout does occur when you change the text, and at many other times.
Solution by OP.
Constraint can be treated as an IBOutlet on Storyboard:
Associating constraints as well as label or other IBOutlet.
// ViewController.h
#import <UIKit/UIKit.h>
#interface ViewController : UIViewController {
IBOutlet UILabel *label;
IBOutlet NSLayoutConstraint *lcLabelTop;
}
#end
// ViewController.m
- (void)viewDidLoad {
[super viewDidLoad];
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panAction:)];
[self.view addGestureRecognizer:pan];
}
- (void)panAction : (UIPanGestureRecognizer *)sender
{
CGPoint pan = [sender translationInView:self.view];
lcLabelTop.constant += pan.y;
if (sender.state == UIGestureRecognizerStateEnded) {
label.text = [NSString stringWithFormat:#"y = %f", lcLabelTop.constant];
}
[sender setTranslation:CGPointZero inView:self.view];
}

Trouble detecting a touch on a nested custom sprite layer

I have a layer, HelloWorldLayer below, where touch works anywhere, but I'd like it to work only when touching a sprite in the layer -- turtle below.
If I try to add self.isTouchEnabled = YES; onto the CCTurtle layer it says
property isTouchEnabled not found on object type CCTurtle
my output reads as follows
2013-01-08 20:30:14.767 FlashToCocosARC[6746:d503] cocos2d: deallocing
2013-01-08 20:30:15.245 FlashToCocosARC[6746:d503] playing walk animation2
Here's my HelloWorldLayer code:
#import "HelloWorldLayer.h"
#import "CCTurtle.h"
#implementation HelloWorldLayer
+(CCScene *) scene
{
CCScene *scene = [CCScene node];
HelloWorldLayer *layer = [HelloWorldLayer node];
[scene addChild: layer];
return scene;
}
-(id) init
{
if( (self=[super init])) {
turtle= [[CCTurtle alloc] init];
[turtle setPosition:ccp(300, 100)];
[self addChild:turtle];
///addChild:child z:z tag:aTag;
self.isTouchEnabled = YES;
turtle. tag=4;
//
}
return self;
}
//- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
//{
// // Processing all touches for multi-touch support
// UITouch *touch = [touches anyObject];
// if ([[touch view] isKindOfClass:[turtle class]]) {
// NSLog(#"[touch view].tag = %d", [touch view].tag);
// [self toggleTurtle];
// }
//}
-(BOOL)containsTouch:(UITouch *)touch {
CGRect r=[turtle textureRect];
CGPoint p=[turtle convertTouchToNodeSpace:touch];
return CGRectContainsPoint(r, p );
}
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
//////GENERAL TOUCH SCREEN
for (UITouch *touch in touches) {
CGPoint touchLocation = [touch locationInView:[touch view]];
touchLocation = [[CCDirector sharedDirector] convertToGL:touchLocation];
[self toggleTurtle];
/////
}
}
-(void) toggleTurtle
{
NSLog(#"playing walk animation2");
[turtle playAnimation:#"walk_in" loop:NO wait:YES];
}
#end
//hello world.h
#import "cocos2d.h"
#import "CCTurtle.h"
#interface HelloWorldLayer : CCLayer
{
CCTurtle *turtle;
}
+(CCScene *) scene;
#end
//CCturtle
#import <Foundation/Foundation.h>
#import "FTCCharacter.h"
#interface CCTurtle : FTCCharacter <FTCCharacterDelegate, CCTargetedTouchDelegate>
{
}
#end
I'm using Cocos2D cocos2d v1.0.1 (arch enabled), and am testing on ipad 4.3 simulator.
with thanks Natalie
ive tried to put the touches directly into ccturtle.m so it can handle its own touches using CCTargetedTouchDelegate as above but using
CCturtle/// I changed files to this trying a different way to find the touched area...
- (CGRect)rect
{
CGSize s = [self.texture contentSize];
return CGRectMake(-s.width / 2, -s.height / 2, s.width, s.height);
}
-(BOOL) didTouch: (UITouch*)touch {
return CGRectContainsPoint(self.rect, [self convertTouchToNodeSpaceAR:touch]);
//return CGRectContainsPoint( [self rect], [self convertTouchToNodeSpaceAR: touch] );
}
-(BOOL) ccTouchBegan:(UITouch*)touch withEvent: (UIEvent*)event {
NSLog(#"attempting touch.");
if([self didTouch: touch]) {
return [self tsTouchBegan:touch withEvent: event];
}
return NO;
}
but still wont compile as still returns the error "Property 'is TouchEnabled' not found on object type 'CCTurtle*'
am really not sure what i can do to get it to run now... and really need to get this working (i suppose i could make invisible buttons, but it would be nicer to be able to find the ccturtle properly and to understand what im doing wrong... hope someone can help
First of all, I cannot see anywhere calling of containsTouch: method. And here are several advices:
Use boundingBox instead of textureRect to get local rect of your node (your turtle, in this case). Or just replace containsTouch: method to your turtle class to incapsulate this. It can be helpful, for example, if you want to make touch area of your turtle bigger/smaller. You will just need to change one little method in your turtle class.
In your ccTouchesBegan:withEvent: method just check for every turtle if it is hit by this touch. Then, for example, you can create dictionary, with touch as the key and array of corresponding turtles as the value. Then you just need to update all turtles positions for moved touch in your ccTouchesMoved:withEvent: method and remove this array of turtles from the dictionary in ccTouchesEnded:withEvent: and ccTouchCancelled:withEvent: method.
If you want your CCTurtle object to accept targeted touches, you can do it by specifying that it conforms to the CCTargetedTouchDelegate protocol. In your CCTurtle #interface, you declare it like so:
#interface CCTurtle : CCNode <CCTargetedTouchDelegate>
Then in the implementation, you tell it to accept touches and implement the ccTouch methods:
#implementation CCTurtle
-(id) init {
.
.
[[CCDirector sharedDirector].touchDispatcher addTargetedDelegate:self priority:1 swallowsTouches:YES];
return self;
}
-(void) onExit {
[[CCDirector sharedDirector].touchDispatcher removeDelegate:self];
}
-(BOOL) ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event {
CGPoint location = [[CCDirector sharedDirector] convertToGL:[touch locationInView:[touch view]]];
if (CGRectContainsPoint([self boundingBox], location] {
// some code here
}
}
-(void)ccTouchMoved:(UITouch *)touch withEvent:(UIEvent *)event { // code here }
-(void)ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event { // code here}
-(void)ccTouchCancelled:(UITouch *)touch withEvent:(UIEvent *)event { // code here}
Thanks to Prototypical the issue was solved,
Although the CCTurtle was a layer with a type of sprite on it was nested sprites
which meant that cocos2d had problems creating the correct bounding box for my
"contains touch method"
so with a bit of magic he combined his "get full bounding box" method to the contains touch method to account for the children of the sprite, and the code now relies on collision detection to process the touch.
currently im happily working away on some nice icons for him in return
but wanted to say thank-you to all that helped, and hope that this method comes in handy for anyone with the same problem!

How to detect when no fingers are on the screen?

I need some help with the touchesEnded function. I want to start a NSTimer when there are no fingers on the screen using the touchesEnded function. Is this possible?. Currently I have a touchesBegan function working :-).
Here's my code:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSUInteger numTaps = [[touches anyObject] tapCount];
if (numTaps >= 2)
{
self.label.text = [NSString stringWithFormat:#"%d", numTaps];
self.label2.text = #"+1";
[self.button setHidden:YES];
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
}
- (void)showButton
{
[self.button setHidden:NO];
}
It's tricky. You don't get an event that tells you that all fingers are up; each touchesEnded only tells you that this finger is up. So you have to keep track of touches yourself. The usual technique is to store touches in a mutable set as they arrive. But you can't store the touches themselves, so you have to store an identifier instead.
Start by putting a category on UITouch, to derive this unique identifier:
#interface UITouch (additions)
- (NSString*) uid;
#end
#implementation UITouch (additions)
- (NSString*) uid {
return [NSString stringWithFormat: #"%p", self];
}
#end
Now we must maintain our mutable set throughout the period while we have touches, creating it on our first touch and destroying it when our last touch is gone. I'm presuming you have an NSMutableSet instance variable / property. Here is pseudo-code:
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// create mutable set if it doesn't exist
// then add each touch's uid to the set with addObject:
// and then go on and do whatever else you want to do
}
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
// remove uid of *every* touch that has ended from our mutable set
// if *all* touches are gone, nilify the set
// now you know that all fingers are up
}
To create an idle timer, the simplest way is to create a custom subclass of UIApplication, and override sendEvent: - here, call super, and reset your timer. This will cover all touches for you. Every touch your app receives goes through this method.
To use the custom application subclass you need to modify the call in main.m to UIApplicationMain(), inserting your custom class name.

-(void)touchesBegan (TO A SPECIFIED UIIMAGEVIEW)

Hi I want to make it so that if you touch an image that I put on the view the -(void)checkcollision happens. The -(void)checkcollision happens, but when I touch anything. How can I say it only works if i touch a specified image. e.g. a pimple:
IBOutlet UIImageView *pimple;
#property (nonatomic, retain) UIImageView *pimple;
here is the touchesBegan code
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *myTouch = [[event allTouches] anyObject];
[self checkcollision];
}
-(void)checkcollision {
if (label.text = #"0") {
label.text = #"1";
}
pimple.hidden = YES;
}
CGPoint point = [myTouch locationInView:pimple];
if ( CGRectContainsPoint(pimple.bounds, point) ) {
... Touch detected.
}
Alternatively you can consider gesture recognizers. You can use a tap recognizer for this case.
There are two options:
Subclass UIImageView and define your own handler
Check sender - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event and compare it to your UIImageView instance.