I am developing a very simple application in Objective-c.
In the app, the user can change a position of the label by dragging the screen.
Tap & drag the screen, the label moves up and down to match the position of the finger.
When you release your finger, the coordinates information of the label is set to its text.
But the label position is reset at the moment its text is changed.
// IBOutlet UILabel *label
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touch = [[touches anyObject] locationInView:self.view];
label.center = CGPointMake(label.center.x, touch.y);
}
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touch = [[touches anyObject] locationInView:self.view];
label.center = CGPointMake(label.center.x, touch.y);
}
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touch = [[touches anyObject] locationInView:self.view];
label.text = [NSString stringWithFormat:#"y = %f", touch.y];
}
In touchesEnded event, just only change the text of the label,
but the position of it has reset.
I tried to change touchesEnded event like following but it haven't solved the problem.
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touch = [[touches anyObject] locationInView:self.view];
label.text = [NSString stringWithFormat:#"y = %f", touch.y];
label.center = CGPointMake(label.center.x, touch.y); // add this line
}
I want to resolve this weird behavior without uncheck "Use auto layout".
I'd like to keep using auto layout.
I have 4 screenshots for my app.
4 Screenshots
The first image is a Storyboard.
I have a label with auto layout constraints.
The second image is a screenshot right after launching app.
The third image is when a user drag the screen,
and the label move down to match the finger.
The forth image is right after release your finger
and the label text has changed.
You cannot use auto layout and change the center / frame of things; those are opposites. Do one or the other - not both.
So, you don't have to turn off auto layout, but if you don't, then you must use auto layout, and auto layout only, to position things.
When you move the label, do not change its center - change its constraints. Or at least, having changed its center, change its constraints to match.
Otherwise, when layout occurs, the constraints will put it back where you have told them to put it. And layout does occur when you change the text, and at many other times.
Solution by OP.
Constraint can be treated as an IBOutlet on Storyboard:
Associating constraints as well as label or other IBOutlet.
// ViewController.h
#import <UIKit/UIKit.h>
#interface ViewController : UIViewController {
IBOutlet UILabel *label;
IBOutlet NSLayoutConstraint *lcLabelTop;
}
#end
// ViewController.m
- (void)viewDidLoad {
[super viewDidLoad];
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(panAction:)];
[self.view addGestureRecognizer:pan];
}
- (void)panAction : (UIPanGestureRecognizer *)sender
{
CGPoint pan = [sender translationInView:self.view];
lcLabelTop.constant += pan.y;
if (sender.state == UIGestureRecognizerStateEnded) {
label.text = [NSString stringWithFormat:#"y = %f", lcLabelTop.constant];
}
[sender setTranslation:CGPointZero inView:self.view];
}
Related
I recently came across an issue where I had a superview that needed to be swipable and a subview that also needed to be swipable. The interaction was that the subview should be the only one swiped if the swipe occurred within its bounds. If the swipe happened outside of the subview, the superview should handle the swipe.
I couldn't find any answers that solved this exact problem and eventually came up with a hacky solution that I thought I'd post if it can help others.
Edit:
A better solution is now marked as the right answer.
Changed title from "Ignore touch events..." to "Ignore gestures..."
If you are looking for a better solution, you can use gestureRecognizer:shouldReceiveTouch: delegate method to ignore the touch for the parent view recognizer.
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldReceiveTouch:(UITouch *)touch{
UIView* swipeableSubview = ...; //set to the subview that can be swiped
CGPoint locationInSubview = [touch locationInView:swipeableSubview];
BOOL touchIsInSubview = [swipeableSubview pointInside:locationInSubview withEvent:nil];
return !touchIsInSubview;
}
This will make sure the parent only receives the swipe if the swipe does not start on the swipeable subview.
The basic premise is to catch when a touch happens and remove gestures if the touch happened within a set of views. It then re-adds the gestures after the gesture recognizer handles the gestures.
#interface TouchIgnorer : UIView
#property (nonatomic) NSMutableSet * ignoreOnViews;
#property (nonatomic) NSMutableSet * gesturesToIgnore;
#end
#implementation TouchIgnorer
- (id) init
{
self = [super init];
if (self)
{
_ignoreOnViews = [[NSMutableSet alloc] init];
_gesturesToIgnore = [[NSMutableSet alloc] init];
}
return self;
}
- (BOOL) pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
CGPoint relativePt;
for (UIView * view in _ignoreOnViews)
{
relativePt = [view convertPoint:point toView:view];
if (!view.isHidden && CGRectContainsPoint(view.frame, relativePt))
{
for (UIGestureRecognizer * gesture in _gesturesToIgnore)
{
[self removeGestureRecognizer:gesture];
}
[self performSelector:#selector(rebindGestures) withObject:self afterDelay:0];
break;
}
}
return [super pointInside:point withEvent:event];
}
- (void) rebindGestures
{
for (UIGestureRecognizer * gesture in _gesturesToIgnore)
{
[self addGestureRecognizer:gesture];
}
}
#end
I'm a beginner at iOS development and I'm working on a side project for my school. What I would like to do is have a small image where the user can drag out more images from. Essentially I have an app where I'm generating squares and I need to have a set square where the user can touch the square and drag out new squares onto a board. I have achieved this to a certain point but it's not the desired behavior yet. Currently the user has to touch the button and then release before they can interact with the new square to drag it off of the button. What i would like is to have the touch event somehow passed over to the new UIImageView object that is created so it is seamless dragging out a new square. Below is my current code.
//UIViewController
#import <UIKit/UIKit.h>
#import "Tile.h"
#interface MenuViewController : UIViewController
{
Tile *tileObject;
}
#end
#import "MenuViewController.h"
#implementation MenuViewController
- (IBAction)createTile:(UIButton *)sender {
CGFloat tileX = sender.center.x;
CGFloat tileY = sender.center.y;
tileObject = [[Tile alloc]initWithX:tileX andY:tileY];
[self.view addSubview:tileObject];
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
#pragma mark - View lifecycle
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
}
- (void)viewDidUnload
{
[super viewDidUnload];
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
// Return YES for supported orientations
return YES;
}
#end
//Custom Tile Class
#import <UIKit/UIKit.h>
#interface Tile : UIImageView
{
CGPoint startLocation;
}
- (id)init; //Default Initialization
- (id)initWithX:(CGFloat)x andY:(CGFloat)y; //Initialization with coordinate points from single screen tap
#end
#import "Tile.h"
#implementation Tile
//Default initialization
-(id) init
{
self = [self initWithX:0 andY:0];
return self;
}
//Initialization with coordinate points from single screen tap
- (id)initWithX:(CGFloat)centerX andY:(CGFloat)centerY
{
//Creates a UIImageView object from an image file
UIImage *image = [UIImage imageNamed:#"redblock.png"];
self = [super initWithImage:image];
//Used to center the image under the single screen tap
centerX -= (image.size.height/2);
centerY -= (image.size.height/2);
//Sets the position of the image
self.frame = CGRectMake(centerX, centerY, image.size.width, image.size.height);
self.userInteractionEnabled = YES; //required for interacting with UIImageViews
return self;
}
/* Methods from Stack Overflow
http://stackoverflow.com/questions/1991899/uiimage-detecting-touch-and-dragging
Referenced from http://www.iphoneexamples.com/
*/
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
// Retrieve the touch point
CGPoint point = [[touches anyObject] locationInView:self];
startLocation = point;
[[self superview] bringSubviewToFront:self];
}
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
// Move relative to the original touch point
CGPoint point = [[touches anyObject] locationInView:self];
CGRect frame = [self frame];
frame.origin.x += point.x - startLocation.x;
frame.origin.y += point.y - startLocation.y;
[self setFrame:frame];
}
/*
end of methods from Stack Overflow
*/
#end
If the initial touch down point can be an image view with the image you want to drag as its image, then you can do it like this:
In your menuViewController class, put this in the viewDidLoad (and delete the button and its method):
- (void)viewDidLoad
{
[super viewDidLoad];
tileObject = [[Tile alloc]initWithX:160 andY:200];
[self.view addSubview:tileObject];
}
Then in your Tile class, in the touchesBegan:withEvent: method you create a new tile, and carry out the drag as you did before:
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
Tile *newTile = [[Tile alloc]initWithX:160 andY:200];
[self.superview addSubview:newTile];
// Retrieve the touch point
CGPoint point = [[touches anyObject] locationInView:self];
startLocation = point;
[[self superview] bringSubviewToFront:self];
}
Instead of using a button with touchDown event, you need to:
1) Detect touchesBegan and create your tile
2) Detect touchesMoved and move your tile
Im trying to write a simple program that takes 5 images and allows you to drag them from the bottom of the screen and snap them on to 5 other images on the top in any order you like. I have subclassed the UIImageView with a new class and added the touches began, touches moved and touches ended. Then on my main viewcontroller I place the images and set their class to be the new subclass I created. The movement works great in fact I am NSLogging the coordinates in the custom class.
Problem is I'm trying to figure out how to get that CGpoint info from the touches end out of the custom class and get it to call a function in my main view controller that has the image objects so i can test whether or not the image is over another image. and move it to be centered on the image its over (snap onto it).
here is the code in my custom class m file. MY view controller is basically empty with a xib file with 10 image views 5 of which are set to this class and 5 are normal image views..
#import "MoveImage.h"
CGPoint EndPoint;
#implementation MoveImage
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
startPoint = [[touches anyObject] locationInView:self];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
CGPoint newPoint = [[touches anyObject] locationInView:self.superview];
newPoint.x -= startPoint.x;
newPoint.y -= startPoint.y;
CGRect frm = [self frame];
frm.origin = newPoint;
[self setFrame:frm];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *end = [[event allTouches] anyObject];
EndPoint = [end locationInView:self];
NSLog(#"end ponts x : %f y : %f", EndPoint.x, EndPoint.y);
}
#end
You could write a MoveImageDelegate protocol which your app controller would then implement. This protocol would have methods that your MoveImage class can call to get its information.
In your header file ( if you don't know how to write a protocol )
#protocol MoveImageDelegate <TypeOfObjectsThatCanImplementIt> // usually NSObject
// methods
#end
then you can declare an instance variable of type id, but still have the compiler recognize that that variable responds to your delegate selectors ( so you can code without those annoying warnings ).
Something like:
#interface MoveImage : UIImageView
{
id <MoveImageDelegate> _delegate;
}
#property (assign) id <MoveImageDelegate> delegate;
Lastly:
#implementation MoveImage
#synthesize delegate = _delegate;
And your delegate is complete.
I'm making an iPad project in which a class named "Car" (this is a separate file from the view controller) is supposed to be dragged around the main view.
I setup the class as I saw in an Apple example and I'm able to view my image when I run the application but it's like my class doesn't respond to my touches event and I can't solve the problem.
Here is my class code:
Car.h
#import <UIKit/UIKit.h>
#interface Car : UIView {
UIImageView *firstPieceView;
CGPoint startTouchPosition;
}
-(void)animateFirstTouchAtPoint:(CGPoint)touchPoint forView:(UIImageView *)theView;
-(void)animateView:(UIView *)theView toPosition:(CGPoint) thePosition;
-(void)dispatchFirstTouchAtPoint:(CGPoint)touchPoint forEvent:(UIEvent *)event;
-(void)dispatchTouchEvent:(UIView *)theView toPosition:(CGPoint)position;
-(void)dispatchTouchEndEvent:(UIView *)theView toPosition:(CGPoint)position;
#property (nonatomic, retain) IBOutlet UIImageView *firstPieceView;
#end
and this is my other class code: Car.m
#import "Car.h"
#implementation Car
#synthesize firstPieceView;
#define GROW_ANIMATION_DURATION_SECONDS 0.15 // Determines how fast a piece size grows when it is moved.
#define SHRINK_ANIMATION_DURATION_SECONDS 0.15 // Determines how fast a piece size shrinks when a piece stops moving.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Enumerate through all the touch objects.
NSUInteger touchCount = 0;
for (UITouch *touch in touches) {
// Send to the dispatch method, which will make sure the appropriate subview is acted upon
[self dispatchFirstTouchAtPoint:[touch locationInView:self] forEvent:nil];
touchCount++;
}
}
// Checks to see which view, or views, the point is in and then calls a method to perform the opening animation,
// which makes the piece slightly larger, as if it is being picked up by the user.
-(void)dispatchFirstTouchAtPoint:(CGPoint)touchPoint forEvent:(UIEvent *)event
{
if (CGRectContainsPoint([firstPieceView frame], touchPoint)) {
[self animateFirstTouchAtPoint:touchPoint forView:firstPieceView];
}
}
// Handles the continuation of a touch.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSUInteger touchCount = 0;
// Enumerates through all touch objects
for (UITouch *touch in touches) {
// Send to the dispatch method, which will make sure the appropriate subview is acted upon
[self dispatchTouchEvent:[touch view] toPosition:[touch locationInView:self]];
touchCount++;
}
}
// Checks to see which view, or views, the point is in and then sets the center of each moved view to the new postion.
// If views are directly on top of each other, they move together.
-(void)dispatchTouchEvent:(UIView *)theView toPosition:(CGPoint)position
{
// Check to see which view, or views, the point is in and then move to that position.
if (CGRectContainsPoint([firstPieceView frame], position)) {
firstPieceView.center = position;
}
}
// Handles the end of a touch event.
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
// Enumerates through all touch object
for (UITouch *touch in touches) {
// Sends to the dispatch method, which will make sure the appropriate subview is acted upon
[self dispatchTouchEndEvent:[touch view] toPosition:[touch locationInView:self]];
}
}
// Checks to see which view, or views, the point is in and then calls a method to perform the closing animation,
// which is to return the piece to its original size, as if it is being put down by the user.
-(void)dispatchTouchEndEvent:(UIView *)theView toPosition:(CGPoint)position
{
// Check to see which view, or views, the point is in and then animate to that position.
if (CGRectContainsPoint([firstPieceView frame], position)) {
[self animateView:firstPieceView toPosition: position];
}
}
-(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
// Enumerates through all touch object
for (UITouch *touch in touches) {
// Sends to the dispatch method, which will make sure the appropriate subview is acted upon
[self dispatchTouchEndEvent:[touch view] toPosition:[touch locationInView:self]];
}
}
#pragma mark -
#pragma mark === Animating subviews ===
#pragma mark
// Scales up a view slightly which makes the piece slightly larger, as if it is being picked up by the user.
-(void)animateFirstTouchAtPoint:(CGPoint)touchPoint forView:(UIImageView *)theView
{
// Pulse the view by scaling up, then move the view to under the finger.
[UIView beginAnimations:nil context:nil];
[UIView setAnimationDuration:GROW_ANIMATION_DURATION_SECONDS];
theView.transform = CGAffineTransformMakeScale(1.2, 1.2);
[UIView commitAnimations];
}
// Scales down the view and moves it to the new position.
-(void)animateView:(UIView *)theView toPosition:(CGPoint)thePosition
{
[UIView beginAnimations:nil context:NULL];
[UIView setAnimationDuration:SHRINK_ANIMATION_DURATION_SECONDS];
// Set the center to the final postion
theView.center = thePosition;
// Set the transform back to the identity, thus undoing the previous scaling effect.
theView.transform = CGAffineTransformIdentity;
[UIView commitAnimations];
}
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
UIImage *img = [ UIImage imageNamed: #"CyanSquare.png" ];
firstPieceView = [[UIImageView alloc] initWithImage: img];
//[img release];
[super addSubview:firstPieceView];
[firstPieceView release];
}
return self;
}
/*
// Only override drawRect: if you perform custom drawing.
// An empty implementation adversely affects performance during animation.
- (void)drawRect:(CGRect)rect
{
// Drawing code
}
*/
- (void)dealloc
{
[firstPieceView release];
[super dealloc];
}
#end
And here is my code for the view controller: (ParkingviewController.h)
#import <UIKit/UIKit.h>
#import "Car.h"
#interface ParkingViewController : UIViewController {
}
#end
and last but not least the ParkingViewController.m
#import "ParkingViewController.h"
#implementation ParkingViewController
- (void)dealloc
{
[super dealloc];
}
- (void)didReceiveMemoryWarning
{
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
#pragma mark - View lifecycle
// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.
- (void)viewDidLoad
{
Car *car1 = [[Car alloc] init];
[self.view addSubview:car1];
[car1 release];
[super viewDidLoad];
}
- (void)viewDidUnload
{
[super viewDidUnload];
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
// Return YES for supported orientations
return YES;
}
#end
Please forgive me if I've posted all the code but I want to be clear in every aspect of my project so that anyone can have the whole situation to be clear.
You need to set a frame for Car object that you are creating for touches to be processed. You are able to see the image as the clipsToBounds property of the view is set to NO by default.
Hi I want to make it so that if you touch an image that I put on the view the -(void)checkcollision happens. The -(void)checkcollision happens, but when I touch anything. How can I say it only works if i touch a specified image. e.g. a pimple:
IBOutlet UIImageView *pimple;
#property (nonatomic, retain) UIImageView *pimple;
here is the touchesBegan code
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *myTouch = [[event allTouches] anyObject];
[self checkcollision];
}
-(void)checkcollision {
if (label.text = #"0") {
label.text = #"1";
}
pimple.hidden = YES;
}
CGPoint point = [myTouch locationInView:pimple];
if ( CGRectContainsPoint(pimple.bounds, point) ) {
... Touch detected.
}
Alternatively you can consider gesture recognizers. You can use a tap recognizer for this case.
There are two options:
Subclass UIImageView and define your own handler
Check sender - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event and compare it to your UIImageView instance.