How to move a UIImageView with Touch - cocoa-touch

I'm trying to create moving functionality to my imageView (maskPreview in the code below), so that users can move a picture, which is contained in maskPreview, around the screen. Here's my code for touch begin and touch moved:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if ([touches count]==1) {
UITouch *touch= [touches anyObject];
originalOrigin = [touch locationInView:maskPreview];
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if ([touches count]==1) {
UITouch *touch = [touches anyObject];
CGPoint lastTouch = [touch previousLocationInView:self.view];
CGFloat movedDistanceX = originalOrigin.x-lastTouch.x;
CGFloat movedDistanceY = originalOrigin.y-lastTouch.y;
[maskPreview setFrame:CGRectMake(maskPreview.frame.origin.x+movedDistanceX, maskPreview.frame.origin.y + movedDistanceY, maskPreview.frame.size.width, maskPreview.frame.size.height)];
}
}
but I'm getting some weird responses from the app. I haven't put restrictions on how far the imageview can move, i.e. to prevent it from going out of the screen, but even if it's a small move, my imageview goes wild and disappears.
Thanks alot in advance for all the help

Implementing touchesBegan and so on is way overkill in this modern world. You're just confusing the heck out of yourself, and your code will quickly become impossible to understand or maintain. Use a UIPanGestureRecognizer; that's what it's for. Making a view draggable with a UIPanGestureRecognizer is trivial. Here's the action handler for a UIPanGestureRecognizer that makes the view draggable:
- (void) dragging: (UIPanGestureRecognizer*) p {
UIView* vv = p.view;
if (p.state == UIGestureRecognizerStateBegan ||
p.state == UIGestureRecognizerStateChanged) {
CGPoint delta = [p translationInView: vv.superview];
CGPoint c = vv.center;
c.x += delta.x; c.y += delta.y;
vv.center = c;
[p setTranslation: CGPointZero inView: vv.superview];
}
}

There are two problems with your code. First, this line is wrong:
CGPoint lastTouch = [touch previousLocationInView:self.view];
It should be this:
CGPoint lastTouch = [touch previousLocationInView:maskPreview];
Really, you shouldn't even be using previousLocationInView:. You should just be using locationInView: like this:
CGPoint lastTouch = [touch locationInView:maskPreview];
Second, you are getting the signs of movedDistanceX and movedDistanceY wrong. Change them to this:
CGFloat movedDistanceX = lastTouch.x - originalOrigin.x;
CGFloat movedDistanceY = lastTouch.y - originalOrigin.y;
Also, the documentation for touchesBegan:withEvent: says this:
If you override this method without calling super (a common use pattern), you must also override the other methods for handling touch events, if only as stub (empy) implementations.
So make sure you're also overriding touchesEnded:withEvent: and touchesCancelled:withEvent:.
Anyway, you can do this quite a bit more simply. One way is to make touchesBegan:withEvent: empty and do all the work in touchesMoved:withEvent: by using both previousLocationInView: and locationInView:, and updating maskPreview.center instead of maskPreview.frame. You won't even need the originalOrigin instance variable:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if ([touches count]==1) {
UITouch *touch = [touches anyObject];
CGPoint p0 = [touch previousLocationInView:maskPreview];
CGPoint p1 = [touch locationInView:maskPreview];
CGPoint center = maskPreview.center;
center.x += p1.x - p0.x;
center.y += p1.y - p0.y;
maskPreview.center = center;
}
}
Another way to do this is by using a UIPanGestureRecognizer. I leave that as an exercise for the reader.

Related

Drag and drop. Nudging other images while moving

I've implemented drag and drop for multiple images which works fine but I'm facing one issue. When I'm dragging one image, other images will be nudged along when my finger moves over them whilst Im still dragging the original image. I'd love to be able to have only one image moveable at once.
Heres some of my code.
-(void)touchesBegan: (NSSet *) touches withEvent:(UIEvent *)event{
UITouch* touch = [touches anyObject];
for (UIImageView *noseImage in noseArray) {
if ([touch.view isEqual:noseArray]) {
firstTouchPoint = [touch locationInView:[self view]];
xd = firstTouchPoint.x - [[touch view]center].x;
yd = firstTouchPoint.y - [[touch view]center].y;
}
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGPoint oldPoint = [touch previousLocationInView:touch.view];
CGPoint newPoint = [touch locationInView:touch.view];
CGPoint diff = CGPointMake(newPoint.x - oldPoint.x, newPoint.y - oldPoint.y);
for (UIImageView *noseImageView in noseArray) {
if (CGRectContainsPoint(noseImageView.frame, newPoint)) {
CGPoint cntr = [noseImageView center];
[noseImageView setCenter:CGPointMake(cntr.x + diff.x, cntr.y + diff.y)];
}
}
}
Typically you would determine which image is being dragged in the touchesBegan: and remember that. Then in the touchesMoved:, move the remembered image the given amount.
But a Gesture Recogniser works a lot easier than these low level methods, so I suggest you use that instead.
You wrote:
for (UIImageView *noseImage in noseArray) {
if ([touch.view isEqual:noseArray]) {
Are you sure the second line shouldn't be the following?
if ([touch.view isEqual: noseImage]) {

Objective-C SpriteKit Create dotted line to certain points

I have a certain point at the bottom of the screen . . . when I touch the screen somewhere, I'd like a dotted line to appear between the point, and the point my finger is at. The length and rotation of the line will change based on where my finger is, or moves to.
I'm assuming I'd make the dotted line with a repetition of a small line image, but I guess that's why I need your help!
Note that all this can be organized better, and I personally don't like SKShapeNode in any shape :) or form, but this is the one way to do it:
#import "GameScene.h"
#implementation GameScene{
SKShapeNode *line;
}
-(void)didMoveToView:(SKView *)view {
/* Setup your scene here */
line = [SKShapeNode node];
[self addChild:line];
[line setStrokeColor:[UIColor redColor]];
}
-(void)drawLine:(CGPoint)endingPoint{
CGMutablePathRef pathToDraw = CGPathCreateMutable();
CGPathMoveToPoint(pathToDraw, NULL, CGRectGetMidX(self.frame),CGRectGetMidY(self.frame));
CGPathAddLineToPoint(pathToDraw, NULL, endingPoint.x,endingPoint.y);
CGFloat pattern[2];
pattern[0] = 20.0;
pattern[1] = 20.0;
CGPathRef dashed =
CGPathCreateCopyByDashingPath(pathToDraw,NULL,0,pattern,2);
line.path = dashed;
CGPathRelease(dashed);
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
/* Called when a touch begins */
for (UITouch *touch in touches) {
CGPoint location = [touch locationInNode:self];
[self drawLine:location];
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
for (UITouch *touch in touches) {
CGPoint location = [touch locationInNode:self];
[self drawLine:location];
}
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
line.path = nil;
}
The result is:
Also I don't know how much performant this is, but you can test it, tweak it and improve it. Or even use SKSpriteNode like you said. Happy coding!
EDIT:
I just noticed that you said dotted (not dashed) :)
You have to change pattern to something like:
pattern[0] = 3.0;
pattern[1] = 3.0;

Spritekit and Touch event (Coordinates seems flipped)

I have added an image on the SKScene using the following code -
char1 = [[SKSpriteNode alloc] initWithImageNamed:#"CH_designb_nerd.png"];
char1.position = CGPointMake(225.0, 65.0);
char1.anchorPoint = CGPointMake(0, 0);
[char1 setScale:0.35];
[self addChild:char1];
Then I was trying to create a touch event on the image but CGRect of the image seems totally flipped on the other side of the screen. I have used the following code
(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if (CGRectContainsPoint(char1.frame, location)) {
NSLog(#"yaaa!!");
}
else{
NSLog(#"%.2f",location.x);
NSLog(#"%.2f",char1.position.x);
}
}
Im totally lost on this as the image is placed perfectly but the coordinates or the CGRect seems flipped. The image is placed on the bottom but only when i touch on top of the screen does it say touch occurred.
I imagine this is due to getting the wrong click location. I would instead look at using the SKNode equivalent for hit testing. Something like this:
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
SKSpriteNode *char1 = (SKSpriteNode *)[self childNodeWithName:#"char1Name"];
if ([char1 containsPoint:location])
{
...
}
There might be some syntax errors here, typing from memory but you get the gist. Note: you would need to set the name property for char1 to perform the lookup, or you could just enumerate self.children if there is only that node.
Try this.
UITouch * touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
Nice example of spriteKit Link

Prevent dragging UIButton outside of its UIView

I've got two UIViews on my window: one to hold player scores, (a sidebar), and a main play area. They both fit on the UIWindow, and neither scroll. The user can drag UIButtons around on the main play area – but at present, they can drop them onto the sidebar. Once they do, they can't drag them again to bring them back, presumably because you're then tapping on the second view, which doesn't contain the button in question.
I'd like to prevent anything inside the main view being moved onto the sidebar view. I've managed this, but I need the drag to be released if the player's finger moves off that view. With the code below, the button keeps moving with the finger, but just won't go past the X coordinate of the view. How can I go about this? Dragging is enabled using this call:
[firstButton addTarget: self action: #selector(wasDragged: withEvent:) forControlEvents: UIControlEventTouchDragInside];
To this method:
- (void) wasDragged: (UIButton *) button withEvent: (UIEvent *) event
{
if (button == firstButton) {
UITouch *touch = [[event touchesForView:button] anyObject];
CGPoint previousLocation = [touch previousLocationInView:button];
CGPoint location = [touch locationInView:button];
CGFloat delta_x = location.x - previousLocation.x;
CGFloat delta_y = location.y - previousLocation.y;
if ((button.center.x + delta_x) < 352)
{
button.center = CGPointMake(button.center.x + delta_x, button.center.y + delta_y);
} else {
button.center = CGPointMake(345, button.center.y + delta_y);
}
}
}
implement
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
touch delegate method, then check the location of the UITouch, if the location is outside of the bounds you want to allow (the first view), then don't move it any further. You could also kill the touch at the point the user drags outside the view using a BOOL iVar
//In .h file
BOOL touchedOutside;
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
touchedOutside = NO;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if (!touchedOutside) {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:firstView];
if (location.x < UPPER_XLIMIT && location.x > LOWER_XLIMIT) {
if (location.y < UPPER_YLIMIT && location.x > LOWER_YLIMIT) {
//Moved within acceptable bounds
button.centre = location;
}
} else {
//This will end the touch sequence
touchedOutside = YES;
//This is optional really, but you can implement
//touchesCancelled: to handle the end of the touch
//sequence, and execute the code immediately rather than
//waiting for the user to remove the finger from the screen
[self touchesCancelled:touches withEvent:event];
}
}

Bug when moving a UIView horizontally

I've created a puzzle engine for iPad (iOS SDK 4.2)
I have a PuzzleViewController that controls a UIView (rootView) that holds a smaller UIView (puzzleView) and twelve puzzle pieces (PuzzlePieceView class extends UIImageView).
The ideia is very simple. The puzzlePieces are around the puzzleView and are picked and dragged to the puzzleView. When the pieces are picked they double they're size and are centered to place where the finger is touching.
When they're dropped in the right place they stay put (they're removed from rootView and added as subviews to the puzzleView) if they're drop in the wrong place they return to the original position and size.
Currently I'm overriding touches Began/Moved/Ended/Cancelled in the PuzzlePieceView class so that each PuzzlePiece controls its own movements.
here's what they do
#pragma mark UIResponder overriding
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"PuzzlePiece touchesBegan");
if (!isDeployed) {
if (self.initialSize.width == 0 || self.initialSize.height ==0) { //if there is still no initial size
self.initialSize = self.frame.size;
}
NSLog(#"self.initialLocation.x %f %f", self.initialLocation.x, self.initialLocation.y);
if (self.initialLocation.x == 0. || self.initialLocation.y == 0.) { //if there is still no initial location
self.initialLocation = self.center; //record the initial location
}
UITouch *touch = [[event allTouches] anyObject];
self.center = [touch locationInView:self.superview]; //set the center of the view as the point where the finger touched it
[self.superview bringSubviewToFront:self]; //the piece brings itself as frontMostView so that it is always visible and never behind any other piece while moving
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"PuzzlePiece touchesMoved");
if (self.frame.size.width == self.initialSize.width) {
self.frame = CGRectMake(self.frame.origin.x, self.frame.origin.y, self.image.size.width, self.image.size.height);
}
if (!isDeployed) {
UITouch *touch = [[event allTouches] anyObject];
self.center = [touch locationInView:self.superview]; //set the center of the view as the point where the finger moved to
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"PuzzlePiece touchesEnded");
if (!isDeployed) {
UITouch *touch = [[event allTouches] anyObject];
NSLog(#"point %# x%f y%f", self.puzzleView, [touch locationInView:self.puzzleView].x, [touch locationInView:self.puzzleView].y);
if (CGRectContainsPoint(self.dropZone,[touch locationInView:self.puzzleView])) {
[self removeFromSuperview];
[self.puzzleViewController addPiece:self];
self.center = self.finalCenter;
self.isDeployed = YES;
}
else {
[self restoreToInitialSize];
[self restoreToInitialPosition];
}
}
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"PuzzlePiece touchesCancelled");
if (!isDeployed) {
[self restoreToInitialSize];
[self restoreToInitialPosition];
}
}
#pragma mark -
Seemed pretty easy and it was.
I've got only one bug left.
When I try to make an horizontal move on any puzzlePiece the touch gets cancelled.
This only happens when the move is started in a fast way. If i touch the piece and wait for a moment (about a second) the pieces move perfectly.
The pieces also move perfectly in the vertical direction.
I've tried not changing the puzzlePiece size when the view is first touched bu the bug remains...
I found the solution to this.
One of my team-mates registered a UISwipeGestureRecognizer in the RootView instead of the specific subview were the swipeGesture should be recognized.