Determine overlapping UIImageViews and exchange their location - objective-c

I have around 16 UIImage views, can be moved by touch. I can drop over UIImage over another UIImage and then the UIImages would exchange their position. Being newcomer to ObjectiveC programming, I am struggling to figure out the UIImageViews and exchange their positons. Till now I am trying to implement in touchesEnded method
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
[self.superview exchangeSubviewAtIndex:[[self.superview subviews] indexOfObject:self] withSubviewAtIndex:[[self.superview subviews] indexOfObject:[touch view]]];
}
But this does not work. Any help would be appreciated.
UIImages are created by following code
TouchImageView *touchImageView = [[TouchImageView alloc]initWithImage:displayImage];
touchImageView.identy = [NSString stringWithFormat:#"Image ID %d",i];
So each touch image is associated some string to describe itself.
In touch end, I have just added following code to know ID of image views
UITouch *touch = [touches anyObject];
TouchImageView *dragImage = (TouchImageView*)[touch view];
NSLog(#"Ended %a%a",[dragImage identy],[self identy]);
But the o/p I got is totally different
2011-11-21 11:50:34.404 OrganizeMe[882:f803] Ended 0x1.6d96006a6d96p-9170x1.807p-1022
FInal code
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
__block int tag = -1;
__block float distance = 100000.0;
[[self.superview subviews] enumerateObjectsUsingBlock:^(UIView *view, NSUInteger index, BOOL *stop) {
BOOL interSect = CGRectIntersectsRect([self frame], [view frame]);
if(interSect && ([self tag]!=[view tag])){
[self.image CGImage];
CGPoint currPoint = [[touches anyObject]locationInView:[self superview]];
CGPoint underPoint = view.center;
if(distance >= [self distanceBetweenPoint1:currPoint Point2:underPoint]){
distance = [self distanceBetweenPoint1:currPoint Point2:underPoint] ;
tag = [view tag];
}
}
}];
NSLog(#"Tag and Distance %d,%f",tag,distance);
TouchImageView* imageView1 = (TouchImageView*)[self.superview viewWithTag:tag];
CGRect point1 = [imageView1 frame];
CGRect point2 = [self frame];
if(tag != -1){
[imageView1 setFrame:point2];
[self setFrame:point1];
}else{
CGPoint lastTouch = [[touches anyObject]previousLocationInView:[self superview]];
self.center = lastTouch;
}
[self.superview setNeedsLayout];
}

The index is just the UIView's position in the subview array. It does not set the location within the superview.
Relayout the superview based on the subview's index.
In your above method, also call setNeedsLayout in order to relayout the superview.
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
[self.superview exchangeSubviewAtIndex:[[self.superview subviews] indexOfObject:self] withSubviewAtIndex:[[self.superview subviews] indexOfObject:[touch view]]];
[self.superview setNeedsLayout]; // Should call the method below
}
Implement the following method for the superview:
- (void)layoutSubviews
{
// Iterate over all subviews
[self.subviews enumerateObjectsUsingBlock:^(UIView *view, NSUInteger index, BOOL *stop) {
// This block will be executed for each subview. You have the subview's index.
// Now set the subview's position accordingly to the index.
// I don't know your layout logic, but let's assume you have 14 subviews spread over 4 columns, this will be the subviews' position:
/* 0 1 2 3
4 5 6 7
8 9 10 11
12 13 */
int col = index % 4; // Column from 0 to 3
int row = (int)(index / 4); // Row starting from 0
int w = view.frame.size.width;
int h = view.frame.size.height;
CGPoint p = CGPointMake(col * w + w / 2, row * h + h / 2);
[view setCenter:p];
}];
}
It's been a long time I last used Cocoa touch, so I can't guarantee that the code is working.

Related

Adding and touching multiple sprites in cocos2d

I have a class where I add multiple sprites as shown in the code below:
CCSprite *b = [CCSprite spriteWithFile:#"b"];
b.position = ccp(100, 160);
CCSprite *b2 = [CCSprite spriteWithFile:#"b2.png"];
b2.position = ccp(115, 150);
CCSprite *b3 = [CCSprite spriteWithFile:#"b3.png"];
b.position = ccp(200, 150);
CCSprite *b4 = [CCSprite spriteWithFile:#"b4.png"];
b4.position = ccp(220, 145);
b.anchorPoint = ccp(0.98, 0.05);
b2.anchorPoint = ccp(0.03, 0.05);
b3.anchorPoint = ccp(0.03, 0.05);
b4.anchorPoint = ccp(0.95, 0.05);
[self addChild:b z:1 tag:1];
[self addChild:b2 z:1 tag:2];
[self addChild:b3 z:1 tag:3];
[self addChild:b4 z:1 tag:4];
Here's the code for the touch event:
-(void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSSet *allTouch = [event allTouches];
UITouch *touch = [[allTouch allObjects]objectAtIndex:0];
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector]convertToGL:location];
//Swipe Detection - Beginning point
beginTouch = location;
for(int i = 0; i < [hairArray count]; i++)
{
CCSprite *sprite = (CCSprite *)[hairArray objectAtIndex:i];
if(CGRectContainsPoint([sprite boundingBox], location))
{
//selectedSprite is a sprite declared on the header file
selectedSprite = sprite;
}
}}
-(void)ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
//Move touched sprite
NSSet *allTouch = [event allTouches];
UITouch *touch = [[allTouch allObjects]objectAtIndex:0];
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector]convertToGL:location];
if(selectedSprite != nil)
{
selectedSprite.position = ccp(location.x, location.y);
}}
-(void) ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
//End point of sprite after dragged
NSSet *allTouch = [event allTouches];
UITouch *touch = [[allTouch allObjects]objectAtIndex:0];
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector]convertToGL:location];
endTouch = location;
posX = endTouch.x;
//Minimum swipe length
posY = ccpDistance(beginTouch, endTouch);
[self moveSprite];}
Now, the actions itself work just fine but the trouble I'm having is that if I want to drag b2, I have to drag b3 and b4 first. I'm not sure if it has anything to do with the z-index or it is because of the transparent areas that is present for each sprite. Is there something I'm missing here?
if(CGRectContainsPoint([sprite boundingBox], location))
{
//selectedSprite is a sprite declared on the header file
selectedSprite = sprite;
}
This code updates the currently selected sprite as soon as a new one is found while looping on all sprites. This means that if 3 sprites overlap you will get that the selected one is the last one in the array of nodes of the parent.
You can't make any assumptions on the orders so this is not clearly what you want, you have to decide a policy to give sprites priority. Mind that editing the anchorPoint may alter the position of the sprite compared to the bounding box (so that the bounding box is even outside the sprite).
To be sure of it you should enable:
#define CC_SPRITE_DEBUG_DRAW 1
in ccConfig.h. This will render bounding boxes around sprites.

How to return the subview that was touched

I've searched around on here and see a lot of answers that say to turn on the userInteractionEnabled property. I've already done that.
I create my subviews programmatically (not in Interface Builder). The subviews are a custom subclass of UIView (called PieceSuperClass).
All I want is something that would look like
UITouch *touch = [touches anyObject];
CGPoint currentPosition = [touch locationInView:self.view];
UIView *hitView = [self.view hitTest:currentPosition withEvent:event];
if ([hitView isKindOfClass:[PieceSuperClass class]]) {
return hitView;
}
For some reason, hitView isKindOfClass UIImageView even though I most definitely declared it as a PieceSuperClass. 'PieceSuperClass' is a subclass of UIImageView.
// Draw proper piece
UIImage *pieceImage = [UIImage imageNamed:[NSString stringWithFormat:#"%#%#.png", pieceColor, pieceName]];
PieceSuperClass *pieceImageView = [[PieceSuperClass alloc] initWithFrame:CGRectMake(0, 0, 39, 38)];
pieceImageView.image = pieceImage;
pieceImageView.identifier = [NSString stringWithFormat:#"%#%#%#", pieceColor, pieceName, pieceNumber];
pieceImageView.userInteractionEnabled = YES;
[boardView addSubview:pieceImageView];
I think you are tracking the wrong view with the currectposition. If the PieceSuperClass is a subview of your self.view you should probably track it like:
CGPoint currentPosition = [touch locationInView:self.view.subviews];
That's from the UITouch Documentation:
The view in which the touch initially occurred. (read-only)
#property(nonatomic, readonly, retain) UIView *view
So you can know which view was touched and simply do this:
UIView *hitView = touch.view;
if ([hitView isKindOfClass:[PieceSuperClass class]]) {
return hitView;
}
Figured it out!
- (id)whichPieceTouched:touches withEvent:event
{
UITouch *touch = [touches anyObject];
CGPoint currentPosition = [touch locationInView:self.view];
for (int i = 0; i < [piecesArray count]; i++) {
if (CGRectContainsPoint([[piecesArray objectAtIndex:i] frame], currentPosition)) {
return [piecesArray objectAtIndex:i];
}
}
}
Though I have one lingering problem: it seems to always select the object about 20 pixels below (or the next object down).

How do I add images to a view by touch?

I'm working on an image editing app. Right now I have the app built so a user can choose a photo from their library or take a photo with the camera. I also have another view (a picker view) that has other images a user can choose from. By selecting one of the images the app takes the user back to the main photo.
I want the user to be able to touch anywhere on screen and add the image they selected.
What is the best way to approach this?
touchesBegan? touchesMoved? UITapGestureRecognizer?
If anyone knows of any sample code or can give me a general idea of how to approach this that would be great!
EDIT:
Now I am able to see the coordinates and that my UIImage is getting the image I select from my Picker. But the image is not being displayed on the screen when I tap. Can someone help me troubleshoot my code please:
-(void)drawRect:(CGRect)rect
{
CGRect currentRect = CGRectMake(touchPoint.x, touchPoint.y, 30.0, 30.0);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextFillRect(context, currentRect);
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
touchPoint = [touch locationInView:imageView];
NSLog(#"%f", touchPoint.x);
NSLog(#"%f", touchPoint.y);
if (touchPoint.x > -1 && touchPoint.y > -1)
{
stampedImage = _imagePicker.selectedImage;
//[stampedImage drawAtPoint:touchPoint];
[_stampedImageView setFrame:CGRectMake(touchPoint.x, touchPoint.y, 30.0, 30.0)];
[_stampedImageView setImage:stampedImage];
[imageView addSubview:_stampedImageView];
NSLog(#"Stamped Image = %#", stampedImage);
//[self.view setNeedsDisplay];
}
}
For an example of my NSLogs I am seeing:
162.500000
236.000000
Stamped Image = <UIImage: 0xe68a7d0>
Thanks!
In your ViewController, user the method "-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event" to get the X and Y coordinate of where a touch happened. Here is some sample code that shows how to get the x and y of the touch
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
/* Detect touch anywhere */
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
NSLog(#"%f", touchPoint.x); // The x coordinate of the touch
NSLog(#"%f", touchPoint.y); // The y coordinate of the touch
}
Once you have this x and y data, you can set the image that the user selected or shot with the built in camera to appear at those coordinates.
EDIT:
I think the issue MIGHT lie in how you create you UIImage view. Instead of this:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
touchPoint = [touch locationInView:imageView];
CGRect myImageRect = CGRectMake(touchPoint.x, touchPoint.y, 20.0f, 20.0f);
UIImageView * myImage = [[UIImageView alloc] initWithFrame:myImageRect];
[myImage setImage:_stampedImageView.image];
myImage.opaque = YES;
[imageView addSubview:myImage];
[myImage release];
}
Try this:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
touchPoint = [touch locationInView:imageView];
myImage = [[UIImageView alloc] initWithImage:_stampedImageView.image];
[imageView addSubview:myImage];
[myImage release];
}
If this does not work, try checking if "_stampedImageView.image == nil". If this is true, Your UIImage may not have been created properly.

Getting location of sprite within array cocos2d`

I need to be able to touch a specific moving sprite in my array and perform an action on it. However when I perform my MoveTo action, the sprite location doesn't update. Help!
Array:
int numbreds = 7;
redBirds = [[CCArray alloc] initWithCapacity: numbreds];
for( int i = 1; i<=numbreds; i++){
int xvalue = ((-50*i) + 320);
int yvalue= 160;
if (i==4)
{
CCSprite *parrot = [CCSprite spriteWithFile:#"taco.png"];
[birdLayer addChild:parrot];
[self movement]; //the action that moves the array horizontally
parrot.position = ccp(xvalue,yvalue);
parrot.tag=100;
Touch
-(void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:[touch view]];
location = [[CCDirector sharedDirector] convertToGL:location];
CCSprite *mark = (CCSprite *)[birdLayer getChildByTag:100];
if (CGRectContainsPoint([mark boundingBox], location))
{
CCLOG(#"YAY!");
}
THe problem is that the location of the CCSprite doesn't actually update or move. YAY! only is generated at the origin location of the sprite.
Try this:
CCSprite *temp = [CCSprite spriteWithFile:#"taco.png"];
temp = [birdLayer getChildByTag:100];
if (temp.position.x == location.x) {
// do stuff...
}

How to drag only one image with iPhone SDK

I want to create a little app that takes two images and i want to make only the image over draggable.
After research, i found this solution:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[ event allTouches] anyObject];
image.alpha = 0.7;
if([touch view] == image){
CGPoint location = [touch locationInView:self.view];
image.center = location;
}
It works but the problem is that the image is draggable from its center and I don't want that.
So I found another solution:
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
// Retrieve the touch point
CGPoint pt = [[touches anyObject] locationInView:self.view];
startLocation = pt;
[[self view] bringSubviewToFront:self.view];
}
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
// Move relative to the original touch point
CGPoint pt = [[touches anyObject] locationInView:self.view];
frame = [self.view frame];
frame.origin.x += pt.x - startLocation.x;
frame.origin.y += pt.y - startLocation.y;
[self.view setFrame:frame];
}
It works very well but when I add another image, all the images of the view are draggable at the same time. I'm a beginner with the iPhone development and I have no idea of how I can only make the image over draggable.
You can use if condition in the touchesBegan: method, See this-
-(void)ViewDidLoad
{
//Your First Image View
UIImageView *imageView1 = [UIImageView alloc]initWithFrame:CGRectMake(150.0, 100.0, 30.0, 30.0)];
[imageView1 setImage:[UIImage imageNamed:#"anyImage.png"]];
[imageView1 setUserInteractionEnabled:YES];
[self.view addSubview:imageView1];
//Your Second Image View
UIImageView *imageView2 = [UIImageView alloc]initWithFrame:CGRectMake(150.0, 200.0, 30.0, 30.0)];
[imageView2 setImage:[UIImage imageNamed:#"anyImage.png"]];
[imageView2 setUserInteractionEnabled:YES];
[self.view addSubview:imageView2];
}
//Now Use Touch begin methods with condition like this
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches]anyObject];
if([touch view] == imageView1)
{
CGPoint pt = [[touches anyObject] locationInView:imageView1];
startLocation = pt;
}
if([touch view] == imageView2)
{
CGPoint pt = [[touches anyObject] locationInView:imageView2];
startLocation = pt;
}
}
//Use same thing with touch move methods...
- (void) touchesMoved:(NSSet *)touches withEvent: (UIEvent *)event
{
UITouch *touch = [[event allTouches]anyObject];
if([touch view] == imageView1)
{
CGPoint pt = [[touches anyObject] previousLocationInView:imageView1];
CGFloat dx = pt.x - startLocation.x;
CGFloat dy = pt.y - startLocation.y;
CGPoint newCenter = CGPointMake(imageView1.center.x + dx, imageView1.center.y + dy);
imageView1.center = newCenter;
}
if([touch view] == imageView2)
{
CGPoint pt = [[touches anyObject] previousLocationInView:imageView2];
CGFloat dx = pt.x - startLocation.x;
CGFloat dy = pt.y - startLocation.y;
CGPoint newCenter = CGPointMake(imageView2.center.x + dx, imageView2.center.y + dy);
imageView2.center = newCenter;
}
}
These two images will move only when you touch on it and move it. I hope my tutorials will help you.
Simply use these two methods
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
myImage.center = location;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[self touchesBegan:touches withEvent:event];
}
Let's break it down. You should have a main containerView (object of UIView) in your view controller. Within that, you should have DragableImageView objects.
DragableImageView inherits UIImageView and within that, you add your touches code. (Right now you are doing self.frame = position which is wrong - and obviously it will move all images at once - because if I understand correctly, you are adding these touches in your view controller).
Create only DragableImageView first, test it. Then add two DragableImageView instances. test them in different configurations.. you might want to hit test in the containerView to see which DragableImageView should be dragged (e.g. if you have jigsaw pieces, you might want to pick the one that is at the back rather than the front one).