Rotate UIView based on position in superview - objective-c

I have a UIImageView that is an arrow image. I added this view to a parent view with a frame of 500x500 dimensions. I want to rotate the UIImageView so that the arrow is always pointing to the center of its parent when the user touches and moves it around.
I was looking into CGAffineTransform and trigonometry rotations, but I'm having trouble getting started. Can anyone provide some insight?
Thanks!
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
pointerView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"pointer.png"]];
pointerView.layer.anchorPoint = CGPointMake(0, 0.5);
pointerView.center = point;
[self setPointerRotation];
[self addSubview:pointerView];
[pointerView release];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
pointerView.center = point;
[self setPointerRotation];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[pointerView removeFromSuperview];
pointerView = nil;
}
- (void)setPointerRotation {
CGAffineTransform transform = // calculate rotation so that pointerView points to the center of the superview
pointerView.transform = transform;
}

Related

Drag multiple UIImageViews in an Array

This is what I have so far
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
for (UIImageView *imageView in _imageViewArray) {
CGPoint Location = [touch locationInView:touch.view];
imageView.center = Location;
}
}
The problem I am facing is when I move one image they all jump to the same place.
Thanks to cyberpawn this it what I did it get it to work
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint oldPoint = [touch previousLocationInView:touch.view];
CGPoint newPoint = [touch locationInView:touch.view];
CGPoint diff = CGPointMake(newPoint.x - oldPoint.x, newPoint.y - oldPoint.y);
for (UIImageView *imageView in _imageViewArray) {
if (CGRectContainsPoint(imageView.frame, newPoint)) {
CGPoint cntr = [imageView center];
[imageView setCenter:CGPointMake(cntr.x + diff.x, cntr.y + diff.y)];
}
}
}
Thats because you are moving them all to same location, you need to calculate the difference between touch locations and add that displacement to all views. Below code should solve your problem! Forget touchesBegan and Just override touchesMoved method like that.
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint oldPoint = [touch previousLocationInView:touch.view];
CGPoint newPoint = [touch locationInView:touch.view];
CGPoint diff = CGPointMake(newPoint.x - oldPoint.x, newPoint.y - oldPoint.y);
for (UIImageView *imageView in _imageViewArray) {
CGPoint cntr = [imageView center];
[imageView setCenter:CGPointMake(cntr.x + diff.x, cntr.y + diff.y)];
}
}
If You want to move them separately when any of them is clicked than use below code instead!
float oldX, oldY;
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint pt = [touch locationInView:touch.view];
for (UIImageView *imageView in _imageViewArray) {
if(CGRectContainsPoint(imageView.frame, pt)) {
oldX = imageView.center.x - imageView.frame.origin.x - pt.x;
oldY = imageView.center.y - imageView.frame.origin.y - pt.y;
break;
}
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint pt = [touch locationInView:touch.view];
for (UIImageView *imageView in _imageViewArray) {
if (CGRectContainsPoint(imageView.frame, pt)) {
[self setCenter:CGPointMake(pt.x+oldX, pt.y+oldY)];
}
}
Enjoy Programming!
Here you coded like that only. If you want to move a single image, then you have to find that image and you should move that image alone.

Objective C: fix distance between images in touches moved

How could i set up my images(points) with same fix distance to new image(point) when performing a touches moved?
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:touch.view];
UIImageView *imageView=[[UIImageView alloc] initWithImage:[UIImage imageNamed:#"Crayon_Black.png"]];
imageView.center = touchLocation;
[drawImage addSubview:imageView];
}
I hope this make sense. I just need to finish my school project. Thanks guys in advance.
This solution works as long as you don't move finger too fast:
#interface ViewController : UIViewController {
CGPoint lastLocation_;
CGFloat accumulatedDistance_;
}
...
-(CGFloat) distanceFromPoint:(CGPoint)p1 ToPoint:(CGPoint)p2 {
CGFloat xDist = (p2.x - p1.x);
CGFloat yDist = (p2.y - p1.y);
return sqrt((xDist * xDist) + (yDist * yDist));
}
-(void) addImageAtLocation:(CGPoint)location {
UIImageView *imageView=[[UIImageView alloc] initWithImage:[UIImage imageNamed:#"Crayon_Black.png"]];
imageView.center = location;
[self.view addSubview:imageView];
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
lastLocation_ = [[touches anyObject] locationInView:self.view];
accumulatedDistance_ = 0;
[self addImageAtLocation:lastLocation_];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
CGFloat distance = [self distanceFromPoint:touchLocation ToPoint:lastLocation_];
accumulatedDistance_ += distance;
CGFloat fixedDistance = 40;
if (accumulatedDistance_ > fixedDistance) {
[self addImageAtLocation:touchLocation];
while (accumulatedDistance_ > fixedDistance) {
accumulatedDistance_ -= fixedDistance;
}
}
lastLocation_ = touchLocation;
}

How do I add images to a view by touch?

I'm working on an image editing app. Right now I have the app built so a user can choose a photo from their library or take a photo with the camera. I also have another view (a picker view) that has other images a user can choose from. By selecting one of the images the app takes the user back to the main photo.
I want the user to be able to touch anywhere on screen and add the image they selected.
What is the best way to approach this?
touchesBegan? touchesMoved? UITapGestureRecognizer?
If anyone knows of any sample code or can give me a general idea of how to approach this that would be great!
EDIT:
Now I am able to see the coordinates and that my UIImage is getting the image I select from my Picker. But the image is not being displayed on the screen when I tap. Can someone help me troubleshoot my code please:
-(void)drawRect:(CGRect)rect
{
CGRect currentRect = CGRectMake(touchPoint.x, touchPoint.y, 30.0, 30.0);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextFillRect(context, currentRect);
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
touchPoint = [touch locationInView:imageView];
NSLog(#"%f", touchPoint.x);
NSLog(#"%f", touchPoint.y);
if (touchPoint.x > -1 && touchPoint.y > -1)
{
stampedImage = _imagePicker.selectedImage;
//[stampedImage drawAtPoint:touchPoint];
[_stampedImageView setFrame:CGRectMake(touchPoint.x, touchPoint.y, 30.0, 30.0)];
[_stampedImageView setImage:stampedImage];
[imageView addSubview:_stampedImageView];
NSLog(#"Stamped Image = %#", stampedImage);
//[self.view setNeedsDisplay];
}
}
For an example of my NSLogs I am seeing:
162.500000
236.000000
Stamped Image = <UIImage: 0xe68a7d0>
Thanks!
In your ViewController, user the method "-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event" to get the X and Y coordinate of where a touch happened. Here is some sample code that shows how to get the x and y of the touch
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
/* Detect touch anywhere */
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
NSLog(#"%f", touchPoint.x); // The x coordinate of the touch
NSLog(#"%f", touchPoint.y); // The y coordinate of the touch
}
Once you have this x and y data, you can set the image that the user selected or shot with the built in camera to appear at those coordinates.
EDIT:
I think the issue MIGHT lie in how you create you UIImage view. Instead of this:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
touchPoint = [touch locationInView:imageView];
CGRect myImageRect = CGRectMake(touchPoint.x, touchPoint.y, 20.0f, 20.0f);
UIImageView * myImage = [[UIImageView alloc] initWithFrame:myImageRect];
[myImage setImage:_stampedImageView.image];
myImage.opaque = YES;
[imageView addSubview:myImage];
[myImage release];
}
Try this:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
touchPoint = [touch locationInView:imageView];
myImage = [[UIImageView alloc] initWithImage:_stampedImageView.image];
[imageView addSubview:myImage];
[myImage release];
}
If this does not work, try checking if "_stampedImageView.image == nil". If this is true, Your UIImage may not have been created properly.

Prevent view object from jumping when moving with finger touch

I've created a 200x200 circle in a custom UIView. I am using the following script to move the view object around the screen on the iPad.
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
if([touch view] == newShape)
{
newShape.center = currentPoint;
}
[self.view setNeedsDisplay];
}
Everything works fine, I can move the circle anywhere on the screen. However, if I don't touch dead center on the circle object, it jumps slightly. By reading the code, this is quite obvious because newShape.center is being set to wherever the touch happens and ultimately snapping quickly to that position.
I'm looking for a way to move an object without snapping to the touch position. I'm thinking I would use xy coordinates to achieve this, but I'm not sure how to implement it.
Thanks!
Declare CGPoint prevPos; in .h file.
Here my view is _rectView.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
prevPos = currentPoint;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
if([touch view] == _rectView)
{
float delX = currentPoint.x - prevPos.x;
float delY = currentPoint.y - prevPos.y;
CGPoint np = CGPointMake(_rectView.frame.origin.x+delX, _rectView.frame.origin.y+delY);
//_rect.center = np;
CGRect fr = _rectView.frame;
fr.origin = np;
_rectView.frame = fr;
}
//[self.view setNeedsDisplay];
prevPos = currentPoint;
}
Use the above code. You will not get that 'jump' effect.
An obvious way to do it is to store the offset of the touch from the shape's center in -touchesBegan:withEvent: and apply the offset in -touchesMoved:withEvent:.

Drawing a CGRect by touching

I was wondering how to draw a CGRect onto an UIImageView. Here is the code I've got but it doesn't seem to be working. Any suggestions? touch1 and touch2 are both CGPoints and point is a CGRect.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
touch1 = [touch locationInView:self];
touch2 = [touch locationInView:self];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
touch2 = [touch locationInView:self];
point = CGRectMake(touch2.x, touch2.y, 50, 50);
[self setNeedsDisplay];
}
- (void) drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(context, 2.0);
CGContextSetStrokeColorWithColor(context, [UIColor blackColor].CGColor);
CGContextSetFillColorWithColor(context, pointColor.CGColor);
CGContextAddEllipseInRect(context, point);
CGContextDrawPath(context, kCGPathFillStroke);
}
What do you see actually happening?
A few notes:
You shouldn't be doing this with a UIImageView-- those are specifically intended to be containers for image files. You should just use a subclass of a regular UIView for this.
In your touchesBegan, you know that touch1 and touch2 will always be set to the same thing, right? You don't seem to ever be using touch1.
point is a misleading variable name for something that is a rect.
Your drawRect is not unreasonable. What is pointColor? If you're drawing black-on-black that might be part of the problem.
Listing 3-1 Code that creates an ellipse by applying a transform to a circle
CGContextScaleCTM(context, 1,2);
CGContextBeginPath(context);
CGContextAddArc(context, 0, 0, 25, 0, 2*M_PI, false);
CGContextStrokePath(context);
Lots more example code in the Quartz2d Example
I took your code only, its working for me. Just have look at it. And thanks for your code. In this sample i am drawing a rect.
- (void)drawRect:(CGRect)rect
{
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(context, 2.0);
CGContextSetStrokeColorWithColor(context, [UIColor blackColor].CGColor);
CGContextSetFillColorWithColor(context, [UIColor redColor].CGColor);
CGContextAddRect(context, rectFrame);
CGContextDrawPath(context, kCGPathFillStroke);
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
startPoint = [touch locationInView:self];
rectFrame.origin.x = startPoint.x;
rectFrame.origin.y = startPoint.y;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
endPoint = [touch locationInView:self];
rectFrame.size.width = endPoint.y - startPoint.x;
rectFrame.size.height = endPoint.y - startPoint.x;
[self setNeedsDisplay];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
endPoint = [touch locationInView:self];
rectFrame.size.width = endPoint.y - startPoint.x;
rectFrame.size.height = endPoint.y - startPoint.x;
[self setNeedsDisplay];
}