Problem with custom vertical UISlider - objective-c

I'm trying to implement vertical slider with no thumb, that reacts to touches on it's track. I have subclassed UISlider and everything goes fine in horizontal slider, but when I'm transforming slider to vertical there is wierd stuff with coordinates going on. May be there is other way to implement this? Please, give me right direction, thanks!
I'm using this code for slider now:
#implementation MMTouchSlider
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:self];
self.value = self.minimumValue + (self.maximumValue - self.minimumValue) * (touchLocation.x / self.frame.size.width);
[super touchesBegan:touches withEvent:event];
}
- (CGRect)thumbRectForBounds:(CGRect)bounds trackRect:(CGRect)rect value:(float)value {
CGRect thumbRect = [super thumbRectForBounds:bounds trackRect:rect value:value];
return thumbRect;
}
#end

It's easy to make a vertical slider - just use 270 degree rotation!
UISlider *slider = [[UISlider alloc] initWithFrame:CGRectMake(100, 100, 200, 10)];
float angleInDegrees = 270;
float angleInRadians = angleInDegrees * (M_PI/180);
slider.transform = CGAffineTransformMakeRotation(angleInRadians);
[self.view addSubView:slider];

Related

Objective C: fix distance between images in touches moved

How could i set up my images(points) with same fix distance to new image(point) when performing a touches moved?
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:touch.view];
UIImageView *imageView=[[UIImageView alloc] initWithImage:[UIImage imageNamed:#"Crayon_Black.png"]];
imageView.center = touchLocation;
[drawImage addSubview:imageView];
}
I hope this make sense. I just need to finish my school project. Thanks guys in advance.
This solution works as long as you don't move finger too fast:
#interface ViewController : UIViewController {
CGPoint lastLocation_;
CGFloat accumulatedDistance_;
}
...
-(CGFloat) distanceFromPoint:(CGPoint)p1 ToPoint:(CGPoint)p2 {
CGFloat xDist = (p2.x - p1.x);
CGFloat yDist = (p2.y - p1.y);
return sqrt((xDist * xDist) + (yDist * yDist));
}
-(void) addImageAtLocation:(CGPoint)location {
UIImageView *imageView=[[UIImageView alloc] initWithImage:[UIImage imageNamed:#"Crayon_Black.png"]];
imageView.center = location;
[self.view addSubview:imageView];
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
lastLocation_ = [[touches anyObject] locationInView:self.view];
accumulatedDistance_ = 0;
[self addImageAtLocation:lastLocation_];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
CGFloat distance = [self distanceFromPoint:touchLocation ToPoint:lastLocation_];
accumulatedDistance_ += distance;
CGFloat fixedDistance = 40;
if (accumulatedDistance_ > fixedDistance) {
[self addImageAtLocation:touchLocation];
while (accumulatedDistance_ > fixedDistance) {
accumulatedDistance_ -= fixedDistance;
}
}
lastLocation_ = touchLocation;
}

How do I add images to a view by touch?

I'm working on an image editing app. Right now I have the app built so a user can choose a photo from their library or take a photo with the camera. I also have another view (a picker view) that has other images a user can choose from. By selecting one of the images the app takes the user back to the main photo.
I want the user to be able to touch anywhere on screen and add the image they selected.
What is the best way to approach this?
touchesBegan? touchesMoved? UITapGestureRecognizer?
If anyone knows of any sample code or can give me a general idea of how to approach this that would be great!
EDIT:
Now I am able to see the coordinates and that my UIImage is getting the image I select from my Picker. But the image is not being displayed on the screen when I tap. Can someone help me troubleshoot my code please:
-(void)drawRect:(CGRect)rect
{
CGRect currentRect = CGRectMake(touchPoint.x, touchPoint.y, 30.0, 30.0);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextFillRect(context, currentRect);
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
touchPoint = [touch locationInView:imageView];
NSLog(#"%f", touchPoint.x);
NSLog(#"%f", touchPoint.y);
if (touchPoint.x > -1 && touchPoint.y > -1)
{
stampedImage = _imagePicker.selectedImage;
//[stampedImage drawAtPoint:touchPoint];
[_stampedImageView setFrame:CGRectMake(touchPoint.x, touchPoint.y, 30.0, 30.0)];
[_stampedImageView setImage:stampedImage];
[imageView addSubview:_stampedImageView];
NSLog(#"Stamped Image = %#", stampedImage);
//[self.view setNeedsDisplay];
}
}
For an example of my NSLogs I am seeing:
162.500000
236.000000
Stamped Image = <UIImage: 0xe68a7d0>
Thanks!
In your ViewController, user the method "-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event" to get the X and Y coordinate of where a touch happened. Here is some sample code that shows how to get the x and y of the touch
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
/* Detect touch anywhere */
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
NSLog(#"%f", touchPoint.x); // The x coordinate of the touch
NSLog(#"%f", touchPoint.y); // The y coordinate of the touch
}
Once you have this x and y data, you can set the image that the user selected or shot with the built in camera to appear at those coordinates.
EDIT:
I think the issue MIGHT lie in how you create you UIImage view. Instead of this:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
touchPoint = [touch locationInView:imageView];
CGRect myImageRect = CGRectMake(touchPoint.x, touchPoint.y, 20.0f, 20.0f);
UIImageView * myImage = [[UIImageView alloc] initWithFrame:myImageRect];
[myImage setImage:_stampedImageView.image];
myImage.opaque = YES;
[imageView addSubview:myImage];
[myImage release];
}
Try this:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
touchPoint = [touch locationInView:imageView];
myImage = [[UIImageView alloc] initWithImage:_stampedImageView.image];
[imageView addSubview:myImage];
[myImage release];
}
If this does not work, try checking if "_stampedImageView.image == nil". If this is true, Your UIImage may not have been created properly.

Prevent view object from jumping when moving with finger touch

I've created a 200x200 circle in a custom UIView. I am using the following script to move the view object around the screen on the iPad.
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
if([touch view] == newShape)
{
newShape.center = currentPoint;
}
[self.view setNeedsDisplay];
}
Everything works fine, I can move the circle anywhere on the screen. However, if I don't touch dead center on the circle object, it jumps slightly. By reading the code, this is quite obvious because newShape.center is being set to wherever the touch happens and ultimately snapping quickly to that position.
I'm looking for a way to move an object without snapping to the touch position. I'm thinking I would use xy coordinates to achieve this, but I'm not sure how to implement it.
Thanks!
Declare CGPoint prevPos; in .h file.
Here my view is _rectView.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
prevPos = currentPoint;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
if([touch view] == _rectView)
{
float delX = currentPoint.x - prevPos.x;
float delY = currentPoint.y - prevPos.y;
CGPoint np = CGPointMake(_rectView.frame.origin.x+delX, _rectView.frame.origin.y+delY);
//_rect.center = np;
CGRect fr = _rectView.frame;
fr.origin = np;
_rectView.frame = fr;
}
//[self.view setNeedsDisplay];
prevPos = currentPoint;
}
Use the above code. You will not get that 'jump' effect.
An obvious way to do it is to store the offset of the touch from the shape's center in -touchesBegan:withEvent: and apply the offset in -touchesMoved:withEvent:.

Rotate UIView based on position in superview

I have a UIImageView that is an arrow image. I added this view to a parent view with a frame of 500x500 dimensions. I want to rotate the UIImageView so that the arrow is always pointing to the center of its parent when the user touches and moves it around.
I was looking into CGAffineTransform and trigonometry rotations, but I'm having trouble getting started. Can anyone provide some insight?
Thanks!
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
pointerView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"pointer.png"]];
pointerView.layer.anchorPoint = CGPointMake(0, 0.5);
pointerView.center = point;
[self setPointerRotation];
[self addSubview:pointerView];
[pointerView release];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
pointerView.center = point;
[self setPointerRotation];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[pointerView removeFromSuperview];
pointerView = nil;
}
- (void)setPointerRotation {
CGAffineTransform transform = // calculate rotation so that pointerView points to the center of the superview
pointerView.transform = transform;
}

How to drag only one image with iPhone SDK

I want to create a little app that takes two images and i want to make only the image over draggable.
After research, i found this solution:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[ event allTouches] anyObject];
image.alpha = 0.7;
if([touch view] == image){
CGPoint location = [touch locationInView:self.view];
image.center = location;
}
It works but the problem is that the image is draggable from its center and I don't want that.
So I found another solution:
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
// Retrieve the touch point
CGPoint pt = [[touches anyObject] locationInView:self.view];
startLocation = pt;
[[self view] bringSubviewToFront:self.view];
}
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
// Move relative to the original touch point
CGPoint pt = [[touches anyObject] locationInView:self.view];
frame = [self.view frame];
frame.origin.x += pt.x - startLocation.x;
frame.origin.y += pt.y - startLocation.y;
[self.view setFrame:frame];
}
It works very well but when I add another image, all the images of the view are draggable at the same time. I'm a beginner with the iPhone development and I have no idea of how I can only make the image over draggable.
You can use if condition in the touchesBegan: method, See this-
-(void)ViewDidLoad
{
//Your First Image View
UIImageView *imageView1 = [UIImageView alloc]initWithFrame:CGRectMake(150.0, 100.0, 30.0, 30.0)];
[imageView1 setImage:[UIImage imageNamed:#"anyImage.png"]];
[imageView1 setUserInteractionEnabled:YES];
[self.view addSubview:imageView1];
//Your Second Image View
UIImageView *imageView2 = [UIImageView alloc]initWithFrame:CGRectMake(150.0, 200.0, 30.0, 30.0)];
[imageView2 setImage:[UIImage imageNamed:#"anyImage.png"]];
[imageView2 setUserInteractionEnabled:YES];
[self.view addSubview:imageView2];
}
//Now Use Touch begin methods with condition like this
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches]anyObject];
if([touch view] == imageView1)
{
CGPoint pt = [[touches anyObject] locationInView:imageView1];
startLocation = pt;
}
if([touch view] == imageView2)
{
CGPoint pt = [[touches anyObject] locationInView:imageView2];
startLocation = pt;
}
}
//Use same thing with touch move methods...
- (void) touchesMoved:(NSSet *)touches withEvent: (UIEvent *)event
{
UITouch *touch = [[event allTouches]anyObject];
if([touch view] == imageView1)
{
CGPoint pt = [[touches anyObject] previousLocationInView:imageView1];
CGFloat dx = pt.x - startLocation.x;
CGFloat dy = pt.y - startLocation.y;
CGPoint newCenter = CGPointMake(imageView1.center.x + dx, imageView1.center.y + dy);
imageView1.center = newCenter;
}
if([touch view] == imageView2)
{
CGPoint pt = [[touches anyObject] previousLocationInView:imageView2];
CGFloat dx = pt.x - startLocation.x;
CGFloat dy = pt.y - startLocation.y;
CGPoint newCenter = CGPointMake(imageView2.center.x + dx, imageView2.center.y + dy);
imageView2.center = newCenter;
}
}
These two images will move only when you touch on it and move it. I hope my tutorials will help you.
Simply use these two methods
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
myImage.center = location;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
[self touchesBegan:touches withEvent:event];
}
Let's break it down. You should have a main containerView (object of UIView) in your view controller. Within that, you should have DragableImageView objects.
DragableImageView inherits UIImageView and within that, you add your touches code. (Right now you are doing self.frame = position which is wrong - and obviously it will move all images at once - because if I understand correctly, you are adding these touches in your view controller).
Create only DragableImageView first, test it. Then add two DragableImageView instances. test them in different configurations.. you might want to hit test in the containerView to see which DragableImageView should be dragged (e.g. if you have jigsaw pieces, you might want to pick the one that is at the back rather than the front one).