Drawing Straight Lines with Finger on iPhone - objective-c

Background: I am trying to create a really simple iPhone app that will allow the user to draw multiple straight lines on the screen with their finger.
I'm using these two methods in my UIViewController to capture the coordinates of each line's endpoints.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
Question:
I would like the line to show up as soon as touchesEnded has fired and then keep drawing more lines on the screen. How do I do this? I don't necessarily need the code, but I need help with the big picture idea of how to put it together. Also, I'm not a huge fan of xibs and like to do things all programmatically if that affects the answer.
What I've Tried: I've tried using Quartz 2d but it seems in order to use that, you have to do your drawing in the drawRect method of a separate subclassed view. So I would have to create a new view for each line? and then my coordinates would be messed up b/c I'd have to translate the touches positions from the UIViewController to the view.
I've also tried with OpenGL, which I've had a bit more success with (using the GLPaint sample as a template) but OpenGL seems like overkill for just drawing some straight lines on the screen.

You don't need multiple views, and you don't need OpenGL.
Make a subclass of UIView -- call it CanvasView.
Make an object to represent "a line" in your canvas -- it would be a subclass of NSObject with CGPoint properties for start and end.
CanvasView should keep an array of the lines that are in the canvas.
In -[CanvasView drawRect:], loop through the lines in the array, and draw each one.
In -[CanvasView touchesBegan:withEvent:], stash the start point in an instance variable.
In -[CanvasView touchesEnded:withEvent:], make a new line with the start and end points, and add it to your array of lines. Call [self setNeedsDisplay] to cause the view to be redrawn.

U mentioned that you just need big picture of the idea..
So here it is..
Take a view, subclass of UIView.
Capture touches events and then draw it in drawRect: method..
This is all what #kurtRevis mentioned in his answer.
Now to avoid slowing down of drawing, keep an offline image that keeps updated screen view and then just go on adding lines over that. So you wont get slowing down performance when there is big array of lines to be drawn.
Hope u are getting me.

As far as i know you can also use Core Graphics to draw a line, and from your question you don't need to create views for every single line, instead your single view graphics context will be a drawing sheet for all the drawing, and you almost nearer to the solution. Just by taking the touch coordinate you can draw the lines on the view.
CGPoint previousPoint;//This must have the global scope, it will be used in all the touch events
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
previousPoint = [touch locationInView:self]; // take the starting touch point;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint currentPoint;
UITouch *touch = [touches anyObject];
currentPoint = [touch locationInView:self];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(context, 2.0);
CGContextMoveToPoint(context, previousPoint.x,previousPoint.y);
CGContextAddLineToPoint(context, currentPoint.x,currentPoint.y);
CGContextStrokePath(bluecontext);
}
Hope this helps you, Let me know any issue....

Related

Objective-C – Making several views respond to the same touch event?

As you all know in Cocoa-touch the GUI is made up of several UIViews. Now in my view I have an UIImageView (containing an image) which is aligned at x:10 y:10, then I have a UILabel aligned at x:30 y:10 and then finally another UILabel aligned at x:50 y:10.
Now I would want all these UIViews to respond to the same touch event. What would be the easiest way to accomplish this? Would it be to create an UIView that will align from x:10 y:10 and cover all the views and then place this view on top of these views (-bringSubviewToFront)?
Cheers,
Peter
I think your variant is ok! Just create UIView and catch all touch events by it.
Or you can put all your subviews (image, label, label) on one subview mergeView of main view, disable user interaction of that views (image, label, label) and to mergeView add gesture recognizer or whatever you want to catch touches. With such approach it will be easier to reposition your views: just reposition mergeView. Don't forget to enable user interaction of mergeView.
Catch the touch events on the top View and explicitly pass those touches to other views you are interested in. But makes sure the touch point interesects the other views to whom you are sending the touch events
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [[touches allObjects] objectAtIndex:0];
CGPoint point = [touch locationInView:topView];
if (CGRectContainsPoint(view1.frame, point)) {
[view1 touchesBegan:[NSSet setWithObject:touch] withEvent:nil];
}
if (CGRectContainsPoint(view2.frame, point)) {
[view2 touchesBegan:[NSSet setWithObject:touch] withEvent:nil];
}
}

Drag and drop UIButton but limit to boundaries

I’m not asking for givemesamplecodenow responses, just a nudge in the right direction would be much appreciated. My searches haven’t been any good.
I have a UIButton that I am able to move freely around the screen.
I would like to limit the drag area on this object so it can only be moved up and down or side to side. I believe I need to get the x,y coordinates of the boundaries and then restrict movement outside this area. But that’s a far as I have got. My knowledge doesn’t stretch any further than that.
Has anyone implemented something similar in the past?
Adam
So let's say you're in the middle of the drag operation. You're moving the button instance around by setting its center to the center of whatever gesture is causing the movement.
You can impose restrictions by testing the gesture's center and resetting the center values if you don't like them. The below assumes a button wired to an action for all Touch Drag events but the principle still applies if you're using gesture recognizers or touchesBegan: and friends.
- (IBAction)handleDrag:(UIButton *)sender forEvent:(UIEvent *)event
{
CGPoint point = [[[event allTouches] anyObject] locationInView:self.view];
if (point.y > 200)
{
point.y = 200; //No dragging this button lower than 200px from the origin!
}
sender.center = point;
}
If you want a button that slides only on one axis, that's easy enough:
- (IBAction)handleDrag:(UIButton *)sender forEvent:(UIEvent *)event
{
CGPoint point = [[[event allTouches] anyObject] locationInView:self.view];
point.y = sender.center.y; //Always stick to the same y value
sender.center = point;
}
Or perhaps you want the button draggable only inside the region of a specific view. This might be easier to define if your boundaries are complicated.
- (IBAction)handleDrag:(UIButton *)sender forEvent:(UIEvent *)event
{
CGPoint point = [[[event allTouches] anyObject] locationInView:self.someView];
if ([self.someView pointInside:point withEvent:nil])
{
sender.center = point;
//Only if the gesture center is inside the specified view will the button be moved
}
}
Presumably you'd be using touchesBegan:, touchesMoved:, etc., so it should be as simple as testing whether the touch point is outside your view's bounds in touchesMoved:. If it is outside, ignore it, but if it's inside, adjust the position of the button.
I suspect you may find this function useful:
bool CGRectContainsPoint ( CGRect rect, CGPoint point );

How to determine the location of a touch using view property

I'm trying to develop some simple game apps. For a Pong-styled game, I have a moving ball that stays in bounds and two paddles. I implemented code that moves paddle 1 so that it reflects the ball as expected.
When I tried to add the same behavior to the other paddle, I tried this:
UITouch *touch = [[event allTouches]anyObject]; // Picks up the touch
CGPoint location = [touch locationInView:self.view]; // Gets coordinates
// to help move the paddle on the X axis; Y axis is determined by the paddle,
// so it only moves along one axis
if ([touch view] == paddle2) {
// move the second paddle
...
}
else {
// move the first paddle
...
}
However, any touches only move paddle1, indicating the condition is never activated. Based on the documentation, I thought that by sending the [touch view] message, the image view that was touched would return its own name.
What am I doing wrong? Is a simpler way to do this?
Do not use different views for the paddles, use one host view with different CALayers for drawing you objects.
You can move you paddles by assigning new origin to the layers.
The view should proces the following methods
- (void)touchesBegan:(NSSet*)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
to find out where you touched it.
Assuming you know the frames of you paddles you can decide whether you hit them with the touch or not.
I hope this helps.

containsPoint doesn't work with CAShapeLayer?

I have two CAShapeLayers insde the main layer of UIView.
The CAShapeLayers have complex shapes and I need to know if a point was touched within the shape boundaries. Also, I need to know which shape got touched.
I am trying the containsPoint, but nothing works.
After banging my head for two days I was able to produce this bizarre code and looks like it is working!
The goal was to hit test CAShapeLayer. The CAShapeLayer is moving on the screen, so the shape is not in constant place. Hittesting the CGPath currentPoint is not straightforward.
Feel free to add any input...
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint p = [[touches anyObject] locationInView:self];
CGAffineTransform transf = CGAffineTransformMakeTranslation(-shapeLayer.position.x, -shapeLayer.position.y);
if(CGPathContainsPoint(shapeLayer.path, &transf, p, NO)){
// the touch is inside the shape
}
}

Problem with touchesMoved and drawRect

I have a sticky bug that I can't seem to figure out, and I think it has something to do with the way touchesMoved is implemented.
in touchesMoved, I check to see where the touch is (if statement) and then accordingly, call setNeedsDisplayWithRect on a 40 by 40 area near the touchpoint. What happens in DrawRect is that a black image is put down if there was a white image there before, and vice versa. At the same time I'm calling setNeedsDisplayWithRect, I'm setting a boolean variable in an array of booleans, so I can keep track of what the current image is, and therefore display the opposite. (Actually, I don't always flip the image... I look at what the first touch is going to do, like switch from black to white, and then put white images on all the subsequent touches, so it's kind of like drawing or erasing with the images).
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self];
CGPoint lastTouchPoint = [touch previousLocationInView:self];
touchX = touchPoint.x;
touchY = touchPoint.y;
int lastX = (int)floor((lastTouchPoint.x+0.001)/40);
int lastY = (int)floor((lastTouchPoint.y+0.001)/40);
int currentX = (int)(floor((touchPoint.x+0.001)/40));
int currentY = (int)(floor((touchPoint.y+0.001)/40));
if ((abs((currentX-lastX)) >=1) || (abs((currentY-lastY)) >=1))
{
if ([soundArray buttonStateForRow:currentX column:currentY] == firstTouchColor){
[soundArray setButtonState:!firstTouchColor row:(int)(floor((touchPoint.x+0.001)/40)) column:(int)(floor((touchPoint.y+0.001)/40))];
[self setNeedsDisplayInRect:(CGRectMake((CGFloat)(floor((touchPoint.x+0.001)/40)*40), (CGFloat)(floor((touchPoint.y+0.001)/40)*40), (CGFloat)40.0, (CGFloat)40.0))];
}
}
}
My problem is that the boolean array seems to be out of whack with the images I'm putting down. This only happens if I drag really fast across the screen. Eventually the boolean array and the images are no longer in sync, even though I set them at the same time. Any idea what is causing this, or what I can do to fix it?
Here's my drawRect:
- (void)drawRect:(CGRect)rect {
if ([soundArray buttonStateForRow:(int)(floor((touchX+0.001)/40)) column:(int)(floor((touchY+0.001)/40))])
[whiteImage drawAtPoint:(CGPointMake((CGFloat)(floor((touchX+0.001)/40)*40), (CGFloat)(floor((touchY+0.001)/40))*40))];
else
[blackImage drawAtPoint:(CGPointMake((CGFloat)(floor((touchX+0.001)/40)*40), (CGFloat)(floor((touchY+0.001)/40))*40))];
}
I figured out the answer to this. touchX and touchY were instance variables, and they were getting reset in touchesMoved before each call to drawRect was complete. Therefore, if I moved fast on the screen, touchesMoved would get called, then call drawRect, then touchesMoved would get called again before drawRect had used touchX and touchY, so the drawing would get out of sync with the boolean array backend.
to solve this, I stopped using touchX and touchY in drawRect, and started deriving the same point by using the dirty rect that was being passed in from touchesMoved.
tada!