I have an area where a user can add their signature to an app, to verify an order.
They sign with their touch.
The only thing is, the frame for the signature is large, and if the user were to make a signature very small, when I save the image, I'm left with a wealth of empty space around the image, which, when I add it to an email further down the process can look horrible.
Is there any way, using that I can crop the image to whatever the bounds of the actual content is, and not the bounds of the box itself?
I would imagine the process would involve somehow detecting the content within the space and drawing a CGRect to match it's bounds, before passing this CGRect back to the context? But I'm not sure how to go about doing this in any way shape or form, this really is my first time using CGContext and the Graphics Frameworks.
Here's my signature drawing code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:signView];
//Define Properties
[drawView.image drawInRect:CGRectMake(0, 0, drawView.frame.size.width, drawView.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineJoin(UIGraphicsGetCurrentContext(), kCGLineJoinBevel);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 5.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.0, 0.0, 1.0);
CGContextSetShouldAntialias(UIGraphicsGetCurrentContext(), true);
CGContextSetAllowsAntialiasing(UIGraphicsGetCurrentContext(), true);
//Start Path
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
//Save Path to Image
drawView.image = UIGraphicsGetImageFromCurrentImageContext();
lastPoint = currentPoint;
}
Thanks for your help if you can offer it.
You can do this with by tracking the min and max touch values.
For example, make some min/max ivars
#interface ViewController () {
CGFloat touchRectMinX_;
CGFloat touchRectMinY_;
CGFloat touchRectMaxX_;
CGFloat touchRectMaxY_;
UIView *demoView_;
}
#end
Heres a demo rect
- (void)viewDidLoad {
[super viewDidLoad];
demoView_ = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 10, 10)];
demoView_.backgroundColor = [UIColor redColor];
[self.view addSubview:demoView_];
}
Set them up with impossible values. You'll want to do this for each signature not each stroke, but how you do this is up to you. I suggest a reset on 'clear', assuming you have a clear button.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
touchRectMinX_ = CGFLOAT_MAX;
touchRectMinY_ = CGFLOAT_MAX;
touchRectMaxX_ = CGFLOAT_MIN;
touchRectMaxY_ = CGFLOAT_MIN;
}
Now record the changes
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
touchRectMinX_ = MIN(touchRectMinX_, currentPoint.x);
touchRectMinY_ = MIN(touchRectMinY_, currentPoint.y);
touchRectMaxX_ = MAX(touchRectMaxX_, currentPoint.x);
touchRectMaxY_ = MAX(touchRectMaxY_, currentPoint.y);
// You can use them like this:
CGRect rect = CGRectMake(touchRectMinX_, touchRectMinY_, fabs(touchRectMinX_ - touchRectMaxX_), fabs(touchRectMinY_ - touchRectMaxY_));
demoView_.frame = rect;
NSLog(#"Signature rect is: %#", NSStringFromCGRect(rect));
}
Hope this helps!
Related
I have a app that allows fingerprinting. For this I use touchesBegan, touchesMoved, touchesEnded. On iOS6 it works smooth, when moving the finger, the line is painted. But on iOS7, only the first point from touchesBegan is painted and on touchesEnded, the line is painted.
Does anyone has a similar issue and/or solution.
(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwiped = YES;
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:drawImage];
// currentPoint.y -= 20; // only for 'kCGLineCapRound'
UIGraphicsBeginImageContext(drawImage.frame.size);
[drawImage.image drawInRect:CGRectMake(0, 0, drawImage.frame.size.width, drawImage.frame.size.height)]; //originally self.frame.size.width, self.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound); //kCGLineCapSquare, kCGLineCapButt, kCGLineCapRound
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 30.0); // for size
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.0, 1.0, 1.0); //values for R, G, B, and Alpha
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
mouseMoved++;
if (mouseMoved == 10) {
mouseMoved = 0;
}
}
EDIT and solution!
Don't try to draw or call draw routines inside the touch delegates. The newer devices might support faster and higher resolution touch detection, and thus your app might be getting touch messages way to fast to draw in time, thus choking your UI run loop. Trying to update faster than 60 fps can do that.
Instead save your touch data points somewhere, and then draw later, for instance inside a polling animation loop callback (using CADisplayLink or a repeating timer), outside the touch handler, thus not choking the UI.
Comment from Lawrence Ingraham in Apple Developer Forum
Based on this, I have made the following changes:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
currentPoint = [touch locationInView:self.view];
[self performSelector:#selector(drawFinger) withObject:nil afterDelay:0.0];
}
- (void)drawFinger
{
UIGraphicsBeginImageContext(imgDibujo.frame.size);
[imgDibujo.image drawInRect:CGRectMake(0, 0, imgDibujo.frame.size.width, imgDibujo.frame.size.height)]; //originally self.frame.size.width, self.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound); //kCGLineCapSquare, kCGLineCapButt, kCGLineCapRound
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), delegate.lineAncho); // for size
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), delegate.colorR, delegate.colorG, delegate.colorB, delegate.colorS); //values for R, G, B, and Alpha
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
imgDibujo.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
It works now perfect on all iPad devices. Hope that this is helpfully for you.
I'm working on an image editing app. Right now I have the app built so a user can choose a photo from their library or take a photo with the camera. I also have another view (a picker view) that has other images a user can choose from. By selecting one of the images the app takes the user back to the main photo.
I want the user to be able to touch anywhere on screen and add the image they selected.
What is the best way to approach this?
touchesBegan? touchesMoved? UITapGestureRecognizer?
If anyone knows of any sample code or can give me a general idea of how to approach this that would be great!
EDIT:
Now I am able to see the coordinates and that my UIImage is getting the image I select from my Picker. But the image is not being displayed on the screen when I tap. Can someone help me troubleshoot my code please:
-(void)drawRect:(CGRect)rect
{
CGRect currentRect = CGRectMake(touchPoint.x, touchPoint.y, 30.0, 30.0);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextFillRect(context, currentRect);
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
touchPoint = [touch locationInView:imageView];
NSLog(#"%f", touchPoint.x);
NSLog(#"%f", touchPoint.y);
if (touchPoint.x > -1 && touchPoint.y > -1)
{
stampedImage = _imagePicker.selectedImage;
//[stampedImage drawAtPoint:touchPoint];
[_stampedImageView setFrame:CGRectMake(touchPoint.x, touchPoint.y, 30.0, 30.0)];
[_stampedImageView setImage:stampedImage];
[imageView addSubview:_stampedImageView];
NSLog(#"Stamped Image = %#", stampedImage);
//[self.view setNeedsDisplay];
}
}
For an example of my NSLogs I am seeing:
162.500000
236.000000
Stamped Image = <UIImage: 0xe68a7d0>
Thanks!
In your ViewController, user the method "-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event" to get the X and Y coordinate of where a touch happened. Here is some sample code that shows how to get the x and y of the touch
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
/* Detect touch anywhere */
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
NSLog(#"%f", touchPoint.x); // The x coordinate of the touch
NSLog(#"%f", touchPoint.y); // The y coordinate of the touch
}
Once you have this x and y data, you can set the image that the user selected or shot with the built in camera to appear at those coordinates.
EDIT:
I think the issue MIGHT lie in how you create you UIImage view. Instead of this:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
touchPoint = [touch locationInView:imageView];
CGRect myImageRect = CGRectMake(touchPoint.x, touchPoint.y, 20.0f, 20.0f);
UIImageView * myImage = [[UIImageView alloc] initWithFrame:myImageRect];
[myImage setImage:_stampedImageView.image];
myImage.opaque = YES;
[imageView addSubview:myImage];
[myImage release];
}
Try this:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
touchPoint = [touch locationInView:imageView];
myImage = [[UIImageView alloc] initWithImage:_stampedImageView.image];
[imageView addSubview:myImage];
[myImage release];
}
If this does not work, try checking if "_stampedImageView.image == nil". If this is true, Your UIImage may not have been created properly.
I am new to Objective-C programming.
My problem is about drawing lines with touchesMoved.
My code is like this:
- (void)drawRect:(CGRect)rect
{
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetShouldAntialias(context, YES);
CGContextSetLineWidth(context, 7.0f);
CGContextSetRGBStrokeColor(context, 0.7, 0.7, 0.7, 1.0);
CGContextMoveToPoint(context, self.touchedPoint2.x, self.touchedPoint2.y);
CGContextAddLineToPoint(context, self.touchedPoint.x, self.touchedPoint.y);
CGContextDrawPath(context,kCGPathFillStroke);
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
self.touchedPoint = [[touches anyObject] locationInView:self];
[self setNeedsDisplay];
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
self.touchedPoint2 = [[touches anyObject] locationInView:self];
}
This is how drawRect: is supposed to work.
You should draw everything into an offscreen buffer (e.g. CGImage or CGLayer) and then use drawRect: only to draw the buffer.
Consider also looking into this question which lists other possibilites.
How, i've this code. When the touch moved, the view adds a line.
Now, if i want to create an eraser for this line, how can i do?
Please, answer me early!
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:drawView];
UIGraphicsBeginImageContext(drawView.frame.size);
[drawView.image drawInRect:CGRectMake(0, 0, drawView.frame.size.width, drawView.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), brushDimension);
const CGFloat *components = CGColorGetComponents([brushColor CGColor]);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), components[0], components[1], components[2], components[3]);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
drawView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
If you are looking for an erase function that the user can use touches to erase portion of the line instead of undo provide by RickyTheCoder, you have 2 options.
The first option is use the brush that has the same background color of
the background view so it perceive as line got erased while it
actually just got paint over with the color that is same as the background.
The second option is to use the brush with clear color and set the
blend mode to clear so it erase the line and the background view is still
visible.
if (isErase)
{
CGContextSetLineWidth(currentContext, 10);
CGContextSetStrokeColorWithColor(currentContext, [UIColor clearColor].CGColor);
CGContextSetFillColorWithColor(currentContext, [UIColor clearColor].CGColor);
CGContextSetBlendMode(currentContext, kCGBlendModeClear);
CGContextDrawPath(currentContext, kCGPathStroke);
}
i think this is what you are looking for:
http://soulwithmobiletechnology.blogspot.com/2011/06/redo-undo-in-paint-feature.html
I am building a brush app, it's almost finish and what i did was just a basic brush/drawing tool. I want to give it a more brush-like feel because in my current output it has angles and it doesn't look like a real brush ink.
here's my code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
touchSwiped = YES;
UITouch *touch = [touches anyObject];
currentTouch = [touch locationInView:self.view];
currentTouch.y -= 20;
UIGraphicsBeginImageContext(self.view.frame.size);
[touchDraw.image drawInRect:CGRectMake(0, 0, touchDraw.frame.size.width, touchDraw.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 35.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), redAmt, blueAmt, greenAmt, 1.0);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), endingPoint.x, endingPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentTouch.x, currentTouch.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
touchDraw.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
endingPoint = currentTouch;
touchMoved++;
if (touchMoved == 10) {
touchMoved = 0;
}
}
I don't know that you'll be able to use touch events for this without getting the angles you describe. The resolution of the events just aren't discrete enough.
OpenGL seems a better fit. Check out Apple's GLPaint sample code for an example.
Try using the quadCurve instead of addLineToPoint.. Quad Curve make a line in a two points that don't have an angle and make your line a curve..
CGPoint midPoint(CGPoint p1,CGPoint p2)
{
return CGPointMake ((p1.x + p2.x) * 0.5,(p1.y + p2.y) * 0.5);
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event//upon moving
{
mouseSwiped = YES;
UITouch *touch = [touches anyObject];
previousPoint2 = previousPoint1;
previousPoint1 = currentTouch;
currentTouch = [touch locationInView:self.view];
CGPoint mid1 = midPoint(previousPoint2, previousPoint1);
CGPoint mid2 = midPoint(currentTouch, previousPoint1);
// here's your ticket to the finals..
CGContextMoveToPoint(context, mid1.x, mid1.y);
CGContextAddQuadCurveToPoint(context, previousPoint1.x, previousPoint1.y, mid2.x, mid2.y);
CGContextStrokePath(context)
}