Objective C: How do i smoothen the brush like a real brush? - objective-c

I am building a brush app, it's almost finish and what i did was just a basic brush/drawing tool. I want to give it a more brush-like feel because in my current output it has angles and it doesn't look like a real brush ink.
here's my code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
touchSwiped = YES;
UITouch *touch = [touches anyObject];
currentTouch = [touch locationInView:self.view];
currentTouch.y -= 20;
UIGraphicsBeginImageContext(self.view.frame.size);
[touchDraw.image drawInRect:CGRectMake(0, 0, touchDraw.frame.size.width, touchDraw.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 35.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), redAmt, blueAmt, greenAmt, 1.0);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), endingPoint.x, endingPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentTouch.x, currentTouch.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
touchDraw.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
endingPoint = currentTouch;
touchMoved++;
if (touchMoved == 10) {
touchMoved = 0;
}
}

I don't know that you'll be able to use touch events for this without getting the angles you describe. The resolution of the events just aren't discrete enough.
OpenGL seems a better fit. Check out Apple's GLPaint sample code for an example.

Try using the quadCurve instead of addLineToPoint.. Quad Curve make a line in a two points that don't have an angle and make your line a curve..
CGPoint midPoint(CGPoint p1,CGPoint p2)
{
return CGPointMake ((p1.x + p2.x) * 0.5,(p1.y + p2.y) * 0.5);
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event//upon moving
{
mouseSwiped = YES;
UITouch *touch = [touches anyObject];
previousPoint2 = previousPoint1;
previousPoint1 = currentTouch;
currentTouch = [touch locationInView:self.view];
CGPoint mid1 = midPoint(previousPoint2, previousPoint1);
CGPoint mid2 = midPoint(currentTouch, previousPoint1);
// here's your ticket to the finals..
CGContextMoveToPoint(context, mid1.x, mid1.y);
CGContextAddQuadCurveToPoint(context, previousPoint1.x, previousPoint1.y, mid2.x, mid2.y);
CGContextStrokePath(context)
}

Related

CGContextClearRect not working smoothly in touchesMoved event in objective c

I used this code, it works partly but image not crop properly
output seen like this
-(void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:_img.superview];
UIGraphicsBeginImageContext(self.img.frame.size);
[_img.image drawInRect:CGRectMake(0, 0, _img.frame.size.width, _img.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 10);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextClearRect (UIGraphicsGetCurrentContext(), CGRectMake(currentPoint.x, currentPoint.y, 15, 15));
CGContextStrokePath(UIGraphicsGetCurrentContext());
_img.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
I face same problem earlier then i tried below code and it's work for me
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.img_main];
// NSLog(#"Current ponint : %f,%f",currentPoint.x,currentPoint.y);
self.img_main.autoresizingMask = UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth;
self.img_main.contentMode = UIViewContentModeScaleAspectFit;
UIGraphicsBeginImageContextWithOptions(self.img_main.bounds.size, NO, 0.0);
[_img_main.image drawInRect:CGRectMake(0,0, self.img_main.bounds.size.width,self.img_main.bounds.size.height)];
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
//For clearing Size and Mode
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), brush_Size );
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeDestinationOut);
//For Clearing Opacity
CGContextSetAlpha(UIGraphicsGetCurrentContext(), brush_opacity);
CGContextStrokePath(UIGraphicsGetCurrentContext());
_img_main.image = UIGraphicsGetImageFromCurrentImageContext();
CGContextFlush(UIGraphicsGetCurrentContext());
UIGraphicsEndImageContext();

Rendering touchesMoved and Fingerpaint slow - iOS7

I have a app that allows fingerprinting. For this I use touchesBegan, touchesMoved, touchesEnded. On iOS6 it works smooth, when moving the finger, the line is painted. But on iOS7, only the first point from touchesBegan is painted and on touchesEnded, the line is painted.
Does anyone has a similar issue and/or solution.
(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwiped = YES;
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:drawImage];
// currentPoint.y -= 20; // only for 'kCGLineCapRound'
UIGraphicsBeginImageContext(drawImage.frame.size);
[drawImage.image drawInRect:CGRectMake(0, 0, drawImage.frame.size.width, drawImage.frame.size.height)]; //originally self.frame.size.width, self.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound); //kCGLineCapSquare, kCGLineCapButt, kCGLineCapRound
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 30.0); // for size
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.0, 1.0, 1.0); //values for R, G, B, and Alpha
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
mouseMoved++;
if (mouseMoved == 10) {
mouseMoved = 0;
}
}
EDIT and solution!
Don't try to draw or call draw routines inside the touch delegates. The newer devices might support faster and higher resolution touch detection, and thus your app might be getting touch messages way to fast to draw in time, thus choking your UI run loop. Trying to update faster than 60 fps can do that.
Instead save your touch data points somewhere, and then draw later, for instance inside a polling animation loop callback (using CADisplayLink or a repeating timer), outside the touch handler, thus not choking the UI.
Comment from Lawrence Ingraham in Apple Developer Forum
Based on this, I have made the following changes:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
currentPoint = [touch locationInView:self.view];
[self performSelector:#selector(drawFinger) withObject:nil afterDelay:0.0];
}
- (void)drawFinger
{
UIGraphicsBeginImageContext(imgDibujo.frame.size);
[imgDibujo.image drawInRect:CGRectMake(0, 0, imgDibujo.frame.size.width, imgDibujo.frame.size.height)]; //originally self.frame.size.width, self.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound); //kCGLineCapSquare, kCGLineCapButt, kCGLineCapRound
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), delegate.lineAncho); // for size
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), delegate.colorR, delegate.colorG, delegate.colorB, delegate.colorS); //values for R, G, B, and Alpha
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
imgDibujo.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
It works now perfect on all iPad devices. Hope that this is helpfully for you.

UIGraphicsGetImageFromCurrentImageContext crop to content and not to bounding box

I have an area where a user can add their signature to an app, to verify an order.
They sign with their touch.
The only thing is, the frame for the signature is large, and if the user were to make a signature very small, when I save the image, I'm left with a wealth of empty space around the image, which, when I add it to an email further down the process can look horrible.
Is there any way, using that I can crop the image to whatever the bounds of the actual content is, and not the bounds of the box itself?
I would imagine the process would involve somehow detecting the content within the space and drawing a CGRect to match it's bounds, before passing this CGRect back to the context? But I'm not sure how to go about doing this in any way shape or form, this really is my first time using CGContext and the Graphics Frameworks.
Here's my signature drawing code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:signView];
//Define Properties
[drawView.image drawInRect:CGRectMake(0, 0, drawView.frame.size.width, drawView.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineJoin(UIGraphicsGetCurrentContext(), kCGLineJoinBevel);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 5.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.0, 0.0, 1.0);
CGContextSetShouldAntialias(UIGraphicsGetCurrentContext(), true);
CGContextSetAllowsAntialiasing(UIGraphicsGetCurrentContext(), true);
//Start Path
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
//Save Path to Image
drawView.image = UIGraphicsGetImageFromCurrentImageContext();
lastPoint = currentPoint;
}
Thanks for your help if you can offer it.
You can do this with by tracking the min and max touch values.
For example, make some min/max ivars
#interface ViewController () {
CGFloat touchRectMinX_;
CGFloat touchRectMinY_;
CGFloat touchRectMaxX_;
CGFloat touchRectMaxY_;
UIView *demoView_;
}
#end
Heres a demo rect
- (void)viewDidLoad {
[super viewDidLoad];
demoView_ = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 10, 10)];
demoView_.backgroundColor = [UIColor redColor];
[self.view addSubview:demoView_];
}
Set them up with impossible values. You'll want to do this for each signature not each stroke, but how you do this is up to you. I suggest a reset on 'clear', assuming you have a clear button.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
touchRectMinX_ = CGFLOAT_MAX;
touchRectMinY_ = CGFLOAT_MAX;
touchRectMaxX_ = CGFLOAT_MIN;
touchRectMaxY_ = CGFLOAT_MIN;
}
Now record the changes
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
touchRectMinX_ = MIN(touchRectMinX_, currentPoint.x);
touchRectMinY_ = MIN(touchRectMinY_, currentPoint.y);
touchRectMaxX_ = MAX(touchRectMaxX_, currentPoint.x);
touchRectMaxY_ = MAX(touchRectMaxY_, currentPoint.y);
// You can use them like this:
CGRect rect = CGRectMake(touchRectMinX_, touchRectMinY_, fabs(touchRectMinX_ - touchRectMaxX_), fabs(touchRectMinY_ - touchRectMaxY_));
demoView_.frame = rect;
NSLog(#"Signature rect is: %#", NSStringFromCGRect(rect));
}
Hope this helps!

Rotating OpenGL-ES object

After many hours of struggle i was finally able to draw a object, but got stuck when it came to rotating. I am trying to make a rotation around the y-axis, but i cannot seem to make the rotation smooth in any way. Its pretty much jumping around. There might be some bits and pieces of uneccessary in between hence i used a template.
- (void)update
{
float aspect = fabsf(self.view.bounds.size.width / self.view.bounds.size.height);
GLKMatrix4 projectionMatrix = GLKMatrix4MakePerspective(GLKMathDegreesToRadians(65.0f), aspect, 0.1f, 100.0f);
self.effect.transform.projectionMatrix = projectionMatrix;
GLKMatrix4 baseModelViewMatrix = GLKMatrix4MakeTranslation(0.0f, 0.0f, -4.0f);
// Compute the model view matrix for the object rendered with ES2
GLKMatrix4 modelViewMatrix = GLKMatrix4MakeTranslation(0.0f, 0.0f, 0.0f);
modelViewMatrix = GLKMatrix4Rotate(modelViewMatrix, startPoint.y, 0.0f, 1.0f, 0.0f);
dx = dy = 0;
modelViewMatrix = GLKMatrix4Multiply(baseModelViewMatrix, modelViewMatrix);
_normalMatrix = GLKMatrix3InvertAndTranspose(GLKMatrix4GetMatrix3(modelViewMatrix), NULL);
_modelViewProjectionMatrix = GLKMatrix4Multiply(projectionMatrix, modelViewMatrix);
_rotation += self.timeSinceLastUpdate * 0.5f;
}
And my touch behaviours:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
startPoint = [touch locationInView:self.view];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self.view];
dx = point.y - startPoint.y;
dy = point.x - startPoint.x;
startPoint = point;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
}
Thanks in advance.
going through the below link of apple library it can be solve your problem
[http://developer.apple.com/library/ios/#samplecode/GLSprite/Introduction/Intro.html#//apple_ref/doc/uid/DTS40007325

Drawing a CGRect by touching

I was wondering how to draw a CGRect onto an UIImageView. Here is the code I've got but it doesn't seem to be working. Any suggestions? touch1 and touch2 are both CGPoints and point is a CGRect.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
touch1 = [touch locationInView:self];
touch2 = [touch locationInView:self];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
touch2 = [touch locationInView:self];
point = CGRectMake(touch2.x, touch2.y, 50, 50);
[self setNeedsDisplay];
}
- (void) drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(context, 2.0);
CGContextSetStrokeColorWithColor(context, [UIColor blackColor].CGColor);
CGContextSetFillColorWithColor(context, pointColor.CGColor);
CGContextAddEllipseInRect(context, point);
CGContextDrawPath(context, kCGPathFillStroke);
}
What do you see actually happening?
A few notes:
You shouldn't be doing this with a UIImageView-- those are specifically intended to be containers for image files. You should just use a subclass of a regular UIView for this.
In your touchesBegan, you know that touch1 and touch2 will always be set to the same thing, right? You don't seem to ever be using touch1.
point is a misleading variable name for something that is a rect.
Your drawRect is not unreasonable. What is pointColor? If you're drawing black-on-black that might be part of the problem.
Listing 3-1 Code that creates an ellipse by applying a transform to a circle
CGContextScaleCTM(context, 1,2);
CGContextBeginPath(context);
CGContextAddArc(context, 0, 0, 25, 0, 2*M_PI, false);
CGContextStrokePath(context);
Lots more example code in the Quartz2d Example
I took your code only, its working for me. Just have look at it. And thanks for your code. In this sample i am drawing a rect.
- (void)drawRect:(CGRect)rect
{
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetLineWidth(context, 2.0);
CGContextSetStrokeColorWithColor(context, [UIColor blackColor].CGColor);
CGContextSetFillColorWithColor(context, [UIColor redColor].CGColor);
CGContextAddRect(context, rectFrame);
CGContextDrawPath(context, kCGPathFillStroke);
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
startPoint = [touch locationInView:self];
rectFrame.origin.x = startPoint.x;
rectFrame.origin.y = startPoint.y;
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
endPoint = [touch locationInView:self];
rectFrame.size.width = endPoint.y - startPoint.x;
rectFrame.size.height = endPoint.y - startPoint.x;
[self setNeedsDisplay];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
endPoint = [touch locationInView:self];
rectFrame.size.width = endPoint.y - startPoint.x;
rectFrame.size.height = endPoint.y - startPoint.x;
[self setNeedsDisplay];
}