RGBStroke - Image pixels instead of color (iOS) - objective-c

I am implementing a image mask kind of feature in iOS similar to what is offered in the Blender app with two images. Here is my touch move code :-
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:staticBG1];
UIGraphicsBeginImageContext(view.frame.size);
[image_1 drawInRect:CGRectMake(0, 0, view.frame.size.width, view.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 20.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 1.0, 0.0, 0.0, 1.0);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
image_1 = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
mouseMoved++;
if (mouseMoved == 10)
mouseMoved = 0;
}
Now what I really want is not the bright red line but pixels from another image in those places. Both images are of the same dimensions. How do I do it??
I tried to implement my manual image processing method my pixel access but it was too slow and this will be done in real time.
Is there any alternative to:
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 1.0, 0.0, 0.0, 1.0);
?

Don't draw a colour or a pattern into the path, draw transparency. You need one image in its own layer behind the image that is being wiped out. Create the path as you are now, but instead of setting the colour, set the blend mode to clear (kCGBlendModeClear).
This will remove sections of the image to allow you to see through to the image below.
Replace:
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 1.0, 0.0, 0.0, 1.0);
with:
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeClear);

Related

Core graphics drawing stretches when orientation changes

I am making an app that allows you to draw, and it is working well in portrait mode. However, when I rotate the screen, the drawn image becomes distorted (squashed and stretched). What would I need to do to make it retain the image's dimensions even though the UIImageView has changed its dimensions?
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwiped = NO;
UITouch *touch = [touches anyObject];
lastPoint = [touch locationInView:self.view]; // first point touched
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwiped = YES;
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view]; // second point touched (to be connected to first point)
// initialize the UIImageView that will be drawn on
UIGraphicsBeginImageContext(self.view.frame.size);
//UIGraphicsBeginImageContextWithOptions(self.view.frame.size, NO, 0.0); // use this instead of previous line to make smoother drawings
if (!erasing) // choose tempDrawImage if not erasing
[self.tempDrawImage.image drawInRect:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height)];
else // choose mainImage if erasing
[self.mainImage.image drawInRect:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height)];
// draw a line with CGContextAddLineToPoint from lastPoint to currentPoint
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
if (!erasing) { // color selected
// set brush size, opacity, and stroke color
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), brush );
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), red, green, blue, 1.0);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeNormal);
// draw the path
CGContextStrokePath(UIGraphicsGetCurrentContext());
// draw on the tempDrawImage UIImageView
self.tempDrawImage.image = UIGraphicsGetImageFromCurrentImageContext();
[self.tempDrawImage setAlpha:opacity];
}
else { // eraser selected
// set brush size
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), brush);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeClear);
// draw the path
CGContextStrokePath(UIGraphicsGetCurrentContext());
// draw on the tempDrawImage UIImageView
self.mainImage.image = UIGraphicsGetImageFromCurrentImageContext();
//[self.mainImage setAlpha:1.0];
}
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
// if screen was not swiped (i.e. the screen was only tapped), draw a single point
if(!mouseSwiped) {
UIGraphicsBeginImageContext(self.view.frame.size);
//UIGraphicsBeginImageContextWithOptions(self.view.frame.size, NO, 0.0); // use this instead of previous line to make smoother drawings
if (!erasing) {
[self.tempDrawImage.image drawInRect:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), brush);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), red, green, blue, opacity);
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
CGContextFlush(UIGraphicsGetCurrentContext());
self.tempDrawImage.image = UIGraphicsGetImageFromCurrentImageContext();
}
else {
[self.mainImage.image drawInRect:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), brush);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeClear);
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
CGContextFlush(UIGraphicsGetCurrentContext());
self.mainImage.image = UIGraphicsGetImageFromCurrentImageContext();
}
UIGraphicsEndImageContext();
}
///////BRUSH STROKE IS DONE; BEGIN UIIMAGEVIEW MERGE//////
if (!erasing) {
// initialize mainImage (for merge)
UIGraphicsBeginImageContext(self.mainImage.frame.size);
// merge tempDrawImage with mainImage
[self.mainImage.image drawInRect:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height) blendMode:kCGBlendModeNormal alpha:1.0];
[self.tempDrawImage.image drawInRect:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height) blendMode:kCGBlendModeNormal alpha:opacity];
self.mainImage.image = UIGraphicsGetImageFromCurrentImageContext();
// clear tempDrawImage
self.tempDrawImage.image = nil;
}
UIGraphicsEndImageContext();
}

Rendering touchesMoved and Fingerpaint slow - iOS7

I have a app that allows fingerprinting. For this I use touchesBegan, touchesMoved, touchesEnded. On iOS6 it works smooth, when moving the finger, the line is painted. But on iOS7, only the first point from touchesBegan is painted and on touchesEnded, the line is painted.
Does anyone has a similar issue and/or solution.
(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwiped = YES;
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:drawImage];
// currentPoint.y -= 20; // only for 'kCGLineCapRound'
UIGraphicsBeginImageContext(drawImage.frame.size);
[drawImage.image drawInRect:CGRectMake(0, 0, drawImage.frame.size.width, drawImage.frame.size.height)]; //originally self.frame.size.width, self.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound); //kCGLineCapSquare, kCGLineCapButt, kCGLineCapRound
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 30.0); // for size
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.0, 1.0, 1.0); //values for R, G, B, and Alpha
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
mouseMoved++;
if (mouseMoved == 10) {
mouseMoved = 0;
}
}
EDIT and solution!
Don't try to draw or call draw routines inside the touch delegates. The newer devices might support faster and higher resolution touch detection, and thus your app might be getting touch messages way to fast to draw in time, thus choking your UI run loop. Trying to update faster than 60 fps can do that.
Instead save your touch data points somewhere, and then draw later, for instance inside a polling animation loop callback (using CADisplayLink or a repeating timer), outside the touch handler, thus not choking the UI.
Comment from Lawrence Ingraham in Apple Developer Forum
Based on this, I have made the following changes:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
currentPoint = [touch locationInView:self.view];
[self performSelector:#selector(drawFinger) withObject:nil afterDelay:0.0];
}
- (void)drawFinger
{
UIGraphicsBeginImageContext(imgDibujo.frame.size);
[imgDibujo.image drawInRect:CGRectMake(0, 0, imgDibujo.frame.size.width, imgDibujo.frame.size.height)]; //originally self.frame.size.width, self.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound); //kCGLineCapSquare, kCGLineCapButt, kCGLineCapRound
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), delegate.lineAncho); // for size
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), delegate.colorR, delegate.colorG, delegate.colorB, delegate.colorS); //values for R, G, B, and Alpha
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
imgDibujo.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
It works now perfect on all iPad devices. Hope that this is helpfully for you.

ios map- couple of line while am draw a line

in my app i want to do while the user touch and move on the map i want to draw couple of line. how can i do this. am using this below code for draw a single line.
code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:drawImage];
currentPoint.y -= 20;
mapView.scrollEnabled = NO;
lastpinpoint1.x = 170.000000;
lastpinpoint1.y = 327.000000;
UIGraphicsBeginImageContext(drawImage.frame.size);
CGContextRef ctx = UIGraphicsGetCurrentContext();
[imgMap drawInRect:CGRectMake(0, 0, drawImage.frame.size.width,drawImage.frame.size.height)];
CGContextSetLineCap(ctx, kCGLineCapRound);
CGContextSetLineWidth(ctx, 2.0);
// CGContextSetRGBStrokeColor(ctx, 0.0, 0.5, 0.6, 1.0);
CGContextSetStrokeColorWithColor(ctx, [[UIColor redColor] CGColor]);
CGContextBeginPath(ctx);
CGContextMoveToPoint(ctx, lastpinpoint.x, lastpinpoint.y);
CGContextMoveToPoint(ctx, lastpinpoint1.x, lastpinpoint1.y);
CGContextAddLineToPoint(ctx, currentPoint.x, currentPoint.y);
CGContextStrokePath(ctx);
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
In this above code with out this CGContextMoveToPoint(ctx, lastpinpoint1.x, lastpinpoint1.y); this line for draw a single line. but i have added this CGContextMoveToPoint(ctx, lastpinpoint1.x, lastpinpoint1.y); line for draw a second line. Now what am did wrong in this code. please help me. Issue is: while am move the finger on the map i want to draw a couple of line from different point to one current point. thanks for advance.
Try this answer
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
lastPoint = [touch locationInView:self];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self];
UIGraphicsBeginImageContext(self.frame.size);
[self.image drawInRect:CGRectMake(0, 0, self.frame.size.width, self.frame.size.height)];
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 5.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 1.0 , 0.0, 1.0, 1.0);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeNormal);
CGContextStrokePath(UIGraphicsGetCurrentContext());
self.image = UIGraphicsGetImageFromCurrentImageContext();
[self setAlpha:1.0];
UIGraphicsEndImageContext();
UIGraphicsBeginImageContext(self.frame.size);
[self.image drawInRect:CGRectMake(0, 0, self.frame.size.width, self.frame.size.height)];
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x+20, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x+20, currentPoint.y);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 5);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 1.0 , 0.0, 1.0, 1.0);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeNormal);
CGContextStrokePath(UIGraphicsGetCurrentContext());
self.image = UIGraphicsGetImageFromCurrentImageContext();
[self setAlpha:1.0];
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
Hope it helps!!!

How can i create an eraser for this line, in Xcode?

How, i've this code. When the touch moved, the view adds a line.
Now, if i want to create an eraser for this line, how can i do?
Please, answer me early!
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:drawView];
UIGraphicsBeginImageContext(drawView.frame.size);
[drawView.image drawInRect:CGRectMake(0, 0, drawView.frame.size.width, drawView.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), brushDimension);
const CGFloat *components = CGColorGetComponents([brushColor CGColor]);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), components[0], components[1], components[2], components[3]);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
drawView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
If you are looking for an erase function that the user can use touches to erase portion of the line instead of undo provide by RickyTheCoder, you have 2 options.
The first option is use the brush that has the same background color of
the background view so it perceive as line got erased while it
actually just got paint over with the color that is same as the background.
The second option is to use the brush with clear color and set the
blend mode to clear so it erase the line and the background view is still
visible.
if (isErase)
{
CGContextSetLineWidth(currentContext, 10);
CGContextSetStrokeColorWithColor(currentContext, [UIColor clearColor].CGColor);
CGContextSetFillColorWithColor(currentContext, [UIColor clearColor].CGColor);
CGContextSetBlendMode(currentContext, kCGBlendModeClear);
CGContextDrawPath(currentContext, kCGPathStroke);
}
i think this is what you are looking for:
http://soulwithmobiletechnology.blogspot.com/2011/06/redo-undo-in-paint-feature.html

Adding an Image to a Current UIGraphics Context

I've got a slideshow that allows users to annotate slides with a simple drawing tool. Just allows you to draw on the screen with your finger and then 'save'. The save feature uses UIImagePNGRepresentation and works rather well. What I need to work out is how to 'continue' existing annotations so when a save happens it also takes into account what is already on the slide.
It works using UIImageContext and saving that image context into a file. When an image is saved it opens onto an overlay UIImageView, so if you are 'continuing' then your drawing onto an existing png file.
Is there a way I can add an existing image to the UIImageContext? Here I control the addition of the lines upon movement:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if(drawToggle){
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
currentPoint.y -= 40;
//Define Properties
[drawView.image drawInRect:CGRectMake(0, 0, drawView.frame.size.width, drawView.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineJoin(UIGraphicsGetCurrentContext(), kCGLineJoinBevel);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 5.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 1.0, 0.0, 0.0, 1.0);
//Start Path
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
//Save Path to Image
drawView.image = UIGraphicsGetImageFromCurrentImageContext();
lastPoint = currentPoint;
}
}
Here is the magic saving line:
NSData *saveDrawData = UIImagePNGRepresentation(UIGraphicsGetImageFromCurrentImageContext());
NSError *error = nil;
[saveDrawData writeToFile:dataFilePath options:NSDataWritingAtomic error:&error];
Thanks for any help you can offer.
UPDATE:
Oops I forgot to add, when an annotation is 'saved' the Image Context is ended, so I can't use any Get Current Image Context style methods.
I achieved this by adding this between my Start and End lines:
UIImage *image = [[UIImage alloc] initWithContentsOfFile:saveFilePath];
CGRect imageRect = CGRectMake(0, 0, image.size.width, image.size.height);
CGContextTranslateCTM(UIGraphicsGetCurrentContext(), 0, image.size.height);
CGContextScaleCTM(UIGraphicsGetCurrentContext(), 1.0, -1.0);
CGContextDrawImage(UIGraphicsGetCurrentContext(), imageRect, image.CGImage);
The Context Translate and Scales are necessary because converting a UIImage to a CGImage flips the image - it does this because a CGImage draws from a bottom left origin and the UIImage draws from a top left origin, the same co-ordinates on a flipped scale results in a flipped image.
Because I drew the existing image into the UIGraphicsGetCurrentContext() when I save the file it'll take this into account.