How can i zoom in and out in a UIImageView without UIScrollView? - objective-c

I'm developing an app for iOS 4.2, iPhone, in this app I download images and I save them in the internal storage (NSDocuments).
Well, then I show the first image in a UIImageView. The user can drag their finger on the UIImageView (TouchesMoved), when the user do that, I load other image. If the user drag down, I load one image, if drag up, i load other, and also with right and left.
This is all done. But I want to implement zooming. This is my code until now:
initialDistance --> is the distance between the fingers at first touch
finalDistance --> is the distance between the fingers each time they move
x --> is 0
y --> is 0
// this method calculate the distance between 2 fingers
- (CGFloat)distanceBetweenTwoPoints:(CGPoint)fromPoint toPoint:(CGPoint)toPoint {
float xPoint = toPoint.x - fromPoint.x;
float yPoint = toPoint.y - fromPoint.y;
return sqrt(xPoint * xPoint + yPoint * yPoint);
}
//------------------- Movimientos con los dedos ------------------------------------
#pragma mark -
#pragma mark UIResponder
// First Touch
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
NSSet *allTouches = [event allTouches];
switch ([allTouches count]) {
case 1: { //Single touch
//Get the first touch.
UITouch *touch1 = [[allTouches allObjects] objectAtIndex:0];
switch ([touch1 tapCount])
{
case 1: //Single Tap.
{
// Guardo la primera localizaciĆ³n del dedo cuando pulsa por primera vez
//inicial = [touch1 locationInView:self.view];
} break;
case 2: {//Double tap.
//Track the initial distance between two fingers.
//if ([[allTouches allObjects] count] >= 2) {
// oculto/o no, la barra de arriba cuando se hace un dobleTap
//[self switchToolBar];
} break;
}
} break;
case 2: { //Double Touch
// calculo la distancia inicial que hay entre los dedos cuando empieza a tocar
UITouch *touch1 = [[allTouches allObjects] objectAtIndex:0];
UITouch *touch2 = [[allTouches allObjects] objectAtIndex:1];
initialDistance = [self distanceBetweenTwoPoints:[touch1 locationInView:[self view]]
toPoint:[touch2 locationInView:[self view]]];
}
default:
break;
}
}
// when the finger/s move to
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
NSSet *allTouches = [event allTouches];
switch ([allTouches count])
{
case 1: {
} break;
case 2: {
//The image is being zoomed in or out.
UITouch *touch1 = [[allTouches allObjects] objectAtIndex:0];
UITouch *touch2 = [[allTouches allObjects] objectAtIndex:1];
//Calculate the distance between the two fingers.
CGFloat finalDistance = [self distanceBetweenTwoPoints:[touch1 locationInView:[self view]]
toPoint:[touch2 locationInView:[self view]]];
NSLog(#"Distancia Inicial :: %.f, Ditancia final :: %.f", initialDistance, finalDistance);
float factorX = 20.0;
float factorY = 11.0;
// guardo la posicion de los 2 dedos
//CGPoint dedo1 = [[[touches allObjects] objectAtIndex:0] locationInView:self.view];
//CGPoint dedo2 = [[[touches allObjects] objectAtIndex:1] locationInView:self.view];
// comparo para saber si el usuario esta haciendo zoom in o zoom out
if(initialDistance < finalDistance) {
NSLog(#"Zoom In");
float newWidth = imagen.frame.size.width + (initialDistance - finalDistance + factorX);
float newHeight = imagen.frame.size.height + (initialDistance - finalDistance + factorY);
if (newWidth <= 960 && newHeight <= 640) {
/*
if (dedo1.x >= dedo2.x) {
x = (dedo1.x + finalDistance/2);
y = (dedo1.y + finalDistance/2);
} else {
x = (dedo2.x + finalDistance/2);
y = (dedo2.y + finalDistance/2);
}
*/
//x = (dedo1.x);
//y = (dedo1.y);
imagen.frame = CGRectMake( x, y, newWidth, newHeight);
} else {
imagen.frame = CGRectMake( x, y, 960, 640);
}
}
else {
NSLog(#"Zoom Out");
float newWidth = imagen.frame.size.width - (finalDistance - initialDistance + factorX);
float newHeight = imagen.frame.size.height - (finalDistance - initialDistance + factorY);
if (newWidth >= 480 && newHeight >= 320) {
/*
if (dedo1.x >= dedo2.x) {
x = (dedo1.x + finalDistance/2);
y = (dedo1.y + finalDistance/2);
} else {
x = (dedo2.x + finalDistance/2);
y = (dedo2.y + finalDistance/2);
}
*/
//x -= (finalDistance - initialDistance + factorX);
//y -= (finalDistance - initialDistance + factorX);
//x = (dedo1.x);
//y = (dedo1.y);
imagen.frame = CGRectMake( x, y, newWidth, newHeight);
} else {
imagen.frame = CGRectMake( 0, 0, 480, 320);
}
}
initialDistance = finalDistance;
} break;
}
}
#pragma mark -
Thank you very much!!
PD: If there is a method with UIScrollView whitch I can move between different images, I m open to take a look too.

Ok. You can consider using a UIScrollView if only to use it for its zoom functionality.
Say we have a scrollView and a imageView with both having the same bounds. Add the imageView as the subview of scrollView.
[scrollView addSubview:imageView];
scrollView.contentSize = imageView.frame.size;
To support only zoom and not not panning in the scrollView your viewController will have to adopt the UIScrollViewDelegate protocol.
// Disabling panning/scrolling in the scrollView
scrollView.scrollEnabled = NO;
// For supporting zoom,
scrollView.minimumZoomScale = 0.5;
scrollView.maximumZoomScale = 2.0;
...
// Implement a single scroll view delegate method
- (UIView*)viewForZoomingInScrollView:(UIScrollView *)aScrollView {
return imageView;
}
By now we have zooming available. For swipes, you can use appropriately configured UISwipeGestureRecognizers. Create a gesture for handling each swipe direction and add it to scroll view.
UISwipeGestureRecognizer *rightSwipe = [[UISwipeGestureRecognizer alloc] initWithTarget:self selector:#selector(handleRightSwipe:)];
rightSwipe.direction = UISwipeGestureRecognizerDirectionRight;
rightSwipe.numberOfTouchesRequired = 1;
[scrollView addGesture:rightSwipe];
[rightSwipe release];
And retrieve the appropriate image based on the gesture and set it using imageView.image = yourImage;.

Finally, with Deepak's help, I use the transform property of UIImageView to make the Zooming.
I use the CGFloat at the position [0,0] in the matrix of the CGAffineTransform to set the limits. I have to pass the CGFloat to a string at zoom in because when i compare it with 0 this is always true.
Finally this is my code in touchesMoved, if it's useful for someone:
// comparo para saber si el usuario esta haciendo zoom in o zoom out
if(initialDistance < finalDistance) {
NSLog(#"Zoom Out");
CGAffineTransform transformer = CGAffineTransformScale(imagen.transform, 1.05, 1.05);
NSLog(#"transformer :: A: %.f, B: %.f, C: %.f, D: %.f", imagen.transform.a, imagen.transform.b, imagen.transform.c, imagen.transform.d);
if (transformer.a < 5) {
[UIView beginAnimations:nil context:NULL];
[UIView setAnimationDuration: 0.2];
imagen.transform = transformer;
[UIView setAnimationDelegate:self];
[UIView commitAnimations];
}
}
else {
NSLog(#"Zoom In");
CGAffineTransform transformer = CGAffineTransformScale(imagen.transform, 0.95, 0.95);
NSLog(#"transformer :: A: %.f, B: %.f, C: %.f, D: %.f", transformer.a, transformer.b, transformer.c, transformer.d);
NSString *num = [NSString stringWithFormat:#"%.f", transformer.a];
if (![num isEqualToString:#"0"]) {
[UIView beginAnimations:nil context:NULL];
[UIView setAnimationDuration: 0.2];
imagen.transform = transformer;
[UIView setAnimationDelegate:self];
[UIView commitAnimations];
}
}
And of course, a lot of thanks to Deepak.

Related

UIView not dragging smoothly by UIpangesture when use autolayout constraint

**I want to drag a view up vertically using UIpangesture. As my project includes autolayout, I created 4 layout constraint [top, right, bottom, left] and create an outlet from topspace constraint. When gesture recognize, topspaceConstraint's constant is chnage to change the views orgin Y. The code works but dragging is not smooth. How can I make it smooth **
-(void)gestureAction:(UIPanGestureRecognizer *)gesture
{
CGFloat targetY = 0;
if(gesture.state == UIGestureRecognizerStateBegan)
{
self.panCoord = [gesture locationInView:gesture.view];
}
CGPoint newCoord = [gesture locationInView:gesture.view];
float dY = newCoord.y-self.panCoord.y;
float newOriginY = (gesture.view.frame.origin.y+dY);
dispatch_async(dispatch_get_main_queue(), ^(void){
self.propertyDetailContentViewTopConstraint.constant = newOriginY;
});
if (gesture.state == UIGestureRecognizerStateEnded) {
if (gesture.view.frame.origin.y*0.8 < 100) {
targetY = 0;
}else if (gesture.view.frame.origin.y*0.8 < 250){
targetY = 250;
}else{
targetY = MAIN_SCREEN_HEIGHT;
}
[self setPropertyDetailContentViewTopConstraintTop:targetY];
}
}
-(void)setPropertyDetailContentViewTopConstraintTop:(CGFloat)top
{
[self.view layoutIfNeeded];
self.propertyDetailContentViewTopConstraint.constant = top;
[UIView animateWithDuration:0.5
animations:^{
[self.view layoutIfNeeded];
}];
}
#Rokon please use following code may be its help to you. Please use your view instead of "DrawImageView"
-(void)moveViewWithGestureRecognizer:(UIPanGestureRecognizer *)panGestureRecognizer{
NSUInteger touches = panGestureRecognizer.numberOfTouches;
CGPoint translation = [panGestureRecognizer translationInView:self.view];
self.DrawImageView.center = CGPointMake(self.DrawImageView.center.x + translation.x,
self.DrawImageView.center.y + translation.y);
[panGestureRecognizer setTranslation:CGPointMake(0, 0) inView:self.view];
if (panGestureRecognizer.state == UIGestureRecognizerStateEnded) {
CGPoint velocity = [panGestureRecognizer velocityInView:self.view];
CGFloat magnitude = sqrtf((velocity.x * velocity.x) + (velocity.y * velocity.y));
CGFloat slideMult = magnitude / 200;
// NSLog(#"magnitude: %f, slideMult: %f", magnitude, slideMult);
float slideFactor = 0.1 * slideMult; // Increase for more of a slide
CGPoint finalPoint = CGPointMake(self.DrawImageView.center.x + (velocity.x * slideFactor),
self.DrawImageView.center.y + (velocity.y * slideFactor));
finalPoint.x = MIN(MAX(finalPoint.x, 0), self.view.bounds.size.width);
finalPoint.y = MIN(MAX(finalPoint.y, 0), self.view.bounds.size.height);
[UIView animateWithDuration:slideFactor*2 delay:0 options:UIViewAnimationOptionCurveEaseOut animations:^{
self.DrawImageView.center = finalPoint;
} completion:nil];
// [self.frontImageView setAlpha:1.0];
}
}

Tableview scroll to top not going all the way

I have a navigation bar that I've manually coded to animate the frame of depending on the offset of the scrollView (tableView). Below is a screenshot of what it looks like unscrolled.
Now after setScrollOffset:(0,0) is invoked (by scrollsToTop, not me manually - e.g. by tapping status bar), the scrollview scrolls to the top, but at the position at which there used to be no navigation bar. I can manually scroll the last 44px or so after the animation happens, but obviously thats not the behavior that's expected.
Here is my code for hiding the navbar:
- (void)scrollViewDidScroll:(UIScrollView *)scrollView
{
CGRect frame = self.navigationController.navigationBar.frame;
CGFloat size = frame.size.height - kMacro1;
CGFloat framePercentageHidden = ((kMacro2 - frame.origin.y) / (frame.size.height - 1));
CGFloat scrollOffset = scrollView.contentOffset.y;
CGFloat scrollDiff = scrollOffset - self.previousScrollViewYOffset;
CGFloat scrollHeight = scrollView.frame.size.height;
CGFloat scrollContentSizeHeight = scrollView.contentSize.height + scrollView.contentInset.bottom;
if (scrollOffset <= -scrollView.contentInset.top) {
frame.origin.y = kMacro2;
} else if ((scrollOffset + scrollHeight) >= scrollContentSizeHeight) {
frame.origin.y = -size;
} else {
frame.origin.y = MIN(kMacro2, MAX(-size, frame.origin.y - scrollDiff));
}
[self.navigationController.navigationBar setFrame:frame];
[self updateBarButtonItems:(1 - framePercentageHidden)];
self.previousScrollViewYOffset = scrollOffset;
}
- (void)scrollViewDidEndDecelerating:(UIScrollView *)scrollView
{
[self stoppedScrolling];
}
- (void)scrollViewDidEndDragging:(UIScrollView *)scrollView
willDecelerate:(BOOL)decelerate
{
if (!decelerate) {
[self stoppedScrolling];
}
}
- (void)stoppedScrolling
{
CGRect frame = self.navigationController.navigationBar.frame;
if (frame.origin.y < kMacro2) {
[self animateNavBarTo:-(frame.size.height - kMacro1)];
}
}
- (void)animateNavBarTo:(CGFloat)y
{
[UIView animateWithDuration:0.2 animations:^{
CGRect frame = self.navigationController.navigationBar.frame;
CGFloat alpha = (frame.origin.y >= y ? 0 : 1);
frame.origin.y = y;
[self.navigationController.navigationBar setFrame:frame];
[self updateBarButtonItems:alpha];
}];
}

Bounce UIImageView back when dragged off screen

What I need is when a UIImageView is dragged off of the screen it to bounce back when it gets let go. I have it working in the left and top sides this is what I am doing.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if (!CGRectContainsPoint(self.view.frame, imageView.frame.origin)){
CGFloat newX = 0.0f;
CGFloat newY = 0.0f;
// If off screen upper and left
if (imageView.frame.origin.x < 0.0f){
CGFloat negX = imageView.frame.origin.x * -1;
newX = negX;
}else{
newX = imageView.frame.origin.x;
}
if (imageView.frame.origin.y < 0.0f){
CGFloat negY = imageView.frame.origin.y * -1;
newY = negY;
}else{
newY = imageView.frame.origin.y;
}
CGRect newPoint = CGRectMake(newX, newY, imageView.frame.size.width, imageView.frame.size.height);
[UIView beginAnimations:#"BounceAnimations" context:nil];
[UIView setAnimationDuration:.5];
[letterOutOfBounds play];
[imageView setFrame:newPoint];
[UIView commitAnimations];
}
}
}
So I would like to achieve the same thing for the right and bottom sides. But I have been stuck at this for awhile. Any Ideas?
How about something like the following?
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UIImageView *imageView = nil;
BOOL moved = NO;
CGRect newPoint = imageView.frame;
// If off screen left
if (newPoint.origin.x < 0.0f){
newPoint.origin.x *= -1.0;
moved = YES;
}
// if off screen up
if (newPoint.origin.y < 0.0f){
newPoint.origin.y *= -1.0;
moved = YES;
}
// if off screen right
CGFloat howFarOffRight = (newPoint.origin.x + newPoint.size.width) - imageView.superview.frame.size.width;
if (howFarOffRight > 0.0)
{
newPoint.origin.x -= howFarOffRight * 2;
moved = YES;
}
// if off screen bottom
CGFloat howFarOffBottom = (newPoint.origin.y + newPoint.size.height) - imageView.superview.frame.size.height;
if (howFarOffBottom > 0.0)
{
newPoint.origin.y -= howFarOffBottom * 2;
moved = YES;
}
if (moved)
{
[UIView beginAnimations:#"BounceAnimations" context:nil];
[UIView setAnimationDuration:.5];
[letterOutOfBounds play];
[imageView setFrame:newPoint];
[UIView commitAnimations];
}
}
As I read your code, the logic of "if off the left side, move it back on to the view by the same distance it was off the screen." To be honest, that doesn't quite make sense to me (why, when bouncing back, does the coordinate depend upon how far off the screen it was), but I've tried to honor that in the "off screen right" and "off screen bottom" logic. Obviously my logic is using the superview of imageView to determine the width of the containing view, but if that's not appropriate, replace it with whatever is.
Edit:
I personally do this stuff with gesture recognizers, such as:
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePan:)];
[self.imageView addGestureRecognizer:pan];
self.imageView.userInteractionEnabled = YES;
Thus, a gesture recognizer to animate moving the image back would be:
- (void)handlePan:(UIPanGestureRecognizer *)gesture
{
static CGRect originalFrame; // you could make this an ivar if you want, but just for demonstration purposes
if (gesture.state == UIGestureRecognizerStateBegan)
{
originalFrame = self.imageView.frame;
}
else if (gesture.state == UIGestureRecognizerStateChanged)
{
CGPoint translate = [gesture translationInView:gesture.view];
CGRect newFrame = originalFrame;
newFrame.origin.x += translate.x;
newFrame.origin.y += translate.y;
gesture.view.frame = newFrame;
}
else if (gesture.state == UIGestureRecognizerStateEnded || gesture.state == UIGestureRecognizerStateCancelled)
{
CGRect newFrame = gesture.view.frame;
newFrame.origin.x = fmaxf(newFrame.origin.x, 0.0);
newFrame.origin.x = fminf(newFrame.origin.x, gesture.view.superview.bounds.size.width - newFrame.size.width);
newFrame.origin.y = fmaxf(newFrame.origin.y, 0.0);
newFrame.origin.y = fminf(newFrame.origin.y, gesture.view.superview.bounds.size.height - newFrame.size.height);
// animate how ever you want ... I generally just do animateWithDuration
[UIView animateWithDuration:0.5 animations:^{
gesture.view.frame = newFrame;
}];
}
}
Or, if you want a gesture recognizer that just prevents the dragging of the image off the screen in the first place, it would be:
- (void)handlePan:(UIPanGestureRecognizer *)gesture
{
static CGRect originalFrame;
if (gesture.state == UIGestureRecognizerStateBegan)
{
originalFrame = self.imageView.frame;
}
else if (gesture.state == UIGestureRecognizerStateChanged)
{
CGPoint translate = [gesture translationInView:gesture.view];
CGRect newFrame = originalFrame;
newFrame.origin.x += translate.x;
newFrame.origin.x = fmaxf(newFrame.origin.x, 0.0);
newFrame.origin.x = fminf(newFrame.origin.x, gesture.view.superview.bounds.size.width - newFrame.size.width);
newFrame.origin.y += translate.y;
newFrame.origin.y = fmaxf(newFrame.origin.y, 0.0);
newFrame.origin.y = fminf(newFrame.origin.y, gesture.view.superview.bounds.size.height - newFrame.size.height);
gesture.view.frame = newFrame;
}
}
By the way, in iOS 7, you can give the animation of the image view back to its original location a little bounciness by using the new animationWithDuration with the usingSpringWithDampening and initialSpringVelocity parameters:
[UIView animateWithDuration:1.0
delay:0.0
usingSpringWithDamping:0.3
initialSpringVelocity:0.1
options:0
animations:^{
// set the new `frame` (or update the constraint constant values that
// will dictate the `frame` and call `layoutViewsIfNeeded`)
}
completion:nil];
Alternatively, in iOS7, you can also use UIKit Dynamics to add a UISnapBehavior:
self.animator = [[UIDynamicAnimator alloc] initWithReferenceView:self.view];
self.animator.delegate = self;
UISnapBehavior *snap = [[UISnapBehavior alloc] initWithItem:self.viewToAnimate snapToPoint:CGPointMake(self.viewToAnimate.center.x, self.view.frame.size.height - 50)];
// optionally, you can control how much bouncing happens when it finishes, e.g., for a lot of bouncing:
//
// snap.damping = 0.2;
// you can also slow down the snap by adding some resistance
//
// UIDynamicItemBehavior *resistance = [[UIDynamicItemBehavior alloc] initWithItems:#[self.viewToAnimate]];
// resistance.resistance = 20.0;
// resistance.angularResistance = 200.0;
// [self.animator addBehavior:resistance];
[self.animator addBehavior:snap];
I think the easiest way is to check whether your imageView has gone out of your self.view.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if (!CGRectContainsRect(self.view.frame, imageView.frame)){
// Your animation to bounce.
}
}

How to move UITableView vertically

I have created small iPhone application. I put UITableView and put some UIView inside cells. I added functionality to move view in cell horizontally. Now I want to add logic to move whole UITableView vertically if I move my finger vertically in my UIView. That is my code:
-(void)moveView:(UIPanGestureRecognizer *)recognizer
{
if([recognizer state] == UIGestureRecognizerStateBegan)
{
CGPoint location = [recognizer locationInView:[self superview]];
prevPos = location;
startPosition = self.frame.origin.x;
}
else if([recognizer state] == UIGestureRecognizerStateChanged)
{
CGPoint location = [recognizer locationInView:[self superview]];
float delX = location.x - prevPos.x;
float delY = location.y - prevPos.y;
if(delX > 0 && delX * delX > delY * delY)
{
CGPoint np = CGPointMake(self.frame.origin.x+delX, self.frame.origin.y);
CGRect fr = self.frame;
fr.origin = np;
self.frame = fr;
prevPos = location;
}
else if(delX * delX < delY * delY)
{
}
}
else if([recognizer state] == UIGestureRecognizerStateEnded ||
[recognizer state] == UIGestureRecognizerStateFailed ||
[recognizer state] == UIGestureRecognizerStateCancelled)
{
[UIView beginAnimations:nil context:NULL];
[UIView setAnimationDuration:0.5];
CGRect rect = self.frame;
rect.origin.x = startPosition;
self.frame = rect;
startPosition = 0;
[UIView commitAnimations];
}
}
What code should I put in this condition: else if(delX * delX < delY * delY)?
Thanks!
If you want to scroll a TableView, you need to alter the TableView's contentOffset property.
Look at the setContentOffset:animated: documentation

Paint app for ipad [opengl-es] line strokes not proper

I have a strange problem with openGL. I am working on paint app for both iphone and ipad. I am using opengl-es for my app. In my app I am filling colors in outline images, drawings a line onscreen based on where the user touches. I just use the "touchesMoved" function to draw a line between two points. Since I would like lines to stay on screen, in my renderLineFromPoint function, but for some reason some of the points of the line just drop out, and it appears completely random. However ipad simulator and iphone device/simulator gives desired output. Line stroke appears as shown in figure.
I am creating buffer using following code:
- (BOOL)createFramebuffer{
// Generate IDs for a framebuffer object and a color renderbuffer
glGenFramebuffersOES(1, &viewFramebuffer);
glGenRenderbuffersOES(1, &viewRenderbuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
// This call associates the storage for the current render buffer with the EAGLDrawable (our CAEAGLLayer)
// allowing us to draw into a buffer that will later be rendered to screen wherever the layer is (which corresponds with our view).
[context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(id<EAGLDrawable>)self.layer];
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);
NSLog(#"Backing Width:%i and Height: %i", backingWidth, backingHeight);
// For this sample, we also need a depth buffer, so we'll create and attach one via another renderbuffer.
glGenRenderbuffersOES(1, &depthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
if(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES)
{
NSLog(#"failed to make complete framebuffer object %x", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES));
return NO;
}
return YES;
}
I am using following code snippet for renderLineFromPoint
- (void) renderLineFromPoint:(CGPoint)start toPoint:(CGPoint)end{
static GLfloat* vertexBuffer = NULL;
static NSUInteger vertexMax = 64;
NSUInteger vertexCount = 0,
count,
i;
[EAGLContext setCurrentContext:context];
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
// Convert locations from Points to Pixels
//CGFloat scale = self.contentScaleFactor;
CGFloat scale;
if ([self respondsToSelector: #selector(contentScaleFactor)])
{
scale=self.contentScaleFactor;
}
else{
//scale = 1.000000;
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)] == YES && [[UIScreen mainScreen] scale] == 2.00) {
// RETINA DISPLAY
scale = 2.000000;
}
else {
scale = 1.000000;
}
}
NSLog(#"start point %#", NSStringFromCGPoint(start));
NSLog(#"End Point %#", NSStringFromCGPoint(end));
start.x *= scale;
start.y *= scale;
end.x *= scale;
end.y *= scale;
// Allocate vertex array buffer
if(vertexBuffer == NULL)
// vertexBuffer = malloc(vertexMax * 2 * sizeof(GLfloat));
vertexBuffer = malloc(vertexMax * 2 * sizeof(GLfloat));
// Add points to the buffer so there are drawing points every X pixels
count = MAX(ceilf(sqrtf((end.x - start.x) * (end.x - start.x) + (end.y - start.y) * (end.y - start.y)) / kBrushPixelStep), 1);
NSLog(#"count %d",count);
for(i = 0; i < count; ++i) {
if(vertexCount == vertexMax) {
vertexMax = 2 * vertexMax;
vertexBuffer = realloc(vertexBuffer, vertexMax * 2 * sizeof(GLfloat));
NSLog(#"if loop");
}
vertexBuffer[2 * vertexCount + 0] = start.x + (end.x - start.x) * ((GLfloat)i / (GLfloat)count);
vertexBuffer[2 * vertexCount + 1] = start.y + (end.y - start.y) * ((GLfloat)i / (GLfloat)count);
vertexCount += 1;
}
NSLog(#"Scale vertex %f",scale);
//NSLog(#"%i",vertexCount);
// Render the vertex array
glVertexPointer(2, GL_FLOAT, 0, vertexBuffer);
glDrawArrays(GL_POINTS, 0, vertexCount);
// Display the buffer
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
}
touchbegan function code:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
CGRect bounds = [self bounds];
UITouch* touch = [[event touchesForView:self] anyObject];
firstTouch = YES;
// Convert touch point from UIView referential to OpenGL one (upside-down flip)
location = [touch locationInView:self];
location.y = bounds.size.height - location.y;
}
touchmoved function code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
CGRect bounds = [self bounds];
UITouch* touch = [[event touchesForView:self] anyObject];
// Convert touch point from UIView referential to OpenGL one (upside-down flip)
if (firstTouch) {
firstTouch = NO;
previousLocation = [touch previousLocationInView:self];
previousLocation.y = bounds.size.height - previousLocation.y;
} else {
location = [touch locationInView:self];
location.y = bounds.size.height - location.y;
previousLocation = [touch previousLocationInView:self];
previousLocation.y = bounds.size.height - previousLocation.y;
}
// Render the stroke
[self renderLineFromPoint:previousLocation toPoint:location];
}
I had a similar issue with missing points and it was related to the drawableProprties of my CAEAGLLayer.
Within CAEAGLLayer's drawableProperties, do you have the kEAGLDrawablePropertyRetainedBacking set to YES? If not then the Backing of your drawing is not being retained from frame to frame which can cause missing points and flickering.