GLKBaseEffect prepareToDraw Performance - opengl-es-2.0

I want to render many objects with the same effect.
So I change the transform property and call preparetodraw, e.g.
GLKMatrix4 baseModelViewMatrix = GLKMatrix4MakeTranslation(0.0f, position, -1.0f);
self.effect.transform.modelviewMatrix = baseModelViewMatrix;
[self.effect prepareToDraw];
glDrawElements(GL_TRIANGLES, sizeof(SquareIndices)/sizeof(SquareIndices[0]), GL_UNSIGNED_BYTE, 0);
baseModelViewMatrix = GLKMatrix4MakeTranslation(0.0f, position+2.0f, -1.0f);
self.effect.transform.modelviewMatrix = baseModelViewMatrix;
[self.effect prepareToDraw];
glDrawElements(GL_TRIANGLES, sizeof(SquareIndices)/sizeof(SquareIndices[0]), GL_UNSIGNED_BYTE, 0);
Is this efficient - or is there a better approach? Will this result in extra glUseProgram calls and such?
I have an older shader manager class that I built, but was hoping to use glkit instead.
Thanks in advance for any hints...

Related

OpenGL texture not showing up (with cocoa)

It's a very strange problem. I create a texture using the method copied from apple's sample. It's works fine in apple's sample, but not in mine project. The texture is not shows up, only the color define by glcolor4f. I used glistexture and glgeterror to check, they tells that nothing wrong there. This only happens at the first time I load the texture. If I release the texture and reload it, It works, with the same code. Are there any other ways to check errors of OpenGL?
here's the code I'm using to load the texture:
- (FETexture *)loadTextureWithPath:(NSString*)name{
NSURL *url = nil;
CGImageSourceRef src;
CGImageRef image;
CGContextRef context = nil;
CGColorSpaceRef colorSpace;
GLuint textureId;
GLuint pboId;
GLubyte *data;
url = [NSURL fileURLWithPath: name];
src = CGImageSourceCreateWithURL((__bridge CFURLRef)url, NULL);
if (!src) {
NSLog(#"No image");
return nil;
}
image = CGImageSourceCreateImageAtIndex(src, 0, NULL);
CFRelease(src);
GLuint width = (GLint)CGImageGetWidth(image);
GLuint height = (GLint)CGImageGetHeight(image);
data = (GLubyte*) calloc(width * height * 4, sizeof(GLubyte));
colorSpace = CGColorSpaceCreateDeviceRGB();
context = CGBitmapContextCreate(data, width, height, 8, 4 * width, colorSpace, kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Host);
CGColorSpaceRelease(colorSpace);
// Core Graphics referential is upside-down compared to OpenGL referential
// Flip the Core Graphics context here
// An alternative is to use flipped OpenGL texture coordinates when drawing textures
CGContextTranslateCTM(context, 0.0, height);
CGContextScaleCTM(context, 1.0, -1.0);
// Set the blend mode to copy before drawing since the previous contents of memory aren't used. This avoids unnecessary blending.
CGContextSetBlendMode(context, kCGBlendModeCopy);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), image);
CGContextRelease(context);
CGImageRelease(image);
glGenTextures(1, &textureId);
glGenBuffers(1, &pboId);
// Bind the texture
glBindTexture(GL_TEXTURE_2D, textureId);
// Bind the PBO
glBindBuffer(GL_PIXEL_UNPACK_BUFFER, pboId);
// Upload the texture data to the PBO
glBufferData(GL_PIXEL_UNPACK_BUFFER, width * height * 4 * sizeof(GLubyte), data, GL_STATIC_DRAW);
// Setup texture parameters
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glPixelStorei(GL_UNPACK_ROW_LENGTH, 0);
// OpenGL likes the GL_BGRA + GL_UNSIGNED_INT_8_8_8_8_REV combination
// Use offset instead of pointer to indictate that we want to use data copied from a PBO
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0,
GL_BGRA, GL_UNSIGNED_INT_8_8_8_8_REV, 0);
// We can delete the application copy of the texture data now
free(data);
glBindTexture(GL_TEXTURE_2D, 0);
glBindBuffer(GL_PIXEL_UNPACK_BUFFER, 0);
FETexture *texture = [FETexture new];
texture.textureID = textureId;
texture.bufferID = pboId;
texture.width = width;
texture.height = height;
return texture;
You've not shown any of the relevant code for the draw site. Perhaps you're not binding this texture at draw time.
You can use the OpenGL Profiler in "Graphics tools for Xcode" at https://developer.apple.com/downloads/index.action to debug GL errors and state.

glColor4f Stops Textures From Loading Using GLKTextureLoader

I'm currently using OpenGL ES 2.0 under GLKView on the new iPad. It seems, whenever I make the call to glColor4f, nothing happens (ie. it doesn't color the polygons I want it to) other than causing a GL_INVALID_OPERATION error. Then, as soon as I try to load a texture, I get this error message:
Error loading file: The operation couldn't be completed. (GLKTextureLoaderErrorDomain error 8.)
But before the glColor4f is called, everything works fine. Here's my code below for drawInRect:
- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect
{
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
// Render the object with GLKit
self.effect.texture2d0.enabled = NO;
self.effect.transform.modelviewMatrix = _defaultModelMatrix;
[self.effect prepareToDraw];
// Render the grid
glBindBuffer(GL_ARRAY_BUFFER, _gridVertexBuffer);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(GLKVertexAttribPosition, 2, GL_FLOAT, GL_FALSE, 0, 0);
glColor4f(1.0f, 0.0f, 0.0f, 1.0f);
glDrawArrays(GL_TRIANGLES, 0, TRIANGLES * 6);
glDisableVertexAttribArray(GLKVertexAttribPosition);
glBindBuffer(GL_ARRAY_BUFFER, 0);
}
If anyone can help, I'd really appreciate it. Some things to point out though. The object I'm trying to color is stored in a VBO and is rendered fine, in a white color (even though I'm trying to color it red).
Thanks
glColor4f is not an OpenGLES2.0 command (it's from GLES1.1)
What you want to do is declare a uniform in your shader called 'color' (or something similar), set that uniform as you typically would with any shader uniform, and multiply your fragment color with that uniform before writing the pixel.

Objective-c : Most efficient way to load a texture to OpenGL

Currently i load a texture in iOS using Image I/O and I extract its image data with Core Graphics. Then i can send the image data to OpenGL like this :
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, texture->width, texture->height, 0, GL_RGBA, GL_UNSIGNED_BYTE, texture->imageData);
The problem is that the Core Graphics part is really slow, i need to setup and draw with Core Graphics just to extract the image data...i don't want to show it on screen. There must be a more efficient way to extract image data in iOS?...
Here is my code :
...
myTexRef = CGImageSourceCreateWithURL((__bridge CFURLRef)url, myOptions);
...
MyTexture2D* texture;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
void *imageData = malloc( tileSize.width * tileSize.height * 4 );
CGContextRef imgContext = CGBitmapContextCreate( imageData, tileSize.width, tileSize.height, 8, 4 * tileSize.width, colorSpace, kCGImageAlphaNoneSkipLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease( colorSpace );
CGContextClearRect( imgContext, CGRectMake( 0, 0, tileSize.width, tileSize.height ) );
CGContextTranslateCTM( imgContext, 0, 0 );
...
CGImageRef tiledImage = CGImageCreateWithImageInRect (imageRef, tileArea);
CGRect drawRect = CGRectMake(0, 0, tileSize.width, tileSize.height);
// *** THIS CALL IS REALLY EXPENSIVE!
CGContextDrawImage(imgContext, drawRect, tiledImage);
CGImageRelease(tiledImage);
// TamTexture2D takes the ownership of imageData and will be responsible to free it
texture = new MyTexture2D(tileSize.width, tileSize.height, imageData);
CGContextRelease(imgContext);
If you are developing for iOS 5 and above, GLKTextureLoader is what you're looking for:
GLKTextureLoader is orders of magnitude slower than using CG.
I have logged this as a bug with Apple DTS, and got it thrown back as "yes, we know, don't care, not going to fix it. You should be using CoreGraphics instead if you want your textures to be loaded fast" (almost that wording)
glTexImage2D is immensely fast if you give it the raw buffer that CGContext* methods create for you / allow you to pass in. Assuming you get the RGBA/ARGB/etc colour-spaces correct, of course.
cgcontextDrawImage is also super fast. My guess is that it's taking time to load your data over the web...

ios CCLabelTTF colored subclass with Core Text

Good day to all.
At the moment I am trying to implement CCLabelTTF subclass with suppport of NSAttributedString to get multi-colored label. And I am hampered by lack of CoreText and CoreGraphics knowledge.
After reading few guides I, created CCTexture2D category to create texture using NSAttributedString object.
Here is my drawing code:
data = calloc(POTHigh, POTWide * 2);
colorSpace = CGColorSpaceCreateDeviceGray();
context = CGBitmapContextCreate(data, POTWide, POTHigh, 8, POTWide, colorSpace, kCGImageAlphaNone);
CGColorSpaceRelease(colorSpace);
if( ! context )
{
free(data);
[self release];
return nil;
}
UIGraphicsPushContext(context);
CGContextTranslateCTM(context, 0.0f, POTHigh);
CGContextScaleCTM(context, 1.0f, -1.0f);
// draw attributed string to context
CTFramesetterRef frameSetter = CTFramesetterCreateWithAttributedString((CFAttributedStringRef)string);
CGMutablePathRef path = CGPathCreateMutable();
CGPathAddRect(path, NULL, CGRectMake(0.f, 0.f, dimensions.width, dimensions.height));
CTFrameRef frame = CTFramesetterCreateFrame(frameSetter, CFRangeMake(0, 0), path, NULL);
CTFrameDraw(frame, context);
UIGraphicsPopContext();
CFRelease(frame);
CGPathRelease(path);
CFRelease(frameSetter);
And now I have few troubles:
The first one - my texture is shown flipped vertically. I thought, that these lines
CGContextTranslateCTM(context, 0.0f, POTHigh);
CGContextScaleCTM(context, 1.0f, -1.0f);
should prevent this.
The second one, if I create RGB context, I cannot see anything on the screen. I tried to create RGB context with these lines.
colorSpace = CGColorSpaceCreateDeviceRGB();
context = CGBitmapContextCreate(data, POTWide, POTHigh, 8, POTWide * 4, colorSpace, kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Big);
I tried to google, but don't find anything related to my issues =( Any help(links or suggestions) is appreciated.
Couple things to try:
Your data allocation isn't big enough for RGB. Try: data = calloc(POTHigh, POTWide * 4); for RGB color space.
CTFrameDraw draws in relation to GL coords so you don't need to use CGContextScaleCTM(context, 1.0f, -1.0f);
that line was put in the original CCTexture2D creation for a CCLabelTTF because it used NSString's drawInRect: which draws in relation to UIKit coords.
Maybe try other alpha mask flags...? Check out Apple's documentation on Supported Pixel Formats for iOS to see what your options are.
You may want to take a look at ActiveTextView-iOS (https://github.com/storify/ActiveTextView-iOS). It may be of use.
use this to get color texture:
context = CGBitmapContextCreate(data, POTWide, POTHigh, 8, POTWide, colorSpace, kCGImageAlphaPremultipliedLast);

Ribbon Graphs (charts) with Core Graphics

I am generating a classic line graph using core graphics, which renders and work very well.
There are several lines stacking one after another using "layer.zPosition"
-(void)drawRect:(CGRect)rect {
float colorChange = (0.1 * [self tag]);
theFillColor = [UIColor colorWithRed:(colorChange) green:(colorChange*0.50) blue:colorChange alpha:0.75f].CGColor;
CGContextRef c = UIGraphicsGetCurrentContext();
CGFloat white[4] = {1.0f, 1.0f, 1.0f, 1.0f};
CGContextSetFillColorWithColor(c, theFillColor);
CGContextSetStrokeColor(c, white);
CGContextSetLineWidth(c, 2.0f);
CGContextBeginPath(c);
//
CGContextMoveToPoint(c, 0.0f, 200-[[array objectAtIndex:0] floatValue]);
CGContextAddLineToPoint(c, 0.0f, 200-[[array objectAtIndex:0] floatValue]);
//
distancePerPoint = (rect.size.width / [array count]);
float lastPointX = 750.0;
for (int i = 0 ; i < [array count] ; i++)
{
CGContextAddLineToPoint(c, (i*distancePerPoint), 200-[[array objectAtIndex:i] floatValue]);
lastPointX = (i*distancePerPoint);
}
//
CGContextAddLineToPoint(c, lastPointX, 200.0);
CGContextAddLineToPoint(c, 0, 200);
CGContextClosePath(c);
//
//CGContextFillPath(c);
CGContextDrawPath(c, kCGPathFillStroke);
//CGContextDrawPath(c, kCGPathStroke);
}
(The above code is generating the following result):
(I can post the code I am using for the 3d effect if needed, but the way I do it is generically by
CATransform3D rotationAndPerspectiveTransform = CATransform3DIdentity;)
Question:
How can I transform my line graph to have depth ?
I would like to have a "depth" to the line(s) graph (thus making them a ribbon) as later I would like to represent them using rotation And Perspective Transform (as stated above)
You can't easily do with with Core Graphics or Core animation because CALayers are "flat" - they work like origami, where you can make 3D structures by connecting rectangles in 3D space, but you can't have arbitrary polygonal 3D shapes.
Actually that's not strictly true, you could look at using CAShapeLayers to do your drawing, and then manipulate them in 3D, but I think this is generally going to be very hard work to calculate where to position each shape and to get the edges to line up correctly.
Really the way to make this kind of 3D structure is to use OpenGL directly.
If you're not too familiar with low-level OpenGL programming, you might want to check out Galaxy Engine or Cocos3D.