OpenGL-ES 2.0 blend 2 textures and save the result into a original texture - opengl-es-2.0

I am a newbie to OpenGL ES 2.0, I have read this thread about how to blend 2 textures into a final framebuffer result. How do I blend two textures with different co-ordinates in OpenGL ES 2.0 on iPhone?
My current requirement is a little different.
I want to blend texture called inputTextureTop(Varying) and texture inputTextureBot(constant) and save the result into texture inputTextureTop.
It should be simple enough. How do I modify the sample code in the thread to try?
Objective C code...
- (void) display {
[EAGLContext setCurrentContext:context];
glBindFramebuffer(GL_FRAMEBUFFER, targetFBO);
glUseProgram(program);
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, textureTop);
glActiveTexture(GL_TEXTURE3);
glBindTexture(GL_TEXTURE_2D, textureBot);
glUniform1i(inputTextureTop, 2);
glUniform1i(inputTextureBot, 3);
glUniform1f(alphaTop, alpha);
glEnable (GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glVertexAttribPointer(position, 2, GL_FLOAT, 0, 0, imageVertices);
glVertexAttribPointer(inputTextureCoordinate, 2, GL_FLOAT, 0, 0, textureCoordinates);
glClearColor(0.0, 0.0, 0.0, 0.0);
glClear(GL_COLOR_BUFFER_BIT);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
glBindRenderbuffer(GL_RENDERBUFFER, viewRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER];
}
Vertex Shader code:
attribute vec4 position;
attribute vec4 inputTextureCoordinate;
varying vec2 textureCoordinate;
void main()
{
gl_Position = position;
textureCoordinate = inputTextureCoordinate.xy;
}
Fragment shader...
varying highp vec2 textureCoordinate;
uniform sampler2D inputTextureTop;
uniform sampler2D inputTextureBot;
uniform highp float alphaTop;
void main()
{
lowp vec4 pixelTop = texture2D(inputTextureTop, textureCoordinate);
lowp vec4 pixelBot = texture2D(inputTextureBot, textureCoordinate);
gl_FragColor = someBlendOperation(pixelTop, pixelBot);
}
Thanks so much!
Regards,
Howard
========================
Updated on August 22nd.
After some investigation, I found it's a dead path. There is no way to composite texture (top) + texture (bottom) and change to texture (top).
However, there is a way to composite texture (top) + texture (bottom) and save it to texture (target).
The way to do it is to bind a target texture to the same framebuffer that texture top and texture bottom uses.
Add the code below will do the job:
glActiveTexture(GL_TEXTURE0);
bindTexture(textureTarget);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, textureTarget, 0);
The result will be stored in textureTarget.
Bingo.

To render into a texture is more complicated because you must use an FBO. This can be done with either OpenGL ES 2.0 or 1.1 (with extensions). This article explains in detail:
http://processors.wiki.ti.com/index.php/Render_to_Texture_with_OpenGL_ES
Also, see my answer here for an example of combining 2 textures in shader code:
OpenGL ES 2 : use an additional texture unit to blend this image with the current one

Related

OpenGL ES 2.0, quad does not change colour when background is changed

I have drawn a circle in a quad on OpenGL ES 2.0. The code in the fragment shader takes the centre and radius that has been set and creates a circle within the quad. This worked fine until I tried to change the colour of the background as the quad still shows up blank and does not get filled with the background colour/texture. Is there an easy way to make the quad fill with the same colour/texture as the background whilst also keeping the circle on show?
The code in the fragment shader is as follows:
"varying highp vec2 textureCoordinate;\n"
"const highp vec2 center = vec2(0.5, 0.5);\n"
"const highp float radius = 0.5;\n"
"void main()\n"
"{\n"
"highp float distanceFromCenter = distance(center, textureCoordinate);\n"
"lowp float checkForPresenceWithinCircle = step(distanceFromCenter, radius);\n"
"gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0) * checkForPresenceWithinCircle;\n"
"}\n"
The trick is not to fill the quad with the background, but to avoid replacing it outside the area covered by the circle.
Your fragment shader will always output a value - even if it's outside the circle. That is, if checkForPresenceWithinCircle is 0.0, gl_FragColor gets assigned vec4(0.0, 0.0, 0.0, 0.0) - transparent black.
I think what you are looking for is the discard keyword, which prevents the shader from outputting anything for that fragment. Something like:
if ( checkForPresenceWithinCircle > 0.0 )
gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
else
discard;
Since you know that the alpha will be 0.0 outside the circle and 1.0 within, you could also achieve the same effect using alpha blending from the API-side:
draw_background();
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
draw_circle();
glDisable(GL_BLEND);
The alpha part of gl_FragColor is normally read from the texture, like this:
vec4 vTexture = texture2D(gsuTexture0, gsvTexCoord);
gl_FragColor = vTexture;
Also, be sure your color buffer clear has the alpha set to 0.0, like this:
glClearColor(fRed, fGreen, fBlue, 0.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
And there is also the problem that you can't use alpha textures with the Android Bitmap class as discussed here:
http://software.intel.com/en-us/articles/porting-opengl-games-to-android-on-intel-atom-processors-part-1/

OpenGL ES 2.0: Rendering a texture to screen without a ModelView or Projection matrix

I'm trying to render a texture to the bottom left corner of the screen, so something like:
glViewPort(0, 0, width/4, height/4);
I'm having trouble performing that rendering. I draw a quad and do some basic stuff. I've tried all variants of rendering to a quad as discussed in these links and others on SO:
How to draw a texture as a 2D background in OpenGL ES 2.0?
OpenGL ES 2.0 Rendering with a Texture
My vertex and fragment shaders are as follows:
const char * VERTEX_SHADER =
" \n\
attribute vec4 a_position; \n\
attribute vec4 a_texCoord; \n\
\n\
\n\
#ifdef GL_ES \n\
varying mediump vec2 v_texCoord; \n\
#else \n\
varying vec2 v_texCoord; \n\
#endif \n\
\n\
void main() \n\
{ \n\
gl_Position = a_position; \n\
v_texCoord = a_texCoord.xy; \n\
} \n\
";
const char * FRAGMENT_SHADER =
" \n\
#ifdef GL_ES \n\
precision lowp float; \n\
#endif \n\
\n\
varying vec2 v_texCoord; \n\
uniform sampler2D u_texture; \n\
\n\
void main() \n\
{ \n\
gl_FragColor = texture2D(u_texture, v_texCoord); \n\
} \n\
";
And my rendering code is this:
static const GLfloat squareVertices[] = {
-1.0f, -1.0f,
1.0f, -1.0f,
-1.0f, 1.0f,
1.0f, 1.0f,
};
static const GLfloat textureVertices[] = {
0.0f, 0.0f,
1.0f, 0.0f,
0.0f, 1.0f,
1.0f, 1.0f,
};
// ...
glUseProgram(myProgram_);
glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);
glViewPort(0, 0, width/4, height/4); // width and height are defined elsewhere
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, myTexture_); // myTexture_ has been successfully loaded and checked
// Update uniform values
glUniform1i(textureUniformLocation_, 0);
// Update attribute values.
glEnableVertexAttribArray(a_position_location_);
glEnableVertexAttribArray(a_texcoord_location_);
glVertexAttribPointer(a_position_location_, 2, GL_FLOAT, 0, 0, squareVertices);
glVertexAttribPointer(a_texcoord_location_, 2, GL_FLOAT, 0, 0, textureVertices;
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
The vertex and fragment shaders compile and the glProgram links successfully. All I'm getting is a white rectangle in the bottom left of my screen (on top of the already rendered scene, of course). Am I missing something here?
Maybe you forgot to set your minification filter? Saving the texture as an image would probably not alert you of sampling errors like this.
http://www.opengl.org/wiki/Common_Mistakes#Creating_a_Texture
You should change
attribute vec4 a_position;
attribute vec4 a_texCoord;
to
attribute vec2 a_position;
attribute vec2 a_texCoord;
and
gl_Position = a_position;
to
gl_Position = vec4(a_position,0.0,1.0);

filled antialiased poly cocos2d

How can I draw filled poly in Cocos2D framework?
Code below draws poly but without antialiasing.What should I change?
void ccFillPoly( CGPoint *poli, int points, BOOL closePolygon )
{
// Default GL states: GL_TEXTURE_2D, GL_VERTEX_ARRAY, GL_COLOR_ARRAY, GL_TEXTURE_COORD_ARRAY
// Needed states: GL_VERTEX_ARRAY,
// Unneeded states: GL_TEXTURE_2D, GL_TEXTURE_COORD_ARRAY, GL_COLOR_ARRAY
glDisable(GL_TEXTURE_2D);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
glDisableClientState(GL_COLOR_ARRAY);
glVertexPointer(2, GL_FLOAT, 0, poli);
if( closePolygon )
// glDrawArrays(GL_LINE_LOOP, 0, points);
glDrawArrays(GL_TRIANGLE_FAN, 0, points);
else
glDrawArrays(GL_LINE_STRIP, 0, points);
// restore default state
glEnableClientState(GL_COLOR_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glEnable(GL_TEXTURE_2D);
}
One good approach to emulate antialiasing is to add transparent vertices around your polygon.
This method is fast and fine-looking, but is little hard to implement.
Here is solution for antialiased lines.
If you don't worry about performance, you may render the polygon multiple times with some transparency and offset by 1 pixel. This would work for not textured polygons.

Unusual Lighting Effects - Random Polygons Coloured

I am working on creating an object loader for use with iOS, I have managed to load the vertices, normals and face data from and OBJ file, and then place this data into arrays for reconstructing the object. But I have come across an issue with the lighting, at the bottom is a video from the simulation of my program - this is with the lighting in the following position:
CGFloat position[] = { 0.0f, -1.0f, 0.0f, 0.0f };
glLightfv(GL_LIGHT0, GL_POSITION, position);
This is specified in both the render method each frame and the setup view method which is called once at setup.
Various other lighting details are here, these are called once during setup:
glEnable(GL_LIGHTING);
glEnable(GL_LIGHT0);
CGFloat ambientLight[] = { 0.2f, 0.2f, 0.2f, 1.0f };
CGFloat diffuseLight[] = { 1.0f, 0.0f, 0.0, 1.0f };
glLightfv(GL_LIGHT0, GL_AMBIENT, ambientLight);
glLightfv(GL_LIGHT0, GL_DIFFUSE, diffuseLight);
CGFloat position[] = { 0.0f, -1.0f, 0.0f, 0.0f };
glLightfv(GL_LIGHT0, GL_POSITION, position);
glEnable(GL_COLOR_MATERIAL);
glEnable(GL_NORMALIZE);
The video of the issue can be found here:
http://youtu.be/dXm4wqzvO5c
Thanks,
Paul
[EDIT]
for further info normals are also supplied by the following code, they are currently in a large normals array or XYZ XYZ XYZ etc...
// FACE SHADING
glColorPointer(4, GL_FLOAT, 0, colors);
glEnableClientState(GL_COLOR_ARRAY);
glNormalPointer(GL_FLOAT, 3, normals);
glEnableClientState(GL_NORMAL_ARRAY);
glDrawArrays(GL_TRIANGLES, 0, 3*numOfFaces);
glDisableClientState(GL_COLOR_ARRAY);
I now feel incredibly stupid... All part of being a student programmer I guess. I will leave an answer to this so if anyone else gets this problem they can solve it too! The mistake was simply down to a typo:
glNormalPointer(GL_FLOAT, 3, normals);
Should have read
glNormalPointer(GL_FLOAT, 0, normals);
The second argument being the STRIDE which is only used if the array contains other values e.g. Vert Coords / Normals / Texture Coords. As mine are in single arrays the stride between the values should be 0.

Flipping OpenGL texture

When I load textures from images normally, they are upside down because of OpenGL's coordinate system. What would be the best way to flip them?
glScalef(1.0f, -1.0f, 1.0f);
mapping the y coordinates of the textures in reverse
vertically flipping the image files manually (in Photoshop)
flipping them programatically after loading them (I don't know how)
This is the method I'm using to load png textures, in my Utilities.m file (Objective-C):
+ (TextureImageRef)loadPngTexture:(NSString *)name {
CFURLRef textureURL = CFBundleCopyResourceURL(
CFBundleGetMainBundle(),
(CFStringRef)name,
CFSTR("png"),
CFSTR("Textures"));
NSAssert(textureURL, #"Texture name invalid");
CGImageSourceRef imageSource = CGImageSourceCreateWithURL(textureURL, NULL);
NSAssert(imageSource, #"Invalid Image Path.");
NSAssert((CGImageSourceGetCount(imageSource) > 0), #"No Image in Image Source.");
CFRelease(textureURL);
CGImageRef image = CGImageSourceCreateImageAtIndex(imageSource, 0, NULL);
NSAssert(image, #"Image not created.");
CFRelease(imageSource);
GLuint width = CGImageGetWidth(image);
GLuint height = CGImageGetHeight(image);
void *data = malloc(width * height * 4);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
NSAssert(colorSpace, #"Colorspace not created.");
CGContextRef context = CGBitmapContextCreate(
data,
width,
height,
8,
width * 4,
colorSpace,
kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Host);
NSAssert(context, #"Context not created.");
CGColorSpaceRelease(colorSpace);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), image);
CGImageRelease(image);
CGContextRelease(context);
return TextureImageCreate(width, height, data);
}
Where TextureImage is a struct that has a height, width and void *data.
Right now I'm just playing around with OpenGL, but later I want to try making a simple 2d game. I'm using Cocoa for all the windowing and Objective-C as the language.
Also, another thing I was wondering about: If I made a simple game, with pixels mapped to units, would it be alright to set it up so that the origin is in the top-left corner (personal preference), or would I run in to problems with other things (e.g. text rendering)?
Thanks.
Any of those:
Flip texture during the texture load,
OR flip model texture coordinates during model load
OR set texture matrix to flip y (glMatrixMode(GL_TEXTURE)) during render.
Also, another thing I was wondering about: If I made a simple game, with pixels mapped to units, would it be alright to set it up so that the origin is in the top-left corner (personal preference), or would I run in to problems with other things (e.g. text rendering)?
Depends on how you are going to render text.
Jordan Lewis pointed out CGContextDrawImage draws image upside down when passed UIImage.CGImage. There I found a quick and easy solution: Before calling CGContextDrawImage,
CGContextTranslateCTM(context, 0, height);
CGContextScaleCTM(context, 1.0f, -1.0f);
Does the job perfectly well.