I want to specify interpolation qualifier - FLAT and Layout(location=0) for an input vertex in my fragment shader in VULKAN, but upon compilation to SPIR-V file, it throws an error.
Shader:
#version 450
#extension GL_ARB_separate_shader_objects : enable
flat in vec3 fragColor;
layout(location = 0) in vec3 fragColor;
layout(location = 0) out vec4 outColor;
void main() {
outColor = vec4(fragColor, 1.0);
}
How do I retain/program the below:
flat in vec3 fragColor;
layout(location = 0) in vec3 fragColor;
The flat modifier needs to be part of your already declared input layout:
layout (location = 0) flat in vec3 fragColor;
In Vulkan, I had written a simple program to draw lines with fixed color, with simple vertex shader and fragement shader. But the colors input to fragment shaders are different than what is set in vertices. I checked with RenderDoc, and the colors passed to the vertex shader are correct (1,1,1,1) for both vertices of a line and also checked its output, its also same. But in Fragment shader, the colors I am getting are (1,1,0,1). Dont understand why this is happening. Irrespetive of what colors vertex shader emit, the input in fragment shader is always yellow.
Vertex shader:
layout(location = 0) in vec4 position;
layout(location = 1) in vec4 color;
layout(location = 2) in vec2 texcoord;
out vec4 io_color;
out vec2 io_uv;
out vec4 io_position2;
layout(std140, binding = 0) uniform UniformBlock_uTransform
{
mat4 uTransform;
};
layout(std140, binding = 1) uniform UniformBlock_uTransform2
{
mat4 uTransform2;
};
void main ()
{
io_uv = texcoord;
io_color = vec4(1,1,1,1); //Just to debug it
gl_Position = uTransform * position;
io_position2 = uTransform2 * position;
}
//Fragement :
in vec4 io_color;
layout(location = 0) out vec4 result;
void main ()
{
result = io_color;
}
Try adding output and input layout qualifiers to the values you pass from one shader to the other to ensure that they actually point to the same location:
VS:
layout (location = 0) out vec4 io_color;
FS:
layout (location = 0) in vec4 io_color;
I recommend always using that syntax to connect shader out- and inputs.
Check if color write mask is not disabled for blue channel.
I'm implementing a custom filter creating two strings one as a vertex shader and the other one as a fragment one.
I'm in the early stage and I was just trying to compile and run a program with custom shaders just copying the one used in GPUImage.
Here the shaders:
NSString *const CustomShaderV = SHADER_STRING
(
attribute vec4 position;
attribute vec4 inputTextureCoordinate;
attribute vec4 inputTextureCoordinate2;
varying vec2 textureCoordinate;
varying vec2 textureCoordinate2;
void main()
{
gl_Position = position;
textureCoordinate = inputTextureCoordinate.xy;
textureCoordinate2 = inputTextureCoordinate2.xy;
}
);
NSString *const CustomShaderF = SHADER_STRING
(
varying highp vec2 textureCoordinate;
varying highp vec2 textureCoordinate2;
uniform sampler2D inputImageTexture;
uniform sampler2D inputImageTexture2;
uniform highp float bVal;
uniform highp float hVal;
uniform highp float sVal;
void main()
{
lowp vec4 textureColor = texture2D(inputImageTexture, textureCoordinate);
lowp vec4 alphaColor = texture2D(inputImageTexture2, textureCoordinate2);
lowp vec4 outputColor;
outputColor = textureColor;
gl_FragColor = outputColor;
}
);
However whenever I initialize a filter with such shaders my app crash with this log.
2013-11-24 17:58:40.243[865:60b] Shader compile log:
ERROR: 0:1: 'CustomShaderV' : syntax error syntax error
2013-11-24 17:58:40.245[865:60b] Failed to compile vertex shader
2013-11-24 17:58:40.248[865:60b] Shader compile log:
ERROR: 0:1: 'CustomShaderF' : syntax error syntax error
2013-11-24 17:58:40.249[865:60b] Failed to compile fragment shader
2013-11-24 17:58:40.250[865:60b] Program link log: ERROR: One or more attached shaders not successfully compiled
2013-11-24 17:58:40.252[865:60b] Fragment shader compile log: (null)
2013-11-24 17:58:40.253[865:60b] Vertex shader compile log: (null)
The shaders are correct to me since are just cut and copy of other working shaders included in the GPUImage projects.
The call inside the code to the two shaders above is as following:
customFilter = [[GPUImageFilter alloc] initWithVertexShaderFromString:#"CustomShaderV" fragmentShaderFromString:#"CustomShaderF"];
My goal is to blend two videos that's why two textures are present in the shaders.
Based on the comment by OP,
here is the call: customFilter = [[GPUImageFilter alloc] initWithVertexShaderFromString:#"CustomShaderV" fragmentShaderFromString:#"CustomShaderF"];
the error was that the variable name was quoted and being used as a string. Fix was:
Change that to: [[GPUImageFilter alloc] initWithVertexShaderFromString:CustomShaderV fragmentShaderFromString:CustomShaderF];
I'm trying to figure out how to put different textures into different texture units and choose which texture to draw with. I have the following code in my onDrawFrame() method
int[] texture = new int[7];
texture[0] =TextureHelper.loadTexture(mActivityContext,R.drawable.texture1);
texture[1] =TextureHelper.loadTexture(mActivityContext,R.drawable.texture2);
texture[2] =TextureHelper.loadTexture(mActivityContext,R.drawable.texture3);
texture[3] =TextureHelper.loadTexture(mActivityContext,R.drawable.texture4);
texture[4] =TextureHelper.loadTexture(mActivityContext,R.drawable.texture5);
texture[5] =TextureHelper.loadTexture(mActivityContext,R.drawable.texture6);
texture[6] =TextureHelper.loadTexture(mActivityContext,R.drawable.texture7);
for (int i = 0; i < 7; i ++) {
GLES20.glActiveTexture(GLES20.GL_TEXTURE0 + i);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texture[i]);
GLES20.glUniform1i(mTextureUniformHandle, i);
Matrix.setIdentityM(mModelMatrix, 0);
Matrix.translateM(mModelMatrix, 0, -0.60f + 0.2f * i, 0.0f, 0.0f);
draw();
}
What this is supposed to do is load seven different textures into separate texture units and draw cubes, each cube with a different texture. However, what ends up happening is that all of the cubes end up being drawn with the first texture.
It works correctly if I change GLES20.glActiveTexture(GLES20.GL_TEXTURE0 + i) to GLES20.glActiveTexture(GLES20.GL_TEXTURE0) and GLES20.glUniform1i(mTextureUniformHandle, i) to GLES20.glUniform1i(mTextureUniformHandle, 0), but that just uses a single texture unit and replaces the texture in that unit every time, which is not what I want to do.
What am I doing wrong?
Thanks in advance.
EDIT:
Vertex shader:
"uniform mat4 u_MVPMatrix;" + // A constant representing the
// combined
// model/view/projection matrix.
"uniform mat4 u_MVMatrix;" + // A constant representing the
// combined model/view matrix.
"attribute vec4 a_Position;" + // Per-vertex position
// information we will pass in.
"attribute vec4 a_Color;" + // Per-vertex color information we
// will pass in.
"attribute vec2 a_TexCoordinate;" + // Per-vertex texture
// coordinate information we
// will pass in.
"varying vec3 v_Position;" + // This will be passed into the
// fragment shader.
"varying vec4 v_Color;" + // This will be passed into the
// fragment shader.
"varying vec2 v_TexCoordinate;" + // This will be passed into
// the fragment shader.
// The entry point for our vertex shader.
"void main()" + "{" +
// Transform the vertex into eye space.
"v_Position = vec3(u_MVMatrix * a_Position);" +
// Pass through the color.
"v_Color = a_Color;" +
// Pass through the texture coordinate.
"v_TexCoordinate = a_TexCoordinate;" +
// gl_Position is a special variable used to store the final
// position.
// Multiply the vertex by the matrix to get the final point in
// normalized screen coordinates.
"gl_Position = u_MVPMatrix * a_Position;" + "} ";
Fragment shader:
"precision mediump float;" + // Set the default precision to medium. We don't need as high of a
// precision in the fragment shader.
"uniform sampler2D u_Texture;" + // The input texture.
"varying vec3 v_Position;" + // Interpolated position for this fragment.
"varying vec4 v_Color;" + // This is the color from the vertex shader interpolated across the
// triangle per fragment.
"varying vec2 v_TexCoordinate;" + // Interpolated texture coordinate per fragment.
// The entry point for our fragment shader.
"void main()" +
"{" +
// Multiply the color by the diffuse illumination level and texture value to get final output color.
"gl_FragColor = (v_Color * texture2D(u_Texture, v_TexCoordinate));" +
"}";
draw() method:
public void draw() {
// Pass in the position information
mCubePositions.position(0);
GLES20.glVertexAttribPointer(mPositionHandle, mPositionDataSize, GLES20.GL_FLOAT, false, 0, mCubePositions);
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Pass in the color information
mCubeColors.position(0);
GLES20.glVertexAttribPointer(mColorHandle, mColorDataSize, GLES20.GL_FLOAT, false, 0, mCubeColors);
GLES20.glEnableVertexAttribArray(mColorHandle);
// Pass in the texture coordinate information
mCubeTextureCoordinates.position(0);
GLES20.glVertexAttribPointer(mTextureCoordinateHandle, mTextureCoordinateDataSize, GLES20.GL_FLOAT, false, 0, mCubeTextureCoordinates);
GLES20.glEnableVertexAttribArray(mTextureCoordinateHandle);
// This multiplies the view matrix by the model matrix, and stores the
// result in the MVP matrix
// (which currently contains model * view).
Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0);
// Pass in the modelview matrix.
GLES20.glUniformMatrix4fv(mMVMatrixHandle, 1, false, mMVPMatrix, 0);
// This multiplies the modelview matrix by the projection matrix, and
// stores the result in the MVP matrix
// (which now contains model * view * projection).
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);
// Pass in the combined matrix.
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mMVPMatrix, 0);
// Draw the cube.
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, 6);
}
Assigning mTextureUniformHandle :
mTextureUniformHandle = GLES20.glGetUniformLocation(mProgramHandle, "u_Texture");
Lately I've been searching for multiple textures in fragment shader and came across this Binding textures to samplers
from which I got the following to work:
In onSurfaceCreated or onSurfaceChanged:
Load shaders (attach and link) and get uniform locations for sampler2D (and other variables):
normalMapLoc = GLES20.glGetUniformLocation(shaderProgram, "normalMap");
shadowMapLoc = GLES20.glGetUniformLocation(shaderProgram, "shadowMap");
Load textures:
GLES20.glGenTextures(2, textures, 0);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[0]);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
bitmap.recycle();
GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
GLES20.glBindTexture(GL10.GL_TEXTURE_COORD_ARRAY, textures[1]);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, width, height, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, mColorBuffer);
GLES20.glUniform1i(normalMapLoc, 0); // Texture unit 0 is for normal images.
GLES20.glUniform1i(shadowMapLoc, 1); // Texture unit 1 is for shadow maps.
In onDrawFrame:
GLES20.glClearColor(0f, 0f, 0f, 0f);
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
// pass variables to the fragment shader
...
// get handle to vertex shader's Position member, etcetera
int mPositionHandle = GLES20.glGetAttribLocation(shaderProgram, "vPosition");
GLES20.glEnableVertexAttribArray(mPositionHandle);
GLES20.glVertexAttribPointer(mPositionHandle, 3, GLES20.GL_FLOAT, false, 0, mVertexBuffer);
GLES20.glDrawElements(GLES20.GL_TRIANGLE_STRIP, 4, GLES20.GL_UNSIGNED_SHORT, mIndexBuffer);
and finally the fragment shader looks like this (only relevant portion of code):
uniform sampler2D normalMap, shadowMap;
varying vec2 pos;
void main() {
vec4 color = texture2D(normalMap, pos);
vec4 shadow = texture2D(shadowMap, pos);
// do stuff with the colors
...
gl_FragColor = ...;
}
This way i was finally able to access both textures !
Hope this helps.
I'm tryng to code a squeeze effect like the one in photobooth ios or osx app; I'm new to shaders, but I was able to implement GPUImage library and tweek some shaders; however all that I was able to obtain is a spherical distortion, and the final result is a bit different from what I would like to achieve.
here is some code I modded from #import "GPUImageBulgeDistortionFilter.h"
in particular I was using this code
NSString *const kGPUImageBulgeDistortionFragmentShaderString = SHADER_STRING
(
varying highp vec2 textureCoordinate;
uniform sampler2D inputImageTexture;
uniform highp vec2 center;
uniform highp float radius;
uniform highp float scale;
void main()
{
highp vec2 textureCoordinateToUse = textureCoordinate;
highp float dist = distance(center, textureCoordinate);
highp float PI = 3.14159265358979323846264;//stone added
textureCoordinateToUse -= center;
textureCoordinateToUse.x = 0.5+textureCoordinateToUse.x*cos(textureCoordinateToUse.x*PI/radius)*scale;
textureCoordinateToUse.y = 0.5 + textureCoordinateToUse.y*cos(textureCoordinateToUse.y*PI/radius)*scale;
gl_FragColor = texture2D(inputImageTexture, textureCoordinateToUse );
}
);
This code is using cos and sin and PI so it's definitely in the spherical range; any hint to make it more planar, with a tiny unstretched part in the middle will be a great help !