I am struggling for days and i still can't figure out what i am doing wrong.
I have a vertex attribute consisting of a single float and i want to compare it's value with other in an if statement but i always get the statement true even if it is not;
Here is my vertex shader were the problem occur:
attribute vec4 a_Position;
attribute vec3 a_Normal;
attribute vec2 a_TextureCoord;
attribute highp float a_Bone;
uniform mat4 bone_1;
uniform mat4 bone_0;
varying vec2 v_TextureCoord;
void main() {
v_TextureCoord = a_TextureCoord;
vec4 posy;
float a = a_Bone;
if(20.0<a)
posy = bone_0*a_Position;
else
posy = bone_1*a_Position;
gl_Position = posy;
}
if i replace what is in if statement with "true" or "false" all the things work as expected ...but if i'm trying to use that attribute value for comparing the if statement acts as the statement is always true even if when a_Bone value is 1.0 or 2.0 (clearly smaller the 20.0)
We've been using Attributes in the if statemens and the code is working fine on most OpenGL ES 2.0 devices. So it is not a problem in using attributes. Here is a working code of how we have done.
int jointId = int(optionalData1.x);
if(jointId > 0)
finalMatrix = jointTransforms[jointId - 1] * optionalData2.x;
This code is working fine on all iOS and Android devices. optionalData1 is the attribute and we are even using this value to get matrix from a uniform array for bone transform matrix.
Related
In Vulkan, I had written a simple program to draw lines with fixed color, with simple vertex shader and fragement shader. But the colors input to fragment shaders are different than what is set in vertices. I checked with RenderDoc, and the colors passed to the vertex shader are correct (1,1,1,1) for both vertices of a line and also checked its output, its also same. But in Fragment shader, the colors I am getting are (1,1,0,1). Dont understand why this is happening. Irrespetive of what colors vertex shader emit, the input in fragment shader is always yellow.
Vertex shader:
layout(location = 0) in vec4 position;
layout(location = 1) in vec4 color;
layout(location = 2) in vec2 texcoord;
out vec4 io_color;
out vec2 io_uv;
out vec4 io_position2;
layout(std140, binding = 0) uniform UniformBlock_uTransform
{
mat4 uTransform;
};
layout(std140, binding = 1) uniform UniformBlock_uTransform2
{
mat4 uTransform2;
};
void main ()
{
io_uv = texcoord;
io_color = vec4(1,1,1,1); //Just to debug it
gl_Position = uTransform * position;
io_position2 = uTransform2 * position;
}
//Fragement :
in vec4 io_color;
layout(location = 0) out vec4 result;
void main ()
{
result = io_color;
}
Try adding output and input layout qualifiers to the values you pass from one shader to the other to ensure that they actually point to the same location:
VS:
layout (location = 0) out vec4 io_color;
FS:
layout (location = 0) in vec4 io_color;
I recommend always using that syntax to connect shader out- and inputs.
Check if color write mask is not disabled for blue channel.
I am now doing with depth shadow mapping,and learning from this tutorial
(http://www.ogre3d.org/tikiwiki/Depth+Shadow+Mapping)
I have 3 questions as following:
(1)Is it right that when I use custom shadow caster,I can get depth in shadow receiver using "uniform sampler2D shadowMap "?
void casterVP( out float2 outDepth : TEXCOORD0)
{
outPos = mul(worldViewProj, position);
outDepth.x = (outPos.z - depthRange.x) * depthRange.w;
}
void casterFP( float2 depth : TEXCOORD0,
out float4 result : COLOR)
{
result = float4(finalDepth, finalDepth, finalDepth, 1);
}
//shadow receiver fragment program
void receiverFP(uniform sampler2D shadowMap : register( s0 ))
{
}
(2)
I am not very sure what this matrix(texture_viewproj_matrix) used for.
I guess,
texture coordinate->camera coordinate->screen coordinate??
and texture coordinates should be 2-D.
Am I right?
(3)
In shadow receiver fragment shader,I don't know what this line mean.
And,do these 3 variable(finalCenterDepth,shadowUV.z and vertexColour) stand for depth?
result = (finalCenterDepth > shadowUV.z) ? vertexColour : float4(0,0,0,1);
Thank you~
any advice is useful for newbie :D
(1)
Not sure If I understood the question correctly. If you wrote depth into that render target then you can read it from the associated texture.
(2)
texture_viewproj_matrix transforms from world space into light's screen space and rescales resulting xy from [-1;1] to [0;1]. Basically in xy/w you get shadow map UV coordinates of the receiver and in z - shadow map depth of the receiver.
(3)
finalCenterDepth - depth read from shadow map and adjusted by the depth bias in order to fix acne artifacts.
shadowUV.z - depth of receiver also adjusted by the depth bias.
vertexColour - lit color, which was calculated in the vertex shader (see outColour in receiverVP).
I'm tryng to code a squeeze effect like the one in photobooth ios or osx app; I'm new to shaders, but I was able to implement GPUImage library and tweek some shaders; however all that I was able to obtain is a spherical distortion, and the final result is a bit different from what I would like to achieve.
here is some code I modded from #import "GPUImageBulgeDistortionFilter.h"
in particular I was using this code
NSString *const kGPUImageBulgeDistortionFragmentShaderString = SHADER_STRING
(
varying highp vec2 textureCoordinate;
uniform sampler2D inputImageTexture;
uniform highp vec2 center;
uniform highp float radius;
uniform highp float scale;
void main()
{
highp vec2 textureCoordinateToUse = textureCoordinate;
highp float dist = distance(center, textureCoordinate);
highp float PI = 3.14159265358979323846264;//stone added
textureCoordinateToUse -= center;
textureCoordinateToUse.x = 0.5+textureCoordinateToUse.x*cos(textureCoordinateToUse.x*PI/radius)*scale;
textureCoordinateToUse.y = 0.5 + textureCoordinateToUse.y*cos(textureCoordinateToUse.y*PI/radius)*scale;
gl_FragColor = texture2D(inputImageTexture, textureCoordinateToUse );
}
);
This code is using cos and sin and PI so it's definitely in the spherical range; any hint to make it more planar, with a tiny unstretched part in the middle will be a great help !
vertex shader:
#version 150
in vec3 MCVertex;
in float pointvar;
uniform mat4 MVMatrix;
uniform mat4 MPMatrix;
void main()
{
gl_Position = MPMatrix * MVMatrix * vec4(MCVertex, 1.0);
}
i need 'pointvar' attribute
but when i call :
glGetProgramiv(program, GL_ACTIVE_ATTRIBUTES, &numAttributes);
numAttribute = 1;
there are 2 attribute in my code, numAttribute must be 2.
and if i do it like this, i can active this attribute:
gl_Position = MPMatrix * MVMatrix * vec4(MCVertex + vec3(pointvar), 1.0);
then numAttributes = 2, is there any other ways to active this attribute?
i have try to #pragma optimize(off), but not work.
I'm pretty sure that GLSL will "erase/forget" any uniform / attribute not used in it's code.
All the info here.
EDIT:
Like uniforms, attributes can be active or inactive. Attributes that
are unused are inactive; they do not have a binding.
I'm working on a custom filter that should combine to pictures using a blending mode but I have a problem.
The kernel looks like this.
kernel vec4 brightnessEffect (sampler background_src, sampler foreground_src)
{
vec4 colorDodgeValue;
vec4 currentSourceBackground;
vec4 currentSourceForeground;
currentSourceBackground = sample(background_src, samplerCoord(background_src));
currentSourceForeground = sample(foreground_src, samplerCoord(foreground_src));
colorDodgeValue = currentSourceBackground; //1
//colorDodgeValue = currentSourceForeground; //2
return colorDodgeValue;
}
I get a crash when I run it, if I comment line 2 and uncomment line 1 it works just fine.
Any ideas?