When trying to compile GLSL shaders in C/C++ using GLFW/GLEW I get the following error:
0(12) : error C5052: gl_Position is not accessible in this profile
I followed a tutorial from learnopengl.com. The code runs and displays a empty while square with the above error message being printed to the command line. Any ideas what is happening and how I might fix it?
The fragment shader is:
#version 410
layout (location = 0) in vec3 aPos;
layout (location = 1) in vec3 aColor;
layout (location = 2) in vec2 aTexCoord;
out vec3 ourColor;
out vec2 TexCoord;
void main()
{
gl_Position = vec4(aPos, 1.0);
ourColor = aColor;
TexCoord = aTexCoord;
}
And the vertex shader is:
#version 410
out vec4 FragColor;
in vec3 ourColor;
in vec2 TexCoord;
uniform sampler2D ourTexture;
void main()
{
FragColor = texture(ourTexture, TexCoord);
}
If you would like to see the rest of the code please refer to the tutorial link above.
Looks like you tried to load the fragment shader as the vertex shader and vice versa. gl_Position can only be set from within the vertex shader, since it's a per-vertex attribute. Loading the shaders in correct order should get rid of that problem though.
Related
I have been trying to learn OpenGL for a while a create a simple 3d game, but upon trying to set uniforms nothing works anymore. I am using a quite old mac, but I don't think that that has anything to do with it.
This is my code for setting the uniform:
Texture texture = createTexture("./res/images/atlas.png");
bindTexture(&texture, 1);
setUniform1i(&shader, "u_Texture", 1);
The code for setUniform1i is:
void setUniform1i(const Shader *shader, char *name, int i1)
{
int loc = getUniformLocation(shader, name);
bindShader(shader);
glUniform1i(loc, i1);
}
This is my fragment shader:
#version 120
uniform sampler2D u_Texture;
varying vec2 v_texCoor;
void main()
{
vec4 texColor = texture2D(u_Texture, v_texCoor);
gl_FragColor = texColor;
}
One thing to note is that I can set a model view projection matrix uniform in my vertex shader just fine, so I have no idea why setting another uniform would result in an error.
Is your shader binding correctly? I had the same problem but it was resolve when binding it. glUseProgram(program);
I have a program that simply draws a cube. When applying transformation such as rotation scaling etc the program works. When I attempt to apply any perspective matrix such as perspective, frustum, or ortho the cube becomes very distorted in undefined ways. What I am getting confused about is why the program works fine when using the other transformations, but breaks when applying any sort of perspective view.
Additionally, when I change the GL_TRUE parameter to GL_FALSE in glUniformMatrix4fv the cube stops moving around the screen, but still has strange distortion. Here is the display function. Just applying perspective gives that same distortion.
void display()
{
vec4 e2 = { 0.0, 0.0, 1.0, 0.0};
vec4 at = { 0.0, 0.0, 0.0, 0.0 };
vec4 up = { 0.0, 1.0, 0.0, 0.0 };
mat4 rx = RotateX(theta);
mat4 ry = RotateY(theta);
mat4 ps = Perspective(zoom*45.0, aspect, 0.5, 10.0);
mat4 rxry = multiplymat4(rx, ry);
mat4 mv = LookAt(e2, at, up);
glUniformMatrix4fv( ModelView, 1, GL_TRUE, &mv.m[0][0] );
mat4 p = multiplymat4(rxry,ps);
glUniformMatrix4fv(projection, 1, GL_TRUE, &p.m[0][0]);
}
I do not believe it is in my perspective function since ortho and frustum does the same thing, but the perspective code is below.
mat4 Perspective(float fovy, float aspect, float zNear, float zFar)
{
float top = tan(fovy*DegreesToRadians/2) * zNear;
float right = top * aspect;
mat4 c = ZERO_MATRIX;
c.m[0][0] = zNear/right;
c.m[1][1] = zNear/top;
c.m[2][2] = -(zFar + zNear)/(zFar - zNear);
c.m[2][3] = -2.0*zFar*zNear/(zFar - zNear);
c.m[3][2] = -1.0;
c.m[3][3] = 0.0;
return c;
}
And the vertex shader
#version 120
attribute vec4 vPosition;
attribute vec4 vColor;
varying vec4 color;
uniform mat4 ModelView;
uniform mat4 projection;
void
main()
{
gl_Position = projection*ModelView*vPosition/vPosition.w;
color = vec4( vColor);
}
I can spot two places where it seems me it is wrong.
1) you are multiplying your perspective matrix with rotational matrix. Why? if you want to do camera movements, they must be done in lookAt matrix. So, I suggest this simple code:
mat4 ps = Perspective(zoom*45.0, aspect, 0.5, 10.0);
glUniformMatrix4fv(projection, 1, GL_TRUE, &ps.m[0][0]);
2) Perspective divide is done automatically by GPU, in this way, it seems me you vertex shader is wrong too:
gl_Position = projection*ModelView*vPosition;
color = vec4( vColor);
Matrix multiplication order matters, and from what I see your p matrix should be
mat4 p = multiplymat4(ps, rxry);
Though logically rotation belongs to the view matrix.
Also, / vPosition.w likely does nothing in the shader, since w equals 1.0 unless you actually supplied 4-dimensional position data. Nor do you need a perspective divide in your vertex shader.
I wrote a little program to display a 32bit float texture in a simple quad. When displaying the quad, the texture color is always black. I experimented with a lot of things, but I couldn't make it work. I'm really at loss what the problem with it.
The code of creating the OpenGL texture goes like this
glEnable(GL_TEXTURE_2D);
glGenTextures(1, &textureID);
glBindTexture(GL_TEXTURE_2D, textureID);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA32F, width, height, 0, GL_RGBA, GL_FLOAT, textureData);
Using the debugger, there's no error in any of these calls. I also examined the textureData pointer, and got the expected results (in my simplified program, it is just a gradient texture).
This is the vertex shader code in GLSL:
#version 400
in vec4 vertexPosition;
out vec2 uv;
void main() {
gl_Position = vertexPosition;
uv.x = (vertexPosition.x + 1.0) / 2;
uv.y = (vertexPosition.y + 1.0) / 2;
}
It's kind of a simple generation of the UV coordinates without taking them as vertex attributes. The corresponding vertex buffer object is really simple:
GLfloat vertices[4][4] = {
{ -1.0, 1.0, 0.0, 1.0 },
{ -1.0, -1.0, 0.0, 1.0 },
{ 1.0, 1.0, 0.0, 1.0 },
{ 1.0, -1.0, 0.0, 1.0 },
};
I've tested the solution, and it displays the quad covering the entire window as I wanted to. Displaying the UV coordinates in the fragment shader reproduce the gradient that I expected to get. Now here's the fragment shader:
#version 400
uniform sampler2D myTex;
in vec2 uv;
out vec4 fragColor;
void main() {
fragColor = texture(myTex, uv);
// fragColor += vec4(uv.x, uv.y, 0, 1);
}
The commented out line displays the UV coordinates as color for debugging purposes. What do I do wrong here? I just can't see why the texture() call returns 0 where the texture seems completely right, and the uv coordinates are also proper. I link here the full code if there's something else I do wrong: gl-view.c
EDIT: This is how I set up the myTex sampler:
glEnable(GL_TEXTURE_2D);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, textureID);
glUniform1i(glGetUniformLocation(shaderProgram, "myTex"), 0);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
EDIT: Cleared up the vertex shader code.
I've found the issue: I didn't set any MAG or MIN filter on the texture. Setting the MIN filter to GL_NEAREST solved the problem.
I have this code calling glGetUniform location but it's returning -1 even though I'm using the uniform in my vertex shader. I don't get any errors from glGetError or glGetProgramInfoLog or glGetShaderInfoLog and the shaders/program all gets created successfully. I also only call this after it gets compiled and linked.
int projectionUniform = glGetUniformLocation( shaderProgram, "dfProjection" );
#version 410
uniform float dfRealTime;
uniform float dfGameTime;
uniform mat4 dfProjection;
uniform mat4 dfModelView;
layout(location = 0) in vec3 vertPosition;
layout(location = 1) in vec4 vertColor;
smooth out vec4 color;
out vec4 position;
void main() {
color = vertColor;
position = (dfModelView * dfProjection) * vec4(vertPosition, 1.0);
}
This is the fragment shader:
smooth in vec4 color;
out vec4 fragColor;
void main() {
fragColor = color;
}
There are three positibilites:
You have mis-spelled dfProjection in glGetUniformLocation, but it doesn't seem so.
You are not binding the correct program before calling glGetUniformLocation using glUseProgram.
Or you are not using position in your fragment shader, which means dfProjection is not really active.
Another thing from the code it seems you are passing the shader handle to glGetUniformLocation you should pass the linked program handle instead.
After your edit you are not using position in your fragment shader,
smooth in vec4 color;
out vec4 fragColor;
in vec4 position;
void main() {
// do sth with position here
fragColor = color*position;
}
Keep in mind that you still need to use gl_Position in-order for the fragment shader to know the final fragment position. But I was answering the question of why a uniform variable is not being detected.
I need to draw a polygon that has the boundary lines with one color and fill the interior with another color. Is there an easy way to do this ? I currently draw two polygons one for the interior color and 1 for the boundary. I think there must be a better to do this. Thanks for your help.
glColor3d (1, 1., .7);
glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
glBegin(GL_TRIANGLES);
glVertex3f(-0.8f, 0.0f, 0.0f);
glVertex3f(-0.6f, 0.0f, 0.0f);
glVertex3f(-0.7f, 0.2f, 0.0f);
glEnd();
glPolygonMode(GL_FRONT_AND_BACK, GL_LINE);
glColor3d (.5, .5, .7);
glBegin(GL_TRIANGLES);
glVertex3f(-0.8f, 0.0f, 0.0f);
glVertex3f(-0.6f, 0.0f, 0.0f);
glVertex3f(-0.7f, 0.2f, 0.0f);
glEnd();
Thank you everyone for answering my question. I am fairly new to openGL and was looking for a simple answer to a simple question. The answer appears to be not so simple and probably can take a semester worth of study.
A more modern approach would be to implement this via geometry shaders. This would work for OpenGL 3.2 and above as part of the core functionality, or for OpenGL 2.1 with extension GL_EXT_geometry_shader4.
This paper has all the relevant theory : Shader-Based wireframe drawing. It also provides a sample implementation of the simplest technique in GLSL.
Here is my own stab at it, basically a port of their implementation for OpenGL 3.3, limited to triangle primitives:
Vertex shader: (replace the inputs with whatever you use to pass in vertex data an m, v and p matrices)
#version 330
layout(location = 0) in vec4 position;
layout(location = 1) in mat4 mv;
out Data
{
vec4 position;
} vdata;
uniform mat4 projection;
void main()
{
vdata.position = projection * mv * position;
}
Geometry shader:
#version 330
layout(triangles) in;
layout(triangle_strip, max_vertices = 3) out;
in Data
{
vec4 position;
} vdata[3];
out Data
{
noperspective out vec3 dist;
} gdata;
void main()
{
vec2 scale = vec2(500.0f, 500.0f); // scaling factor to make 'd' in frag shader big enough to show something
vec2 p0 = scale * vdata[0].position.xy/vdata[0].position.w;
vec2 p1 = scale * vdata[1].position.xy/vdata[1].position.w;
vec2 p2 = scale * vdata[2].position.xy/vdata[2].position.w;
vec2 v0 = p2-p1;
vec2 v1 = p2-p0;
vec2 v2 = p1-p0;
float area = abs(v1.x*v2.y - v1.y*v2.x);
gdata.dist = vec3(area/length(v0),0,0);
gl_Position = vdata[0].position;
EmitVertex();
gdata.dist = vec3(0,area/length(v1),0);
gl_Position = vdata[1].position;
EmitVertex();
gdata.dist = vec3(0,0,area/length(v2));
gl_Position = vdata[2].position;
EmitVertex();
EndPrimitive();
}
Vertex shader: (replace the colors with whatever you needed !)
#version 330
in Data
{
noperspective in vec3 dist;
} gdata;
out vec4 outputColor;
uniform sampler2D tex;
const vec4 wireframeColor = vec4(1.0f, 0.0f, 0.0f, 1.0f);
const vec4 fillColor = vec4(1.0f, 1.0f, 1.0f, 1.0f);
void main()
{
float d = min(gdata.dist.x, min(gdata.dist.y, gdata.dist.z));
float I = exp2(-2*d*d);
outputColor = mix(fillColor, wireframeColor, I);
}
You can switch the fill mode between polygons, lines and points, using glPolygonMode.
In order to draw polygon lines in a different color you can do the following:
glPolygonMode(GL_FRONT_AND_BACK,GL_FILL);
draw_mesh( fill_color );
glPolygonMode(GL_FRONT_AND_BACK,GL_LINE);
glEnable(GL_POLYGON_OFFSET_LINE);
glPolygonOffset(-1.f,-1.f);
draw_mesh( line_color );
Line offset may be needed, because OpenGL doesn't guarantee the edges of polygons will be rasterized in the exact same pixels as the lines. So, without explicit offset you may and up with lines being hidden by polygons, due to failed depth test.
There are 2 ways to do this:
the one you do at the moment (2 polygons, one a little larger than the other or drawn after)
texture
to my knowledge there are no other possibilities and from a performance standpoint these 2 possibilities, especially the first one as long as you only color-fill, are extremely fast.
I think you should see this answer:
fill and outline
first draw your triangle using
glPolygonMode(GL_FRONT_AND_BACK,GL_FILL) and use your desired color.
then draw the triangle again using
glPolygonMode(GL_FRONT_AND_BACK,GL_LINE) using your outline color.