3D pyramid appears scattered, with mixed up sides - c

First of all I defined a structure to express the coordinated of a pyramid:
typedef struct
{
GLfloat xUp;
GLfloat yUp;
GLfloat zUp;
GLfloat base;
GLfloat height;
}pyramid;
Pretty self-explanatory here : I store the coordinates of the uppest point, the base and the height.
The I wrote a function to draw a pyramid:
void drawPyramid(pyramid pyr)
{
GLfloat p1[]= {pyr.xUp+pyr.base/2.0, pyr.yUp-pyr.height, pyr.zUp-pyr.base/2.0};
GLfloat p2[]= {pyr.xUp+pyr.base/2.0, pyr.yUp-pyr.height, pyr.zUp+pyr.base/2.0};
GLfloat p3[]= {pyr.xUp-pyr.base/2.0, pyr.yUp-pyr.height, pyr.zUp+pyr.base/2.0};
GLfloat p4[]= {pyr.xUp-pyr.base/2.0, pyr.yUp-pyr.height, pyr.zUp-pyr.base/2.0};
GLfloat up[]= {pyr.xUp, pyr.yUp, pyr.zUp};
glBegin(GL_TRIANGLES);
glColor4f(1.0, 0.0, 0.0, 0.0);
glVertex3fv(up);
glVertex3fv(p1);
glVertex3fv(p2);
glColor4f(0.0, 1.0, 0.0, 0.0);
glVertex3fv(up);
glVertex3fv(p2);
glVertex3fv(p3);
glColor4f(0.0, 0.0, 1.0, 0.0);
glVertex3fv(up);
glVertex3fv(p3);
glVertex3fv(p4);
glColor4f(1.0, 1.0, 0.0, 0.0);
glVertex3fv(up);
glVertex3fv(p4);
glVertex3fv(p1);
glEnd();
glColor4f(0.0, 1.0, 1.0, 0.0);
glBegin(GL_QUADS);
glVertex3fv(p1);
glVertex3fv(p2);
glVertex3fv(p3);
glVertex3fv(p4);
glEnd();
}
I struggled to draw all the vertices in anti-clockwise order, but probably I messed up something.
This is how I display the pyramid in my rendering function:
void display()
{
glClearColor(0.0, 0.0, 0.0, 0.0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glTranslatef(0.0, -25.0, 50.0);
glRotatef(-angle, 0.0, 1.0, 0.0);
glTranslatef(0.0, 25.0, -50.0);
pyramid pyr;
pyr.xUp=0.0;
pyr.yUp=10.0;
pyr.zUp=50.0;
pyr.base=10.0;
pyr.height=18.0;
glColor4f(1.0, 0.0, 0.0, 0.0);
drawPyramid(pyr);
glutSwapBuffers();
}
I also use an init method called before the glut main loop:
void init()
{
glEnable(GL_DEPTH);
glViewport(-1.0, 1.0, -1.0, 1.0);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(35.0, 1.0, 1.0, 100.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
gluLookAt(0.0,1.0,0.0, 0.0,1.0,30.0, 0.0,1.0,0.0);
}
angle is just a double that I use to rotate the pyramid, changeable by pressing 'r', but this is not relevant.It appears that the real problem is how I draw the vertices.
The problem is that the faces of the pyramid appear scattered, messed up.I would better describe this situation with an image:
There's a face that is too small, that is displayed and I don't know why.
If I rotate the pyramid it appears messed up, I even recored a video to describe this.
Later I could upload it if the problem is not totally clear.
PS: Many people have noticed that I am using outdated techniques.But unfortunately this is what my university offers.
EDIT
I forgot to say about the main function:
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitWindowPosition(100, 100);
glutInitWindowSize(500, 500);
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH);
glutCreateWindow("Sierpinsky Pyramid");
glutDisplayFunc(display);
glutKeyboardFunc(keyboard);
init();
glutMainLoop();
return 0;
}

It looks like depth buffer isn't initialzied.
Calling glEnable(GL_DEPTH_TEST) is not enough. You must correctly initialize glut and specify that you want depth buffer support, otherwise you won't get a depth buffer. If I remember correctly, this is done using glutInitDisplayMode(GLUT_DEPTH|...). See documentation here and introduction here. Additional info can be found using google.
--EDIT--
You're passing invalid parameter to glEnable. call glEnable(GL_DEPTH_TEST) instead of glEnable(GL_DEPTH).
Also:
Matrix code in display function isn't protected by glPushMatrix/glPopMatrix. Which means that every time you rotate pyramid, rotation is applied to previous transform. I.e. calling display function will rotate the pyramid.
glViewport is called with invalid parameters. glViewport takes 4 integer arguments, but you're trying to pass floats. Also, what's "width of -1.0" supposed to mean?
You have not checked any error codes (glGetError). If you tried to call glGetError after glEnable call, then you'd see that it returns GL_INVALID_ENUM.
OpenGL has documentation. Documentation is available on opengl.org. Use it and read it. Also, I'd recommend reading "OpenGL red book".

Related

Simple primitive rotation Opengl c

Im trying to do a simple rotation in opengl of my primitive object in the projection plane. I want to rotate the object like a propeller but i cant seem to get it going right. When i run the code my object looks like it shrinks into itself (i know its not that, but its rotating funny)
void rotateStuff()
{
spin = spin - .5; // inc for spin
if(spin < 360)
{
spin = spin + 360;
}
glPushMatrix();
glTranslatef(150, 95, 0.0);
glRotatef(spin, 1.0, 0.0, 0.0);
glTranslatef(-150, -95, 0);
displayStuff();
glPopMatrix();
drawButton();
glutSwapBuffers();
}
Heres a snippet of my object
glBegin(GL_POLYGON);
glVertex2i(50, 0);
glVertex2i(50, 75);
glVertex2i(150, 75);
glVertex2i(150, 0);
glEnd(); // end current shape
I think something is wrong with the setting of my origin but what exaclty? am i translating to a wrong origin?
This is a rotation around the x-axis: glRotatef(spin, 1.0, 0.0, 0.0).
Presumably you want things in the x-y plane to stay in the x-y plane,
so you want rotation around the z-axis: glRotatef(spin, 0.0, 0.0, 1.0).

Can't spot the issue with my GLSL/OpenGL code

I wrote a little program to display a 32bit float texture in a simple quad. When displaying the quad, the texture color is always black. I experimented with a lot of things, but I couldn't make it work. I'm really at loss what the problem with it.
The code of creating the OpenGL texture goes like this
glEnable(GL_TEXTURE_2D);
glGenTextures(1, &textureID);
glBindTexture(GL_TEXTURE_2D, textureID);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA32F, width, height, 0, GL_RGBA, GL_FLOAT, textureData);
Using the debugger, there's no error in any of these calls. I also examined the textureData pointer, and got the expected results (in my simplified program, it is just a gradient texture).
This is the vertex shader code in GLSL:
#version 400
in vec4 vertexPosition;
out vec2 uv;
void main() {
gl_Position = vertexPosition;
uv.x = (vertexPosition.x + 1.0) / 2;
uv.y = (vertexPosition.y + 1.0) / 2;
}
It's kind of a simple generation of the UV coordinates without taking them as vertex attributes. The corresponding vertex buffer object is really simple:
GLfloat vertices[4][4] = {
{ -1.0, 1.0, 0.0, 1.0 },
{ -1.0, -1.0, 0.0, 1.0 },
{ 1.0, 1.0, 0.0, 1.0 },
{ 1.0, -1.0, 0.0, 1.0 },
};
I've tested the solution, and it displays the quad covering the entire window as I wanted to. Displaying the UV coordinates in the fragment shader reproduce the gradient that I expected to get. Now here's the fragment shader:
#version 400
uniform sampler2D myTex;
in vec2 uv;
out vec4 fragColor;
void main() {
fragColor = texture(myTex, uv);
// fragColor += vec4(uv.x, uv.y, 0, 1);
}
The commented out line displays the UV coordinates as color for debugging purposes. What do I do wrong here? I just can't see why the texture() call returns 0 where the texture seems completely right, and the uv coordinates are also proper. I link here the full code if there's something else I do wrong: gl-view.c
EDIT: This is how I set up the myTex sampler:
glEnable(GL_TEXTURE_2D);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, textureID);
glUniform1i(glGetUniformLocation(shaderProgram, "myTex"), 0);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
EDIT: Cleared up the vertex shader code.
I've found the issue: I didn't set any MAG or MIN filter on the texture. Setting the MIN filter to GL_NEAREST solved the problem.

OpenGL Lighting Failing when Scaling

I have to read a 3D object from an ASE file. This object turns to be too big for the world I have to create, therefore, I must scale it down.
With its original size, it is properly lighted up.
However, once I scale it down, it becomes oversaturated.
The world is centered around (0, 0, 0) and it is 100 meters long (y axis) and 50 meters wide (x axis), my upVector is (0, 0, 1). There are two lights, light0 in (20, 35, 750) and light1 in (-20, -35, 750).
Relevant parts of the code:
void init(void){
glClearColor(0.827, 0.925, 0.949, 0.0);
glEnable(GL_DEPTH_TEST);
glEnable(GL_COLOR_MATERIAL);
glColorMaterial(GL_FRONT, GL_DIFFUSE);
glEnable(GL_LIGHT0);
glEnable(GL_LIGHT1);
glEnable(GL_LIGHTING);
glShadeModel(GL_SMOOTH);
GLfloat difusa[] = { 1.0f, 1.0f, 1.0f, 1.0f}; // white light
glLightfv(GL_LIGHT0, GL_DIFFUSE, difusa);
glLightfv(GL_LIGHT1, GL_DIFFUSE, difusa);
loadObjectFromFile("objeto.ASE");
}
void display ( void ) {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
gluLookAt(eyeX, eyeY, eyeZ, atX, atY, atZ, 0.0, 0.0, 1.0);
GLfloat posicion0[] = { 20.0f, 35.0f, 750.0f, 1.0f};
glLightfv(GL_LIGHT0, GL_POSITION, posicion0);
GLfloat posicion1[] = { -20.0f, -35.0f, 750.0f, 1.0f};
glLightfv(GL_LIGHT1, GL_POSITION, posicion1);
glColor3f(0.749, 0.918, 0.278);
glPushMatrix();
glTranslatef(0.0, 0.0, 1.5);
//Here comes the problem
glScalef(0.08, 0.08, 0.08);
glBegin(GL_TRIANGLES);
for(int i = 0; i < numFaces; i++){
glNormal3d(faces3D[i].n.nx, faces3D[i].n.ny, faces3D[i].n.nz);
glVertex3d(vertex[faces3D[i].s.A].x, vertex[faces3D[i].s.A].y, vertex[faces3D[i].s.A].z);
glVertex3d(vertex[faces3D[i].s.B].x, vertex[faces3D[i].s.B].y, vertex[faces3D[i].s.B].z);
glVertex3d(vertex[faces3D[i].s.C].x, vertex[faces3D[i].s.C].y, vertex[faces3D[i].s.C].z);
}
glEnd();
glPopMatrix();
glutSwapBuffers();
}
Why does lighting fail when the object is scaled down?
The problem you're running into is, that scaling the modelview matrix also influences the "normal matrix" normals are transformed with. The "normal matrix" is actually the transpose of the inverse of the modelview matrix. So by scaling down the modelview matrix, you're scaling up the normal matrix (because of the modelview inversion step used to obtain it).
Because of that the transformed normals must be rescaled, or normalized if the scale of the modelview matrix is not unitary. In fixed function OpenGL there are two methods to do this: Normal normalization (sounds funny, I know) and normal rescaling. You can enable either with
glEnable(GL_NORMALIZE);
glEnable(GL_RESCALE_NORMALS);
In a shader you'd simply normalize the transformed normal
#version ...
uniform mat3 mat_normal;
in vec3 vertex_normal;
void main()
{
...
vec3 view_normal = normalize( mat_normal * vertex_normal );
...
}
Depending on the setting of GL_NORMALIZE and GL_RESCALE_NORMALS, your normals can be transformed by the OpenGL-Pipeline.
Start with glEnable(GL_NORMALIZE) and see if that solves your problem

Graphics not centered

I have been working on a graphics project in C using Codeblocks and Glut lib.
Everything was going well and then tried it in Visual Studio Express 2013 RC.
In VSE my graphics are no longer centered in the window.
Look to be shifted about 2% to the left and top.
I have defined everything I can think of
glutInitWindowSize(1000, 500);
glutInitWindowPosition(100, 100); // Set the position of the window
Reshape function looks like this
glViewport(0, 0, (GLsizei)width, (GLsizei)height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(-1.0, 1.0, -1.0, 1.0); // multiply by new coordinates.
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
initialization is this:
glMatrixMode(GL_PROJECTION); // Select the matrix to change,
glLoadIdentity(); // clear it,
gluOrtho2D(-1.0, 1.0, -1.0, 1.0); // multiply by new coordinates.
glEnable(GL_LINE_SMOOTH);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glHint(GL_LINE_SMOOTH_HINT, GL_DONT_CARE);
glLineWidth(1.0);
glClearColor(1.0, 1.0, 1.0, 0.0);

Trouble Understanding glOrtho

I'm new to openGL and im having trouble understanding the concept of glOrtho. for instance i have:
void display(void)
{
/* clear all pixels */
glClear (GL_COLOR_BUFFER_BIT);
/* draw black polygon (rectangle) with corners at * (0.25, 0.25, 0.0) and (0.75, 0.75, 0.0)
*/
glColor3f (0.0, 0.0, 0.0);
glBegin(GL_POLYGON);
glVertex3f (-.25,0,0.0);
glVertex3f (.25, 0, 0.0);
glVertex3f (.25, .25, 0.0);
glVertex3f (-.25, .25, 0.0);
glEnd();
/* don’t wait!
* start processing buffered OpenGL routines */
glFlush (); }
this produces a rectangle and then this "morphs" the rectangle:
void init (void)
/* this function sets the initial state */ {
/* select clearing (background) color to white */
glClearColor (1.0, 1.0, 1.0, 0.0);
/* initialize viewing values */
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, 1, 1, 0.0, -1.0,1.0);
}
and this pretty much makes it a square and puts in up in the top left corner. I'm not sure how it does that. Are the points transformed in the rectangle?
EDIT:
figured it out. this was very helpful. http://elvenware.sourceforge.net/OpenGLNotes.html#Ortho
glOrtho is used to define an orthographic projection volume:
The signature is glOrtho(GLdouble left, GLdouble right, GLdouble bottom, GLdouble top, GLdouble near, GLdouble far);
left and right specify the x-coordinate clipping planes, bottom and top specify the y-coordinate clipping planes, and near and far specify the distance to the z-coordinate clipping planes. Together these coordinates provide a box shaped viewing volume.
The way you have defined your volume of projection is not centered around the point 3d (0, 0, 0) but (.5, -5, 0) you should have defined your glOrtho this way instead: glOrtho(-.5, .5, -.5, .5, -1.0, 1.0); since you polygon is center around the point 3d (0, 0, 0). (You can also change the coordinates of your polygon to match the center of your projection volume).
Your glOrtho call sets up the viewport such that the top-left is (0,0) and the bottom-right is (1,1), with a valid Z-range of (-1,1).
Now, you drew a square with a top-left of (-0.25,-0.25) to (0.25,0.25).
The glVertex calls do not match the comment just above them. Either change the vertices to the values you stated, or change the glOrtho call:
glOrtho(-0.5, 0.5, 0.5, -0.5, -1.0, 1.0 );

Resources