GLUT timer loop stopping prematurely - c

I've encountered a strange issue where glutTimerFunc seems to randomly stop working when I call it with a zero delay.
Here is my code:
#include <Windows.h>
#include <GL/gl.h>
#include <GL/glut.h>
int x = 0;
void init(void)
{
glClearColor(0.0, 0.0, 0.0, 1.0);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0.0, 1.0, 0.125, 0.875, -1.0, 1.0);
}
void display(void)
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glBegin(GL_POLYGON);
glColor3f(1.0, x ? 1.0 : 0.0, 0.0);
glVertex3f(0.25, 0.25, 0.0);
glVertex3f(0.75, 0.25, 0.0);
glVertex3f(0.75, 0.75, 0.0);
glVertex3f(0.25, 0.75, 0.0);
glEnd();
glFlush();
glutSwapBuffers();
}
void timer(int value)
{
x = !x;
glutPostRedisplay();
glutTimerFunc(0, timer, 0); // The line in question
}
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB);
glutInitWindowSize(800, 600);
glutInitWindowPosition(200, 200);
glutCreateWindow("hello");
init();
glutDisplayFunc(display);
glutTimerFunc(0, timer, 0);
glutMainLoop();
return 0;
}
I expected this to show a flickering square, that is changing color as fast as the GPU can keep up.
That is what it actually does initially, but the timer loop seems to randomly stop, and the square stops changing color. Sometimes it doesn't flicker perceptibly at all, and sometimes it flickers for several seconds before stopping.
It doesn't stop if I set the delay to 1ms (glutTimerFunc(1, timer, 0);).
Why does the timer loop stop unexpectedly?
I don't really care about how to fix it, just why it happens.

Your GPU is changing the value faster than your monitor can draw it.
If you had a monitor with an extremely high refresh rate, you could probably see it, but unfortunately we're limited to 60Hz/120Hz/240Hz for now.
When you remove the 1ms forced delay, you are causing the system to become a non-deterministic system (based on the speed of other programs, rather than just yours), and that's why you're getting the random behavior.

Related

glFlush() do not show anything

My OpenGL glFlush() didn't show anything when I run a glut project in Codeblocks on windows 7.
Here my main function.
#include <windows.h>
#include <GL/glut.h>
#include <stdlib.h>
#include <stdio.h>
float Color1=0.0, Color2=0.0, Color3=0.0;
int r,p,q;
void keyboard(unsigned char key, int x, int y)
{
switch (key)
{
case 27: // ESCAPE key
exit (0);
break;
case 'r':
Color1=1.0, Color2=0.0, Color3=0.0;
break;
case 'g':
Color1=0.0, Color2=1.0, Color3=0.0;
break;
case 'b':
Color1=0.0, Color2=0.0, Color3=1.0;
break;
}
glutPostRedisplay();
}
void Init(int w, int h)
{
glClearColor(1.0, 1.0, 1.0, 1.0);
glViewport(0,0, (GLsizei)w,(GLsizei)h);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D( (GLdouble)w/-2,(GLdouble)w/2, (GLdouble)h/-2, (GLdouble)h/2);
}
static void display(void)
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
int i=0;
glColor4f(0,0,0,1);
glPointSize(1);
glBegin(GL_POINTS);
for( i=-320;i<=320;i++)
glVertex2f(i,0);
for( i=-240;i<=240;i++)
glVertex2f(0,i);
glEnd();
glColor4f(Color1,Color2, Color3,1);
glPointSize(1);
glBegin(GL_POINTS);
int x=0, y = r;
int d= 1-r;
while(y>=x)
{
glVertex2f(x+p, y+q);
glVertex2f(y+p, x+q);
glVertex2f(-1*y+p, x+q);
glVertex2f(-1*x+p, y+q);
glVertex2f(-1*x+p, -1*y+q);
glVertex2f(-1*y+p, -1*x+q);
glVertex2f(y+p, -1*x+q);
glVertex2f(x+p, -1*y+q);
if(d<0)
d += 2*x + 3;
else
{
d += 2*(x-y) + 5;
y--;
}
x++;
}
glEnd();
glFlush();
//glutSwapBuffers();
}
int main(int argc, char *argv[])
{
printf("Enter the center point and radius: ");
scanf("%d %d %d",&p,&q,&r);
glutInit(&argc, argv);
glutInitWindowSize(640,480);
glutInitWindowPosition(10,10);
glutInitDisplayMode(GLUT_RGB | GLUT_SINGLE);
glutCreateWindow("Circle drawing");
Init(640, 480);
glutKeyboardFunc(keyboard);
glutDisplayFunc(display);
glutMainLoop();
return 0;
}
But when I change these two lines, it simply works fine.
glFlush(); to glutSwapBuffers(); and
glutInitDisplayMode(GLUT_RGB | GLUT_SINGLE); to glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH);
Can anyone tell me what's the problem with my code and why not glFlush() didn't work?
Modern graphics systems (Windows DWM/Aero, MacOS Quartz Extreme, X11 Composite) are built around the concept of composition. Composition always implies double buffering and hence relies on the buffer swap to initiate a composition refresh.
You can disable DWM/Aero on Windows and restrain from using a compositing window manager on X11, and then single buffered OpenGL should work as expected.
But why exactly do you want single buffered drawing? Modern GPUs are actually presuming that double buffering is used to pump their presentation pipeline efficiently. There's zero benefit in being single buffered.
glFlush works as documented:
The glFlush function forces execution of OpenGL functions in finite time.
What this does, is it forces all outstanding OpenGL operations to compleate rendering to the back buffer. This will not magically display the back buffer. To do that you need to swap the font buffer and the back buffer.
So the correct use of glFlush, is in conjunction with glutSwapBuffers. But that is redundant, since glutSwapBuffers will flush all outstanding rendering operations anyway.
It appears that you are using an old OpenGL 1.1 tutorial, where double buffering was an expensive novelty. Currently double buffering is the norm and you need to jump through quite some expensive hoops to get single buffering.
Since OpenGL is currently at version 4.6, I would encourage you to at least start using 4.0.

Lighting not working in opengl [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 5 years ago.
Improve this question
I am having trouble getting the lighting to work on just the decline portion of my ground. Below is my code for the ground and the decline (making it a ditch):
static void ground(double x, double y, double z, double dx, double dy,
double dz){
float white[] = {1,1,1,1};
float Emission[] = {0.0,0.0,0.01*emission,1.0};
glMaterialf(GL_FRONT_AND_BACK,GL_SHININESS,shiny);
glMaterialfv(GL_FRONT_AND_BACK,GL_SPECULAR,white);
glMaterialfv(GL_FRONT_AND_BACK,GL_EMISSION,Emission);
// Save transformation
glPushMatrix();
// Offset, scale and rotate
glTranslated(x,y,z);
glScaled(dx, dy, dz);
glEnable(GL_TEXTURE_2D);
glTexEnvi(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_MODULATE);
glBindTexture(GL_TEXTURE_2D, textures[2]);
glBegin(GL_QUADS);
glColor3f(0.5, 1.0, 0.5);
glNormal3f(0,0,0);
glTexCoord2f(0.0, 0.0); glVertex3f(-100, 0, -300);
glTexCoord2f(300.0,0.0); glVertex3f(-100,0,300);
glTexCoord2f(300.0,300.0); glVertex3f(-3,0,300);
glTexCoord2f(0.0,300.0); glVertex3f(-3,0,-300);
glEnd();
glBindTexture(GL_TEXTURE_2D, textures[2]);
glBegin(GL_QUADS);
glColor3f(1.0, 1.0, 0.5);
glNormal3f(0,0,-1);
glTexCoord2f(0.0, 0.0); glVertex3f(-2.99,0,-300);
glTexCoord2f(300.0,0.0); glVertex3f(-2.99,0,300);
glTexCoord2f(300.0,300.0); glVertex3f(-1,-1,300);
glTexCoord2f(0.0,300.0); glVertex3f(-1,-1,-300);
glEnd();
glBindTexture(GL_TEXTURE_2D, textures[2]);
glBegin(GL_QUADS);
glColor3f(1.0, 1.0, 0.5);
glNormal3f(0,0,1);
glTexCoord2f(0.0, 0.0); glVertex3f(0.99,-1,-300);
glTexCoord2f(300.0,0.0); glVertex3f(0.99,-1,300);
glTexCoord2f(300.0,300.0); glVertex3f(2.99,0,300);
glTexCoord2f(0.0,300.0); glVertex3f(2.99,0,-300);
glEnd();
glBindTexture(GL_TEXTURE_2D, textures[2]);
glBegin(GL_QUADS);
glColor3f(0.5, 1.0, 0.5);
glNormal3f(0,0,0);
glTexCoord2f(0.0, 0.0); glVertex3f(2.99, 0, -300);
glTexCoord2f(300.0,0.0); glVertex3f(2.99,0,300);
glTexCoord2f(300.0,300.0); glVertex3f(100,0,300);
glTexCoord2f(0.0,300.0); glVertex3f(100,0,-300);
glEnd();
glPopMatrix();
glDisable(GL_TEXTURE_2D);
}
So the code the doesn't have lighting working correctly is the middle two snippets of "glBegin(GL_QUADS)"
This is just an instance of GIGO (garbage in, garbage out):
glNormal3f(0,0,0);
Nope. That is not a valid normal vector, and will totally break any lighting calculation.
The next one
glNormal3f(0,0,-1);
glTexCoord2f(0.0, 0.0); glVertex3f(-2.99,0,-300);
glTexCoord2f(300.0,0.0); glVertex3f(-2.99,0,300);
glTexCoord2f(300.0,300.0); glVertex3f(-1,-1,300);
glTexCoord2f(0.0,300.0); glVertex3f(-1,-1,-300);
is at least some non-null vector, but it isn't normal to the face you are describing, so the lighting will be just wrong.

Use values instead of -1...1 for OpenGL drawing shapes?

If I wanted to draw a plane in OpenGL, I would do something like the below:
glBegin(GL_POLYGON);
glColor3f(1.0, 1.0, 1.0);
glVertex3f(0.5, -0.5, 0.5);
glVertex3f(0.5, 0.5, 0.5);
glVertex3f(-0.5, 0.5, 0.5);
glVertex3f(-0.5, -0.5, 0.5);
glEnd();
This draws a white plane that covers 50% of the canvas (from -0.5 to 0.5 on two axes). I want to use numbers instead, however. I don't want to use -1 to 1, but instead something like 0 to n, where n is the dimension of my canvas. For the above example, something like 250 to 750 on two axes on a 1000 pixel canvas rather than -0.5 to 0.5.
That's what the transformation matrices are for. In your case you'd set a ortho projection matrix with the limits as you desire. In your example case
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, 1000, 0, 1000, -1, 1);
would set up a viewing volume so that the boundaries are at 0,0 for the lower left corner and 1000,1000 for the upper right.
Note that this (and the code you've given) use the old, deprecated fixed function pipeline. You should drop that in favour of a shader based approach.

OpenGL 3.3 won't draw my triangle

I'm following these tutorials using C instead of C++:
Tutorial 2, short extension to tutorial 2.
The only change I made to port it was changing Vector3f[3] into GLfloat[9]. The version with GLfloat[1] instead of Vector3f[1] works correctly. I think this change might be the reason of glDrawArrays not working but I don't know how to fix it.
#include <stdio.h>
#include <GL/glew.h>
#include <GL/freeglut.h>
GLuint VBO_id;
static void RenderSceneCB()
{
glClear(GL_COLOR_BUFFER_BIT);
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, VBO_id);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0);
glDrawArrays(GL_TRIANGLES, 0, 3);
glDisableVertexAttribArray(0);
glutSwapBuffers();
}
static void CreateVertexBuffer()
{
GLfloat Vertices[9] = { -1.0f, -1.0f, 0.0f, 1.0f, -1.0f, 0.0f, 0.0f,
1.0f, 0.0f };
glGenBuffers(1, &VBO_id);
glBindBuffer(GL_ARRAY_BUFFER, VBO_id);
glBufferData(GL_ARRAY_BUFFER, sizeof(Vertices), Vertices,
GL_STATIC_DRAW);
}
int main(int argc, char **argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowSize(600, 600);
glutInitWindowPosition(100, 100);
glutCreateWindow("Tutorial 03");
glutDisplayFunc(RenderSceneCB);
GLenum res = glewInit();
if (res != GLEW_OK) {
fprintf(stderr, "Error: '%s'\n", glewGetErrorString(res));
return 1;
}
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
CreateVertexBuffer();
glutMainLoop();
return 0;
}
From here.
OpenGL 3.0 was the last revision of the specification which fully supported both fixed and programmable functionality. Even so, most hardware since the OpenGL 2.0 generation lacked the actual fixed-function hardware. Instead, fixed-function processes are emulated with shaders built by the system.
In OpenGL 3.2, the Core Profile lacks these fixed-function concepts. The compatibility profile keeps them around. However, most newer features of OpenGL cannot work with fixed function, even when it might seem theoretically possible for them to interact.
Sounds like your version of OpenGL doesn't support the fixed function pipeline. Either use an older version of OpenGL that does or write and load a shader as shown in Tutorial 4.

OpenGL Lighting struggles

I'm in the middle of a project teaching the basics of OpenGL. I've got most of the requirements working fine in terms of camera rotation, translation etc. However I'm struggling a lot with the lighting.
This picture is a comparison of my current program (left) vs the sample solution (right).
In case you can't tell, I'm getting very monochrome colours on the truck. The shadows are very sharp and dark, the high points are singly coloured instead of specular.
The project calls for the use of textures; the one I've shown here is a basic texture of plain grey pixels but i could use any texture (including the beach sand one being used for the ground).
I'm drawing the object from a mesh:
GLfloat ambient[] = {0.1, 0.1, 0.1, 1};
GLfloat diffuse[] = {0.1, 0.1, 0.1, 1};
GLfloat specular[] = {1.0, 1.0, 1.0, 1.0};
GLfloat shine = 100.0;
glMaterialfv(GL_FRONT, GL_AMBIENT, ambient);
glMaterialfv(GL_FRONT, GL_DIFFUSE, diffuse);
glMaterialfv(GL_FRONT, GL_SPECULAR, specular);
glMaterialf(GL_FRONT, GL_SHININESS, shine);
glEable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, textureNumber);
glBegin(GL_TRIANGLES);
for (int i = 0; i < meshes[n]->nTriangles; i++) {
for (int j = 0; j < 3; j++) {
glNormal3fv(mesh -> normals[mesh->triangles[i][j]]);
glTexCoord2fv(mesh->texCoords[mesh->triangles[i][j]]);
glVertex3fv(mesh -> vertices[mesh->triangles[i][j]]);
}
}
glEnd();
There is one light in the scene:
glEnable(GL_LIGHTING);
glEnable(GL_LIGHT0);
glEnable(GL_DEPTH_TEST)
GLfloat diffuse0[]={1.0, 1.0, 1.0, 1.0};
GLfloat ambient0[]={1.0, 1.0, 1.0, 1.0};
GLfloat specular0[]={1.0, 1.0, 1.0, 1.0};
GLfloat light0_pos[]={1.0, 1.0, 1,0, 1.0};
glLightfv(GL_LIGHT0, GL_POSITION, light0_pos);
glLightfv(GL_LIGHT0, GL_AMBIENT, ambient0);
glLightfv(GL_LIGHT0, GL_DIFFUSE, diffuse0);
glLightfv(GL_LIGHT0, GL_SPECULAR, specular0);
glLightf(GL_LIGHT0, GL_CONSTANT_ATTENUATION, 2.0);
glLightf(GL_LIGHT0, GL_LINEAR_ATTENUATION, 1.0);
glLightf(GL_LIGHT0, GL_QUADRATIC_ATTENUATION, 2.0);
Is there something major that I'm missing that could be causing this severe difference? Particular values I should play with? Or a glEnable call I've missed?
Any help, advice or pointers to elsewhere much appreciated.
To eliminate the sharp drop-off, amp up the ambient light. There's no global illumination model in OpenGL so that parameter has absolutely no effect beyond being the colour the face will be if no other light falls upon it.
Since you're using glVertex-type calls, I'll go out on a limb and guess you're using the fixed functionality pipeline? If so then lighting is calculated at vertices, then interpolated across polygon surfaces. That means that specular highlights don't work very well on 'large' polygons — roughly speaking, the highlight can't appear in the middle of a polygon. GPUs with programmable pipelines (which includes the ones in mobile phones nowadays) can calculate lighting per pixel instead of per vertex but OpenGL doesn't do this for you, so you'd need to delve into shader programming yourself. Or just ensure that your model is made up of small enough polygons.
Your shininess exponent is also quite high - have you tried dialling that down a few notches?
when looking at the background, it looks like a spotlight on the right just bathing the Scene into warm light, and a Flakscheinwerfer on the left basking everything in extreme lightning, eliminating every shadow

Resources