How to create a depth texture on iOS6. Is R16F an option? - ios6

I am trying to write depth to a texture. I would like to have the linear depth, so I tried using I tried using a R16F texture. I defined a texture like this:
glTexImage2D(GL_TEXTURE_2D, 0, GL_R16F_EXT, g_bufferWidth, g_bufferHeight, 0,
GL_RED_EXT, GL_HALF_FLOAT_OES, NULL);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,
GL_TEXTURE_2D, g_texture, 0);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16,
g_bufferWidth, g_bufferHeight);
But when debugging on Xcode by using frame capture on an iPhone5, I get an Unknown texture in the color buffer, and nothing is written to the depth buffer.
I've also tried just creating a depth texture:
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, g_bufferWidth, g_bufferHeight, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_INT, NULL);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, g_texture, 0);
But in this case also nothing seems to get written in the depth buffer.
The only way I can get things rendered to the depth buffer seems by defining the first texture as RGBA32...
Aren't the EXT_color_buffer_half_float and depth extensions active in iOS6??

Related

Open GL ES 2 - glFramebufferTexture2D with Incomplete Missing Attachment error

I'm new to OpenGL/GLES, I got Incomplete Missing Attachment error when generate framebuffer from EGLImageKHR with below code:
GLuint texture;
GLuint framebuffer;
EGLImageKHR image = eglCreateImageKHR(display,
EGL_NO_CONTEXT,
EGL_NATIVE_PIXMAP_KHR,
(EGLClientBuffer)&pixmap,
NULL);
assert(image != EGL_NO_IMAGE_KHR);
glGenTextures(1, &texture);
glGenTextures(1, &framebuffer);
glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glBindTexture(GL_TEXTURE_EXTERNAL_OES, textureId);
glEGLImageTargetTexture2DOES(GL_TEXTURE_EXTERNAL_OES, image);
glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);
glFramebufferTexture2D(GL_FRAMEBUFFER,
GL_COLOR_ATTACHMENT0,
GL_TEXTURE_EXTERNAL_OES,
texture,
0);
glCheckFramebufferStatus(GL_FRAMEBUFFER);
glBindFramebuffer(GL_FRAMEBUFFER, 0);
eglDestroyImageKHR(display,image);
glBindTexture(GL_TEXTURE_EXTERNAL_OES, 0);
I got GL_FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT when using:
glFramebufferTexture2D(GL_FRAMEBUFFER,
GL_COLOR_ATTACHMENT0,
GL_TEXTURE_EXTERNAL_OES,
texture,
0);
and GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT when changing texture to GL_TEXTURE_2D:
glFramebufferTexture2D(GL_FRAMEBUFFER,
GL_COLOR_ATTACHMENT0,
GL_TEXTURE_2D,
texture,
0);
The image and texture is correct as I can display correctly. I don't know what I'm missing here.
I just found the answer, as it is explained in Raspi forum:
Can't render to render buffer
We have to use GL_TEXTURE_2D in this function:
glFramebufferTexture2D(GL_FRAMEBUFFER,
GL_COLOR_ATTACHMENT0,
GL_TEXTURE_2D,
texture,
0);
and have to create an empty texture GL_TEXTURE_2D to bind our framebuffer to that texture before rendering. GL_TEXTURE_EXTERNAL_OES and GL_TEXTURE_2D are different textures, cannot mix them together.

glDrawArrays throws GL_INVALID_VALUE on GLESv2

I'm trying to port an OpenGL program to GLESv2. The program uses the following code a texture to the default framebuffer (it also fails if I render it to an fbo which also works on OpenGL).
glBindFramebuffer(GL_FRAMEBUFFER, 0);
glVertexAttribPointer(bgra_texcoords, 2, GL_FLOAT, GL_FALSE, 0, display_texcoords);
glEnableVertexAttribArray(bgra_texcoords);
DEBUG_ERROR_CHECK();
glBindBuffer(GL_ARRAY_BUFFER, vertex_buffer);
glUseProgram(bgra_program);
glBindTexture(GL_TEXTURE_2D, inst->texture);
glUniform1i(bgra_texture, 0);
glViewport(inst->x, root_surface->h - (inst->y + inst->h), inst->w, inst->h);
DEBUG_ERROR_CHECK();
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
DEBUG_ERROR_CHECK();
glBindBuffer(GL_ARRAY_BUFFER, 0);
glFlush();
This works fine with OpenGL but it fails on glDrawArrays() under GLESv2. I read this question: glDrawElements throw GL_INVALID_VALUE​ error which is very similar to my problem but I can't figure out how to apply the solution to my code since I'm not using VertexArray and I'm very new to GL.
inst->texture is a texture uploaded with glTexImage2D(). I created the vertex_buffer right after initializing EGL and compiling the shaders:
glGenBuffers(1, &vertex_buffer);
glBindBuffer(GL_ARRAY_BUFFER, vertex_buffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices),
vertices, GL_STATIC_DRAW);
glBindBuffer(GL_ARRAY_BUFFER, vertex_buffer);
glVertexAttribPointer(bgra_pos, 2, GL_FLOAT, GL_FALSE, 0, 0);
glEnableVertexAttribArray(bgra_pos);
DEBUG_ERROR_CHECK();
Edit: You can look at the whole source file here: https://github.com/fernando-rodriguez/mediabox/blob/bc4135d9568b2c5b4e8f39ac63ded2cb66023bcd/src/lib/ui/video-opengl.c. The file is a video "driver" for a compositor, all it does is creates 2D surfaces and render them to the screen. If there is anything wrong with the question or am missing something please post a comment so I can fix it. Thanks.
I figured it out. I was calling eglBindAPI() after eglCreateContext() so I think I was actually creating a GLES1 context.

Geometry Image Creation and general handling

I would like some entry level information about the OpenGI library which converts a 3D mesh to a 2D texture map.
http://opengi.sourceforge.net/doc/index.html
At this point i have implemented an example on a specific mesh and it seems to work perfectly. I would like to process this image further and i need to export it as a bmp file.
I have imported a mesh and created:
texture with the parameterized geometry
texture with the normals
At this point i am confused how to handle them. They are stored as OpenGL textures.
Here's the source code:
glBindTexture(GL_TEXTURE_2D, uiTex[0]);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA32F, res, res, 0, GL_RGBA, GL_FLOAT, NULL);
giGenImages(3, gim);
giBindImage(gim[0]);
giImageGLTextureData(res, res, 4, GL_FLOAT, uiTex[0]);
giAttribImage(0, gim[0]);
giAttribSamplerParameteri(0, GI_SAMPLING_MODE, GI_SAMPLE_DEFAULT);
giSample();
How can I export the texture?

GLSL Texture Mapping Results in a Solid Color

I'm trying to write some basic shaders to map a ppm file to my shapes. Unfortunately, instead of a nice multicoloured texture (I'm using a stone brick pattern), I get a solid shade of dark purple.
Here's my code:
Init:
printf("Using %d: Texture shading\n", shaderType);
glEnable(GL_TEXTURE_2D);
glGenTextures(1, &textName);
int w, h;
texture = glmReadPPM("brick.ppm", &w, &h);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT );
glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT );
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
printf("W%dH%d\n", w, h);
glTexImage2D(GL_TEXTURE_2D, 0, 3, w, h, 0, GL_RGB, GL_UNSIGNED_BYTE, texture);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, textName);
programID = LoadShaders("text.vert", "text.frag");
Render:
glClearColor( 0.6f, 0.85f, 1.0f, 1.0f );
glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );
glMatrixMode (GL_MODELVIEW);
glLoadIdentity();
/*Unrelated code here*/
glUseProgram(programID);
varloc = glGetUniformLocation(programID,"texture1");
glUniform1i(varloc, textName);
glLightfv(GL_LIGHT0, GL_SPOT_CUTOFF, &cutOff);
gluLookAt (posx, posy, zoom,
lookx,looky,0,
0,1,0);
glRotatef(anglex,0.0f,1.0f,0.0f);
glRotatef(angley,1.0f,0.0f,0.0f);
renderTriangles(); //Renders mountains from a list using intermediate mode
// Yes, I know it's deprecated
glutSwapBuffers();
glui->sync_live();
glUseProgram(0);
Vertex Shader:
varying vec2 uv;
void main() {
uv = vec2(gl_MultiTexCoord0.st);
gl_Position = ftransform();
}
Fragment Shader:
uniform sampler2D texture1;
varying vec2 uv;
void main() {
gl_FragColor = texture2D(texture1, uv);
}
Does anyone see any problems here? I can't seem to figure it out.
I tried with a basic White and Read 2x2 float, but again, I got one colour. It was a light red.
If you're getting a single colour for the whole object, there might be something wrong with the texture coordinates. I would try looking at them and see if they're correct. You can do that by modifying your fragment shader like this:
gl_FragColor = vec3(uv.xy, 0);
If your whole image is still rendered using one colour, there is something wrong with the way you're sending texture coordinates across. You're using some deprecated functionality (immediate mode, gl_MultiTexCoord0), maybe it's not working together as you would expect:
"Keep in mind that for GLSL 1.30, you should define your own vertex attribute." http://www.opengl.org/wiki/GLSL_:_common_mistakes
It looks like you are binding the texture after you have all of the other texture functions. You should put the call to glBindTexture right after the call to glGenTextures because you have to bind a texture before you can upload the image into it. The other problem is that instead of setting the uniform variable for your sampler to textName in the call to glUniform1i(varloc, textName) you should set it to 0 because that variable represents the active texture unit and you used glActiveTexture(GL_TEXTURE0);

OpenGL 2D texture rendering too large, glViewport broken

I've written a small tiling game engine with OpenGL and C, and I can't seem to figure out what the problem is. My main loop looks like this:
void main_game_loop()
{
(poll for events and respond to them)
glClear(GL_COLOR_BUFFER_BIT);
glPushMatrix();
draw_block(WALL, 10, 10);
}
draw_block:
void draw_block(block b, int x, int y)
{
(load b's texture from a hash and store it in GLuint tex)
glPushMatrix();
glTranslatef(x, y, 0);
glBindTexture(GL_TEXTURE_2D, tex);
glBegin(GL_QUADS);
//BLOCK_DIM is 32, the width and height of the texture
glTexCoord2i(0, 0); glVertex3f(0, 0, 0);
glTexCoord2i(1, 0); glVertex3f(BLOCK_DIM, 0, 0);
glTexCoord2i(1, 1); glVertex3f(BLOCK_DIM, BLOCK_DIM, 0);
glTexCoord2i(0, 1); glVertex3f(0, BLOCK_DIM, 0);
glEnd();
glPopMatrix;
}
initialization function: (called before main_game_loop)
void init_gl()
{
glViewport(0, 0, SCREEN_WIDTH, SCREEN_HEIGHT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0, SCREEN_WIDTH, SCREEN_HEIGHT, 0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glClearColor(0, 0, 0, 0);
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glDisable(GL_DEPTH_TEST);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
}
When run, this displays a black screen. However, if I remove the glViewport call, it seemingly displays the texture, but huge and in the corner of the window. Screenshot:
The texture IS being drawn correctly, because if I scale out by a huge factor, I can see the entire image. The y-axis also seems to be flipped from what I used in the gluOrtho2D call (discovered by making events add or subtract from x/y coordinates of the image, subtracting from the y coordinate causes the image to move downward). I'm starting to get frustrated, because this is the simplest possible example I can think of. I'm using SDL, and am passing SDL_OPENGL to SDL_SetVideoMode. What is going on here?
Looks like a problem with glViewport, but just to be sure, did you try clearing the color buffer to purple?
I've always thought of glViewport as a video/windowing function, not actually part of OpenGL itself, because it is the intermediate between the window manager and the OpenGL subsystem, and it uses window coordinates. As such, you should probably look at it along with the other SDL video calls. I suggest updating the question with the full code, or at least with those parts relevant to the video/window subsystem.
Or is it that you omitted to call glViewport after a resize?
You should also try your code without SDL_FULLSCREEN and/or with a smaller window. I usually start with a 512x512 or 640x480 window until I get the viewport and some basic controls right.
the first two parameters of glViewPort specifies the lower left of the view
http://www.opengl.org/sdk/docs/man/xhtml/glViewport.xml
You can try
glViewport(0, SCREEN_HEIGHT, SCREEN_WIDTH, SCREEN_HEIGHT);
For gluOrtho2D, the parameters are left, right, top, bottom
so I would probably use
gluOrtho2D(0, SCREEN_WIDTH, 0, SCREEN_HEIGHT);

Resources