How could mix noise and texture by Opengl shader? - c

I want to put my noise texture(like perlin noise or something else) on image texture,
when i add noise texture and image texture directly,it will make the src image texture light,
fragColor = vec4(image_texture+noise_texture,1.0);
when i use mix func, some arear will be darker,
fragColor = mix(image_texture,noise_texture,factor)
neither effect is good.
I've referred to a code that use the alpha channel for overlay directly,but it seems to change the background of the GLFW window directly to the image texture,because its shader doesn't use texture,just use alpha channel like this:
gl_FragColor = vec4(1.0, 1.0, 1.0, alpha);
which I don't know how to do too. So,is there any good way to mix noise and texture?

Related

OpenGL lighting has bright surfaces

If you look at this picture:
You can see that the left and right walls are brighter than the others, along with the faces of the chair.
I was wondering, is this an issue with the normals? Or would it potentially be just the position of the light illuminating these surfaces?
In my main method I just do this:
//enable lighting
glEnable(GL_LIGHTING);
glEnable(GL_LIGHT0);
//setup lighting
float lightColor [] = {1.0f, 0.8f, 0.8f,1.0f};
glMaterialfv(GL_FRONT_AND_BACK, GL_AMBIENT_AND_DIFFUSE, lightColor);
GLfloat lightpos[] = {2,2,4,4};
glLightfv(GL_LIGHT0,GL_POSITION, lightpos);
If you need to see the normals I can upload it but I'm not sure if it is a problem with them or not.
It seems your normals are not computed as they should. Notice how same direction sides of different objects are lit differently.
I would guess that:
you are not transforming the normals right when transforming your objects;
your normals are not normalized to unit length (do you have glEnable(GL_NORMALIZE) in your code?)
normals computation is wrong in some other way (e.g. you round the values before sending them to render).
It is hard to suggest more possible causes without seeing your actual code.

How can I use ARGB color in opengl/SDL?

I am rendering SVG using Cairo. The Cairo output format is ARGB. Then I put rendered image in a SDL_Surface so I can use it as a openGL texture.
The rendered image was looking just fine when I use directly the SDL_Surface. But I had to use the surface as a texture in openGL because I needed some openGL function. The problem is, that all the color are flipped. OpenGL use RGBA and not ARGB.
I was wondering if anybody could help me converting a SDL_Surface ARGB to RGBA.
Usefull information:
I used this tutorial to render my SVG.
http://tuxpaint.org/presentations/sdl_svg_svgopen2009_kendrick.pdf
My software is written in C.
EDIT:
I used this tutorial to use a SDL_Surface as a openGL texture.
http://www.sdltutorials.com/sdl-tip-sdl-surface-to-opengl-texture
Both the rendering process and the opengl texture are the same as the tutorials.
Judging by your Tux example code, you can skip SDL completely and feed OpenGL the pixel data manually using the following code:
GLuint tex;
glGenTextures(1, &tex);
glBindTexture(GL_TEXTURE_2D, tex);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height, 0, GL_BGRA, GL_UNSIGNED_INT_8_8_8_8_REV, image);
The important details here are the GL_BGRA format for the pixel data and the GL_UNSIGNED_INT_8_8_8_8_REV data type (this reverses the order of the channels during pixel transfer operations). OpenGL will take care of converting the pixel data into the appropriate texel format for you. OpenGL ES, on the other hand, will not do this; to make this portable you may want to convert the pixel data to RGBA or BGRA yourself...

Using GLKBaseEffect is it possible to colorize a texture

I have a sheet of black shapes surrounded by transparency. I have successfully loaded this texture with GLKit and I can draw the shapes using GLKBaseEffect into rectangles. Is there a way to change the color of the black (ie non-transparent) pixels, so I can draw yellow shapes or blue shapes etc? Or do I need a custom shader to do this?
It seems like you'll need a custom shader (which I highly recommend working with) as you need to check individual texel color values, but here are some suggestions to try first:
You can pass color as per-vertex data in a vertex attribute array pointing to GLKVertexAttribColor. This will allow you to individually set the color of each vertex (and ultimately, faces) but it will be difficult to see where they line up against your texture.
You can try enabling the following property on your effect:
effect.colorMaterialEnabled = YES;
But, for both cases, if your texels are completely black then I don't think any changes in color will show.
I think a custom shader is definitely the way to go, as you'll need to do something like this:
highp vec4 finalColor;
highp vec4 textureColor = texture2D(uTexture, vTexel);
highp vec4 surfaceColor = uColor;
// If texel is non-transparent (check alpha channel)
if(textureColor.a > 0.001)
finalColor = surfaceColor;
else
finalColor = vec4(0.0, 0.0, 0.0, 0.0);
gl_FragColor = finalColor;
Where anything prefixed with u is a uniform variable passed into the shader.
To get fully colorized textures, use:
self.effect.texture2d0.envMode = GLKTextureEnvModeModulate;
Which tells OpenGL to take whatever color is in your texture and multiply it by whatever color the underlying geometry is. You can then use vertex coloring to get neat fades and whatnot.
NOTE: You'll want to change your texture from black to white (1, 1, 1, 1), so multiplication works correctly.
NOTE: Here are some other settings that you should already have in place:
self.effect.texture2d0.enabled = GL_TRUE;
self.effect.texture2d0.target = GLKTextureTarget2D;
self.effect.texture2d0.name = self.texture.name;
self.effect.colorMaterialEnabled = GL_TRUE;
NOTE: You can experiment with GLKTextureEnvModeDecal, too, which blends your texture on top of colored geometry (as when applying a decal), so the transparent parts of the texture show the geometry underneath.

How do you access a previously shaded texture in a Pixel Shader?

In WPF, I want to use a pixel shader to modify a composite image i.e. a new image overlaid on top of a previously shaded image. The new image comes in as a largely transparent image except where there is data (think mathematical functions - sine wave, etc). Anyway this process needs to repeat pretty rapidly - compose the currently shaded texture with a new image and then shade the composite image. The problem is that I don't know how to access the previously shaded texture from within my shader.
Basically, you need to add a Texture2D variable in your shader, then set that parameter as the texture you need to access before drawing the new one (i'm unsure of that process in WPF). You do something like this:
//blahblahblah variables here
Texture2D PreviousTexture;
Sampler PreviousTextureSampler = Sampler2D { Texture = PreviousTexture; };
//blahblahblah code here
then you can sample the texture with a tex2D call.

Handling alpha channel in WPF pixel shader effect

Is there something unusual about how the alpha component is handled in a pixel shader? I have a WPF application for which my artist is giving me grayscale images to use as backgrounds, and the application colorizes those images according to the current state. So I wrote a pixel shader (using the WPF Pixel Shader Effects Library infrastructure) to use as an effect on an Image element. The shader takes a color as a parameter, which it converts to HSL so it can manipulate brightness. Then for each grey pixel, it computes a color whose brightness is interpolated between the color parameter and white in proportion to the brightness of the source pixel.
float4 main(float2 uv : TEXCOORD) : COLOR
{
float4 src = tex2D(implicitInputSampler, uv);
// ...Do messy computation involving src brightness and color parameter...
float4 dst;
dst.r = ...
dst.g = ...
dst.b = ...
dst.a = src.a;
return dst;
}
This works just fine on the pixels where alpha = 1. But where alpha = 0, the resultant pixels come out white, rather than having the window's background show through. So I made a tiny change:
float4 main(float2 uv : TEXCOORD) : COLOR
{
float4 src = tex2D(implicitInputSampler, uv);
if (src.a == 0)
return src;
...
and now the transparent parts really are transparent. Why? Why didn't the dst.a = src.a statement in the first version accomplish that? Unfortunately, even this is only a partial fix, because it looks to me like the pixels with 0 < alpha < 1 are coming out white.
Does anyone know what I'm not understanding about alpha?
After some more web searching, I discovered the piece I was missing.
According to an article on MSDN: "WPF uses pre-multiplied alpha everywhere internally for a number of performance reasons, so that's also the way we interpret the color values in the custom pixel shader."
So the fix turns out to be to throw in a multiplication by alpha:
float4 main(float2 uv : TEXCOORD) : COLOR
{
...
dst.rgb *= src.a;
return dst;
}
And now my output looks as I expect it to.
0 < alpha < 1 are coming out white
What ranges are you expecting here?
All values are going to be in the range 0.0 and 1.0... pixel shaders do not work in discrete 256 colour ranges, they are floating point where 1.0 is the maximum intensity.
If your calculations end up setting r/g/b values to >1.0 you are going to get white...
http://www.facewound.com/tutorials/shader1/
Dude I am working on a XNA game, and I had to use a grayscale pixel shader and I got the same problem you are facing.
I donno if you are familiar with XNA environment or not, but I solved the problem by changing the SpriteBatch drawing SpriteBlendMode from SpriteBlendMode.None to SpriteBlendMode.AlphaBlend, I hope this can help you knowing the reason.
regards,

Resources