GDI Monochrome Bitmap flips bits on each creation of HBITMAP - c

I've been trying to load a 16bit (A1R5G5B5) BMP from file and use its alpha channel as a bit mask. I have gotten everything to work fine except for one problem that's been troubling me for the past week. It is that when I use CreateDIBitmap to make the 1bit channel with a buffer of bytes the Bitmap created uses the inverses of all its bits only on the first draw. On the next paint the bits flip correctly to the data supplied and remain that way for all the draws after. This behavior is very strange and occurs on all Windows versions, I've tracked it down to having to do with some sort of setting of the HDC and possibly CreateDIBitmap. I've tried many things including setting the foreground and background color of both HDC's before and after to many values but everything I have tried still keeps this behavior.
here is a POC to try:
BITMAPINFOHEADER bmih;
BITMAPINFO bmi;
HBITMAP mask;
PBYTE data;
PBYTE alpha;
SIZE dimension;
void WhenCreated() // WM_CREATE
{
dimension.cx=3;
dimension.cy=1;
alpha=(PBYTE)malloc(1);
data=(PBYTE)malloc(1);
alpha[0]=0xA0; // 0b10100000
}
#define BIN_SCAPE(B,A) (B[0]&(1<<A))?1:0
void WhenPresenting(HDC H) // WM_PAINT
{
printf(
"ALPHA:\t%i %i %i\n",
BIN_SCAPE(alpha,7),
BIN_SCAPE(alpha,6),
BIN_SCAPE(alpha,5)
);
HDC memory;
HBITMAP matter;
memory=CreateCompatibleDC(NULL);
memset(&bmi,0x0,sizeof(BITMAPINFO));
bmi.bmiHeader.biSize=sizeof(BITMAPINFOHEADER);
bmi.bmiHeader.biWidth=dimension.cx;
bmi.bmiHeader.biHeight=dimension.cy;
bmi.bmiHeader.biPlanes=1;
bmi.bmiHeader.biBitCount=1;
bmi.bmiHeader.biCompression=BI_RGB;
memset(&bmih,0x0,sizeof(BITMAPINFOHEADER));
bmih.biSize=sizeof(BITMAPINFOHEADER);
bmih.biWidth=bmi.bmiHeader.biWidth;
bmih.biHeight=bmi.bmiHeader.biHeight;
mask=CreateDIBitmap(
memory,
&bmih,
CBM_INIT,
alpha,
&bmi,
DIB_RGB_COLORS
);
SelectObject(memory,mask);
GetDIBits(memory,mask,0,1,data,&bmi,DIB_RGB_COLORS);
printf(
"DATA:\t%i %i %i\n",
BIN_SCAPE(data,7),
BIN_SCAPE(data,6),
BIN_SCAPE(data,5)
);
StretchBlt(
H,
0,0,128,128,
memory,
0,0,dimension.cx,dimension.cy,
SRCCOPY
);
DeleteDC(memory);
DeleteObject(mask);
}
When the program loads the data displayed is inverse to what is given, subsequent paintings cause the data to fit the data supplied as seen in the console output, there is definitely a flipping of bits happening. My guess is the first HDC supplied may use a different palette than ones all after the first which causes this behavior?

Now it all makes sense it's the palette that is changing.
"biBitCount member is less than 16, the biClrUsed member specifies the actual number of colors the graphics engine or device driver accesses." (from msdn)
If you use a color HDC in the CreateDIBitmap you'll get a color with black and this color will change on each repaint, which will start freaking you out, until you understand it is because you have not set a palette to the HBITMAP because when each HDC is made its color palette is undefined unless specified. You could use SetDIBits but if you want to have it done during CreateDIBitmap try this:
PBITMAPINFO pbmi;
RGBQUAD palette[2];
{
// this will give you white (1) and black (0)
palette[0].rgbBlue=0x00;
palette[0].rgbGreen=0x00;
palette[0].rgbRed=0x00;
palette[1].rgbBlue=0xFF;
palette[1].rgbGreen=0xFF;
palette[1].rgbRed=0xFF;
// using a PBITMAPINFO in order to allocate room for palette
pbmi=(PBITMAPINFO)LocalAlloc(LPTR,sizeof(BITMAPINFO)+sizeof(RGBQUAD)*2); // this technically allocates an extra RGBQUAD
pbmi->bmiHeader.biSize=sizeof(BITMAPINFOHEADER);
pbmi->bmiHeader.biWidth=dimension.cx;
pbmi->bmiHeader.biHeight=dimension.cy;
pbmi->bmiHeader.biPlanes=1;
pbmi->bmiHeader.biBitCount=1;
pbmi->bmiHeader.biCompression=BI_RGB;
pbmi->bmiHeader.biClrUsed=2; // palette is two colors long
pbmi->bmiHeader.biClrImportant=2;
memcpy(pbmi->bmiColors,palette,sizeof(RGBQUAD)*2);
mask=CreateDIBitmap(
memory,
&bmih,
CBM_INIT,
alpha,
pbmi,
DIB_RGB_COLORS
);
}

Related

Win32 GDI color palette transparency bug

This question could be considered more of a bug report on an unsightly and time-wasting issue I've recently encountered while using Win32/GDI:
That is, loading a bitmap image into static control (a bitmap static control, not icon). I'll demonstrate with the following code (this follows the creation of the main window):
HBITMAP hbmpLogo;
/* Load the logo bitmap graphic, compiled into the executable file by a resource compiler */
hbmpLogo = (HBITMAP)LoadImage(
wc.hInstance, /* <-- derived from GetModuleHandle(NULL) */
MAKEINTRESOURCE(ID_LOGO), /* <-- ID_LOGO defined in a header */
IMAGE_BITMAP,
0, 0,
LR_CREATEDIBSECTION | LR_LOADTRANSPARENT);
/* We have a fully functioning handle to a bitmap at this line */
if (!hbmpLogo)
{
/* Thus this statement is never reached */
abort();
}
We then create the control, which is a child of the main window:
/* Add static control */
m_hWndLogo = CreateWindowExW(
0, /* Extended styles, not used */
L"STATIC", /* Class name, we want a STATIC control */
(LPWSTR)NULL, /* Would be window text, but we would instead pass an integer identifier
* here, formatted (as a string) in the form "#100" (let 100 = ID_LOGO) */
SS_BITMAP | WS_CHILD | WS_VISIBLE, /* Styles specified. SS = Static Style. We select
* bitmap, rather than other static control styles. */
32, /* X */
32, /* Y */
640, /* Width. */
400, /* Height. */
hMainParentWindow,
(HMENU)ID_LOGO, /* hMenu parameter, repurposed in this case as an identifier for the
* control, hence the obfuscatory use of the cast. */
wc.hInstance, /* Program instance handle appears here again ( GetModuleHandle(NULL) )*/
NULL);
if (!m_hWndLogo)
{
abort(); /* Also never called */
}
/* We then arm the static control with the bitmap by the, once more quite obfuscatory, use of
* a 'SendMessage'-esque interface function: */
SendDlgItemMessageW(
hMainParentWindow, /* Window containing the control */
ID_LOGO, /* The identifier of the control, passed in via the HMENU parameter
* of CreateWindow(...). */
STM_SETIMAGE, /* The action we want to effect, which is, arming the control with the
* bitmap we've loaded. */
(WPARAM)IMAGE_BITMAP, /* Specifying a bitmap, as opposed to an icon or cursor. */
(LPARAM)hbmpLogo); /* Passing in the bitmap handle. */
/* At this line, our static control is sufficiently initialised. */
What is not impressive about this segment of code is the mandated use of LoadImage(...) to load the graphic from the program resources, where it is otherwise seemingly impossible to specify that our image will require transparency. Both flags LR_CREATEDIBSECTION and LR_LOADTRANSPARENT are required to effect this (once again, very ugly and not very explicit behavioural requirements. Why isn't LR_LOADTRANSPARENT good on its own?).
I will elaborate now that the bitmap has been tried at different bit-depths, each less than 16 bits per pixel (id est, using colour palettes), which incurs distractingly unaesthetical disuniformity between them. [Edit: See further discoveries in my answer]
What exactly do I mean by this?
A bitmap loaded at 8 bits per pixel, thus having a 256-length colour palette, renders with the first colour of the bitmap deleted (that is, set to the window class background brush colour); in effect, the bitmap is now 'transparent' in the appropriate areas. This behaviour is expected.
I then recompile the executable, now loading a similar bitmap but at (a reduced) 4 bits per pixel, thus having a 16-length colour palette. All is good and well, except I discover that the transparent region of the bitmap is painted with the WRONG background colour, one that does not match the window background colour. My wonderful bitmap has an unsightly grey rectangle around it, revealing its bounds.
What should the window background colour be? All documentation leads back, very explicitly, to this (HBRUSH)NULL-inclusive eyesore:
WNDCLASSEX wc = {}; /* Zero initialise */
/* initialise various members of wc
* ...
* ... */
wc.hbrBackground = (HBRUSH)(COLOR_WINDOW+1); /* Here is the eyesore. */
Where a certain colour preset must be incremented, then cast to a HBRUSH typename, to specify the desired background colour. 'Window colour' is an obvious choice, and a fragment of code very frequently recurring and reproducible.
You may note that when this is not done, the Window instead assumes the colour of its preceding number code, which on my system happens to be the 'Scroll' colour. Indeed, and alas, if I happen to forget the notorious and glorious +1 appended to the COLOR_WINDOW HBRUSH, my window will become the unintended colour of a scroll bar.
And it seems this mistake has propagated within Microsofts own library. Evidence? That a 4-bpp bitmap, when loaded, will also erase the bitmap transparent areas to the wrong background color, where a 8-bpp bitmap does not.
TL;DR
It seems the programmers at Microsoft themselves do not fully understand their own Win32/GDI interface jargon, especially regarding the peculiar design choice behind adding 1 to the Window Class WNDCLASS[EX] hbrBackground member (supposedly to support (HBRUSH)NULL).
This is unless, of course, anyone can spot a mistake on my part?
Shall I submit a bug report?
Many thanks.
As though to patch over a hole in a parachute, there is a solution that produces consistency, implemented in the window callback procedure:
LRESULT CALLBACK WndProc(HWND hWnd, UINT uiMsg, WPARAM wp, LPARAM lp)
{
/* ... */
switch (uiMsg)
{
/* This message is sent to us as a 'request' for the background colour
* of the static control. */
case WM_CTLCOLORSTATIC:
/* WPARAM will contain the handle of the DC */
/* LPARAM will contain the handle of the control */
if (lp == (LPARAM)g_hLogo)
{
SetBkMode((HDC)wp, TRANSPARENT);
return (LRESULT)GetSysColorBrush(COLOR_WINDOW); /* Here's the magic */
}
break;
}
return DefWindowProc(hWnd, uiMsg, wp, lp);
}
It turns out the problem was not reproducible when other transparent bitmaps of varying sizes (not only bit depths) were loaded.
This is horrible. I am not sure why this happens. Insights?
EDIT: All classes have been removed to produce a neat 'minimal reproducible example'.

Font layouting & rendering with cairo and freetype

I have a system that has only the freetype2 and cairo libraries available. What I want to achieve is:
getting the glyphs for a UTF-8 text
layouting the text, storing position information (by myself)
getting cairo paths for each glyph for rendering
Unfortunately the documentation doesn't really explain how it should be done, as they expect one to use a higher level library like Pango.
What I think could be right is: Create a scaled font with cairo_scaled_font_create and then retrieve the glyphs for the text using cairo_scaled_font_text_to_glyphs. cairo_glyph_extents then gives the extents for each glyph. But how can I then get things like kerning and the advance? Also, how can I then get paths for each font?
Are there some more resources on this topic? Are these functions the expected way to go?
Okay, so I found what's needed.
You first need to create a cairo_scaled_font_t which represents a font in a specific size. To do so, one can simply use cairo_get_scaled_font after setting a font, it creates a scaled font for the current settings in the context.
Next, you convert the input text using cairo_scaled_font_text_to_glyphs, this gives an array of glyphs and also clusters as output. The cluster mappings represent which part of the UTF-8 string belong to the corresponding glyphs in the glyph array.
To get the extents of glyphs, cairo_scaled_font_glyph_extents is used. It gives dimensions, advances and bearings of each glyph/set of glyphs.
Finally, the paths for glyphs can be put in the context using cairo_glyph_path. These paths can then be drawn as wished.
The following example converts an input string to glyphs, retrieves their extents and renders them:
const char* text = "Hello world";
int fontSize = 14;
cairo_font_face_t* fontFace = ...;
// get the scaled font object
cairo_set_font_face(cr, fontFace);
cairo_set_font_size(cr, fontSize);
auto scaled_face = cairo_get_scaled_font(cr);
// get glyphs for the text
cairo_glyph_t* glyphs = NULL;
int glyph_count;
cairo_text_cluster_t* clusters = NULL;
int cluster_count;
cairo_text_cluster_flags_t clusterflags;
auto stat = cairo_scaled_font_text_to_glyphs(scaled_face, 0, 0, text, strlen(text), &glyphs, &glyph_count, &clusters, &cluster_count,
&clusterflags);
// check if conversion was successful
if (stat == CAIRO_STATUS_SUCCESS) {
// text paints on bottom line
cairo_translate(cr, 0, fontSize);
// draw each cluster
int glyph_index = 0;
int byte_index = 0;
for (int i = 0; i < cluster_count; i++) {
cairo_text_cluster_t* cluster = &clusters[i];
cairo_glyph_t* clusterglyphs = &glyphs[glyph_index];
// get extents for the glyphs in the cluster
cairo_text_extents_t extents;
cairo_scaled_font_glyph_extents(scaled_face, clusterglyphs, cluster->num_glyphs, &extents);
// ... for later use
// put paths for current cluster to context
cairo_glyph_path(cr, clusterglyphs, cluster->num_glyphs);
// draw black text with green stroke
cairo_set_source_rgba(cr, 0.2, 0.2, 0.2, 1.0);
cairo_fill_preserve(cr);
cairo_set_source_rgba(cr, 0, 1, 0, 1.0);
cairo_set_line_width(cr, 0.5);
cairo_stroke(cr);
// glyph/byte position
glyph_index += cluster->num_glyphs;
byte_index += cluster->num_bytes;
}
}
Those functions seem to be the best way, considering Cairo's text system. It just shows even more that Cairo isn't really meant for text. It won't be able to do kerning or paths really. Pango, I believe, would have its own complex code for doing those things.
For best advancement of Ghost, I would recommend porting Pango, since you (or someone else) will probably eventually want it anyway.

Why is pango_layout_get_pixel_size slightly wrong on Ubuntu Linux in C

I'm experiencing a frustrating issue trying to draw text using Pango and Cairo libraries in C in a Gtk application running on Ubuntu Linux.
I'm creating a Pango layout and then drawing it at a given location which is determined by the size of the text as reported by pango_layout_get_pixel_size, but the size returned by that function is wrong in both width and height, especially in height. Here is my full code:
// Create a cairo context with which to draw
// Note that we already have a GtkDrawingArea widget m_pGtkDrawingArea
cairo_t *cr = gdk_cairo_create(m_pGtkDrawingArea->window);
// Text to draw
std::string szText("NO DATA AVAILABLE");
// Create the layout
PangoLayout *pLayout = gtk_widget_create_pango_layout(m_pGtkDrawingArea, szText.c_str());
// Set layout properties
pango_layout_set_alignment(pLayout, PANGO_ALIGN_LEFT);
pango_layout_set_width(pLayout, -1);
// The family to use
std::string szFontFamily("FreeSans");
// The font size to use
double dFontSize = 36.0;
// Format the font description string
char szFontDescription[32];
memset(&(szFontDescription[0]), 0, sizeof(szFontDescription));
snprintf(szFontDescription, sizeof(szFontDescription) - 1, "%s %.1f", szFontFamily.c_str(), dFontSize);
// Get a new pango font description
PangoFontDescription *pFontDescription = pango_font_description_from_string(szFontDescription);
// Set up the pango font description
pango_font_description_set_weight(pFontDescription, PANGO_WEIGHT_NORMAL);
pango_font_description_set_style(pFontDescription, PANGO_STYLE_NORMAL);
pango_font_description_set_variant(pFontDescription, PANGO_VARIANT_NORMAL);
pango_font_description_set_stretch(pFontDescription, PANGO_STRETCH_NORMAL);
// Set this as the pango font description on the layout
pango_layout_set_font_description(pLayout, pFontDescription);
// Use auto direction
pango_layout_set_auto_dir(pLayout, TRUE);
// Get the pixel size of this text - this reports a size of 481x54 pixels
int iPixelWidth = 0, iPixelHeight = 0;
pango_layout_get_pixel_size(pLayout, &iPixelWidth, &iPixelHeight);
// Calculate the text location based on iPixelWidth and iPixelHeight
double dTextLocX = ...;
double dTextLocY = ...;
// Set up the cairo context for drawing the text
cairo_set_source_rgba(cr, 1.0, 1.0, 1.0, 1.0);
cairo_set_antialias(cr, CAIRO_ANTIALIAS_BEST);
// Move into place
cairo_move_to(cr, dTextLocX, dTextLocY);
// Draw the layout
pango_cairo_show_layout(cr, pLayout);
//
// pango_layout_get_pixel_size() reported a size of 481x54 pixels,
// but the actual size when drawn is 478x37 pixels!
//
//
// Clean up...
//
So, as described at the bottom of the above code, the pango_layout_get_pixel_size function reports a size of 481x54 pixels, but the size of the text on the screen is actually 478x37 pixels.
What am I doing wrong here? How can I get the actual correct pixel size?
Thanks in advance for your help!
The text you are displaying ("NO DATA AVAILABLE") is all-caps, and consequently has no descenders (letters which are partly below the baseline, like j, p and q.) Normally, when you measure the extent of a text box, you include room for the descenders whether or not they are present; otherwise, you will see odd artifacts such as inconsistent line separation depending on whether or not a given line has a descender.
Pango provides APIs which return both the logical extent (which includes the full height of the font) and the ink extent (which is the bounding box of the inked part of the image). I suspect you are looking for the ink extent.

WinAPI get mouse cursor icon

I want to get the cursor icon in Windows.
I think language I use isn't very important here, so I will just write pseudo code with WinAPI functions I'm trying to use:
c = CURSORINFO.new(20, 1, 1, POINT.new(1,1));
GetCursorInfo(c); #provides correctly filled structure with hCursor
DrawIcon(GetWindowDC(GetForegroundWindow()), 1, 1, c.hCursor);
So this part works fine, it draws current cursor on active window.
But that's not what I want. I want to get an array of pixels, so I should draw it in memory.
I'm trying to do it like this:
hdc = CreateCompatibleDC(GetDC(0)); #returns non-zero int
canvas = CreateCompatibleBitmap(hdc, 256, 256); #returns non-zero int too
c = CURSORINFO.new(20, 1, 1, POINT.new(1,1));
GetCursorInfo(c);
DrawIcon(hdc, 1, 1, c.hCursor); #returns 1
GetPixel(hdc, 1, 1); #returns -1
Why doesn't GetPixel() return COLORREF? What am I missing?
I'm not very experienced with WinAPI, so I'm probably doing some stupid mistake.
You have to select the bitmap you create into the device context. If not, the GetPixel function will return CLR_INVALID (0xFFFFFFFF):
A bitmap must be selected within the device context, otherwise, CLR_INVALID is returned on all pixels.
Also, the pseudo-code you've shown is leaking objects badly. Whenever you call GetDC, you must call ReleaseDC when you're finished using it. And whenever you create a GDI object, you must destroy it when you're finished using it.
Finally, you appear to be assuming that the coordinates for the point of origin—that is, the upper left point—are (1, 1). They are actually (0, 0).
Here's the code I would write (error checking omitted for brevity):
// Get your device contexts.
HDC hdcScreen = GetDC(NULL);
HDC hdcMem = CreateCompatibleDC(hdcScreen);
// Create the bitmap to use as a canvas.
HBITMAP hbmCanvas = CreateCompatibleBitmap(hdcScreen, 256, 256);
// Select the bitmap into the device context.
HGDIOBJ hbmOld = SelectObject(hdcMem, hbmCanvas);
// Get information about the global cursor.
CURSORINFO ci;
ci.cbSize = sizeof(ci);
GetCursorInfo(&ci);
// Draw the cursor into the canvas.
DrawIcon(hdcMem, 0, 0, ci.hCursor);
// Get the color of the pixel you're interested in.
COLORREF clr = GetPixel(hdcMem, 0, 0);
// Clean up after yourself.
SelectObject(hdcMem, hbmOld);
DeleteObject(hbmCanvas);
DeleteDC(hdcMem);
ReleaseDC(hdcScreen);
But one final caveat—the DrawIcon function will probably not work as you expect. It is limited to drawing an icon or cursor at the default size. On most systems, that will be 32x32. From the documentation:
DrawIcon draws the icon or cursor using the width and height specified by the system metric values for icons; for more information, see GetSystemMetrics.
Instead, you probably want to use the DrawIconEx function. The following code will draw the cursor at the actual size of the resource:
DrawIconEx(hdcMem, 0, 0, ci.hCursor, 0, 0, 0, NULL, DI_NORMAL);

LoadImage is working differently based on win color setting

HI,
I am using loadImage to load an 24bit bmp file and then try to get the bmp info
hBitmap = (HBITMAP)LoadImage(NULL, "logo.bmp", IMAGE_BITMAP, 0, 0,
LR_LOADFROMFILE | LR_DEFAULTSIZE)
GetObject( hBitmap, sizeof(BITMAP), &bm );
When I do the same operation with windows color display setting 32 hi color than i got the following value
bmBitsPixel = 32 but if i set the windows color display to 16 than i got
bmBitsPixel = 16
Can any one please explain what does it mean.
I if i used the following formula to calculate the size of the bmp than the size of the bmp depends on the window color setting.
size = bmWidth * bmHeight* bmBitsPixel/8
Thanks and Regards
An HBITMAP is a device-dependent bitmap: its internal representation depends on the color format of your screen.
Accordingly, if you set your display color format to 32 bits per pixel (bpp), then your bitmap will use 32 bpp. If you switch your color format to 16 bpp, the bitmap will follow and use 16 bpp.
You formula is correct, you have to take bmBitsPixel into account when computing the bitmap size.
A HBITMAP can be loaded as a Device-Independent-Bitmap when using LoadImage API by specify the LR_CREATEDIBSECTION flag along with the other flags - without it, Windows is converting to a device-dependent bitmap. This will only work when the source image is a 32BPP bitmap. Lower bit-rates (8BPP, 16BPP, 24BPP etc.) will be loaded with the EXACT bit-planes/color depths - which would have to be converted to the monitor's color depth to actually display it.
Because no processing occurs, you may get a 32BPP BMP that is not pre-multiplied for alpha-rendering (AlphaBlend() function) so you will get color fringing and other unwanted artifacts. For those cases, you need to do the pre-multiply on every pixel. The following is a small snip of code - but does not do too much error checking... you will need to test that the BITMAP is of the correct plane/color depth before allowing this code to execute. There are a few ways to optimize the code below (such as using a lookup table), but this is for explanation purposes mainly.
This code can only work IF the bm.bmBits pointer is not NULL, bm.bmPlanes equals 1, and bmBitsPixel equals 32:
RGBQUAD* lprgbSrc = (RGBQUAD*)bm.bmBits;
if( lprgbSrc )
{
RGBQUAD* lprgbEnd = (RGBQUAD*)((size_t)lprgbSrc + (size_t)bm.bmHeight*bm.bmWidthBytes);
while( lprgbSrc != lprgbEnd )
{
switch(lprgbSrc->rgbReserved)
{
case 255: // Pixel at full opacity - no color shift required...
break;
case 0: // Pixel at full transparency - must go full black
*(DWORD*)lprgbSrc = 0;
break;
// Need to pre-multiply by the alpha (rgbReserved) and
// divide by 255 to get a correct brightness level for correct
// rendering of the color when mixed on top of the background
default:
lprgbSrc->rgbRed = ((size_t)lprgbSrc->rgbRed * (size_t)lprgbSrc->rgbReserved) /255;
lprgbSrc->rgbBlue = ((size_t)lprgbSrc->rgbBlue * (size_t)lprgbSrc->rgbReserved) /255;
lprgbSrc->rgbGreen = ((size_t)lprgbSrc->rgbGreen * (size_t)lprgbSrc->rgbReserved) /255;
break;
}
lprgbSrc++;
}
}
Note that certain Windows GDI functions accept non-premultiplied HBITMAP (ImageList for instance) when certain flags are applied.
The LoadImage function not working because it needs a positive height. Some bitmap images are saved with a -height value so that the image will start at the lower left corner. The LoadImage function VC++ 6.0 MFC was not programmed for negative heights so it fails and just returns NULL. Just change the biheight in structure BITMAPINFOHEADER to a positive value. The LoadImage will then open just about any bitmap 8bit, 24bit or 32bit with a positive biheight.
BITMAPFILEHEADER m_bmfHeader;
BITMAPINFOHEADER m_bi;
HANDLE hFile = CreateFile(image_filename,
GENERIC_READ,
0,
NULL,OPEN_EXISTING,
FILE_ATTRIBUTE_NORMAL, NULL);
if(hFile == INVALID_HANDLE_VALUE)
{
AfxMessageBox("Cannot Open a New File");
return;
}
DWORD dwBytesWritten = 0;
ReadFile(hFile, (LPSTR)&m_bmfHeader, sizeof(BITMAPFILEHEADER), &dwBytesWritten, NULL);
ReadFile(hFile, (LPSTR)&m_bi, sizeof(BITMAPINFOHEADER), &dwBytesWritten, NULL);
int m_nSizeImage = m_bi.biSizeImage;
BYTE *lpbitmap;
lpbitmap = (BYTE*)malloc(m_nSizeImage);
ReadFile( hFile, (LPSTR)lpbitmap, m_nSizeImage, &dwBytesWritten,NULL);
CloseHandle(hFile);
hFile = CreateFile(image_filename, GENERIC_WRITE, 0, NULL, CREATE_ALWAYS, FILE_ATTRIBUTE_NORMAL, NULL);
DWORD dwBytesWritten = 0;
m_bi.biHeight = (int)fabs(m_bi.biHeight); //Height Always Positive!!!
WriteFile(hFile, (LPSTR)&m_bmfHeader, sizeof(BITMAPFILEHEADER), &dwBytesWritten, NULL);
WriteFile(hFile, (LPSTR)&m_bi, sizeof(BITMAPINFOHEADER),&dwBytesWritten, NULL);
WriteFile(hFile, (LPSTR)lpbitmap, m_bi.biSizeImage, &dwBytesWritten, NULL);
CloseHandle(hFile);
free(lpbitmap); // Now you can use the LoadImage(...)

Resources