I'm trying to access the entire screen with the context.
Here's my current code (currently only this file):
#include <stdio.h>
#include <Windows.h>
#include <GL/gl.h>
#include <gl/glu.h>
#include <GL/glext.h>
int main(int argc, char *argv[]) {
HDC hdc = GetDC(NULL);
HGLRC hglrc;
hglrc = wglCreateContext(hdc);
// Handle errors
if (hglrc == NULL) {
DWORD errorCode = GetLastError();
LPVOID lpMsgBuf;
FormatMessage(
FORMAT_MESSAGE_ALLOCATE_BUFFER |
FORMAT_MESSAGE_FROM_SYSTEM |
FORMAT_MESSAGE_IGNORE_INSERTS,
NULL,
errorCode,
MAKELANGID(LANG_NEUTRAL, SUBLANG_DEFAULT),
(LPTSTR)&lpMsgBuf,
0, NULL );
printf("Failed with error %d: %s", errorCode, lpMsgBuf);
LocalFree(lpMsgBuf);
ExitProcess(errorCode);
}
wglMakeCurrent(hdc, hglrc);
printf("%s\n", (char) glGetString(GL_VENDOR));
wglMakeCurrent(NULL, NULL);
wglDeleteContext(hglrc);
return 0;
}
The problem is in this code at the start:
HDC hdc = GetDC(NULL);
HGLRC hglrc;
hglrc = wglCreateContext(hdc);
and the program's output (printed in the error handling if statement) is
Failed with error 2000: The pixel format is invalid.
Calling GetDC(NULL) is specified as retrieving the DC of the entire screen, so I'm not sure what is going wrong here. How do I fix this?
EDIT: added more information
You didn't set the pixel format.
Have a look at the documentation here.
You should declare a pixel format descriptor, for example:
PIXELFORMATDESCRIPTOR pfd =
{
sizeof(PIXELFORMATDESCRIPTOR),
1,
PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER, // Flags
PFD_TYPE_RGBA, // The kind of framebuffer. RGBA or palette.
32, // Colordepth of the framebuffer.
0, 0, 0, 0, 0, 0,
0,
0,
0,
0, 0, 0, 0,
24, // Number of bits for the depthbuffer
8, // Number of bits for the stencilbuffer
0, // Number of Aux buffers in the framebuffer.
PFD_MAIN_PLANE,
0,
0, 0, 0
};
Then use the ChoosePixelFormat to obtain the pixel format number, e.g.:
int iPixelFormat = ChoosePixelFormat(hdc, &pfd);
and finally call the SetPixelFormat function to set the correct pixel format, e.g.:
SetPixelFormat(hdc, iPixelFormat, &pfd);
Only then, you can call the wglCreateContext function.
UPDATE
As pointed out by user Chris Becke one can not call SetPixelFormat on the screen hDC (obtained with GetDC(NULL) as per the OP code). This is reported also in the khronos wiki here.
Therefore, you must also create your own Window, obtain its DC and then use it to set the pixel format and create the GL context. If you want to render "fullscreen" you simply have to create a borderless window with the same size of the screen. I suggest to have a look at the answers to this old question here on SO about this matter.
Related
I'm using some C code to take a screenshot of a window, translated directly from a PowerBuilder sample that works perfectly. Here's the part up to where there's a problem:
extern "C" __declspec(dllexport) BOOL __stdcall WindowScreenShot(const wchar_t* fileName, unsigned long x, unsigned long y,
unsigned long width, unsigned long height)
{
HWND ll_hWnd;
HDC ll_hdc, ll_hdcMem;
HBITMAP ll_hBitmap;
HANDLE hDib = NULL, hFile = NULL;
char* lpBitmap = NULL;
BOOL lb_result, lb_ok = FALSE;
BITMAPINFO lstr_Info;
BITMAPFILEHEADER lstr_Header;
int li_pixels;
DWORD dwBmpSize, dwBytesWritten;
// get handle to windows background
ll_hWnd = GetDesktopWindow();
// Get the device context of window and allocate memory
ll_hdc = GetDC(ll_hWnd);
ll_hdcMem = CreateCompatibleDC(ll_hdc);
ll_hBitmap = CreateCompatibleBitmap(ll_hdc, width, height);
if (ll_hBitmap != 0)
{
// Select an object into the specified device context
SelectObject(ll_hdcMem, ll_hBitmap);
// Copy the bitmap from the source to the destination
lb_result = BitBlt(ll_hdcMem, 0, 0, width, height, ll_hdc, x, y, SRCCOPY);
lstr_Info.bmiHeader.biSize = sizeof(BITMAPINFOHEADER);
// Get the bitmapinfo (THIS LINE IS FAILING CURRENTLY)
if (GetDIBits(ll_hdcMem, ll_hBitmap, 0, height, NULL, &lstr_Info, DIB_RGB_COLORS) > 0)
{ ...
That last call to GetDIBits always fails with the only documented error that it can give, ERROR_INVALID_PARAMETER. Since this code is basically absolutely identical to the PowerBuilder code, in terms of the structures used and Windows APIs called, I just have no idea how to resolve it. I've also read the API docs carefully and it all looks like it should work.
Any bright ideas? Thanks.
Paul Ogilvie's answer resolved it - I was using uninitialized structures because they were automatic variables. As soon as I fixed that the code worked.
I keep getting an error at line 123:
123 C:\Dev-Cpp\Window_main_2.c invalid conversion from `void*' to `HBITMAP__*'
I don't know what to make of this and its driving me crazy.
void DrawBitmap(HDC hdcDest, char *filename, int x, int y)
{
HBITMAP image;
BITMAP bm;
HDC hdcMem;
// This is the line that brings about the issue (just ask me if more code is required because
// there is a lot more. Essentially this whole function points to a file and I call this
// function in another function that will compile a windows screen filled with the following
// image path. But I cant get this HBITMAP to agree with the image datatype. Please let me
// know if more info is required and thank you.) The line is below.
image = LoadImage(0, "C:\\Users\\Lillian\\Pictures\\c.bmp", IMAGE_BITMAP, 0, 0, LR_LOADFROMFILE);
GetObject(image, sizeof(BITMAP), &bm);
hdcMem = CreateCompatibleDC(global_hdc);
SelectObject(hdcMem, image);
BitBlt(
global_hdc,
x,
y,
bm.bmWidth,
bm.bmHeight,
hdcMem,
0,
0,
SRCCOPY);
DeleteDC(hdcMem);
DeleteObject((HBITMAP)image);
}
LoadImage() returns a HANDLE. Your need a cast when assigning the result to your variable:
image = (HBITMAP) LoadImage(0, "C:\\Users\\Lillian\\Pictures\\c.bmp", IMAGE_BITMAP, 0, 0, LR_LOADFROMFILE);
Also, before you DeleteDC(), you need to select the original HBITMAP back into hdcMem - you need to save it when you call SelectObject() earlier.
I am attempting to use GDI+ in my C application to take a screenshot and save it as JPEG. I am using GDI+ to convert the BMP to JPEG but apparently when calling the GdiplusStartup function, the return code is 2(invalid parameter) instead of 0:
int main()
{
GdiplusStartupInput gdiplusStartupInput;
ULONG_PTR gdiplusToken;
//if(GdiplusStartup(&gdiplusToken, &gdiplusStartupInput, NULL) != 0)
// printf("GDI NOT WORKING\n");
printf("%d",GdiplusStartup(&gdiplusToken, &gdiplusStartupInput, NULL));
HDC hdc = GetDC(NULL); // get the desktop device context
HDC hDest = CreateCompatibleDC(hdc); // create a device context to use yourself
// get the height and width of the screen
int height = GetSystemMetrics(SM_CYVIRTUALSCREEN);
int width = GetSystemMetrics(SM_CXVIRTUALSCREEN);
// create a bitmap
HBITMAP hbDesktop = CreateCompatibleBitmap( hdc, width, height);
// use the previously created device context with the bitmap
SelectObject(hDest, hbDesktop);
// copy from the desktop device context to the bitmap device context
// call this once per 'frame'
BitBlt(hDest, 0,0, width, height, hdc, 0, 0, SRCCOPY);
// after the recording is done, release the desktop context you got..
ReleaseDC(NULL, hdc);
// ..and delete the context you created
DeleteDC(hDest);
SaveJpeg(hbDesktop,"a.jpeg",100);
GdiplusShutdown(gdiplusToken);
return 0;
}
I am trying to figure out why the GdiplusStartup function is not working.
Any thoughts?
Initialize gdiplusStartupInput variable with the following values: GdiplusVersion = 1, DebugEventCallback = NULL, SuppressBackgroundThread = FALSE, SuppressExternalCodecs = FALSE
According to MSDN article GdiplusStartup function http://msdn.microsoft.com/en-us/library/windows/desktop/ms534077%28v=vs.85%29.aspx
GdiplusStartupInput structure has default constructor which initializes the structure with these values. Since you call the function from C, constructor is not working and structure remains uninitialized. Provide your own initialization code to solve the problem.
// As Global
ULONG_PTR gdiplusToken;
// In top of main
GdiplusStartupInput gdiplusStartupInput;
GdiplusStartup(&programInfo.gdiplusToken, &gdiplusStartupInput, NULL);
works for me.
I want to add new font and draw text on the screen.
But When I draw text by use this font, the text font is Arial.
I don't know the reason.
Here is my code.
Please see and help me.
HANDLE hFind;
WIN32_FIND_DATA wfd;
WCHAR szFontPath[MAX_PATH];
int nNum;
swprintf(szFontPath, L"%s\\Fonts\\*.*", m_szAppPath);
hFind = FindFirstFile(szFontPath, &wfd);
if(hFind == INVALID_HANDLE_VALUE)
return;
do
{
if(wfd.cFileName[0] == L'.')
continue;
swprintf(szFontPath, L"%s\\Fonts\\%s", m_szAppPath, wfd.cFileName);
nNum = AddFontResource(szFontPath);
}
while(FindNextFile(hFind, &wfd));
PostMessage(HWND_BROADCAST, WM_FONTCHANGE, 0, 0);
FindClose(hFind);
---------------------- In other reference function--------------------------------
int nHeight;
LPDIRECT3DSURFACE9 pSurface;
HDC hDC;
m_pDevice = pDevice;
m_pDevice->GetBackBuffer(0, 0, D3DBACKBUFFER_TYPE_MONO, &pSurface);
pSurface->GetDC(&hDC);
nHeight = -MulDiv( dwSize, GetDeviceCaps(hDC, LOGPIXELSY), 72 );
pSurface->ReleaseDC(hDC);
m_hFont = CreateFont( nHeight, 0, 0, 0, bBold, bItalic, false, false,
HANGUL_CHARSET, OUT_DEFAULT_PRECIS, CLIP_DEFAULT_PRECIS, ANTIALIASED_QUALITY,
DEFAULT_PITCH/* | FF_DONTCARE*/, L"Helvetica-Condensed-Black-Se");
pSurface->Release();
i'm not sure you are allowed to specify a path for your font. you may need to "install" it by copying it to c:\windows\fonts (or the xp equivalent)
Seems that this font can not support HANGUL_CHARSET. Try to use DEFAULT_CHARSET.
HI,
I am using loadImage to load an 24bit bmp file and then try to get the bmp info
hBitmap = (HBITMAP)LoadImage(NULL, "logo.bmp", IMAGE_BITMAP, 0, 0,
LR_LOADFROMFILE | LR_DEFAULTSIZE)
GetObject( hBitmap, sizeof(BITMAP), &bm );
When I do the same operation with windows color display setting 32 hi color than i got the following value
bmBitsPixel = 32 but if i set the windows color display to 16 than i got
bmBitsPixel = 16
Can any one please explain what does it mean.
I if i used the following formula to calculate the size of the bmp than the size of the bmp depends on the window color setting.
size = bmWidth * bmHeight* bmBitsPixel/8
Thanks and Regards
An HBITMAP is a device-dependent bitmap: its internal representation depends on the color format of your screen.
Accordingly, if you set your display color format to 32 bits per pixel (bpp), then your bitmap will use 32 bpp. If you switch your color format to 16 bpp, the bitmap will follow and use 16 bpp.
You formula is correct, you have to take bmBitsPixel into account when computing the bitmap size.
A HBITMAP can be loaded as a Device-Independent-Bitmap when using LoadImage API by specify the LR_CREATEDIBSECTION flag along with the other flags - without it, Windows is converting to a device-dependent bitmap. This will only work when the source image is a 32BPP bitmap. Lower bit-rates (8BPP, 16BPP, 24BPP etc.) will be loaded with the EXACT bit-planes/color depths - which would have to be converted to the monitor's color depth to actually display it.
Because no processing occurs, you may get a 32BPP BMP that is not pre-multiplied for alpha-rendering (AlphaBlend() function) so you will get color fringing and other unwanted artifacts. For those cases, you need to do the pre-multiply on every pixel. The following is a small snip of code - but does not do too much error checking... you will need to test that the BITMAP is of the correct plane/color depth before allowing this code to execute. There are a few ways to optimize the code below (such as using a lookup table), but this is for explanation purposes mainly.
This code can only work IF the bm.bmBits pointer is not NULL, bm.bmPlanes equals 1, and bmBitsPixel equals 32:
RGBQUAD* lprgbSrc = (RGBQUAD*)bm.bmBits;
if( lprgbSrc )
{
RGBQUAD* lprgbEnd = (RGBQUAD*)((size_t)lprgbSrc + (size_t)bm.bmHeight*bm.bmWidthBytes);
while( lprgbSrc != lprgbEnd )
{
switch(lprgbSrc->rgbReserved)
{
case 255: // Pixel at full opacity - no color shift required...
break;
case 0: // Pixel at full transparency - must go full black
*(DWORD*)lprgbSrc = 0;
break;
// Need to pre-multiply by the alpha (rgbReserved) and
// divide by 255 to get a correct brightness level for correct
// rendering of the color when mixed on top of the background
default:
lprgbSrc->rgbRed = ((size_t)lprgbSrc->rgbRed * (size_t)lprgbSrc->rgbReserved) /255;
lprgbSrc->rgbBlue = ((size_t)lprgbSrc->rgbBlue * (size_t)lprgbSrc->rgbReserved) /255;
lprgbSrc->rgbGreen = ((size_t)lprgbSrc->rgbGreen * (size_t)lprgbSrc->rgbReserved) /255;
break;
}
lprgbSrc++;
}
}
Note that certain Windows GDI functions accept non-premultiplied HBITMAP (ImageList for instance) when certain flags are applied.
The LoadImage function not working because it needs a positive height. Some bitmap images are saved with a -height value so that the image will start at the lower left corner. The LoadImage function VC++ 6.0 MFC was not programmed for negative heights so it fails and just returns NULL. Just change the biheight in structure BITMAPINFOHEADER to a positive value. The LoadImage will then open just about any bitmap 8bit, 24bit or 32bit with a positive biheight.
BITMAPFILEHEADER m_bmfHeader;
BITMAPINFOHEADER m_bi;
HANDLE hFile = CreateFile(image_filename,
GENERIC_READ,
0,
NULL,OPEN_EXISTING,
FILE_ATTRIBUTE_NORMAL, NULL);
if(hFile == INVALID_HANDLE_VALUE)
{
AfxMessageBox("Cannot Open a New File");
return;
}
DWORD dwBytesWritten = 0;
ReadFile(hFile, (LPSTR)&m_bmfHeader, sizeof(BITMAPFILEHEADER), &dwBytesWritten, NULL);
ReadFile(hFile, (LPSTR)&m_bi, sizeof(BITMAPINFOHEADER), &dwBytesWritten, NULL);
int m_nSizeImage = m_bi.biSizeImage;
BYTE *lpbitmap;
lpbitmap = (BYTE*)malloc(m_nSizeImage);
ReadFile( hFile, (LPSTR)lpbitmap, m_nSizeImage, &dwBytesWritten,NULL);
CloseHandle(hFile);
hFile = CreateFile(image_filename, GENERIC_WRITE, 0, NULL, CREATE_ALWAYS, FILE_ATTRIBUTE_NORMAL, NULL);
DWORD dwBytesWritten = 0;
m_bi.biHeight = (int)fabs(m_bi.biHeight); //Height Always Positive!!!
WriteFile(hFile, (LPSTR)&m_bmfHeader, sizeof(BITMAPFILEHEADER), &dwBytesWritten, NULL);
WriteFile(hFile, (LPSTR)&m_bi, sizeof(BITMAPINFOHEADER),&dwBytesWritten, NULL);
WriteFile(hFile, (LPSTR)lpbitmap, m_bi.biSizeImage, &dwBytesWritten, NULL);
CloseHandle(hFile);
free(lpbitmap); // Now you can use the LoadImage(...)