How to change console program for unicode support in windows? - c

The following program can be compiled using msvc or mingw. However, the mingw version cannot display unicode correctly. Why? How can I fix that?
Code:
#include <stdio.h>
#include <windows.h>
#include <io.h>
#include <fcntl.h>
int wmain(void)
{
_setmode(_fileno(stdout), _O_U16TEXT);
_putws(L"哈哈哈");
system("pause");
return 0;
}
Mingw64 Compile Command:
i686-w64-mingw32-gcc -mconsole -municode play.c
MSVC Compiled:
Mingw Compiled:
Edit:
After some testing, the problem seems not causing by mingw. If I run the program directly by double clicking the app. The unicode string cannot be displayed correct either. The code page however, is the same, 437.
It turns out the problem is related to console font instead of the compiler. See the following demo code for changing console font.

This is happening because of missing #define UNICODE & #define _UNICODE . You should try adding it along with other headers. The _UNICODE symbol is used with headers such as tchar.h to direct standard C functions such as printf() and fopen() to the Unicode versions.
Please Note - The -municode option is still required when linking if Unicode mode is used.

After doing some research, it turns out the default console font does not support chainese glyphs. One can change the console font by using SetCurrentConsoleFontEx function.
Demo Code:
#ifdef _MSC_VER
#define _CRT_SECURE_NO_WARNINGS
#endif
#include <stdio.h>
#include <io.h>
#include <fcntl.h>
#include <windows.h>
#define FF_SIMHEI 54
int main(int argc, char const *argv[])
{
CONSOLE_FONT_INFOEX cfi = {0};
cfi.cbSize = sizeof(CONSOLE_FONT_INFOEX);
cfi.nFont = 0;
cfi.dwFontSize.X = 8;
cfi.dwFontSize.Y = 16;
cfi.FontFamily = FF_SIMHEI;
cfi.FontWeight = FW_NORMAL;
wcscpy(cfi.FaceName, L"SimHei");
SetCurrentConsoleFontEx(GetStdHandle(STD_OUTPUT_HANDLE), FALSE, &cfi);
/* UTF-8 String */
SetConsoleOutputCP(CP_UTF8); /* Thanks for Eryk Sun's notice: Remove this line if you are using windows 7 or 8 */
puts(u8"UTF-8你好");
/* UTF-16 String */
_setmode(_fileno(stdout), _O_U16TEXT);
_putws(L"UTF-16你好");
system("pause");
return 0;
}

Related

Why does the agwrite function in the cgraph library unexpectedly fail on any config/platform but Win64 release?

I've been trying to get cgraph (https://graphviz.gitlab.io/_pages/pdf/cgraph.pdf) working so I read and write some graph files. I tried writing some very basic code:
#include <assert.h>
#include <ctype.h>
#include <errno.h>
#include <float.h>
#include <limits.h>
#include <math.h>
#include <memory.h>
#include <stdarg.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <Windows.h>
#include <mysql.h>
#include <graphviz/cgraph.h>
int main() {
FILE *fp = NULL;
fp = fopen("test.dot", "w+");
if (fp == NULL) {
return -1;
}
Agraph_t *g;
g = agopen("test", Agdirected, NULL);
Agnode_t *signal1;
signal1 = agnode(g, "Signal1_ON", TRUE);
Agnode_t *signal2;
signal2 = agnode(g, "Signal1_OFF", TRUE);
Agedge_t *link = agedge(g, signal1, signal2, "link1", TRUE);
agattr(g, AGEDGE, "label", "transitionlink");
agwrite(g, fp);
fclose(fp);
system("pause");
return 0;
}
What should be happening is that the file should be written to test.dot. This code works perfectly fine on Win64 release, but fails on Win64 debug, Win32 debug, and Win32 release. I have double checked the .lib files and .dll files settings in visual studio and in the file directories, making sure to copy the release and debug versions of each platform correctly. However, the agwrite keeps causing a "Microsoft Visual Studio C Runtime Library has detected a fatal error" crash on Win64 debug, Win32 debug, and Win32 release. The weird thing is if I change
agwrite(g, fp); to agwrite(g, stdout);, the code works on all platforms/configurations. I am so confused why this is happening. Here is the source file which contains the code for agwrite if that helps: https://github.com/ellson/MOTHBALLED-graphviz/blob/master/lib/cgraph/write.c
I cannot debug the issue because the source has been compiled into .dlls, and .libs for each platform/configuration.
I appreciate any suggestions/feedback,
Thank you
Edit:
For anyone godly enough to try and get this working on their own system, here are all my binaries, libs, and include files: https://www.dropbox.com/sh/o9tjz7txu4m0k5q/AAAnp8Wu99q9IsFN7kvqZP7Ta?dl=0
Edit 2:
The compiler I am using is MSVC 14 on Windows 10.
I found out that using cgraph directly results in an error when trying to use agwrite(). The solution is to use the GVC abstraction layer which comes with the Graphviz C API to do file I/O. Here is the code that worked:
#include <assert.h>
#include <ctype.h>
#include <errno.h>
#include <float.h>
#include <limits.h>
#include <math.h>
#include <memory.h>
#include <stdarg.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <Windows.h>
#include <mysql.h>
#include <graphviz/gvc.h>
int main() {
GVC_t *gvc;
gvc = gvContext();
Agraph_t *g;
g = agopen("test", Agdirected, NULL);
Agnode_t *signal1;
signal1 = agnode(g, "Signal1_ON", TRUE);
Agnode_t *signal2;
signal2 = agnode(g, "Signal1_OFF", TRUE);
Agedge_t *link = agedge(g, signal1, signal2, "link1", TRUE);
agattr(g, AGEDGE, "label", "transitionlink");
gvLayout(gvc, g, "dot");
gvRenderFilename(gvc, g, "dot", "test.dot");
gvFreeLayout(gvc, g);
agclose(g);
gvFreeContext(gvc);
system("pause");
return 0;
}
Edit:
Here is the documentation for GVC: https://graphviz.gitlab.io/_pages/pdf/gvc.3.pdf
The reason of crashing is described on official Graphviz site:
This usually happens when the Graphviz library is built using one version of the stdio library, and the user’s program is compiled using another. If the FILE structure of stdio is different, the call to agread() will cause a crash. This is mainly a problem on Windows where we just provide a binary release built with one version of Visual Studio and stdio changes depending on the version of Visual Studio. It can also occur if the user tries to use cygwin or something similar which may also use an incompatible stdio.
https://graphviz.org/faq/#FaqAgreadCrash

How can I print card suit characters in C Win32 console application?

I have seen a few questions on how to print these characters but none of the methods appear to be working. I suspect it is because I making a Win32 console application based on some of the comments I read.
Here is an example of what I have tried in my code currently. It only prints question mark boxes, or if I change it around I get question marks or random symbols.
I have tried defining these at the top.
#define SPADE '\x06'
#define CLUB '\x05'
#define HEART '\x03'
#define DIAMOND '\x04'
inside function, these are some of the things I've tried. I have left S,D,H,C in case I can't figure it out.
printf("%lc", SPADE);
//printf("♠");
//printf("S");
printf("%lc", HEART);
//printf("♥");
//printf("H");
printf("%lc", DIAMOND);
//printf("♦");
//printf("D");
printf("%lc", CLUB);
//printf("♣");
//printf("C");
UTF-16 wchar_t and wide characters functions are needed in Windows.
#include <windows.h>
int main()
{
DWORD n;
HANDLE hout = GetStdHandle(STD_OUTPUT_HANDLE);
const wchar_t *buf = L"♠♥♦♣\n";
WriteConsoleW(hout, buf, wcslen(buf), &n, 0);
return 0;
}
The following code will compile with Visual Studio:
#include <stdio.h>
#include <io.h> //for _setmode
#include <fcntl.h> //for _O_U16TEXT
int main()
{
_setmode(_fileno(stdout), _O_U16TEXT);
wprintf(L"♠♥♦♣\n");
return 0;
}
After setting the mode to UTF-16, you have to call _setmode(_fileno(stdout), _O_TEXT) if you wish to use printf again.

Setting Console Font in C on Windows 10

I am trying to change the size of console font in my console application during runtime.
Looking into the Windows documentation, I found the following functions:
GetCurrentConsoleFont (and GetCurrentConsoleFontEx)
and SetCurrentConsoleFontEx.
CONSOLE_FONT_INFO prevFontInfo = {sizeof(prevFontInfo)};
GetCurrentConsoleFont( GetStdHandle(STD_OUTPUT_HANDLE), FALSE, &prevFontInfo
);
CONSOLE_FONT_INFO fontInfo = {0};
//fontInfo.cbSize = sizeof(prevFontInfo);
fontInfo.dwFontSize.X = 30;
fontInfo.dwFontSize.Y = 70;
//fontInfo.FontWeight = 700;
SetCurrentConsoleFontEx( GetStdHandle(STD_OUTPUT_HANDLE), FALSE, &fontInfo );
... (Some Irrelevant Code Here)
SetCurrentConsoleFontEx( GetStdHandle(STD_OUTPUT_HANDLE), FALSE,
&prevFontInfo );
Currently I am getting the following warnings:
implicit declaration of function 'GetCurrentConsoleFont'
and
implicit declaration of function 'SetCurrentConsoleFont'
I tried using GetConsoleFontEx (the reason FontWeight and cbSize are commented out) to no prevail. I used CONSOLE_FONT_INFOEX as well, which resulted in the following error:
Unkown type name 'CONSOLE_FONT_INFOEX'
I read that it is necessary to use the following:
#define _WIN32_WINNT 0x0500
I also tried several variations of this (0x0502, 0x0600, ect.), but nothing seemed to change the warnings/errors.
I include the following windows headers, and I compile with -lkernel32.
#define _WIN32_WINNT 0x0500
#include <windows.h>
#include <Wincon.h>
#include <stdio.h>
#include <conio.h>
#include <stdlib.h>
#include <time.h>
I wish to change the font size in C for the Windows terminal at runtime.
How do I do that? Why is this not working?

Printing a unicode box in C

I'm trying to print this medium shade unicode box in C: ▒
(I'm doing the exercises in K&R and then got sidetracked on the one about making a histogram...). I know my unix term (Mac OSX) can display the box because I saved a text file with the box, and used cat textfilewithblock and it printed the block.
So far I initially tried:
#include <stdio.h>
#include <wchar.h>
int main(){
wprintf(L"▒\n");
return 0;
}
and nothing printed
iMac-2$ ./a.out
iMac-2:clang vik$
I did a search and found this: unicode hello world for C?
And it seems like I still have to set a locale (even though the executing environment in utf8? I'm still trying to figure out why this step is necessary) But anyway, it works! (after a bit of a struggle finally realizing that the proper string was en_US.UTF-8 rather than en_US.utf8 which I had read somewhere...)
#include <stdio.h>
#include <wchar.h>
#include <locale.h>
int main(){
setlocale (LC_ALL, "en_US.UTF-8");
wprintf(L"▒\n");
return 0;
}
Output is as follows:
iMac-2$ ./a.out
▒
iMac-2$
But when I try the following code...putting in the UTF-8 hex (which I got from here: http://www.utf8-chartable.de/unicode-utf8-table.pl?start=9472&unicodeinhtml=dec ) which is 0xe29692 for the box rather than pasting the box in itself, it doesn't work again.
#include <stdio.h>
#include <wchar.h>
#include <locale.h>
int main(){
setlocale (LC_ALL, "en_US.UTF-8");
wchar_t box = 0xe29692;
wprintf(L"%lc\n", box);
return 0;
}
I'm clearly missing something but can't quite figure out what it is.
The unicode value of the MEDIUM SHADE code point is not 0xe29692, it is 0x2592. <E2><96><92> is the 3 byte encoding for this code point in UTF-8.
You can print this thing either using the wide char APIs:
#include <stdio.h>
#include <wchar.h>
#include <locale.h>
int main(void) {
setlocale(LC_ALL, "en_US.UTF-8");
wchar_t box = 0x2592;
wprintf(L"%lc\n", box); // or simply printf("%lc\n", box);
return 0;
}
Or simply by printing the UTF-8 encoding directly:
#include <stdio.h>
int main(void) {
printf("\xE2\x96\x92\n");
return 0;
}
Or if your text editor encodes the source file in UTF-8:
#include <stdio.h>
int main(void) {
printf("▒\n");
return 0;
}
But be aware that this will not work: putchar('▒');
Also for full unicode support and a few more goodies, I recommend using iTerm2 on MacOS.
The box character is U+2592, which translates to 0xE2 0x96 0x92 in UTF-8. This adaptation of your third program mostly works for me:
#include <stdio.h>
#include <wchar.h>
#include <locale.h>
int main(void)
{
setlocale (LC_ALL, "en_US.UTF-8");
wchar_t box = 0xe29692;
wprintf(L"%lc\n", box);
wprintf(L"\n\nX\n\n");
box = L'\u2592'; //0xE2 0x96 0x92 = U+2592
wprintf(L"%lc\n", box);
wprintf(L"\n\n0x%.8X\n\n", box);
box = 0x2592;
wprintf(L"%lc\n", box);
return 0;
}
The output I get is:
X
▒
0x00002592
▒
The first print operation produces nothing of use; the others work.
Testing on Mac OS X 10.10.5. I happen to be compiling with GCC 5.3.0 (which I compiled), but I got the same output with XCode 7.0.2 and clang.

Code compiles on AIX 5.3 but not AIX 7.1 something to do with struct shl_descriptor where is this defined?

I have some code that looks similar to the following:
#include <stdio.h>
#include <sys/types.h>
#include <sys/stat.h>
#include <sys/errno.h>
#include <fcntl.h>
#include <sys/types.h>
#include <sys/stat.h>
#include <errno.h>
#include <stdarg.h>
#include <sys/ldr.h>
int main (int argc, char **argv)
{
int liRC = 0;
struct shl_descriptor *lstModDesc;
int liEach;
char lsBaseName[513];
char *lsTheName;
for( liEach = 0; liRC == 0; liEach++ )
{
liRC = shl_get( liEach, &lstModDesc );
if( liRC == 0 )
{
strcpy( lsBaseName, lstModDesc->filename );
lsTheName = (char *)basename( lsBaseName );
/* do more stuff */
}
}
return 0;
}
What it is doing is enumerating all the shared libraries attached to the binary. This compiles fine on AIX 5.3 but on AIX 7.1 I am getting the following concerning lstModDesc:
"modulename.c", line 2553.30: 1506-285
(S) The indirection operator cannot be
a pplied to a pointer to an incomplete
struct or union.
I cannot find where shl_get is defined on my aix 5.3 box nor can I find where struct shl_descriptor is defined either. I am stumped. I even tried outputing the preprocessed output with the -E flag to the compiler with no luck. I did a recursive grep in /usr/include. Is there somewhere else I should be searching? Where are those definitions?
Are you sure that bit of the code was included in the compilation on AIX 5.3? I just went Google-whacking with 'site:ibm.com shl_descriptor' and there is precisely one item found:
http://www-01.ibm.com/support/docview.wss?uid=swg21212239
It is pointing to a problem on HP-UX with WAS (WebSphere Application Server). There is sample code which uses <dl.h> (dynamic loader), and shows shl_descriptor and shl_gethandle() and shl_load().
Given the complete absence of hits for anything in AIX and the presence of the HP-UX platform, then you have a slightly different problem to resolve. The question is:
Why is the conditional compilation on AIX 5.3 excluding the section that uses shl_descriptor and not excluding it on AIX 7.1. You should look at the conditions wrapped around that code in the #ifdef line, and see what is used to trigger the HP-only compilation on AIX 5.3.

Resources