I am trying to learn how to use Cairo 2D drawing library with xlib surfaces.
I wrote a little test program that allows creating multiple windows. Each function may have a custom paint() function that is called regularly to add some graphics content to the window, or redraw it completely if desired. There is also an option to define mouse and key listener. The main routine checks for X events (to delegate them to mouse and key listener) and for timeout for periodic call of those paint() functions.
I tried with the 1.14.6 version of Cairo (that is currently available as package in Ubuntu 16.04), and the latest 1.15.12, but the results are the same.
The expected behavior of this demo is to open 3 windows. One will have random rectangles being added, another one random texts, and the third random circles.
In addition, clicking into windows should produce lines (connecting to mouse click, or randomly), and using arrow keys should draw a red line in the window with circles.
The circles and text seem to show up regularly as expected. All three windows should have white background, but two of them are black. And the worst, the window with rectangles does not get updated much (and it does not matter if it is the first window created or not, it is always the rectangles that do not show up properly).
They are only shown when the focus changes to or from that window - then the remaining rectangles that should have been drawn meanwhile suddenly appear.
I am calling cairo_surface_flush() on the surface of each window after adding any content, but that does not help. I also tried posting XEvents to that window of various kind (such as focus), they arrive, but rectangles do not show up.
Furthermore, even though drawing lines with mouse works fine, drawing line with key arrows suffers from the same problem - it is drawn, but not shown properly.
I am obviously wrong in some of my assumptions about what this library can do, but I am not sure where.
It seems that there are some two competing versions of drawing being shown, since it happens sometimes that one or two rectangles, or pieces of the red line are flashing. Some kind of strange buffering, caching?
It may just be some bug in my program, I do not know.
Another observation - the black background is because drawing white background happens before the window is shown, and thus those cairo_paint calls are somehow discarded. I do not know how to make the window appear earlier, it seems it appears only after some later changes on the screen.
I am stuck on this after a couple of desperate days, could you help me out at least in part, please?
The program is here: test_cairo.c
An example screenshot (with a broken red line drawn by keys, and rectangles not showing up properly): test_cairo.png
To compile (on Ubuntu 16.04 or similar system):
gcc -o test_cairo test_cairo.c -I/usr/include/cairo -lX11 -lcairo
X11 does not retain window content for you. When you get an Expose event, you have to repaint the area described by that event completely.
All three windows should have white background, but two of them are black.
You create your window with XCreateSimpleWindow, so their background attribute is set to black. The X11 server will fill exposed areas with black for you before sending an expose event. Since you do not tell cairo to draw a white background, the black stays.
Try this:
--- test_cairo.c.orig 2018-07-28 09:53:10.000000000 +0200
+++ test_cairo.c 2018-07-29 10:52:43.268867754 +0200
## -63,6 +63,7 ## static gui_mouse_callback mouse_callback
static cairo_t *windows[MAX_GUI_WINDOWS_COUNT];
static cairo_surface_t *surfaces[MAX_GUI_WINDOWS_COUNT];
+static cairo_surface_t *real_surfaces[MAX_GUI_WINDOWS_COUNT];
static Window x11windows[MAX_GUI_WINDOWS_COUNT];
static char *window_names[MAX_GUI_WINDOWS_COUNT];
## -79,7 +80,12 ## long long usec()
void repaint_window(int window_handle)
{
draw_callbacks[window_handle](windows[window_handle]);
- cairo_surface_flush(surfaces[window_handle]);
+
+ cairo_t *cr = cairo_create(real_surfaces[window_handle]);
+ cairo_set_source_surface(cr, surfaces[window_handle], 0, 0);
+ cairo_paint(cr);
+ cairo_destroy(cr);
+ cairo_surface_flush(real_surfaces[window_handle]);
}
int gui_cairo_check_event(int *xclick, int *yclick, int *win)
## -149,7 +155,6 ## void draw_windows_title(int window_handl
sprintf(fullname, "Mikes - %d - [%s]", window_handle, context_names[current_context]);
else
sprintf(fullname, "Mikes - %s - [%s]", window_names[window_handle], context_names[current_context]);
- cairo_surface_flush(surfaces[window_handle]);
XStoreName(dsp, x11windows[window_handle], fullname);
}
## -179,20 +184,17 ## int gui_open_window(gui_draw_callback pa
}
if (window_handle < 0) return -1;
- surfaces[window_handle] = gui_cairo_create_x11_surface(&width, &height, window_handle);
+ real_surfaces[window_handle] = gui_cairo_create_x11_surface(&width, &height, window_handle);
+ surfaces[window_handle] = cairo_surface_create_similar(real_surfaces[window_handle], CAIRO_CONTENT_COLOR, width, height);
windows[window_handle] = cairo_create(surfaces[window_handle]);
mouse_callbacks[window_handle] = 0;
draw_callbacks[window_handle] = paint;
window_update_periods[window_handle] = update_period_in_ms;
window_names[window_handle] = 0;
-
- cairo_surface_flush(surfaces[window_handle]);
cairo_set_source_rgb(windows[window_handle], 1, 1, 1);
cairo_paint(windows[window_handle]);
-
- cairo_surface_flush(surfaces[window_handle]);
draw_callbacks[window_handle](windows[window_handle]);
## -201,7 +203,6 ## int gui_open_window(gui_draw_callback pa
else next_window_update[window_handle] = 0;
draw_windows_title(window_handle);
- cairo_surface_flush(surfaces[window_handle]);
window_in_use[window_handle] = 1;
return window_handle;
## -213,6 +214,7 ## void gui_close_window(int window_handle)
cairo_destroy(windows[window_handle]);
cairo_surface_destroy(surfaces[window_handle]);
+ cairo_surface_destroy(real_surfaces[window_handle]);
window_in_use[window_handle] = 0;
int no_more_windows = 1;
for (int i = 0; i < MAX_GUI_WINDOWS_COUNT; i++)
Related
I'm investigating possibilities that Processing gives regarding generative art, and I stumbled upon a problem:
I'd like to generate multiple Bezier curves using a while loop. However, the program skips parts of some curves, while others are drawn properly.
Here's a working example:
void setup() {
size(1000,500);
background(#ffffff);
}
float[] i_x = {1,1};
float[] i_y = {1,1};
void draw() {
while (i_y[0] < height)
{
bezier(0,i_y[0],100,height-100,width - 100,height-100,width, i_y[0]);
i_y[0] = i_y[0] * 1.1;
}
save("bezier.jpg");
}
And here is the output. As you can see, only few of the curves are drawn in their full shape.
Also, when I draw one of the 'broken' curves out of the loop, it works fine.
I'd appreciate any help. I'm having good time learning coding concepts with visual output that Processing provides.
It works as intended. Look what happens when you change the background color (great post btw, the working example made it good enough for me to want to debug it!):
If you're clever, you'll notice that the "inside" of the curve has a color. Except that for now it's white. That's why only the topmost are "invisible": you're drawing them one after the other, starting topmost, so every new curve eats the last one by painting over it, but only "inside the curve". See what happens when I apply some color to differentiate the fill and the background better:
Now that the problem is obvious, here's the answer: transparency.
while (y < height)
{
fill(0, 0, 0, 0); // this is the important line, you can keep your algo for the rest
bezier(0, y, offset, height-offset, width - offset, height-offset, width, y);
y *= 1.1;
}
Which gives us this result:
Have fun!
I have the strict requirement to have a texture with resolution (let's say) of 512x512, always (even if the window is bigger, and SDL basically scales the texture for me, on rendering). This is because it's an emulator of a classic old computer assuming a fixed texture, I can't rewrite the code to adopt multiple texture sizes and/or texture ratios dynamically.
I use SDL_RenderSetLogicalSize() for the purpose I've described above.
Surely, when this is rendered into a window, I can get the mouse coordinates (window relative), and I can "scale" back to the texture position with getting the real window size (since the window can be resized).
However, there is a big problem here. As soon as window width:height ratio is not the same as the texture's ratio (for example in full screen mode, since the ratio of modern displays would not match of the ratio I want to use), there are "black bars" at the sides or top/bottom. Which is nice, since I want always the same texture ratio, fixed, and SDL does it for me, etc. However I cannot find a way to ask SDL where is my texture rendered exactly inside the window based on the fixed ratio I forced. Since I need the position within the texture only, and the exact texture origin is placed by SDL itself, not by me.
Surely, I can write some code to figure out how those "black bars" would change the origin of the texture, but I can hope there is a more simple and elegant way to "ask" SDL about this, since surely it has the code to position my texture somewhere, so I can re-use that information.
My very ugly (can be optimized, and floating point math can be avoided I think, but as the first try ...) solution:
static void get_mouse_texture_coords ( int x, int y )
{
int win_x_size, win_y_size;
SDL_GetWindowSize(sdl_win, &win_x_size, &win_y_size);
// I don't know if there is more sane way for this ...
// But we must figure out where is the texture within the window,
// which can be changed since the fixed ratio versus the window ratio (especially in full screen mode)
double aspect_tex = (double)SCREEN_W / (double)SCREEN_H;
double aspect_win = (double)win_x_size / (double)win_y_size;
if (aspect_win >= aspect_tex) {
// side ratio correction bars must be taken account
double zoom_factor = (double)win_y_size / (double)SCREEN_H;
int bar_size = win_x_size - (int)((double)SCREEN_W * zoom_factor);
mouse_x = (x - bar_size / 2) / zoom_factor;
mouse_y = y / zoom_factor;
} else {
// top-bottom ratio correction bars must be taken account
double zoom_factor = (double)win_x_size / (double)SCREEN_W;
int bar_size = win_y_size - (int)((double)SCREEN_H * zoom_factor);
mouse_x = x / zoom_factor;
mouse_y = (y - bar_size / 2) / zoom_factor;
}
}
Where SCREEN_W and SCREEN_H are the dimensions of the my texture, quite misleading by names, but anyway. Input parameters x and y are the window-relative mouse position (reported by SDL). mouse_x and mouse_y are the result, the texture based coordinates. This seems to work nicely. However, is there any sane solution or a better one?
The code which calls the function above is in my event handler loop (which I call regularly, of course), something like this:
void handle_sdl_events ( void ) {
SDL_Event event;
while (SDL_PollEvent(&event)) {
switch (event.type) {
case SDL_MOUSEMOTION:
get_mouse_texture_coords(event.motion.x, event.motion.y);
break;
[...]
I have an ncurses program, with multiple child windows acting as columns. Each child window has a fixed width and the height of the parent terminal window.
However I've found that if the width of the terminal is reduced then one of the windows loses their fixed width and seems to "overflow" their previously set bounds, using the entire remaining width of the terminal.
This is hard to explain so I made a quick gif showing the problem:
This is the code I've used for the above:
#include <ncurses.h>
int main() {
WINDOW * winA, * winB;
int i = 0, width = 30;
initscr();
// Suppress stdout
noecho();
// Enable keypad
keypad(stdscr, true);
// interrupt, quit, suspend, and flow control characters are all passed through uninterpreted
raw();
winA = newwin(0, width, 0, 0);
winB = newwin(0, width, 0, width);
timeout(50);
while(getch() != 'q') {
i = width * getmaxy(stdscr);
werase(winA);
werase(winB);
while (i--) {
waddstr(winA, "0");
waddstr(winB, "1");
}
wnoutrefresh(stdscr);
wnoutrefresh(winA);
wnoutrefresh(winB);
doupdate();
}
endwin();
return 0;
}
Here is another screenshot showing the issue in my actual program. The terminal on the left is correct, the one on the right shows the result after resizing the window and triggering this issue:
How can I prevent windows from losing their fixed-width-ness when the terminal is resized to a small width?
Rather than using windows for everything, you could use pads. Windows are limited to the screen-size, while pads are not.
When ncurses gets a SIGWINCH, it resizes stdscr and everything contained within stdscr, shrinking those windows as needed to fit in the new screen-size, or (if their margins match the old screen-size), increasing their size. That's automatic. Your program could check for KEY_RESIZE returned by getch and call wresize to change window sizes (and redraw their contents).
If you used pads, those do not get resized (pads are shown through a viewport that the caller can adjust).
How do I get a bright white background with black text on it in ncurses, similar to the title-bar in nano? All I can seem to achieve despite following the advice in another question (which has to do with getting bright white text on a black background, the opposite of what I want to achieve), is an ugly beige-colored background.
Images:
GNU nano's titlebar, what I want.
What I get with the program below. (Build with gcc -lncursesw -I/usr/include minimal_example.c)
#include <locale.h>
#include <ncurses.h>
int main() {
setlocale(LC_ALL, "");
// Initialize curses library
initscr();
// Enable colors
start_color();
// Attempt recommendation in https://stackoverflow.com/questions/1896162/how-to-get-a-brightwhite-color-in-ncurses and other places on the web
use_default_colors();
// Make the COLOR_PAIR 0xFF refer to a white foreground, black background window.
// Using -1 will not work in my case, because I want the opposite of the default (black text on white bg), not the default (white text on black bg).
init_pair(0xFF, COLOR_BLACK, COLOR_WHITE);
refresh();
// Get our term height and width.
int x;
int y;
// & not required because this is a macro
getmaxyx(stdscr, y, x);
// Create a new window.
// TODO: Resize the window when the term resizes.
WINDOW *window = newwin(y,x,0,0);
// Try some other attributes recommended online, no dice. Putting this after the call to wbkgd() just makes the text look strange, does not change the background.
wattron(window,A_BOLD|A_STANDOUT);
// Set window color.
wbkgd(window, COLOR_PAIR(0xff));
// Draw a nice box around the window.
box(window, 0, 0);
// Write some text.
mvwprintw(window, 1, 1, "背景:不白");
wrefresh(window);
// Wait for keypress to exit.
getch();
// De-initialize ncurses.
endwin();
return 0;
}
I thought that perhaps there was something wrong with my terminal configuration (termite), but I was able to reproduce the problem in xfce4-terminal and xterm, both using the default configurations. The only way to fix this is to set my color7 and color15 to the same color as foreground, which obviously I do not want to do because that is non-standard and I want to distribute the larger application this code is used in.
(xfce4-terminal with the bug)
My recommendation is to define the bright colors (9 to 15) if there are at least 16 colors and can_change_color() returns true. Otherwise fall back to non-bright colors:
#include <stdlib.h>
#include <locale.h>
#include <ncurses.h>
#define PAIR_BW 1
#define BRIGHT_WHITE 15
int main(void) {
int rows, cols;
setlocale(LC_ALL, "");
initscr();
start_color();
use_default_colors();
if (can_change_color() && COLORS >= 16)
init_color(BRIGHT_WHITE, 1000,1000,1000);
if (COLORS >= 16) {
init_pair(PAIR_BW, COLOR_BLACK, BRIGHT_WHITE);
} else {
init_pair(PAIR_BW, COLOR_BLACK, COLOR_WHITE);
}
refresh();
getmaxyx(stdscr, rows, cols);
WINDOW *window = newwin(rows,cols,0,0);
wbkgd(window, COLOR_PAIR(PAIR_BW));
box(window, 0, 0);
mvwprintw(window, 1, 1, "背景:不白");
wrefresh(window);
getch();
endwin();
return EXIT_SUCCESS;
}
This is tested to work in Gnome Terminal 3.18.3 and XTerm 322, and it should work in all color-capable terminals if using ncursesw (although on some weird ones you might still get the non-bright-white background).
Colors in terminal emulators are a bit of a mess. Your problem is that that gray background really is "white" according to your terminal! Check out this table on Wikipedia. See how gray-looking the "White" row is across all the different terminal emulators? What you really want is "bright white", which is beyond the 8 initial colors. The problem is that, according to curses:
Some terminals support more than the eight (8) “ANSI” colors. There
are no standard names for those additional colors.
So you just have to use them by number and hope that everyone's terminal conforms to the tables on Wikipedia (I think most do).
init_pair(0xFF, COLOR_BLACK, COLORS > 15 ? 15 : COLOR_WHITE);
That's all you need, so you can get rid of all that other use_default_colors and A_BOLD stuff.
Old answer:
The curs_color manual says
Note that setting an implicit background color via a color pair affects only character cells that a character write operation explicitly touches.
...
The A_BLINK attribute should in theory cause the background to go bright.
Indeed, if you just change
wbkgd(window, COLOR_PAIR(0xff) | A_BLINK);
You get a bright white background, but only in areas where text was drawn (including the window border). I'm not sure how to get the same effect over the entire window background, but hopefully this can get you started.
This change (assuming TERM=xterm-256color) does what was asked:
> diff -u foo.c foo2.c
--- foo.c 2017-10-06 15:59:56.000000000 -0400
+++ foo2.c 2017-10-06 16:10:11.893529758 -0400
## -7,10 +7,9 ##
initscr();
// Enable colors
start_color();
- // Attempt recommendation in https://stackoverflow.com/questions/1896162/how-to-get-a-brightwhite-color-in-ncurses and other places on the web
- use_default_colors();
- // Make the COLOR_PAIR 0xFF refer to a white foreground, black background window.
- // Using -1 will not work in my case, because I want the opposite of the default (black text on white bg), not the default (white text on black bg).
+ // redefine colors, using the initc capability in TERM=xterm-256color
+ init_color(COLOR_BLACK, 0, 0, 0);
+ init_color(COLOR_WHITE, 999, 999, 999);
init_pair(0xFF, COLOR_BLACK, COLOR_WHITE);
refresh();
// Get our term height and width.
But nano doesn't do that. You might find it helpful to read its source-code, to see that it solves the problem by using the terminal's default colors.
I'm trying to learn X11. It's very hard to me, because I don't have experience with window applications on Linux.
I wrote some simple code and I can't resolve this not visible text problem.
Everything is working good probably, when I was trying to draw rectangle with DrawRectangle function it was working.
Here is the code:
#include <stdlib.h>
#include <stdio.h>
#include <string.h>
#include <X11/Xlib.h>
int main()
{
Display* myDisplay;
Window myWindow;
int myScreen;
GC myGC;
XEvent myEvent;
unsigned long black, white;
char* hello = "Hello world!";
XFontStruct* myFont;
if((myDisplay = XOpenDisplay(NULL)) == NULL)
{
puts("Error in conneting to X Server!");
return -1;
}
myScreen = DefaultScreen(myDisplay);
black = BlackPixel(myDisplay, myScreen);
white = WhitePixel(myDisplay, myScreen);
myWindow = XCreateSimpleWindow(myDisplay, RootWindow(myDisplay, myScreen), 0, 0, 640, 320, 5, black, white);
XSelectInput(myDisplay, myWindow, ExposureMask);
XClearWindow(myDisplay, myWindow);
XMapWindow(myDisplay, myWindow);
myGC = XCreateGC(myDisplay, myWindow, 0, 0);
XSetForeground(myDisplay, myGC, black);
XSetBackground(myDisplay, myGC, white);
myFont = XLoadQueryFont(myDisplay, "-Misc-Fixed-Medium-R-Normal--7-70-75-75-C-50-ISO10646-1");
XSetFont(myDisplay, myGC, myFont->fid);
while(1)
{
XNextEvent(myDisplay, &myEvent);
if(myEvent.type == Expose)
{
XClearWindow(myDisplay, myWindow);
// HERE I DONT KNOW WHY IT DOESNT WORK!
XDrawString(myDisplay, myWindow, myGC, 0, 0, hello, strlen(hello));
}
}
XFreeGC(myDisplay, myGC);
XDestroyWindow(myDisplay, myWindow);
XCloseDisplay(myDisplay);
return 0;
}
Thank you for help!
Your font path argument to XLoadQueryFont is wrong (on my Linux/Debian desktop). Check with the xlsfonts command the right ones (they are all lowercases).
With
myFont = XLoadQueryFont
(myDisplay,
"-misc-fixed-medium-r-normal--9-90-75-75-c-60-iso10646-1");
it could work better. Try also with "lucidasanstypewriter-bold-14"
And most importantly the coordinates passed to XDrawString are wrong. Remember that they are the coordinates of the baseline of your text. And x=0, y=0 is the top left corner of the window, and y is growing downwards and x is growing to the right. Hence your text is drawn off-window, above its top. So y should be positive and more than the font height.
Try
XDrawString (myDisplay, myWindow, myGC, 15, 20, hello,
strlen (hello));
As I commented, you need to handle a lot more events.
I don't have experience with window applications on Linux.
And to learn about GUI programming, I strongly recommend first using some toolkit like GTK or Qt or perhaps SDL.
Raw X11 programming is too hard (and by the time you'll learn it is will be obsolete, e.g. by Wayland), in particular because an X11 application needs to be ICCCM & EWMH compliant. Notice that the entire X11 documentation requires nearly ten thousand pages.
See also https://tronche.com/gui/x/xlib/
BTW, most Linux GUI applications are drawing pixmap client side and sending it to the X11 server. Read about compositing window managers. Drawing requests like XDrawString are no more used anymore in practice. Recent font related libraries like libfontconfig, libXft, etc are working on the client side.