Request terminal size [C - Linux] - c

I'm trying to make a nice looking terminal game, but a lot of the things i'd like to do need a constant screen size. So i need the program to request a certain size every time it is ran. Is this possible, if so how?

The ncurses library has functionality for handling terminal sizes. This has been answered here and here regarding terminal dimensions.

You can only get the existing size, but you can't ask for a specific size. This terminal stuff was invented in a time where these sizes where hardcoded to be the physical size of the screen.

Related

how print in c without jumping to end of text?

I'm very new to coding in general so please excuse any stupid things I say.
I am trying to print (in c) a rather long text using printf() but since it can't all fit on the screen it jumps to the end of the text and the beginning is not visible unless you scroll up. Is there an easy way to have it print the long text but stay at the beginning and allow the user to scroll down as they read before they put in the next command?
On Unix (including Linux and Mac), there are command line programs built in called more and less that does exactly what you describe. more is a program that simply waits for the user to press enter or space before showing the next page of output. less is slightly improved in that it allows vi editor keystrokes (such as j and k) to scroll up and down in the output.
more is also available on the Windows command line as well. You might even be able to find a version of less for Windows as well.
c:\users\selbie> your_program.exe | more
$> ./your_program | less
As to how to do this programmatically, that's a bit more difficult as it would involve measuring the console width and implementing your own scroll buffers. There might be open-source libraries that provide this functionality, but the console environment already has a solution for apps that produce long output.
Not really, though you may find a reasonable and simple solution is to print only a certain number of lines (say 30), then prompt the user to press Enter before display more lines.
You can even find out the current size of the terminal. That's platform specific; for Linux it's explained here: How to get terminal window width?
Not in a standard way, no.
Your output stream in C is just a stream of characters -- scrolling is handled by your terminal.
Depending on your terminal, it may be possible to control scrolling by outputting special characters, like ANSI escape codes. The ncurses library provides a portable way to manipulate terminals.
However, if you just want a more convenient way to look through your output (or really any command output), #selbie's answer is the best: use more or less. This will avoid any extra complexity in your program.

ALSA API using in Matlab - buffer issue

being part of a lab course, I have to update the simulation about Pulse Coded Modulation. Initially, the simulation was written in 1998 using the OSS (open sound system) and was never updated thereafter. I have rewritten the entire code and ported it to ALSA.
The code itself is a bit long, that's why I haven't put it here but am providing a link.
Now to my issue: Whenever I want to play a vector of random length containing many samples, I start hearing weird periodic random noises. I have a feeling it's due to a buffer underrun. For a better understanding, I have recorded the output.
I believe it has to do something with the parameters I've set. Even though I tried out many cases, I didn't come to a solution.
Just take a look at the period size, buffer size, periods and the sbplay(..) function. PS.: My HW is set such that buffer size = period size * periods
I hope you can help me somehow! Thanks in advance
Code
Output WAV
BTW.: ALSA: buffer underrun on snd_pcm_writei call
didn't help me much...
Efe,
Why don't you try the audioplayer/audiorecorder functions in MATLAB. They use ALSO on Linux. If you want greater control over the latency try the dsp.AudioPlayer/AudioRecorder system objects.
Dinesh

C ncurses prevent resize

I am starting to learn how to use ncurses right now, and I do some calculations based on the number of lines and columns when the program starts.
It would be too much work for me to do dynamic calculation to manage the display, so I would need to find a way to block the resize of the shell during the execution, is this possible ?
There is certainly no portable or general-purpose way of blocking display size changes. Specific terminal emulators might offer this feature, but I don't know of any which do. It is generally possible to create a window of fixed size, but the terminal emulator would have to do that; it is invisible to the console code running inside the terminal.
If you find it difficult to respond to dynamic display size changes, you probably need to restructure your code. Otherwise, you can just ignore the size change, which might result in a confusing experience for your users, or might just result in them seeing either a portion of the output or a lot of blank space, depending on the nature of the resizing. (To get the latter effect, you need to avoid relying on automatic line wrapping and scrolling. On the other hand, automatic wrapping and scrolling are often just what you need to make your application window-size-independent.)

Display pixel on screen in C

How would I change a pixel on a display, in C?
Assume NOTHING: I am using a linux machine from console to do this. I do not want to use GUI toolkits or frameworks to draw the pixel. I do not want to draw the pixel in a window. I want to draw the pixel directly to the screen.
EDIT: I have a screen. I'm on a laptop running linux from console. I'd prefer a solution not using X as I'd rather learn how X works than how to use X.
If theres more information, ask, but don't assume. I'm not trying to build a GUI, and that was the main purpose of blocking assumptions as I don't want people to assume I'm doing things the long way when in reality I'm just tinkering.
EDIT 2: You may use any X11 related libraries provided that you can explain how they work.
If we really assume nothing, can we even assume that X is running? For that matter, can we even assume that there is a video card? Perhaps Linux is running headless and we're accessing it over a serial console.
If we are allowed to assume a few things, let's assume that Linux has booted with framebuffer support. (It's been a couple years since I worked with Linux framebuffers, I may get some of the details wrong.) There will be a device created, probably /dev/fb or /dev/fb0. Open that file and start writing RGB values at an offset, and the screen will change, pretty much regardless of anything: text console, graphical console, full-fledged desktop envrionment, etc. If you want to see if framebuffer support is working, do dd if=/dev/zero of=/dev/fb on the command line, and the display should go all black.
C doesnt have any graphics capabilities - you'd need to use a third party library for this.
You cannot assume a display in C. There is literally no way to do what you ask.
Edit: Okay, you have a display, but again, there's not a whole lot you can get from there. The point is that there are a TON of competing standards for graphics displays, and while some of them (VGA interfaces, for example) are standardized, a lot of the others (display driver interfaces, for example) are NOT. Much of what X (and other display device drivers, such as Windows or the like) do, is have specific interface code for how to talk to the display drivers; they abstract out the complexity of dealing with the display drivers. The windowing systems, though, have HUGE libraries of complicated and specific code for dealing with the display drivers; the fact that these things are relatively transparent is an indication of just how much work they've put into these things over time.
Very primitive and making a lot of assumptions:
fd = open("/dev/fb0", O_RDWR);
lseek(fd, 640*y+x, SEEK_SET);
write(fd, "\377\377\377\377", 4);
In reality, you would use mmap rather than write, and use the appropriate ioctl to query the screen mode rather than assuming 640xHHH 32bpp. There are also endian issues, etc.
So in real reality, you might use some sort of library code that handles this kind of thing for you.
I suppose you could paint to the terminal program that you are using as your console. All you have to do is figure out which one that is and look it up.
Whoops I assumed a terminal. :P
I think what you are looking for is information on how to write to the frame buffer. The easiest way would be to use SDL and render to the frame buffer, or else use GTK+ with DirectFB, although that goes against your edict on not using toolkits or frameworks.

What C library allows scaling of ginormous images?

Consider the following file:
-rw-r--r-- 1 user user 470886479 2009-12-15 08:26 the_known_universe.png
How would you scale the image down to a reasonable resolution, using no more than 4GB of RAM?
For example:
$ convert -scale 7666x3833 the_known_universe.png
What C library would handle it?
Thank you!
I believe libpng has a stream interface. I think this can be used to read parts of the image at a time; depending on the image file you might be able to get the lines in order. You could then shrink each line (e.g. for 50% shrinking, shrink the line horizontally and discard every second line) and write to an output file.
Using libpng in C can take a fair amount of code, but the documentation guides you through it pretty well.
http://www.libpng.org/pub/png/libpng-1.2.5-manual.html#section-3.8
You could try making a 64 bit build of ImageMagick or seeing if there is one. My colleague wrote a blog with a super-simple png decoder (assumes you have zlib or equivalent) so you can kind of see the code you'd need to roll your own.
http://www.atalasoft.com/cs/blogs/stevehawley/archive/2010/02/23/libpng-you-re-doing-it-wrong.aspx
You would need to do the resample as you're reading it in.
I used cximage a few years ago. I think the latest version is at
http://www.xdp.it/cximage.htm
after moving off of CodeProject.
Edit: sorry, it's C++ not C.
You could use an image processing library that is intended to do complex operations on large (and small) images. One example is the IM imaging toolkit. It links well with C (but is implemented at least partly in C++) and has a good binding to Lua. From the Lua binding it should be easy to experiment.
libvips is comfortable with huge images. It's a streaming image processing library, so it can read from the source, process, and write to the destination simultaneously and in parallel. It's typically 3x to 5x faster than imagemagick and needs very little memory.
For example, with the largest PNG I have on my laptop (1.8gb), I can downsize 10x with:
$ vipsheader huge.png
huge.png: 72000x72000 uchar, 3 bands, srgb, pngload
$ ls -l huge.png
-rw-r--r-- 1 john john 1785845477 Feb 19 09:39 huge.png
$ time vips resize huge.png x.png 0.1
real 1m35.279s
user 1m49.178s
sys 0m1.208s
peak RES 230mb
Not fast, but not too shabby either. PNG is rather a slow format, it would be much quicker with TIFF.
libvips is installable by most package managers (eg. homebrew on macOS, apt on Debian), there's a Windows binary, and it's free (LGPL). As well as the command-line, there are bindings for C, C++, Python, Ruby, Lua, node, PHP, and others.
Have you considered exploring pyramid based images? Imagine a pyramid where the image is divided up in multiple layers, each layer with a different resolution. Each layer is split up into tiles.
This way you can display a zoomed out version of the image, and also a zoomed in partial view of the image, without having to re-scale.
See the Wikipedia entry.
One of the original formats was FlashPix, which I wrote a renderer for.
I've also created a new format of a pyramid converter and renderer, which was used for a medical application. An actual scanner would produce 90GB+ scans of a slice of an organ for cancer research.
The algorithm of the converter was actually pretty tricky to get efficient, to produce the pyramid images efficienty. Believe it or not, it was actually Java based, and it performed much better than you'd think. It used multithreading. Benchmarking showed it was unlikely that a C version would do a whole lot better. This was 6ish years ago. The original renderer I did over 10 years ago.
You don't hear anything about pyramid based images anymore these days. But it's really the only efficient way to produce scaled images on demand without having to generate cached scaled versions.
Jpeg2000 may or may not have an optional pyramid feature as well.
I recall that ImageMagick's supporter formats and conversions perhaps, include FlashPix.
Googling for "image pyramid" reveals some interesting results. Bring back some memories ;-)
If you can move it to a 64-bit OS you can open it as a memory mapped file or equivalent and use pretty much any library you want. It won't be fast, and may need the increase of the page/swap file (depending on the OS and what else you want to do with it) but in return you won't be limited to streaming libraries so you'll be able to do more operation before going into resolution reduction or slicing.

Resources