calculating stack memory in c - c

I'm trying to calculate stack memory used in my program.
should i add 4 for each integer i have defined?
what about something like char str[128], shoudld i add 128 or 129?
#define ARRAY1_LIMIT 200
#define ARRAY2_LIMIT 100
char* array1[ARRAY1_LIMIT];
char* array2[ARRAY2_LIMIT];
int i = 0;
int j = 0
array1[i] = (char *)malloc(sizeof(char)*5);
array2[j] = (char *)malloc(sizeof(char)*10);
I know that the heap memory is 5+15 = 15, but i don't know how to calculate the stack memory? is it 200 + 100?

If you're doing anything more than simply trying to learn about stack usage and how variable allocations affect it, you'll want to use a stack depth analysis tool. Such a tool can help you determine if your program could possibly overflow its stack under any possible sequence of events (excepting unexpected or unbounded recursion). You can write your own (I have, in C#, for embedded programs compiled in C for M16C and MIPS targets using GCC and IAR compilers), but it's really complex and not something for beginners to attempt.
Look for a "stack usage analyzer" or "stack usage analysis tool" for your particular processor and toolchain (e.g. x86/x64/ARM/etc and GCC/VisualStudio/IAR/etc).
If you're using GCC, you may be able to use the -fstack-usage option, but that only gives you the maximum stack usage on a per-function basis. By itself that's not terribly helpful, since to verify that a program won't blow its stack, you have to recursively walk the calltree to see what the maximum stack depth could be at any level of the call tree. If you also use the -Wstack-usage option, you can get a warning if any subprogram's stack usage can possibly exceed a specified stack depth, which is more useful than the information you get with merely the -fstack-usage option.

If you're trying to work out the amount of 'space allocated to those variables in a function, use the sizeof() operator.
Notice that the arrays are arrays of char * which are pointers to characters not characters.
#include <stdio.h>
int main(void) {
size_t total=0;
size_t first_array_size=sizeof(char *[200]);
printf("first array: %zu\n",first_array_size);
total+=first_array_size;
size_t second_array_size=sizeof(char *[100]);
printf("second array: %zu\n",second_array_size);
total+=second_array_size;
size_t int_size=sizeof(int);
printf("int size: %zu * 2 = %zu\n",int_size,int_size*2);
total+=int_size*2;
printf("total=%zu\n",total);
return 0;
}
Typical output (on a 64-bit architecture):
first array: 1600
second array: 800
int size: 4 * 2 = 8
total=2408
Results may vary.
Footnote: It's also worth understanding that the amount of space allocated to the stack frame may be greater.
For example arguments passed in are typically copied to the stack as is the return value along with a pointer representing the execution point to return to after the function is called.
There's also the complexity of alignment. For example on many modern machines space may be left between variables to make sure they're aligned. However optimizations may take some of that space back depending on how it re-orders the variables. It's also possible that values (particularly integer values) may be allocated to registers and not take up space on the stack.
Finally an implementation could in principle allocate the arrays on the heap. That's certainly a possible implementation for variable length arrays but I'm not aware of it being strictly disallowed for fixed size arrays.

Related

Static Allocation - C language [duplicate]

This question already has answers here:
Do I really need malloc?
(2 answers)
Closed 2 years ago.
As far as I know, the C compiler (I am using GCC 6) will scan the code in order to:
Finding syntax issues;
Allocating memory to the program (Static allocation concept);
So why does this code work?
int main(){
int integers_amount; // each int has 4 bytes
printf("How many intergers do you wanna store? \n");
scanf("%d", &integers_amount);
int array[integers_amount];
printf("Size of array: %d\n", sizeof(array)); // Should be 4 times integer_amount
for(int i = 0; i < integers_amount; i++){
int integer;
printf("Type the integer: \n");
scanf("%d", &integer);
array[i] = integer;
}
for(int j = 0; j < integers_amount; j++){
printf("Integer typed: %d \n", array[j]);
}
return 0;
}
My point is:
How does the C compiler infer the size of the array during compilation time?
I mean, it was declared but its value has not been informed just yet (Compilation time). I really believed that the compiler allocated the needed amount of memory (in bytes) at compilation time - That is the concept of static allocation matter of fact.
From what I could see, the allocation for the variable 'array' is done during runtime, only after the user has informed the 'size' of the array. Is that correct?
I thought that dynamic allocation was used to use the needed memory only (let's say that I declare an integer array of size 10 because I don't know how many values the user will need to hold there, but I ended up only using 7, so I have a waste of 12 bytes).
If during runtime I have those bytes informed I can allocate only the memory needed. However, it doesn't seem to be the case because from the code we can see that the array is only allocated during runtime.
Can I have some help understanding that?
Thanks in advance.
How does the C compiler infer the size of the array during compilation time?
It's what's called a variable length array or for short a VLA, the size is determined at runtime but it's a one off, you cannot resize anymore. Some compilers even warn you about the usage of such arrays, as they are stored in the stack, which has a very limited size, it can potencially cause a stackoverflow.
From what I could see, the allocation for the variable 'array' is done during runtime, only after the user has informed the 'size' of the array. Is that correct?
Yes, that is correct. That's why these can be dangerous, the compiler won't know what is the size of the array at compile time, so if it's too large there is nothing it can do to avoid problems. For that reason C++ forbids VLA's.
let's say that I declare an integer array of size 10 because I don't know how many values the user will need to hold there, but I ended up only using 7, so I have a waste of 12 bytes
Contrary to fixed size arrays, a variable length array size can be determined at runtime, but when its size is defined you can no longer change it, for that you have dynamic memory allocation (discussed ahead) if you are really set on having the exact size needed, and not one byte more.
Anyway, if you are expecting an outside value to set the size of the array, odds are that it is the size you need, if not, well there is nothing you can do, aside from the mentioned dynamic memory allocation, in any case it's better to have a little more wasted space than too little space.
Can I have some help understanding that?
There are three concepts I find relevant to the discussion:
Fixed size arrays, i.e. int array[10]:
Their size defined at compile time, they cannot be resized and are useful if you already know the size they should have.
Variable length arrays, i.e. int array[size], size being a non constant variable:
Their size is defined at runtime, but can only be set once, they are useful if the size of the array is dependant on external values, e.g. a user input or some value retrived from a file.
Dynamically allocated arrays: i.e. int *array = malloc(sizeof *arr * size), size may or may not be a constant:
These are used when your array will need to be resized, or if it's too large to store in the stack, which has limited size. You can change its size at any point in your code using realloc, which may simply resize the array or, as #Peter reminded, may simply allocate a new array and copy the contents of the old one over.
Variables defined inside functions, like array in your snippet (main is a function like any other!), have "automatic" storage duration; typically, this translates to them being on the "stack", a universal concept for a first in/last out storage which gets built and unbuilt as functions are entered and exited.
The "stack" simply is an address which keeps track of the current edge of unused storage available for local variables of a function. The compiler emits code for moving it "forward" when a function is entered in order to accommodate the memory needs of local variables and to move it "backward" when the program flow leaves the function (the double quotes are there because the stack may as well grow towards smaller addresses).
Typically these stack adjustments upon entering into and returning from functions are computed at compile time; after all, the local variables are all visible in the program code. But principally, nothing keeps a program from changing the stack pointer "on the fly". Very early on, Unixes made use of this and provided a function which dynamically allocates space on the stack, called alloca(). The FreeBSD man page says: "The alloca() function appeared in Version 32V AT&T UNIX"ยด(which was released in 1979).
alloca behaves very much like alloc except that the storage is lost when the current function returns, and that it underlies the usual stack size restrictions.
So the first part of the answer is that your array does not have static storage duration. The memory where local variables will reside is not known at compile time (for example, a function with local variables in it may or may not be called at all, depending on run-time user input!). If it were, your astonishment would be entirely justified.
The second part of the answer is that array is a variable length array, a fairly new feature of the C programming language which was only added in 1999. It declares an object on the stack whose size is not known until run time (leading to the anti-paradigmatic consequence that sizeof(array) is not a compile time constant!).
One could argue that variable length arrays are only syntactic sugar around an alloca call; but alloca is, although widely available, not part of any standard.

Why can't the size of stack be determined at compile time?

I have read that the size of heap and stack cannot be computed at compile time and needs to be evaluated at runtime.
I can think of this code which allocates heap based on user input and needs the runtime:
int size;
scanf("%d", &size):
void *ptr= malloc(size);
But aren't all the stack variables already present in a function? given their data type (int, char, long etc.) why can't the compiler calculate the size?
With C99, it is possible to create Variable length array (VLA) on the stack. Those arrays will have dynamic size based on runtime parameters, or calculated expressions. In those cases, not possible to calculate stack size until runtime.
For example:
int f(int n) {
// Size based on input
int x[n] ;
// Dynamic size
int m = n+5000 ;
int y[mm] ;
};
Needless to say that if the allocation of a single function can not be calculated, it is not possible to calculate the stack size of a complete program
Stack memory is allocated at the entry of the function. That's why the stack size depends on the sequence of function calls, which is not defined at compile time (e.g. any if, switch can change the function call sequence)
Why can't the size of stack be determined at compile time?
But aren't all the stack variables already present in a function?
2 things that prevent compile time stack size computation:
Variable logic arrays allow for a run-time determined amount of memory usage of the "stack". Other non-standard functions like alloca() do so also.
Recursion allows for a run-time determined depth of function calls and thus a run-time determined amount of memory usage even if each function memory usage was constant. #Weather Vane
Were it not for these 2 and maybe others, code could be analysed at compile time to determine stack usage, max function depth and then possible not even use a "stack" in classic sense, storing all "stack" memory in a fixed space. Some compilers provide for this. e.g.
recursive functions need the stack size to be dynamic I believe.
And also some versions of C allow variable length arrays

How to get maximum possible size of array?

I have this code which segfaults when run.
#include <stdlib.h> // size_t
int main()
{
size_t s = 0xFFF;
char a[ s * s ];
a[ 0 ] = 'a';
return 0;
}
When I change the value of variable s to 0xFF it works correctly.
How can I determine the maximum possible size of an array in a portable way?
I use 64 bit linux and code is compiled with
gcc file.c
Update:
I do understand that large objects should be allocated with malloc() and small arrays can be allocated locally on the stack. However I want to know exactly how large these small arrays can be. I also want the solution to be portable.
Also, what if I have some other stuff on stack before my array is allocated? In that case, the maximum size of stack can't be used as the size of my array, because the stack will overflow.
The problem is not cause by "the maximum size of arrays" (I don't think such limits even exist), but by the limit of stack size, as you're declaring the array on the stack.
If you're on Linux, there is getrlimit to query with. Also you may find getrusage helpful. However I believe for large items they shall be allocated on the heap (via something like malloc).
For changing the maximum stack size, see Change stack size for a C++ application in Linux during compilation with GNU compiler
That is not a question of maxim array size, is a question of maximum stack frame size. The max size will depend on OS, arhitecture and max stack size. For example on Linux you can adjust it with setrlimi(LIMIT_STACK,...) and on Windows there are several means to control it, see Thread Stack Size.
But none of these really apply. If you want to allocate a large array, allocate it from the heap, not on the stack.
array locates in stack. You may check the default stack size with command ulimit -s. Say, ulimit -s tells your that stack size is 1024 bytes (in reality it should be some value like 8MB), in your case, s uses 8 bytes, so a can be at maximum 1016 bytes then.

What maximum size of static arrays are allowed in C?

In my algorithm I know work with static arrays, no dynamic ones. But I sometimes
reach the limit of the stack. Am I right, that static arrays are stored to the stack?
Which parameters affect my maximum stack size for one C programm?
Are there many system parameters which affect the maximal array size? Does the maximunm no. of elements depend of the array type? Does it depend on the total system RAM? Or does every C programm have a static maximum stack size?
Am I right, that static arrays are stored to the stack?
No, static arrays are stored in the static storage area. The automatic ones (i.e. ones declared inside functions, with no static storage specifier) are allocated on the stack.
Which parameters affect my maximum stack size for one C program?
This is system-dependent. On some operating systems you can change stack size programmatically.
Running out of stack space due to automatic storage allocation is a clear sign that you need to reconsider your memory strategy: you should either allocate the buffer in the static storage area if re-entrancy is not an issue, or use dynamic allocation for the largest of your arrays.
Actually, it depends on the C compiler for the platform you use.
As an example, there are even systems which don't have a real stack so recursion won't work.
A static array is compiled as a continuous memory area with pointers. The pointers might be two or four bytes in size (or maybe even only one on exotic platforms).
There are platforms which use memory pages which have "near" and "far" pointers which differ in size (and speed, of course). So it could be the case that the pointers representing the array and the objects need to fit into the same memory page.
On embedded systems, static data usually is collected in the memory area which will later be represented by the read-only memory. So your array will have to fit in there.
On platforms which run arbitrary applications, RAM is the limiting factor if none of the above applies.
Most of your questions have been answered, but just to give an answer that made my life a lot easier:
Qualitatively the maximum size of the non-dynamically allocated array depends on the amount of RAM that you have. Also it depends on the type of the array, e.g. an int may be 4 bytes while a double may be 8 bytes (they are also system dependent), thus you will be able to have an array that is double in number of elements if you use int instead of double.
Having said that and keeping in mind that sometimes numbers are indeed important, here is a very noobish code snippet to help you extract the maximum number in your system.
#include <stdio.h>
#include <stdlib.h>
#define UPPER_LIMIT 10000000000000 // a very big number
int main (int argc, const char * argv[])
{
long int_size = sizeof(int);
for (int i = 1; i < UPPER_LIMIT; i++)
{
int c[i];
for (int j = 0; j < i; j++)
{
c[j] = j;
}
printf("You can set the array size at %d, which means %ld bytes. \n", c[i-1], int_size*c[i-1]);
}
}
P.S.: It may take a while until you reach your system's maximum and produce the expected Segmentation Fault, so you may want to change the initial value of i to something closer to your system's RAM, expressed in bytes.

Segmentation fault due to lack of memory in C

This code gives me segmentation fault about 1/2 of the time:
int main(int argc, char **argv) {
float test[2619560];
int i;
for(i = 0; i < 2619560; i++)
test[i] = 1.0f;
}
I actually need to allocate a much larger array, is there some way of allowing the operating system to allow me get more memory?
I am using Linux Ubuntu 9.10
You are overflowing the default maximum stack size, which is 8 MB.
You can either increase the stack size - eg. for 32 MB:
ulimit -s 32767
... or you can switch to allocation with malloc:
float *test = malloc(2619560 * sizeof test[0]);
Right now you're allocating (or at least trying to) 2619560*sizeof(float) bytes on the stack. At least in most typical cases, the stack can use only a limited amount of memory. You might try defining it static instead:
static float test[2619560];
This gets it out of the stack, so it can typically use any available memory instead. In other functions, defining something as static changes the semantics, but in the case of main it makes little difference (other than the mostly theoretical possibility of a recursive main).
Don't put such a large object on the stack. Instead, consider storing it in the heap, by allocation with malloc() or its friends.
2.6M floats isn't that many, and even on a 32-bit system you should be ok for address space.
If you need to allocate a very large array, be sure to use a 64-bit system (assuming you have enough memory!). 32-bit systems can only address about 3G per process, and even then you can't allocate it all as a single contigous block.
It is the stack overflower.
You'd better to use malloc function to get memory larger than stack size which you can get it from "ulimit -s".

Resources