Memory issue with c program - c

So,i was solving a problem on SPOJ.I have to handle large number of values of order 2^18.I am on a system with 4 GB ( 2**34 Bytes) ram. While allocating memory for my program , i observed something strange. I am using long long qualifier for storing the input in two arrays sum1[1<<17] and sum2[1<<17].So, total memory allocated by the two arrays is 2^17*8 + 2^17*8 = 2^21 Bytes.The part of code i want to refer is :
#include <stdio.h>
int main()
{
// some code
int a=17,b=17;
long long sum1[1<<a], sum2[1<<b];
// some more code
}
The problem is that whenever a+b >= 34 , the program stops working, else it works fine.I guess it is due to unavailability of large space. But if i make the two arrays global like that :
#include <stdio.h>
long long sum1[1<<18], sum2[1<<18];
int main()
{
// some code
}
It works great and doesn't bother about a+b <= 34 , as you can see it works fine for 1<<18.So what's happening under the hood.
Regards.

The local variables for your function go into the stack. Your stack size is not large enough to hold your arrays.
Additional info:
You can see your stack size with the following:
ulimit -s

Local variables are usually allocated on the stack, and most systems have a relatively small limit on the size of a stack frame. Large arrays should be either global or allocated dynamically with malloc().

Related

decalre a hight rage char array but no RAM usage - only use when there are characters inside it [duplicate]

This question already has answers here:
Getting a stack overflow exception when declaring a large array
(8 answers)
Closed 4 years ago.
i have question about C language (memory). this is my source
#include <unistd.h>
int main() {
char ___storage___[1073741824];
sleep(30);
return 0;
}
RAM Usage : 10 Bytes
when i run this program i expected this program get 1 GB from my pc ram for 30 seconds. but it's get nothing from my PC ram !!! but for example if i copy the characters into this array like this
#include <stdio.h>
#include <unistd.h>
int main() {
char ___storage___[1073741824];
for (int i = 0; i < 536870912; i++) // 512 Mb characters !
___storage___[i] = 'h';
sleep(30);
return 0;
}
RAM Usage : 512 MB
for this program when i run it, this program get 512 MB of my ram ! but i declared a variable with (1GB) size ! why ? if this get our PC ram only when we insert something inside it, why we have dynamic variables !? for example we give a dynamic variable high range and insert inside it with out dynamic allocation or reallocation !
Most compilers allocate local variables in the stack, however most operating system limit stack size to something reasonable, like in the tens to hundreds of megabytes. So let's assume it does some compiler magic is hide this limit and isn't in the stack but on the heap.
Long story short your program doesn't get physical RAM until it actually attempts to read or write for most things. This causes a pages fault which gets handled by a component in the operating system called a virtual memory manger. This then maps RAM to a segment of your address space (program memory) in a unit called a page. Pages vary in size based on processor architecture, however for AMD64 it uses 4KB or 4MB page sizes, depending on what the operating system would like to do.

showing error for large arrays in C

When I run the following code in C language, my compiler shows the error "xxx has stopped working ".
However, when I take array sizes as 1000 instead of 100000 it runs fine. What is the problem and how can I fix it? If there is some memory problem then how can I take input of 100000 numbers in these arrays without exceeding it?
Code I tried :
int main()
{
int a[100000],l[100000],r[100000],ans[100000],x[100000],y[100000];
/*
some code
*/
return 0;
}
Declare a, l, r, ans, x and y as global variables so that they will be allocated in the heap instead of the stack.
int a[100000], l[100000], r[100000], ans[100000], x[100000], y[100000];
int main()
{
The stack is typically a limited resource. Use dynamic allocation (such as malloc) instead.
Most systems limits the stack to something between one and four megabytes. Since your arrays are well over 2MB you are most likely going over the stack limit of your system.
In C there are a couple of ways to solve that problem:
Make the arrays global
Make the arrays static
Dynamically allocate the memory for them of the heap (e.g. malloc and friends)
Simply make the arrays smaller
Welcome in stackoverflow ;)
use dynamic allocation (malloc/free) in order to use all your ram capacities.
Most systems have a limited stack size and since your arrays are local(automatic) variables they will be allocated on the stack, so you are very likely overflowing the stack. If you need to allocated large arrays malloc is going to be the better choice.

How to prevent the excessive usage of program-stack memory while initializing arrays?

#include<stdio.h>
int main(){
scanf("%d",&Testcase)
while(Testcase--){
int a[100000] = {0};
/* Other statements */
}
}
In the above program, for every Testcase, the program allocates 100000*sizeof(int) bytes of memory. But in codechef the maximum memory that we can use is about 10 MB. So, is there any optimal way to reduce the memory usage?
P.S. I have tried declaring it as a global variable. But the problem with that is, after every test case, the old Testcase values interferes with the new Testcase values.
Also, I have tried reinitializing the entire array with value 0, after every Testcase, using a for loop. But that takes so long, exceeding the time requirement which is 3 seconds.
The problem I'm trying to solve is http://www.codechef.com/MARCH13/problems/FIRESC
Edit: The total allowable memory limit is actually about 10 MB
If you declare the array as a global variable, it will be allocated into the .bss section which again is not very optimal. If you wish to allocate a large section of memory, malloc would be the preferred way where you would allocate the memory in heap section.

Cannot declare array of size 400000 in C

I am trying to do following:
#include <windows.h>
#include <stdio.h>
#define N 400000
void main() {
int a[N];
}
I get a stackoverflow exception. My computer has 6GB of main memory so I cant be using it all up. How do I solve this problem? I using VS 2008 on Windows 7 and coding in C.
The amount of stack size you're allowed to use is never going to be the full amount of main memory.
You can use this flag to set the stack size--which defaults to 1MB. To store 400,000 ints you'll need at least 1.526 MB.
Why not allocate this on the heap instead of the stack?
When you define a variable like that, you're requesting space on the stack. This is the managed section of memory that's used for variables in function calls, but isn't meant to store large amounts of data.
Instead, you'd need to allocate the memory manually, on the heap.
int *a = (int *) malloc(sizeof(int) * N);
This defines a as a pointer to the memory on the heap. This will behave the same as the array, except you will need to manually
free(a);
when you finish using it or you'll create a memory leak.
Automatic variables are allocated on the stack, which is usually 1MB. To solve this, allocate the memory on the heap:
int *a = (int*)malloc(sizeof(int) * N);
When you're done with that memory, you can deallocate it:
free(a);
That will return the memory to the system.
You need Stack Size larger than 400000*4=1600000 Bytes ~ 1.6 MB but the default stack size in visual studio is 1MB. There is 2 solutions:
1- you can change the stack size of you program by:
right click project, and choose properties from the menu .
go to Configuration properties->Linker->Commandline, add this parameter
/STACK:2000000
2- dynamic array to allocate over the heap, instead of static array , as all have said.

Segmentation Fault on Small(ish) 2d array

I keep getting a segmentation with the following code. Changing the 4000 to 1000 makes the code run fine. I would think that I have enough memory here... How can I fix this?
#include <stdio.h>
#include <math.h>
#include <stdlib.h>
#include <string.h>
#define MAXLEN 4000
void initialize_mx(float mx[][MAXLEN])
{
int i, j;
float c=0;
for(i=0;i<MAXLEN;i++){
for(j=0;j<MAXLEN;j++) mx[i][j]=c;
}
}
int main(int ac, char *av[])
{
int i, j;
float confmx[MAXLEN][MAXLEN];
initialize_mx(confmx);
return 0;
}
The problem is you're overflowing the stack.
When you call initialize_mx() it allocates stack space for it's local variables (confmx in your case). This space, which is limited by your OS (check ulimit if you're on linux), can get overflowed if local variables are too big.
Basically you can:
Declare confmx as a global variable as cnicutar suggests.
Allocate memory space for your array dynamically. and pass a pointer to initialize_mx()
EDIT: Just realized you must still allocate memory space if you pass a pointer so you have those two options :)
You are using 4000*4000*4 bytes on your stack, if I didn't make any calculation errors, that's 61MB, which is a lot. It works with 1000 because in that case you are only using nearly 4MB on your stack.
4000*4000*sizeof(float)==64000000. I suspect your operating system may have a limit on the stack size between 4 and 64 MB.
As others have noted smallish isn't small for auto class variables which are allocated on the stack.
Depending on your needs, you could
static float confmx[MAXLEN][MAXLEN];
which would allocate the storage in the BSS. You might want to consider a different storage system as one often only needs a sparse matrix and there are more efficient ways to store and access matrices where many of the cells are zero.

Resources