This question already has answers here:
Segmentation Fault, large arrays
(1 answer)
Getting a stack overflow exception when declaring a large array
(8 answers)
Closed 6 years ago.
This is my part of the code that occurred and segmentation fault:
int main (int argc, char *argv[]) {
printf ("====================================================\n");
double pointArray[MAX_NUM_OF_POINTS][DIMENSION];
double range;
int num_of_nearest;
double queryPoint[DIMENSION];
int counter;
int dist;
int num;
printf ("====================================================\n");
}
where MAX_NUM_OF_POINTS was defined to be 100,000,000.
However, when I changed this number to be smaller like 100,000, the segmentation fault disappeared.
Could anyone tell me the reason?
Local variables are created on the stack, which has a limited amount of space. By attempting to create an array of at least 100000000 doubles, each of which is probably 8 bytes, it is too large for the stack and causes a segfault.
If you declare the array as a global, it will not reside on the stack but in the data segment instead, which can handle larger variables. Alternately, you can create the array dynamically using malloc in which case it lives on the stack.
This however raises the question as to why you need an array that large. You may need to rethink your design to see if there is a more memory efficient way of doing what you want.
Related
This question already has answers here:
Why do I get a segfault in C from declaring a large array on the stack?
(5 answers)
Closed 4 years ago.
I am trying to read a data file containing about 10^7 values and perform some calculations. I am creating an array of that size and doing fscanf to read those values into each of the elements of the array. The gist of the program looks like this
#include<stdio.h>
#include<math.h>
int main()
{
int L = 10000000;
float array[L];
FILE *fp;
fp = fopen("datafile.txt","r");
/*
reading values into the array from datafile.txt using fscanf
and doing some operations on array elements
*/
fclose(fp);
return 0;
}
But the same program works if I use L of lesser magnitude, i.e for L=10^6 and lesser it works fine.
Firstly I thought I have lesser primary memory on my laptop (~4GB) then I tried to run the program on high-end computers having 16GB and 128GB main memory there also I got segmentation fault(core dumped)
I used gcc to compile the program which compiled the program without any error and warnings.
gcc my_program.c -lm
./a.out
The output was segmentation fault as I mentioned.
You're probably blowing your stack. For anything "big" allocate dynamically using something like calloc:
int main()
{
int L = 10000000;
float *array = calloc(L, sizeof(float));
FILE *fp;
fp = fopen("datafile.txt",'r');
/*
reading values into the array from datafile.txt using fscanf
and doing some operations on array elements
*/
fclose(fp);
free(array);
return 0;
}
Local variables are limited in size, so trying to create a local variable that's "too big" will result in unpredictable behaviour or even crashes. The memory remaining for local variables depends on how deeply nested your code is, so it can fluctuate wildly. This is why keeping your local variables to a minimum is important. Pointers and integers are really cheap, but arrays of a consequential size are trouble.
This question already has answers here:
Segmentation fault on large array sizes
(7 answers)
Closed 3 years ago.
Program with large global array:
int ar[2000000];
int main()
{
}
Program with large local array:
int main()
{
int ar[2000000];
}
When I declare an array with large size in the main function, the program crashes with "SIGSEGV (Segmentation fault)".
However, when I declare it as global, everything works fine. Why is that?
Declaring the array globally causes the compiler to include the space for the array in the data section of the compiled binary. In this case you have increased the binary size by 8 MB (2000000 * 4 bytes per int). However, this does mean that the memory is available at all times and does not need to be allocated on the stack or heap.
EDIT: #Blue Moon rightly points out that an uninitialized array will most likely be allocated in the bss data segment and may, in fact, take up no additional disk space. An initialized array will be allocated statically.
When you declare an array that large in your program you have probably exceeded the stack size of the program (and ironically caused a stack overflow).
A better way to allocate a large array dynamically is to use a pointer and allocate the memory on the heap like this:
using namespace std;
int main() {
int *ar;
ar = malloc(2000000 * sizeof(int));
if (ar != null) {
// Do something
free(ar);
}
return 0;
}
A good tutorial on the Memory Layout of C Programs can be found here.
This question already has answers here:
Memory allocation for global and local variables
(3 answers)
Segmentation fault on large array sizes
(7 answers)
Closed 7 years ago.
I'm not too experienced in C, but I've been recently writing some program in the language to speed it up a little bit (it was originally written in python). I don't really have a problem anymore since I already managed to solve my original problem. However, I would like to know why this solution works.
I have a data structure representing complex numbers defined as
typedef struct _fcomplex { float re, im; } fcomplex;
Then I want to create an array of complex numbers as:
fcomplex M[N];
Here N is a large number (something like ~10^6). Then I initialize the array with zeros in a function that essentially runs through all the indices and sets the values in the array. It's something like:
fcomplex num = {0.0, 0.0};
int i;
for (i=0 ; i < N ; i++) {
M[i] = num;
}
However, when I run the code, it results in a segmentation fault. However, if I use malloc() to allocate space for the array instead as
fcomplex* M = malloc(N*sizeof(fcomplex));
and then do everything as before, the code works fine. Also, for smaller values of N, the code runs fine either way.
As I said, using malloc() already solved the problem, but I would like to know why?
It depends where you allocated the array. If it's inside a function, then the variable is allocated on the stack, and by default, (I assume you're running linux) the stack size is 8Mb.
You can find it out using ulimit -s and also modify this value, for instance ulimit -s 1000000.
You may want to have a look on those questions:
Memory allocation for global and local variables
Segmentation fault on large array sizes (suggested by #Ed)
This question already has answers here:
Getting a stack overflow exception when declaring a large array
(8 answers)
Closed 8 years ago.
Why int array[1000][1000] is memory issue in C program when it is declared in main method instead of global declaration?
The stack has a limited size, and consequently can only hold a limited amount of information. If the program tries to put too much information on the stack, stack overflow will result. Stack overflow happens when all the memory in the stack has been allocated.
The program
int main()
{
int array[1000][1000];
return 0;
}
tries to allocate a huge array on the stack.
Because the stack is not large enough to handle this array, the array allocation overflows into portions of memory the program is not allowed to use. Consequently, the program crashes.
Further reading: The stack and the heap.
This question already has answers here:
Segmentation fault on large array sizes
(7 answers)
Closed 9 years ago.
If I keep the value of rows to 100000, the program works fine but, if I make rows one million as 1000000, the program gives me segmentation fault. What is the reason? I am running below on Linux 2.6x RHEL kernel.
#include<stdio.h>
#define ROWS 1000000
#define COLS 4
int main(int args, char ** argv)
{
int matrix[ROWS][COLS];
for(int col=0;col<COLS;col++)
for(int row=0;row < ROWS; row++)
matrix[row][col] = row*col;
return 0;
}
The matrix is a local variable inside your main function. So it is "allocated" on the machine call stack.
This stack has some limits.
You should make your matrix a global or static variable or make it a pointer and heap-allocate (with e.g. calloc or malloc) the memory zone. Don't forget that calloc or malloc may fail (by returning NULL).
A better reason to heap-allocate such a thing is that the dimensions of the matrix should really be a variable or some input. There are few reasons to wire-in the dimensions in the source code.
Heuristic: don't have a local frame (cumulated sum of local variables' sizes) bigger than a kilobyte or two.
[of course, there are valid exceptions to that heuristic]
You are allocating a stack variable, the stack of each program is limited.
When you try to allocate too much stack memory, your kernel will kill your program by sending it a SEGV signal, aka segmentation fault.
If you want to allocate bigger chunks of memory, use malloc, this function will get memory from the heap.
Your system must not allow you to make a stack allocation that large. Make matrix global or use dynamic allocation (via malloc and free) and you should be ok.