Reason Why this Happens in c [duplicate] - c

I have this code in C which takes in bunch of chars
#include<stdio.h>
# define NEWLINE '\n'
int main()
{
char c;
char str[6];
int i = 0;
while( ((c = getchar()) != NEWLINE))
{
str[i] = c;
++i;
printf("%d\n", i);
}
return 0;
}
Input is: testtesttest
Output:
1
2
3
4
5
6
7
8
117
118
119
120
My questions are:
Why don't I get an out of bounds (segmentation fault) exception although I clearly exceed the capacity of the array?
Why do the numbers in the output suddenly jump to very big numbers?
I tried this in C++ and got the same behavior. Could anyone please explain what is the reason for this?

C doesn't check array boundaries. A segmentation fault will only occur if you try to dereference a pointer to memory that your program doesn't have permission to access. Simply going past the end of an array is unlikely to cause that behaviour. Undefined behaviour is just that - undefined. It may appear to work just fine, but you shouldn't be relying on its safety.
Your program causes undefined behaviour by accessing memory past the end of the array. In this case, it looks like one of your str[i] = c writes overwrites the value in i.
C++ has the same rules as C does in this case.

When you access an array index, C and C++ don't do bound checking. Segmentation faults only happen when you try to read or write to a page that was not allocated (or try to do something on a page which isn't permitted, e.g. trying to write to a read-only page), but since pages are usually pretty big (multiples of a few kilobytes; on Mac OS, multiples of 4 KB), it often leaves you with lots of room to overflow.
If your array is on the stack (like yours), it can be even worse as the stack is usually pretty large (up to several megabytes). This is also the cause of security concerns: writing past the bounds of an array on the stack may overwrite the return address of the function and lead to arbitrary code execution (the famous "buffer overflow" security breaches).
The values you get when you read are just what happens to exist at this particular place. They are completely undefined.
If you use C++ (and are lucky enough to work with C++11), the standard defines the std::array<T, N> type, which is an array that knows its bounds. The at method will throw if you try to read past the end of it.

C does not check array bounds.
In fact, a segmentation fault isn't specifically a runtime error generated by exceeding the array bounds. Rather, it is a result of memory protection that is provided by the operating system. It occurs when your process tries to access memory that does not belong to it, or if it tries to access a memory address that doesn't exist.

Writing outside array bounds (actually even just performing the pointer arithmetic/array subscripting, even if you don't use the result to read or write anything) results in undefined behavior. Undefined behavior is not a reported or reportable error; it measn your program could do anything at all. It's very dangerous and you are fully responsible for avoiding it. C is not Java/Python/etc.

Memory allocation is more complicated than it seems. The variable "str," in this case, is on the stack, next to other variables, so it's not followed by unallocated memory. Memory is also usually word-aligned (one "word" is four to eight bytes.) You were possibly messing with the value for another variable, or with some "padding" (empty space added to maintain word alignment,) or something else entirely.
Like R.. said, it's undefined behavior. Out-of-bounds conditions could cause a segfault... or they could cause silent memory corruption. If you're modifying memory which has already been allocated, this will not be caught by the operating system. That's why out-of-bounds errors are so insidious in C.

Because C/C++ doesn't check bounds.
Arrays are internally pointers to a location in memory. When you call arr[index] what it does is:
type value = *(arr + index);
The results are big numbers (not necessarily) because they're garbage values. Just like an uninitialized variable.

You have to compile like this:
gcc -fsanitize=address -ggdb -o test test.c
There is more information here.

Related

what happens when we only malloc size of 1 int and not n ints in an array? [duplicate]

I have this code in C which takes in bunch of chars
#include<stdio.h>
# define NEWLINE '\n'
int main()
{
char c;
char str[6];
int i = 0;
while( ((c = getchar()) != NEWLINE))
{
str[i] = c;
++i;
printf("%d\n", i);
}
return 0;
}
Input is: testtesttest
Output:
1
2
3
4
5
6
7
8
117
118
119
120
My questions are:
Why don't I get an out of bounds (segmentation fault) exception although I clearly exceed the capacity of the array?
Why do the numbers in the output suddenly jump to very big numbers?
I tried this in C++ and got the same behavior. Could anyone please explain what is the reason for this?
C doesn't check array boundaries. A segmentation fault will only occur if you try to dereference a pointer to memory that your program doesn't have permission to access. Simply going past the end of an array is unlikely to cause that behaviour. Undefined behaviour is just that - undefined. It may appear to work just fine, but you shouldn't be relying on its safety.
Your program causes undefined behaviour by accessing memory past the end of the array. In this case, it looks like one of your str[i] = c writes overwrites the value in i.
C++ has the same rules as C does in this case.
When you access an array index, C and C++ don't do bound checking. Segmentation faults only happen when you try to read or write to a page that was not allocated (or try to do something on a page which isn't permitted, e.g. trying to write to a read-only page), but since pages are usually pretty big (multiples of a few kilobytes; on Mac OS, multiples of 4 KB), it often leaves you with lots of room to overflow.
If your array is on the stack (like yours), it can be even worse as the stack is usually pretty large (up to several megabytes). This is also the cause of security concerns: writing past the bounds of an array on the stack may overwrite the return address of the function and lead to arbitrary code execution (the famous "buffer overflow" security breaches).
The values you get when you read are just what happens to exist at this particular place. They are completely undefined.
If you use C++ (and are lucky enough to work with C++11), the standard defines the std::array<T, N> type, which is an array that knows its bounds. The at method will throw if you try to read past the end of it.
C does not check array bounds.
In fact, a segmentation fault isn't specifically a runtime error generated by exceeding the array bounds. Rather, it is a result of memory protection that is provided by the operating system. It occurs when your process tries to access memory that does not belong to it, or if it tries to access a memory address that doesn't exist.
Writing outside array bounds (actually even just performing the pointer arithmetic/array subscripting, even if you don't use the result to read or write anything) results in undefined behavior. Undefined behavior is not a reported or reportable error; it measn your program could do anything at all. It's very dangerous and you are fully responsible for avoiding it. C is not Java/Python/etc.
Memory allocation is more complicated than it seems. The variable "str," in this case, is on the stack, next to other variables, so it's not followed by unallocated memory. Memory is also usually word-aligned (one "word" is four to eight bytes.) You were possibly messing with the value for another variable, or with some "padding" (empty space added to maintain word alignment,) or something else entirely.
Like R.. said, it's undefined behavior. Out-of-bounds conditions could cause a segfault... or they could cause silent memory corruption. If you're modifying memory which has already been allocated, this will not be caught by the operating system. That's why out-of-bounds errors are so insidious in C.
Because C/C++ doesn't check bounds.
Arrays are internally pointers to a location in memory. When you call arr[index] what it does is:
type value = *(arr + index);
The results are big numbers (not necessarily) because they're garbage values. Just like an uninitialized variable.
You have to compile like this:
gcc -fsanitize=address -ggdb -o test test.c
There is more information here.

What does static arr[] mean??? [In C language] [duplicate]

I have this code in C which takes in bunch of chars
#include<stdio.h>
# define NEWLINE '\n'
int main()
{
char c;
char str[6];
int i = 0;
while( ((c = getchar()) != NEWLINE))
{
str[i] = c;
++i;
printf("%d\n", i);
}
return 0;
}
Input is: testtesttest
Output:
1
2
3
4
5
6
7
8
117
118
119
120
My questions are:
Why don't I get an out of bounds (segmentation fault) exception although I clearly exceed the capacity of the array?
Why do the numbers in the output suddenly jump to very big numbers?
I tried this in C++ and got the same behavior. Could anyone please explain what is the reason for this?
C doesn't check array boundaries. A segmentation fault will only occur if you try to dereference a pointer to memory that your program doesn't have permission to access. Simply going past the end of an array is unlikely to cause that behaviour. Undefined behaviour is just that - undefined. It may appear to work just fine, but you shouldn't be relying on its safety.
Your program causes undefined behaviour by accessing memory past the end of the array. In this case, it looks like one of your str[i] = c writes overwrites the value in i.
C++ has the same rules as C does in this case.
When you access an array index, C and C++ don't do bound checking. Segmentation faults only happen when you try to read or write to a page that was not allocated (or try to do something on a page which isn't permitted, e.g. trying to write to a read-only page), but since pages are usually pretty big (multiples of a few kilobytes; on Mac OS, multiples of 4 KB), it often leaves you with lots of room to overflow.
If your array is on the stack (like yours), it can be even worse as the stack is usually pretty large (up to several megabytes). This is also the cause of security concerns: writing past the bounds of an array on the stack may overwrite the return address of the function and lead to arbitrary code execution (the famous "buffer overflow" security breaches).
The values you get when you read are just what happens to exist at this particular place. They are completely undefined.
If you use C++ (and are lucky enough to work with C++11), the standard defines the std::array<T, N> type, which is an array that knows its bounds. The at method will throw if you try to read past the end of it.
C does not check array bounds.
In fact, a segmentation fault isn't specifically a runtime error generated by exceeding the array bounds. Rather, it is a result of memory protection that is provided by the operating system. It occurs when your process tries to access memory that does not belong to it, or if it tries to access a memory address that doesn't exist.
Writing outside array bounds (actually even just performing the pointer arithmetic/array subscripting, even if you don't use the result to read or write anything) results in undefined behavior. Undefined behavior is not a reported or reportable error; it measn your program could do anything at all. It's very dangerous and you are fully responsible for avoiding it. C is not Java/Python/etc.
Memory allocation is more complicated than it seems. The variable "str," in this case, is on the stack, next to other variables, so it's not followed by unallocated memory. Memory is also usually word-aligned (one "word" is four to eight bytes.) You were possibly messing with the value for another variable, or with some "padding" (empty space added to maintain word alignment,) or something else entirely.
Like R.. said, it's undefined behavior. Out-of-bounds conditions could cause a segfault... or they could cause silent memory corruption. If you're modifying memory which has already been allocated, this will not be caught by the operating system. That's why out-of-bounds errors are so insidious in C.
Because C/C++ doesn't check bounds.
Arrays are internally pointers to a location in memory. When you call arr[index] what it does is:
type value = *(arr + index);
The results are big numbers (not necessarily) because they're garbage values. Just like an uninitialized variable.
You have to compile like this:
gcc -fsanitize=address -ggdb -o test test.c
There is more information here.

Why can I fill buffer without sigfault? [duplicate]

I have this code in C which takes in bunch of chars
#include<stdio.h>
# define NEWLINE '\n'
int main()
{
char c;
char str[6];
int i = 0;
while( ((c = getchar()) != NEWLINE))
{
str[i] = c;
++i;
printf("%d\n", i);
}
return 0;
}
Input is: testtesttest
Output:
1
2
3
4
5
6
7
8
117
118
119
120
My questions are:
Why don't I get an out of bounds (segmentation fault) exception although I clearly exceed the capacity of the array?
Why do the numbers in the output suddenly jump to very big numbers?
I tried this in C++ and got the same behavior. Could anyone please explain what is the reason for this?
C doesn't check array boundaries. A segmentation fault will only occur if you try to dereference a pointer to memory that your program doesn't have permission to access. Simply going past the end of an array is unlikely to cause that behaviour. Undefined behaviour is just that - undefined. It may appear to work just fine, but you shouldn't be relying on its safety.
Your program causes undefined behaviour by accessing memory past the end of the array. In this case, it looks like one of your str[i] = c writes overwrites the value in i.
C++ has the same rules as C does in this case.
When you access an array index, C and C++ don't do bound checking. Segmentation faults only happen when you try to read or write to a page that was not allocated (or try to do something on a page which isn't permitted, e.g. trying to write to a read-only page), but since pages are usually pretty big (multiples of a few kilobytes; on Mac OS, multiples of 4 KB), it often leaves you with lots of room to overflow.
If your array is on the stack (like yours), it can be even worse as the stack is usually pretty large (up to several megabytes). This is also the cause of security concerns: writing past the bounds of an array on the stack may overwrite the return address of the function and lead to arbitrary code execution (the famous "buffer overflow" security breaches).
The values you get when you read are just what happens to exist at this particular place. They are completely undefined.
If you use C++ (and are lucky enough to work with C++11), the standard defines the std::array<T, N> type, which is an array that knows its bounds. The at method will throw if you try to read past the end of it.
C does not check array bounds.
In fact, a segmentation fault isn't specifically a runtime error generated by exceeding the array bounds. Rather, it is a result of memory protection that is provided by the operating system. It occurs when your process tries to access memory that does not belong to it, or if it tries to access a memory address that doesn't exist.
Writing outside array bounds (actually even just performing the pointer arithmetic/array subscripting, even if you don't use the result to read or write anything) results in undefined behavior. Undefined behavior is not a reported or reportable error; it measn your program could do anything at all. It's very dangerous and you are fully responsible for avoiding it. C is not Java/Python/etc.
Memory allocation is more complicated than it seems. The variable "str," in this case, is on the stack, next to other variables, so it's not followed by unallocated memory. Memory is also usually word-aligned (one "word" is four to eight bytes.) You were possibly messing with the value for another variable, or with some "padding" (empty space added to maintain word alignment,) or something else entirely.
Like R.. said, it's undefined behavior. Out-of-bounds conditions could cause a segfault... or they could cause silent memory corruption. If you're modifying memory which has already been allocated, this will not be caught by the operating system. That's why out-of-bounds errors are so insidious in C.
Because C/C++ doesn't check bounds.
Arrays are internally pointers to a location in memory. When you call arr[index] what it does is:
type value = *(arr + index);
The results are big numbers (not necessarily) because they're garbage values. Just like an uninitialized variable.
You have to compile like this:
gcc -fsanitize=address -ggdb -o test test.c
There is more information here.

Unlimited Space when array is declared in heap [duplicate]

I have this code in C which takes in bunch of chars
#include<stdio.h>
# define NEWLINE '\n'
int main()
{
char c;
char str[6];
int i = 0;
while( ((c = getchar()) != NEWLINE))
{
str[i] = c;
++i;
printf("%d\n", i);
}
return 0;
}
Input is: testtesttest
Output:
1
2
3
4
5
6
7
8
117
118
119
120
My questions are:
Why don't I get an out of bounds (segmentation fault) exception although I clearly exceed the capacity of the array?
Why do the numbers in the output suddenly jump to very big numbers?
I tried this in C++ and got the same behavior. Could anyone please explain what is the reason for this?
C doesn't check array boundaries. A segmentation fault will only occur if you try to dereference a pointer to memory that your program doesn't have permission to access. Simply going past the end of an array is unlikely to cause that behaviour. Undefined behaviour is just that - undefined. It may appear to work just fine, but you shouldn't be relying on its safety.
Your program causes undefined behaviour by accessing memory past the end of the array. In this case, it looks like one of your str[i] = c writes overwrites the value in i.
C++ has the same rules as C does in this case.
When you access an array index, C and C++ don't do bound checking. Segmentation faults only happen when you try to read or write to a page that was not allocated (or try to do something on a page which isn't permitted, e.g. trying to write to a read-only page), but since pages are usually pretty big (multiples of a few kilobytes; on Mac OS, multiples of 4 KB), it often leaves you with lots of room to overflow.
If your array is on the stack (like yours), it can be even worse as the stack is usually pretty large (up to several megabytes). This is also the cause of security concerns: writing past the bounds of an array on the stack may overwrite the return address of the function and lead to arbitrary code execution (the famous "buffer overflow" security breaches).
The values you get when you read are just what happens to exist at this particular place. They are completely undefined.
If you use C++ (and are lucky enough to work with C++11), the standard defines the std::array<T, N> type, which is an array that knows its bounds. The at method will throw if you try to read past the end of it.
C does not check array bounds.
In fact, a segmentation fault isn't specifically a runtime error generated by exceeding the array bounds. Rather, it is a result of memory protection that is provided by the operating system. It occurs when your process tries to access memory that does not belong to it, or if it tries to access a memory address that doesn't exist.
Writing outside array bounds (actually even just performing the pointer arithmetic/array subscripting, even if you don't use the result to read or write anything) results in undefined behavior. Undefined behavior is not a reported or reportable error; it measn your program could do anything at all. It's very dangerous and you are fully responsible for avoiding it. C is not Java/Python/etc.
Memory allocation is more complicated than it seems. The variable "str," in this case, is on the stack, next to other variables, so it's not followed by unallocated memory. Memory is also usually word-aligned (one "word" is four to eight bytes.) You were possibly messing with the value for another variable, or with some "padding" (empty space added to maintain word alignment,) or something else entirely.
Like R.. said, it's undefined behavior. Out-of-bounds conditions could cause a segfault... or they could cause silent memory corruption. If you're modifying memory which has already been allocated, this will not be caught by the operating system. That's why out-of-bounds errors are so insidious in C.
Because C/C++ doesn't check bounds.
Arrays are internally pointers to a location in memory. When you call arr[index] what it does is:
type value = *(arr + index);
The results are big numbers (not necessarily) because they're garbage values. Just like an uninitialized variable.
You have to compile like this:
gcc -fsanitize=address -ggdb -o test test.c
There is more information here.

Why ARR[-1] does not give segmentation fault in C? [duplicate]

I have this code in C which takes in bunch of chars
#include<stdio.h>
# define NEWLINE '\n'
int main()
{
char c;
char str[6];
int i = 0;
while( ((c = getchar()) != NEWLINE))
{
str[i] = c;
++i;
printf("%d\n", i);
}
return 0;
}
Input is: testtesttest
Output:
1
2
3
4
5
6
7
8
117
118
119
120
My questions are:
Why don't I get an out of bounds (segmentation fault) exception although I clearly exceed the capacity of the array?
Why do the numbers in the output suddenly jump to very big numbers?
I tried this in C++ and got the same behavior. Could anyone please explain what is the reason for this?
C doesn't check array boundaries. A segmentation fault will only occur if you try to dereference a pointer to memory that your program doesn't have permission to access. Simply going past the end of an array is unlikely to cause that behaviour. Undefined behaviour is just that - undefined. It may appear to work just fine, but you shouldn't be relying on its safety.
Your program causes undefined behaviour by accessing memory past the end of the array. In this case, it looks like one of your str[i] = c writes overwrites the value in i.
C++ has the same rules as C does in this case.
When you access an array index, C and C++ don't do bound checking. Segmentation faults only happen when you try to read or write to a page that was not allocated (or try to do something on a page which isn't permitted, e.g. trying to write to a read-only page), but since pages are usually pretty big (multiples of a few kilobytes; on Mac OS, multiples of 4 KB), it often leaves you with lots of room to overflow.
If your array is on the stack (like yours), it can be even worse as the stack is usually pretty large (up to several megabytes). This is also the cause of security concerns: writing past the bounds of an array on the stack may overwrite the return address of the function and lead to arbitrary code execution (the famous "buffer overflow" security breaches).
The values you get when you read are just what happens to exist at this particular place. They are completely undefined.
If you use C++ (and are lucky enough to work with C++11), the standard defines the std::array<T, N> type, which is an array that knows its bounds. The at method will throw if you try to read past the end of it.
C does not check array bounds.
In fact, a segmentation fault isn't specifically a runtime error generated by exceeding the array bounds. Rather, it is a result of memory protection that is provided by the operating system. It occurs when your process tries to access memory that does not belong to it, or if it tries to access a memory address that doesn't exist.
Writing outside array bounds (actually even just performing the pointer arithmetic/array subscripting, even if you don't use the result to read or write anything) results in undefined behavior. Undefined behavior is not a reported or reportable error; it measn your program could do anything at all. It's very dangerous and you are fully responsible for avoiding it. C is not Java/Python/etc.
Memory allocation is more complicated than it seems. The variable "str," in this case, is on the stack, next to other variables, so it's not followed by unallocated memory. Memory is also usually word-aligned (one "word" is four to eight bytes.) You were possibly messing with the value for another variable, or with some "padding" (empty space added to maintain word alignment,) or something else entirely.
Like R.. said, it's undefined behavior. Out-of-bounds conditions could cause a segfault... or they could cause silent memory corruption. If you're modifying memory which has already been allocated, this will not be caught by the operating system. That's why out-of-bounds errors are so insidious in C.
Because C/C++ doesn't check bounds.
Arrays are internally pointers to a location in memory. When you call arr[index] what it does is:
type value = *(arr + index);
The results are big numbers (not necessarily) because they're garbage values. Just like an uninitialized variable.
You have to compile like this:
gcc -fsanitize=address -ggdb -o test test.c
There is more information here.

Resources