two dimensional array increement - c

I need to bin a hist variable like this, which is in a loop for p and bin,
hist[p][bin] = hist[p][bin] + 1;
When I comment this line, the code works( verified the p and bin variable print). However when I include this line the program terminates with segmentation fault. Further examining the bin variable gives me a huge negative integer ( -214733313 ), which leads to segmentation fault. The program runs normally when I comment this line and the bin variables are normal integer. Do I miss an obvious thing here?.
Thanks

If you're getting a -2147... you are basically reaching the max size of a signed integer, or 2^31 -1 (32 bits, 4 bytes, a C int).
If we assume this, then it's safe to say you're hitting memory where $FFFFFFFF is in it. I only ever see this in unallocated, generally random memory. It could be safe to assume then you are going out of bounds with your ask. You could have hist[p][bin] be the maximum memory of your array, and adding 1 is going out bounds.

Related

Why am I not getting a segfault error with this simple code?

I have to show an error when I access an item outside of an array (without creating my own function for it). So I just thought it was necessary to access the value out of the array to trigger a segfault but this code does not crash at all:
int main(){
int tab[4];
printf("%d", tab[7]);
}
Why I can't get an error when I'm doing this?
When you invoke undefined behavior, anything can happen. You program may crash, it may display strange results, or it may appear to work properly.
Also, making a seemingly unrelated change such as adding an unused local variable or a simple call to printf can change the way in which undefined behavior manifests itself.
When I ran this program, it completed and printed 63. If I changed the referenced index from 7 to 7000, I get a segfault.
In short, just because the program can crash doesn't mean it will.
Because the behavior when you do things not allowed by the spec is "undefined". And because there are no bounds checks required in C. You got "lucky".
int tab[4]; says to allocate memory for 4 integers on the stack. tab is just a number of a memory address. It doesn't know anything about what it's pointing at or how much space as been allocated.
printf("%d", tab[7]); says to print out the 8th element of tab. So the compiler does...
tab is set to 1000 (for example) meaning memory address 1000.
tab represents a list of int, so each element will be sizeof(int), probably 4 or 8 bytes. Let's say 8.
Therefore tab[7] means to start reading at memory position (7 * 8) + 1000 = 1056 and for 8 more bytes. So it reads 1056 to 1063.
That's it. No bounds checks by the program itself. The hardware or OS might do a bounds check to prevent one process from reading arbitrary memory, have a look into protected memory, but nothing required by C.
So tab[7] faithfully reproduces whatever garbage is in 1056 to 1063.
You can write a little program to see this.
int main(){
int tab[4];
printf("sizeof(int): %zu\n", sizeof(int));
printf("tab: %d\n", tab);
printf("&tab[7]: %d\n", &tab[7]);
/* Note: tab must be cast to an integer else C will do pointer
math on it. `7 + tab` gives the same result. */
printf("(7 * sizeof(int)) + (int)tab: %d\n", (7 * sizeof(int)) + (int)tab);
printf("7 + tab: %d\n", 7 + tab);
}
The exact results will vary, but you'll see that &tab[7] is just some math done on tab to figure out what memory address to examine.
$ ./test
sizeof(int): 4
tab: 1595446448
&tab[7]: 1595446476
(7 * sizeof(int)) + (int)tab: 1595446476
7 + tab: 1595446476
1595446476 - 1595446448 is 28. 7 * 4 is 28.
An array in C is just a pointer to a block of memory with a starting point at, in this case, the arbitrary location of tab[0]. Sure you've set a bound of 4 but if you go past that, you just accessing random values that are past that block of memory. (i.e. the reason it is probably printing out weird numbers).

Array & segmentation fault

I'm creating the below array:
int p[100];
int
main ()
{
int i = 0;
while (1)
{
p[i] = 148;
i++;
}
return (0);
}
The program aborts with a segmentation fault after writing 1000 positions of the array, instead of the 100. I know that C doesn't check if the program writes out of bounds, this is left to the OS. I'm running it on ubuntu, the size of the stack is 8MB (limit -s). Why is it aborting after 1000? How can I check how much memory my OS allocates for an array?
Sorry if it's been asked before, I've been googling this but can't seem to find a specific explanation for this.
Accessing an invalid memory location leads to Undefined Behavior which means that anything can happen. It is not necessary for a segmentation-fault to occur.
...the size of the stack is 8MB (limit -s)...
variable int p[100]; is not at the stack but in data area because it is defined as global. It is not initialized so it is placed into BSS area and filled with zeros. You can check that printing array values just at the beginning of main() function.
As other said, using p[i] = 148; you produced undefined behaviour. Filling 1000 position you most probably reached end of BSS area and got segmentation fault.
It appear that you clearly get over the 100 elements defined (int p[100];) since you make a loop without any limitation (while (1)).
I would suggest to you to use a for loop instead:
for (i = 0; i < 100; i++) {
// do your stuff...
}
Regarding you more specific question about the memory, consider that any outside range request (in your situation over the 100 elements of the array) can produce an error. The fact that you notice it was 1000 in your situation can change depending on memory usage by other program.
It will fail once the CPU says
HEY! that's not Your memory, leave it!
The fact that the memory is not inside of the array does not mean that it's not for the application to manipulate.
The program aborts with a segmentation fault after writing 1000 positions of the array, instead of the 100.
You do not reason out Undefined Behavior. Its like asking If 1000 people are under a coconut tree, will 700 hundred of them always fall unconscious if a Coconut smacks each of their heads?

Segmentation Fault & printing an array without loops

I'm Just trying to run a simple program to count the number of spaces, digits and other characters using arrays. Below is my program:
void main(){
int digit_holders[10]={0};
int ch;
int i, white_space=0,other=0;
while((ch=getchar())!=EOF){
if(isspace(ch))
white_space++;
else if(isdigit(ch))
digit_holders[ch-'0']++;
else
other++;
}
digit_holders[12]=20;
printf("\n White spaces=%d\n Other=%d\n",white_space,other);
for(i=0;i<=9;i++)
printf("\ndigit_holders[%d]=%d\n",i,digit_holders[i]);
printf("\n digit_holder[12]=%d\n",digit_holders[12]);
}
2 Questions:
Why does digit_holders[12] still manage to print the assigned vale despite it being outside the range? Why doesn't it display a segmentation fault ?The same happens when I change the for loop check to i<=11 it manages to print digit_holders[11]=0 ( which it shouldn't) .. however when I replace 11/10 with 1100 i.e digit_holders[1100] in either of the case, the program crashes ( segmentation fault). Why so ?
Is there an easier way to print the elements of this array without using for loop ?
-Thanks!
There is no range checking in C so it gives it is best shot (I.e enough rope to hang yourself and the rest of the family).
Segmentation fault occurs from the OS. i.e. trying to access memory not assigned to the process.
As I wrote in the comment:
You will need a loop of some kind to print the content of your array in C.
You are assigning the 13th element of an array with only 10 elements declared, its risky to do and will be unstable, but wont necessarily result in a seg fault because if the OS hasnt modified it in the time between write and read, your pointer will resolve the value without error. But again, risky.
If you had declared an array with 13 elements, all 13 will be reserved in memory and there will be no chance of a seg fault. You are likely to get a seg fault if you interrogate an array outside of its declared limits, more so the further you go away from the range you defined.
It's possible to print "out-of-range" array indices in C because digit_holders[12] is handled as though digit_holders were a pointer with an offset of 12 * sizeof(int) added on to it.
This won't necessarily cause a segmentation fault, since the memory address being read might still be earmarked as being "available" to the program. A good example of this is a struct containing an array:
typedef struct {
int some_array[12];
int another_int;
} my_struct;
If you were to read the value of some_array[12], you'd get the contents of another_int instead, since it's stored directly after the last "valid" index of some_array.
As for the printing out of arrays: I'm not aware of any way to print out an array without using a loop in C, though it is possible in some other languages. For example, you could easily print out a list of strings in Python using print(", ".join(["This", "That", "The other"]))

Will this loop run infinitely?

I am sitting in a class and I am told by a very experienced teacher that the following code will terminate when the STACK memory gets completely filled by the program. Now I am not able to understand why? Below Is the source code :-
#include<stdio.h>
int main()
{
char i;
for (i = 120; i < 130; i++)
printf("\n%d", i);
return 0;
}
Now, the reason I feel that this loop will not terminate is because once the program runs, the variable is declared in one memory location which is not changing till the life of the program and we are only changing the value of the already declared variable. So, I wanted to ask the answer to this question. Also, if you think the teacher is right, please explain as well :)
Also, I tried running the program for a long time but the memory consumption did not increase by even a bit :|
The actions of the program depend on how your implementation defines char: it may be a signed or an unsigned type.
If it is unsigned, it outputs 10 numbers and terminates.
If it is signed, it will wrap at 127 and the next value is -128 - in most implementations. But according to the standard, it is undefined behaviour.
I don't see why it should eat up the complete stack - there is no recursion and no additional memory allocation, so that
told by a very experienced teacher that the following code will terminate when the STACK memory gets completely filled by the program
means "never" - because it just doesn't fill up the stack. It cannot have been such an experienced programmer/teacher – or the OP is not an experienced listener and has misunderstood something the teacher has told him.
the reason is simple as well as tricky :)
i is not an int, but a char.
This means that its range goes from -128 to +127.
While the loop increses the index, it will overflow at +128, so that the value in memory will be -127 again. This means that i is again smaller than 130! The loop continues on and on...
Now continue cheating :P
char is 1byte long -2^8 to 2^8-1 (-128 to 127) if you try to add 1 to 127 it will be -128 an overflow occurs.
printing the variable you will see the same .
change the declartion from
char i to int i
it never fills the stack as you are not declaring new variables or calling functions to fill the stack. so it's an infinite loop only
Yeah it will lead to infinite loop since i has been declared as char which ranges from -128 to +127 so it never reached 130
the time i reaches 127 it comes back to -128 and never reaches 130
http://ideone.com/iVLoHe

How is this loop ending and are the results deterministic?

I found some code and I am baffled as to how the loop exits, and how it works. Does the program produce a deterministic output?
The reason I am baffled is:
1. `someArray` is of size 2, but clearly, the loop goes till size 3,
2. The value is deterministic and it always exits `someNumber` reaches 4
Can someone please explain how this is happening?
The code was not printing correctly when I put angle brackets <> around include's library names.
#include <stdlib.h>
#include <time.h>
#include <stdio.h>
int main() {
int someNumber = 97;
int someArray[2] = {0,1};
int findTheValue;
for (findTheValue=0; (someNumber -= someArray[findTheValue]) >0; findTheValue++) {
}
printf("The crazy value is %d", findTheValue);
return EXIT_SUCCESS;
}
Accessing an array element beyond its bounds is undefined behavior. That is, the program is allowed to do anything it pleases, reply 42, eat your hard disk or spend all your money. Said in other words what is happening in such cases is entirely platform dependent. It may look "deterministic" but this is just because you are lucky, and also probably because you are only reading from that place and not writing to it.
This kind of code is just bad. Don't do that.
Depending on your compiler, someArray[2] is a pointer to findTheValue!
Because these variables are declared one-after-another, it's entirely possible that they would be positioned consecutively in memory (I believe on the stack). C doesn't really do any memory management or errorchecking, so someArray[2] just means the memory at someArray[0] + 2 * sizeof(int).
So when findTheValue is 0, we subtract, then when findTheValue is 1, we subtract 1. When findTheValue is 2, we subtract someNumber (which is now 94) and exit.
This behavior is by no means guaranteed. Don't rely on it!
EDIT: It is probably more likely that someArray[2] just points to garbage (unspecified) values in your RAM. These values are likely more than 93 and will cause the loop to exit.
EDIT2: Or maybe someArray[2] and someArray[3] are large negative numbers, and subtracting both causes someNumber to roll over to negative.
The loop exits because (someNumber -= someArray[findTheValue]) doesnt set.
Adding a debug line, you can see
value 0 number 97 array 0
value 1 number 96 array 1
value 2 number 1208148276 array -1208148180
that is printing out findTheValue, someNumber, someArray[findTheValue]
Its not the answer I would have expected at first glance.
Checking addresses:
printf("&someNumber = %p\n", &someNumber);
printf("&someArray[0] = %p\n", &someArray[0]);
printf("&someArray[1] = %p\n", &someArray[1]);
printf("&findTheValue = %p\n", &findTheValue);
gave this output:
&someNumber = 0xbfc78e5c
&someArray[0] = 0xbfc78e50
&someArray[1] = 0xbfc78e54
&findTheValue = 0xbfc78e58
It seems that for some reason the compiler puts the array in the beginning of the stack area, then the variables that are declared below and then those that are above in the order they are declared. So someArray[3] effectively points at someNumber.
I really do not know the reason, but I tried gcc on Ubuntu 32 bit and Visual Studio with and without optimisation and the results were always similar.

Resources