Here is my code :
#include<stdio.h>
#include <stdlib.h>
#define LEN 2
int main(void)
{
char num1[LEN],num2[LEN]; //works fine with
//char *num1= malloc(LEN), *num2= malloc(LEN);
int number1,number2;
int sum;
printf("first integer to add = ");
scanf("%s",num1);
printf("second integer to add = ");
scanf("%s",num2);
//adds integers
number1= atoi(num1);
number2= atoi(num2);
sum = number1 + number2;
//prints sum
printf("Sum of %d and %d = %d \n",number1, number2, sum);
return 0;
}
Here is the output :
first integer to add = 15
second integer to add = 12
Sum of 0 and 12 = 12
Why it is taking 0 instead of first variable 15 ?
Could not understand why this is happening.
It is working fine if I am using
char *num1= malloc(LEN), *num2= malloc(LEN);
instead of
char num1[LEN],num2[LEN];
But it should work fine with this.
Edited :
Yes, it worked for LEN 3 but why it showed this undefined behaviour. I mean not working with the normal arrays and working with malloc. Now I got that it should not work with malloc also. But why it worked for me, please be specific so that I can debug more accurately ?
Is there any issue with my system or compiler or IDE ?
Please explain a bit more as it will be helpful or provide any links to resources. Because I don't want to be unlucky anymore.
LEN is 2, which is enough to store both digits but not the required null terminating character. You are therefore overrunning the arrays (and the heap allocations, in that version of the code!) and this causes undefined behavior. The fact that one works and the other does not is simply a byproduct of how the undefined behavior plays out on your particular system; the malloc version could indeed crash on a different system or a different compiler.
Correct results, incorrect results, crashing, or something completely different are all possibilities when you invoke undefined behavior.
Change LEN to 3 and your example input would work fine.
I would suggest indicating the size of your buffers in your scanf() line to avoid the undefined behavior. You may get incorrect results, but your program at least would not crash or have a security vulnerability:
scanf("%2s", num1);
Note that the number you use there must be one less than the size of the array -- in this example it assumes an array of size 3 (so you read a maximum of 2 characters, because you need the last character for the null terminating character).
LEN is defined as 2. You left no room for a null terminator. In the array case you would overrun the array end and damage your stack. In the malloc case you would overrun your heap and potentially damage the malloc structures.
Both are undefined behaviour. You are unlucky that your code works at all: if you were "lucky", your program would decide to crash in every case just to show you that you were triggering undefined behaviour. Unfortunately that's not how undefined behaviour works, so as a C programmer, you just have to be defensive and avoid entering into undefined behaviour situations.
Why are you using strings, anyway? Just use scanf("%d", &number1) and you can avoid all of this.
Your program does not "work fine" (and should not "work fine") with either explicitly declared arrays or malloc-ed arrays. Strings like 15 and 12 require char buffers of size 3 at least. You provided buffers of size 2. Your program overruns the buffer boundary in both cases, thus causing undefined behavior. It is just that the consequences of that undefined behavior manifest themselves differently in different versions of the code.
The malloc version has a greater chance to produce illusion of "working" since sizes of dynamically allocated memory blocks are typically rounded to the nearest implementation-depended "round" boundary (like 8 or 16 bytes). That means that your malloc calls actually allocate more memory than you ask them to. This might temporarily hide the buffer overrun problems present in your code. This produces the illusion of your program "working fine".
Meanwhile, the version with explicit arrays uses local arrays. Local arrays often have precise size (as declared) and also have a greater chance to end up located next to each other in memory. This means that buffer overrun in one array can easily destroy the contents of the other array. This is exactly what happened in your case.
However, even in the malloc-based version I'd still expect a good debugging version of standard library implementation to catch the overrun problems. It is quite possible that if you attempt to actually free these malloc-ed memory blocks (something you apparently didn't bother to do), free will notice the problem and tell you that heap integrity has been violated at some point after malloc.
P.S. Don't use atoi to convert strings to integers. Function that converts strings to integers is called strtol.
Related
I have some trouble with strncat().The book called Pointers On C says the function ,strncat(),always add a NUL in the end of the character string.To better understand it ,I do an experiment.
#include<stdio.h>
#include<string.h>
int main(void)
{
char a[14]="mynameiszhm";
strncat(a,"hello",3);
printf("%s",a);
return 0;
}
The result is mynameiszhmhel
In this case the array has 14 char memory.And there were originally 11 characters in the array except for NUL.Thus when I add three more characters,all 14 characters fill up the memory of array.So when the function want to add a NUL,the NUL takes up memory outside the array.This cause the array to go out of bounds but the program above can run without any warning.Why?Will this causes something unexpected?
So when we use the strncat ,should we consider the NUL,in case causes the array go out of bound?
And I also notice the function strncpy don't add NUL.Why this two string function do different things about the same thing?And why the designer of C do this design?
This cause the array to go out of bounds but the program above can run without any warning. Why?
Maybe. With strncat(a,"hello",3);, code attempted to write beyond the 14 of a[]. It might go out of bounds, it might not. It is undefined behavior (UB). Anything is allowed.
Will this causes something unexpected?
Maybe, the behavior is not defined. It might work just as you expect - whatever that is.
So when we use thestrncat ,should we consider the NUL, in case causes the array go out of bound?
Yes, the size parameter needs to account for appending a null character, else UB.
I also notice the function strncpy don't add NUL. Why this two string function do different things about the same thing? And why the designer of C do this design?
The 2 functions strncpy()/strncat() simple share similar names, not highly similar paired functionality of strcpy()/strcat().
Consider that the early 1970s, memory was far more expensive and many considerations can be traced back to a byte of memory more that an hour's wage. Uniformity of functionality/names was of lesser importance.
And there were originally 11 characters in the array except for NUL.
More like "And there were originally 11 characters in the array except for 3 NUL.". This is no partial initialization in C.
This is not really an answer, but a counterexample.
Observe the following modification to your program:
#include<stdio.h>
#include<string.h>
int main(void)
{
char p[]="***";
char a[14]="mynameiszhm";
char q[]="***";
strncat(a,"hello",3);
printf("%s%s%s", p, a, q);
return 0;
}
The results of this program are dependent on where p and q are located in memory, compared to a. If they are not adjacent, the results are not so clear but if either p or q immediately comes after a, then your strncat will overwrite the first * causing one of them not to be printed anymore because that will now be a string of length 0.
So the results are dependent on memory layout, and it should be clear that the compiler can put the variables in memory in any order it likes. And they can be adjacent or not.
So the problem is that you are not keeping to your promise not to put more than 14 bytes into a. The compiler did what you asked, and the C standards guarantee behaviour as long as you keep to the promises.
And now you have a program that may or may not do what you wanted it to do.
I am writing a code to take a user's input from the terminal as a string. I've read online that the correct way to instantiate a string in C is to use an array of characters. My question is if I instantiate an array of size [10], is that 10 indexes? 10 bits? 10 bytes? See the code below:
#include <stdio.h>
int main(int argc, char **argv){
char str[10] = "Jessica";
scanf("%s", &str);
printf("%c\n", str[15]);
}
In this example "str" is initialized to size 10 and I am able to to print out str[15] assuming that when the user inputs a a string it goes up to that index.
My questions are:
Does the size of the "str" array increase after taking a value from scanf?
At what amount of string characters will my original array have overflow?
.
When you declare an array of char as you have done:
char str[10] = "Jessica";
then you are telling the compiler that the array will hold up to 10 values of the type char (generally - maybe even always - this is an 8-bit character). When you then try to access a 'member' of that array with an index that goes beyond the allocated size, you will get what is known as Undefined Behaviour, which means that absolutely anything may happen: your program may crash; you may get what looks like a 'sensible' value; you may find that your hard disk is entirely erased! The behaviour is undefined. So, make sure you stick within the limits you set in the declaration: for str[n] in your case, the behaviour is undefined if n < 0 or n > 9 (array indexes start at ZERO). Your code:
printf("%c\n", str[15]);
does just what I have described - it goes beyond the 'bounds' of your str array and, thus, will cause the described undefined behaviour (UB).
Also, your scanf("%s", &str); may also cause such UB, if the user enters a string of characters longer than 9 (one must be reserved for a terminating nul character)! You can prevent this by telling the scanf function to accept a maximum number of characters:
scanf("%9s", str);
where the integer given after the % is the maximum input length allowed (anything after this will be ignored). Also, as str is defined as an array, then you don't need the explicit "address of" operator (&) in scanf - it is already there, as an array reference decays to a pointer!
Hope this helps! Feel free to ask for further clarification and/or explanation.
One of C's funny little foibles is that in almost all cases it does not check to make sure you are not overflowing your arrays.
It's your job to make sure you don't access outside the bounds of your arrays, and if you accidentally do, almost anything can happen. (Formally, it's undefined behavior.)
About the only thing that can't happen is that you get a nice error message
Error: array out-of-bounds access at line 23
(Well, theoretically that could happen, but in practice, virtually no C implementation checks for array bounds violations or issues messages like that.)
See also this answer to a similar question.
An array declares the given number of whatever you are declaring. So in the case of:
char str[10]
You are declaring an array of ten chars.
Does the size of the "str" array increase after taking a value from scanf?
No, the size does not change.
At what amount of string characters will my original array have overflow?
An array of 10 chars will hold nine characters and the null terminator. So, technically, it limits the string to nine characters.
printf("%c\n", str[15]);
This code references the 16th character in your array. Because your array only holds ten characters, you are accessing memory outside of the array. It's anyone's guess as to if your program even owns that memory and, if it does, you are referencing memory that is part of another variable. This is a recipe for disaster.
I have written simple string program using array allocation method. I have allocated character array 10 bytes, but when i give input, program is accepting input string of greater than 10 bytes. I am getting segmentation fault only when I give input string of some 21 chars. Why there is no segmentation fault when my input exceed allocated my array limit?
Program:
#include <stdio.h>
#include <string.h>
void main() {
char str[10];
printf ("\n Enter the string: ");
gets (str);
printf ("\n The value of string=%s",str);
int str_len;
str_len = strlen (str);
printf ("\n Length of String=%d\n",str_len);
}
Output:
Enter the string: n durga prasad
The value of string=n durga prasad
Length of String=14
As you can see, string length is shown as 14, but I have allocated only 10 bytes. How can the length be more that my allocated size?
Please, don't use gets() it suffers from buffer overflow issues which in turn invokes undefined behaviour.
Why there is no segmentation fault when my input exceed allocated my array limit?
Once your input is exceeding the allocated array size (i.e., 9 valid characters + 1 null-terminator), the immediate next access t the array location becomes illegal and invokes UB. The segmentation fault is one of the side effect of UB, it is not a must.
Solution: Use fgets() instead.
When you declare an array, like char str[10];, your compiler won't always allocate precisely the number of bytes that you required. It often allocate more, usually a multiple of 8 if you are in 64-bits system, for instance it might be 16 in your case.
So even if you asked for 10 bytes, you can manipulate some more. But of course, it's strongly unrecommended because, as you said, it can produce segmentation faults.
And, as said by other answers from Sourav and Gopi, to use fgets instead of gets may also help to produce less undefined behavior.
When you enter more than the number of characters the array can hold then you have undefined behavior. Your array can hold 9 characters followed by a null terminator, so any devaition from this is a UB.
Don't use gets() use fgets() instead
char a[10];
fgets(a,sizeof(a),stdin);
By using fgets() you are avoiding buffer overflow issue and avoiding undefined behavior.
PS: fgets() comes with a newline character
As you already know, your input causes buffer overflow, I'm not going to repeat the reason. Instead I would like to answer the particular question ,
"Why there is no segmentation fault when my input exceed allocated my array limit?"
The reason that there may or may not be segmentation fault depends on something called undefined behaviour. Once you overrun the allocated memory boundary, you're not supposed to get a segmentation fault for sure. Rather, what you'll be facing is UB (as told earlier). Now, quoting the results of UB,
[...] programs invoking undefined behavior may compile and run, and produce correct results, or undetectably incorrect results, or any other behavior.
So, it is not a must that you'll be getting a segmentation fault immediately on accessing the very next memory. It may run perfectly well unless it reaches some memory which is actually inaccessible for the particular process and then, the SIGSEV signal (11) will be raised.
However, after running into UB, any output from any subsequent statement cannot be validated. So, the output of strlen() is invalid here.
int main()
{
char *p;
p = (char* ) malloc(sizeof(char) * 0);
printf("Hello Enter the data without spaces :\n");
scanf("%s",p);
printf("The entered string is %s\n",p);
//puts(p);
}
On compiling the above code and running it , the program is able to read the string even though we assigned a 0 byte memory to the pointer p .
What actually happens in the statement p = (char* ) malloc(0) ?
It is implementation defined what malloc() will return but it is undefined behavior to use that pointer. And Undefined behavior means that anything can happen literally from program working without glitch to a crash, all safe bets are off.
C99 Standard:
7.22.3 Memory management functions
Para 1:
If the size of the space requested is zero, the behavior is implementation-defined: either a null pointer is returned, or the behavior is as if the size were some nonzero value, except that the returned pointer shall not be used to access an object.
In addition to Als comment - what happens: you write somewhere into memory and retrieve the data from there. So depending of your system and OS type you get a exception or just some undefined behaviour
Just out of curiosity, I tested your code using gcc on linux, and its a lot more robust than I would expect (after all, writing data to a character buffer of length 0 is undefined behavior... I would expect it to crash).
Here's my modification of your code:
#include <stdio.h>
#include <stdlib.h>
int main()
{
char *p;
p = malloc(sizeof(char)*0);
printf("Hello Enter some without spaces :\n");
scanf("%s",p);
char *q;
q = malloc(sizeof(char)*0);
printf("Hello Enter more data without spaces :\n");
scanf("%s",q);
printf("The first string is '%s'\n",p);
printf("The second string is '%s'\n",q);
}
My first thought was that you might be saved by the fact that you're only reading data into a single memory location -- if you use two buffers, the second might overwrite the first... so I broke the code into input and output sections:
Hello Enter some without spaces :
asdf
Hello Enter more data without spaces :
tutututu
The first string is 'asdf'
The second string is 'tutututu'
If the first buffer had been overwritten, we would see
The first string is 'tutututu'
The second string is 'tutututu'
So that's not the case. [but this depends on how much data you pack into each buffer... see below]
Then, I pasted a crazy amount of data into both variables:
perl -e 'print "c" x 5000000 . "\n" ' | xsel -i
(This put 4+ MB of 'c's into the copy buffer). I pasted this in to both the first and second scanf calls. The program took it without a segmentation fault.
Even though I didn't have a segmentation fault, the first buffer did get overwritten. I couldn't tell it because so much data went flying up the screen. Here's a run with less data:
$ ./foo
Hello Enter some without spaces :
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
Hello Enter more data without spaces :
ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc
The first string is 'aaaaaaaaaaaa'
The second string is 'ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc'
There was a little glyph after aaaaaaaaaaaa, which is how my terminal represents a unicode character that it can't display. This is typical of overwritten data: you don't know what is going to overwrite your data... it's undefined behavior, so you're prone to nasal demons.
The bottom line is that when you write to memory that you haven't allocated space for (either explicitly using malloc or implicitly with an array), you're playing with fire. Sooner or later, you'll overwrite memory and cause yourself all sorts of grief.
The real lesson here is that C doesn't do bounds checking. It will happily let you write to memory that you don't own. You can do it all day long. Your program may run correctly, and it may not. It may crash, it may write back corrupted data, or it might work until you scan in one more byte than you used while testing. It doesn't care, so you have to.
The case of malloc(0) is simply a special case of this question.
Let me preface this by saying that i am a newbie, and im in a entry level C class at school.
Im writing a program that required me to use malloc and malloc is allocating 8x the space i expect it to in all cases. Even when just to malloc(1), it is allocation 8 bytes instead of 1, and i am confused as to why.
Here is my code I tested with. This should only allow one character to be entered plus the escape character. Instead I can enter 8, so it is allocating 8 bytes instead of 1, this is the case even if I just use a integer in malloc(). Please ignore the x variable, it is used in the actual program, but not in this test. :
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
int main (int argc ,char* argv[]){
int x = 0;
char *A = NULL;
A=(char*)malloc(sizeof(char)+1);
scanf("%s",A);
printf("%s", A);
free(A);
return 0;
}
A=(char*)malloc(sizeof(char)+1);
is going to allocate at least 2 bytes (sizeof(char) is always 1).
I don't understand how you are determining that it is allocating 8 bytes, however malloc is allowed to allocate more memory than you ask for, just never less.
The fact that you can use scanf to write a longer string to the memory pointed to by A does not mean that you have that memory allocated. It will overwrite whatever is there, which may result in your program crashing or producing unexpected results.
malloc is allocating as much memory as you asked for.
If you can read more than the allocated bytes (using scanf) it's because scanf is reading also over the memory you own: it's a buffer overflow.
You should limit the data scanf can read this way:
scanf( "%10s", ... ); // scanf will read a string no longer than 10
Im writing a program that required me
to use malloc and malloc is allocating
8x the space i expect it to in all
cases. Even when just to malloc(1), it
is allocation 8 bytes instead of 1,
and i am confused as to why.
Theoretically speaking, the way you do things in the program, is not allocating 8 bytes.
You can still type in 8 bytes (or any number of bytes) because in C there is no check, that you are still using a valid place to write.
What you see is Undefined Behaviour, and the reason for that is that you write in memory that you shouldn't. There is nothing in your code that stops the program after n byte(s) you allocated have been used.
You might get Seg Fault now, or later, or never. This is Undefined Behaviour. Just because it appears to work, does not mean it is right.
Now, Your program could indeed allocate 8 bytes instead of 1.
The reason for that is because of Alignment
The same program might allocate a different size in a different machine and/or a different Operating System.
Also, since you are using C you don't really need to cast. See this for a start.
In your code, there is no limit on how much data you can load in with scanf, leading to a buffer overflow (security flaw/crash). You should use a format string that limits the amount of data read in to the one or two bytes that you allocate. The malloc function will probably allocate some extra space to round the size up, but you should not rely on that.
malloc is allowed to allocate more memory than you ask for. It's only required to provide at least as much as you ask for, or fail if it can't.
using malloc or creating a buffer on the stack will allocate memory in words.
On a 32-bit system the word size is 4 bytes, so when you ask for
A=(char*)malloc(sizeof(char)+1);
(which is essentially A=(char*)malloc(2);
the system will actually give you 4 bytes. On a 64-bit machine you should get 8 bytes.
The way you use scanf there is dangerous as it will overflow the buffer if a string greater than the allocated size leaving a heap overflow vulnerability in your program. scanf in this case will attempt to stuff a string of any length in to that memory so using it to count the allocated size will not work.
What system are you running on? If it's 64 bit, it is possible that the system is allocating the smallest possible unit that it can. 64 bits being 8 bytes.
EDIT: Just a note of interest:
char *s = malloc (1);
Causes 16 bytes to be allocated on iOS 4.2 (Xcode 3.2.5).
If you enter 8 if will just allocate 2 bytes sizeof(char) == 1 (unless you are on some obscure platform) and you will write you number to that char. Then on printf it will output the number you stored in there. So if you store the number 8 it'll display 8 on the command line. It has nothing to do with the count of chars allocated.
Unless of course you looked up in a debugger or somewhere else that it is really allocating 8 bytes.
scanf has no idea how big the target buffer actually is. All it knows is the starting address of the buffer. C does no bounds checking, so if you pass it the address of a buffer sized to hold 2 characters, and you enter a string that's 10 characters long, scanf will write those extra 8 characters to the memory following the end of the buffer. This is called a buffer overrun, which is a common malware exploit. For whatever reason, the six bytes immediately following your buffer aren't "important", so you can enter up to 8 characters with no apparent ill effects.
You can limit the number of characters read in a scanf call by including an explicit field width in the conversion specifier:
scanf("%2s", A);
but it's still up to you to make sure that target buffer is large enough to accomodate that width. Unfortunately, there's no way to specify the field width dynamically as there is with printf:
printf("%*s", fieldWidth, string);
because %*s means something completely different in scanf (basically, skip over the next string).
You could use sprintf to build your format string:
sprintf(format, "%%%ds", max_bytes_in_A);
scanf(format, A);
but you have to make sure the buffer format is wide enough to hold the result, etc., etc., etc.
This is why I usually recommend fgets() for interactive input.