Difference between 'A' and "A" literals in C [duplicate] - c

This question already has answers here:
Difference between "3" and '3' in C
(5 answers)
Closed 9 years ago.
Am I right in saying that the difference between 'A' and "A" in C is:
'A' - Means a character 1 byte (8 bits) long containing A.
"A" - Means a string which is 2 bytes (16 bits) long which holds an A and a NULL character.
Are my explanations correct?
Thanks.

You are absolutely correct.But you got to bear in mind that when you use those as rvalues they are different.One has the type char, and can be used as an int as it is implicitly converted to the ASCII value, while the other has type char*.
Let me illustrate my point with a code if that helps:
int num='A'; //Valid, assigns 65 to num
char test=65; //Valid, as test will be 'A' after this
char *ptr="A" //Valid, assigns address of string "A" to pointer ptr
printf("%c,%d",'A','A'); // Output will be A,65
printf("%p",(void*)"A"); //Will print address where string "A" is
printf("%c","A"); ///WRONG
printf("%s","A"); //Works
Edit For the finer nuances, if you feel your understanding is up to that mark yet,refer to Mat's comment.Else read it after a few weeks when you have advanced further in your study of C.

Basically yes. The main nuance is byte vs. char. Where you say byte, you should say char. On most systems a char is one byte. There are a few that use larger objects to store a char.

Related

Why sizeof() operator gives different value for 'a' and "a" in C? [duplicate]

This question already has answers here:
Sizeof(char[]) in C
(8 answers)
Why the sizeof character constant is 4 bytes? [duplicate]
(2 answers)
Sizeof string literal
(2 answers)
Closed 4 years ago.
As far as I know a character constant, viz, 'a', is stored in ASCII format which is internally treated as integer, 97 in case of 'a', that's why sizeof('a') returns 4 on executing but when I use sizeof("a") it returns 2. I have not found any explanation regarding that yet.
My code:
#include <stdio.h>
void main()
{
int x,y;
x = sizeof('a');
y = sizeof("a");
printf("%d\n",x);
printf("%d",y);
}
which gives the output:
4
2
'a' is an integer. It has a size of 4 on most computers. It may also be something else, however. 2 is also common on more specialized hardware.
"a" is a string literal. There are two characters: a and \0. They have a size of 1 each, for a total of size 2. However, when you try to assign that, what you get usually is a const char* and that may have a different size.

Trouble comparing value in C [duplicate]

This question already has answers here:
How to convert integer to char in C? [duplicate]
(6 answers)
Closed 5 years ago.
I am having problem comparing an int value and a char value in C. Lets say I variable int value1 that is 0 and a char that has value '0'. I know that that char's value is actually an ascii number and that '0' is 48, but how do I compare int's 0 value with chars '0' value, in an if statement as a example?
You can use the idiom c - '0' to convert a char c to its equivalent digit.
This is an expression of type int.
Note that it works in any character encoding supported by the C Standard, since such an encoding must order 0 to 9 consecutively.

unable to get char of 2 character long in C [duplicate]

This question already has answers here:
How to determine the result of assigning multi-character char constant to a char variable?
(5 answers)
Closed 8 years ago.
following statement in c gives no error
char p='-1';
but the following gives error:
char p='-12';
ERROR: character can be one or two characters long.
I never knew that a char in c can ever be two characters long. However printf("%c",p) gives - as output. Where can i use char in c?
In C, a character constant like 'A' does not have type char, but rather type int. This creates the possibility that, even on a system where char is only 8 bits wide (and so int is wider than char), character constant notations can exist which provide integer values wider than char.
The C standard requires implementations to support multi-character constants, but their values are implementation-defined.
Why your compiler allows only two characters is likely because the type int is only 16 bits wide. Perhaps a constant like 'AB' is encoded similarly to, say, the expression ('A' << 8 | 'B'). According to the obvious extension of this scheme, 'ABC' would then have to be ('A' << 16 | 'B' << 8 | 'C') which doesn't fit into 16 bits and calls for out-of-range shifts. Hence, the two character limit.
In the GNU C compiler, four characters can be used:
#include <stdio.h>
int main(void)
{
printf("%x\n", (unsigned) 'ABCD');
return 0;
}
int is 32 bits wide, and this program prints 41424344 which, by golly, is hexadecimal for the ASCII characters ABCD. So this feature is useful for int-wide magic constants which are readable. Instead of:
#define MAGIC 0x41424344 /* This spells ABCD; easy to spot in memory dumps */
You can do this, which is nice, but less portable:
#define MAGIC 'ABCD'
What if we use five or more characters, like 'ABCDE'? Then GCC respond similarly to how Turbo C++ responds for three or more:
test.c:5:35: warning: character constant too long for its type [enabled by default]
It so happens that the program still compiles, and its output is unchanged: the E was truncated.
There is an important difference. The old Borland compiler is rejecting the excessively-long constant as an error. Though that is probably a good idea, it is not standard-conforming; when some value is implementation-defined, the implementation's response cannot be failure, such as stopping the translation or execution of the program. Issuing a diagnostic is fine, of course.
char p='-517';
printf("%c\n", p);
Running the above code gave me output 7 and a warning: overflow in implicit constant conversion [-Woverflow]
char can not contain more than 1 byte of information
You want an array of characters, also known as a C-string
// Note, if you initialize a character array with a literal string
// there is no need for a size specifier
char c[] = "-12";
// Note this is a method of copying one character array into another.
#include <string.h>
char c[4];
strcpy(c, "-12");
You'll notice that char c[4] has an indicated size of 4. Meaning, the array can only hold 4 characters. In C, character arrays have a special property: A null terminator (char '\0') is a sentinel value that C-string functions use to recognize the end of your string. So, in reality, a character string "-12" is of size 4. '-', '1', '2', and '\0'.
You can also access individual elements of an array by passing an indice to it's operator[] function.
printf("%s\n", c);
printf("%c\n", c[0]);
Notice the c[0] call, This will access the character '-' of the string "-12".
Hope I helped.

why sizeof('a') is 4 in C? [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Why are C character literals ints instead of chars?
#include<stdio.h>
int main(void)
{
char b = 'c';
printf("here size is %zu\n",sizeof('a'));
printf("here size is %zu",sizeof(b));
}
here output is (See live demo here.)
here size is 4
here size is 1
I am not getting why sizeof('a') is 4 ?
Because in C character constants, such as 'a' have the type int.
There's a C FAQ about this suject:
Perhaps surprisingly, character constants in C are of type int, so
sizeof('a') is sizeof(int) (though this is another area where C++
differs).
The following is the famous line from the famous C book - The C programming Language by Kernighan & Ritchie with respect to a character written between single quotes.
A character written between single quotes represents an integer value equal to the numerical value of the character in the machine's character set.
So sizeof('a') is equivalent to sizeof(int)
'a' by default is an integer and because of that you get size of int in your machine 4 bytes.
char is 1 bytes and because of this you get 1 bytes.

Why sizeof('c') is returning 4 instead of 1? [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Why are C character literals ints instead of chars?
http://ideone.com/lHYY8
int main(void)
{
printf("%d %d\n", sizeof('c'), sizeof(char));
return 0;
}
Why does sizeof('c') return 4 instead of 1?
Because in C character constants have the type int, not char. So sizeof('c') == sizeof(int). Refer to this C FAQ
Perhaps surprisingly, character constants in C are of type int, so
sizeof('a') is sizeof(int) (though this is another area where C++
differs).
One (possibly even more extreme) oddity that also somehow justifies this, is the fact that character literals are not limited to being single character.
Try this:
printf("%d\n", 'xy');
This is sometimes useful when dealing with e.g. binary file formats that use 32-bit "chunk" identifiers, such as PNG. You can do things like this:
const int chunk = read_chunk_from_file(...);
if(chunk == 'IHDR')
process_image_header(...);
There might be portability issues with code like this though, of course the above snippet assumes that read_chunk_from_file() magically does the right thing to transform the big-endian 32-bit value found in the PNG file into something that matches the value of the corresponding multi-character character literal.
The following is the famous line from the famous C book - The C programming Language by Kernighan & Ritchie with respect to a character written between single quotes.
A character written between single quotes represents an integer value equal to the numerical value of the character in the machine's character set.
So sizeof('a') is equivalent to sizeof(int)
And this question is a duplicate of why sizeof('a') is 4 in C?
cnicutar is completely right of course. I just wanted to add the reason for this. If you look at functions line fgetc, you'll notice that it also returns an int. It's because a char can represent any character from 0x00 to 0xFF, but an additional value is needed in order to represent EOF. So functions that return a character from input or a file often return an int, which can be compared with EOF, which is usually defined to be -1, but it can be anything that isn't a valid character.

Resources