Practice.c
#include <stdio.h>
main()
{
char a;
printf("\nEnter Anything = ");
scanf("%c",&a); <Line 1>
printf("\n%d",a);
printf("\n%c",a);
}
Output 1 : Enter Character = 5
53
5
Output 2 : Enter Character = a
97
a
This program gets executed exactly by the book.
New.c
#include <stdio.h>
main()
{
char a;
printf("\nEnter Character = ");
scanf("%d",&a); <Line 1>
printf("\n%d",a);
printf("\n%c",a);
}
Output 1 : Enter Character = 5
5
♣
Output 2 : Enter Character = a
0
This is the same program as Practice.c with minor change in it. It is not a question but I mistakenly typed %d instead of %c in the line denoted by Line 1 of the program. This mistake of mine produced 2 very different outputs. What is the exact reason behind it?
Below ASCII 32, all are non-printable characters. In second code you are reading an int and then trying to print the equivalent character which is non-printable.
For the second input a, scanf doesn't read this character and leave it in the buffer because it expects an integer not a character. The variable a is uninitialized and you are now accessing an uninitialized variable. This invokes undefined behavior.
The character data type is having size 1 byte, whereas integer is usually 4 bytes. Here you have tried to read into a character variable an integer. There is not enough storage to handle this properly. The result depends on whether your architecture supports little endian or big endian notation.
When you use %c the input is taken as character and stored its ASCII value.
When you use %d the input is taken as integer.
When you print with %c, it tries to print ASCII char at given value.
When you print with %d, it tries to print the number.
What really happened in your NEW program is
scanf("%d",&a);
Format specifier while reading is given as an integer %d ,so the compiler assumes the input value as integer even though it is stored in char variable, so the value of variable will be
a = 0x05 which is an integer 5
printf("\n%d",a);
printing variable a as integer prints 5 on output
printf("\n%c",a);
I think you know that while reading a digit as a character(using %c) it's ASCII value is stored (for eg: scanf("%c",&a)); and you enter a as 5, the value stored in the variable a is a=0x35 in hex ,or 53 as integer representation
printing variable as character-> since 0x05 is not character or digit in ASCII value it prints the special character on the output
for about ASCII table refer : http://www.asciitable.com/
Related
wrote this code to "find if the given character is a digit or not"
#include<stdio.h>
int main()
{
char ch;
printf("enter a character");
scanf("%c", &ch);
printf("%c", ch>='0'&&ch<='9');
return 0;
}
this got compiled, but after taking the input it didn't give any output.
However, on changing the %c in the second last line to %d format specifier it indeed worked. I'm a bit confused as in why %d worked but %c didn't though the variable is of character datatype.
Characters in C are really just numbers in a token table. The %c is mainly there to do the translation between the alphanumeric token table that humans like to read/write and the raw binary that the C program uses internally.
The expression ch>='0'&&ch<='9' evaluates to 1 or 0 which is a raw binary integer of type int (it would be type bool in C++). If you attempt to print that one with %c, you'll get the symbol table character with index 0 or 1, which isn't even a printable character (0-31 aren't printable). So you print a non-printable character... either you'll see nothing or you'll see some strange symbols.
Instead you need to use %d for printing an integer, then printf will do the correct conversion to the printable symbols '1' and '0'
As a side-note, make it a habit to always end your (sequence of) printf statements with \n since that "flushes the output buffer" = actually prints to the screen, on many systems. See Why does printf not flush after the call unless a newline is in the format string? for details
In a memory, int take 4 bytes of memory you are trying to storing the int values in a character which will return the ascii value not a int value in %c if you are using a %d which will return the int value which are storing in a memory of 4 bytes memory.
so I need to create a program in C that reads a string character by character and output each character in the string's corresponding ascii decimal/hexadecimal/octal representation. This program should stop reading the string upon reaching a newline character.
For example given "Hi" the program should print "72 105" if I chose the decimal option.
So my idea is to read the string provided character by character into a large character array checking each character to see if it is the new line and then afterwards looping through elements 0-N where N is the index where char_array[N]=='\n' printing each element as its decimal representation.
So far, I've come up with this code which should cover the decimal representation case but I've come across an error I'm unsure how to troubleshoot.
#include <stdio.h>
int main(){
char char_arr[1000],conversion,temp;
int i=-1,j=0;
printf("Integer conversion in decimal (d), octal (o), or hexadecimal (h)? ");
scanf("%c",&conversion);
printf("Enter a message: ");
do{ i++;
if(scanf("%c",&char_arr[i])!=1){
break;}
}
while(char_arr[i]!='\n');
for(j=0;j<i;j++){
printf("%d",char_arr[j]);
}
return 0;
}
When I run this program it simply prints
Integer conversion in decimal (d), octal (o), or hexadecimal (h)? d
Enter a message:
and gives me no option to enter a message.
Does anyone have any thoughts or explanations for why this would occur? I'm new to C.
I had a project to write this program that reads a character, integer and a floating number. It also had to convert the character in to its ASCII integer. I vaguely understand most of it except the code regarding the ASCII number. I created most of it on my own but when it came to converting the character I had a buddy help me, but he is not the best at explaining this.
Any explanation would be very helpful, thank you. I did this all by trial and error for... probably far too long :P
#include <stdio.h>
#include <stdlib.h>
int main(void)
{
char c;
int i;
float f;
printf ("Please enter a character: ");
scanf ("%c", &c);
printf ("Please enter an integer: ");
scanf ("%d", &i);
printf ("Please enter a floating point number: ");
scanf ("%f", &f);
printf ("\n");
printf ("The character you entered is: %c which is %d in integer-specification\n", c,c);
printf ("The integer you entered is: %d.\n", i);
printf ("The floating-point number you entered is: %f.\n", f);
return 0;
}
It will be helpful to have a reference documentation of printf handy.
When the format specifier is %c, printf expects an int. A char works since it is promoted to an int.
When the format specifier is %d, printf expects an int. A char works since it is promoted to an int.
For the first case, printf prints the character corresponding to the value of the int.
For the second case, printf just prints the number.
Essentially a char is just an integer with a range from 0 to 255, the only thing that makes it a 'character' is that we have assigned characters to be represented by certain numbers.
What's going on in the printf() is that first you tell it that you're going to pass a char and would it please just output that byte to be displayed, then you say that you want it to convert an integer to a string and output that.
What it boils down to is that the ASCII code is the character.
Note: for reasons that are not important at the moment when being passed into a vararg function like printf() chars and shorts are promoted to ints.
Here is my attempt at explaining this.
Everything in a program memory is stored as sequences of 0s and 1s (binary numbers). You are probably familiar with bits and bytes, but just in case: a binary digit (0 or 1) is a bit. A sequence of 8 bits is a byte. Characters, such as 'A', 'b', '1', etc. have to be stored as numbers in memory. There are different ways to map a character to a number: EBCDIC (very rare these days, was used on large mainframes in the old days), UTF-8, UTF-16, UTF-32 (types of Unicode), ASCII. These mappings are called character encodings, and ASCII is the simplest of those commonly used.
So, when you enter a character, it gets stored in the variable c as an integer number encoded using ASCII. The content of memory location corresponding to the variable can be thus viewed both as a character and an integer.
When using the printf() function you can request the same variable to be printed using different possible representations. In this case, since c is a character variable and can be represented as a character or a number, you can use the different format specifiers, %c and %d, to have it printed as character and integer, respectively.
The 1st of the printf() calls above takes 3 arguments. The first argument is a string that is printed out, with things that start with % being used to interpret the remaining arguments. %c corresponds to the 2nd argument, %d to the 3rd. %c says: "treat the 2nd argument as character and print it out." %s says: "treat the 3rd argument as integer and print it out."
The other 2 printf() calls in your code take only 2 arguments. The 1st argument, the string, contains only one thing starting with % (format specifier), so we only need 1 additional argument to be interpreted using the format specifier.
Hope this is helpful.
the Man ascii will make you understand
what happens if I read an integer like 20,30,10000...9999 into variable a ? it only prints the first digit in the number that I've read...why is that ?
for example if I read 123, on the screen it prints 1. Isn't it supposed to convert the integer 123 into it's equivalent ASCII character representation ?
#include <stdio.h>
int main() {
char a;
scanf("%c", &a);
printf("%c", a);
return 0;
}
This is an exam question from C language.
No, it reads the character, which is represented by the machine as a small integer, into the variable.
If you enter 100 (the number 100, three keypresses and thus three characters), it will only store the first character of that, i.e. the leading 1.
If you wanted to convert a number to an actual integer, you should use %d and an int variable of course.
Printing with %c will print back a single character, by interpreting the small integer value as a character (rather than as an integer). So for an input of 100 you will see 1 printed back out, i.e. the character that represents the decimal digit one.
If you want to print out the numeric representation of the character you read in, scan with %c but print with %d, and cast the char to (int) in the printf() call.
The problem is that %c parse a char for console input. From a number like 123 it take only the first letter and dispose the rest. The way to parse a int value is using %d on the scanf function.
No, it will read only the first character into the char variable. How can a char variable store more than one character at an instant? It can't.
So if you want the ASCII value, input as an integer instead.
int a;
scanf("%d", &a); // suppose input is 65
printf("%c", a); // prints 'A'
printf("%d", a); // prints 65
Whereas
char a;
scanf("%c", &a); // suppose input is 65
printf("%c", a); // prints '6'
printf("%d", a); // prints 54 which is the ASCII value of '6'
I want to type in a number that is 10 digits long, then put the digits into an array. But for some reason, I get this random 2-digit numbers that seem to have nothing to do with my input (??).
char number[10]; //number containing 10 digits
scanf("%s",number); //store digits of number
printf("%d\n",number[0]); //print the 1st digit in the number
printf("%d\n",number[1]); //print the 2nd digit in the number
Here is what I got:
Input:
1234567890
Output:
49
50
Actually, 49 should be 1, and 50 should be 2.
You are getting ASCII value of characters 1 and 2. Use %c specifier to print the digits.
printf("%c\n",number[0]);
Warning! Your code may invoke undefined behaviour!
But we'll talk about it later. Let us address your actual question first.
Here is a step by step explanation of what is going on. The first thing you need to know is that every character literal in C is actually an integer for the compiler.
Try this code.
#include <stdio.h>
int main()
{
printf("%d\n", sizeof '1');
return 0;
}
The output is:
4
This shows that the character literal '1' is represented as 4 byte integer by the compiler. Now, let us see what this 4 byte integer for '1' is using the next code here.
#include <stdio.h>
int main()
{
int a = '1';
printf("a when intepreted as int : %d\n", a);
printf("a when intepreted as char: %c\n", a);
return 0;
}
Compile it and run it. You'll see this output.
a when intepreted as int : 49
a when intepreted as char: 1
What do we learn?
The character '1' is represented as the integer 49 on my system. This is so for your system too. That's because in my system as well as yours, the compiler is using ASCII codes for the integers where '1' is 49, '2' is 50, 'A' is 65, 'B' is 66, and so on. Note that the mapping of the characters to these codes could be different for another system. You should never rely on these integer codes to identify the characters.
So when I try to print this value as integer (using %d as the format specifier), well what gets printed is the integer value of '1' which is 49. However, if we print this value as a character (using %c as the format specifier), what gets printed is the character whose integer code is 49. In other words, 1 gets printed.
Now try this code.
#include <stdio.h>
int main()
{
char s[] = "ABC123";
int i;
printf("char %%d %%c\n");
printf("---- -- --\n");
for (i = 0; i < 6; i++) {
printf("s[%d] %d %c\n", i, s[i], s[i]);
}
return 0;
}
Now you should see this output.
char %d %c
---- -- --
s[0] 65 A
s[1] 66 B
s[2] 67 C
s[3] 49 1
s[4] 50 2
s[5] 51 3
Does it make sense now? You need to use the %c format specifier when you want to print the character. You should use %d only when you want to see the integer code that represents that character.
Finally, let us come back to your code. This is how you fix it.
#include <stdio.h>
int main()
{
char number[10];
scanf("%9[^\n]", number);
printf("%c\n", number[0]);
printf("%c\n", number[1]);
return 0;
}
There are two things to note.
I have used %c as the format specifier to print the character representation of the digits read.
I have altered the format specifier for scanf to accept at most 9 characters only where the characters are not newline characters. This is to make sure that a user cannot crash your program by inputting a string that is far longer than 9 characters. Why 9 instead of 10?. Because we need to leave one cell of the array empty for the null-terminator. A longer input would overwrite memory locations beyond the allocated 10 bytes for the number array. Such buffer overruns lead to code that invoke undefined behaviour which could either cause a crash or kill your cat.
printf("%c\n",number[0]); //print the 1st digit in the number
printf("%c\n",number[1]);
should do the job for you, what you see are ascii values.
your number array is an array of char, and so every element of it is a char.
when you type:
printf("%d\n",number[0]);
you printing the chars as integers, and so you get the ASCII code for each char.
change your statement to printf("%c\n",number[0]); to print chars as chars not as ints
Warning! Your code invokes undefined behaviour!
char number[10]; // Can only store 9 digits and nul character
scanf("%s",number); // Inputting 1234567890 (11 chars) will overflow the array!
Use fgets instead:
#define MAX_LEN 10
char number[MAX_LEN];
if(fgets(number, MAX_LEN, stdin)) {
// all went ok
}
Once you have fixed this, you can fix the printing problem. You are printing the character code (number), and not the actual character. Use different type specifier:
printf("%c\n",number[0]);