LOGIC of Converting a "char" into an "int" in C - c

everyone!
please help me understand the following problem...
So i will have a STRING-type input of a note, looks like "A5" or "G#2" or "Cb4" etc. And i need to extract an octave index, which is the last digit "5" or "2" or "4"... And after exctraction i need it as an int-type.
So I did this:
string note = get_string("Input: ");
int octave = atoi(note[strlen(note) - 1]);
printf("your octave is %i \n", octave);
But it gave me an error "error: incompatible integer to pointer conversion passing 'char' to parameter of type 'const char *'; take the address with & [-Werror,-Wint-conversion]"
Then I tryied to throug away the math from the function, and did this:
int extnum = strlen(note) - 1;
int octave = atoi(note[extnum]);
It didn't work as well. So i did my reserch on atoi function and i don't get it...
ATOI expects a string (CHECK)
Converts it to an interger, not the ASCII meanning (CHECK)
Library for atoi function (CHECK)
What I am doing in basically asking "take n-th character of that string and make it an int".
After googling for some time a found an other code example where a guy uses atoi with this symbol '&'. So i did this:
int octave = atoi(&note[strlen(note) - 1]);
And IT WORKED! But I can't understand WHY it worked with the & symbol and didnt work without it....Cause it always worked without it! There was a million times i was giving a single-character string like '5' or so ond just used atoi and it worked perfectly...
Plesase help me, why in this case it acts so weird?

C does not have a native string type. Strings are usually represented as char array or a pointer to char.
Assuming that string is just a typedef to char *.
if note is an array of chars, note[strlen(note)-1] is just the last character. Since atoi expects a pointer to char (which has to be null-terminated) you have to pass the address of the char and not the value.
The task to convert one char digit to int could also be solved easier:
int octave = note[strlen(note) - 1] - '0';

The function atoi takes a pointer to a character array as the input parameter (const char*). When you call note[strlen(note) - 1] this is a single character (char), in order to make atoi work you need to provide the pointer. You do that by adding & as you've done. This then works, because right after that single digit there is a null character \0 that terminates the string - because your original string was null-terminated.
Note however that doing something like this would not be a good idea:
char a = '7';
int b = atoi(&a);
as there is no way to be sure what the next byte in memory is (following the byte that belongs to a), but the function will try to read it anyway, which can lead to undefined behaviour.

The last character is... well a character! not a string. So by adding the & sign, you made it a pointer to character (char*)!
You can also try this code:
char a = '5';
int b = a - '0';
Gives you ASCII code of 5 minus ASCII code of 0

From the manual pages, the signature of atoi is: int atoi(const char *nptr);. So, you need to pass the address of a char.
When you do this: atoi(note[strlen(note) - 1]) you pass the char itself. Thus, invoking UB.
When you use the &, you are passing what the function expects - the address. Hence, that works.

atoi excepts a string (a pointer to character), not a single character.
However, you should never use atoi since that function has bad error handling. The function strtol is 100% equivalent but safer.
You need to do this in two steps:
Find the first digit in the string.
From there, convert it to integer by calling strtol.
1) can is solved by looping through the string, checking if every item is a digit by calling isdigit from ctype.h. Simultaneously, check for the end of the string, the null terminator \0. When you find the first digit, save a pointer to that address.
2) is solved by passing the saved pointer to strtol, such as result = strtol(pointer, NULL, 10);.

Related

Scanf a Character into an Array in C

I'm trying to take a single character in an array and then print that character using a specific syntax. Here's my code :
int main(){
char in[18];
scanf("%c",in);
printf("%c",in);
return 0;
}
I know how to take a character from user in C & many other ways to do the same task but I'm curious to know Why this code prints nothing on the screen. Here's my explanation for this code. Kindly correct me if wrong.
First of all array of 18 characters is declared.
Using scanf, Character is stored in the 1st position of array.("in" refers to the address of its first element.)
Then when I'm trying to print that character, It prints nothing.
When I changed "in" to "in[0]" then Character prints on the screen.
I think "in" also points to the 1st element as well as in[0] too. Then Why I'm getting two different answers. ?
Thanks In Advance !!
in[0] does not point to the first element in the array. It is the first element in the array.
in has type char * (when passed to a function) while in[0] has type char. And the %c format specifier to printf expects a char, not a char *.
Your code invokes undefined behavior, the compiler might be warning about the fact that the "%c" specifier expects a char (rigorously speaking it expects an int parameter that is after converted to unsigned char) parameter but you passed a char * (an array of char).
To make it print the character use
printf("%c", in[0]);
Passing the wrong type for a given format specifier in both printf() and scanf() is undefined behavior.

Understanding atoi(var-1) versus atoi(var)-1?

I had an issue where my C program allocated input data correctly only for values less than 5. I found the error in the creation of the int array holding the values: I had used atoi(var-1) instead of atoi(var)-1.
When var='5', atoi(var-1) is 0 when printed out. Why is the number "5" where the erroneous char to int conversion breaks? And why does it become zero at that point?
I'm just curious about what actually happens with this.
When you write atoi(var - 1), where var is a char*, you are asking the function atoi to read the string which begins at the memory location one lower than var and convert that to an integer.
In general, the character that is at the lower memory address could be anything. You just happened to have it break when your char* was '5', but it could have happened anywhere.
On the other hand atoi(var) - 1 does exactly what you would expect, converting var to an int and then subtracting 1 numerically.
Pointer arithmetic. If var is a string (char *), then var + n is the substring starting at offset n.
const char* s = "12345":
printf("%d\n", atoi(s + 2)); // prints 345
Subtraction is allowed as well: var - 1 is a pointer to one character before the string. This may be anything, but is probably a non-digit character, so atoi returns 0.

atoi() doesn't like what I am doing

I am trying to solve a homework problem. The instructions are to code a Vigenere cipher in C.
C is not liking the follow piece of code:
rot = atoi(argv[1][index]) - 'A';
rot has been declared as an integer;
index is also an integer;
argv[1] is a string (char array) passed to C from the command-line by the user;
A bit of an explanation of the code above.
argv[1] is a string array (passed from the user at the command-line). I am using each character in this string to encode the user's message. To 'get' to each character, I am using the code above. When some branches of the code are executed, I increase index by 1 (not directly since it needs to wrap around the key if the message has more characters) to get to the next char.
I get the following error when I try to compile: http://ideone.com/pjPGlT
atoi() expects a "string", really a char *, you are passing it a char.
You may try with:
rot = argv[1][index] - 'A';
Also atoi() expects a "string", presently you are passing a character to it.

New to C: whats wrong with my program?

I know my way around ruby pretty well and am teaching myself C starting with a few toy programs. This one is just to calculate the average of a string of numbers I enter as an argument.
#include <stdio.h>
#include <string.h>
main(int argc, char *argv[])
{
char *token;
int sum = 0;
int count = 0;
token = strtok(argv[1],",");
while (token != NULL)
{
count++;
sum += (int)*token;
token = strtok(NULL, ",");
}
printf("Avg: %d", sum/count);
printf("\n");
return 0;
}
The output is:
mike#sleepycat:~/projects/cee$ ./avg 1,1
Avg: 49
Which clearly needs some adjustment.
Any improvements and an explanation would be appreciated.
Look for sscanf or atoi as functions to convert from a string (array of characters) to an integer.
Unlike higher-level languages, C doesn't automatically convert between string and integral/real data types.
49 is the ASCII value of '1' char.
It should be helpful to you....:D
The problem is the character "1" is 49. You have to convert the character value to an integer and then average.
In C if you cast a char to an int you just get the ASCII value of it. So, you're averaging the ascii value of the character 1 twice, and getting what you'd expect.
You probably want to use atoi().
EDIT: Note that this is generally true of all typecasts in C. C doesn't reinterpret values for you, it trusts you to know what exists at a given location.
strtok(
Please, please do not use this. Even its own documentation says never to use it. I don't know how you, as a Ruby programmer, found out about its existence, but please forget about it.
(int)*token
This is not even close to doing what you want. There are two fundamental problems:
1) A char* does not "contain" text. It points at text. token is of type char*; therefore *token is of type char. That is, a single byte, not a string. Note that I said "byte", not "character", because the name char is actually wrong - an understandable oversight on the part of the language designers, because Unicode did not exist back then. Please understand that char is fundamentally a numeric type. There is no real text type in C! Interpreting a sequence of char values as text is just a convention.
2) Casting in C does not perform any kind of magical conversions.
What your code does is to grab the byte that token points at (after the strtok() call), and cast that numeric value to int. The byte that is rendered with the symbol 1 actually has a value of 49. Again, interpreting a sequence of bytes as text is just a convention, and thus interpreting a byte as a character is just a convention - specifically, here we are using the convention known as ASCII. When you hit the 1 key on your keyboard, and later hit enter to run the program, the chain of events set in motion by the command window actually passed a byte with the value 49 to your program. (In the same way, the comma has a value of 44.)
Both of the above problems are solved by using the proper tools to parse the input. Look up sscanf(). However, you don't even want to pass the input to your program this way, because you can't put any spaces in the input - each "word" on the command line will be passed as a separate entry in the argv[] array.
What you should do, in fact, is take advantage of that, by just expecting each entry in argv[] to represent one number. You can again use sscanf() to parse each entry, and it will be much easier.
Finally:
printf("Avg: %d", sum/count)
The quotient sum/count will not give you a decimal result. Dividing an integer by another integer yields an integer in C, discarding the remainder.
In this line: sum += (int)*token;
Casting a char to an int takes the ASCII value of the char. for 1, this value is 49.
Use the atoi function instead:
sum += atoi(token);
Note atoi is found in the stdlib.h file, so you'll need to #include it as well.
You can't convert a string to an integer via
sum += (int)*token;
Instead you have to call a function like atoi():
sum += atoi (token);
when you cast a char (which is what *token is) to int you get its ascii value in C - which is 49... so the average of the chars ascii values is in fact 49. you need to use atoi to get the value of the number represented

String parsing in C

how would you parse the string, 1234567 into individual numbers?
char mystring[] = "1234567";
Each digit is going to be mystring[n] - '0'.
What Delan said. Also, it's probably bad practice for maintainability to use a ASCII dependent trickery. Try using this one from the standard library:
int atoi ( const char * str );
EDIT: Much better idea (the one above has been pointed out to me as being a slow way to do it) Put a function like this in:
int ASCIIdigitToInt(char c){
return (int) c - '0';
}
and iterate this along your string.
Don't forget that a string in C is actually an array of type 'char'. You could walk through the array, and grab each individual character by array index and subtract from that character's ascii value the ascii value of '0' (which can be represented by '0').

Resources