I am trying to solve a homework problem. The instructions are to code a Vigenere cipher in C.
C is not liking the follow piece of code:
rot = atoi(argv[1][index]) - 'A';
rot has been declared as an integer;
index is also an integer;
argv[1] is a string (char array) passed to C from the command-line by the user;
A bit of an explanation of the code above.
argv[1] is a string array (passed from the user at the command-line). I am using each character in this string to encode the user's message. To 'get' to each character, I am using the code above. When some branches of the code are executed, I increase index by 1 (not directly since it needs to wrap around the key if the message has more characters) to get to the next char.
I get the following error when I try to compile: http://ideone.com/pjPGlT
atoi() expects a "string", really a char *, you are passing it a char.
You may try with:
rot = argv[1][index] - 'A';
Also atoi() expects a "string", presently you are passing a character to it.
Related
I have been using this code:
char options_string[96];
sprintf(options_string,"%s_G%u", options_string, options.allowed_nucleotide_gap_between_CpN);
which is just writing unsigned integers to a string mixed with some letters.
but with the new version 9 of GCC that I just started using, is warning me:
warning: passing argument 1 to restrict-qualified parameter aliases
with argument 3 [-Wrestrict] 1012 |
sprintf(options_string,"%s_G%u", options_string,
options.allowed_nucleotide_gap_between_CpN);
| ^~~~~~~~~~~~~~ ~~~~~~~~~~~~~~
I've read that the best way to make a string like this is to use sprintf, as I have: How to convert an int to string in C?
I've re-checked the code, and I'm not using any restrict keywords.
How can I write to this string without the warning?
The code causes undefined behaviour because the same part of a char buffer is used as both input and output for sprintf. The warning is useful information in this case. To be correct you must change the code so there is no overlap between inputs and outputs.
For example you could find the end of the current string and start writing from there. Also it would be wise to guard against buffer overflows in the length of output.
Possible code:
char options_string[96];
// ...assume you omitted some code that writes some valid string
size_t upto = strlen(options_string);
int written = snprintf(options_string + upto, sizeof options_string - upto,
"_G%u", options.allowed_nucleotide_gap_between_CpN);
if ( written < 0 || written + upto >= sizeof options_string )
{
// ...what you want to do if the options don't fit in the buffer
}
A conforming implementation of sprintf could start by writing a zero byte to the destination, then replacing that byte with the first byte of output (if any) and writing a zero after that, then replacing that second zero byte with the next byte of output and writing a third zero, etc. Such an approach would avoid the need to have it take any particular action (such as writing a terminating zero) after processing the last byte. An attempt to use your code with such an implementation, however, would fail since options_string would effectively get cleared before code could read it.
The warning you're receiving is thus an indication that your code may not work as written.
In your case it is better to use the strcat function instead of sprintf, due to the fact that you want to concatenate a string.
everyone!
please help me understand the following problem...
So i will have a STRING-type input of a note, looks like "A5" or "G#2" or "Cb4" etc. And i need to extract an octave index, which is the last digit "5" or "2" or "4"... And after exctraction i need it as an int-type.
So I did this:
string note = get_string("Input: ");
int octave = atoi(note[strlen(note) - 1]);
printf("your octave is %i \n", octave);
But it gave me an error "error: incompatible integer to pointer conversion passing 'char' to parameter of type 'const char *'; take the address with & [-Werror,-Wint-conversion]"
Then I tryied to throug away the math from the function, and did this:
int extnum = strlen(note) - 1;
int octave = atoi(note[extnum]);
It didn't work as well. So i did my reserch on atoi function and i don't get it...
ATOI expects a string (CHECK)
Converts it to an interger, not the ASCII meanning (CHECK)
Library for atoi function (CHECK)
What I am doing in basically asking "take n-th character of that string and make it an int".
After googling for some time a found an other code example where a guy uses atoi with this symbol '&'. So i did this:
int octave = atoi(¬e[strlen(note) - 1]);
And IT WORKED! But I can't understand WHY it worked with the & symbol and didnt work without it....Cause it always worked without it! There was a million times i was giving a single-character string like '5' or so ond just used atoi and it worked perfectly...
Plesase help me, why in this case it acts so weird?
C does not have a native string type. Strings are usually represented as char array or a pointer to char.
Assuming that string is just a typedef to char *.
if note is an array of chars, note[strlen(note)-1] is just the last character. Since atoi expects a pointer to char (which has to be null-terminated) you have to pass the address of the char and not the value.
The task to convert one char digit to int could also be solved easier:
int octave = note[strlen(note) - 1] - '0';
The function atoi takes a pointer to a character array as the input parameter (const char*). When you call note[strlen(note) - 1] this is a single character (char), in order to make atoi work you need to provide the pointer. You do that by adding & as you've done. This then works, because right after that single digit there is a null character \0 that terminates the string - because your original string was null-terminated.
Note however that doing something like this would not be a good idea:
char a = '7';
int b = atoi(&a);
as there is no way to be sure what the next byte in memory is (following the byte that belongs to a), but the function will try to read it anyway, which can lead to undefined behaviour.
The last character is... well a character! not a string. So by adding the & sign, you made it a pointer to character (char*)!
You can also try this code:
char a = '5';
int b = a - '0';
Gives you ASCII code of 5 minus ASCII code of 0
From the manual pages, the signature of atoi is: int atoi(const char *nptr);. So, you need to pass the address of a char.
When you do this: atoi(note[strlen(note) - 1]) you pass the char itself. Thus, invoking UB.
When you use the &, you are passing what the function expects - the address. Hence, that works.
atoi excepts a string (a pointer to character), not a single character.
However, you should never use atoi since that function has bad error handling. The function strtol is 100% equivalent but safer.
You need to do this in two steps:
Find the first digit in the string.
From there, convert it to integer by calling strtol.
1) can is solved by looping through the string, checking if every item is a digit by calling isdigit from ctype.h. Simultaneously, check for the end of the string, the null terminator \0. When you find the first digit, save a pointer to that address.
2) is solved by passing the saved pointer to strtol, such as result = strtol(pointer, NULL, 10);.
I have an Arduino project with a string, called string, which is four digits, each between 0 and 9. So for example, a possible value is 1200. I'd like to take the first character, 1, and assign it to another string, called xCo.
String string = String(c);
String xCo = String(string[0]);
Serial.print(xCo);
Strangely, the Serial.print(xCo); line doesn't just print the first character, 1. Rather, it prints the whole string. I've read other questions' answers and they said that to reference a particular character, you just choose the index number of that character by doing something like string[0]. Yet, this isn't working for me.
What am I doing wrong here?
Edit: As the commenters have pointed out, String is an Arduino type, at least I'm pretty sure. My C and Arduino experience is very limited, so I can't be sure.
If you need to get the value of a character at a given position in a string, use charAt().
String string = "1200";
char singleCharacter = string.charAt(0);
Serial.print(singleCharacter);
Lot of people recommends to not use String. The best way is to simply use char *
char *foo = "1200";
char c = foo[0];
Can someone explain how to convert a string of decimal values from ASCII table to its character 'representation' in C ? For example: user input could be 097 and the function would print 'a' on the screen, but also user could type in '097100101' and the function would have to print 'ade' etc. I have written something clunky that does the opposite operation:
char word[30];
scanf("%s", word);
while(word[i]!=0)
{
if(word[i]<'d')
printf("0%d", (int)word[i]);
if(word[i]>='d')
printf("%d", (int)word[i]);
i++;
}
but it works. Now I want to have function that works in a similar way but of course does decimal > char conversion. The point is, I cannot use any functions like 'atoi' or something like that (not sure about names, never used them ;)).
You can use this function instead of atoi:
char a3toc(const char *ptr)
{
return (ptr[0]-'0')*100 + (ptr[1]-'0')*10 + (ptr[0]-'0');
}
So, a3toc("102") will return the same thing as (char) 102, which is an 'f'.
If you don't see why, substitute in the values: ptr[0] is '1', so the first part becomes ('1'-'0')*100 or 1*100 or 100, which is what that first 1 in 102 represents.
Tokenize the input string. I'm assuming you are forcing that every letter MUST be represented in 3 characters. So break the string that way. And simply use explicit type casting to get the desired character.
I don't think I should be giving you the code for this, since it is pretty easy and seems more like a Homework question.
I know my way around ruby pretty well and am teaching myself C starting with a few toy programs. This one is just to calculate the average of a string of numbers I enter as an argument.
#include <stdio.h>
#include <string.h>
main(int argc, char *argv[])
{
char *token;
int sum = 0;
int count = 0;
token = strtok(argv[1],",");
while (token != NULL)
{
count++;
sum += (int)*token;
token = strtok(NULL, ",");
}
printf("Avg: %d", sum/count);
printf("\n");
return 0;
}
The output is:
mike#sleepycat:~/projects/cee$ ./avg 1,1
Avg: 49
Which clearly needs some adjustment.
Any improvements and an explanation would be appreciated.
Look for sscanf or atoi as functions to convert from a string (array of characters) to an integer.
Unlike higher-level languages, C doesn't automatically convert between string and integral/real data types.
49 is the ASCII value of '1' char.
It should be helpful to you....:D
The problem is the character "1" is 49. You have to convert the character value to an integer and then average.
In C if you cast a char to an int you just get the ASCII value of it. So, you're averaging the ascii value of the character 1 twice, and getting what you'd expect.
You probably want to use atoi().
EDIT: Note that this is generally true of all typecasts in C. C doesn't reinterpret values for you, it trusts you to know what exists at a given location.
strtok(
Please, please do not use this. Even its own documentation says never to use it. I don't know how you, as a Ruby programmer, found out about its existence, but please forget about it.
(int)*token
This is not even close to doing what you want. There are two fundamental problems:
1) A char* does not "contain" text. It points at text. token is of type char*; therefore *token is of type char. That is, a single byte, not a string. Note that I said "byte", not "character", because the name char is actually wrong - an understandable oversight on the part of the language designers, because Unicode did not exist back then. Please understand that char is fundamentally a numeric type. There is no real text type in C! Interpreting a sequence of char values as text is just a convention.
2) Casting in C does not perform any kind of magical conversions.
What your code does is to grab the byte that token points at (after the strtok() call), and cast that numeric value to int. The byte that is rendered with the symbol 1 actually has a value of 49. Again, interpreting a sequence of bytes as text is just a convention, and thus interpreting a byte as a character is just a convention - specifically, here we are using the convention known as ASCII. When you hit the 1 key on your keyboard, and later hit enter to run the program, the chain of events set in motion by the command window actually passed a byte with the value 49 to your program. (In the same way, the comma has a value of 44.)
Both of the above problems are solved by using the proper tools to parse the input. Look up sscanf(). However, you don't even want to pass the input to your program this way, because you can't put any spaces in the input - each "word" on the command line will be passed as a separate entry in the argv[] array.
What you should do, in fact, is take advantage of that, by just expecting each entry in argv[] to represent one number. You can again use sscanf() to parse each entry, and it will be much easier.
Finally:
printf("Avg: %d", sum/count)
The quotient sum/count will not give you a decimal result. Dividing an integer by another integer yields an integer in C, discarding the remainder.
In this line: sum += (int)*token;
Casting a char to an int takes the ASCII value of the char. for 1, this value is 49.
Use the atoi function instead:
sum += atoi(token);
Note atoi is found in the stdlib.h file, so you'll need to #include it as well.
You can't convert a string to an integer via
sum += (int)*token;
Instead you have to call a function like atoi():
sum += atoi (token);
when you cast a char (which is what *token is) to int you get its ascii value in C - which is 49... so the average of the chars ascii values is in fact 49. you need to use atoi to get the value of the number represented