I have vague memories of suggestions that sscanf was bad. I know it won't overflow buffers if I use the field width specifier, so is my memory just playing tricks with me?
I think it depends on how you're using it: If you're scanning for something like int, it's fine. If you're scanning for a string, it's not (unless there was a width field I'm forgetting?).
Edit:
It's not always safe for scanning strings.
If your buffer size is a constant, then you can certainly specify it as something like %20s. But if it's not a constant, you need to specify it in the format string, and you'd need to do:
char format[80]; //Make sure this is big enough... kinda painful
sprintf(format, "%%%ds", cchBuffer - 1); //Don't miss the percent signs and - 1!
sscanf(format, input); //Good luck
which is possible but very easy to get wrong, like I did in my previous edit (forgot to take care of the null-terminator). You might even overflow the format string buffer.
The reason why sscanf might be considered bad is because it doesnt require you to specify maximum string width for string arguments, which could result in overflows if the input read from the source string is longer. so the precise answer is: it is safe if you specify widths properly in the format string otherwise not.
Note that as long as your buffers are at least as long as strlen(input_string)+1, there is no way the %s or %[ specifiers can overflow. You can also use field widths in the specifiers if you want to enforce stricter limits, or you can use %*s and %*[ to suppress assignment and instead use %n before and after to get the offsets in the original string, and then use those to read the resulting sub-string in-place from the input string.
Yes it is..if you specify the string width so the are no buffer overflow related problems.
Anyway, like #Mehrdad showed us, there will be possible problems if the buffer size isn't established at compile-time. I suppose that put a limit to the length of a string that can be supplied to sscanf, could eliminate the problem.
All of the scanf functions have fundamental design flaws, only some of which could be fixed. They should not be used in production code.
Numeric conversion has full-on demons-fly-out-of-your-nose undefined behavior if a value overflows the representable range of the variable you're storing the value in. I am not making this up. The C library is allowed to crash your program just because somebody typed too many input digits. Even if it doesn't crash, it's not obliged to do anything sensible. There is no workaround.
As pointed out in several other answers, %s is just as dangerous as the infamous gets. It's possible to avoid this by using either the 'm' modifier, or a field width, but you have to remember to do that for every single text field you want to convert, and you have to wire the field widths into the format string -- you can't pass sizeof(buff) as an argument.
If the input does not exactly match the format string, sscanf doesn't tell you how many characters into the input buffer it got before it gave up. This means the only practical error-recovery policy is to discard the entire input buffer. This can be OK if you are processing a file that's a simple linear array of records of some sort (e.g. with a CSV file, "skip the malformed line and go on to the next one" is a sensible error recovery policy), but if the input has any more structure than that, you're hosed.
In C, parse jobs that aren't complicated enough to justify using lex and yacc are generally best done either with POSIX regexps (regex.h) or with hand-rolled string parsing. The strto* numeric conversion functions do have well-specified and useful behavior on overflow and do tell you how may characters of input they consumed, and string.h has lots of handy functions for hand-rolled parsers (strchr, strcspn, strsep, etc).
There is 2 point to take care.
The output buffer[s].
As mention by others if you specify a size smaller or equals to the output buffer size in the format string you are safe.
The input buffer.
Here you need to make sure that it is a null terminate string or that you will not read more than the input buffer size.
If the input string is not null terminated sscanf may read past the boundary of the buffer and crash if the memorie is not allocated.
Related
Is this ever acceptable?
fprintf(fp,"Just a string");
or
fprintf(fp,stringvariable);
versus
fprintf(fp,"%s","Just a string");
It seems confusing to me as the string variable (or constant) is used as the formatting versus the output itself. It the string variable had format-specific content ('%s', etc.) then the output would not be as intended.
For string-only output (no formatting) which is better?
fprintf(fp,"%s",stringvariable);
or
fputs(stringvariable,fp);
It is acceptable if you "know" the string variable to be "clean", if you don't care about the warning most modern compilers generate for that construct. Because:
If your string contains conversion specifiers "by accident", you are invoking undefined behaviour.
If you read that string from somewhere, a malicious attacker could exploit point 1. above to his ends.
It's generally better to use puts() or fputs() as they avoid this problem, and consequently don't generate a warning. (puts() also tosses in an automatic '\n'.)
The *puts() functions also have (marginally) better performance. *printf(), even on nothing more than "%s" as format string, still has to parse that conversion specifier, and count the number of characters printed for its return value.
Thanks to users 'rici' and 'Grady Player' for pointing out the character counting and compiler warning. My C got a bit rusty it seems. ;-)
my understanding is that most implementations of printf rely on something like
vsnprintf( _acBuffer[0], sizeof( _acBuffer[0] ), pcFormat, *ptArgList );
to actually handle the formatting and then they output them to the stream via puts.
Are there any implementation that minimize the size of _acBuffer[0] required while maintaining the ability to print all string?
Obviously something like :
printf("%s", pcReallyLongString);
would be a problem.
Your thoughts are much appreciated!
Your understanding is just wrong. I've never seen or heard of a printf implementation that works by first formatting the entire output to a temporary string buffer. Usually printf is done the other way around: the fundamental building block is vfprintf and vsnprintf is a wrapper for that which creates a fake FILE whose buffer is the destination string.
Edit: Some popular (e.g. glibc) implementations do make some use of unboundedly-large intermediate buffers for certain formats, especially wide character conversions, and will fail unpredictably when they cannot allocate sufficient memory for the buffer. This is purely a low-quality-implementation issue, however; there's no fundamental reason any of the printf functions should require any more than a small constant amount of working space, regardless of what they're printing.
I'd say that the whole point of fprintf (or printf) specification being the way it it is to allow a "bufferless" one-pass implementation of this function. I.e. it converts the data sequentially piece by piece (if it requires conversion), immediately sends it to the output and forgets about it for good. The function can use an intermediate buffer for numeric data conversion, but this is a temporary buffer of fixed and insignificant compile-time size.
Unless I'm missing something, a properly implemented fprintf function should impose absolutely no limitations on how long the resultant string is. Your hypothetical implementation through vsnprintf would violate that principle.
I have a file like this:
10 15
something
I want to read this into tree variables, let's say number1, number2, and mystring. I have doubts about what kind of pattern to give to fscanf. I am thinking something like this;
fscanf(fp,"%i %i\n%s",number1,number2,mystring);
Should this work, and also, is this the correct way of reading this file? If not, what would you suggest?
fscanf(fp,"%i %i\n%s",&number1,&number2,mystring);
fscanf takes pointers.
Read each line with fgets (or getline if you have it), split up the line with strsep (better, if available) or strtok_r (more awkward API but more portable), and then use strtoul to convert strings to numbers as necessary.
*scanf should never be used, because:
Some format strings (e.g. a bare "%s") are just as eager to overflow your buffers as gets is.
Behavior on integer overflow is undefined -- invalid input can potentially crash your program.
They do not report the character position of the first scan error, making it nigh-impossible to recover from a parse error. (This can be somewhat mitigated by using fgets and then sscanf instead of fscanf.)
Besides the problem with pointers, generally using spaces in the scanf format is a mistake -- in most cases scanf skips whitespace automatically. So I would use something like:
int number1, number2;
char mystring[32];
fscanf("%i%i%31s", &number1, &number2, &mystring)
This will read two numbers followed by a string of up to 31 non-whitespace characters, all separated by any whitespace. Note that "whitespace" includes spaces, tabs, and newlines, so it doesn't matter if its all on one line, or spread out over 3 lines or anything in between.
Note also using a limit on the size of the string -- without that, the input might overflow any fixed size buffer you provide (and there's no way to provide a variable sized buffer with scanf)
I wrote up a quick memory reader class that emulates the same functions as fread and fscanf.
Basically, I used memcpy and increased an internal pointer to read the data like fread, but I have a fscanf_s call. I used sscanf_s, except that doesn't tell me how many bytes it read out of the data.
Is there a way to tell how many bytes sscanf_s read in the last operation in order to increase the internal pointer of the string reader? Thanks!
EDIT:
And example format I am reading is:
|172|44|40|128|32|28|
fscanf reads that fine, so does sscanf. The only reason is that, if it were to be:
|0|0|0|0|0|0|
The length would be different. What I'm wondering is how fscanf knows where to put the file pointer, but sscanf doesn't.
With scanf and family, use %n in the format string. It won't read anything in, but it will cause the number of characters read so far (by this call) to be stored in the corresponding parameter (expects an int*).
Maybe I´m silly, but I´m going to try anyway. It seems from the comment threads that there's still some misconception. You need to know the amount of bytes. But the method returns only the amount of fields read, or EOF.
To get to the amount of bytes, either use something that you can easily count, or use a size specifier in the format string. Otherwise, you won't stand a chance finding out how many bytes are read, other then going over the fields one by one. Also, what you may mean is that
sscanf_s(source, "%d%d"...)
will succeed on both inputs "123 456" and "10\t30", which has a different length. In these cases, there's no way to tell the size, unless you convert it back. So: use a fixed size field, or be left in oblivion....
Important note: remember that when using %c it's the only way to include the field separators (newline, tab and space) in the output. All others will skip the field boundaries, making it harder to find the right amount of bytes.
EDIT:
From "C++ The Complete Reference" I just read that:
%n Receives an integer value equal to
the nubmer of characters read so far
Isn't that precisely what you were after? Just add it in the format string. This is confirmed here, but I haven't tested it with sscanf_s.
From MSDN:
sscanf_s, _sscanf_s_l, swscanf_s, _swscanf_s_l
Each of these functions returns the number of fields successfully converted and assigned; the return value does not include fields that were read but not assigned. A return value of 0 indicates that no fields were assigned. The return value is EOF for an error or if the end of the string is reached before the first conversion.
From man gets:
Never use gets(). Because it is
impossible to tell without knowing the
data in advance how many
characters gets() will read, and
because gets() will continue to store
characters past the end of the buffer,
it is extremely dangerous to use.
It has been used to break computer
security. Use fgets() instead.
Almost everywhere I see scanf being used in a way that should have the same problem (buffer overflow/buffer overrun): scanf("%s",string). This problem exists in this case? Why there are no references about it in the scanf man page? Why gcc does not warn when compiling this with -Wall?
ps: I know that there is a way to specify in the format string the maximum length of the string with scanf:
char str[10];
scanf("%9s",str);
edit: I am not asking to determe if the preceding code is right or not. My question is: if scanf("%s",string) is always wrong, why there are no warnings and there is nothing about it in the man page?
The answer is simply that no-one has written the code in GCC to produce that warning.
As you point out, a warning for the specific case of "%s" (with no field width) is quite appropriate.
However, bear in mind that this is only the case for the case of scanf(), vscanf(), fscanf() and vfscanf(). This format specifier can be perfectly safe with sscanf() and vsscanf(), so the warning should not be issued in that case. This means that you cannot simply add it to the existing "scanf-style-format-string" analysis code; you will have to split that into "fscanf-style-format-string" and "sscanf-style-format-string" options.
I'm sure if you produce a patch for the latest version of GCC it stands a good chance of being accepted (and of course, you will need to submit patches for the glibc header files too).
Using gets() is never safe. scanf() can be used safely, as you said in your question. However, determining if you're using it safely is a more difficult problem for the compiler to work out (e.g. if you're calling scanf() in a function where you pass in the buffer and a character count as arguments, it won't be able to tell); in that case, it has to assume that you know what you're doing.
When the compiler looks at the formatting string of scanf, it sees a string! That's assuming the formatting string is not entered at run-time. Some compilers like GCC have some extra functionality to analyze the formatting string if entered at compile time. That extra functionality is not comprehensive, because in some situations a run-time overhead is needed which is a NO NO for languages like C. For example, can you detect an unsafe usage without inserting some extra hidden code in this case:
char* str;
size_t size;
scanf("%z", &size);
str = malloc(size);
scanf("%9s"); // how can the compiler determine if this is a safe call?!
Of course, there are ways to write safe code with scanf if you specify the number of characters to read, and that there is enough memory to hold the string. In the case of gets, there is no way to specify the number of characters to read.
I am not sure why the man page for scanf doesn't mention the probability of a buffer overrun, but vanilla scanf is not a secure option. A rather dated link - Link shows this as the case. Also, check this (not gcc but informative nevertheless) - Link
It may be simply that scanf will allocate space on the heap based on how much data is read in. Since it doesn't allocate the buffer and then read until the null character is read, it doesn't risk overwriting the buffer. Instead, it reads into its own buffer until the null character is found, and presumably copies that buffer into another of the correct size at the end of the read.