Related
This question already has answers here:
Inconsistent strcmp() return value when passing strings as pointers or as literals
(2 answers)
Closed 4 years ago.
While checking the return value of strcmp function, I found some strange behavior in gcc. Here's my code:
#include <stdio.h>
#include <string.h>
char str0[] = "hello world!";
char str1[] = "Hello world!";
int main() {
printf("%d\n", strcmp("hello world!", "Hello world!"));
printf("%d\n", strcmp(str0, str1));
}
When I compile this with clang, both calls to strcmp return 32. However, when compiling with gcc, the first call returns 1, and the second call returns 32. I don't understand why the first and second calls to strcmp return different values when compiled using gcc.
Below is my test environment.
Ubuntu 18.04 64bit
gcc 7.3.0
clang 6.0.0
It looks like you didn't enable optimizations (e.g. -O2).
From my tests it looks like gcc always recognizes strcmp with constant arguments and optimizes it, even with -O0 (no optimizations). Clang needs at least -O1 to do so.
That's where the difference comes from: The code produced by clang calls strcmp twice, but the code produced by gcc just does printf("%d\n", 1) in the first case because it knows that 'h' > 'H' (ASCIIbetically, that is). It's just constant folding, really.
Live example: https://godbolt.org/z/8Hg-gI
As the other answers explain, any positive value will do to indicate that the first string is greater than the second, so the compiler optimizer simply chooses 1. The strcmp library function apparently uses a different value.
The standard defines the result of strcmp to be negative, if lhs appears before rhs in lexical order, zero if they are equal, or a positive value if lhs appears lexically after rhs.
It's up to the implementation how to implement that and what exactly to return. You must not depend on a specific value in your programs, or they won't be portable. Simply check with comparisons (<, >, ==).
See https://en.cppreference.com/w/c/string/byte/strcmp
Background
One simple implementation might just calculate the difference of each character c1 - c2 and do that until the result is not zero, or one of the strings ends. The result will then be the numeric difference between the first character, in which the two strings differed.
For example, this GLibC implementation: https://sourceware.org/git/?p=glibc.git;a=blob_plain;f=string/strcmp.c;hb=HEAD
The strcmp function is only specified to return a value larger than zero, zero, or less than zero. There's nothing specified what those positive and negative values have to be.
The exact values returned by strcmp in the case of the strings not being equal are not specified. From the man page:
#include <string.h>
int strcmp(const char *s1, const char *s2);
int strncmp(const char *s1, const char *s2, size_t n);
The strcmp() and strncmp() functions return an integer less than,
equal to, or greater than zero if s1 (or the first n bytes thereof) is
found, respectively, to be less than, to match, or be greater than s2.
Since str1 compares greater than str2, the value must be positive, which it is in both cases.
As for the difference between the two compilers, it appears that clang is returning the difference between the ASCII values for the corresponding characters that mismatched, while gcc is opting for a simple -1, 0, or 1. Both are valid, so your code should only need to check if the value is 0, greater than 0, or less than 0.
My following code for testing strcmp is as follows:
char s1[10] = "racecar";
char *s2 = "raceCar"; //yes, a capital 'C'
int diff;
diff = strcmp(s1,s2);
printf(" %d\n", diff);
So I am confused on why the output is 32. What exactly is it comparing to get that result? I appreciate your time and help.
Whatever it wants. In this case, it looks like the value you're getting is 'c' - 'C' (the difference between the two characters at the first point where the strings differ), which is equal to 32 on many systems, but you shouldn't by any means count on that. The only thing that you can count on is that the return will be 0 if the two strings are equal, negative if s1 comes before s2, and positive if s1 comes after s2.
The man pages states that the output will be greater than 0 or less than 0 if the strings are not the same. It doesn't say anything else regarding the exact value (if not 0).
That being said, the ASCII codes for c and C differ by 32. That's probably where the result is coming from. You can't depend on this behavior being identical in any two given implementations however.
It is not specified. According to the standard:
7.24.4.2 The strcmp function
#include <string.h>
int strcmp(const char *s1, const char *s2);
Description
The strcmp function compares the string pointed to by s1 to the string pointed to by
s2.
Returns
The strcmp function returns an integer greater than, equal to, or less than zero,
accordingly as the string pointed to by s1 is greater than, equal to, or less than the string pointed to by s2.
According to the C standard (N1570 7.24.4.2):
The strcmp function returns an integer greater than, equal to,
or less than zero, accordingly as the string pointed to by s1 is
greater than, equal to, or less than the string pointed to by
s2.
It says nothing about which positive or negative value it will return if the strings are unequal, and portable code should only check whether the result is less than, equal to, or greater than zero.
Having said that, a straightforward implementation of strcmp would likely return the numeric difference in the values of the first characters that don't match. In your case, the first non-matching characters are 'c' and 'C', which happen to differ by 32 in ASCII.
Don't count on this.
"strcmp" compares strings and when it reaches a different character, it will return the difference between them.
In your case, it reaches 'c' in your first string, and 'C' in your second string. 'c' in hex is 0x63 while 'C' is 0x43. Subtract and you get 0x20, which is 32 in decimal.
We use strcmp to check if strings are equal if the function returns 0.
strcmp compares the strings character by character until it reaches characters that don't match or the terminating null-character.
so the strcmp function sees that c (which is 99 in ASCII) is greater than C (which is 67 in ascii), so it returns a positive integer. Whatever positive integer it returns is I think usually defined by your system or whatever version of c you are compiling.
The following piece of code behaves differently in 32-bit and 64-bit operating systems.
char *cat = "v,a";
if (strcmp(cat, ",") == 1)
...
The above condition is true in 32-bit but false in 64-bit. I wonder why this is different?
Both 32-bit and 64-bit OS are Linux (Fedora).
The strcmp() function is only defined to return a negative value if argument 1 precedes argument 2, zero if they're identical, or a positive value if argument 1 follows argument 2.
There is no guarantee of any sort that the value returned will be +1 or -1 at any time. Any equality test based on that assumption is faulty. It is conceivable that the 32-bit and 64-bit versions of strcmp() return different numbers for a given string comparison, but any test that looks for +1 from strcmp() is inherently flawed.
Your comparison code should be one of:
if (strcmp(cat, ",") > 0) // cat > ","
if (strcmp(cat, ",") == 0) // cat == ","
if (strcmp(cat, ",") >= 0) // cat >= ","
if (strcmp(cat, ",") <= 0) // cat <= ","
if (strcmp(cat, ",") < 0) // cat < ","
if (strcmp(cat, ",") != 0) // cat != ","
Note the common theme — all the tests compare with 0. You'll also see people write:
if (strcmp(cat, ",")) // != 0
if (!strcmp(cat, ",")) // == 0
Personally, I prefer the explicit comparisons with zero; I mentally translate the shorthands into the appropriate longhand (and resent being made to do so).
Note that the specification of strcmp() says:
ISO/IEC 9899:2011 §7.24.4.2 The strcmp function
¶3 The strcmp function returns an integer greater than, equal to, or less than zero,
accordingly as the string pointed to by s1 is greater than, equal to, or less than the string pointed to by s2.
It says nothing about +1 or -1; you cannot rely on the magnitude of the result, only on its signedness (or that it is zero when the strings are equal).
Standard functions doesn't exhibit different behaviour based on the "bittedness" of your OS unless you're doing something silly like, for example, not including the relevant header file. They are required to exhibit exactly the behaviour specified in the standard, unless you violate the rules. Otherwise, your compiler, while close, will not be a C compiler.
However, as per the standard, the return value from strcmp() is either zero, positive or negative, it's not guaranteed to be +/-1 when non-zero.
Your expression would be better written as:
strcmp (cat, ",") > 0
The faultiness of using strcmp (cat, ",") == 1 has nothing to do with whether your OS is 32 or 64 bits, and everything to do with the fact you've misunderstood the return value. From the ISO C11 standard (my bold):
The strcmp function returns an integer greater than, equal to, or less than zero, accordingly as the string pointed to by s1 is greater than, equal to, or less than the string pointed to by s2.
The semantics guaranteed by strcmp() are well explained above in Jonathan's answer.
Coming back to your original question i.e.
Q. Why strcmp() behaviour differs in 32-bit and 64-bit systems?
Answer: strcmp() is implemented in glibc, wherein there exist different implementations for various architectures, all highly optimised for the corresponding architecture.
strcmp() on x86
strcmp() on x86-64
As the spec simply defines that the the return value is one of 3 possibilities (-ve, 0, +ve), the various implementations are free to return any value as long as the sign indicates the result appropriately.
On certain architectures (in this case x86), it is faster to simply compare each byte without storing the result. Hence its quicker to simply return -/+1 on a mismatch.
(Note that one could use subb instead of cmpb on x86 to obtain the difference in magnitude of the non-matching bytes. But this would require 1 additional clock cycle per byte. This would mean an addition 3% increase in total time taken as each complete iteration runs in less than 30 clock cycles.)
On other architectures (in this case x86-64), the difference between the byte values of the corresponding characters is already available as a by-product of the comparision. Hence it faster to simply return it rather than test them again and return -/+1.
Both are perfectly valid output as the strcmp() function is ONLY guaranteed to return the result using the proper sign and the magnitude is architecture/implementation specific.
From the man page:
The strcmp() and strncmp() functions return an integer less than, equal
to, or greater than zero if s1 (or the first n bytes thereof) is found,
respectively, to be less than, to match, or be greater than s2.
Example code in C (prints -15 on my machine, swapping test1 and test2 inverts the value):
#include <stdio.h>
#include <string.h>
int main() {
char* test1 = "hello";
char* test2 = "world";
printf("%d\n", strcmp(test1, test2));
}
I found this code (taken from this question) that relies on the values of strcmp being something other than -1, 0 and 1 (it uses the return value in qsort). To me, this is terrible style and depends on undocumented features.
I guess I have two, related questions:
Is there something in the C standard that defines what the return values are besides less than, greater than, or equal to zero? If not, what does the standard implementation do?
Is the return value consistent across the Linux, Windows and the BSDs?
Edit:
After leaving my computer for 5 minutes, I realized that there is in fact no error with the code in question. I struck out the parts that I figured out before reading the comments/answers, but I left them there to keep the comments relevant. I think this is still an interesting question and may cause hiccups for programmers used to other languages that always return -1, 0 or 1 (e.g. Python seems to do this, but it's not documented that way).
FWIW, I think that relying on something other than the documented behavior is bad style.
Is there something in the C standard that defines what the return values are besides less than, greater than, or equal to zero?
No. The tightest constraint is that it should be zero, less than zero or more than zero, as specified in the documentation of this particular function.
If not, what does the standard implementation do?
There's no such thing as "the standard implementation". Even if there was, it would probably just
return zero, less than zero or more than zero;
:-)
Is the return value consistent across the Linux, Windows and the BSDs?
I can confirm that it's consistent across Linux and OS X as of 10.7.4 (specifically, it's -1, 0 or +1). I have no idea about Windows, but I bet Microsoft guys use -2 and +3 just to break code :P
Also, let me also point out that you have completely misunderstood what the code does.
I found this code (taken from this question) that relies on the values of strcmp being something other than -1, 0 and 1 (it uses the return value in qsort). To me, this is terrible style and depends on undocumented features.
No, it actually doesn't. The C standard library is designed with consistency and ease of use in mind. That is, what qsort() requires is that its comparator function returns a negative or a positive number or zero - exactly what strcmp() is guaranteed to do. So this is not "terrible style", it's perfectly standards-conformant code which does not depend upon undocumented features.
In the C99 standard, §7.21.4.2 The strcmp function:
The strcmp function returns an integer greater than, equal to, or less than zero,
accordingly as the string pointed to by s1 is greater than, equal to, or less than the string pointed to by s2.
Emphasis added.
It means the standard doesn't guarantee about the -1, 0 or 1; it may vary according to operating systems.
The value you are getting is the difference between w and h which is 15.
In your case hello and world so 'h'-'w' = -15 < 0 and that's why strcmp returns -15.
• Is there something in the C standard that defines what the return values are besides less than, greater than, or equal to zero? If not, what does the standard implementation do?
No, as you mentioned yourself the man page says less than, equal to, or greater than zero and that's what the standard says as well.
• Is the return value consistent across the Linux, Windows and the BSDs?
No.
On Linux (OpenSuSE 12.1, kernel 3.1) with gcc, I get -15/15 depending on if test1 or test2 is first. On Windows 7 (VS 2010) I get -1/1.
Based on the loose definition of strcmp(), both are fine.
...that relies on the values of strcmp being something other than -1, 0 and 1 (it uses the return value in qsort).
An interesting side note for you... if you take a look at the qsort() man page, the example there is pretty much the same as the Bell code you posted using strcmp(). The reason being the comparator function that qsort() requires is actually a great fit for the return from strcmp():
The comparison function must return an integer less than, equal to, or
greater than zero if the first argument is considered to be
respectively less than, equal to, or greater than the second.
In reality, the return value of strcmp is likely to be the difference between the values of the bytes at the first position that differed, simply because returning this difference is a lot more efficient than doing an additional conditional branch to convert it to -1 or 1. Unfortunately, some broken software has been known to assume the result fits in 8 bits, leading to serious vulnerabilities. In short, you should never use anything but the sign of the result.
For details on the issues, read the article I linked above:
https://communities.coverity.com/blogs/security/2012/07/19/more-defects-like-the-mysql-memcmp-vulnerability
In this page:
The strcmp() function compares the string pointed to by s1 to the string pointed to by s2.
The sign of a non-zero return value is determined by the sign of the difference between the values of the first pair of bytes (both interpreted as type unsigned char) that differ in the strings being compared.
Here is an implementation of strcmp in FreeBSD.
#include <string.h>
/*
* Compare strings.
*/
int
strcmp(s1, s2)
register const char *s1, *s2;
{
while (*s1 == *s2++)
if (*s1++ == 0)
return (0);
return (*(const unsigned char *)s1 - *(const unsigned char *)(s2 - 1));
}
From the manual page:
RETURN VALUE
The strcmp() and strncmp() functions return an integer less than, equal to, or greater than zero if s1 (or the first n bytes
thereof) is found, respectively, to
be less than, to match, or be greater than s2.
It only specifies that it is greater or less than 0, doesn't say anything about specific values, those are implementation specific i suppose.
CONFORMING TO
SVr4, 4.3BSD, C89, C99.
This says in which standards it is included. The function must exist and behave as specified, but the specification doesn't say anything about the actual returned values, so you can't rely on them.
There's nothing in the C standard that talks about the value returned by strcmp() (that is, other than the sign of that value):
7.21.4.2 The strcmp function
Synopsis
#include <string.h>
int strcmp(const char *s1, const char *s2);
Description
The strcmp function compares the string pointed to by s1
to the string pointed to by s2.
Returns
The strcmp function returns an integer greater than, equal
to, or less than zero, accordingly as the string pointed to by s1 is
greater than, equal to, or less than the string pointed to by s2.
It is therefore pretty clear that using anything other than the sign of the returned value is a poor practice.
I thought strcmp was supposed to return a positive number if the first string was larger than the second string. But this program
#include <stdio.h>
#include <string.h>
int main()
{
char A[] = "A";
char Aumlaut[] = "Ä";
printf("%i\n", A[0]);
printf("%i\n", Aumlaut[0]);
printf("%i\n", strcmp(A, Aumlaut));
return 0;
}
prints 65, -61 and -1.
Why? Is there something I'm overlooking?
I thought that maybe the fact that I'm saving as UTF-8 would influence things.. You know because the Ä consists of 2 chars there. But saving as an 8-bit encoding and making sure that the strings both have length 1 doesn't help, the end result is the same.
What am I doing wrong?
Using GCC 4.3 under 32 bit Linux here, in case that matters.
strcmp and the other string functions aren't actually utf aware. On most posix machines, C/C++ char is internally utf8, which makes most things "just work" with regards to reading and writing and provide the option of a library understanding and manipulating the utf codepoints. But the default string.h functions are not culture sensitive and do not know anything about comparing utf strings. You can look at the source code for strcmp and see for yourself, it's about as naïve an implementation as possible (which means it's also faster than an internationalization-aware compare function).
I just answered this in another question - you need to use a UTF-aware string library such as IBM's excellent ICU - International Components for Unicode.
The strcmp and similar comparison functions treat the bytes in the strings as unsigned chars, as specified by the standard in section 7.24.4, point 1 (was 7.21.4 in C99)
The sign of a nonzero value returned by the comparison functions memcmp, strcmp, and strncmp is determined by the sign of the difference between the values of the first pair of characters (both interpreted as unsigned char) that differ in the objects being compared.
(emphasis mine).
The reason is probably that such an interpretation maintains the ordering between code points in the common encodings, while interpreting them a s signed chars doesn't.
strcmp() takes chars as unsigned ASCII values. So, your A-with-double-dots isn't char -61, it's char 195 (or maybe 196, if I've got my math wrong).
Saving as an 8-bit ASCII encoding, 'A' == 65 and 'Ä' equals whatever -61 is if you consider it to be an unsigned char. Anyway, 'Ä' is strictly positive and greater than 2^7-1, you're just printing it as if it were signed.
If you consider 'Ä' to be an unsigned char (which it is), its value is 195 in your charset. Hence, strcmp(65, 195) correctly reports -1.
Check the strcmp manpage:
The strcmp() function compares the two strings s1 and s2. It returns
an integer less than, equal to, or greater than zero if s1 is found,
respectively, to be less than, to match, or be greater than s2.
To do string handling correctly in C when the input character set exceeds
UTF8 you should use the standard library's wide-character facilities for
strings and i/o. Your program should be:
#include <wchar.h>
#include <stdio.h>
int main()
{
wchar_t A[] = L"A";
wchar_t Aumlaut[] = L"Ä";
wprintf(L"%i\n", A[0]);
wprintf(L"%i\n", Aumlaut[0]);
wprintf(L"%i\n", wcscmp(A, Aumlaut));
return 0;
}
and then it will give the correct results (GCC 4.6.3). You don't need a special library.