What do atoi, atol, and stoi stand for? [duplicate] - c

This question already has answers here:
Where did the name `atoi` come from?
(2 answers)
Closed 6 years ago.
I understand what said functions do, but I can't guess how their names were created, except that the last letter is from the return type.

atoi -> ASCII to integer.
atol -> ASCII to long.
atof -> ASCII to floating.
stoi -> string to integer.
stol -> string to long.
stoll -> string to long long.
stof -> string to float.
stod -> string to double.
stold -> string to long double.
atoi, atol, atof come from C and its godfather most probably is considered to be Ken Thompson the co-creator of the UNIX operating system and the creator of the B programming language which is the predecessor of the C programming language. The names are mentioned in the first UNIX Programmer's Manual November 3, 1971 and as you can see in the owner's label ken is mentioned which is the nickname of Ken Thomson:
stoi, stol, stoll, stof, stod and stold got in C++ since C++11. Consequently, the naming must have been a unanimous decision of the C++ committee. The original proposal N1803 though dates back in 2005. I couldn't find in the proposal why the named these functions after these names. My guess is that probably they wanted to keep the uniformity with their C "equivalents" mentioned above.

Related

Convert number in scientific notation string to float with limited C-like library

I'm writing a program in CAPL (which is based on C and minus some concepts) to convert a string containing a number displayed in scientific notation to a float (doesn't strictly need to be a float but I think its an appropriate type for this). For example:
-7.68000000E-06 should be converted to -0.00000768
I've done some looking around for this and atof() comes up a lot but this is not supported in CAPL so I cannot use that.
A list of the other C concepts not supported in CAPL:
Update: Thanks everyone for the help. M. Spiller's answer proved to be the easiest solution. I have accepted this answer.
In CAPL the function is called atodbl with the same signature as C's atof.

Where did the name `atol` come from? [duplicate]

This question already has answers here:
Where did the name `atoi` come from?
(2 answers)
Closed 5 years ago.
Anyone know what the source of the name of the function atol for converting a string to an long?
I thought about Array To long but it's not sounds to me true.
ASCII To Long is what atol(3) means (in the early days of Unix, ASCII was only used, and IIRC was mentioned in the K&R book)
Today we usually use UTF-8 everywhere, but atol still works (since UTF-8 for digits uses the same encoding than ASCII)
On C implementations using another encoding (e.g. EBCDIC) atol should still do what is expected (so atol("345") would give 345), since the C standard requires that the encoding of digit characters is consecutive. Its implementation might be more complex (or encoding specific).
so today, the atol name don't refer anymore to ASCII. The C11 standard n1570 don't mention ASCII (as mandatory) IIRC. you might rewrite history by reading atol as anything to long even if historically it was ASCII to long.
It's Ascii to long, the same convention is used for atoi etc.

Converting a Letter to a Number in C [duplicate]

This question already has answers here:
Converting Letters to Numbers in C
(10 answers)
Closed 6 years ago.
Alright so pretty simple, I want to convert a letter to a number so that a = 0, b = 1, etc. Now I know I can do
number = letter + '0';
so when I input the letter 'a' it gives me the number 145. My question is, if I am to run this on a different computer or OS, would it still give me the same number 145 for when I input the letter 'a'?
It depends on what character encoding you are using. If you're using the same encoding and compiler on both the computers, yes, it will be the same. But if you're using another encoding like EBCDIC on one computer and ASCII on another, you cannot guarantee them to be the same.
Also, you can use atoi.
If you do not want to use atoi, see: Converting Letters to Numbers in C
It depends on what character encoding you are using.
It is also important to note that if you use ASCII the value will fit in a byte.
If you are using UTF-8 for example, the value wont fit a byte but you will require two bytes (int16) at least.
Now, lets assume you are making sure you use one specific character encoding then, the value will be the same no matter the system.
Yes, the number used to represent a is defined in the American Standard Code for Information Interchange. This is the standard that C compilers use by default, so on all other OSs you will get the same result.

ASCII char to int conversions in C [duplicate]

This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
Char to int conversion in C.
I remember learning in a course a long time ago that converting from an ASCII char to an int by subtracting '0' is bad.
For example:
int converted;
char ascii = '8';
converted = ascii - '0';
Why is this considered a bad practice? Is it because some systems don't use ASCII? The question has been bugging me for a long time.
While you probably shouldn't use this as part of a hand rolled strtol (that's what the standard library is for) there is nothing wrong with this technique for converting a single digit to its value. It's simple and clear, even idiomatic. You should, though, add range checking if you are not absolutely certain that the given char is in range.
It's a C language guarantee that this works.
5.2.1/3 says:
In both the source and execution basic character sets, the value of each character after 0 in the above list [includes the sequence: 0,1,2,3,4,5,6,7,8,9] shall be one greater that the value of the previous.
Character sets may exist where this isn't true but they can't be used as either source or execution character sets in any C implementation.
Edit: Apparently the C standard guarantees consecutive 0-9 digits.
ASCII is not guaranteed by the C standard, in effect making it non-portable. You should use a standard library function intended for conversion, such as atoi.
However, if you wish to make assumptions about where you are running (for example, an embedded system where space is at a premium), then by all means use the subtraction method. Even on systems not in the US-ASCII code page (UTF-8, other code pages) this conversion will work. It will work on ebcdic (amazingly).
This is a common trick taught in C classes primarily to illustrate the notion that a char is a number and that its value is different from the corresponding int.
Unfortunately, this educational toy somehow became part of the typical arsenal of most C developers, partially because C doesn't provide a convenient call for this (it is often platform specific, I'm not even sure what it is).
Generally, this code is not portable for non-ASCII platforms, and for future transitions to other encodings. It's also not really readable. At a minimum wrap this trick in a function.

Where did the name `atoi` come from?

In the C language where did they come up with the name atoi for converting a string to an integer? The only thing I can think of is Array To Integer for an acronym but that doesn't really make sense.
It means Ascii to Integer. Likewise, you can have atol for Ascii to Long, atof for Ascii to Float, etc.
A Google search for 'atoi "ascii to integer"' confirms this on several pages.
I'm having trouble finding any official source on it... but in this listing of man pages from Third Edition Unix (1973) collected by Dennis Ritchie himself, it does contain the line:
atoi(III): convert ASCII to integer
In fact, even the first edition Unix (ca 1971) man pages list atoi as meaning Ascii to Integer.
So even if there isn't any documentation more official than man pages indicating that atoi means Ascii to Integer (I suspect there is and I just haven't been able to locate it), it's been Ascii to Integer by convention at least since 1971.
I griefly believe that function atoi means ascii to integer.

Resources