Right now I am only trying to get my getline() function to work. I have the code from the book and it seems to be identical, but I cant get it to compile. This is homework but this part should be just copying from the book.
#include <stdio.h>
#include <stdlib.h>
#include <math.h>
//error list
#define ENDOFFILE = -1;
#define TOOMANYNUMS = -2;
#define LIMIT = 256;
//functions declared
int get_line(char line[], int);
//main
main(){
char line[255];
int num[6];
printf("Please input numbers %c: ", line);
get_line(line,LIMIT);
}
//functions
int get_line(char s[],int lim){
int c, i;
for (i=0;i<lim-1 && (c=getchar())!=EOF && c!='\n'; ++i)
s[i] = c;
if(c=='\n'){
s[i]=c;
++i;
}
s[i]='\0';
return i;
}
Now (edited at 10:22) I only get one error:
18 - expected expression before equal
conflicting types for 'getline'
getline might be a function in your standard library, e.g. thisone. If you want to reimplement it, give it a different name.
too few arguments to function 'getline'
You are calling getline() in main() without any arguments, but a few lines above you state that getline takes a char[] and an int. call it like getline(line,sizeof line);
Right now I am only trying to get my getline() function to work.
getline() is a name of Linux function, declared in the stdio.h. C compiler complains that there are two conflicting declarations.
Simply give your getline() function a different name.
Edit1: That:
#define ENDOFFILE = -1;
Should be
#define ENDOFFILE -1
No =, no ; needed for preprocessor directives.
The problem appears to be that the system you are compiling this on appears to have a getline() function already defined, and your definition is conflicting with that. It appears that glibc, the C library used on Linux, has a non-standard getline() function declared in stdio.h. It shouldn't be defined unless you include a line like #define _GNU_SOURCE to opt-in to including non-standard functions, but it may be that this is pre-defined based on how you are compiling your code.
The easiest solution would be to rename your function to something else, but you could also try and find in your compiler options why GNU extensions are being turned on.
Now that you've edited your code, your second problem is that your #define lines are wrong. You don't need an equal or semicolon; these are processed by the preprocessor, which has a different syntax than C, and all you need to do is write #define NAME VALUE.
The proper syntax would be:
#define ENDOFFILE -1
#define TOOMANYNUMS -2
#define LIMIT 256
you need to realize:
it is general an error to write
#define macro like what you wrote,because #define macro is
simply string replacement,so this
statement get_line(line,LIMIT) is
actually get_line(line,=256;)
processed by the compiler,then the
compiling error occurred. Just
change #define LIMIT =256; to
#define LIMIT 256 would be ok.
as is mentioned in the previous
replies,never write codes that
library provided,getline is a function defined in stdio.h.
hope this helps.
Related
putchar(char) writes a character to standard output and is normally provided by stdio.h.
How do I write a character to standard output without using stdio.h or any other standard library file (that is: no #include:s allowed)?
Or phrased different, how do I implement my own putchar(char) with zero #include statements?
This is what I want to achieve:
/* NOTE: No #include:s allowed! :-) */
void putchar(char c) {
/*
* Correct answer to this question == Code that implements putchar(char).
* Please note: no #include:s allowed. Not even a single one :-)
*/
}
int main() {
putchar('H');
putchar('i');
putchar('!');
putchar('\n');
return 0;
}
Clarifications:
Please note: No #include:s allowed. Not even a single one :-)
The solution does not have to be portable (inline assembler is hence OK), but it must compile with gcc under MacOS X.
Definition of correct answer:
A working putchar(char c) function. Nothing more, nothing less :-)
On a POSIX system, such as Linux or OSX, you could use the write system call:
/*
#include <unistd.h>
#include <string.h>
*/
int main(int argc, char *argv[])
{
char str[] = "Hello world\n";
/* Possible warnings will be encountered here, about implicit declaration
* of `write` and `strlen`
*/
write(1, str, strlen(str));
/* `1` is the standard output file descriptor, a.k.a. `STDOUT_FILENO` */
return 0;
}
On Windows there are similar functions. You probably have to open the console with OpenFile and then use WriteFile.
void putchar(char c) {
extern long write(int, const char *, unsigned long);
(void) write(1, &c, 1);
}
There is no platform-independent way of doing this.
Of course, on any specified platform, you can trivially achieve this by reimplementing/copy-and-pasting the implementation of stdio.h (and anything that this in turn relies on). But this won't be portable. Nor will it be useful.
Since providing a full solutions is probably unsporting, this is the next best thing... which is the source-code for Apple's implementation - which is open source.
I think reducing this to a minimum case is an exercise for the OP. Apple have even kindly provided an XCode project file.
void fun()
{
// Essentially this is a function with an empty body
// And I don't care about () in a macro
// Because this is evil, regardless
#define printf(a, b) (printf)(a, b*2)
}
void main() // I know this is not a valid main() signature
{
int x = 20;
fun();
x = 10;
printf("%d", x);
}
I am having doubt with #define line ! Can you please give me some links documentation for understanding this line of code.Answer is 20.
The #define defines a preprocessor macro is processed by the preprocessor before the compiler does anything.
The preprocessor doesn't even know if the line of code is inside or outside a function.
Generally macros are defined after inclusion of header files.
i.e. after #include statements.
Preprocessor macros are not part of the actual C language, handling of macros and other preprocessor directives is a separate step done before the compiler1. This means that macros do not follow the rules of C, especially in regards to scoping, macros are always "global".
That means the printf function you think you call in the main function is not actually the printf function, it's the printf macro.
The code you show will look like this after preprocessing (and removal of comments):
void fun()
{
}
void main()
{
int x = 20;
fun();
x = 10;
(printf)("%d", x*2);
}
What happens is that the invocation of the printf macro is replaced with a call to the printf function. And since the second argument of the macro is multiplied by two, the output will be 10 * 2 which is 20.
This program illustrates a major problem with macros: It's to easy to a program look like a normal program, but it does something unexpected. It's simple to define a macro true that actually evaluates to false, and the opposite, changing the meaning of comparisons against true or false completely. The only thing you should learn from an example like this is how bad macros are, and that you should never try to use macros to "redefine" the language or standard functions. When used sparingly and well macros are good and will make programming in C easier. Used wrongly, like in this example, and they will make the code unreadable and unmaintainable.
1 The preprocessor used to be a separate program that ran before the compiler program. Modern compilers have the preprocessor step built-in, but it's still a separate step before the actual C-code is parsed.
Let me put this in another way.
printf() is an inbuilt standard library function that sends formatted output to stdout (Your console screen). The printf() function call is executed during the runtime of the program. The syntax looks likes this.
int printf(const char *format, ...)
But this program of yours replaces the printf() function with your macro before the compilation.
void fun(){
#define printf(a, b) printf(a, b*2)
}
void main() {
int x = 20;
fun();
x = 10;
printf("%d", x);
}
So what happens is, before compilation the compiler replaces the inbuilt function call with your own user defined macro function with two arguments:
a="%d" the format specifier and
b=the value of x =10.
So the value of x*2 =20
So I'm new to C and I'm playing around with functions in the GNU C Library when I come across https://www.gnu.org/software/libc/manual/html_node/strfry.html#strfry
Intrigued, I wrote a little tester program:
1 #include <stdio.h>
2 #include <string.h>
3
4 main ()
5 {
6 char *str = "test123abc";
7 char *other;
8
9 other = strfry(str);
10 printf("%s\n", other);
11 return 0;
12 }
gcc test.c outputs test.c:9: warning: assignment makes pointer from integer without a cast
Why?
/usr/include/string.h has the following entry:
extern char *strfry (char *__string) __THROW __nonnull ((1));
How can a char *function(...) return int?
Thanks
Since strfry is a GNU extension, you need to #define _GNU_SOURCE to use it. If you fail to provide that #define, the declaration will not be visible and the compiler will automatically assume that the function returns int.
A related problem, as pointed out by perreal, is that it is undefined behavior to modify a literal string. Once you make the declaration of strfry visible to the compiler, this will be duly reported.
Do note that the strfry function and its cousin memfrob are not entirely serious and are rarely used in production.
To have strfry available, you need
#define _GNU_SOURCE
otherwise the prototype is not exposed and the implicit declaration is assumed to return an int.
The problem is you don't have a prototype in scope for strfry() and the compiler assumes it returns an int. When it wants to assign that int to a char* it complains with the message you specify.
According to my man pages, you need to #define _GNU_SOURCE at the very top of your source code, especially before standard #includes
#define _GNU_SOURCE
/* rest of your program */
You can't modify a literal string:
#define _GNU_SOURCE
#include <stdio.h>
#include <string.h>
int main () {
char *str = "test123abc";
char other[256];
strcpy(other, str);
strfry(other);
printf("%s\n", other);
return 0;
}
I was writing some test code in C. By mistake I had inserted a ; after a #define, which gave me errors. Why is a semicolon not required for #defines?
More specifically :
Method 1: works
const int MAX_STRING = 256;
int main(void) {
char buffer[MAX_STRING];
}
Method 2: Does not work - compilation error.
#define MAX_STRING 256;
int main(void) {
char buffer[MAX_STRING];
}
What is the reason of the different behavior of those codes? Are those both MAX_STRINGs not constants?
#define MAX_STRING 256;
means:
whenever you find MAX_STRING when preprocessing, replace it with 256;. In your case it'll make method 2:
#include <stdio.h>
#include <stdlib.h>
#define MAX_STRING 256;
int main(void) {
char buffer [256;];
}
which isn't valid syntax. Replace
#define MAX_STRING 256;
with
#define MAX_STRING 256
The difference between your two codes is that in first method you declare a constant equal to 256 but in the second code you define MAX_STRING to stand for 256; in your source file.
The #define directive is used to define values or macros that are used by the preprocessor to manipulate the program source code before it is compiled. Because preprocessor definitions are substituted before the compiler acts on the source code, any errors that are introduced by #define are difficult to trace.
The syntax is:
#define CONST_NAME VALUE
if there is a ; at the end, it's considered as a part of VALUE.
to understand how exactly #defines work, try defining:
#define FOREVER for(;;)
...
FOREVER {
/perform something forever.
}
Interesting remark by John Hascall:
Most compilers will give you a way to see the output after the preprocessor phase, this can aid with debugging issues like this.
In gcc it can be done with flag -E.
#define is a preprocessor directive, not a statement or declaration as defined by the C grammar (both of those are required to end with a semicolon). The rules for the syntax of each one are different.
define is a preprocessor directive, and is a simple replacement, it is not a declaration.
BTW, as a replacement it may contain some ; as part of it:
// Ugly as hell, but valid
#define END_STATEMENT ;
int a = 1 END_STATEMENT // preprocessed to -> int a = 1;
Both constants? No.
The first method does not produce a constant in C language. Const-qualified variables do not qualify as constants in C. Your first method works only because past-C99 C compilers support variable-length arrays (VLA). Your buffer is a VLA in the first case specifically because MAX_STRING is not a constant. Try declaring the same array in file scope and you'll get an error, since VLAs are not allowed in file scope.
The second method can be used to assign names to constant values in C, but you have to do it properly. The ; in macro definition should not be there. Macros work through textual substitution and you don't want to substitute that extra ; into your array declaration. The proper way to define that macro would be
#define MAX_STRING 256
In C language, when it comes to defining proper named constants, you are basically limited to macros and enums. Don't try to use const "constants", unless you really know that it will work for your purposes.
Because that is how the syntax was decided for the precompiler directives.
Only statements end with a ; in c/c++, #define is a pre-processor directive and not a statement.
The second version does not define a constant as far as the language is concerned, just a substitution rule for a block of text. Once the preprocessor has done it's job, the compiler sees
char buffer [256;];
which is not syntactically valid.
The moral of the story: prefer the const int MAX_STRING = 256; way as that helps you, the compiler, and the debugger.
This preprocessor directive:
#define MAX_STRING 256;
tells the preprocessor to replace all MAX_STRINGs with 256; - and with the semicolon. Preprocessor statements don't need a semicolon at the end. If you put one, the preprocessor actually thinks you mean it with a semicolon.
If you are confused with #defines for constants, const int would probably be easier to comprehend.
If you want to learn more on how to properly use these preprocessor directives, try looking at this website.
I was wondering whether there is anything wrong with passing a pointer to putchar, or any other standard function which can be implemented as a macro, to a function accepting a function pointer. Below is an example of what I'm doing.
#include <stdio.h>
static int print(const char *s, int (*printc)(int))
{
int c;
while (*s != '\0') {
if ((c = printc(*s++)) < 0)
return c;
}
return printc('\n');
}
int main(void)
{
print("Hello, world", putchar);
return 0;
}
I don't have any problems compiling this with GCC and Clang under GNU/Linux, as well as GCC under OpenBSD. I'm wondering whether it will have the same behaviour under every other standard compliant implementation, since putchar can be implemented as a macro. I've searched through the standard, particularly the sections on function pointers and putchar, and haven't been able to find anything that specifies whether this is legal or not.
Thanks.
Following on from my comments-based discussion with #pmg, I found the relevant section of the standard (C99, 7.1.4 p.1):
... it is permitted to take the address of a library function even if it is also defined as
a macro.161)
161) This means that an implementation shall provide an actual function for each library function, even if it also provides a macro for that function.
It is legal, because it refers to the library function, not the macro. When it is defined as a macro, that macro takes an argument, and therefore putchar with no parenthesis following does not refer to it.
This is an instance of using a macro to inline functions, and should not be taken as good practice now that inlining is supported by compilers.
The macro is only "invoked" when its identifier is followed by an open-parenthesis (. Otherwise the real function of the same name is invoked/used.
#include <stdio.h>
#define macro(x) 42
int (macro)(int x) { return x; }
int main(void) {
printf("%d %d\n", macro(1), (macro)(1));
}
The function definition has "extra" parenthesis to prevent the macro from being expanded
See code running at ideone: http://ideone.com/UXvjX