It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
In C I have noted that pointers result in faster program execution. How is it possible, as it must fetch the pointer variable before going to the actual variable?
Pointers don't result in faster program execution. Smart algorithms result in faster program execution. Sometimes algorithms can be made smarter by using pointers in the right way. Pointers are never a magic wand to throw at problems to make the solutions faster.
Pointers are just a design paradigm though, using functional programming you do not use any pointers at all.
This is not true. The reason for faster program execution is not the availablility of pointers. It's a question of what you do with the pointers. The (possibly) faster program execution yields from the fact that no hidden functionality is introduced with C.
Take a string for example. Common implementations in other languages introduce a length field along with the string in order to keep track of the length of the string. This "bookkeeping" (although hidden from the programmer) causes extra cycles to be executed.
Another example is the fact that C does not check if the pointer you are dereferencing is valid or not. This evaluation would also cost extra cycles.
The C standard does not specify any required speed, so it doesn't make sense to attribute speed to features of C. Consider that some C implementations produce more optimal machine code than others, and it might make more sense to attribute speed to aspects of specific implementations of C1. Don't confuse implementation and specification.
1: To make a meaningful comparison of the speed of specific implementations of C, you'd probably want to mention your OS (major and minor version), your compiler (major and minor version), your CPU (model), mainboard, memory (model and configuration) and the command line arguments you used.
While I am aware every answer to the question come from people far more knowledgeable than me in C (and I am out of my league actually), IMVHO and/or limited knowledge, pointers do improve efficiency.
To answer the OP's question (and ignoring the rest about program execution and fetching):
How can pointers improve program efficiency?
By avoiding duplication of data. Although this efficiency may only be notable when dealing with user-defined variables, "structures".
Here is a nice read I found on C pointers: Why C has Pointers
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have been told by more senior, experienced and better-educated programmers than myself that the use of function-pointers in c should be avoided. I have seen the fact that some code contains function pointers as a rationale not to re-use that code, even when the only alternative is complete re-implementation. Upon further discussion I haven't been able to determine why this would be. I am happy to use function pointers where appropriate, and like the interesting and powerful things they allow you to do, but am I throwing caution to the wind by using them?
I see the pros and cons of function pointers as follows:
Pros:
Great opportunity for code modularity
OO-like features in non-OO c (i.e. code and data in the same object)
How else could you reasonably implement a callback?
Cons:
Negative impact to code readability - not always obvious what function is actually called when a function pointer is invoked
Minor performance hit compared to a direct function call
I think Con # 1. can usually reasonably be mitigated by well chosen symbol names and good comments. And Con # 2. will in general not be a big deal. Am I missing something - are there other reasons to avoid function pointers like the plague?
This question looks a little discussion-ey, but I'm looking for good reasons why I shouldn't use function pointers, not opinions
Function pointers are not evil. The main times you "shouldn't" use them are when either:
The use is gratuitous, i.e. not actually needed for what you're doing, or
In situations where you're writing hardened code and the function pointer might be stored at a location you're concerned may be a likely candidate for buffer overflow attacks.
As for when function pointers are needed, Adam's answer provided some good examples. The common theme in all those examples is that the caller needs to be able to provide part of the code that runs from the called function. Without function pointers, the only way you could do this would be to copy-and-paste the implementation of the function and change part of it to call a different function, for every individual usage case. For qsort and bsearch, which can be implemented portably, this would just be a nuisance and hideously ugly. For thread creation, on the other hand, without function pointers you would have to copy and paste part of the system implementation for the particular OS you're running on, and adapt it to call the function you want called. This is obviously unacceptable; your program would then be completely non-portable.
As such, function pointers are absolutely necessary for some tasks, and for other tasks, they are a major convenience which allows general code to be reused. I see no reason why they should not be used in such cases.
No, they're not evil. They're absolute necessary in order to implement various features such as callback functions in C.
Without function pointers, you could not implement:
qsort(3)
bsearch(3)
Window procedures
Threads
Signal handlers
And many more.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
My question is particulary to function calls in C. So, either we can call functions normally or through function pointers. When your interface remainas the same but with different implementations, function pointers are used, but even if you have a single implementation having function pointers can improve the readabilty of the code.
So, what are the benefits of having static calls rather than dynamic function pointers. The call will obviously be implemented in 2 instructions as the address of the function needs to be fetched, but return will take equal cycles. I just want to understand, how can if at all processor and compiler optimize static calls over dynamic functions pointers?
Thanks,
Calling a function through a pointer will in most cases result in the compiler not being able to inline the call (if the compiler has determined that it would be beneficial to do so.) In some cases, the compiler is even able to determine the result of a function call at compile time, and thus optimize-away the whole code of the function. Function pointers also prevent this from happening.
That doesn't mean that the impact will be noticeable in any way that actually matters though. The only way to determine that, is to go and benchmark/profile your code.
However, I don't see how function pointers would be able to provide better code readability. You might want to give an example of that.
In general, direct call allows way more optimization to the compiler.
If there is really only one implementation and compiler can see it [*], it can optimize and do exactly as with direct call. But it of course depends on how smart the compiler is and what optimization options you use.
[*] i.e. if the pointer value is known in compile time; i.e. if it is 'static' variable, its address never passed outside the compilation unit, etc.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
I was solving a practice problem on a site which states that
The purpose of this problem is to verify whether the method you are
using to read input data is sufficiently fast to handle problems
branded with the enormous Input/Output warning. You are expected to be
able to process at least 2.5MB of input data per second at runtime.
Also how do I optimize input/output routines other than printf and scanf?
It is operating system specific (because the C standard only knows about <stdio.h>). With Linux consider using low-level syscalls for efficiency, like open(2), mmap(2), read(2), pread(2), write(2). You might also want to use readahead(2). Don't forget to make I/O in rather large blocks (e.g. 128Kbytes), page aligned if possible. Read the Advanced Linux Programming book.
If restricted to standard C99 functions, use fread(3) on rather big chunks. Consider also increasing the internal buffer with setvbuf(3)
And 2.5Mbyte/sec is not very impressive. Probably, the bottleneck is the hardware, but you should be able to get perhaps 20 or 50Mbytes/sec on a standard desktop hardware. Using SSD would help a big lot.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
I'm using the PellesC C compiler. Sometimes my code randomly stops working. A particular statement can trigger it. For example, I multiplied a variable by sin(c) (c is a double) and my code seemed to just finish execution with no result. Sometimes it freezes, sometimes it appears to just return, but I can always fix it by removing the offending statement or disabling compiler optimizations, specifically "maximize speed" or "maximize speed more". The freezing will also go away nearly 100% of the time if I add a printf statement somewhere near the point at which it crashes. I've never found anything to suggest that I am accessing memory improperly, I'm fairly sure its a compiler issue. I was wondering if anybody could shed some light on this. Is it possible that I am, in fact, doing something wrong? Or is this a known issue with the Pelles C compiler?
Edit:
Changing
canvas->pixels[(y*canvas->pitch)+(x*canvas->Bpp)+2]=(unsigned char)(255.0*dtempA*(1-sin(c)));
canvas->pixels[(y*canvas->pitch)+(x*canvas->Bpp)+1]=(unsigned char)(255.0*dtempA*(1+cos(c)));
canvas->pixels[(y*canvas->pitch)+(x*canvas->Bpp)]=(unsigned char)(255.0*dtempA*(1+sin(c)));
to (difference at the end of the last line)
canvas->pixels[(y*canvas->pitch)+(x*canvas->Bpp)+2]=(unsigned char)(255.0*dtempA*(1-sin(c)));
canvas->pixels[(y*canvas->pitch)+(x*canvas->Bpp)+1]=(unsigned char)(255.0*dtempA*(1+cos(c)));
canvas->pixels[(y*canvas->pitch)+(x*canvas->Bpp)]=(unsigned char)(255.0*dtempA*(1+1));
makes it work.
It could be either but a good bet is that it's you :) Variables that are not explicitly initialized will often get different values in a optimized vs an un-optimized build because the stack layout can change subtly depending on how aggressively the compiler removes temporaries, as well as other factors.
You're probably accidentally using undefined behavior somewhere, and changing random instructions in the program is breaking the very fragile alignment of the code on the stack that happens to make the program work.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
In a recent question I was encouraged to try using some basic data structures such as binary trees, red-black trees, et cetera, before tackling other things like quadtrees.
My experience in C is fairly limited and I am fearful of using pointers for anything but simple data (like 2D grids, image storage and strings), although I am familiar with referencing, malloc, realloc and other trivial actions, I am not used to the "hard" parts of C, which makes such structures hard to tackle from theory, and I don't want to just copy working code into this.
What I'd like to know, in order to tackle basic trees, is a practical application for them. Sort of an exercise with some guidelines (sort of "don't do this or you will kill performance" or "don't do that or this will leak memory"), just to be able to know the practical purpose. Even if I memorize the theory, I still don't know what sort of experiment to conduct in order to understand their application.
I am mostly attempting to use plain C, I don't really understand C++/# code when reading it, although I have certain mastery of the Lua language in case that helps.
So far I've been coding combining Lua for dictionary searches and designing data (and some logic parts) and left all video and audio storage, heavy math and "world" storage in C (using grid structures and a not-too-bruteforced collision detection approach (using a linear array to place objects in 1/24 of the map, nothing complex in code terms)). Because I could always rely on Lua's solid code for some functions, I neglected learning more of C and now I am paying for it with lack of knowledge.
So, to formulate a question: "What is the basic use case for data trees?" The only idea I have so far is using a splay tree to match strings (filenames?) to textures. Is that a valid use? Should I begin with that?
One of the uses for data trees was when writing a parser / compiler. After breaking up the source (Lexical analysis) and running the parsing (verifying grammar), we would build a tree structure of the source code (Syntactic tree), which was then repeatedly visited by the next parts of the compiler.
Another use of trees is when having to match if strings belong to a set of words very quickly, using very little memory, you can use a DAWG (Directed acyclic word graph).
Finally, a classic use is writing a solver for a Travelling Salesman problem using a data tree to store your cities in memory.
The classic Sedgewick Algorithms in C, Part 5: Graph Algorithms is full of examples, also if you have access to it.