Segmentation fault in fscanf() function - c

I am writing some code for a C wordsearch program and, while compiling one of the files, i get segmentation fault on fscanf function but i don't know where the error is.
I already searched for the answer and i understood that on integer variables I must initialize the value of them (and I already done it) and that i must refer the types inside fscanf() with '&' (done it too).
Here is the code( in the main() function):
int i;
int nl = 0;
int nc = 0;
int a, b, d;
char mat[1000][1000];//character matrix for wordsearch letters
int x[1000], y[1000];
int l, c, n;
printf("Chose the wordsearch you want to use.\n For example: t01.in, t02.in, t03.in, t04.in, (...),t30.in\n");
char wordsearch_file[8];
fgets(wordsearch_file,8,stdin);//user enters f.e. 't01.in'
FILE *stream;
char buffer[1024];
sprintf(buffer, "home/adminuser/Desktop/LA/ETAPA2/sopas/%s", wordsearch_file);
stream = fopen(buffer,"r");
if((fscanf (stream,"%d%d", &nl, &nc)) > 0){ // SEG. FAULT happens here
for(l = 0; l < nl; l++) {
for(c = 0; c < nc; c++)
mat[l][c] = fgetc(stream) != EOF;
fgetc(stream);
}
}
I wanted 'nl' (number of lines) to read 3 and 'nc' (number os columns) to read the other 3.
The 't01.in' file:
3 3
SIA
ORR
EDI

Anytime you open an external resource (file, database, socket), or make any system call whatsoever, you always check for a valid stream or return code.
The first thing you should do is add a check for stream instead of blindly calling fscanf(stream,...) without knowing if the fopen() call succeeded.
Then decide why fopen() failed. I suggest printing out the filenamne, or checking that it exists, and/or using perror() to print the system error. perror() will tell you exactly what is wrong, and if I had to guess, it would be as #BLUEPIXY mentioned, a newline in the filename.
stream = fopen(buffer,"r");
if(!stream) {
perror("what the problem: ");
}
Lastly, learn how to use the debugger to analyze the core file. If you aren't getting a core dump, set your ulimit correctly. From memory, you want "ulimit -c unlimited". Find out your current ulimits by typing simply "ulimit" at the shell prompt. Then re-run your crashing program. Once you get a core file, run GNU debugger on it.
gdb program.exe core

Related

C Error During Debugger (Cannot open file: ../../../src/gcc-6.3.0/libgcc/config/i386/cygwin.S)

I saw a similar question but the answer didn't make any sense to me. Basically what's happening is that during debugging, I get the error Cannot open file: ../../../src/gcc-6.3.0/libgcc/config/i386/cygwin.S and it crashes. Weirdly, when I run the getFromMemory() function, by itself, it runs without crashing when I don't use debug mode, but the output is unexpected. (Weird characters.) I'm using codeblocks IDE and the GNU GCC compiler. Not really sure what's going on. Here is the code:
int main(){
char output[200000];
getFromMemory(0, output, 2);
printf("output: %s\n", output);
getFromMemory(0, output, 40000);
printf("output: %s\n", output);
}
void getFromMemory(int lineNum, char *output, int lines){
Sleep(10); //sleeps to create a delay when accessing main memory
//gets bits from memory starting at the specified for a specified number of lines
//there is a new line every 8 bits
//not working right now, appears to be doing nothing
lineNum--;
FILE *fh;
fh = fopen("Main Memory.txt", "r");
char toCombine[lines][10];
char useless[9];
for(int i = 0; i < lineNum; i++){
fgets(useless, 9, fh); //this is done to incrament the file handle by the number of lines specified in lines
}
for(int i = 0; i < lines; i++){
fgets(toCombine[i], 9, fh);
}
int index = 0; //this is used to keep track of what index in output is being written to
char character; //character is which character in toCombine[] is being read
int y = 0;
for(int i = 0; i < lines; i++){
character = toCombine[i][0];
while(character != '\0'){
y++;
output[index] = character;
index++;
character = toCombine[i][y];
}
if(y == 1){
//if there is a null terminator at the start of a line (technically there shouldn't even be lines, just a long line of 1's and 0's)
flags[5] = '1';
break;
}
y = 0;
}
output[index] = '\0';
fclose(fh);
}
I put the breakpoint at the line under int main(). I have no idea what the error Cannot open file: ../../../src/gcc-6.3.0/libgcc/config/i386/cygwin.S
represents. Not really sure what to do about this. Thanks.
cygwin.S is the startup code used by the cygwin linker to prepare the CPU/environment for the execution of your application.
Strongly suggest your run of the debugger looks like:
gdb ./nameOfExecutable
break main
run
(execution stops at the first line of function `main'
(from this point do what ever is needed to test the executable
(One common scenario is to run the program to completion I.E.
continue
or single step through the code:
step
or to step over a function call
next
or display a (visible) variable
print variableName
or to quit
quit
use the help system to get the syntax for the commands
help gdbCommand
By using various commands to gdb, you can insert start up (command line) parameters, display/change variable values, and much much more

How to properly call an executable in C program runtime?

I have a C application whose one of the jobs is to call an executable file. That file has performance measurement routines inserted during compilation, at the level of intermediate code. It can measure time or L1/L2/L3 cache misses. In other words, I have modified the LLVM compiler to insert a call to that function and print the result to stdout for any compiled program.
Now, like I mentioned at the beginning, I would like to execute the program (with this result returned to stdout) from a separate C application and save that result. The way I'm doing it right now is:
void executeProgram(const char* filename, char* time) {
printf("Executing selected program %s...\n", filename);
char filePath[100] = "/home/michal/thesis/Drafts/output/";
strcat(filePath, filename);
FILE *fp;
fp = popen(filePath, "r");
char str[30];
if (fp == NULL) {
printf("Failed to run command\n" );
exit(1);
}
while (fgets(str, sizeof(str) - 1, fp) != NULL) {
strcat(time, str);
}
pclose(fp);
}
where filename is the name of the compiled executable to run. The result is saved to time string.
The problem is, that the results I'm getting are pretty different and unstable compared to those that are returned by simply running the executable 'by hand' from the command line (./test16). They look like:
231425
229958
230450
228534
230033
230566
231059
232016
230733
236017
213179
90515
229775
213351
229316
231642
230875
So they're mostly around 230000 us, with some occasional drops. The same executable, run from within the other application, produces:
97097
88706
91418
97970
97972
94597
95846
95139
91070
95918
107006
89988
90882
91986
90997
88824
129136
94976
102191
94400
95215
95061
92115
96319
114091
95230
114500
95533
102294
108473
105730
Note that it is the same executable that's being called. Yet the measured time it returns is different. The program that is being measured consists of a function call to a simple nested loop, accessing array elements. Here is the code:
#include "test.h"
#include <stdio.h>
float data[1000][1000] = {0};
void test(void)
{
int i0, i1;
int N = 80;
float mean[1000];
for (i0 = 0; i0 < N; i0++)
{
mean[i0] = 0.0;
for (i1 = 0; i1 < N; i1++) {
mean[i0] += data[i0][i1];
}
mean[i0] /= 1000;
}
}
I'm suspecting that there is something wrong in the way the program is invoked in the code, maybe the process should be forked or something? Any ideas?
You didnt specify where exactly your time measuring subroutines are inserted, so all I can really offer is guesswork.
The results seem to hint to the exact opposite - running the application from shell is slower, so I wouldn't worry about the way you're starting the process from the C code. My guess would be - when you run your program from shell, it's the terminal that's slowing you down. When you're running the process from your C code, you pipe the output back to your 'starter' application, which is already waiting for input on the pipe.
As a side note, consider switching from strcat to something safer, like strncat.

Segmentation Fault 11 in C File I/O

I'm writing a function which searches a text file formatted like this:
#User1\pass\
#User2\pass\
#User3\pass\
I have written the function Check_User:
int Check_User(char input[20], FILE *userlist)
{
int c, i;
fseek(userlist, 0, SEEK_SET);
while(1)
{
while((c = fgetc(userlist)) != '#')
{
if(c == EOF)
return 1;
}
while(input[i] == (c = fgetc(userlist)))
{
i++;
}
i = 0;
if(c == '\\')
return 0;
}
}
Which is called from here:
while(Check_User(username, userlist) == 0)
{
printf("Username already in use. Please select another:");
Get_Input(username);
}
Check user checks the file pointed to by *userlist to see if the username[20] contains a username which is already in use. If it is already in use it calls Get_Input for a new username.
The program makes it all the way to the while loop and then gives me a segmentation fault: 11. I have read that this often stems from trying to write beyond the end of an array, or generally doing things to memory you don't have access to. Userlist.txt has been opened in r+ mode.
Thanks for the help.
You did not initialize your variable i before its first use. In C, declared variables are not initialized to numerical 0 (as they would be in C# or Java), but simply use the value that was present at their memory location before. Thus, the value of i may be far bigger that the length of the string input, and input[i] may access an invalid memory location.
As a side note: to quickly debug this yourself under Linux, compile the program with debug symbols (gcc -g) and then use valgrind --tool=memcheck <your program> to let valgrind find the source of error for you.

Can I parse ngrep's output with popen()?

I tried running this code, but nothing is ever shown. (Yes, I ran it as root) If I can't get ngrep's output I guess I'll try to figure out how to use libpcap with c++ although I haven't been able to find any good examples.
int main(void)
{
FILE* fproc = popen("ngrep -d wlan0 GET");
char c;
do {
printf("%c", fgetc(fproc));
} while (c!=EOF);
}
So what about this code causes nothing to be show, and what do you suggest to easily parse ngrep's output, or some other way of capturing GET requests, maybe with libpcap
I see the possible potential problems:
You have no open mode for the popen call? Leaving this off is likely to result in either a core dump or a random value of the stack deciding whether it's a read or write pipe.
The c variable should be an int rather than a char since it has to be able to hold all characters plus an EOF indicator.
And, you're not actually assigning anything to c which would cause the loop to exit.
With that do loop, you're trying to output the EOF to the output stream at the end. Don't know off the top of my head if this is a bad thing but it's certainly not necessary.
Try this:
int main(void) {
int ch;
FILE* fproc;
if ((fproc = popen("ngrep -d wlan0 GET", "r")) < 0) {
fprintf (stderr, "Cannot open pipe\n");
return 1;
}
while ((ch = fgetc (fproc)) != EOF) {
printf ("%c", ch);
};
pclose (fproc);
return 0;
}
You should also be aware that the pipe is fully buffered by default so you may not get any information until the buffer is full.

When compiling a line counting program in Solaris an extra three lines are being as opposed to MacOSX

I wrote the following code under MacOSX in XCode. When moving the code over to a Solaris Server three extra lines are being counted and I can not figure out why.
#include <stdio.h>
#define MAXLINE 281 // 281 is a prime number!!
char words[4][MAXLINE]; // words array to hold menu items
char displayfilename[4][MAXLINE]; //filename array to hold filename for display function
char exit_choice[4][MAXLINE]; //for user interaction and end of each function
int i; //standard array variable
int loop = 1; //control variable for my loop
int main()
{
printf("Enter filename: ");
scanf("%s", displayfilename[i]);
FILE *fp;
int clo_c , clo_nc, clo_nlines;
fp = fopen(*displayfilename,"r"); // open for reading */
if ( fp == NULL )
{
printf("Cannot open for reading!\n");
}
clo_c = getc( fp ) ;
while ( clo_c != EOF )
{
if (clo_c == '\n')
clo_nlines++ ;
clo_nc++ ;
clo_c = getc ( fp );
}
fclose( fp );
if ( clo_nc != 0 )
{
printf("There are %d lines in this file.\n", clo_nlines);
}
else
printf("File is empty, exiting!\n");
}
Can anyone explain to me Solaris is adding three to clo_nlines?
You didn't initialize clo_nlines - therefore you got 'undefined behavior'.
Declaring a variable in C doesn't set its value to anything - it just allocates some memory for that variable, and whatever junk happens to be in that bit (well, not bit, but you get the idea >.>) of memory is what the variable starts out as.
There are a couple of issues here.
First one, from a bulletproof-code point of view, is #Zilchonum's point, that clo_nc and clo_nlines aren't being initialized. In old C, that means you don't have any idea what's in them to start with and so don't have any idea what you'll end with.
However, later C standards define that uninitialized variables are set to 0, so that's probably not it unless you're setting the compiler to earlier behavior with flags.
More likely is Auri's point, that different machines use different newline standards. However, I believe that Mac OS/X uses a single character for newline, just as Solaris does.
Which brings us to the file itself. Try using oc -c to see what's actually in the file. my guess is that you'll find the file on one system is \r\n newlines, but on the other system has \n newlines, probably as a result of the settings of the file transfer program you used. It has probably converted to UNIX format on one but not the other.
Did you make sure you're not counting crlf as two linefeeds?

Resources