Codeblocks, C. I'm trying to write characters to a .txt file, using fwrite. The first couple of characters get written correctly, but after them the file says: _mcleanup: tos ov. I think it might be a buffer overload. Any ideas?
#include <stdio.h>
#include <stdlib.h>
#include <sys/stat.h>
#include <fcntl.h>
int main()
{
FILE*p1;
p1=fopen("Tomi.txt","w+");
fseek(p1,0,SEEK_SET);
// fwrite("Toth Tamas",sizeof(char),30,p1);
while(a<10)
{
fwrite("Toth Tamas",sizeof("Toth Tamas"),1,p1);
a++;
}
return 0;
}
If I add int a = 0; immediately after FILE *p1;, the while-loop version of the program compiles without complaint, even with maximum warnings, and executes without error on my computer, even under valgrind. You want sizeof("Toth Tamas") - 1 in the fwrite call to prevent NUL bytes from being written to the file, and maybe there should be an \n after "Tamas", and see my earlier comments for other minor problems, but that code seems mostly okay.
However, the commented-out line //fwrite("Toth Tamas",sizeof(char),30,p1); does have a bug: the string constant "Toth Tamas" is only 11 bytes long (counting the terminating NUL), but you have asked fwrite to write 30 bytes, so it will access 19 more bytes beyond the end of the string, triggering undefined behavior. In context, one would expect either a segmentation fault, or garbage written to the file.
And "_mcleanup: tos ov" is 17 bytes long, so if that is (the beginning of) the next string constant in the read-only data segment of your executable, it's a plausible thing to show up as garbage written to the file, which I think is what you are saying happened.
Related
So I've been tasked to write my own system call in linux. This system call will take a pointer to a character array and replace all o's with 0's. The system call will return the number of replacements performed. If the string size of larger than 128 bytes, it will return -1. So I've already implemented a different system call that works completely fine following the steps in this link.
https://medium.com/anubhav-shrimal/adding-a-hello-world-system-call-to-linux-kernel-dad32875872 . I have double checked everything and everything seems fine, so I believe it is something wrong in either my system call that I wrote or when I test it. Assuming this is the case, is there anything wrong with my system call? I think it may have something to do with the character pointer or something of that sort.
Here is my system call.
#include <linux/kernel.h>
asmlinkage int sys_my_syscall2(char * string){
if(sizeof(string) > 128){
return -1;
}
int x, count = 0;
for(x = 0; x < sizeof(string); x++){
if(string[x] == 'o'){
string[x] = '0';
count++;
}
}
return count;
}
Here is my test file.
#include <stdio.h>
#include <linux/kernel.h>
#include <sys/syscall.h>
#include <unistd.h>
int main() {
int syscall2 = syscall(333, "Hello World");
printf("Syscall 333 printed %d", syscall2);
return 0;
}
333 is the number of my syscall. After running the test file, it pauses for an extremely large time and seems to freeze. Even after trying to control-C, the program still seems to run. Is there a problem with these files?
The correct way is to:
Pass pointer to string to the syscall
do a strlen_user(string) to find out the size of the string
Provide a buffer large enough to hold the string (for example with kmalloc())
use copy_from_user() to copy the string from userspace into a kernel buffer (check if the call succeeded!)
Perform the substitution operation on the kernel buffer
use copy_to_user() to copy the string back to userspace
Free the buffer if you allocated one with kmalloc().
You should not access userspace directly, since the kernel may oops if the memory pointed to by your pointer is not available (i.e. swapped out) or protected for read/write. This is for example the case in your program; trying to write to this memory directly (without copy_to_user) will result in a kernel page fault. Note : Direct access can work, but it is not safe!
Furthermore, you have to use strlen(string) (or strlen(string)+1 to account for the terminating zero byte) when operating with strings.
I've got a little problem while experimenting with some C code. I've tried to use read()-Command to read a text out of a file and store the results in a charArray. But when I print the results they're always different from the file.
Here is the code:
#include <stdio.h>
#include <fcntl.h>
#include <unistd.h>
void main() {
int fd = open("file", 2);
char buf[2];
printf("Read elements: %ld\n", read(fd, buf, 2));
printf("%s\n", buf);
close(fd);
}
The file "file" was created in the same directory using the following UNIX commands:
cat > file
Hi
So it contains just the word "Hi". When I run it, I expect it to read 2 bytes from the file (which are 'H' and 'i') and store them at buf[0] and buf[1]. But when I want to print the result, it appears, that there was an issue, because besides the word "Hi" there are several wierd characters printed (indicating a memory reading/writing problem i guess, due to bad buffer size). I've tried to increase the size of the buf-Array and it appears that when i change the size, the wierd characters printed change. The problem is removed when size reaches 32 bytes.
Can someone explain to me in detail why this is happening?
I've understood so far that read() does not read \'0' when it reads something, and that the third parameter of read() indicates the maximum number of bytes to read.
Antoher thing I've noticed while experimenting with the above code is the following: Let's assume one changes the third parameter (maximum bytes to read) of read() to 3, and the size of buf-Array to 512 (overkill i know, but I really wanted to see what will happen). Now read will acutally read a third character (in my case 'e') and store it into the buffer, even tho this third character does not exist.
I've searched for a while now #stackoverflow and I found many similiar cases, but none of them made me understand my problem. If there is any other thread i missed, it would be a pleasure if u could link me to it.
At last: sry for my bad english, it's not my native language.
Clearly you need to make buf 3 bytes long and use the last byte as the null byte (0 or '\0'). That way, when you print the string, your computer doesn't carry on until he finds another 0 !
The way strings (char arrays really) are handled in C is quite straightforward. Indeed, when dealing with strings (most) if not all functions take under the assumption that string parameters are null terminated (puts) or return null terminated strings (strdup).
The point is that, by default the computer can't tell where a string ends unless it is given the strings size each time he processes it. The easiest implementation around this approach was to append after each string a 0 (namely the null byte). That way, the computer just need to iterate over the string's characters and stop when he finds the termination character (other name for null byte).
I'm new to C, so I apologize if the answer is obvious, I've searched elsewhere.
the libraries I'm including are:
#include <unistd.h>
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
#include <sys/wait.h>
#include <fcntl.h>
#include <sys/stat.h>
the code that is failing is:
char *USER = getlogin();
char CWD[128];
if (USER == NULL)
printf("cry\n");
getcwd(CWD, sizeof(CWD));
printf("this prints\n");
printf(USER);
printf("this does not\n");
printf("%s#myshell:%s> ", USER, CWD);
cry does not print, so that should mean that getlogin is successful. the segfault is caused on printf(USER);
further testing shows that the folling block prints entirely
printf("this prints\n");
printf(USER);
printf("this prints\n");
but the folling block will print this prints end then segfault without showing USER
printf("this prints\n");
printf(USER);
EDIT:
Sorry for wasting your time. I accidentally deleted an fgets that was supposed to follow it and that was causing the segfault. I've been on this bug for a couple hours now, I love it when the problem is so small.
Thanks
You should check getcwd return value. According to man page of getcwd:
If the length of the absolute pathname of the current working
directory, including the terminating null byte, exceeds size bytes,
NULL is returned, and errno is set to ERANGE; an application should
check for this error, and allocate a larger buffer if necessary.
If USER is null, printf will dereference a null pointer, which is undefined behavior. The compiler isn't required to do something that makes sense when undefined behavior occurs, so it would be allowed to not print "cry" when USER is null. You'll want to avoid undefined behavior.
Something else that might be causing your result is the fact that data sent to stdout is usually buffered. If the program crashes before the data is flushed from the buffer, the data will be lost instead of printed.
So here is how printf works...
it has to read in the format string.
it has to grab values off of the stack in order to fulfill format specifiers, format them and output them.
examples of things that could cause this...
char stringOfTest[5] = {'1','2','3','4','5'};
or
char * stringOfTest = "here are some formats that will be unsatisfied: %d%f%i%s%x";
so the first one could crash because the string isn't null terminated, and depending on the state of the application could basically read until it overruns a buffer (a good implementation should guard against this), or just happens to run into a format specifier that causes a crash... This goes for any garbage data also.
and the second one deals with how variadic functions work... all of the variables are pushed onto the stack in some order and the function doesn't have a safe way to know which the last one is... so it will keep grabbing things that are specified until it grabs something out of the stack and (maybe) crashes.
the third way is also in the second example... if you have a %s it will cause a pointer to be dereferenced... so that can also crash.
This program crashes at the point i=(strlen(data)); with the message
No source available for "strlen() "
But Why?
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
int main (void) {
char data[]="Hallo";
char buffer[100];
if (strlen(data)!=0)
{
size_t i=0;
i=(strlen(data));
snprintf(buffer,i,"Data: %s \n",data);
return strlen(data)+1;
}
return -1;
}
The error message you cite does not sound like a crash. More like a debugger trying to step into a system library function.
I suspect the cause of the problem is
snprintf(buffer,i,"Data: %s \n",data);
The i here is the "buffer size". i is also the length of data. So you're writing a string to a buffer which is longer than the buffer size. The effect is that snprintf() truncates the output, so not the entire data string will be written.
In fact, Data: is six characters long, that's longer than i (5). So maybe what's happening is that snprintf never makes use of the %s modified, which somehow breaks the stack?
Try replacing i with sizeof(buffer) and see whether that works better.
I just ran this program in Eclipse, and it works fine. It sounds like you are stepping through the code line-by-line and when you get to the strlen call you do a "Step-Into"(F5) instead of "Step Over"(F6). So Eclipse is trying to debug strlen.
Either way, this is an Eclipse issue and I suggest you add an Eclipse tag to the question.
I'm kinda new to C programming ... I'm trying to read a user name from the user and another from a text file... the first one is working but when reading from the text file and storing into "user_name" it gives me a segmentation error. what's wrong here?
char user_in[10];
char user_name[10];
scanf("%s",user_in);
FILE *users_file;
users_file=fopen("users.txt","r");
fscanf(users_file,"%s",user_name);// segmentation error
(EDITED) :
The file does exist (I've tested it). the first content is a 5 character long string followed by a white space;
Sarah Mary Sally
You should ensure that you do not overwrite beyond the allocated array size of user_name.
You allocated user_name a memory of 10 characters, If your file contains more than the memory allocated for user_name then there is not enough space to store that content and it overwrittes the bounds of allocated memory causing an Undefined Behavior which can very well lead to a segmentation fault.
Also, there is no error handling in your program. For eg: You should be checking if the fopen call suceeded.
In short, whenever you use c standard library functions always check if the function was successful or not.
the size of content which the program read from file is over the size of user_name. it would cause buffer overflow and break the function stack.
You'd be better giving a size in scanf()
scanf("%9s", user_in);
and
fscanf(users_file, "%9s", user_name);
You specify 9 as the length, since the final, tenth character has the value zero to represent the end of the string.
Also, as others have said, check that you successfully opened the file:
#include <errno.h> /* errno */
#include <string.h> /* strerror() */
users_file = fopen("users.txt","r");
if(!users_file){
fprintf(stderr, "couldn't open users.txt: %s\n", strerror(errno));
return 1;
}