Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 8 years ago.
Improve this question
I am trying to copy data from a file containing HEX characters to my embedded device's SRAM memory over UART from Matlab. The problem is I don't know how to make the program stop treating any of the received characters as a special command.
For eg: the symbol '' has HEX equivalent of 0x20 as per the ASCII table. However,it may happen that my data has this 0x20 somewhere. So, i can't use this as a delimiter for my program.
Please suggest me a way by which all the data from the hex file can be read without causing any such issues.
This is the part of my code.
memcpy(((uint8_t*)(SRAM_BASE+i)),&cThisChar,1);
UARTCharPut(UART0_BASE, cThisChar);
i++;
//
// Stay in the loop until either a CR or LF is received.
//
}
while((cThisChar != '\n') && (cThisChar != '\r')); // this is where the problem happens!
So what should I put a condition for the while loop so that it accepts all characters?
Thanks!
Do not try to get the end of file by its content.
Get its size and use a counter instead.
FILE *fp = NULL;
long int fsize = 0;
long int fptr = 0;
/* Open file. */
/* todo: Open file here. */
/* Get file size. */
fseek(fp, 0L, SEEK_END);
fsize = ftell(fp);
fseek(fp, 0L, SEEK_SET);
/* Process file data. */
while (fptr < fsize) {
/* todo: Do your stuff here. */
++fptr;
}
/* Close file. */
/* todo: Close file here. */
Basically,this is all what I wanted to achieve! Thanks for your effort anyways!
signed char cThisChar;
while (cThisChar != EOF);
I am posting the solution.
while (j < 2048)
{
do
{
//
// Read a character using the blocking read function. This function
// will not return until a character is available.
//
cThisChar = UARTCharGet(UART0_BASE);
//
// Write the same character using the blocking write function. This
// function will not return until there was space in the FIFO and
// the character is written.
//
memcpy(((uint8_t*)(SRAM_BASE+i)),&cThisChar,1);
UARTCharPut(UART0_BASE, cThisChar);
i++;
//
// Stay in the loop until either a CR or LF is received.
//
} //while((cThisChar != '\n') && (cThisChar != '\r'));
while (i<17);
j++;
}
This is part of the code from the microcontroller side. After repeated iterations, I found out that using a buffer value of 16 was the best option. j is the outer loop counter = 32768/16 = 2048. by this I can write all the 32768 bytes in packets of 16 bytes.
And now the corresponding MATLAB version of the code:
while(true)
txdata = fread(A,**16**,'uint8','ieee-be');
%[my_count_rows, my_count_columns]=size(txdata);
%Convert to decimal format
%txdata_dec = hex2dec(txdata);
%Write using the UINT8 data format
**fwrite(obj1,txdata(1:16),'uint8');**
if txdata > 32768
break;
end
end
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 1 year ago.
Improve this question
I have to write a program in C in which I have to read some values from stdin and also to do it with the quickest function in C. The stdin is preloaded in a form like this:
int,int\n
char\n
int,int,int\n ex...
I'm asking for help because scanf is too slow for the time requirement of the project and also because I have some difficulties to read because of the ',' that I don't really need and that causes me problems.
I've tried with gets or getchar, but I didn't manage to make it work.
The fastest way to read stdin ("standard in" - 0 file descriptor) is to use read function from <unistd.h.>:
char buff[1024] = {0}; /* zero-initialize */
ssize_t res = read(0, &buff, sizeof(buff));
/* res is amount of bytes read; -1 if error */
Here is an example of program which reads 1024 bytes of stdin and echoes it to stdout (file descriptor: 1) (no error handling for simplicity):
#include <unistd.h>
#define BUFSIZ 1024
int main() {
char buff[BUFSIZ] = {0}; /* zero-initialize */
ssize_t nread = read(0, buff, BUFSIZ);
/* pass amount of bytes read as a byte amount to write */
write(1, buff, nread);
return 0;
}
This is the fastest way to read from stdin because read is native libc wrapper for a kernel syscall. By the way, you can use -O3, or even -Ofast compiler options to make it optimize the code.
Also, keep in mind that read and write are not guaranteed to read/write exactly as many bytes as you want, you should call it in a loop like this:
size_t to_write = sizeof(buff); /* example, can be result of read() */
int fd = 1;
size_t nwrote = 0;
while ((nwrote += write(1, buff, to_write) < to_write) {
/* pointer arithmetic to create offset from buff start */
write(fd, buff+nwrote, to_write - nwrote);
}
I am reading data from an input file and compressing it with bzip library function calls BZ2_bzCompress in C. I can compress the data successfully. But I cannot write all the compressed data to an output file. Only the first compressed line can be written. Am I missing something here.
int main()
{
bz_stream bz;
FILE* f_d;
FILE* f_s;
BZFILE* b;
int bzerror = -10;
unsigned int nbytes_in;
unsigned int nbytes_out;
char buf[3000] = {0};
int result = 0;
char buf_read[500];
char file_name[] = "/path/file_name";
long int save_pos;
f_d = fopen ( "myfile.bz2", "wb+" );
f_s = fopen(file_name, "r");
if ((!f_d) && (!f_s)) {
printf("Cannot open files");
return(-1);
}
bz.opaque = NULL;
bz.bzalloc = NULL;
bz.bzfree = NULL;
result = BZ2_bzCompressInit(&bz, 1, 2, 30);
while (fgets(buf_read, sizeof(buf_read), f_s) != NULL)
{
bz.next_in = buf_read;
bz.avail_in = sizeof(buf_read);
bz.next_out = buf;
bz.avail_out = sizeof(buf);
printf("%s\n", buf_read);
save_pos = ftell(f_d);
fseek(f_d, save_pos, SEEK_SET);
while ((result == BZ_RUN_OK) || (result == 0) || (result == BZ_FINISH_OK))
{
result = BZ2_bzCompress(&bz, (bz.avail_in) ? BZ_RUN : BZ_FINISH);
printf("2 result:%d,in:%d,outhi:%d, outlo:%d \n",result, bz.total_in_lo32, bz.total_out_hi32, bz.total_out_lo32);
fwrite(buf, 1, bz.total_out_lo32, f_d);
}
if (result == BZ_STREAM_END)
{
result = BZ2_bzCompressEnd(&bz);
}
printf("3 result:%d, out:%d\n", result, bz.total_out_lo32);
result = BZ2_bzCompressInit(&bz, 1, 2, 30);
memset(buf, 0, sizeof(buf));
}
fclose(f_d);
fclose(f_s);
return(0);
}
TL;DR: there are multiple problems, but the main one that explains the problem you asked about is likely that you compress each line of the file independently, instead of the whole file as a unit.
According to the docs of BZ2_bzCompressInit, the bz_stream argument should be allocated and initialized before the call. Yours is (automatically) allocated, but not (fully) initialized. It would be clearer and easier to change to
bz_stream bz = { 0 };
and then skip the assignments to bz.opaque, bz.alloc, and bz.free.
You store but do not really check the return value of your BZ2_bzCompressInit call. It does eventually get tested in the condition of the inner while loop, but you do not detect error conditions there, but instead just success and normal completion conditions.
Your handling of the input buffer is significantly flawed.
In the first place, you set the number of available input bytes incorrectly:
bz.avail_in = sizeof(buf_read);
Since you're using fgets() to read data into the buffer, under no circumstances is the full size of the buffer occupied by input data, because fgets() ensures that a string terminator is written into the array. In fact, it could be worse because fgets() will stop at after newlines, so it may provide as few as just one input byte on a successful read.
If you want to stick with fgets() then you need to use strlen() to determine the number of bytes available from each read, but I would suggest that you instead switch to fread(), which will more reliably fill the buffer, indicate with its return value how many bytes were read, and correctly handle inputs containing null bytes.
In the second place, you use BZ2_bzCompress() to compress each buffer of input as if it were a complete file. When you come to the end of the buffer, you finish a compression run and reinitialize the bz_stream. This will definitely interfere with decompressing, and it may explain why your program (seems to) compress only the first line of its input. You should be reading the whole content of the file (in suitably-sized chunks) and feeding all of it to BZ2_bzCompress(... BZ_RUN) before you finish up. There should be one sequence of calls to BZ2_bzCompress(... BZ_FINISH) and finally one call to BZ2_bzCompressEnd() for the whole file, not per line.
You do not perform error detection or handling for any of your calls to standard library or bzip functions. You do handle the expected success-case return values for some of these, but you need to be rpepared for errors, too.
There are some additional oddities
you have unused variables nbytes_in, nbytes_out, bzerror, and b.
you open the input file as a text file, though whether that makes any difference is platform-dependent.
the ftell() / fseek() pair has no overall effect other than setting save_pos, which is not otherwise used.
although it is not harmful, it also is not useful to memset() the output buffer to all-zeroes at the end of each line (or initially).
Given that you're compressing the input, it's odd (but again not harmful) that you provide six times as much output buffer as you do input buffer.
Using C, is there a way to read only the last line of a file without looping it's entire content?
Thing is that file contains millions of lines, each of them holding an integer (long long int). The file itself can be quite large, I presume even up to 1000mb. I know for sure that the last line won't be longer than 55 digits, but could be 2 only digits as well. It's out of options to use any kind of database... I've considered it already.
Maybe its a silly question, but coming from PHP background I find it hard to answer. I looked everywhere but found nothing clean.
Currently I'm using:
if ((fd = fopen(filename, "r")) != NULL) // open file
{
fseek(fd, 0, SEEK_SET); // make sure start from 0
while(!feof(fd))
{
memset(buff, 0x00, buff_len); // clean buffer
fscanf(fd, "%[^\n]\n", buff); // read file *prefer using fscanf
}
printf("Last Line :: %d\n", atoi(buff)); // for testing I'm using small integers
}
This way I'm looping file's content and as soon as file gets bigger than ~500k lines things slow down pretty bad....
Thank you in advance.
maxim
Just fseek to fileSize - 55 and read forward?
If there is a maximum line length, seek to that distance before the end.
Read up to the end, and find the last end-of-line in your buffer.
If there is no maximum line length, guess a reasonable value, read that much at the end, and if there is no end-of-line, double your guess and try again.
In your case:
/* max length including newline */
static const long max_len = 55 + 1;
/* space for all of that plus a nul terminator */
char buf[max_len + 1];
/* now read that many bytes from the end of the file */
fseek(fd, -max_len, SEEK_END);
ssize_t len = read(fd, buf, max_len);
/* don't forget the nul terminator */
buf[len] = '\0';
/* and find the last newline character (there must be one, right?) */
char *last_newline = strrchr(buf, '\n');
char *last_line = last_newline+1;
Open with "rb" to make sure you're reading binary. Then fseek(..., SEEK_END) and start reading bytes from the back until you find the first line separator (if you know the maximum line length is 55 characters, read 55 characters ...).
ok. It all worked for me. I learned something new. The last line of a file 41mb large and with >500k lines was read instantly. Thanks to you all guys, especially 'Useless' (love the controversy of your nickname, btw). I will post here the code in the hope that someone else in the future can benefit from it:
Reading ONLY the last line of the file:
the file is structured the way that there is a new line appended and I am sure that any line is shorter than, in my case, 55 characters:
file contents:
------------------------
2943728727
3129123555
3743778
412912777
43127787727
472977827
------------------------
notice the new line appended.
FILE *fd; // File pointer
char filename[] = "file.dat"; // file to read
static const long max_len = 55+ 1; // define the max length of the line to read
char buff[max_len + 1]; // define the buffer and allocate the length
if ((fd = fopen(filename, "rb")) != NULL) { // open file. I omit error checks
fseek(fd, -max_len, SEEK_END); // set pointer to the end of file minus the length you need. Presumably there can be more than one new line caracter
fread(buff, max_len-1, 1, fd); // read the contents of the file starting from where fseek() positioned us
fclose(fd); // close the file
buff[max_len-1] = '\0'; // close the string
char *last_newline = strrchr(buff, '\n'); // find last occurrence of newlinw
char *last_line = last_newline+1; // jump to it
printf("captured: [%s]\n", last_line); // captured: [472977827]
}
cheers!
maxim
I have a programme which is writing results to a file and I would like to read in real-time from that file. It is a normal text file and external programme always write a whole line. I need to run it just on a Linux system.
int last_read = 0;
int res;
FILE *file;
char *buf;
char *end = buf + MAXLINE - 1;
int c;
int fileSize;
char *dst;
while (external_programme_is_running()) {
file = fopen(logfile, "r"); //without opening and closing it's not working
fseek(file, 0, SEEK_END);
fileSize = ftell(file);
if (fileSize > last_read) {
fseek(file, last_read, SEEK_SET);
while (!feof(file)) {
dst = buf;
while ((c = fgetc(file)) != EOF && c != '\n' && dst < end)
*dst++ = c;
*dst = '\0';
res = ((c == EOF && dst == buf) ? EOF : dst - buf);
if (res != -1) {
last_read = ftell(file);
parse_result(buf)
}
}
}
fclose(file);
}
Is this a correct approach? Or would it be better to check the modification time and then open the file? Is is possible that reading would crash in case that the file would be modified at the very same time?
To avoid the need to close, re-open, and re-seek for every loop iteration, call clearerr on the stream after reading EOF.
You shouldn't have any problems if you read at the same time the other program writes. The worst that would happen is that you wouldn't get to see what was written until the next time you open the file.
Also, your approach of comparing the last seek position to the end of the file is a fine way to look for additions to the file, since the external program is simply writing additional lines. I would recommend adding a sleep(1) at the end of your loop, though, so you don't use a ton of CPU.
There's no problem in reading a file while another process is writing to it. The standard tail -f utility is used often for that very purpose.
The only caveat is that the writing process must not exclusively lock the file.
Regarding your read code (ie. using fgetc()), since you said that the writing process will be writing a line at a time, you might want to look at fgets() instead.
Here i want to know the how can i implement calculate CRC16 for Any type of file,
Here i have idea about CRC16 and its code logic.
here i made one function which take file path as a input and find out CRC value of this. here i pass file name in this function and this functions calculates CRC value of this file. but i want to calculate all types of files like .tar,.tar.gz,.txt,.bin,.scr etc.
so here i open this all files and "rb" mode and take one by one character and find out CRC value
its right way? may be i missing something in this. its works fine for .txt and all other files but it will create problems in .tar, .tar.gz type files. Because here i have one file which is 890 MB and its name is file.tar.gz and its take 203 Microseconds and I have other file which is 382 MB and its name is file2.tar its take 6097850.000000 Microseconds its unbelievable for me hows its possible?
My question are these
1 Some thing problem in my CRC code ?
2 Problem in reading file data style, may be i am reading file data in wrong manner for .tar.gz.
Here is my code :
int CRC16(const char* filePath) {
//Declare variable to store CRC result.
unsigned short result;
//Declare loop variables.
int intOuterLoopIndex, intInnerLoopIndex, nLen;
result = 0xffff; //initialize result variable to perform CRC checksum calculation.
//Store message which read from file.
//char content[2000000];
//Create file pointer to open and read file.
FILE *readFile;
//Use to read character from file.
char readChar;
//open a file for Reading
readFile = fopen(filePath, "rb");
//Checking file is able to open or exists.
if (!readFile) {
fputs("Unable to open file %s", stderr);
}
/*
Here reading file and store into variable.
*/
int chCnt = 0;
while ((readChar = getc(readFile)) != EOF) {
result ^= (short) (readChar);
for (intInnerLoopIndex = 0; intInnerLoopIndex < 8; intInnerLoopIndex++) {
if ((result & 0x0001) == 0x0001) {
result = result >> 1; //Perform bit shifting.
result = result ^ 0xa001; //Perform XOR operation on result.
} else {
result = result >> 1; //Perform bit shifting.
}
}
//content[chCnt] = readChar;
chCnt++;
}
printf("CRC data length in file: %d", chCnt);
return (result);
}
char readChar; is wrong, it needs to be int readChar;
getc() returns an int, so it can signal EOF to you. EOF is likely the value -1
If you convert the return value to a char, and read a byte with the value 255, 255 will be interpreted as -1, and you stop reading at the first byte that has the value 255.
This code is wrong in different points:
if (!readFile) {
fputs("Unable to open file %s", stderr);
}
If the file opening fails you don't return but continue! And the puts is wrong also, you don't specify the file name.
Also, getc() returns an integer, but you put it in a char and it may end just when it finds an EOF character! Don't ignore compiler warnings...
I would download a small CRC calculator program, so that you can test if your code is working properly.