I want to write a ".hex" file of size 28 kb flash on to that Internal flashROM of size 32 kb.The above mentioned question is for initialising the flashROM. What i am doing for writing that file and the code is mentioned below:
I am having a .hex file in Intel hex format, that i have to read and write to internal
FlashROM of At91(8051) microcontroller.
Open the .hex file in "rb+" mode.
Get the length of the file and set the pointer to start address(zeroth address).
As I need to write that file page by page and pagesize in my case is 256 bytes, I have
divide the file by 256.
After that I have try to write that file.
Please let me know where I am going wrong. The code is given below.
int a,b; int size,last_chunk; FILE *file; char *buffer1,name[20]; unsigned long fileLen;
file = fopen("flashROM.hex", "rb+");
if (!file)
{
fprintf(stderr, "Unable to open file %s", name);
return;
}
fseek(file, 0, SEEK_END);
fileLen=ftell(file);
printf("the file length is:%d\n",fileLen);
fseek(file, 0, SEEK_SET);
//Allocate memory
buffer1 =(char *)malloc(fileLen+1);
if (!buffer1)
{
fprintf(stderr, "Memory error!");
fclose(file);
return;
}
//Read file contents into buffer
fread(buffer1, fileLen, 1, file);
/* We have to divide the entire file into chunks of 256 bytes*/
size = fileLen/256;
printf("\nsize = %d\n",size);
last_chunk = fileLen%256;
printf("\nlast chunk = %d bytes\n",last_chunk);
address = 0x0000;
printf("File upgradation should start from :%.4x",address);
for(a=0;a<=size;a++)
{
write(fd,&buffer1,size);
printf("Iteration=[%d]\t Data=[%x]\n",a,*buffer1);
usleep(5000);
}
for(b=0;b<=last_chunk;b++)
{
write(fd,&buffer1,1);
usleep(5000);
}
After executing the binary of above mentioned program, my result is mentioned below:
Writing upgrade file
the file length is:30855
size = 120
last chunk = 135 bytes
File upgradation should start from :0000
Iteration=[0] Data=[3a]
Iteration=[1] Data=[3a]
Iteration=[2] Data=[3a]
Iteration=[3] Data=[3a]
Iteration=[4] Data=[3a]
Iteration=[5] Data=[3a]
Iteration=[6] Data=[3a]
Iteration=[7] Data=[3a]
I don't know, why the data is always "3a", its not clear.
Please let me know where i have done wrong in programming.
You need a special tool that reads the .hex file and does whatever is necessary to write it into the flash memory of your controller (JTAG, talk to a bootloader via whatever means of communication, ...).
The tool depends on your specific microcontroller family. 8051 is not enough information, there is a huge variety of 8051s from many vendors.
Try using "wb+" as your mode for fopen
Opening the file as writable will create it if it doesn't already exist.
The code where you write data also looks suspect. You're passing a pointer to a pointer to a buffer into write rather than a simple pointer to your buffer. I also don't see the pointer getting incremented so you're writing the same data repeatedly.
You could try replacing your writing code with something like the following:
char* ptr = buffer1;
for(a=0;a<=size;a++)
{
write(fd,ptr,size);
ptr+=size;
printf("Iteration=[%d]\t Data=[%x]\n",a,*buffer1);
usleep(5000);
}
for(b=0;b<=last_chunk;b++)
{
write(fd,ptr,1);
usleep(5000);
}
Related
The program works correctly in Linux, but I get extra characters after the end of file when running in Windows or through Wine. Not garbage but repeated text that was already written. The issue persists whether I write to stdout or a file, but doesn't occur with small files, a few hundred KB is needed.
I nailed down the issue to this function:
static unsigned long read_file(const char *filename, const char **output)
{
struct stat file_stats;
int fdescriptor;
unsigned long file_sz;
static char *file;
fdescriptor = open(filename, O_RDONLY);
if (fdescriptor < 0 || (fstat(fdescriptor ,&file_stats) < 0))
{ printf("Error opening file: %s \n", filename);
return (0);
}
if (file_stats.st_size < 0)
{ printf("file %s reports an Incorrect size", filename);
return (0);
}
file_sz = (unsigned long)file_stats.st_size;
file = malloc((file_sz) * sizeof(*file));
if (!file)
{ printf("Error allocating memory for file %s of size %lu\n", filename, file_sz);
return (0);
}
read(fdescriptor, file, file_sz);
*output = file;
write(STDOUT_FILENO, file, file_sz), exit(1); //this statement added for debugging.
return (file_sz);
}
I can't debug through Wine, much less in windows, but by using printf statements I can tell the file size is correct. The issue is either in the reading or the writing and without a debugger I can't look at the contents of the buffer in memory.
The program was compiled with x86_64-w64-mingw32-gcc, version 8.3. which is the same version of gcc in my system.
At this point I'm just perplexed; I would love to hear any ideas you may have.
Thank you.
Edit: The issue was that fewer bytes were being read than the reported file size and I was writing more than necessary. Thanks to Matt for telling me where to look.
Read can return a size different than that reported by fstat. I was writing the reported file size instead of the actual number of bytes read, which led to the issue. If writing, one should use the number of bytes directly reported by read to avoid this.
It is always best to both check the return value of read/write for failure and to make sure all bytes have been read as read can return less bytes than the total when reading from a pipe or interrupted by a signal, in which case multiple calls are necessary.
Thanks to Mat and Felix for the answer.
We use LoadRunner to run performance tests for an application. The tests are generally written in Ansi-C.
We have a simple base64 encoding function:
void base64encode(char *src, char *dest, int srcLen, int destLen)
{
int i=0;
int slen= srcLen;
for(i=0;i<slen && i<destLen;i+=3,src+=3)
{
*(dest++)=base64encode_lut[(*src&0xFC)>>0x2];
*(dest++)=base64encode_lut[(*src&0x3)<<0x4|(*(src+1)&0xF0)>>0x4];
*(dest++)=((i+1)<slen)?base64encode_lut[(*(src+1)&0xF)<<0x2|(*(src+2)&0xC0)>>0x6]:'=';
*(dest++)=((i+2)<slen)?base64encode_lut[*(src+2)&0x3F]:'=';
}
*dest='\0';
}
This code when ran on a developer machine (64-bit Windows 10 machines) the code runs in under a sec for a simple image (srcLen around 7k).
When ran on the production server (32-bit Windows 2012, VM) the execution takes between 10 to 20 minutes for the same image.
Can anyone explain why and how to avoid this issue? I'm not sure if it's LoadRunner or the code to blame.
EDIT: adding the code that is calling the encoding function:
long infile; // file pointer
char *buffer; // buffer to read file contents into
char *filename = "DFC_COLOR.jpg"; // file to read
int fileLen; // file size
int bytesRead; // bytes read from file
char *encoded;
int dest_size;
web_set_max_html_param_len("999999999");
infile = fopen(filename, "rb");
if (!infile) {
lr_error_message("Unable to open file %s", filename);
return;
}
// get the file length
fseek(infile, 0, SEEK_END);
fileLen = ftell(infile);
fseek(infile, 0, SEEK_SET);
lr_log_message("File length is: %9d bytes.", fileLen);
// Allocate memory for buffer to read file
buffer = (char *)malloc(fileLen + 1);
if (!buffer) {
lr_error_message("Could not malloc %10d bytes", fileLen + 1);
fclose(infile);
return;
}
// Read file contents into buffer
bytesRead = fread(buffer, 1, fileLen, infile);
if (bytesRead != fileLen)
{
lr_error_message("File length is %10d bytes but only read %10d bytes", fileLen, bytesRead);
}
else
{
lr_log_message("Successfully read %9d bytes from file: ", bytesRead);
}
fclose(infile);
// Save the buffer to a loadrunner parameter
lr_save_var(buffer, bytesRead, 0, "fileDataParameter");
// calculate the destination size
dest_size = 1 + ((bytesRead + 2) / 3 * 4);
encoded = (char *)malloc(dest_size);
memset(encoded, 0, dest_size);
// encode the buffer
base64encode(buffer, encoded, bytesRead, dest_size);
Run a single virtual user on a non shared/brokered resource physical machine and see if you have the same performance. You have no want to determine what your priority and resource pool is within the virtual machine environment. You could be running on a VM host which is highly overloaded and you are simply having to wait as the hypervisor makes decisions on who gets a given resource set at which time. Moving a single virtual user to a single hardware host provides a common apples to pears basis (oranges to kumquats) for making your comparison.
LoadRunner also includes an RFC compliant base64 encoding and decoding algorithm as a part of its core set so you do not need to recode this. Normally this is used as part of the SMTP or DNS virtual users, but you can load the DLL, prototype the functions and move forward. You can find existing function prototypes in the \include\mic_socket.h header file Here is a great article on how this is accomplished. I know the editors are going to scream and demote this as an external link, so you may want to capture the link fast
https://northwaysolutions.com/blog/loadrunner-vugen-encoding-and-decoding-base64/#.WL1vZxLytlc
The other way to handle this is as a Data Format extension. Decode is already covered as a base extension, so you would only have to handle the encode if you so desired. Use Google to pull references to Loadrunner, DFE and base64 for the appropriate items. You can also find a DFE developers guide in your local documentation set, installed with your copy of LoadRunner.
I am trying to build antivirus in C.
I do that like this:
Read data of virus and picture file to scanned.
Check if virus data appear in picture data.
I read the data of scanned file and virus file like this: ( I read the file by binary mode because the file is picture(.png) )
// open file
file = fopen(filePath, "rb");
if (!file)
{
printf("Error: can't open file.\n");
return 0;
}
// Allocate memory for fileData
char* fileData = calloc(fileLength + 1, sizeof(char));
// Read data of file.
fread(fileData, fileLength, 1, file);
after i read the file data and the Virus data i check if the virus appear in the file like this:
char* ret = strstr(fileData, virusID);
if (ret != NULL)
printf("Infetecd file");
It does not work even though in my picture i have VirusID.
I want to check if the binary data of virus appear in binary data of picture.
For example: my binary data of my virus http://pastebin.com/xZbWA9qu
And the binary data of my picture(with the virus): http://pastebin.com/yjXr84kr
First, note the order of arguments of fread, fread(void *ptr, size_t size, size_t nmemb, FILE *stream); so to get the number of bytes, it's better to do fread(fileData, 1, fileLength, file);. Your code will return 0 or 1 depends on whether there is enough data to be read in the file, not the number of bytes it has read.
Second, strstr is to search for strings, not memory blocks, in order to search binary blocks, you need to write your own, or you can use the GNU extension function memmem.
// Allocate memory for fileData
char *fileData = malloc(fileLength);
// Read data of file.
size_t nread = fread(fileData, 1, fileLength, file);
void *ret = memmem(fileData, nread, virusID, virusLen);
if (ret != NULL)
printf("Infetecd file");
Search for the first byte of the virus signature, if you find it then see if the next byte is the second byte of the signature, and so on until you have checked and matched all bytes of the signature. Then the file is infected. If not all bytes matches then search again for the first byte of the signature.
I am working on an assignment in socket programming in which I have to send a file between sparc and linux machine. Before sending the file in char stream I have to get the file size and tell the client. Here are some of the ways I tried to get the size but I am not sure which one is the proper one.
For testing purpose, I created a file with content " test" (space + (string)test)
Method 1 - Using fseeko() and ftello()
This is a method I found on https://www.securecoding.cert.org/confluence/display/c/FIO19-C.+Do+not+use+fseek()+and+ftell()+to+compute+the+size+of+a+regular+file
While the fssek() has a problem of "Setting the file position indicator to end-of-file, as with fseek(file, 0, SEEK_END), has undefined behavior for a binary stream", fseeko() is said to have tackled this problem but it only works on POSIX system (which is fine because the environment I am using is sparc and linux)
fd = open(file_path, O_RDONLY);
fp = fopen(file_path, "rb");
/* Ensure that the file is a regular file */
if ((fstat(fd, &st) != 0) || (!S_ISREG(st.st_mode))) {
/* Handle error */
}
if (fseeko(fp, 0 , SEEK_END) != 0) {
/* Handle error */
}
file_size = ftello(fp);
fseeko(fp, 0, SEEK_SET);
printf("file size %zu\n", file_size);
This method works fine and get the size correctly. However, it is limited to regular files only. I tried to google the term "regular file" but I still not quite understand it thoroughly. And I do not know if this function is reliable for my project.
Method 2 - Using strlen()
Since the max. size of a file in my project is 4MB, so I can just calloc a 4MB buffer. After that, the file is read into the buffer, and I tried to use the strlen to get the file size (or more correctly the length of content). Since strlen() is portable, can I use this method instead? The code snippet is like this
fp = fopen(file_path, "rb");
fread(file_buffer, 1024*1024*4, 1, fp);
printf("strlen %zu\n", strlen(file_buffer));
This method works too and returns
strlen 8
However, I couldn't see any similar approach on the Internet using this method. So I am thinking maybe I have missed something or there are some limitations of this approach which I haven't realized.
Regular file means that it is nothing special like device, socket, pipe etc. but "normal" file.
It seems that by your task description before sending you must retrieve size of normal file.
So your way is right:
FILE* fp = fopen(...);
if(fp) {
fseek(fp, 0 , SEEK_END);
long fileSize = ftell(fp);
fseek(fp, 0 , SEEK_SET);// needed for next read from beginning of file
...
fclose(fp);
}
but you can do it without opening file:
#include <sys/types.h>
#include <sys/stat.h>
#include <fcntl.h>
struct stat buffer;
int status;
status = stat("path to file", &buffer);
if(status == 0) {
// size of file is in member buffer.st_size;
}
OP can do it the easy way as "max. size of a file in my project is 4MB".
Rather than using strlen(), use the return value from fread(). stlen() stops on the first null character, so may report too small a value. #Sami Kuhmonen Also we do not know the data read contains any null character, so it may not be a string. Append a null character (and allocate +1) if code needs to use data as a string. But in that case, I'd expect the file needed to be open in text mode.
Note that many OS's do not even use allocated memory until it is written.
Why is malloc not "using up" the memory on my computer?
fp = fopen(file_path, "rb");
if (fp) {
#define MAX_FILE_SIZE 4194304
char *buf = malloc(MAX_FILE_SIZE);
if (buf) {
size_t numread = fread(buf, sizeof *buf, MAX_FILE_SIZE, fp);
// shrink if desired
char *tmp = realloc(buf, numread);
if (tmp) {
buf = tmp;
// Use buf with numread char
}
free(buf);
}
fclose(fp);
}
Note: Reading the entire file into memory may not be the best idea to begin with.
I'm developing very simple ftp client. I have created a data connection sockets, but I can't transfer file successfully:
FILE *f = fopen("got.png", "w");
int total = 0;
while (1){
memset(temp, 0, BUFFSIZE);
int got = recv(data, temp, sizeof(temp), 0);
fwrite(temp, 1, BUFFSIZE, f);
total += got;
if (total == 1568){
break;
}
}
fclose(f);
BUFFSIZE = 1568
I know that my file is 1568 bytes size, so I try to download it just for a test. Everything is file when I try to download .xml or .html files, but nothing good happens when I try to download png or avi files. Simply original file size is 1568 but got.png file size is 1573. I can't figure out what might cause that.
EDIT:
I have modified my code, so now it looks like (it can accept any file size):
FILE *f = fopen("got.png", "w");
while (1){
char* temp = (char*)malloc(BUFFSIZE);
int got = recv(data, temp, BUFFSIZE, 0);
fwrite(temp, 1, got, f);
if (got == 0){
break;
}
}
fclose(f);
Still received file is 2 bytes too long.
You are opening the file in text mode, so bare CR/LF characters are going to get translated to CRLF pairs when written to the file. You need to open the file in binary mode instead:
FILE *f = fopen("got.png", "wb");
You are always writing a whole buffer even if you have received only a partial one. This is the same problem with ~50% of all TCP questions.
The memset is not necessary. And I hope that temp is an array so that sizeof(temp) does not evaluate to the native pointer size. Better use BUFFSIZE there as well.
Seeing your edit, after fixing the first problem there is another one: Open the file in binary mode.