Writing ByteArray to file in c - c

I have a struct which holds some ByteArray data
typedef struct {
uint32_t length;
uint8_t* bytes;
} FREByteArray;
And here I am trying to save this to a file
FREByteArray byteArray;
if((fileToWrite = fopen(filePath, "wb+")) != NULL){
fwrite(&byteArray.bytes, 1, byteArray.length, fileToWrite);
fclose(fileToWrite);
}
But this doesn't seem to be saving all of the data, the saved file size is 16KB, actual data is about 32KB. I think fwrite is not able to write the whole bytearray to the file.
Is this the correct way to save the ByteArray? Is there a limit how much fwrite can handle in a single call?

Replace
fwrite(&byteArray.bytes, 1, byteArray.length, fileToWrite);
with
fwrite(byteArray.bytes, 1, byteArray.length, fileToWrite);
And as pointed out by #Sourav Ghosh make sure that byteArray.bytes is pointing to the correct source location.

Related

How to allocate memory when using custom reading functions in libpng?

I'm in need of reading base64 encoded PNG image, stored as char array/null terminated string, and I'm stuck. Here is what I have found out for now:
Libpng is capable of changing it's workings, by using png_set_*_fn().
reading functions must have prototype alike this one : void user_read_data(png_structp png_ptr, png_bytep data, size_t length); and must check for EOF errors.
Original read function (which reads from png file directly) calls an fread function and dumps everything to memory pointed by data. I have no idea how libpng knows about image size.
So, here is my implementation of read function
size_t base64_to_PNG(const char *const base64_png, png_bytep out)
{
size_t encoded_size, decoded_count;
size_t decoded_size = base64_decoded_block_size(base64_png, &encoded_size);
decoded_count = base64_decode_block(base64_png, encoded_size, (char*)out);
if(decoded_count != decoded_size)
return 0;
return decoded_size;
}
void my_read_png_from_data_uri(png_structp png_ptr, png_bytep data, size_t length)
{
const char *base64_encoded_png = NULL;
size_t PNG_bytes_len;
if(png_ptr == NULL)
return;
base64_encoded_png = png_get_io_ptr(png_ptr);
PNG_bytes_len = base64_to_PNG(base64_encoded_png, data);
if(PNG_bytes_len != length)
png_error(png_ptr, "Error occured during decoding of the image data");
}
I do believe that information about the decoded image size is lost, and I'm going straight to the segfault with that, as I'll be writing to some random address, but I have no idea how to tell libpng how much memory I need. Can you please help me with that?

How to get BYTE from bytea?

In the extension for postgresql, I encrypt char*, get a BYTE buffer, write it to a file, and return it as bytea* via PG_RETURN_BYTEA_P. When I try to read the same through bytea* b = PG_GETARG_BYTEA_P, the read does not match what I wrote to the file. How to read bytea correctly to get BYTE and decrypt again? Everything is in C.
Part of code:
FILE* log = AllocateFile("C:\\pg\\log.txt", PG_BINARY_A);
//get data to decrypt
bytea* dataToDecrypt = PG_GETARG_BYTEA_P(0);
FILE* tempFile = AllocateFile("C:\\pg\\file1.txt", PG_BINARY_A);
fwrite(dataToDecrypt, sizeof(dataToDecrypt), 1, tempFile);
FreeFile(tempFile);
I put the data in a file1 to use an example from Microsoft. Further everything as in the example https://learn.microsoft.com/en-us/windows/win32/seccrypto/example-c-program-decrypting-a-file
In the end:
FreeFile(log);
PG_RETURN_BYTEA_P(pbBuffer);

How to set jbytearray directly from file

I have the following native code that copies from a file into a buffer and then copies the
contents of that buffer into a jbytearray.
JNIEXPORT void JNICALL Java_com_test(JNIEnv * env, jobject){
int file_descriptor = 100;
JNIEnv * jni_env = env;
FILE* file = fdopen(file_descriptor, "r");
unsigned char* buffer;
int size_of_file = 1000000;
fread(buffer, 1, static_cast<size_t>(size_of_file), file);
imageArr = static_cast<jbyteArray>(jni_env->NewByteArray(static_cast<jsize> (size_of_file)));
jni_env->SetByteArrayRegion (imageArr, 0, static_cast<jsize>
(size_of_file ), (jbyte*)buffer);
}
As this code runs in a loop, I would like to optimize this as much as possible. Is there any way to directly read from the file to the jbyteArray? I am aware jbyteArray is a pointer to a struct. Is there any way to set the fields of this struct directly instead of using the setByteArrayRegion() function?
If not, is there any other function that I can use to read from a file to a jbytearray?
In short, no. You can probably do it, but it probably wont be much faster and if something with the implementation changed in the JVM your code would stop working. You are dealing with file I/O so I don't think SetByteArrayRegion is your real bottleneck here.

Reading unsingned chars with Qt

Hi fellow stack overflowers,
I'm currently parsing a file which both contains text and binary data. Currently, I'm reading the file in following manner:
QTextStream in(&file);
int index = 0;
while(!in.atEnd()) {
if (index==0) {
QString line = in.readLine(); // parse file here
} else {
QByteArray raw_data(in.readAll().toAscii());
data = new QByteArray(raw_data);
}
index++;
}
where data refers to the binary data I'm looking for. I'm not sure if this is what I want, since the QString is encoded into ascii and I have no idea if some bytes are lost.
I checked the documentation, and it recommends using a QDataStream. How can I combine both approaches, i.e. read lines with an encoding and also read the binary dump, after one line break?
Help is greatly appreciated!
This will do what you want.
QTextStream t(&in);
QString line;
QByteArray raw_data;
if(!in.atEnd()) {line = t.readLine();}
in.reset();
int lineSize = line.toLocal8Bit().size() + 1;
in.seek(lineSize);
if(!in.atEnd())
{
int len = in.size() - lineSize;
QDataStream d(&in);
char *raw = new char[len]();
d.readRawData(raw, len);
raw_data = QByteArray(raw, len);
delete raw;
}
PS: if file format is yours, it will be better to create file with QDataStream and write data with <<, read with >>. This way you can store QByteArray and QString in file without such problems.

fwrite() and file corruption

I'm trying to write a wchar array to a file in C, however there is some sort of corruption and unrelevant data like variables and paths like this
c.:.\.p.r.o.g.r.a.m. .f.i.l.e.s.\.m.i.c.r.o.s.o.f.t. .v.i.s.u.a.l. .s.t.u.d.i.o. 1.0...0.\.v.c.\.i.n.c.l.u.d.e.\.x.s.t.r.i.n.g..l.i.s.t...i.n.s.e.r.t
are written on to the file along with the correct data (example) I have confirmed that the buffer is null-terminated and contains proper data.
Heres my code:
myfile = fopen("logs.txt","ab+");
fseek(myfile,0,SEEK_END);
long int size = ftell(myfile);
fseek(myfile,0,SEEK_SET);
if (size == 0)
{
wchar_t bom_mark = 0xFFFE;
size_t written = fwrite(&bom_mark,sizeof(wchar_t),1,myfile);
}
// in another func
while (true)
{
[..]
unsigned char Temp[512];
iBytesRcvd = recv(sclient_socket,(char*)&Temp,iSize,NULL);
if(iBytesRcvd > 0 )
{
WCHAR* unicode_recv = (WCHAR*)&Temp;
fwrite(unicode_recv,sizeof(WCHAR),wcslen(unicode_recv),myfile);
fflush(myfile);
}
[..]
}
What could be causing this?
recv() will not null-terminate &Temp, so wcslen() runs over the bytes actually written by recv(). You will get correct results if you just use iBytesReceived as byte count for fwrite() instead of using wcslen() and hoping the data received is correctly null-terminated (wide-NULL-terminated, that is):
fwrite(unicode_recv, 1, iBytesReceived, myfile);

Resources