I use below method for reading a binary file :
public void readFile()
{
try
{
Reader in = new InputStreamReader( this.getClass().getResourceAsStream( this.fileName));
int count = (in.read() * 0xFF) + in.read();
int heights = in.read();
this.shapes = new int[count][];
for(int ii = 0;ii<count;ii++)
{
int gwidth = in.read();
int[] tempG = new int[gwidth * heights];
int len = (in.read() * 0xff) + in.read();
for(int jj = 0;jj<len;jj++)
{
tempG[top++] = in.read() * 0x1000000;
}
this.shapes[ii] = tempG;
}
in.close();
}catch(Exception e){}
}
It works perfectly in netbeans emulator and some devices,but in some devices and in kemulator it seems that in.read(), read a char (two byte), and it causes my app crashes on those device and emulator.
what is the best method for reading file in bytes?
Since you are always dealing with bytes, you should use an InputStream rather than an InputStreamReader.
Add the Javadoc says:
An InputStreamReader is a bridge from byte streams to character streams: It reads bytes and decodes them into characters using a specified charset. The charset that it uses may be specified by name or may be given explicitly, or the platform's default charset may be accepted.
And the read() method reads a "character":
On the other hand, an InputStream represents an input stream of bytes:
public abstract int read() throws IOException
Reads the next byte of data from the input stream. The value byte is returned as an int in the range 0 to 255. If no byte is available because the end of the stream has been reached, the value -1 is returned. This method blocks until input data is available, the end of the stream is detected, or an exception is thrown.
(And for kicks, here's a dated article about "buffered readers" in j2me)
Best example I could find directly from Nokia
public Image readFile(String path) {
try {
FileConnection fc = (FileConnection)Connector.open(path, Connector.READ);
if(!fc.exists()) {
System.out.println("File doesn't exist!");
}
else {
int size = (int)fc.fileSize();
InputStream is = fc.openInputStream();
byte bytes[] = new byte[size];
is.read(bytes, 0, size);
image = Image.createImage(bytes, 0, size);
}
} catch (IOException ioe) {
System.out.println("IOException: "+ioe.getMessage());
} catch (IllegalArgumentException iae) {
System.out.println("IllegalArgumentException: "+iae.getMessage());
}
return image;
}
http://www.developer.nokia.com/Community/Wiki/How_to_read_an_image_from_Gallery_in_Java_ME
Related
I'm trying to fetch 3 keys from SPIFFS that I stored from a string to a file (when fetched from an API endpoint).
I am able to read the files using the SPIFFS library and print the values out in the serial console. When I check the contents, they are just fine. When I use the contents in the secureClient.setCACert() function, they "break" and throw an exception; E][ssl_client.cpp:36] _handle_error(): [start_ssl_client():138]: (-8576) X509 - The CRT/CRL/CSR format is invalid, e.g. different type expected
My code looks like the following:
#include <WiFiClientSecure.h>
#include <FS.h>
#include <SPIFFS.h>
WiFiClientSecure net = WiFiClientSecure();
String read2String(const char * path) {
String output = "";
File file = SPIFFS.open(path, "r");
if (!file || file.isDirectory()) {
return "";
}
while (file.available()) {
char c = file.read();
output.concat(c);
}
file.close();
return output;
}
void readFilesToVariables() {
String awsRootCa = read2String("/AmazonRootCA1.pem");
String privateKey = read2String("/private.pem.key");
String certificate = read2String("/certificate.pem.crt");
net.setCACert(awsRootCa.c_str());
net.setCertificate(certificate.c_str());
net.setPrivateKey(privateKey.c_str());
}
void setup() {
Serial.begin(115200);
SPIFFS.begin();
readFilesToVariables();
mqttClient.begin(AWS_IOT_ENDPOINT, AWS_IOT_ENDPOINT_PORT, net); // crashes here (I think)
...
}
I have on the mqttClient.begin(...) line (the mqtt client and all the variables except for the net var) a couple of variables that are defined in my script but not shared here.
Anyone any idea what I'm doing wrong here?
the exact same code works when I define the certificates and private key using the following methodology:
static const char AWS_CERT_CA[] PROGMEM = R"EOF(
-----BEGIN CERTIFICATE-----
certificate_content_here
-----END CERTIFICATE-----
)EOF";
What am I doing wrong?
The same thing drove me crazy for the last few days.
Then I found a solution, but only supposed the problem.
I think that this issue has something to do with memory handling and where and how you store the certificates after you read it from flash.
Here's how I read the certificates. Using calloc I allocate the memory and never free it, so that the certificate will always be available to the MQTT client
char* getFileAsString(String path) {
File file = SPIFFS.open(path);
if (!file) {
Serial.println("Failed to open " + path);
return NULL;
}
char* buffer = (char*)calloc(file.size(), sizeof(char));
file.readBytes(buffer, file.size());
file.close();
return buffer;
}
Then I just use it in the config struct
const esp_mqtt_client_config_t mqttConf = {
.uri = MQTT_BROKER,
.client_id = MQTT_CLIENT_ID,
.cert_pem = getFileAsString("/caFile"),
.client_cert_pem = getFileAsString("/certificateFile"),
.client_key_pem = getFileAsString("/privateFile"),
};
Note: I did't put the *_len attributes, because I stored my certificates with a leadinig null character:
dd if=/dev/zero bs=1 count=1 >> [my cert name].pem
You can use regular certificates, but you must set the certificates length or adding 1 byte to the calloc instruction (calloc will set all the allocated memory to 0x00, so there will be a leading null character):
char* buffer = (char*)calloc(file.size() + 1, sizeof(char));
Input: a stream of ogg/vorbis coming from an encoder chip of an embedded system.
Problem: create output chunks of one second without transcoding.
Issue: the stream is being read "in the middle", so the first page with BOS (Beginning of Stream) is not available. Since the encoder chip has always the same parameters, I'd like to recreate the BOS page using the BOS page of a stream that was read from the start (reference stream).
I am trying to use vcut. I modified it so that it creates infinite chunks of one second. It was easy, and it works with files and streams with BOS.
I also hacked it so that I wrote to a file the first pages of the reference stream and then read them before reading the production stream with no BOS. In this way, vs->headers are populated. When I detect a page serial number change, I change it so that vcut and libogg do not freak:
int process_page(vcut_state *s, ogg_page *page) {
...
else if(vs->serial != ogg_page_serialno(page))
{
// fprintf(stderr, _("Multiplexed bitstreams are not supported.\n"));
vs->stream_in.serialno = ogg_page_serialno(page);
vs->serial = ogg_page_serialno(page);
vs->granulepos = -1;
vs->initial_granpos = 0;
// ogg_stream_init(&vs->stream_in, vs->serial);
// vorbis_info_init(&vs->vi);
// vorbis_comment_init(&vs->vc);
s->vorbis_init = 1;
}
However, this gigantic hack does not work. How to solve this issue?
It actually works: see VS1053 split ogg.
What I needed to do was to consider that starting reading in the middle of the stream, granulepos was naturally high. So it was mine logical mistake.
In process_audio_packet, I added:
int process_audio_packet(vcut_state *s,
vcut_vorbis_stream *vs, ogg_packet *packet)
{
...
if(packet->granulepos >= 0)
{
if (!firstNonZeroGranule) { // my addition
firstNonZeroGranule = 1;
vs->initial_granpos = packet->granulepos - bs;
if(vs->initial_granpos < 0)
vs->initial_granpos = 0;
} else if(vs->granulepos == 0 && packet->granulepos != bs) {
...
I'm trying to write a X509 Cert to DER format in memory.
Writing it to a file works perfectly.
I need the Cert in PEM format without the "-----BEGIN PRIVATE KEY-----" header, footer or newlines. I can't figure out how to do it directly so...
I'm outputting to der and base64 encoding.
THIS WORKS.
int X509_to_DER_file(X509 *cert) {
int res=0;
out = BIO_new(BIO_s_file());
if (NULL != out) {
if(BIO_write_filename(out, "my.der") > 0) {
res = i2d_X509_bio(out, cert);
}
BIO_free_all(out);
}
return (tres);
}
THIS DOES NOT.
It returns and mallocs the correct number of bytes and appears to write out to memory correctly but the resulting string is incorrect (the first 15 or so positions are correct).
char *X509_to_DER_mem(X509 *cert) {
char *der = NULL;
bio = BIO_new(BIO_s_mem());
if (NULL != bio) {
//load cert into bio
if (0 == i2d_X509_bio(bio, cert)) {
BIO_flush(bio);
BIO_free(bio);
return NULL;
}
der = (char *) malloc(bio->num_write + 1);
if (NULL == der) {
BIO_free(bio);
return NULL;
}
memset(der, 0, bio->num_write + 1);
BIO_read(bio, der, bio->num_write);
// Appears to work put "der" is incomplete.
BIO_free(bio);
}
return der;
}
It returns and mallocs the correct number of bytes and appears to
write out to memory correctly but the resulting string is incorrect
The result of i2d_X509_bio() is not a (zero-terminated) string, but a bunch of bytes. If you try to write it to a file as a string, it might look incomplete because you might encounter a 0-byte at a location before you reach the end. So in addition to the char * result, your function X509_to_DER_mem() will have to return the number of bytes that make up the result.
With regard to the memory BIO, another way of obtaining its data is with the BIO_get_mem_data() function. Something like this:
char *ptr = NULL;
long len = BIO_get_mem_data(bio, &ptr);
der = malloc(len);
memcpy(der, ptr, len);
Finally, your actual question is
I need the Cert in PEM format without the "-----BEGIN PRIVATE
KEY-----" header, footer or newlines.
Writing the certificate in DER format does not seem to give you what you need. This answer to another SO question explains how you could use the function PEM_read_bio() in combination with EVP_EncodeBlock() for that purpose.
We use testing equipment (1995 year manufacturing) powered by MS DOS. Analog-digital converter records information in the file.
In [picture1] is shown the structure of that file.
In [picture2] is shown the oscillogram that constructed according to the data from the file (program for opening the file on MS DOS).
Below I placed link to this file (google drive).
This file contains the data that need for me - the massive of point of oscillogram. I want have opportunities to keep, analyze and print this chart on Windows or Linux (not MS DOS). So I need to extract data from the file.
But I can't make it. And no program (known to me) can't open this file. I analyzed a few first byte and they point to program 'TRAS v4.99'. This program is on MS DOS.
But I really hope, that it is really to get data without this program.
P.S. If anyone will say it is impossible - it is will well too because I haven't found point of view yet:)
Thank you for your time! Best regards!
LINK TO FILE ON GOOGLE DISK - 00014380.K00
STRUCTURE OF FILE
OPENING FILE VIA PROGRAM IN MS DOS
Here is an idea on how you can tackle this problem. Since the format is relatively well specified in the handbook you can use the Java programming language for example with something like java.io.RandomAccessFile to read arrays of bytes. These arrays of bytes can then be converted to Java primitive types OR to string according to the data type. After this conversion you can the print out the data in a human readable format.
Below you can find some sample code to give you an idea of what you could do with this approach (I have not tested the code, it is not complete, it is just to give you an idea of what you can do):
public static void readBinaryfile() throws IOException {
java.io.RandomAccessFile randomAccessFile = new RandomAccessFile("test.bin", "r");
byte[] addKenStrBytes = new byte[12];
randomAccessFile.read(addKenStrBytes);
String addKenStr = new String(addKenStrBytes, "UTF-8");
// TODO: Do something with addKenStr.
System.out.println(addKenStr);
byte[] kopfSizeBytes = new byte[2];
randomAccessFile.read(kopfSizeBytes);
// TODO: Do something with kopfSizeBytes
System.out.println(convertToInt(kopfSizeBytes));
byte[] addRufNrCounterBytes = new byte[6];
randomAccessFile.read(addRufNrCounterBytes);
long addRufNrCounter = convertToLong(addRufNrCounterBytes);
// TODO: Do something with addRufNrCounter
System.out.println(addRufNrCounter);
byte[] endAdrBytes = new byte[4];
randomAccessFile.read(endAdrBytes);
// TODO: Do something with endAdrBytes
System.out.println(convertToLong(endAdrBytes));
// Continue here and after you reached the end of the record repeat until you reached the end off the file
}
private static int convertToInt(byte[] bytes) {
if(bytes.length > 4) {
throw new IllegalArgumentException();
}
int buffer = 0;
for(byte b : bytes) {
buffer |= b;
buffer = buffer << 8;
}
return buffer;
}
private static long convertToLong(byte[] bytes) {
if(bytes.length > 8) {
throw new IllegalArgumentException();
}
long buffer = 0L;
for(byte b : bytes) {
buffer |= b;
buffer = buffer << 8;
}
return buffer;
}
Note that fields with more than 8 bytes need to be most probably converted to strings. This is not complete code, just an example to give you an idea on how you can tackle this problem.
I'm trying to send an image from C++ to C# with an interop (marshaling) of C++ managed. image->getStream() return a const char* from a string.
I'm having exception with my Marshal::Copy function.
An unhandled exception of type 'System.AccessViolationException' occurred in mscorlib.dll
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
Am I doing the right thing for the copy from a const char* to a byte array ? My dll is compiled with ASCII char set in VS2010.
array<System::Byte>^ OsgViewer::getLastImage()
{
array< Byte >^ byteArray;
m_ImageQueue->lock();
int index = m_ImageQueue->getCurrentImageIndex();
std::shared_ptr<Image> image = m_ImageQueue->getImage(static_cast<unsigned int>(index));
if( image && image->isValid() == true)
{
int wLen = image->getStreamSize();
char* wStream = const_cast<char*>(image->getStream());
byteArray = gcnew array< Byte >(wLen);
// convert native pointer to System::IntPtr with C-Style cast
Marshal::Copy((IntPtr)wStream ,byteArray , 0, wLen);
}
m_ImageQueue->unlock();
return byteArray;
}
Image is a home made C++ class
class ADAPTER Image
{
public :
Image();
~Image();
const char* getStream() const;
int getStreamSize();
bool setStringStream(std::ostringstream* iStringStream);
void setIsValid(bool isValid){ m_isValid = isValid;}
bool isValid() const{return m_isValid;}
std::ostringstream* getOStringStream() {return m_StringStream;}
private:
std::ostringstream* m_StringStream;
bool m_isValid;
};
I wouldn't use Marshal::Copy. Since you have the array locally, why not just pin it and use memcpy?
pin_ptr<Byte> ptrBuffer = &byteArray[byteArray->GetLowerBound(0)];
You can now call memcpy to ptrBuffer.
When the scope ends the pinning is automatically undone.