Is this a bug in the Log class in CodenameOne? - codenameone

I've been trying to use the log class to capture some strange device-specific failures using local storage. When I went into the Log class and traced the code I noticed what seems to be a bug.
when I call the p(String) method, it calls getWriter() to get the 'output' instance of the Writer. It will notice output is null so it calls createWriter() create it. Since I haven't set a File URL, the following code gets executed:
if(getFileURL() == null) {
return new OutputStreamWriter(Storage.getInstance().createOutputStream("CN1Log__$"));
}
On the Simulator, I notice this file is created and contains log info.
so in my app I want to display the logs after an error is detected (to debug). I call getLogContent() to retrieve it as a string but it does some strange things:
if(instance.isFileWriteEnabled()) {
if(instance.getFileURL() == null) {
instance.setFileURL("file:///" + FileSystemStorage.getInstance().getRoots()[0] + "/codenameOne.log");
}
Reader r = new InputStreamReader(FileSystemStorage.getInstance().openInputStream(instance.getFileURL()));
The main problem I see is that it's using a different file URL than the default writer location. and since the creation of the Writer didn't set the File URL, the getLogContent method will never see the logged data. (The other issue I have is a style issue that a method getting content shouldn't be setting the location for that content persistently for the instance, but that's another story).
As a workaround I think I can just call "getLogContent()" at the beginning of the application which should set the file url correctly in a place that it will retrieve it from later. I'll test that next.
In the mean time, is this a Bug, or is it functionality I don't understand from my user perspective?

It's more like "unimplemented functionality". This specific API dates back to LWUIT.
The main problem with that method is that we are currently writing into a log file and getting its contents which we might currently be in the middle of writing into can be a problem and might actually cause a failure. So this approach was mostly abandoned in favor of the more robust crash protection approach.

Related

Overwrite a json-file in the resources folder with Kotlin

The context is a android application, written in Kotlin
I have this variable that my application reads from in order to set the applications default language. In order to change the default language of the application, I have to overwrite this variable in the json file.
I am able to read from it, utilizing:
val fileContent = getResources().openRawResource(R.raw.FILE_NAME)
which will return the content as an inputstream. However, it seems like I cant overwrite the contents of the file using this method.
I have also tried to access the file utilizing the file path, however, this results in a file-not-found exception. (Been trying several different combination for this one, but they all yield the same results.
The file is located as a resource. After searching a bit, I found this thread from 11 years ago, that says that "resources cannot be overwritten". Is this still true?
Overwrite resources
Any suggestions on how to solve the problem?

New Google Realtime Document timing issue

I am creating a new in-memory realtime document from a large JSON:
var newDoc = gapi.drive.realtime.loadFromJson(jsonData);
Then saving that new document to a newly created drive file:
newDoc.saveAs(file.id);
I also monitor "isSaving" on the document:
newDoc.addEventListener(gapi.drive.realtime.EventType.DOCUMENT_SAVE_STATE_CHANGED, onSaveStateChange);
function onSaveStateChange(e) {
blah...
}
The problem is that when, according to the API, the document is apparently no longer saving, it IS in fact still uploading (as checked by Resource Monitor), and if I try to open that document during that time, I get unpredictable results.
This appears to be a realtime bug with the setting of isSaving, or the triggering of DOCUMENT_SAVE_STATE_CHANGED.
I badly need a way to determine when the new file is ACTUALLY available for use. A hack of some sort, or having to make an extra call would be fine... but as is, putting in any arbitrarily long delay won't account for a slow or intermittent network.

Save to .settings file diffrence between 2 diffrent ways of saving

I was reading about the .settings file on msdn and I noticed they give 2 examples of how to set the value of a item in the settings. Now my question is what is the real diffrence between the 2 and when would you use one instead of the other, since to me they seem pretty mutch the same.
To Write and Persist User Settings at Run Time
Access the user setting and assign it a new value, as shown in the following example:
Properties.Settings.Default.myColor = Color.AliceBlue;
If you want to persist changes to user settings between application sessions, call the Save method, as shown in the following code:
Properties.Settings.Default.Save();
The first statement updates the value of the setting in memory. The second statement updates the persisted value in the user.config file on the disk. That second statement is required to get the value back when you restart the program.
It is very, very important to realize that these two statements must be separate and never be written close together in your code. Keeping them close is harakiri-code. Settings tend to implement unsubtle features in your code, making it operate differently. Which isn't always perfectly tested. What you strongly want to avoid is persisting a setting value that subsequently crashes your program.
That's the harakiri angle, if you saved that value then it is highly likely that the program will immediately crash again when the user restarts it. Or in other words, your program will never run correctly again.
The Save() call must be made when you have a reasonable guarantee that nothing bad happened when the new setting value was used. It belongs at the end of your Main() method. Only reached when the program terminated normally.

Extend Store class to always execute a function after load on ExtJS

I am working on a project where we were asked to "patch" (they don't want a lot of time spent on development as they soon will replace the system) a system implemented under ExtJS 4.1.0.
That system is used under a very slow and non-stable network connection. So sometimes the stores don't get the expected data.
First two things that come to my mind as patches are:
1. Every time a store is loaded for the first time, wait 5 seconds and try again. Most times, a page refresh fix the problem of stores not loading.
Somehow, check detect that no data was received after loading a store and, try to get it again.
This patches should be executed only once to avoid infinite loops or unnecessary recursivity, given that it's ok that some times, it's ok that stores don't get any data back.
I don't like this kind of solutions but it was requested by the client.
This link should help with your question.
One of the posters suggests adding the below in an overrides.js file which is loaded in between the ExtJs source code and your applications code.
Ext.util.Observable.observe(Ext.data.Connection);
Ext.data.Connection.on('requestexception', function(dataconn, response, options){
if (response.responseText != null) {
window.document.body.innerHTML = response.responseText;
}
});
Using this example, on any error instead of echoing the error in the example you could log the error details for debugging later and try to load again. I would suggest adding some additional logic into this so that it will only retry a certain number of times otherwise it could run indefinitely while the browser window is open and more than likely crash the browser and put additional load on your server.
Obviously the root cause of the issue is not the code itself, rather your slow connection. I'd try to address this issue rather than any other.

XmlSerializer and/or StreamWriter not freeing their contents

I have an issue whilst streaming to a file, I'm sure there is a simple solution but I'm struggling to find it! What I'm attempting to do is very straightforward, I'm getting the contents of a class and serializing them into XML and streaming them to a file. The code I'm using is:
ObservableCollection<RackItemViewModel> rackVMs = ProjectTree.GetTreeData();
XmlSerializer serializer = new XmlSerializer(typeof(RackItem));
using (TextWriter tw = new StreamWriter(filename, false))
{
foreach (RackItemViewModel VM in rackVMs)
serializer.Serialize(tw, VM.RackItem);
}
ProjectTree.GetTreeData() just returns the data to be serialized. If I run the program and save the data it all works as expected, the data is saved and can be read back with Deserialize. The problem I'm having is when I perform more than one save. If I save one set of data to one file and then another set of data to another file, the first file is correct but the second file is a concatenation of the first file and the second! It seems that either the stream or the XMLSerializer are not releasing their contents between saves. I've tried using writefile instead of Stream and I still get the same issue, I've also tried flushing and closing the stream but this has no effect. Needless to say if I close and restart the application between saves it all works fine.
Could anyone tell me what I'm doing wrong please?
Before writing to new file try flushing the stream tw.Flush().
I thought I'd tidy up this thread as I've managed to solve the problem. It turns out that it was nothing to do with the serializing or streaming of the data. The data buffer being written wasn't fully releasing the data between writes. I was checking the View Model object which was OK but the object being written (RackItem), wasn't following suite. Silly error on my part. Thanks for the suggestions.

Resources