i'm new to dart and i'm trying to run multiple algorithms using isolates to more efficiently run them multiple times, however, when calling more than one function with Isolate.spawn i start getting 'Malformed message' in the console, it still works, but i would like to know why am i getting this message, also any help in improving the code is welcome since i'm still learning the intricacies of using isolates
stressTest()
{
ReceivePort rpDouble = new ReceivePort();
ReceivePort rpString = new ReceivePort();
ReceivePort rpInteger = new ReceivePort();
int counter = 0;
int tempoTotal = 0;
rpDouble.listen((data) {
counter++;
tempoTotal += data; //data is a stopwatch.toMilliseconds
setState(() { //updating the "progress" and the time it took to run the algorithm in the ui
test = counter.toString() + '%';
_counter = tempoTotal.toString();
});
});
rpInteger.listen((data){
counter++;
tempoTotal += data;
setState(() {
test = counter.toString() + '%';
_counter = tempoTotal.toString();
});
});
rpString.listen((data){
counter++;
tempoTotal += data;
setState(() {
test = counter.toString() + '%';
_counter = tempoTotal.toString();
});
});
for(int i = 0; i < 5; i++) {
Isolate.spawn(DoubleTest, rpDouble.sendPort);
Isolate.spawn(StringStress, rpString.sendPort);
Isolate.spawn(integerTest, rpInteger.sendPort);
}
}
}
the DoubleTest, StringStress and IntegerTest are functions that return a stopwatch.toMilliseconds integer to the sendPort.
Thanks in advance, any help is appreciated
This happens when you want to see the Performance App. However, during the Reload app, resyncing the data results in flutter performance not getting an address or interrupting the progress reader leading to that message. This does not affect the application or your application error. If you need to read "Memory usage" or "frame rendering times" then just restart the IDE and start again.
As your code is running fine, so you don't need to worry much about the error. However,try reformatting your code to avoid this error message.
I had a similar problem in my program. Just saving the file before 'hot restart' did the job.
Related
I want to batch my records for uploading so I don't create server issues. I would like to be able to push 10 records at a time, every five seconds, just to prove the concept for now. I've put setInterval functions all over my code but can't get it to run at the right time. I've been at this for days but can't figure it out.
chunkData(data) {
const maxRecords = 10;
const loops = (data.length % maxRecords > 0) ? Math.floor(data.length / maxRecords) + 1 : data.length / maxRecords;
//console.log('data: ', data);
//console.log('loops: ', loops);
//setInterval(() => {
for (let loop = 0; loop < loops; loop++) {
console.log('loop: ', loop);
let start = loop * maxRecords;
//setInterval(() => {
for (let batch = start; batch < start + maxRecords; batch++) {
// the line below will become the upload function once I get this to work
if (data[batch] !== undefined) console.log('data[batch]: ', data[batch]);
}
//start = start + 10;
//}, 5000);
}
//}, 5000);
}
I'm certain it's a simple tweak I need but I'm clueless as to how to make it happen right now. Any advice would be greatly appreciated.
The bigger problem you'll have to figure out is that a client-side change will not help in this case. If you're trying to help your server this way, what happens when there is more than 1 concurrent user uploading? 2? 3? 100? 1000? This solution isn't scalable. You'll eventually (or very quickly) have to make sure your server is robust enough to handle upload traffic.
As for your specific code. Your problem is that you're using setInterval inside a for-loop but use the same value. Remember, uploading (or any XHR/fetch request) is an asynchronous action. Right now, you're setting the intervals to run at basically the same time.
To get actual intervals between uploads, you'd need something like this:
for (let loop = 0; loop < loops; loop++) {
console.log('loop: ', loop);
let start = loop * maxRecords;
for (let i=1, batch = start; batch < start + maxRecords; i++, batch++) {
// the line below will become the upload function once I get this to work
if (data[batch] !== undefined) {
setInterval(() => {
//make upload request here
}, (loop + 1) * i * 5000);
}
}
}
I'm not sure what your "start" variable is supposed to be.
In any case, this code is really error prone and fragile. I really advise reconsidering your approach and look into fixing your server side.
If you still wish to go with this client-side hack and even if not, and you're looking for a more stable client-side solution. I advise to go with react-uploady. It takes care of the uploads for you and all the edge cases that come with managing uploads in React.
You can even do your intervals easily:
import ChunkedUploady, { useChunkStartListener } from "#rpldy/chunked-uploady";
import UploadButton from "#rpldy/upload-button";
const CHUNK_SIZE = 1e+6;
const UploadButtonDelayedChunks = () => {
useChunkStartListener(() => new Promise((resolve) => {
//delays chunk upload by 5 seconds
setTimeout(resolve, 5000);
}));
return <UploadButton/>;
};
export const ChunkedExample = () => {
return (
<ChunkedUploady
destination={{ url: "https://my-server/upload" }}
chunkSize={CHUNK_SIZE}>
<UploadButtonDelayedChunks/>
</ChunkedUploady>
);
};
I am trying to make a while loop loop a statement exactly for one second after which it stops. I have tried this in DartPad, but it crashes the browser window.
void main(){
var count = 0.0;
bool flag = true;
Future.delayed(Duration(seconds: 1), (){
flag = false;
});
while (flag){
count++;
}
print(count);
}
Am I doing something wrong?
I like how you are trying to figure Futures out. I was exactly where you were before I understood this stuff. It's kind of like threads, but quite different in some ways.
The Dart code that you wrote is single threaded. By writing Future.delayed, you did not start a job. Its execution won't happen unless you let go of the thread by returning from this main function.
Main does not have to return if it is marked with async.
Two actions have to run "concurrently" to be able to interact with each other like you are trying to do. The way to do it is to call Future.wait to get a future that depends on the two futures. Edit: Both of these actions have to let go of execution at every step so that the other can get control of the single thread. So, if you have a loop, you have to have some kind of await call in it to yield execution to other actions.
Here's a modified version of your code that counts up to about 215 for me:
Future main() async {
var count = 0.0;
bool flag = true;
var futureThatStopsIt = Future.delayed(Duration(seconds: 1), (){
flag = false;
});
var futureWithTheLoop = () async {
while (flag){
count++;
print("going on: $count");
await Future.delayed(Duration(seconds: 0));
}
}();
await Future.wait([futureThatStopsIt, futureWithTheLoop]);
print(count);
}
I'm still a beginner in Protractor so forgive me if it is not the most optimized code. Although any advice and help are appreciated.
var orderno =["100788743","100788148","100788087","100000000","100786703"];
for (var i = 0; i < orderno.length; i++) {
(function(Loop) {
element(by.css('[autoid="_is_3"]')).sendKeys(orderno[i]);
browser.actions().sendKeys(protractor.Key.ENTER).perform();
expect(element(by.cssContainingText("span.Class", "Store A")).waitReady()).toBeTruthy();
element(by.cssContainingText("span.Class", "Store A")).isDisplayed().then(function(pickfail) {
if (pickfail) {
element(by.css('[class="highlight"]')).getText().then(function(text) {
console.log(text + "-" + "Pass");
});
} else {
console.log("Order Number: Missing");
}
});
element(by.css('[autoid="_is_3"]')).clear();
})([i]);
};
*waitReady is there to wait for the element to come up but I believe it its trying to find it but couldn't so it timesout.
I've created a test where I could input a value in a textbox which would search and check if it exists or not. If it does, it passes but if it doesn't, then fails. But want the loop to continue even if it fails to see if the other remaining items exist.
I think expect would fail when it couldn't find the value thus stopping the whole test. Is there another way to check and continue the whole checking?
I am trying to save documents using pouchdb's bulkSave() function.
However, when these documents are saved it starts to sync with master database using sync gateway & in doing so the webapp slows down and when I try to navigate to different tabs no content is displayed on that tab.
Below is an example of how the documents are being created:
for (var i = 0; i <= instances; i++) {
if (i > 0) {
advTask.startDate = new Date(new Date(advTask.startDate).setHours(new Date(advTask.startDate).getHours() + offset));
}
if (advTask.estimatedDurationUnit == 'Minutes') {
advTask = $Date.getAdvTaskEndTimeIfMinutes(advTask);
} else if (advTask.estimatedDurationUnit == 'Hours') {
advTask = $Date.getAdvTaskEndTimeIfHours(advTask);
} else if (advTask.estimatedDurationUnit == 'Days') {
advTask = $Date.getAdvTaskEndTimeIfDays(advTask);
}
if(new Date(advTask.endDate).getTime() >= new Date($scope.advTask.endDate).getTime()) {
// here save the task array using bulkSave() function
$db.bulkSave(tasks).then(function (res) {
$db.sync();
});
break;
}
advTask.startDate = $Date.toGMT(advTask.startDate);
advTask.endDate = $Date.toGMT(advTask.endDate);
var adv = angular.copy(advTask);
tasks.push(adv); // here pushing the documents to an array
offset = advTask.every;
}
Thanks in advance!
bulkSave is not a core PouchDB API; are you using a plugin?
Also one piece of advice I'd give is that Couchbase Sync Gateway does not have 100% support for PouchDB and is known to be problematic in some cases.
Another piece of advice is that running PouchDB in a web worker can prevent your UI thread from getting overloaded, which would fix the problem of tabs not showing up.
Do you have a live test case to demonstrate?
I want to get all names of files and directories from path and recognize them as files and directories. but When i run my code sometimes it works and somentimes it shows that directories are files. Here is the code
socket.on('data',function(path){
fs.readdir('path',function(err, data) {
var filestatus=[];
var z=0;
var i=data.length-1;
data.forEach(function(file){
fs.stat(file, function(err, stats) {
filestatus[z]=stats.isDirectory()
if (z==i){
socket.emit('backinfo',{names:data,status:filestatus});
}
z++;
})
})
})
})
During tests i realized that when i slow down data.forEach loop (using console.log(something) it works better(less miss). And this is strange.
This is about 96% incorrect, thank you to JohnnyHK for pointing out my mistake, see the comments below for the real problem / solution.
Because the fs.stat() function call is asynchronous, the operations on the filestatus array are overlapping. You should either use the async library as elmigranto suggested, or switch to using fs.statSync.
More details on what's happening:
When you call fs.stat(), it basically runs in the background and then immediately goes onto the next line of code. When it has got the details of the file, it then calls the callback function, which in your code is the function where you add the information to the filestatus array.
Because fs.stat() doesn't wait before returning, your program is going through the data array very quickly, and mutliple callbacks are being run simultanously and causing issues because the z variable isn't being incremented straight away, so
filestatus[z]=stats.isDirectory()
could be executed multiple times by different callbacks before z gets incremented.
Hope that makes sense!
you are using for statement in NODEJS and this will work if turned the For Statement to recursive function please see the attached code for help
function load_Files(pat,callback) {
console.log("Searching Path is: "+ph);
fs.readdir(pat,(err,files)=>
{
if(err)
{
callback(err);
}
else
{
var onlydir=[];
var onlyfiles=[];
var d=(index)=>
{
if (index==files.length)
{
console.log("last index: "+ index);
var ar=[];
ar.concat(onlydir,onlyfiles);
callback(null,ar);
return;
}
fs.stat(files[index],(err,status)=>
{
console.log("the needed file " +files[index]);
if (status.isDirectory())
{
onlydir.push(files[index]);
}
else
{
onlyfiles.push(files[index]);
}
console.log("only Directory: "+onlydir.length);
console.log("index: "+ index);
d(index+1);
}
)
}
d(0);
}
});
}