Flutter - Why do I get 'Unhandled Exception: Concurrent modification during iteration' from a database query? - database

I'm having some trouble understanding an issue regarding flutters database handling (at least I assume so since that's where the issue seems to be). When I run my app I get the following error message:
E/flutter (26185): [ERROR:flutter/lib/ui/ui_dart_state.cc(209)] Unhandled Exception: Concurrent modification during iteration: Instance(length:13) of '_GrowableList'.
E/flutter (26185): #0 ListIterator.moveNext (dart:_internal/iterable.dart:336:7)
E/flutter (26185): #1 _MyHomePageState._processBle.<anonymous closure> (package:blescanner/screens/home_view.dart:283:27)
E/flutter (26185): <asynchronous suspension>
I've put a lot of effort trying to find why this is. The row 283 referenced is my "for (var element in list){"-row in MyHomePage, but to my knowledge I'm not trying to modify that list anywhere. After some time I just started commenting away line after line to find what the cause of the issue and it turns out that it disappeared when I commented away the "dbHelper.getBlItemLog"-call. That led me into that function (code included further down) where the issue seems to be the database query. I thought that it might be some issue with the database being modified during the query, so I removed all other calls to the database, but the issue remained. How can a database query cause this kind of issue? I just can't understand why? does anyone have any kind of insight or suggestion regarding this?
from MyHomePage (which extends Stateful Widget):
void _processBle() {
FlutterBluePlus.instance.scanResults.forEach((list) async {
for (var element in list) {
String bleName = element.device.name;
dev.log('Found device: ' + bleName);
String blItemId = bleName.replaceAll('XXX', '');
var blItemData = CustomBluetoothItem.fromBle(bleName,
blItemId,
element.advertisementData.manufacturerData,
_geoPos.lat, _geoPos.long
);
int tempStatus = blItemData.status;
try {
if (blItemData.status > 1) {
await lock.synchronized(() async {
dev.log('${DateTime.now()
.toIso8601String()} _processBle .. start $bleName');
//here's where I call the troubling function
await dbHelper.getBlItemLog(
bleName: bleName, startDate: blItemData.sDate)
.then((queueItem) async {
// the stuff in here is commented away for now and thus doesn't matter
});
dev.log('${DateTime.now()
.toIso8601String()} _processBle .. stop ${element.device
.name}');
});
}
} catch (e) {
dev.log('error : $e');
}
}
});
firstStartup = false;
dev.log('_processBle - ### done');
}
from my dbHelper:
Future<CustomBluetoothItem?> getBlItemLog({required String bleName, required String startDate}) async {
Database? db = await database;
//the following line seems to be where the issue is, since there's no issue if I comment it away.
List<Map<String, dynamic>> maps = await db!.rawQuery('SELECT * FROM ble_table WHERE bleName = ? AND startDate = ? ORDER BY createDate desc limit 1 offset 0',[bleName,startDate]);
if (maps.isNotEmpty) {
return CustomBluetoothItem.fromJson(maps[0]);
}
return null;
}

Related

I am receiving a TypeError: cannot read properties of null (reading 'push') for a clock in/out schedule

I am a new dev taking over for another developer at work and I am also a solo dev as they were, I am using react as my code base and we had 6 weeks worth of turnover before they left and as time goes on, I realize how little knowledge was passed along to me. This is my first job but I am not looking for sympathy points, just the best way to ask questions and how to find the right questions I am looking for.
The code has not been touched for after I completed it, but now I keep getting this errors where it worked before. I dont currently know if its because my Twilio trial period has ended or if its another issue. I have troubleshot this code from my knowledge (which isnt much) but I am not sure where else to get answers from.
the code I am currently getting the error from
SendTextMessage(_type, _work_time, _notes, _week, _month) {
debugger;
let month = ["Jan.", "Feb.", "Mar.", "Apr.", "May.", "Jun.", "Jul.", "Aug.", "Sep.", "Oct.", "Nov.", "Dec."];
let msgList = JSON.parse((this.props.lists).MsgTo);
msgList.push(this.props.user.phoneNumber);
let date = "";
let mobileModeSend = msgList;
let workTime = `Last Worked ${_work_time}.`;
if(this.state.fridayTime !== null){
date = FormatDateTime(new Date());
if (_type === "Punching Out") {
workTime = `Worked for ${_work_time}. \n` +
`Total time worked this week: (${_week}). \n` +
`Total time worked since ${month[(new Date()).getMonth()]} 1st (${_month}).`;
}
} else {
date = FormatDateTime(new Date());
if (_type === "Punching Out") {
workTime = `Worked for ${_work_time}. \n` +
`Total time worked this week: (${_week}). \n` +
`Total time worked since ${month[(new Date()).getMonth()]} 1st (${_month}).`;
}
}
`Total time worked this week: (${_week}). \n`
let msg = `${this.props.user.fullName} \n` +
`${_type} \n` +
`${ date } \n` +
`${workTime} \n\n` +
`${_notes} \n`;
for (let i = 0; i < mobileModeSend.length; i++) {
const mobileModeMsg = {
body: `Time Entry \n ${msg}`,
to: mobileModeSend[i],
from: `Auto Msg`,
time: CurrentTime(),
type: 'admin'
};
$.post('/api/send_message_async',{messages: mobileModeMsg},(response) => {})
}
}
I have ran the debugger to test all of the following that I am aware of how to do.
ran the debugger for the line it said it was having the issue.
1-1. it returns null when I step over the line and when I console log the data
added an empty array with the msgList name to pull from to see if it needed that first, but that returned the same issue and I didnt need it when it was working from before.
I restarted the servers and FTP instances that have been connected to see if the upload had an issue, I reset the instance and launched it again. This cleared some other issues up but this one still remains.
before the " let msgList = JSON.parse((this.props.lists).MsgTo);" line, if you console out this.props.lists what do you get?
Where is lists populated?

Gatling how to store and load a value for a later request

I'd like to build a load test where the second request is fed from first response. The data extraction is done in a method because it is not only one line of code. My problem is storing the value (id) and load it later. How should the value be stored and loaded? I tried some different approaches, and I come up with this code. The documentation has not helped me.
object First {
val first = {
exec(http("first request")
.post("/graphql")
.headers(headers_0)
.body(RawFileBody("computerdatabase/recordedsimulation/first.json"))
.check(bodyString.saveAs("bodyResponse"))
)
.exec {
session =>
val response = session("bodyResponse").as[String]
session.set("Id", getRandomValueForKey("id", response))
session}
.pause(1)
}
}
object Second {
val second = {
exec(http("Second ${Id}")
.post("/graphql")
.headers(headers_0)
.body(RawFileBody("computerdatabase/recordedsimulation/second.json"))
)
.pause(1)
}
}
val user = scenario("User")
.exec(
First.first,
Second.second
)
setUp(user.inject(
atOnceUsers(1),
)).protocols(httpProtocol)
Your issue is that you're not using the Session properly.
From the documentation:
Warning
Session instances are immutable!
Why is that so? Because Sessions are messages that are dealt with in a multi-threaded concurrent way, so immutability is the best way to deal with state without relying on synchronization and blocking.
A very common pitfall is to forget that set and setAll actually return new instances.
This is exactly what you're doing:
exec { session =>
val response = session("bodyResponse").as[String]
session.set("Id", getRandomValueForKey("id", response))
session
}
It should be:
exec { session =>
val response = session("bodyResponse").as[String]
session.set("Id", getRandomValueForKey("id", response))
}

how to test that the function under test has thrown an exception

In my scala and play code, a function throws an exception
case None => {
println("error in updating password info")
throw new Exception("error in updating password info") //TODOM - refine errors. Make errors well defined. Pick from config/env file
}
I want to test the above code but I don't know how to test that the Exception was thrown. The spec I have written is
"PasswordRepository Specs" should {
"should not add password for non-existing user" in {
val newPassword = PasswordInfo("newHasher","newPassword",Some("newSalt"))
when(repoTestEnv.mockUserRepository.findOne(ArgumentMatchers.any())).thenReturn(Future{None}) //THIS WILL CAUSE EXCEPTION CODE TO GET EXECUTED
val passwordRepository = new PasswordRepository(repoTestEnv.testEnv.mockHelperMethods,repoTestEnv.mockUserRepository)
println(s"adding password ${newPassword}")
val passwordInfo:PasswordInfo = await[PasswordInfo](passwordRepository.add(repoTestEnv.testEnv.loginInfo,newPassword))(Timeout(Duration(5000,"millis"))) //add SHOULD THROW AN EXCEPTION BUT HOW DO I TEST IT???
}
}
Thanks JB Nizet. I must confess I was being lazy! The correct way is either to use assertThrow or interrupt. Eg.
val exception:scalatest.Assertion = assertThrows[java.lang.Exception](await[PasswordInfo(passwordRepository.add(repoTestEnv.testEnv.loginInfo,newPassword)
)(Timeout(Duration(5000,"millis"))))
or
val exception = intercept[java.lang.Exception](await[Unit](passwordRepository.remove(repoTestEnv.testEnv.loginInfo))(Timeout(Duration(5000,"millis"))))
println(s"exception is ${exception}")
exception.getMessage() mustBe repoTestEnv.testEnv.messagesApi("error.passwordDeleteError")(repoTestEnv.testEnv.langs.availables(0))

how do you detect createUpdate() fails?

public Connection executeUpdate() {
long start = System.currentTimeMillis();
try {
this.logExecution();
PreparedStatement statement = this.buildPreparedStatement();
this.connection.setResult(statement.executeUpdate());
this.connection.setKeys(this.returnGeneratedKeys ? statement.getGeneratedKeys() : null);
this.connection.setCanGetKeys(this.returnGeneratedKeys);
} catch (SQLException var7) {
this.connection.onException();
throw new Sql2oException("Error in executeUpdate, " + var7.getMessage(), var7);
} finally {
this.closeConnectionIfNecessary();
}
long end = System.currentTimeMillis();
logger.debug("total: {} ms; executed update [{}]", new Object[]{end - start, this.getName() == null ? "No name" : this.getName()});
return this.connection;
}
I'm wondering how to test an update failing. The query I'm using is:
update my_table set some_field=:some_value
where the_key = :the_key_value
And, immediately before executeUpdate() runs, I am deleting the record where the_key == "the_key_value".
Does anyone know the correct way to determine if the update failed?
Also, when I go to the javadoc and click on anything, I get:
CDN object Not Found - Request ID: c6a9ba5f-f8ea-46ae-bf7a-efc084971878-19055563
Is there a way to build javadoc locally?
EDIT: is the way to check this thru the use of Connection.getResult()? Does it return the number of records updated/inserted etc?
I went and RTFM which explained how to do this.

How to stream 2 million rows from SQL Server without crashing Node?

I am using Node to copy 2 million rows from SQL Server to another database, so of course I use the "streaming" option, like this:
const sql = require('mssql')
...
const request = new sql.Request()
request.stream = true
request.query('select * from verylargetable')
request.on('row', row => {
promise = write_to_other_database(row);
})
My problem is that I have do an asynchronous operation with each row ( insert into another database), which takes time.
The reading is faster than the writing, so the "on row" events just keep coming, and memory eventually fills-up with pending promises, and eventually crashes Node. This is frustrating -- the whole point of "streaming" is to avoid this, isn't it?
How can I solve this problem?
To stream millions of rows without crashing, intermittently pause your request.
sql.connect(config, err => {
if (err) console.log(err);
const request = new sql.Request();
request.stream = true; // You can set streaming differently for each request
request.query('select * from dbo.YourAmazingTable'); // or
request.execute(procedure)
request.on('recordset', columns => {
// Emitted once for each recordset in a query
//console.log(columns);
});
let rowsToProcess = [];
request.on('row', row => {
// Emitted for each row in a recordset
rowsToProcess.push(row);
if (rowsToProcess.length >= 3) {
request.pause();
processRows();
}
console.log(row);
});
request.on('error', err => {
// May be emitted multiple times
console.log(err);
});
request.on('done', result => {
// Always emitted as the last one
processRows();
//console.log(result);
});
const processRows = () => {
// process rows
rowsToProcess = [];
request.resume();
}
The problems seems to be caused by reading the stream using "row" events that don't allow you to control the flow of the stream. This should be possible with "pipe" method, but then you end up in a Data Stream and implementing a writable stream - which may be tricky.
A simple solution would be to use Scramjet so your code would be complete in a couple lines:
const sql = require('mssql')
const {DataStream} = require("scramjet");
//...
const request = new sql.Request()
request.stream = true
request.query('select * from verylargetable')
request.pipe(new DataStream({maxParallel: 1}))
// pipe to a new DataStream with no parallel processing
.batch(64)
// optionally batch the requests that someone mentioned
.consume(async (row) => write_to_other_database(row));
// flow control will be done automatically
Scramjet will use promises to control the flow. You can also try increasing the maxParallel method, but keep in mind that in this case the last line could start pushing rows simultaneously.
My own answer: instead of writing to the target database at the same time, I convert each row into an "insert" statement, and push the statement to a message queue ( RabbitMQ, a separate process ). This is fast, and can keep-up with the rate of reading. Another node process pulls from the queue ( more slowly ) and writes to the target database. Thus the big "back-log" of rows is handled by the message queue itself, which is good at that sort of thing.

Resources