I have this scrip that read from an object array and load it to a MS SQL database, without the Sequelize.close the code seems to be stuck waiting on the next object, when there is none exist. I would like to close the connection once its done reading the object array.
async.eachSeries(grouparrobj,(item,cb) =>{
async.each(item,(items,callback)=>{
var queryvar = `INSERT INTO ${tablename} (ID,NAME) VALUES ('${items.ID}','${items.name}')`;
sequelize.query(queryvar).then(results =>{
callback();
})
},(err,result) =>{
if (err)
{
console.log('Error',err);
}
else{
console.log(`Finished processing level ${item}`)
cb();
}
})
, (err,success) => cb(err,success);
})
How do i achieve this
I found an alternate way to do it. I moved the asyn each to another function and pass the callback there and close the sequelize connection
Related
So I am trying to push an item to a list (joinedGrps) every time a user creates a group, but when I try to push any item to the list, the value set to an automatically generated id
But I want it to store like an array with indexing starts with 0 like the below example
CODE
const RDBuserRef = realDB.ref(`users/${userInfo.userId}/joinedGrps`);
const newRDBref = RDBuserRef.push(grpName).catch((err) => {
console.log(err);
});
So is it possible to do?
Alternatives methods are welcomed but it should be in Realtime Database
You need to use the set() method instead of the push() one, which "generates a unique key every time a new child is added to the specified Firebase reference".
In order to know which key to use, you need to first read the Array, add a new element and then, write it back.
The following will do the trick:
const groupRef = realDB.ref(`users/${userInfo.userId}/joinedGrps`);
groupRef
.get()
.then(snapshot => {
if (snapshot.exists()) {
const grpArray = snapshot.val();
grpArray.push('...');
groupRef.set(grpArray);
} else {
groupRef.set({ 0: 'Elem#1' });
}
})
.catch(function (error) {
console.error(error);
});
HOWEVER, if several users are able to update the database node you need to use a transaction. Since you need to read the array before writing it back, using a transaction is the only way to ensure that there are no conflicts with other users writing to the same location at the same time.
groupRef.transaction(currentArray => {
if (currentArray === null) {
return { 0: 'Elem#1' };
} else {
currentArray.push('....');
return currentArray;
}
}
I'm new to NodeJS and are only familiar with Java. I'm trying to create a file that creates objects based on a database and adds them to an array. This array I want to be able to export so that I can use it throughout the whole program, but when I try to export the array it doesn't work. I've tried googling and understanding but haven't come across anything that was helpful unfortunately.
I hope that someone can help me understand
I've tried calling module.exports after the ".then" call, but it just returns an empty array because its async.
I've also tried calling module.exports = teams inside the .then call but it didn't work neither.
var teams = [];
function assignTeamsToClasses() {
return new Promise((resolve, reject) => {
getAllTeamsInDb((teamList) => {
teamList.forEach((aTeam) => {
let newTeam = new Team(aTeam['teamid'], aTeam['teamname'], aTeam['teamrank']);
teams.push(newTeam);
});
resolve();
});
})
}
assignTeamsToClasses().then(() => {
module.exports = teams;
});
main.js
var teams = require('./initialize.js');
console.log(teams);
I expect it to return all teams that are in the database. I know that array is not empty when called within the ".then" call, but the export part does not.
Simple
the sequence require() + console.log() is synchronous
assignTeamsToClasses() is asynchronous, i.e. it updates teams at some unknown later point in time.
You'll have to design your module API to be asynchronous, e.g. by providing event listener interface or Promise interface that clients can subscribe to, to receive the "database update complete" event.
A proposal:
module.exports = {
completed: new Promise(resolve =>
getAllTeamsInDb(teams => {
const result = [];
teams.each(aTeam =>
result.append(new Team(aTeam.teamid,
aTeam.teamname,
aTeam.teamrank)
)
);
resolve(result);
})
),
};
How to use it:
const dbAPI = require('./initialize.js');
dbAPI
.completed
.then(teams => console.log(teams))
.catch(error => /* handle DB error here? */);
Every caller who uses this API will
either be blocked until the database access has been completed, or
receive result from the already resolved promise and proceed with its then() callback.
I'm using papaparse to read the csv file to fetch the records but the issue is that it is too fast for the database. It reads the csv file in an instant but the individual record is still being processed in the database API, so only some records are processed asynchronously in the database with random sequence and all the records do not enter the database because of it. My code is
onSubmit() {
this.papa.parse(this.csvFile, {
step: (row) => {
var jsonObj = this.arrayToJSON(row.data[0]);
console.log(jsonObj);
this.apiService.addEmployees(jsonObj).
subscribe(
add => this.arrayToJSON(jsonObj),
error => console.log("Error :: " + error))
}
});}
I want all that the papaparse waits for the db request to process, then fetch the next record.
You can use Observable.concat for that:
onSubmit() {
// First of all, prepare an array of observables
const databaseWrites = [];
this.papa.parse(this.csvFile, {
step: (row) => {
let jsonObj = this.arrayToJSON(row.data[0]);
console.log(jsonObj);
databaseWrites.push(this.apiService.addEmployees(jsonObj));
}
});
// Once everything is prepared, use concat to do each operation one after another
Observable.concat(...databaseWrites).subscribe(() => { console.log('write success !')});
}
Because Observables are cold, you can prepare your requests and subscribe to the concat in order to have each operation being executed one after another.
Been going round in circles for 2 days now.
I am getting some data from Azure SQL database (connection parameters are in sqlconfig)
function getCategories(callback) {
var conn = new mssql.ConnectionPool(sqlconfig);
var req = new mssql.Request(conn);
console.log('in getCategories');
conn.connect((err) => {
if (err) {
console.log('Connection Error:', err);
}
req.query("Select top 3 * from Categories", (err, rs) => {
if (err) {
console.log('Select error: ', err);
} else {
callback(rs.recordsets[0]);
}
conn.close();
});
})
}
I know the data is being returned correctly because when I do
getCategories((rs) => console.log('Get-Categories', rs));
I get the expected results
I am struggling to get the dataset to pass through to the view
app.get('/categories', (req, res) => {
res.render('categories.hbs', {
pageTitle: 'Catgories',
currentYear: new Date().getFullYear(),
categories: getCategories((rs) => rs)
});
});
returns nothing in the categories as it is undefined - so the callback has not finished running when the code is called.
How do I make the app.get wait until the getCategories has completed so that the data is ready to pass back to the view.
I found this post which let me to understand how this works
Need to keep promise result in a variable for Nodejs mssql validation
and have put my own answer in there. Short version is that in Node you have to set the variable value INSIDE the callback stack rather then returning it from the function to assign to the variable.
fetchDataFromDB('Select top 10 * from X', (err, res)=>{myvar = res})
How do I make the app.get wait until the getCategories has completed so that the data is ready to pass back to the view.
You could make the "getCategories" function a middleware that places the result on the request object that can then be obtained by the view. Simply call next() when the operation is complete and then render the view.
app.get('/categories', getCategories, (req, res) => {
res.render('categories.hbs', {
pageTitle: 'Catgories',
currentYear: new Date().getFullYear(),
categories: req.categories
});
});
I am building an API integration application in Node.js using the "mssql" package. I have the data pulling from the third-party API, and stored in my SQL Server. However, my DB connection stays open forever and keeps my app running. Everything that I have tried ends the connection before the data can be stored. So, I can either store my data and keep the connection open forever, or end my connection and not store the data. The best that I have found is something like this answer: https://stackoverflow.com/a/45681751/5552707.
And I have tried that in my app, which still kills my connection before data is stored:
sql.connect(sqlConfig).then(pool => {
var request = new sql.Request(pool);
var result = request.bulk(table, (err, result) => {
if(err){
console.log("fail. " + err);
return;
}
})
}).catch(err => {
console.log('There was an error processing the request. ' + err);
}).then(() => {
console.log('done');
process.exit(1);
});
They docs don't explain how to do this, which is frustrating.
Any ideas would be awesome!
Thanks!
Adding
process.exit();
to the callback function did the trick.
var request = new sql.Request(pool);
var result = request.bulk(table, (err) => {
if(err){
console.log("fail. " + err);
return;
}
process.exit(1);
})