Handle connection losses: reperform request or resend response - angularjs

Using angular-sails the sailsjs backend is usually called this way:
this.doSomethingWithItem = function(itemID, callback){
$sails.put('/item/doSomething', {itemID:itemID}).
success(function (data, status, headers, config) {
callback(data);
}).
error(function (data, status, headers, config){
alert('Error!');
});
};
In the backend, most of the Sails (v0.11.0) Controller functions are rather simple. An example might look like such:
doSomething: function(req, res) {
var p = req.params.all();
postgresClientPool.connect(function(err, client, done) {
client.query("SELECT item_do_something_in_this_awesome_function($1) AS dbreturn", [p.itemID], function(err, result) {
done();
if(err) {
res.status(500).json({success: false});
}
else {
res.json(result.rows[0].dbreturn[0]);
}
});
});
}
Now for reasons we can't influence we're experiencing quite frequent but rather short connection losses (between Client/Browser and nodeJS/sails-Server). The task is now to handle them as smooth for the user as possible and avoid any further inconvenience.
So, if during an ongoing request the connection is interrupted, the logic has to be something like:
Check if the connection interruption happened before or after the request reached the server.
If it happened before: re-perform the request.
If it happened afterwards: tell the backend to re-send the result of the request.
Now, how to achieve that?
I don't know if registering a $sails.on('disconnect'... in each service function is the best idea. And anyway, I haven't figured out yet how the de-register them after the function finished executing.

First of all, I would recommend you to separate your business logic out of the controller.
You can check this other answer: sails.js access controller method from controller method
By doing so, you would be able to actually make the call again from the controller, without having to do much.
Also, we will use the async.retry and async.apply functions from the async module.
For example, imagine you move your code to a service in api/services/CustomerService.js ex:
module.exports = {
get: function (customerId, done) {
// postgresClientPool should be available in a param or globally?
// I'd prefer assigning it to the sails object...
// and use it like sails.postgresClientPool
postgresClientPool.connect(function (err, client, release) {
if (err) {
release();
done(err);
}
client.query("SELECT getCustomer($1) AS dbreturn", [customerId],
function (err, result) {
release();
if (err) {
done(err);
} else {
done(undefined, result.rows[0].dbreturn[0]);
}
});
});
}
};
Then, in your controller, for example, api/controllers/Customer.js:
get: function (req, res) {
var p = req.params.all();
async.retry(3, async.apply(CustomerService.get, p.customerID), function (err, result) {
if (err) {
res.status(500).json({
success: false
});
} else {
res.json(result);
}
})
}
You should require async in the top of your controller.

Related

put operation in Node.js

For put request:
router.put('/:id', controller.update);
My update method look like this:
exports.update = function(req, res) {
if(req.body._id) { delete req.body._id; }
Thing.findById(req.params.id, function (err, thing) {
if (err) { return handleError(res, err); }
if(!thing) { return res.status(404).send('Not Found'); }
var updated = _.merge(thing, req.body);
updated.save(function (err) {
if (err) { return handleError(res, err); }
return res.status(200).json(thing);
});
});
};
Making request:
$http.put('/api/things/'+ thing._id, updatedThingObject)
.success(function(update){
console.log("update", update)
})
.error(function(err){
console.log("err", err)
})
It gives connection error on passing the object while making the request in angular.
The error looks like this:
PUT http://localhost:9000/api/things/56c8325b9a0ee7d00d266495
net::ERR_CONNECTION_REFUSED(anonymous function) # angular.js:11442sendReq #
If I take off the updated object, it makes the request just fine but ofcourse nothing gets updated in
that case. What might be wrong here,please?
I figured.
The reason for the functions not being called is that I have a function that is being called repetitively in Node .
var autoCreate = function(){
console.log("THING CREATED AUTOMATICALLY")
var randomNumb=0;
clearTimeout(randomNumb);
randomNumb = (Math.random()* (10-5) + 5).toFixed(0);
console.log("random number", randomNumb)
var randomThing =randomstring({
length: randomNumb,
numeric: false,
letters: true,
special: false
});
console.log("ranfom thing", randomThing)
Thing.create({
name: randomThing,
readByUser: false
}, function(err, thing) {
console.log("THING IS", thing)
//setTimeout(autoCreate, randomNumb * 1000);
});
}
setTimeout(autoCreate, 10*1000);
Since this is running when post/put request is made, I get connection error. How do I handle this to be able to have this function running and be able to make put/post requests as well?

How to defer promising with status in AngularJs

I am not able to find similar questions (hopefully I am right) so I decide to ask
I have a lot of deferred promise methods from my own services, something like this
function GetSomething(id) {
var deferred = $q.defer();
$http.get(url + "/" + id)
.success(function (response) {
deferred.resolve(response);
})
.error(function (err, status) {
// I wish I won't need to do something like this
if (status === 404)
deferred.reject("Not Found");
else
deferred.reject(err);
});
return deferred.promise;
}
When I am calling this, I do
myService.getSomething(id).then (
function(response) {
// normal
},
function(err, status) {
// HERE no more additional input parameter
}
}
so, my question is, if I use deferred promise, how can I reject it with all additional parameter, such as "status", so I don't have to manually construct a "reason" for the reject method
I don't think it's a big deal, actually when I am typing this, I feel it's probably fine, so my service will handle the 404 and redirect to a specific page when it encounters such a status. Initially I was thinking to make all my services dumb, which simply returns the promise without any UI logic

Updating database with node.js and angular

I have an app which posts, gets and deletes data and I would like to add 'update' functionality as well but I can't figure it out..
I have a node.js server which has such api:
app.get('/api/feedbacks', function(req, res) {
// use mongoose to get all feedbacks in the database
getfeedbacks(res);
});
// create feedback and send back all feedback after creation
app.post('/api/feedbacks', function(req, res) {
// create a feedback, information comes from AJAX request from Angular
FeedBack.create(req.body, function(err, feedback) {
if (err)
res.send(err);
// get and return all the feedbacks after you create another
getfeedbacks(res);
});
});
// delete a feedback
app.delete('/api/feedbacks/:feedback_id', function(req, res) {
FeedBack.remove({
_id : req.params.feedback_id
}, function(err, feedback) {
if (err)
res.send(err);
getfeedbacks(res);
});
});
and such angular service which speaks to node api:
service.factory('FeedBacks', ['$http',function($http) {
return {
create : function(feedBackData) {
return $http.post('/api/feedbacks', feedBackData);
},
get : function() {
return $http.get('/api/feedbacks');
},
delete : function(id) {
return $http.delete('/api/feedbacks/' + id);
}
}
}]);
That way I can post, get and delete data.
My goal is to add also update function.
What I have tried on node:
// update a feedback
app.put('/api/feedbacks/:feedback_id', function(req, res) {
// edit a feedback, information comes from AJAX request from Angular
FeedBack.put(req.body, function(err, feedback) {
if (err)
res.send(err);
// get and return all the feedbacks after you edit one
getfeedbacks(res);
});
});
on Angular service:
update: function(editFeedId, editedFeed){
return $http.put('/api/feedbacks/' + editFeedId, editedFeed);
}
controller looks like:
$scope.editFeed = function(id) {
$scope.editFeedId = id;
$scope.editedFeed = 'replace this txt'
FeedBacks.update($scope.editFeedId, $scope.editedFeed)
// if successful creation, call our get function to get all the new
feedBacks
.success(function(data) {
console.log('updated');
$scope.feedbacks = data;
});
};
I get 500 error as I execute editFeed(). I couldn't figure out to configure that! Where do I do wrong? Any Tips?
Thanks a lot in advance!
I'm assuming you're using Mongo here, in which case your update statement is incorrect.
It should be something like:
app.put('/api/feedbacks/:feedback_id', function(req, res) {
FeedBack.update({_id: req.params.feedback_id}, req.body, function(err, feedback) {
if (err)
res.send(err);
// get and return all the feedbacks after you edit one
getfeedbacks(res);
});
});

AngularJS - why promises ($q) with $http?

I am learning AngularJS after converting from jQuery for a few years. And some bits are much more intuitive. Some not so much :).
I am trying to get my head around the use of promises, particularly $q in use with $http and there does not seem to be too much information around these two combined that I can find.
Why would I use promises in place of the success/error callback? They both make uses of callbacks in reality, so why is a promise considered better? E.g. I could set up a get(...) function like follows:
function get(url, success, error) {
success = success || function () {};
error = error || function () {};
$http.get(url)
.success(function (data) {
success(data);
})
.error(function (error) {
error(error);
});
}
get('http://myservice.com/JSON/',
function () {
// do something with data
},
function () {
// display an error
}
);
Which is good(?) because it gives me complete control over what is happening. If I call get(...) then I can control any success/errors wherever get is called.
If I convert this to use promises, then I get:
function get(url) {
return $http.get(url)
.then(function (data) {
return data;
},
function (error) {
return error;
});
}
get('http://myservice.com/JSON/')
.then(function (data) {
// do something with data
});
// cannot handle my errors?
Which is condensed, I agree; we also do not have to explicitly worry about the success/error callback, but I seem to have lost control over my error callback for a start - because I cannot configure a second callback to handle an error.
Which means that if I use this function in a service which can be used by multiple controllers, then I cannot update the UI to alert the user to an error.
Am I missing something? Is there a reason why promises is preferred? I cannot find an example why.
Usually you'll deal with asynchronous tasks in Javascript with callbacks;
$.get('path/to/data', function(data) {
console.log(data);
});
It works fine, but start to complicate when you go into whats called the 'callback hell';
$.get('path/to/data', function(data) {
$.get('path/to/data2' + data, function(data2) {
$.get('path/to/data3' + data2, function(data3) {
manipulate(data, data2, data3);
}, errorCb);
}, errorCb);
}, errorCb);
The alternative is working with promises and defered object;
Deferreds - representing units of work
Promises - representing data from those Deferreds
Sticking to this agenda can assist to you in every extreme asynctask case:
You have a regular call that need to get data from the server, manipulate it, and return to the scope
You have multiple calls that each is depending on the precious one (cahin strategy)
You want to send multiple (parallel) calls and handle their success in 1 block
You want your code to be orginized (prevent dealing with handling results on controllers)
Your task is the easiest one to handle with $q and $http
function get(url) {
var deferred = $q.defer();
$http.get(url)
.success(function (data) {
deferred.resolve(data);
})
.error(function (error) {
deferred.reject(error);
});
return deferred.promise;
}
And calling the service function is the same
get('http://myservice.com/JSON/')
.then(function (data) {
// do something with data
});
// cannot handle my errors?
You can handle the error like this:
get('http://myservice.com/JSON/')
.then(function (data) {
// do something with data
},
function (error) {
//do something with error
});
But unfortunately since you have already caught the error then the final error won't be triggered. You will also have the same problem with success.
To get that to work you ned to use $q.
function get(url) {
var deferred = $q.defer();
$http.get(url)
.success(function (data) {
deferred.resolve(data);
})
.error(function (error) {
deferred.reject(error);
});
return deferred.promise;
}
Also there is no need to pass in success and error functions because you can use promises instead.

calling angularjs-service iteratively

I have an array of ids and would like to iterate over them and pass them to a service to fetch some data. But I would like to only move to the next id after the processing of the previous id has finished. After all the data has been fetched I need to call a specific function.
My code (without the iteration) wold be something like
MyService.fetch(id)
.success(function (data, status, headers, config) {
doSomething();
});
What I want to achieve is something like this but in a way which can handle an unknown number of items in my array of ids:
MyService.fetch(id).success(function (data, status, headers, config)
{
MyService.fetch(id2).success(function (data, status, headers, config)
{
doSomething();
});
});
Any ideas how to achieve this ?
thanks
Thomas
Angular comes with a lite promise library: $q.
It's actually quite simple to do.
Service
myApp.factory('theProcessor', function($q, $timeout) {
return {
fetch: function(queue, results, defer) {
defer = defer || $q.defer();
var self = this;
// Continue fetching if we still have ids left
if(queue.length) {
var id = queue.shift();
// Replace this with your http call
$timeout(function() {
// Don't forget to call d.resolve, if you add logic here
// that decides not to continue the process loop.
self.fetch(queue, results, defer);
results.push({ id: id, value: Math.floor((Math.random()*100)+1) });
}, 500);
} else {
// We're done -- inform our caller
defer.resolve(results);
}
// Return the promise which we will resolve when we're done
return defer.promise;
},
};
});
See it in action at this plunker.
Try to use following approuch:
var idsArray= [], result = [];
/// ...After filling array
function nextIteration(index) {
MyService.fetch(idsArray[index]).success(function (data, status, headers, config)
{
result.push(data);
if (++index < idsArray.length) {
nextIteration(index)
} else {
console.log('Task complete');
}
}
nextIteration(0);
You could use the $q's all() method to bundle all the promises that you define and then do something after all of them are resolved e.g:
$q.all([promise1, promise2, ...]).then(...)
You may want to consider implementing this feature in your controller or your service.
Take a look at HERE for a complete API reference and details.
UPDATE
Just thinking that your service could accept an array of ids and it could have a method which would recursively fetch the data in order that you want. Look and the following code, it's an idea so it may not work as is:
function(){
var result = [];
var fetch = function(idArr /*this is your ID array*/){
(a simple guess if what you want to do with that ID)
$http.get('yourURL?id=' + <first element of idArr>)
.success(function(data){
//some logic
result.push(data);
idArr.splice(1,0);
fetch(idArr);
});
}
}

Resources