How to defer promising with status in AngularJs - angularjs

I am not able to find similar questions (hopefully I am right) so I decide to ask
I have a lot of deferred promise methods from my own services, something like this
function GetSomething(id) {
var deferred = $q.defer();
$http.get(url + "/" + id)
.success(function (response) {
deferred.resolve(response);
})
.error(function (err, status) {
// I wish I won't need to do something like this
if (status === 404)
deferred.reject("Not Found");
else
deferred.reject(err);
});
return deferred.promise;
}
When I am calling this, I do
myService.getSomething(id).then (
function(response) {
// normal
},
function(err, status) {
// HERE no more additional input parameter
}
}
so, my question is, if I use deferred promise, how can I reject it with all additional parameter, such as "status", so I don't have to manually construct a "reason" for the reject method
I don't think it's a big deal, actually when I am typing this, I feel it's probably fine, so my service will handle the 404 and redirect to a specific page when it encounters such a status. Initially I was thinking to make all my services dumb, which simply returns the promise without any UI logic

Related

Handle connection losses: reperform request or resend response

Using angular-sails the sailsjs backend is usually called this way:
this.doSomethingWithItem = function(itemID, callback){
$sails.put('/item/doSomething', {itemID:itemID}).
success(function (data, status, headers, config) {
callback(data);
}).
error(function (data, status, headers, config){
alert('Error!');
});
};
In the backend, most of the Sails (v0.11.0) Controller functions are rather simple. An example might look like such:
doSomething: function(req, res) {
var p = req.params.all();
postgresClientPool.connect(function(err, client, done) {
client.query("SELECT item_do_something_in_this_awesome_function($1) AS dbreturn", [p.itemID], function(err, result) {
done();
if(err) {
res.status(500).json({success: false});
}
else {
res.json(result.rows[0].dbreturn[0]);
}
});
});
}
Now for reasons we can't influence we're experiencing quite frequent but rather short connection losses (between Client/Browser and nodeJS/sails-Server). The task is now to handle them as smooth for the user as possible and avoid any further inconvenience.
So, if during an ongoing request the connection is interrupted, the logic has to be something like:
Check if the connection interruption happened before or after the request reached the server.
If it happened before: re-perform the request.
If it happened afterwards: tell the backend to re-send the result of the request.
Now, how to achieve that?
I don't know if registering a $sails.on('disconnect'... in each service function is the best idea. And anyway, I haven't figured out yet how the de-register them after the function finished executing.
First of all, I would recommend you to separate your business logic out of the controller.
You can check this other answer: sails.js access controller method from controller method
By doing so, you would be able to actually make the call again from the controller, without having to do much.
Also, we will use the async.retry and async.apply functions from the async module.
For example, imagine you move your code to a service in api/services/CustomerService.js ex:
module.exports = {
get: function (customerId, done) {
// postgresClientPool should be available in a param or globally?
// I'd prefer assigning it to the sails object...
// and use it like sails.postgresClientPool
postgresClientPool.connect(function (err, client, release) {
if (err) {
release();
done(err);
}
client.query("SELECT getCustomer($1) AS dbreturn", [customerId],
function (err, result) {
release();
if (err) {
done(err);
} else {
done(undefined, result.rows[0].dbreturn[0]);
}
});
});
}
};
Then, in your controller, for example, api/controllers/Customer.js:
get: function (req, res) {
var p = req.params.all();
async.retry(3, async.apply(CustomerService.get, p.customerID), function (err, result) {
if (err) {
res.status(500).json({
success: false
});
} else {
res.json(result);
}
})
}
You should require async in the top of your controller.

Force rejecting Angular $http call

I am using $http to make a call. Based on the successful result of the call I may decided to throw an error/reject and have it trickle down to the next call as an error. However if an error is thrown it just halt the process. How can I force the $http promise to reject without wrapping it in some $q code?
// A service
angular.module('app').factory('aService', function ($http, config) {
return {
subscribe: function (params) {
return $http({
url: '...'
method: 'JSONP'
}).then(function (res) {
// This is a successful http call but may be a failure as far as I am concerned so I want the calling code to treat it so.
if (res.data.result === 'error') throw new Error('Big Errror')
}, function (err) {
return err
})
}
}
})
// Controller
aService.subscribe({
'email': '...'
}).then(function (result) {
}, function (result) {
// I want this to be the Big Error message. How do I get here from the success call above?
})
In the above code I would like the Big Error message to end up as a rejected call. However in this case it just dies with the error. This is how I handle things in say Bluebird but it's a no go here.
Ti continue the Chain in a rejected state just return a rejected promise $q.reject('reason') from your $http result something like
$http.get(url).then(
function (response){
if(something){
return $q.reject('reason');
}
return response;
}
)
That way you'll get a a rejected promise and can react to it even when the api call is successful.

AngularJS - why promises ($q) with $http?

I am learning AngularJS after converting from jQuery for a few years. And some bits are much more intuitive. Some not so much :).
I am trying to get my head around the use of promises, particularly $q in use with $http and there does not seem to be too much information around these two combined that I can find.
Why would I use promises in place of the success/error callback? They both make uses of callbacks in reality, so why is a promise considered better? E.g. I could set up a get(...) function like follows:
function get(url, success, error) {
success = success || function () {};
error = error || function () {};
$http.get(url)
.success(function (data) {
success(data);
})
.error(function (error) {
error(error);
});
}
get('http://myservice.com/JSON/',
function () {
// do something with data
},
function () {
// display an error
}
);
Which is good(?) because it gives me complete control over what is happening. If I call get(...) then I can control any success/errors wherever get is called.
If I convert this to use promises, then I get:
function get(url) {
return $http.get(url)
.then(function (data) {
return data;
},
function (error) {
return error;
});
}
get('http://myservice.com/JSON/')
.then(function (data) {
// do something with data
});
// cannot handle my errors?
Which is condensed, I agree; we also do not have to explicitly worry about the success/error callback, but I seem to have lost control over my error callback for a start - because I cannot configure a second callback to handle an error.
Which means that if I use this function in a service which can be used by multiple controllers, then I cannot update the UI to alert the user to an error.
Am I missing something? Is there a reason why promises is preferred? I cannot find an example why.
Usually you'll deal with asynchronous tasks in Javascript with callbacks;
$.get('path/to/data', function(data) {
console.log(data);
});
It works fine, but start to complicate when you go into whats called the 'callback hell';
$.get('path/to/data', function(data) {
$.get('path/to/data2' + data, function(data2) {
$.get('path/to/data3' + data2, function(data3) {
manipulate(data, data2, data3);
}, errorCb);
}, errorCb);
}, errorCb);
The alternative is working with promises and defered object;
Deferreds - representing units of work
Promises - representing data from those Deferreds
Sticking to this agenda can assist to you in every extreme asynctask case:
You have a regular call that need to get data from the server, manipulate it, and return to the scope
You have multiple calls that each is depending on the precious one (cahin strategy)
You want to send multiple (parallel) calls and handle their success in 1 block
You want your code to be orginized (prevent dealing with handling results on controllers)
Your task is the easiest one to handle with $q and $http
function get(url) {
var deferred = $q.defer();
$http.get(url)
.success(function (data) {
deferred.resolve(data);
})
.error(function (error) {
deferred.reject(error);
});
return deferred.promise;
}
And calling the service function is the same
get('http://myservice.com/JSON/')
.then(function (data) {
// do something with data
});
// cannot handle my errors?
You can handle the error like this:
get('http://myservice.com/JSON/')
.then(function (data) {
// do something with data
},
function (error) {
//do something with error
});
But unfortunately since you have already caught the error then the final error won't be triggered. You will also have the same problem with success.
To get that to work you ned to use $q.
function get(url) {
var deferred = $q.defer();
$http.get(url)
.success(function (data) {
deferred.resolve(data);
})
.error(function (error) {
deferred.reject(error);
});
return deferred.promise;
}
Also there is no need to pass in success and error functions because you can use promises instead.

Mark a notification as read in Facebook graph API

I am willing to mark a user's notification as read using the facebook graph api, but I am now starting to wonder if that is possible at all. Here is what I am trying now, which is a solution I found in this question on stackoverflow.
$http({method: 'POST', url: 'https://graph.facebook.com/' + item.id + '?unread=0'}).
success(function(data, status, headers, config) {
// this callback will be called asynchronously
// when the response is available
deferred.resolve(status);
}).
error(function(data, status, headers, config) {
// called asynchronously if an error occurs
// or server returns response with an error status.
});
Of course, item.id is the id of the notification.
I am using angular for my http requests, but I dont mind other methods too, just angular's is the easiest for me. I am also looking forward to hear any ideas on how to mark notifications as read, I don't prefer any way, just want it to happen somehow.
This is the solution I found a while ago, forgot to post. (Note that I am using angular.js, if you are not either make the promise with q or rsvp libraries or don't)
function (notificationId) {
var deferred = $q.defer();
FB.api(
'https://graph.facebook.com/' + notificationId, 'post', {
unread: 0
},
function (response) {
if (!response || response.error) {
deferred.reject(response.error);
} else {
deferred.resolve(response);
}
});
return deferred.promise;
}

Implementing sockets into $http in angularjs

I implemented Primus (Sockets) on my Server and would like to access it via the client, which uses AngularJS. I would like to be able to still use libraries like Restangular or the $resource from Angular. So IMHO the best way to achieve this is to extend the $http service, which is used by most libraries as the basis.
I would like this new service to be able to gracefully fall back to the normal $http, when there is no socket connection available.
In Pseudocode:
socketHttpService = function(config) {
if(socketEnabled) {
var message = buildMessageFromConfig();
primus.write(message);
return promise;
}
return $http(config);
}
Call it like you would $http
socketHttpService({method: 'GET', url: '/someUrl'}).then(function() {
// do whatever
});
My question is, how can i replace the standard $http service with this newly created one? Is there an elegant way, while still retaining the default $http behaviour?
In the meantime, I found a solution to the problem
.config(function($provide) {
$provide.decorator('$httpBackend', function($delegate, $q, $log, SocketService) {
// do not blast mock $httpBackend if it exists
if (angular.isDefined(angular.mock)) {
return $delegate;
}
var httpBackendSocket = function(method, url, post, callback, headers, timeout, withCredentials, responseType) {
if(SocketService.isOpen) {
console.log('open');
method = method.toLowerCase();
// we only know get, post, put, delete
if(method === 'get' || method === 'post' || method === 'put' || method === 'delete') {
// we can not handle the authentication links via sockets, so exclude them
if( url.substring( 0, '/api/v1/currentuser'.length ) !== '/api/v1/currentuser' &&
!angular.equals(url, '/api/v1/login') &&
!angular.equals(url, '/api/v1/logout') &&
!angular.equals(url, '/api/v1/session') ) {
var promise = SocketService.writeRest(method, url, post || {});
return promise.then(function promiseSuccess(response) {
return callback(response.status, response.data, response.headers, '');
}, function promiseError(response) {
// is caught via http handlers
// LATER: If error, retry with $httpBackend ($delegate)
return callback(response.status, response.data, response.headers, '');
});
}
}
}
return $delegate(method, url, post, callback, headers, timeout, withCredentials, responseType);
}
return httpBackendSocket;
});
})
Why? Because it feels like 5 times faster than http, because there is a standing connection and I am not losing any of the realtime options. It's like a cherry on top.
Kind Regards

Resources