Multiple requests, same response - angularjs

I'm facing this strange issue with the $http service: every request made to the API provided on the company that I work, wrapped in a $q.all(), return the same response.
var promises = [
$httpPromiseA(),
$httpPromiseB(),
$httpPromiseC()
];
$q.all(promises)
.then(function (response) {
$log.log(response);
});
// $log.log returns [expectedObjectFromA, expectedObjectFromA, expectedObjectFromA]
This occurs randomly: the expectedObjectFromA can be expectedObjectFromB or expectedObjectFromC. The fact is that all of the three objects return the same, or two of one and one of other, etc.
If I chain one after other, in a linear manner, they all work perfectly, but takes more time to acomplish the task, of course.
var def = $q.deferred();
var resolvedData = [];
$httpPromiseA()
.then(function (response) {
$log.log(response);
resolvedData.push(reponse);
return $httpPromiseB();
})
.then(function (response) {
$log.log(response);
resolvedData.push(reponse);
return $httpPromiseC();
})
.then(function (response) {
$log.log(response);
resolvedData.push(reponse);
return def.resolve(resolvedData);
});
return def.promise();
// $log.log returns [expectedObjectFromA],
// [expectedObjectFromB] and [expectedObjectFromC]
// as expected. ResolvedData brings [expectedObjectFromA,
// expectedObjectFromB, expectedObjectFromC].
Can you give me some directions on what may be happening here?
I'm using the core implementations of $http and $q. I've tried one $q.allSettled, applied as $decorator, based on the API resource of Kris Kowalk's Q, but didn't succeed too.
Thanks.
EDIT 1:
I cannot pass the arguments to the functions separately because it's a wrapper function thats call the $http service, here in my app. The wrapper function expects a String as the first argument, and an Object as the second. The wrapper function returns the $http calls.
EDIT 2:
This Plunker calls 2 concurrent requests, one to the Instagram API and the other to the Flickr API. The problem don't occur is this Plunker. I really don't know how to deal with this trouble. This is REALLY annoying.

In $q.all you'll want to pass in an array of promises, instead executing the function. $httpPromiseA.$promise instead of $httpPromiseA() and so on.

The problem was on the server side, which was having problems with simultaneous requests, answering all made together with the same response.
Thanks for all support and attention.

Related

How to stop double request to same url

Angularjs app here.
There are 2 controllers that do similar things.
In particular, they have an interval. Each 10 seconds they go to their own service.
These 2 different services also do similar things. Most important is that they go to an URL that looks like this:
example.com/fromTimestamp=2019-11-21T15:13:51.618Z
As the two controllers start more or less at the same time, in the example above they could generate something like:
controller/service 1: example.com/fromTimestamp=2019-11-21T15:13:51.618Z
controller/service 2: example.com/fromTimestamp=2019-11-21T15:13:52.898Z
This is because the parameter is created in the service with his line:
var timestamp = fromTimestamp ? '&fromTimestamp=' +fromTimestamp.toISOString() : '';
So maybe there will be a difference of some seconds. Or even only a difference of milliseconds.
I would like to make only one request, and share the data fetched from http between the two services.
The most natural approach would seem to be using cache.
What I could understand is that this call could make the trick:
return $http.get('example.com/fromTimestamp=2019-11-21T15:13:51.618Z', {cache: true});
But looking in the dev tools it is still making 2 requests to the server. I guess this is because they have 2 different urls?
If that is the problem, what could be another approach to this problem?
In my apps, when face with this problem, I use the $q provider and a promise object to suspend all calls to the same endpoint while a singleton promise is unresolved.
So, if the app makes two calls very close together, the second call will not be attempted until the promise created by the first call is resolved. Further, I check the parameters of the calls, and if they are the same, then the original promise object is returned for both requests. In your case, your parameters are always different because of the time stamp. In that case, you could compare the difference in time between the two calls, and if it is under a certain threshold in miliseconds, you can just return that first promise object. Something like this:
var promiseKeeper; //singleton variable in service
function(endpointName, dataAsJson) {
if (
angular.isDefined(promiseKeeper[endpointName].promise) &&
/*promiseKeeper[endpointName].dataAsJson == dataAsJson && */
lastRequestTime - currentRequestTime < 500
) {
return promiseKeeper[endpointName].promise;
} else {
deferred = $q.defer();
postRequest = $http.post(apiUrl, payload);
postRequest
.then(function(response) {
promiseKeeper[endpointName] = {};
if (params.special) {
deferred.resolve(response);
} else {
deferred.resolve(response.data.result);
}
})
.catch(function(errorResponse) {
promiseKeeper[endpointName] = {};
console.error("Error making API request");
deferred.reject(extractError(errorResponse));
});
promiseKeeper[endpointName].promise = deferred.promise;
promiseKeeper[endpointName].dataAsJson = dataAsJson;
return deferred.promise;
}
}

Make Angular $http service process results one after another

I have a very large angularjs app, that sells stuff and has filters
It seems that we need to support people on flaky connection.
That means that if user selects 'used product' filter and then he unselects 'used product', there will be a 2 calls to the server via $http.
$http.get("reloadresults?used=true", function (response) { $scope.items = response items; }); at 12:03:04 hours
$http.get("reloadresults?used=false", function (response) { $scope.items = response items; }); at 12:03:05
Now, image there is a bottleneck or something and the first call with 'used=true' returns last, then there is a problem with the filters.
I know there is a $http interceptor in angularjs, based on promises, how would i fix this problem? So that requests are processed in the order they are sent, meaning 'used=true' and only then used=false.
Edit: cant block thread, cant refactor, just need for the promises to fullfil in the order they were first sent. I think ill post answer later.
I din't understand your question well but i think you are looking for
$q.all(valid_request)
You could indeed ensure that success handlers are called in the correct order by forming a queue (a promise chain) however it is simpler, and better in this case, to nullify the previous request each time a new request is made.
There's a number of ways in which this could be achieved. Here's one ...
function cancelPrevious(fn) {
var reject_ = null;
return function(x) {
if(reject_) reject_(new Error('cancelled'));
return Promise.race(fn(x), new Promise(function(_, reject) {
reject_ = reject; // if reject_ is called before fn(x) settles, the returned promise will be rejected.
}));
};
}
which allows you to write ...
var get = cancelPrevious(function(str) {
return $http.get(str);
});
... and to make requests from any number of event threads :
get('reloadresults?used=true').then(function(response) {
// This function will be reached only if
// $http.get('reloadresults?used=true') fulfills
// before another get() call is made.
$scope.items = response.items;
});
...
// This call causes the then callback above to be suppressed if not already called
get('reloadresults?used=false').then(function(response) {
$scope.items = response.items;
});
Notes :
http requests are not canceled. Instead, the succuss path stemming from each request is made "cancelable" by racing against a rejectable promise.
side-effects included in the function passed to cancelPrevious() may be executed; in general, don't include such side effects.

$q.all slower than sequential .then()?

I have some angular code that calls two separate backend services via $http.get. The backend is ASP.NET MVC 5.
I call the services via $http.get, and since I need a response from both services before continuing, I wrap the returned promises in $q.all. However, there seems be a massive overhead when resolving the promises via $q.all compared to resolving the promises sequentially (i.e. call the second service in the .then callback of the first promise).
The overhead appears in the TTFB (Time to first byte).
I can't figure out why $q.all would be slower than sequentially waiting for one promise to resolve before starting the next. In fact, I thought $q.all would be faster since it would allow me to start the second service call before the first has resolved.
Read on for implementation details.
These backend services are fairly lightweight:
ProductsController:
[HttpGet]
public Dictionary<string, PriceListTypeDto> GetPriceListTypesForProducts([FromUri] List<string> productErpIds)
{
// Work to get PriceListTypes. Work takes 40 ms on avg.
}
UserController:
[HttpGet]
public int? GetUserOrganizationId()
{
// work to get orgId. 1-10 ms runtime on avg.
}
Javascript functions that call these services:
var addPriceListTypes = function (replacementInfoObjects, productErpIds) {
return productService.getPriceListTypesForProducts(productErpIds) // Returns promise from $http.get
.then(function (response) {
// Simple work, takes 1 ms.
})
.catch(function () {
});
}
var addOrganizationSpecificDetails = function (replacementInfoObjects) {
return userContextService.getUserOrganizationId() // Returns promise from $http.get
.then(function (response) {
// Simple work, takes 1 ms.
})
.catch(function () {
});
};
Handling the promises:
Option 1: Takes ~600 ms before $q.all.then is called.
mapping-service.js:
var deferredResult = $q.defer();
var orgDetailsPromise = addOrganizationSpecificDetails(productInfoObjects);
var priceListPromise = addPriceListTypes(products, productErpIds);
$q.all([orgDetailsPromise, priceListPromise])
.then(function () {
deferredResult.resolve(productInfoObjects);
}).catch(function () {
deferredResult.reject();
});
return deferredResult.promise;
Performance via Chrome devtools:
Option 2: Takes ~250 ms before both promises are resolved:
mapping-service.js:
var deferredResult = $q.defer();
addOrganizationSpecificDetails(productInfoObjects)
.then(function () {
addPriceListTypes(productInfoObjects, productErpIds)
.then(function () {
deferredResult.resolve(productInfoObjects);
})
.catch(function () {
deferredResult.reject();
});
})
.catch(function () {
deferredResult.reject();
});
return deferredResult.promise;
Performance via Chrome devtools:
Where does the overhead in option 1 come from? What have I missed? I'm completely stumped here. Please let me know if you need more information.
I had a very similar problem when building a custom screen for Microsoft CRM a while back. I was using $q.all() and realized that by hitting the server with multiple requests at the same time, some of them failed or took really long to get resolved. Eventually we did the same thing you did - chain the requests rather than firing them all at once.
I believe this might be a similar issue to the one we had. We I am trying to say is that I am not very surprised that this is the case. I am not sure what our problem was exactly (meaning what was causing it), but it was there and out of our hands (it was the online CRM, not hosted).
I know my answer does not really offer any solution, but I though it is an insight which might give you some peace of mind.

angularjs chain http post sequentially

In my application, I am storing data in local storage and trigger async http post in the background. Once successfully posted, the posted data gets removed from local storage. When http post is in process, there may be more data added to local storage so I need to queue up the post and sequentially process it because, I need to wait for the local storage to be cleared from the successful posts. The task should be called recursively until there is data in local storage.
taskQueue: function () {
var deferred = $q.defer();
queue.push(deferred);
var promise = deferred.promise;
if (!saveInProgress) {
// get post data from storage
var promises = queue.map(function(){$http.post(<post url>, <post data>).then(function(result){
// clear successful data
deferred.resolve(result);
}, function(error){
deferred.reject(error);
})
})
return $q.all(promises);
}
As angular newbie, I am having problems with the above code not being sequential. How can I achieve what I intend to? The queue length is unknown and also the queue length increases as the process is in progress. Please point me to other answers if this is a duplicate.
Async.js sounds a good idea but if you don't want to use a library ...
$q.all will batch up an array of requests and fire them at the same time and will resolve when ALL promises in the array resolve - WHICH IS NOT WHAT YOU WANT.
to make $http calls SEQUENTIALLY from an array do this ....
var request0 = $http.get('/my/path0');
var request1 = $http.post('/my/path1', {data:'fred'});
var request2 = $http.get('/my/path2');
var requestArray = [];
then ...
requestArray.push(request0);
requestArray.push(request1);
requestArray.push(request2);
then ...
requestArray[0].then(function(response0) {
// do something with response0
return requestArray[1];
}).then(function(response1) {
// do something with response1
return requestArray[2];
}).then(function(response2) {
// do something with response2
}).catch(function(failedResponse) {
console.log("i will be displayed when a request fails (if ever)", failedResponse)
});
While having a library solution would be great (per #nstoitsev's answer), you can do this without it.
sequential requests of unknown length
Just to recap:
we do not know the number of requests
each response may enqueue another request
A few assumptions:
all requests will be working on a common data object (local storage in your case)
all requests are promises
running the queue
function postMyData (data){
return $http.post(<url>, data)
}
var rqsts = []
function executeQueue (){
if(!rqsts.length)
//we're done
return
var rqst = rqsts.shift()
rqst()
.then(function(rsp){
//based on how you're determining if you need to do another request...
if(keepGoing)
rqsts.push(postMyData(<more data>))
})
}
codepen - http://codepen.io/jusopi/pen/VaYRXR?editors=1010
I intentionally left this vague because I don't understand what the conditions for failure are and if you wanted to vary up the requests to use more than the same $http.post call, you could pass it back in some way.
and just a suggestion
As angular newbie...
Many things are progressing towards this whole functional, reactive programming paradigm. Since you're relatively new to Angular and NG2 already has some of this built in, it might be worthy of your attention. I think rxjs is already in many NG2 example bundles.
The easies way to achieve this is by using Async.js. There you can find a method called mapSeries. You can run it over the queue and it will sequentially process all elements of the array one by one, and will continue to the next element only when the correct callback is called.

AngularJS Execute function after a Service request ends

I am using AngularJS Services in my application to retrieve data from the backend, and I would like to make a loading mask, so the loading mask will start just before sending the request. but how can I know when the request ends?
For example I defined my servive as:
angular.module('myServices', ['ngResource'])
.factory('Clients', function ($resource) {
return $resource('getclients');
})
.factory('ClientsDetails', function ($resource) {
return $resource('getclient/:cltId');
})
So I use them in my controller as:
$scope.list = Clients.query();
and
$scope.datails = ClientsDetails.get({
date:$scope.selectedId
});
So the question would be, how to know when the query and get requests ends?
Edit:
As a side note in this question I've been using using angularjs 1.0.7
In AngularJS 1.2 automatic unwrapping of promises is no longer supported unless you turn on a special feature for it (and no telling for how long that will be available).
So that means if you write a line like this:
$scope.someVariable = $http.get("some url");
When you try to use someVariable in your view code (for example, "{{ someVariable }}") it won't work anymore. Instead attach functions to the promise you get back from the get() function like dawuut showed and perform your scope assignment within the success function:
$http.get("some url").then(function successFunction(result) {
$scope.someVariable = result;
console.log(result);
});
I know you probably have your $http.get() wrapped inside of a service or factory of some sort, but you've probably been passing the promise you got from using $http out of the functions on that wrapper so this applies just the same there.
My old blog post on AngularJS promises is fairly popular, it's just not yet updated with the info that you can't do direct assignment of promises to $scope anymore and expect it to work well for you: http://johnmunsch.com/2013/07/17/angularjs-services-and-promises/
You can use promises to manage it, something like :
Clients.query().then(function (res) {
// Content loaded
console.log(res);
}, function (err) {
// Error
console.log(err);
});
Another way (much robust and 'best practice') is to make Angular intercepting your requests automatically by using interceptor (see doc here : http://docs.angularjs.org/api/ng.$http).
This can help too : Showing Spinner GIF during $http request in angular
As left in a comment by Pointy I solved my problem giving a second parameter to the get function as following:
$scope.datails = ClientsDetails.get({
date:$scope.selectedId
}, function(){
// do my stuff here
});

Resources