I'm wondering if it's possible to manage the following scenario:
I'm using Laravel as API and AngularJS as Frontend. I have a functionality which allows to the user to search over a list of customers. But I'm experiencing a problem because when user enters every letter of the name of the customer, I'm sending a request to Laravel (API), but sometimes the last request is getting the response faster than the first request.
So, the final result displayed to the user is the response of the first request, which is bad because the user has already finished to type the entire customer name.
My objective is this:
If the user types the first letter, send a request to the API.
If the user types a second letter, check if the previous response is not received then cancell the previous request and send a new request.
If the user types a third letter, check if a response was received from the second request then send a request, if the previous response is not received then cancell the previous request and send a new request.
I'm not sure if my example is clear, but have seen similar behaviour before on many websites.
If found this: Cancelling $http request but looks like this is a bad practice.
How can I do this? Any clue will be really appreciated. Thanks in advance!
Your requirement can be met quite easily by setting a boolean variable to true right before the API call is started, and setting it to false when it responds. And then you can use this variable to check if a request is already in progress.
However, I think what you want is actually a little different than what you describe. It makes more sense to wait until the user stops typing before sending the request. In Angular you can do that quite easily with the debounce setting:
<input ng-model="params.q" ng-change="doApiCall()" ng-model-options="{ debounce: 500 }" type="text">
Edit: I now see that your requirement is a little bit different. You want to cancel the running request. There may be ways to cancel running ajax calls, but you certainly can't guarantee that. What should be doable is to make a queue so that the new request will start right after the last one is finished.
Related
If a request fails, HTTP POST is normally not idempotent (executing a failed request again might cause multiple inserts). What do you think about using the users session id as UUID v5 "namespace" and the JSON payload as the "name"? It would result in the same ID for multiple requests and the database would reject an additional insert.
There are APIs that specifically mark HTTP methods that are otherwise non-idempotent as idempotent.
POST being non-idempotent by default does not mean it's not allowed to be that, it just means that generic clients can't assume they are.
The best implementation I've seen is the Stripe API, that uses an Idempotency-Key as a HTTP header. The client defines this, and if 2 requests are received with an identical id, stripe knows how to handle the second. I think this is the best approach, and better than the idea of trying to construct a hash based on the request. A request looking identical does not mean the effect is the same, consider for example this POST request:
POST /increment
Content-Type: application/json
{ "increment-by": 2 }
If I send this request twice, I expect some id to be increased to 4, even if the request body was the same each time.
The Idempotency-Key lets a client control and inform the server if 2 requests were actually the same.
https://stripe.com/blog/idempotency
Followups:
Do I store the Idempotency-key as a separate column on the record?
I would be inclined to implement this feature globally as some kind of middleware.
Storing the Idempotency-key in something like Redis yields the risk of two realities (e.g. server creates db record and crashes before writing to Redis).
Use a transaction.
All you have to store about the key is that you've seen it before, and you only have to store it if the request was successful.
I have a page with multiple widgets, each receiving data from a different query in the backend. Doing a request for each will consume the limit the browser puts on the number of parallel connections and will serialize some of them. On the other hand, doing one request that will return one response means it will be as slow as the slowest query (I have no apriori knowledge about which query will be slowest).
So I want to create one request such that the backend runs the queries in parallel and writes each result as it is ready and for the frontend to handle each result as it arrives. At the HTTP level I believe it can be just one body with serveral json, or maybe multipart response.
Is there an angularjs extension that handles the frontend side of things? Optimally something that works well with whatever can be done in the Java backend (didn't start investigating my options there)
I have another suggestion to solve your problem, but I am not sure you would be able to implement such a thing as from you question it is not very clear what you can or cannot do.
You could implement WebSockets and the server would be able to notify the front-end about the data being fetched or it could send the data via WebSockets right away.
In the first example, you would send a request to the server to fetch all the data for your dashboard. Once a piece of data is available, you could make a request for that particular piece and given that the data was fetched couple of seconds ago, it could be cached on the server and the response would be fast.
The second approach seems a more reasonable one. You would make an HTTP/WebSocket request to the server and wait for the data to arrive over WebSocket.
I believe this would be the most robust an efficient way to implement what you are asking for.
https://github.com/dfltr/jQuery-MXHR
This plugin allows to parse a response that contains several parts (multipart) by having a callback to parse each part. This can be used in all our frontends to support responses for multiple data (widgets) in one requests. The server side will receive one request and use servlet 3 async support (or whatever exists in other languages) to ‘park’ it, sending multiple queries, writing each response to the request as each query returns (and with the right multipart boundary).
Another example can be found here: https://github.com/anentropic/stream.
While both of these may not be compatible with angularjs, the code does not seem complex to port there.
I'm using an API data feed for the first time, and it has a request limit of 1000 per hour. This seemed like a lot and I didn't expect to ever exceed it, but just in testing the site I have.
Could this be anything to do with how it is being requested?
It is an Angular application, which uses a Service to call the API endpoint, but the specific endpoint is dictated by a 'team_id' property which exists in an object that is selected on the 'parent' page. I've used $routeParams to pull out that variable and then populate the request URL with that included.
Is this a clumsy way of doing this that is causing 'phantom' requests to occur? As I said I have no idea how the limit was exceeded so is anything else that could be happening here to cause unnecessary API requests in the background?
Thanks in advance.
It's going to be tough to troubleshoot unless you post your code that is doing the actual API requests. If you have any loops happening, intervals, or ajax, that could easily multiply your requests into the hundreds on every page view.
Your best bet to troubleshoot is to look at your browser debugger and just look at the http requests in the 'network' tab of your browser dev tools. It will list each individual request for you as they happen (if they are done as an http request / AJAX).
https://developer.chrome.com/devtools
https://developer.mozilla.org/en-US/docs/Tools
https://msdn.microsoft.com/en-us/library/dd565628(v=vs.85).aspx
I need to know how to update UI with the status of a batch update operation. For example i am sending a request to WebApi to update multiple records (could be any number of records), now i want to show the status of each record to client side.
Please suggest me the best way of doing that. I am using WebApi, Angular. I am thinking about implementing SignalR that can update Client UI with respect to the status but is there an another way of doing that ??
signalr may be overkill for what could be a simple polling request to your api.
It's a little ambiguous as to exactly what it is you are doing with your batch update operation, but since your looking to get status back I'm going to assume it's a long slow operation that your not waiting on a response for in the initial request.
Web api has REST principles baked into it from the start so I imagine your batch update operation is using a PUT with a set of objects that need to be updated. If so you could simply request those objects back from your api to check their state and see if the operation has updated them yet.
If your not doing a simple PUT on an entity and it's more like a POST to submit a batch operation you should persist the operation entity and return a reference to it in the initial call then subsequently poll for that operation by id to get its current status.
signalr might let it feel a little more realtime by immediately pushing completed events down to the client but it can also bring a lot more overhead for what you are trying to achieve
I have implemented Google App Engine's Channel API feature in my application. Everything runs smoothly. I create new channels every one hour for every user. I have managed to maintain one channel per session (same channel for different tabs in a browser). I have implemented the onerror and onclose methods in such a way that every time they are invoked, a call is made to the server requesting for a valid token.
Sometimes, after the channel's been alive for a while, it gets disconnected. I can see failed HTTP calls to talkgadget.google.com on the JavaScript console. The URLs are something like this:
https://129.talkgadget.google.com/talkgadget/dch/bind?VER=8&clid=.....
These calls have responses like "401 (Token timed out)" or "401 (Token invalid)".
Which is indeed true, the token used by the client is invalid. It should get updated with the new token but the onerror or onclose methods aren't invoked. How am I supposed to figure out when this would happen or how to handle it? There is no real way to say if a client is disconnected or not except for the onerror or onclose methods. This issue is resolved if I refresh the page (I get the valid token from database every time the user refreshes).
I checked the socket objects's "readyState" property and it had the value 1. There are many who face this issue and as of date, there seems to be no valid solution offered by the folks at GAE.
Edit: I'm a premium account holder and this issue is holding back our deployments.
Edit 2: Having one channel per tab reduces the frequency of this happening. But it doesn't solve the problem completely.
It has been six days since I posted the question and there has been no response from the AppEngine team or any other users.
The workaround I applied was to have a button on the site that would fetch the (valid) token from the database, close the channel and then open it again with the token received.
Sometimes its a new token which should've been received before, sometimes its the same token that had been valid all along.
This issue cannot be replicated often I agree, but when it happens, it causes a lot of damage. I hope I find a solution soon.
Edit: Having one channel per tab reduces the frequency of this happening. But it doesn't solve the problem completely.