I'm using an API data feed for the first time, and it has a request limit of 1000 per hour. This seemed like a lot and I didn't expect to ever exceed it, but just in testing the site I have.
Could this be anything to do with how it is being requested?
It is an Angular application, which uses a Service to call the API endpoint, but the specific endpoint is dictated by a 'team_id' property which exists in an object that is selected on the 'parent' page. I've used $routeParams to pull out that variable and then populate the request URL with that included.
Is this a clumsy way of doing this that is causing 'phantom' requests to occur? As I said I have no idea how the limit was exceeded so is anything else that could be happening here to cause unnecessary API requests in the background?
Thanks in advance.
It's going to be tough to troubleshoot unless you post your code that is doing the actual API requests. If you have any loops happening, intervals, or ajax, that could easily multiply your requests into the hundreds on every page view.
Your best bet to troubleshoot is to look at your browser debugger and just look at the http requests in the 'network' tab of your browser dev tools. It will list each individual request for you as they happen (if they are done as an http request / AJAX).
https://developer.chrome.com/devtools
https://developer.mozilla.org/en-US/docs/Tools
https://msdn.microsoft.com/en-us/library/dd565628(v=vs.85).aspx
Related
I have the following problem, I have API written in symfony 3, in angularJS I have written a front for this API and my problem is that the response time is very long, i.e. I am asking the API for a list of events which is 500, I am waiting for almost 30s to return them, I'm asking for a list of orders which is from 300 I am waiting 17s.
A backend in symfony is probably nothing to it, because even a postman asking for a list of events also takes 30s. The next thing is asking the API for the ticket code or it has been used up, the answer is only received after about 15s, which in the case of such a simple request is a tragic time.
I am seriously wondering what the problem is that this is how long it takes. On the other hand, in the situation of using Postman, the frontend also falls out because then it is not used at all.
I have no idea what the problem may be.
I'm still very new to React so forgive me if the question is too naive. To my understand, React usually requires an API to do XHR requests. So someone with very basic tech background can easily figure out what the api looks like by looking at the network tab in web browser's debug console.
For example, people might find a page that calls to api https://www.example.com/product/1, then they can just do brute force scraping on product id 1 - 10000 to get data for all products.
https://www.example.com/api/v1/product/1
https://www.example.com/api/v1/product/2
https://www.example.com/api/v1/product/3
https://www.example.com/api/v1/product/4
https://www.example.com/api/v1/product/5
https://www.example.com/api/v1/product/6
...
Even with user auth, one can just use same cookie or token when they login to make the call and get the data.
So what is the best way to prevent scraping on React app? or maybe the api shouldn't be designed as such, hence I'm just asking the wrong question?
Here are some suggestions to address the issue you're facing:
This is a common problem. You need to solve it by using id's that are GUID's and not sequentially generated integers.
Restricting to the same-origin won't work because someone can make a request through Postman or Insomnia or Curl.
You can also introduce rate-limiting
In addition, you can invalidate your token after a certain number of requests or require it to be renewed after every 10 requests
I think no matter what you do to the JavaScript code, reading your API endpoint is the easiest thing in the world(Wireshark is an easy, bad example), once it is called upon from the browser. Expect it to be public, with that said, protecting it it is easier than you might anticipate.
Access-Control-Allow-Origin is your friend
Only allow requests to come from given urls. This may or may not allow GET requests but it will always allow direct access on GET routes. Keep that in mind.
PHP Example
$origin = $_SERVER['HTTP_ORIGIN'];
$allowed_domains = [
'http://mysite1.com',
'https://www.mysite2.com',
'http://www.mysite2.com',
];
if (in_array($origin, $allowed_domains)) {
header('Access-Control-Allow-Origin: ' . $origin);
}
Use some form of token that can be validated
This is another conventional approach, and you can find more about this here: https://www.owasp.org/index.php/REST_Security_Cheat_Sheet
Cheers!
I have a page with multiple widgets, each receiving data from a different query in the backend. Doing a request for each will consume the limit the browser puts on the number of parallel connections and will serialize some of them. On the other hand, doing one request that will return one response means it will be as slow as the slowest query (I have no apriori knowledge about which query will be slowest).
So I want to create one request such that the backend runs the queries in parallel and writes each result as it is ready and for the frontend to handle each result as it arrives. At the HTTP level I believe it can be just one body with serveral json, or maybe multipart response.
Is there an angularjs extension that handles the frontend side of things? Optimally something that works well with whatever can be done in the Java backend (didn't start investigating my options there)
I have another suggestion to solve your problem, but I am not sure you would be able to implement such a thing as from you question it is not very clear what you can or cannot do.
You could implement WebSockets and the server would be able to notify the front-end about the data being fetched or it could send the data via WebSockets right away.
In the first example, you would send a request to the server to fetch all the data for your dashboard. Once a piece of data is available, you could make a request for that particular piece and given that the data was fetched couple of seconds ago, it could be cached on the server and the response would be fast.
The second approach seems a more reasonable one. You would make an HTTP/WebSocket request to the server and wait for the data to arrive over WebSocket.
I believe this would be the most robust an efficient way to implement what you are asking for.
https://github.com/dfltr/jQuery-MXHR
This plugin allows to parse a response that contains several parts (multipart) by having a callback to parse each part. This can be used in all our frontends to support responses for multiple data (widgets) in one requests. The server side will receive one request and use servlet 3 async support (or whatever exists in other languages) to ‘park’ it, sending multiple queries, writing each response to the request as each query returns (and with the right multipart boundary).
Another example can be found here: https://github.com/anentropic/stream.
While both of these may not be compatible with angularjs, the code does not seem complex to port there.
I have in my application a cart, it currently works for me, though the issue my colleague is having is that it is too "slow". I have been thinking of a better way to implement this to make it faster and efficient.
Currently this is how my page loads:
Product/Ticket page loads.
AJAX function gets products/tickets from server and displays on the page.
Each product has a buy button like this:
<button ng-click="buyTicket(id)" class="btn">Buy Ticket</button>
This is how the buyticket works:
I pass the id of the product/ticket to my function.
AJAX function posts id to server.
Server runs a database query to retrieve product/ticket information based on id.
Product/ticket information is saved into "cart" table.
AJAX function returns with data "true" as the response.
I broadcast this:
$rootScope.$broadcast('TICKET_ADDED', true);
Cart directive listens to broadcast and makes an AJAX call to server to get cart data:
$scope.$on('TICKET_ADDED', function(response) {
$scope.loadCart();
})
Data returned is assigned to an array and displayed in the cart.
Each user cart is identified by a randomly generated string of length 25.
That is how my cart works for now, i would like a faster, better way to do this please.
Edit: (used a debug bar to get these statistics)
When page loads:
No. of queries run: 1
Memory Usage: 12.25 MB
Request Duration: 1.04s
No. of AJAX requests: 3
When buy ticket function is clicked:
No. of queries run: 5
Memory Usage: 12.75 MB
Request Duration: 934.41 ms
No. of AJAX requests: 2
The approach you are using is fine, just the thing is you might not have used caching. Use caching & you will get a good Speed. Also check your server response time, speed etc. on Google Speed Insights. It will tell you how you can make it more fact. Hope it helps.
vJ
You can improve performance by introducing caching, both server side and client side.
Server side caching: instead of doing a DB query every time, keep objects in memory for some period of time. You can define a 'time to live' for an object and if the object has 'expired', you requery the db.
There are probably plenty of libraries out there that support this kind of functionality, but I would simply build it myself, because it's not that complicated. The trickiest part is making sure that nothing breaks down when multiple threads are trying to modify your collection of cached objects.
Client side caching is a breeze when you use angular. This is from the documentation of $http:
To enable caching, set the request configuration cache property to
true (to use default cache) or to a custom cache object (built with
$cacheFactory). When the cache is enabled, $http stores the response
from the server in the specified cache. The next time the same request
is made, the response is served from the cache without sending a
request to the server.
I've been working on a REST implementation with my existing Cake install, and it's looking great except that I want to use HTTP Digest Authentication for all requests (Basic Auth won't cut it). So great, I'll generate a header in the client app (which is not cake) and send it to my cake install. Only problem is, I can't find a method for extracting that Digest from the request...
I've looked through the Cake API for something that I can use to get the Digest Header. You'd think that Request Handler would be able to grab it, but I can't find anything resembling that.
There must be another method of getting the digest that I am overlooking?
In the meantime I'm writing my own regex to parse it out of the Request... once I'm done I'll post it here so no one has to waste as much time as I did hunting for it.
Figured it out. It's already accessible via PHP as $_SERVER['PHP_AUTH_DIGEST']
So then you pass to parseDigestAuthData($_SERVER['PHP_AUTH_DIGEST']);
<bangs head against wall>