I have in my application a cart, it currently works for me, though the issue my colleague is having is that it is too "slow". I have been thinking of a better way to implement this to make it faster and efficient.
Currently this is how my page loads:
Product/Ticket page loads.
AJAX function gets products/tickets from server and displays on the page.
Each product has a buy button like this:
<button ng-click="buyTicket(id)" class="btn">Buy Ticket</button>
This is how the buyticket works:
I pass the id of the product/ticket to my function.
AJAX function posts id to server.
Server runs a database query to retrieve product/ticket information based on id.
Product/ticket information is saved into "cart" table.
AJAX function returns with data "true" as the response.
I broadcast this:
$rootScope.$broadcast('TICKET_ADDED', true);
Cart directive listens to broadcast and makes an AJAX call to server to get cart data:
$scope.$on('TICKET_ADDED', function(response) {
$scope.loadCart();
})
Data returned is assigned to an array and displayed in the cart.
Each user cart is identified by a randomly generated string of length 25.
That is how my cart works for now, i would like a faster, better way to do this please.
Edit: (used a debug bar to get these statistics)
When page loads:
No. of queries run: 1
Memory Usage: 12.25 MB
Request Duration: 1.04s
No. of AJAX requests: 3
When buy ticket function is clicked:
No. of queries run: 5
Memory Usage: 12.75 MB
Request Duration: 934.41 ms
No. of AJAX requests: 2
The approach you are using is fine, just the thing is you might not have used caching. Use caching & you will get a good Speed. Also check your server response time, speed etc. on Google Speed Insights. It will tell you how you can make it more fact. Hope it helps.
vJ
You can improve performance by introducing caching, both server side and client side.
Server side caching: instead of doing a DB query every time, keep objects in memory for some period of time. You can define a 'time to live' for an object and if the object has 'expired', you requery the db.
There are probably plenty of libraries out there that support this kind of functionality, but I would simply build it myself, because it's not that complicated. The trickiest part is making sure that nothing breaks down when multiple threads are trying to modify your collection of cached objects.
Client side caching is a breeze when you use angular. This is from the documentation of $http:
To enable caching, set the request configuration cache property to
true (to use default cache) or to a custom cache object (built with
$cacheFactory). When the cache is enabled, $http stores the response
from the server in the specified cache. The next time the same request
is made, the response is served from the cache without sending a
request to the server.
Related
I'm sending data from my backend every 10 seconds and I wanted to display that data in reactjs. I've searched on the net to use socket.io to display real-time data. Is there a better way to use it?
If you're dead set on updating your data every 10 seconds, it would make more sense to make a request from the client to the server, as HTTP requests can only be opened from client to server. By using HTTP requests, you won't need to use socket.io, but socket.io is an easy alternative if you need much faster requests.
Depending on how you are generating the data being sent from your backend, specifically if you are using a database, there is most likely a way to subscribe to changes in the database. This would actually update the data in realtime, without a 10 second delay.
If you want a more detailed answer, you'll have to provide more detail regarding your question: what data are you sending? where is it coming from or how are you generating it?
I'm working on an autodialer feature, in which an agent will get a call when I trigger the button from the frontend (using react js language), and then automatically all the leads in the agent assigned portal will get back-to-back calls from agent number. However, because this process is automatic, the agent won't know who the agent has called, so I want to establish a real-time connection so that I can show a popup on the frontend that contains information about the lead who was called.
I'm basic web.api developer. I've almost 10000 records of data. As it is a huge data, basically takes more time to load. So, front end Dev. asked me to give an API such a way that he can pass the size of the records per scroll.
So, my question is Data Loading while scrolling should done by front end developer or web.api developer?. If it is web.api side how can i do that?
Please help me!!!
Thanks in advance.
You need to do it in both client's side and server side. You need to plan your table in the database that it will provide you options for paging so you can retrieve the data by a bulk of data. For example, select * from youTable whrere id between 1 and 50.
in angular, you have to use en event that will be fired every time when you scroll down and call to the web API service. You need to manage the data you already got and the data you will going to get and to send the indexes every time.
nice link in angular - https://sroze.github.io/ngInfiniteScroll/demo_basic.html
Basically front-end developer send a request for data to API with pagination parameter, for example :
for the first time request is like
http://example.com?page=1
here API should return for example first 1-20 data, for the second request the page number is incremented like http://example.com?page=2 so API return 21-40 data and so on.
It may possible front-end developer also pass the number of data required for each request, so you have to send the data in response as request.
I'm using an API data feed for the first time, and it has a request limit of 1000 per hour. This seemed like a lot and I didn't expect to ever exceed it, but just in testing the site I have.
Could this be anything to do with how it is being requested?
It is an Angular application, which uses a Service to call the API endpoint, but the specific endpoint is dictated by a 'team_id' property which exists in an object that is selected on the 'parent' page. I've used $routeParams to pull out that variable and then populate the request URL with that included.
Is this a clumsy way of doing this that is causing 'phantom' requests to occur? As I said I have no idea how the limit was exceeded so is anything else that could be happening here to cause unnecessary API requests in the background?
Thanks in advance.
It's going to be tough to troubleshoot unless you post your code that is doing the actual API requests. If you have any loops happening, intervals, or ajax, that could easily multiply your requests into the hundreds on every page view.
Your best bet to troubleshoot is to look at your browser debugger and just look at the http requests in the 'network' tab of your browser dev tools. It will list each individual request for you as they happen (if they are done as an http request / AJAX).
https://developer.chrome.com/devtools
https://developer.mozilla.org/en-US/docs/Tools
https://msdn.microsoft.com/en-us/library/dd565628(v=vs.85).aspx
I'm posting data on my ElasticSearch database.
I've noticed that data is not immediately available, it requires some milliseconds to show up in a GET request. I can live with that (after all, the calls are asynchronous so this behavior is expected) but in my test code I need to POST some data and immediately after retrieve it. At the moment I'm using a sleep(5) just to be sure data is available but how can I synchronize with the db?
To ensure data is available, you can make a refresh request to corresponding index before GET/SEARCH:
http://localhost:9200/your_index/_refresh
Or refresh all indexes:
http://localhost:9200/_refresh
After reading about how CodeIgniter handles sessions, it has me concerned about the performance impact when sessions are configured to be stored and retrieved from the database.
This is from the CI documentation: "When session data is available in a database, every time a valid session is found in the user's cookie, a database query is performed to match it."
So every AJAX call, every HTML fragment I request is going to have this overhead? That is potentially a huge issue for systems that are trying to scale!
I would have guessed that CI would have implemented it better: include the MD5 hash to cover both the sessionID+timestamp when encoding them in the session record. Then only check the database for the session record every X minutes whenever the sessionID gets regenerated. Am I missing something?
You can make your AJAX requests use a different controller, for example Ajax_Controller instead of MY_Controller. MY_Controller would load the Session class but the Ajax_Controller doesn't. That way when you call to your AJAX, it doesn't touch session data and therefore doesn't make any erroneous calls to the database that aren't necessary.
If you are autoloading the Session class, maybe you can try unloading it for the AJAX requests? I've never tried it but it's talked about here http://codeigniter.com/forums/viewthread/65191/#320552 and then do something like this
if($this->input->is_ajax_request()){
// unload session class code goes here
}