Logic in Web.API to load data while scrolling - angularjs

I'm basic web.api developer. I've almost 10000 records of data. As it is a huge data, basically takes more time to load. So, front end Dev. asked me to give an API such a way that he can pass the size of the records per scroll.
So, my question is Data Loading while scrolling should done by front end developer or web.api developer?. If it is web.api side how can i do that?
Please help me!!!
Thanks in advance.

You need to do it in both client's side and server side. You need to plan your table in the database that it will provide you options for paging so you can retrieve the data by a bulk of data. For example, select * from youTable whrere id between 1 and 50.
in angular, you have to use en event that will be fired every time when you scroll down and call to the web API service. You need to manage the data you already got and the data you will going to get and to send the indexes every time.
nice link in angular - https://sroze.github.io/ngInfiniteScroll/demo_basic.html

Basically front-end developer send a request for data to API with pagination parameter, for example :
for the first time request is like
http://example.com?page=1
here API should return for example first 1-20 data, for the second request the page number is incremented like http://example.com?page=2 so API return 21-40 data and so on.
It may possible front-end developer also pass the number of data required for each request, so you have to send the data in response as request.

Related

Display realtime data in reactjs

I'm sending data from my backend every 10 seconds and I wanted to display that data in reactjs. I've searched on the net to use socket.io to display real-time data. Is there a better way to use it?
If you're dead set on updating your data every 10 seconds, it would make more sense to make a request from the client to the server, as HTTP requests can only be opened from client to server. By using HTTP requests, you won't need to use socket.io, but socket.io is an easy alternative if you need much faster requests.
Depending on how you are generating the data being sent from your backend, specifically if you are using a database, there is most likely a way to subscribe to changes in the database. This would actually update the data in realtime, without a 10 second delay.
If you want a more detailed answer, you'll have to provide more detail regarding your question: what data are you sending? where is it coming from or how are you generating it?
I'm working on an autodialer feature, in which an agent will get a call when I trigger the button from the frontend (using react js language), and then automatically all the leads in the agent assigned portal will get back-to-back calls from agent number. However, because this process is automatic, the agent won't know who the agent has called, so I want to establish a real-time connection so that I can show a popup on the frontend that contains information about the lead who was called.

Pattern/approach for frontend-backend data integration and fetching

I have been applying a pattern in some small apps I worked on, which next tools:
Frontend: ReactJS, Redux, Firebase (only authentication stuff)
Backend: Node with Express and the library for the database I was using (mysql or mongo)
The flow used to be like this:
Page loading and validating if user is signed in. If yes, fetching app data with backend. If not, just checking if user is not on a protected route and then let him navigate (create account, reset password, etc)
When fetch occurs and user signed in, backend send all data from that user (categories, stores, products, profile and any other info to let application run fluid) so when navigating there is not continuously loading to fetch each chunk of data. This also happens like this because is not a big amount of data (at the beginning), lets say maybe 1 or 2 stores with 15-20 products each one.
When updating data like changing product name, price or any kind of data mutation, the frontend send a request to the backend and then the update is done and a response of success true/false is sent as an answer for the request.
Let's say the previous step is a store creation. So, the frontend receives the success response and then dispatch an APPEND_STORE event to redux store to catch the new store with its data (only the ID + other data which is created on the backend is received as response on http request, data which was generated at the frontend is taken from the app itself) and append it to the stores array and then set the store array object again.
The reason why the previous step occurs like that is to avoid the backend first make the update and then make a new query to get again the new register (ej: create new store request -> backend creates the store and retrieve the new ID -> backend find/query the new register with its ID -> answer the data)
So, basically the question is about the pattern. So far this pattern had not give big troubles by now but I'm pretty sure there are some details I haven't seen from this point and also I think there are recommended ways to handle this that I couldn't find on internet yet.
I would like to know the best approach for this (or at least a better approach from the above) in order to implement it to bigger applications to handle more users and data which will increase as time goes.
Main questions:
When loading, should the backend process all data and send it to the frontend or should send minimal data to let the application begin and then send as navigation requires? I was also thinking when data grows up send a chunk of data with limit of 10/20 registers to keep the same approach but want to know the standard/correct way to handle it.
When a data validation fails on backend, should the backend answer with an OK status code with a key called success: "true/false" and additional data to handle this on the frontend, or should the backend answer with an error status code?
As I said, backend only answer with the data created by backend itself (ID, creation date, etc). Should the backend answer only with this or should backend make a new query to get full register and send it as response? Initially took this approach in order to optimize resources (minimum quantity of request and data sent by response). I'm also thinking maybe this is a dumb approach because today's world have a lot of resources so that difference should not change performance of anything. Also, with this answer my behavior on redux will change. Do you have any comments also from this redux approach?
Is it ok to make a query to database to first update and then another query to fetch all registered again? I know databases are created to handle multiple queries but I don't know if there are some cons to take on mind when doing like this.
Really thanks!

Angularjs 1 - one request, multiple responses

I have a page with multiple widgets, each receiving data from a different query in the backend. Doing a request for each will consume the limit the browser puts on the number of parallel connections and will serialize some of them. On the other hand, doing one request that will return one response means it will be as slow as the slowest query (I have no apriori knowledge about which query will be slowest).
So I want to create one request such that the backend runs the queries in parallel and writes each result as it is ready and for the frontend to handle each result as it arrives. At the HTTP level I believe it can be just one body with serveral json, or maybe multipart response.
Is there an angularjs extension that handles the frontend side of things? Optimally something that works well with whatever can be done in the Java backend (didn't start investigating my options there)
I have another suggestion to solve your problem, but I am not sure you would be able to implement such a thing as from you question it is not very clear what you can or cannot do.
You could implement WebSockets and the server would be able to notify the front-end about the data being fetched or it could send the data via WebSockets right away.
In the first example, you would send a request to the server to fetch all the data for your dashboard. Once a piece of data is available, you could make a request for that particular piece and given that the data was fetched couple of seconds ago, it could be cached on the server and the response would be fast.
The second approach seems a more reasonable one. You would make an HTTP/WebSocket request to the server and wait for the data to arrive over WebSocket.
I believe this would be the most robust an efficient way to implement what you are asking for.
https://github.com/dfltr/jQuery-MXHR
This plugin allows to parse a response that contains several parts (multipart) by having a callback to parse each part. This can be used in all our frontends to support responses for multiple data (widgets) in one requests. The server side will receive one request and use servlet 3 async support (or whatever exists in other languages) to ‘park’ it, sending multiple queries, writing each response to the request as each query returns (and with the right multipart boundary).
Another example can be found here: https://github.com/anentropic/stream.
While both of these may not be compatible with angularjs, the code does not seem complex to port there.

Building a Cart with Laravel and Angular

I have in my application a cart, it currently works for me, though the issue my colleague is having is that it is too "slow". I have been thinking of a better way to implement this to make it faster and efficient.
Currently this is how my page loads:
Product/Ticket page loads.
AJAX function gets products/tickets from server and displays on the page.
Each product has a buy button like this:
<button ng-click="buyTicket(id)" class="btn">Buy Ticket</button>
This is how the buyticket works:
I pass the id of the product/ticket to my function.
AJAX function posts id to server.
Server runs a database query to retrieve product/ticket information based on id.
Product/ticket information is saved into "cart" table.
AJAX function returns with data "true" as the response.
I broadcast this:
$rootScope.$broadcast('TICKET_ADDED', true);
Cart directive listens to broadcast and makes an AJAX call to server to get cart data:
$scope.$on('TICKET_ADDED', function(response) {
$scope.loadCart();
})
Data returned is assigned to an array and displayed in the cart.
Each user cart is identified by a randomly generated string of length 25.
That is how my cart works for now, i would like a faster, better way to do this please.
Edit: (used a debug bar to get these statistics)
When page loads:
No. of queries run: 1
Memory Usage: 12.25 MB
Request Duration: 1.04s
No. of AJAX requests: 3
When buy ticket function is clicked:
No. of queries run: 5
Memory Usage: 12.75 MB
Request Duration: 934.41 ms
No. of AJAX requests: 2
The approach you are using is fine, just the thing is you might not have used caching. Use caching & you will get a good Speed. Also check your server response time, speed etc. on Google Speed Insights. It will tell you how you can make it more fact. Hope it helps.
vJ
You can improve performance by introducing caching, both server side and client side.
Server side caching: instead of doing a DB query every time, keep objects in memory for some period of time. You can define a 'time to live' for an object and if the object has 'expired', you requery the db.
There are probably plenty of libraries out there that support this kind of functionality, but I would simply build it myself, because it's not that complicated. The trickiest part is making sure that nothing breaks down when multiple threads are trying to modify your collection of cached objects.
Client side caching is a breeze when you use angular. This is from the documentation of $http:
To enable caching, set the request configuration cache property to
true (to use default cache) or to a custom cache object (built with
$cacheFactory). When the cache is enabled, $http stores the response
from the server in the specified cache. The next time the same request
is made, the response is served from the cache without sending a
request to the server.

Retrieve data from server in single (one) page application - best practices

what's best practice for requesting/sending data from/to server in SPA (Angular, Ember)?
Make one request, get everything we need in our SPA
In nested hash (per resource/object)
Exactly structured just for the one specific request
Make multiple request per resource/object.
Every answer is appreciated :)
i get everything I need in just ONE api call, as soon as possible - reducing round trips to server, connection usage in mobile devices.

Resources