I have two applications, spring boot backend and react frontend. I need to load a lot of data (lets say 100 000 objects, each 3 Integer fields), and present it on a leaflet map. However i don't know which protocol should I use. I thought about two approaches:
Do it with REST, 1 000 (or more/less) objects each request, create some progress bar on front end so user does not refresh the page all the time because he thinks something is wrong.
Do it with websocket, so it is faster? Same idea with progress bar, however I am worried that if user starts to refresh the page, backend will stream the data even though connection from frontend is crashed and new one is established, for the new one the process will begin too, and so on.
If it is worth mentioning, I am using spring-boot 2.3.1, together with spring cloud (eureka, spring-cloud-gateway). Websocket i chose is SockJS, data is being streamed by SimpMessagingTemplate from org.springframework.messaging.simp.SimpMessagingTemplate.
If you have that amount of data and alot of read write operations, I would recommend not returning it in either websocket or rest call(reactor or MVC) sending big amount of data over tcp has it issues, what I would recommend is quite simple, save the data to Storage(AWS S3 for example), return the S3 bucket url, and from the client side read the data from the S3 directly,
alternatively you can have a message queue that the client is subscribe on(pub/sub), publish the data in the server side, and subscribe to it on the client side, but this may be an overkill.
If you are set on rest you can use multipart data see the stack overflow question here:
Multipart example
Related
I'm sending data from my backend every 10 seconds and I wanted to display that data in reactjs. I've searched on the net to use socket.io to display real-time data. Is there a better way to use it?
If you're dead set on updating your data every 10 seconds, it would make more sense to make a request from the client to the server, as HTTP requests can only be opened from client to server. By using HTTP requests, you won't need to use socket.io, but socket.io is an easy alternative if you need much faster requests.
Depending on how you are generating the data being sent from your backend, specifically if you are using a database, there is most likely a way to subscribe to changes in the database. This would actually update the data in realtime, without a 10 second delay.
If you want a more detailed answer, you'll have to provide more detail regarding your question: what data are you sending? where is it coming from or how are you generating it?
I'm working on an autodialer feature, in which an agent will get a call when I trigger the button from the frontend (using react js language), and then automatically all the leads in the agent assigned portal will get back-to-back calls from agent number. However, because this process is automatic, the agent won't know who the agent has called, so I want to establish a real-time connection so that I can show a popup on the frontend that contains information about the lead who was called.
I have been applying a pattern in some small apps I worked on, which next tools:
Frontend: ReactJS, Redux, Firebase (only authentication stuff)
Backend: Node with Express and the library for the database I was using (mysql or mongo)
The flow used to be like this:
Page loading and validating if user is signed in. If yes, fetching app data with backend. If not, just checking if user is not on a protected route and then let him navigate (create account, reset password, etc)
When fetch occurs and user signed in, backend send all data from that user (categories, stores, products, profile and any other info to let application run fluid) so when navigating there is not continuously loading to fetch each chunk of data. This also happens like this because is not a big amount of data (at the beginning), lets say maybe 1 or 2 stores with 15-20 products each one.
When updating data like changing product name, price or any kind of data mutation, the frontend send a request to the backend and then the update is done and a response of success true/false is sent as an answer for the request.
Let's say the previous step is a store creation. So, the frontend receives the success response and then dispatch an APPEND_STORE event to redux store to catch the new store with its data (only the ID + other data which is created on the backend is received as response on http request, data which was generated at the frontend is taken from the app itself) and append it to the stores array and then set the store array object again.
The reason why the previous step occurs like that is to avoid the backend first make the update and then make a new query to get again the new register (ej: create new store request -> backend creates the store and retrieve the new ID -> backend find/query the new register with its ID -> answer the data)
So, basically the question is about the pattern. So far this pattern had not give big troubles by now but I'm pretty sure there are some details I haven't seen from this point and also I think there are recommended ways to handle this that I couldn't find on internet yet.
I would like to know the best approach for this (or at least a better approach from the above) in order to implement it to bigger applications to handle more users and data which will increase as time goes.
Main questions:
When loading, should the backend process all data and send it to the frontend or should send minimal data to let the application begin and then send as navigation requires? I was also thinking when data grows up send a chunk of data with limit of 10/20 registers to keep the same approach but want to know the standard/correct way to handle it.
When a data validation fails on backend, should the backend answer with an OK status code with a key called success: "true/false" and additional data to handle this on the frontend, or should the backend answer with an error status code?
As I said, backend only answer with the data created by backend itself (ID, creation date, etc). Should the backend answer only with this or should backend make a new query to get full register and send it as response? Initially took this approach in order to optimize resources (minimum quantity of request and data sent by response). I'm also thinking maybe this is a dumb approach because today's world have a lot of resources so that difference should not change performance of anything. Also, with this answer my behavior on redux will change. Do you have any comments also from this redux approach?
Is it ok to make a query to database to first update and then another query to fetch all registered again? I know databases are created to handle multiple queries but I don't know if there are some cons to take on mind when doing like this.
Really thanks!
Sorry for a general question. My situation looks so: i have mongodb database and 2 reactjs pages. In each page i want to fetch a different information from database. Depending by your practice, which is the best way to fetch data from mongodb in a reactjs component?
I would recommend reading up on the MERN stack - tons of guides available online via google and youtube. The gist would be that a typical web application will consist of a few key components. In this case:
1 - (React) The client page rendered to the user
2 - (Node + Express) The server which processes data, allows you to use endpoints to make changes to your application. These endpoints make the necessary database queries. You can use a database client to write these queries as JavaScript within your NodeJS endpoints.
3 - (MongoDB) Your database.
So for instance a typical CRUD app allows you to create, read, update, and delete. Let's say you are looking at making a standard TODO list app.
You would need to make requests to these endpoints to perform these operations.
You could have a POST to /todo which would then insert a new document into your database.
You would need a way to read the information from the page... say a GET request to /todos to read all items. Or also a GET request to /todo/:id to get a specific item.
You would need a way to update an existing item... say a PUT request to /todo/:id with the updates you want to take place.
Finally you would need a way to delete an item... a DELETE request to /todo/:id which would delete the item.
Each of these endpoints would make a request to insert / read / update / delete items from the database, and return content to the client React code --> which then displays it to the user.
Frontend side, in react.js call api data using fetch() method. Pass your Mongodb URI string. If you want data in slot based use limit() and Skip() function for pagination.
Follow MVC pattern where your frontend only calla controller api. And controller calls DAO methods for Mongodb. You can Use Mongodb Stitch for serverless app.sor data leak can be avoided forntend side. Mongodb has connecting pool max.100 so that each time client request Mongodb connection cashed object given from pool to further speed up your connection time.
This is more like a question about the right approach:
We have an single page web application in angularjs that is loading a view that contains multiple diagrams. Each diagram fetch the data that needs to be displayed through the REST service. There is a limitation in chrome with 6 connection simultaneously. As we have views with more than 10 diagrams the data fetch results in queuing the calls untils previous one are resolved. This appears to the user as if the data fetch is slow.
Is there a way to execute all calls in parallel (same server, different REST endpoints)?
What where the single page solution that would not be limited by the browser but provide faster throughput?
Caching in frontend is only partially applicable, due to the active filtering of data by the user.
One solution will be combining multiple request to one request, by that the overhead of multiple connection establishment time will be gone.
You can make a proxy api which can take care of them.
The problem with combining endpoints is, if any of your endpoint has higher processing time then the other combined endpoints response has to wait for it.
Best solution is, make the endpoints first enough so 6 connections are enough
I have a page with multiple widgets, each receiving data from a different query in the backend. Doing a request for each will consume the limit the browser puts on the number of parallel connections and will serialize some of them. On the other hand, doing one request that will return one response means it will be as slow as the slowest query (I have no apriori knowledge about which query will be slowest).
So I want to create one request such that the backend runs the queries in parallel and writes each result as it is ready and for the frontend to handle each result as it arrives. At the HTTP level I believe it can be just one body with serveral json, or maybe multipart response.
Is there an angularjs extension that handles the frontend side of things? Optimally something that works well with whatever can be done in the Java backend (didn't start investigating my options there)
I have another suggestion to solve your problem, but I am not sure you would be able to implement such a thing as from you question it is not very clear what you can or cannot do.
You could implement WebSockets and the server would be able to notify the front-end about the data being fetched or it could send the data via WebSockets right away.
In the first example, you would send a request to the server to fetch all the data for your dashboard. Once a piece of data is available, you could make a request for that particular piece and given that the data was fetched couple of seconds ago, it could be cached on the server and the response would be fast.
The second approach seems a more reasonable one. You would make an HTTP/WebSocket request to the server and wait for the data to arrive over WebSocket.
I believe this would be the most robust an efficient way to implement what you are asking for.
https://github.com/dfltr/jQuery-MXHR
This plugin allows to parse a response that contains several parts (multipart) by having a callback to parse each part. This can be used in all our frontends to support responses for multiple data (widgets) in one requests. The server side will receive one request and use servlet 3 async support (or whatever exists in other languages) to ‘park’ it, sending multiple queries, writing each response to the request as each query returns (and with the right multipart boundary).
Another example can be found here: https://github.com/anentropic/stream.
While both of these may not be compatible with angularjs, the code does not seem complex to port there.