what's best practice for requesting/sending data from/to server in SPA (Angular, Ember)?
Make one request, get everything we need in our SPA
In nested hash (per resource/object)
Exactly structured just for the one specific request
Make multiple request per resource/object.
Every answer is appreciated :)
i get everything I need in just ONE api call, as soon as possible - reducing round trips to server, connection usage in mobile devices.
Related
I've been working as a full stack web developer for over a year now. Nextjs/golang is our stack. Today I realized that I have no idea whether we use CSR or SSR. I think we use CSR but I'm not sure. I've gone down rabbitholes on this topic like 100 times but its never stuck.
First question:
When they say server side, do they mean on the backend? i.e. golang? or does that mean on the nextjs server? Whenever someone says server I think backend but I don't think this is correct.
Second question:
How does client side rendering work? Like I know the server sends javascript to the client then the client uses the javascript to build the page. But all of this javascript must make a bunch of requests to the server to grab data and html right? Like how would the javascript build the page otherwise? Is this why react/nextjs is written in jsx? So the server can send all the JSX which is actually just javascript to the client then the client can build the html?
Third Question:
If CSR has the client build the page, how would this ever work? Like what about all of the data that needs to be pulled from our database for specific users / etc etc. That can't be done directly from the frontend.
I tried reading tons of articles online! Hasn't clicked yet
You said the essential thing yourself: in client-side rendering, "the server sends javascript to the client, then the client uses the javascript to build the page." Hold on to that one point – everything else is secondary. The client "renders."
Now that "client-side" rendering capabilities have become so powerful – and, so varied – it has generally become favored. If you "tell the client what to do and then let him do it," you are more likely to get a consistently favorable outcome for most clients. Yes, the client will issue perhaps-many AJAX requests in carrying out your instructions. That is irrelevant.
CSR - server sends HTML that contains links to javascript (script tags). Clients then loads and executes JS (JavaScript typically contains fetching code). That means that each client will perform several round trips to the server to get HTML and then the data.
SSR - server sends HTML and embeds the necessary data in it
The client already has the data and HTML, so it can render it. SSR does fetch on each request, meaning the client still gets the latest data.
Benefits of using SSR compared to CSR is lower load time, it makes the website feel "faster" and also improves the ranking by search engine bots. On the other hand, the server does the rendering, which increases its burden (though fewer requests decreases it).
SSG is the same as SSR but fetching occurs at build time, the result page is computed only once and returned for each request. It is possible to use SSG with or without data.
Use SSG if possible, then mostly SSR. In some occasions it may be better to use CSR instead of SSR, though I'm not experienced enough to give the answer when.
Now answering your questions:
Yes, SSR happens on the server. If you use fetch function then it will work on client. But if you use getServerSideProps or getStatisSideProps then it will work on server. You can read from the file system, fetch public API or query the database, whatever you do in getStatisSideProps, getServerSideProps will run on the server, before returning the response.
Yes, you're correct. Client need the data to render the page, so it has to send requests to server and then render.
The third question is the same as the second. I hope the long answer I gave clarified your confusion.
Sorry for long answer.
This is more like a question about the right approach:
We have an single page web application in angularjs that is loading a view that contains multiple diagrams. Each diagram fetch the data that needs to be displayed through the REST service. There is a limitation in chrome with 6 connection simultaneously. As we have views with more than 10 diagrams the data fetch results in queuing the calls untils previous one are resolved. This appears to the user as if the data fetch is slow.
Is there a way to execute all calls in parallel (same server, different REST endpoints)?
What where the single page solution that would not be limited by the browser but provide faster throughput?
Caching in frontend is only partially applicable, due to the active filtering of data by the user.
One solution will be combining multiple request to one request, by that the overhead of multiple connection establishment time will be gone.
You can make a proxy api which can take care of them.
The problem with combining endpoints is, if any of your endpoint has higher processing time then the other combined endpoints response has to wait for it.
Best solution is, make the endpoints first enough so 6 connections are enough
So in an ideal world both client side validation and server side validation can be defined in one place, so the validation only has to be written once and can be reused where ever.
The idea i have to solve this is to do all the validation through an API using ASP.NET Core. When the form data on the client changes it will send an AJAX request with the updated data model, which the API validates and in turn returns any possible errors. The client then shows these errors to the user directly.
This way it still looks like the good old client-side validation, but it actually all happens on the server.
I can already imagine the server load is going to be increased since a lot more API calls will be send, however the question is:
will this server load be manageable in for example a big enterprise application with huge forms and complex validation?
And are there any other big drawbacks of this solution which i have to watch out for?
You are talking about an API not any other type of application with a back-end.
In this world, yes the validation of the payloads is important and needs to happen on the API side. In a way, the validation is the easiest part and less resource consuming since this is the first thing you check and if it doesn't pass then the API returns a 400 BadRequest HTTP code and nothing else happens.
There are systems where the validation, especially business rules validation does not happen on the API side. You could have for example a financial platform and the API is simply the gateway inside that world. In this case, the API acts as a pass-through and doesn't do much itself.
This being said, everything is susceptible to too much traffic, but you should be able to either throw enough resources at it, or have it deployed in the cloud and let it scale based on demand. You can load test APIs as well, to see how well they do under pressure, you must have an idea of how many calls you can expect in a certain period of time.
I wouldn't worry too much about it, I'd say validate what you can client side, so you don't even hit the API if there is no need for it and leave the rest to the API
I have a page with multiple widgets, each receiving data from a different query in the backend. Doing a request for each will consume the limit the browser puts on the number of parallel connections and will serialize some of them. On the other hand, doing one request that will return one response means it will be as slow as the slowest query (I have no apriori knowledge about which query will be slowest).
So I want to create one request such that the backend runs the queries in parallel and writes each result as it is ready and for the frontend to handle each result as it arrives. At the HTTP level I believe it can be just one body with serveral json, or maybe multipart response.
Is there an angularjs extension that handles the frontend side of things? Optimally something that works well with whatever can be done in the Java backend (didn't start investigating my options there)
I have another suggestion to solve your problem, but I am not sure you would be able to implement such a thing as from you question it is not very clear what you can or cannot do.
You could implement WebSockets and the server would be able to notify the front-end about the data being fetched or it could send the data via WebSockets right away.
In the first example, you would send a request to the server to fetch all the data for your dashboard. Once a piece of data is available, you could make a request for that particular piece and given that the data was fetched couple of seconds ago, it could be cached on the server and the response would be fast.
The second approach seems a more reasonable one. You would make an HTTP/WebSocket request to the server and wait for the data to arrive over WebSocket.
I believe this would be the most robust an efficient way to implement what you are asking for.
https://github.com/dfltr/jQuery-MXHR
This plugin allows to parse a response that contains several parts (multipart) by having a callback to parse each part. This can be used in all our frontends to support responses for multiple data (widgets) in one requests. The server side will receive one request and use servlet 3 async support (or whatever exists in other languages) to ‘park’ it, sending multiple queries, writing each response to the request as each query returns (and with the right multipart boundary).
Another example can be found here: https://github.com/anentropic/stream.
While both of these may not be compatible with angularjs, the code does not seem complex to port there.
I am building a website (probably in Wordpress) which takes data from a number of different sources for display on various pages.
The sources:
A Twitter feed
A Flickr feed
A database on a remote server
A local database
From each source I will mainly retrieve
A short string, e.g. for Twitter, the Tweet, and from the local database the title of a blog page.
An associated image, if one exists
A link identifying the content at its source
My question is:
What is the best way to a) store the data and b) retrieve the data
My thinking is:
i) Write a script that is run every 2 or so minutes on a cron job
ii) the script retrieves data from all sources and stores it in the local database
iii) application code can then retrieve all data from the one source, the local database
This should make application code easier to manage - we only ever draw data from one source in application code - and that's the main appeal. But is it overkill for a relatively small site?
I would recommend putting the twitter feed and flickr feed in JavaScript. Both flickr and twitter have REST APIs. By putting it on the client you free up resources on your server, create less complexity, your users won't be waiting around for your server to fetch the data, and you can let twitter and flickr cache the data for you.
This assumes you know JavaScript. Once you get past JavaScript quirks, it's not a bad language. Give Jquery a try. JQuery Twitter plugin Flickery JQuery plugin. There are others, that's just the first results from Google.
As for your data on the local server and remote server, that will depend more on the data that is being fetched. I would go with whatever you can develop the fastest and gives acceptable results. If that means making a REST call from server to sever, then go for it. IF the remote server is slow to respond, I would go the AJAX REST API method.
And for the local database, you are going to have to write server side code for that, so I would do that inside the Wordpress "framework".
Hope that helps.