I am using NextJS and my understanding is that both the front-end and backend exist in the same location. For development, this would be both http://localhost:3000/about for any user who wants to visit the about page. However this means that any API routes I have in 'pages/api' will be visible whenever I just add that to my url, displaying JSON.
How is it that some sites are able to have the same domain and link but with api.website.com where all there other stuff is on website.com. That way any queries to the api and server are done with api.website.com as opposed to revealing anything on the main link?
It's because most websites have their api on a backend server using libraries like express. pages/api is just a next utility which comes under localhost:3000/api/{get-user} or your deployment uri/api/ which is mostly just used for development/testing/production if they don't have a backend server.
Related
I'm asking for some help, maybe I'm missunderstanding some concepts and finally I dont know how to solve this requirement:
I have a create-react-app application deployed using Netlify.
Also my backend is deployed on AWS ECS.
I'm using AWS route 53 for routing frontend and backend to myapp.mydomain.com and api.mydomain.com respectively.
A client has a specific network config so only *.mydomain.com requests are allowed for his organization.
The problem resides on frontend because it uses many libraries, for example:
Checking network tab on browser I noticed thw following:
I'm using a giphy library, so it makes requests to api.giphy.com.
I'm using some google stuff like analytics and fonts, so I assume it will make requests to some google domain.
And so on...
As I understand this kind of fetches will be blocked by client network "firewall".
Adding more rules to said firewall is not an option (That was my first proposal to client but they only allows *.mydomain.com and no more)
So my plan B was implement a proxy ... but I dont have any idea of how to implement such solution.
It's possible to "catch" third party fetches, redirect them to my backend like api.mydomain.com/forward, so my backend would make real fetch and returns response given by said fetch to frontend?
The result desired should be, for example again, all fetches made to api.giphy.com should be redirected to api.mydomain.com/forward/giphy and same for all third-party fetches
I Googled a lot and now I'm very confused, any help is welcome!! Thanks devs!
Can anyone help how to configure nginx so it only accepts the server IP where ReactJS is hosted?
Ive tried many options to no avail. I always see ReactJS is using the client IP where the user is currently browsing (because of I guess of its client-based nature). Unfortunately, I need to block all other request to protect my Django rest api from external requests. My Django app is having this nginx reverse proxy by the way. How do you guys do this?
I think you seriously misunderstood how your app works. Unless you are doing something really, really weird, it generally works like this:
The user's browser (the client) receives the ReactJS code from the server hosting it
The ReactJS code is executed in the user's browser
All requests for your REST API will originate from the user's browser executing the ReactJS code, i.e. coming from the client machine, not from the server hosting the ReactJS code
The server hosting the ReactJS code merely returns the ReactJS code to client, and doesn't even interact with the server hosting Django REST API
Thus, what you are attempting to do is misguided and in fact will just restrict your users.
As a side note, you may use several complicated techniques to be somewhat sure that the API requests do come from a browser executing your ReactJS code (i.e. a legitimate use of your API) rather than other tools, but that's far from a guarantee and in most cases serves no practical purpose anyway.
I am utterly confused about which platform configurations to use under Azure AD app's Authentication blade.
There are 2 platform configurations I am confused about:
"Web"
"Single-page application"
The app I have registered is a React JS app, which in my mind, is both a Web app AND a SPA.
This "rabbit hole" get's deeper as I'm trying to configure redirect URIs so i can use MSAL.js to authenticate and authorize within the app.
Essentially, it comes down to this (for my http://localhost:5000 development environment):
If I specify my URI under Web, then I get error:
AADSTS9002326: Cross-origin token redemption is permitted only for the 'Single-Page Application' client-type.
And from what ive been reading, Web platform is the way to go (not SPA).
Can somebody shed any light onto this convoluted area?
Which platform configuration should I be using for a ReactJS app?
Thank you.
• React js is mostly used to develop SPA (single page application) as it is a web application or website that interacts with the web browser by dynamically rewriting the current web page with new data from the web server, instead of the default method of the browser loading entire new pages. This means that the URL of your website will not change completely (page will not reload), instead it will keep getting content and rewriting the DOM with it instead of loading a new page. The goal is faster transitions that make the website feel more like a native app.
• When building you react-app, you can see that there is only one App.js from where your entire web-app is loaded in fragments and components. This behaviour of rendering components and pages on a single page and changing the DOM (is a single page behaviour and hence the name), instead of loading a new page with new content, this makes it feel like a single application.
• So, when you are using react js as a building code platform, I would suggest you use SPA as the platform in Azure AD app registration. That does not mean you cannot use react js to create an app on a remote web server and deploy it, you can but for hosting react js coded application script and running it as a worker process and provide a desirable output you need script execution backend runtime like ‘ngrok’ and ‘node.js’ to supplement the execution and provide compatibility with the web server environment.
Please refer below links for more information: -
Why is React Js called as Single Page Application
https://learn.microsoft.com/en-us/answers/questions/315313/azure-app-registration-causing-the-following-error.html
I just read the following.
... your web app can be considered static. Some examples of this type of
web application are a simple personal home page, an online games
portal that doesn’t save data to the server on which it is hosted, or
an AngularJS app that performs multiple calls to a RESTful API
provided by another service.
So if a website is based purely on angular, does not have any server code and solely depends on API calling to retrieve and save data, will it be considered static or dynamic?
Dynamic website means we can change the site's content without modify the source code. For your website, if it is showing data from a REST API which means it is depending on a server somewhere then it is a dynamic website.
I read an article about social sharing issues in AngularJS and how to combat by using Apache as a proxy.
The solution is usable for small websites. But if a web app has 20+ different pages, I have to url-write and create static files for all of them. Moreover, a different stack is added to the app by using PHP and Apache.
Can we use NodeJS as the proxy and re-write the url, and what's the approach?
Is there a way to minimize static files creation?
Is there a way to remove proxy, url-rewrite, and static files all together? For example, inside our NodeJS app to check the user agent, if it is facebook bot or twitter and the like, we use request module to download our page and return the raw html code for them, is it a plausible solution?
Normally when someone shares a url in a social network, that social network request that page to generate a preview/thumbnail (aka "scrape").
Most likely those scrapers won't run javascript, so they need a static html version of that page.
Same applies for search engines (even though Google and others are starting to support javascript sites).
Here's a good approach for an SPA to still support scrapers:
use history.pushState in angular to get virtual urls when navigating thru your app (ie. urls without a #)
server-side (node.js or any), detect if a request comes from a user or a bot (eg. check the User-Agent using this lib https://www.npmjs.com/package/is-bot )
if the request url has a file extension, it's probably a static resource request (images, .css, .js), proxy to get the static file
if the request url is a page, for real users, if the url is a page (ie. not a static resource) always serve your index.html that loads your angular app (pro tip: keep this file cached in memory)
if the request url is a page, serve a pre-rendered version of the requested url (they won't run javascript), this is the hard part (side note: ReactJS makes this problem much simpler), you can use a service like https://prerender.io/ they'd take care of loading your angular app, and saving each page as html (if you're curious, they use a headless/virtual browser in memory called PhantomJS to do that, simulating what a real user would do clicking "Save As..."), then you can request and proxy those prerendered pages to bot requests (like social network scrappers). If you want, it's possible to run a prerender instance on your own servers.
All this server-side process I described is implemented in this express.js middleware by prerender:
https://github.com/prerender/prerender-node/blob/master/index.js
(even if you don't like prerender, you can use that code as implementation guide)
Alternatively, here's an implementation example using only nginx:
https://gist.github.com/thoop/8165802