Prerender unable to access relative url assets - angularjs

I am using zf2 and angular js for my frontend. ZF2 controller makes an api call and returns the data to view(phtml file). In phtml file I am using php to generate meta tags and rest of the content of the page is generated by angular js. Do I still need to serve prerendered pages to google bots since meta tags are already generated by php and I think google bots needs meta tags?To test prerender.io i installed it on my server to generate static page but it is unable to load css, js and images because I am using css, js and images as relative urls in my html page.

Related

Next js | how to dynamically update static folder content over CDN ? i am using assetPrefix in next.config

i am working on a next js application , right now static content is being served from .next/static folder i want to serve static js and css and other file which are there in .next/static folder via a CDN for that in my config next.config.js i have added
module.exports={
assetPrefix:"https://myTestCDN.com"
}
as per official Next.js Docs
as per the docs user have to upload his .next/static folder content to CDN
my question is how can i update my static content over this cdn every time my application being built
if i built my app locally and transfer content over cdn it would not work because on deployment the build will be different
actually i am able to achieve this, there are two ways to add content to CDN so that it could be delivered on our site one is we manually add content which needs to be delivered on CDN another is we just add our site link as origin url in CDN config , then CDN will itself take the static content from our site and it will be served via CDN

How to restrict access of my react build static files?

I have created the build by npm run build and hosted the build folder on a server.
My problem is that I can see the static files by their paths. Eg - https://[mydomain.com]/static/js/11.ba24d9f9.chunk.js2
While if this file doesn't exist(hit a random url on this domain, eg - https://[mydomain.com]/abaknan),
it will render my 404Component because of react-router * entry.
Is it possible to block this chunk route and show 404component ?
React is a Javascript library for UI rendering on the client-side. So, it requires all the compiled JS loaded into the document in order to render the components.
If you are using any sensitive information on your page. Please secure those in your backend application.
The compiled JS chunk files are necessary to render the page. So, they must be accessible from wherever you're requesting the page. Other endpoints return your custom 404 page because the file wasn't found by React Router. Blocking chunk routes is not possible from your UI framework, you will need to add those rules on your server.

How to deploy a react application on a static server

I have a react application build with create-react-app. Its using octoberCMS as the backend fetching data using Axios calls from the frontend. Till now I was developing keeping the build content of react inside a directory named 'react' in the root directory of octoberCMS installation. Hence the URL I was hitting was http://example.com/react/.
The problem is now I am done with the development phase and look forward to deployment. But I want my front-end to be served at http://example.com and backend to be served at http://example.com/backend (backend served as I want). How can I achieve this? I am fairly new to both frameworks.
I have tried keeping the build content along with the rest of the octoberCMS
First build your react app that will give you vendor.js[third party scripts] and your app.js[your actual app]
put them in to theme directory assets something
Then In Ocotber CMS make page with URL /:url? and paste your index.html content there.
it will be your root div and including js html, change path for js which points to the build js which you put in theme directory.
now what happens when anybody come to site
- we are serving same content as we do in dev build
- index.html with root tag and needed js
Now if use hit any other url like https://www.example.com/test/etc it also will be catch by /:url? (and all other requests) and home page served and our react app will work as we needed.
if any questions please comment.

spring cloud zuul + multiple ui bundles + angular js

I have multiple UI bundles.
My zuul yml entry
server
port : 8090
zuul:
routes:
ui:
url: http://localhost:8091
sensitive-headers:
When i try to hit url http://localhost:8090/ui
it loaded my html code but not include js and css file.
Thanks in advance.
I would want to have a closer look at the HTML that is returned when you go to the http://localhost:8090/ui. Or at least use Chrome developer tools to see what URL it is using when trying to load the JS and CSS. I had a similar issue with how Zuul does the routing. It is not a full reverse proxy in that it doesn't inspect the HTML body of the response to modify embedded URLs to be corrected.
Check out: https://github.com/spring-cloud/spring-cloud-netflix/issues/8

How to redirect crawlers requests to pre-rendered pages when using Amazon S3?

Problem
I have a static SPA site built with Angular and hosted on Amazon S3. I'm trying to make my pre-rendered pages accessible by crawlers, but I can't redirect the crawlers requests since Amazon S3 does not offer a URL Rewrite option and the Redirect rules are limited.
What I have
I've added the following meta-tag to the <head> of my index.html page:
<meta name="fragment" content="!">
Also, my SPA is using pretty URLs (without the hash # sign) with HTML5 push state.
With this setup, when a crawler finds my http://mywebsite.com/about link, it will make a GET request to http://mywebsite.com/about?_escaped_fragment_=. This is a pattern defined by Google and followed by others crawlers.
What I need is to answer this request with a pre-rendered version of the about.html file. I've already done this pre-rendering with Phantom.js, but I can't serve the correct file to crawlers because Amazon S3 do not have a rewrite rule.
In a nginx server, the solution would be to add a rewrite rule like:
location / {
if ($args ~ "_escaped_fragment_=") {
rewrite ^/(.*)$ /snapshots/$1.html break;
}
}
But in Amazon S3, I'm limited by their redirect rules based on KeyPrefixes and HttpErrorCodes. The ?_escaped_fragment_= is not a KeyPrefix, since it appears at the end of the URL, and it gives no HTTP error since Angular will ignore it.
What I've tried
I've started trying using dynamic templates with ngRoute, but later I've realized that I can't solve this with any Angular solution since I'm targeting crawlers that can't execute JavaScript.
With Amazon S3, I have to stick with their redirect rules.
I've managed to get it working with an ugly workaround. If I create a new rule for each page, I'm done:
<RoutingRules>
<!-- each page needs it own rule -->
<RoutingRule>
<Condition>
<KeyPrefixEquals>about?_escaped_fragment_=</KeyPrefixEquals>
</Condition>
<Redirect>
<HostName>mywebsite.com</HostName>
<ReplaceKeyPrefixWith>snapshots/about.html</ReplaceKeyPrefixWith>
</Redirect>
</RoutingRule>
</RoutingRules>
As you can see in this solution, each page will need its own rule. Since Amazon limits to only 50 redirect rules, this is not a viable solution.
Another solution would be to forget about pretty URLs and use hashbangs. With this, my link would be http://mywebsite.com/#!about and crawlers would request this with http://mywebsite.com/?_escaped_fragment_=about. Since the URL will start with ?_escaped_fragment_=, it can be captured with the KeyPrefix and just one redirect rule would be enough. However, I don't want to use ugly URLs.
So, how can I have a static SPA in Amazon S3 and be SEO-friendly?
Short Answer
Amazon S3 (and Amazon CloudFront) does not offer rewrite rules and have only limited redirect options. However, you don't need to redirect or rewrite your URL requests. Just pre-render all HTML files and upload them following your website paths.
Since a user browsing the webpage has JavaScript enabled, Angular will be triggered and will take control over the page which results into a re-rendering of the template. With this, all Angular functionalities will be available for this user.
Regarding the crawler, the pre-rendered page will be enough.
Example
If you have a website named www.myblog.com and a link to another page with the URL www.myblog.com/posts/my-first-post. Probably, your Angular app has the following structure: an index.html file that is in your root directory and is responsible for everything. The page my-first-post is a partial HTML file located in /partials/my-first-post.html.
The solution in this case is to use a pre-rendering tool at deploy time. You can use PhantomJS for this, but you can't use a middleware tool like Prerender because you have a static site hosted in Amazon S3.
You need to use this pre-render tool to create two files: index.html and my-first-post. Note that my-first-post will be an HTML file without the .html extension, but you will need to set its Content-Type to text/html when you upload to Amazon S3.
You will place the index.html file in your root directory and my-first-post inside a folder named posts to match your URL path /posts/my-first-post.
With this approach, the crawler will be able to retrieve your HTML file and the user will be happy to use all Angular functionalities.
Note: this solution requires that all files be referenced using the root path. Relative paths will not work if you visit the link www.myblog.com/posts/my-first-post.
By root path, I mean:
<script src="/js/myfile.js"></script>
The wrong way, using relative paths, would be:
<script src="js/myfile.js"></script>
EDIT:
Below follows a small JavaScript code that I've used to prerender pages using PhantomJS. After installing PhantomJS and testing the script with a single page, add to your build process a script to prerender all pages before deploying your site.
var fs = require('fs');
var webPage = require('webpage');
var page = webPage.create();
// since this tool will run before your production deploy,
// your target URL will be your dev/staging environment (localhost, in this example)
var path = 'pages/my-page';
var url = 'http://localhost/' + path;
page.open(url, function (status) {
if (status != 'success')
throw 'Error trying to prerender ' + url;
var content = page.content;
fs.write(path, content, 'w');
console.log("The file was saved.");
phantom.exit();
});
Note: it looks like Node.js, but it isn't. It must be executed with Phantom executable and not Node.

Resources