I have a page at the moment with different kinds of "items" with an unique URL for each item.
www.example.com/mysite/items/banana-split/images
www.example.com/mysite/items/apple-pie/images
I'm using AngularJS with ngRoute and ngMeta + JS + HTML5 + CSS only and I host my site at one.com.
It works fine at the moment and when I click on an item everything works fine and the title is changed and I try to change the metaData also.
Though when I link my products to Whatsapp or Facebook or similar services it only displays:
{{ngMeta.title}} {{ngMeta.description}}
I know it has to do with the prerendering of the page and it doesn't prerender my page. I've looked at prerender.io etc, but since it's using NodeJS I don't think I can host it on one.com. Maybe on AWS, but one.com is what I pay for.
Are there any way for me to fix this from .htaccess file or from the pure frontend?
Related
I have a project in ReactJS with multiple pages. I use the react routing to handle different URL paths.
Also, I have already deployed the source code using tomcat 10 and made the following configuration in web.xml:
<error-page>
<error-code>404</error-code>
<location>/index.html</location>
</error-page>
I'm able to open any web-site URLs directly in the address bar. However, I have a problem with meta tags, because the server firstly redirect any request to index.html and after it load my sub page (for example /contact). If I put /contact URL in Facebook I can see meta tags from index.html, but I want to see meta tags from Contact.jsx page. I think that Facebook gets meta tags from the first page from server request and doesn't allow redirect to the second page (/contact).
How can I resolve it?
I think that I should update meta tags using JS in index.html, but I don't like this idea. Because I have already put meta tags into Contact.jsx using react-meta-tags.
Maybe, anybody know better solution?
I have this website were we can create new questions. Whenever a new question is created a new url is generated I want google to crawl my website everytime a new question is added and display it in google.
I have my front end in react js and backend in express js.
My front end is hosted in firebase and backend in heroku.
Since I am using javascript and my urls are all dynamicly generated google does not crawl or index them.
Currently I am writing all dymaicly created urls into a file in my root folder in backend called sitemap.txt.
What should i do to achive this?
my sitmap link
https://ask-over.herokuapp.com/sitemap.txt
my react apps link
https://wixten.com
my express.js link
https://ask-over.herokuapp.com
i want to add
https://ask-over.herokuapp.com/sitemap.txt to google search console
In fact create-react-app is the wrong tool when SEO matters. Because:
there is only one HTML file
there is no content inside the single HTML file
heavy first load
etc, [search about reasons of using nextjs a good article
SPAs are the best for PWAs, admin panels, and stuffs like this.
But take a look at https://nextjs.org/docs/migrating/from-create-react-app. And my suggestion is to make some plans to fully migrate to Next.js.
Also, search about react SEO best practies and use the helpers and utilities like React Helmet.
create-react-app is not the way to go if you are going for a seo friendly website.
if it's behind a login screen you can go with create-react-app.
if the site is a blog or documentation site , I would suggest you migrate to nextjs or gatsby js or if it's a very small webpage go with raw html, css , js
It's not possible for Google or any other web crawler to crawl your SPA Websites. The best way to fix this is either to use Server Side Frameworks like Next.js or use pre-rendering and redirect crawlers to pre rendering server instead of main website.
You can checkout prerender.io, it has the open source version as well, you can run it on a seperate server and use one of the snippets/plugins for your web server (apache/nginx/others) to redirect requests to different upstream server.
I've been using it for one of my projects (e-commerce store) built on VueJs and it works like a charm.
To understand the basics, what it does is it'll load your website in a browser, and cache the rendered code in it's database/cache, and when any crawler visits your website they'll be redirected to cache which is the generated html page of your website, and crawlers will be able to read everything smoothly.
I use digital ocean space and CDN to host a React SPA. When hitting with a browser the url [host]/index.html it works fine. However hitting [host]/index.html/customers/one or any other subpaths, returns a 404. Currently, any reload on any subpath returns that 404. Last, I use terraform to update the SPA artifacts on DO spaces and I have tried to add a website_redirect="/index.html" to all the bucket objects (js, html and css) but with no success (more info if necessary here). And to be completely honest I am not sure I understand that option in the terraform digitalocean provider. I might be using it completely wrong.
Now, I have seen that question in multiple places but never with a clear answer.
Here is one on digitalocean community (https://www.digitalocean.com/community/questions/is-it-possible-to-send-a-301-redirect-for-bucket-objects) where no answer is provided but the issue seems to be similar.
There is a similar question on SO without an approved answer Redirect wrong URL/path DigitalOcean Spaces
This is a DO idea that is somewhat related https://ideas.digitalocean.com/ideas/DO-I-318
Is there a way to achieve the mentioned goal of loading index.html for every route with DO space + CDN and let the app parse the rest of the path to display the right component subtree of the react app?
I was trying to preetify my URL in angular js app and remove the hash. What I did was added in line in my app.config function:
$locationProvider.html5Mode(true);
But this issues I am still facing are:
If I open a page like this $window.location.href = '#/sales'; the slash is encoded and page does not opens.
If I directly type in my browser localhost:9000/sales without hash the page does not opens.
Can someone please help.
To add to it, my base url is: http://localhost:9000
You should choose just one option: either you have hashes in the url, or not.
If hashes are ok - then just remove $locationProvider.html5Mode(true); from your code.
If you really want your app to work w/o hashes in the url then follow this (probably incomplete) checklist:
Remove # from any urls on your page
Configure your web-server to feed the same webapp on all requests which your webapp recognizes. I.e. If your webapp routing knows what to do when user agent is requesting /sales - then make sure that your web-server or backend platform you are using serves the page with your web-app
I'm getting really frustrated with configuring the Routing on our app, which is using sailsJS and angularJS.
The problem is, that the browser doesn't know about angular, so any request like /login returns a 404 Error from sails. I need a solution, to keep the sails routes from the angular ones,
One solution would be to disable html5Mode, but i really don't like the look of URLs with the /#/ which is typical for angular.
I have researched a lot on this and haven't yet found a good answer or maybe a working project for this.
Is what I am trying to do even possible right now?
If you're using HTML5 mode with Angular, then you need to configure your web server (in this case SailsJS) to respond with your index.html file for requests to /login or any arbitrary routes.
If you navigate directly to http://localhost:3000/login in your web browser (assuming you're running Sails on localhost:3000), Sails needs to respond with your index.html so that your Angular app can bootstrap and then display the appropriate route. Then, subsequent links that the user clicks on in your app will be intercepted directly by the Angular router instead of Sails directly.
Angular has documentation about making HTML5 mode work correctly here.
Using this mode requires URL rewriting on server side, basically you have to rewrite all your links to entry point of your application (e.g. index.html). Requiring a <base> tag is also important for this case, as it allows Angular to differentiate between the part of the url that is the application base and the path that should be handeled by the application.