I have a web application that has login and role-based contents etc. which is created in nuxt (framework for vue)using universal mode .currently it is SSR(server-side rendering) app but Is it correct if convert it into static site using nuxt generate command ?
FYI:- I have tried running nuxt generate , it generates appropriate pages inside dist but my concern is inside each HTML files, only CSS is there and script tag. I understand i cannot statically generate contents for each page since it's based on users.knowing that is it correct if I go with SSG or does it kills its purpose?
Avoid SSG for sites with content that changes often(dynamic) for logged in user.
Update:-
Lot of great sites out there for helping developers new to this, such as
https://jamstack.org/
https://explorers.netlify.com/
At the end, its way of generating html for each routes with contents at build time(some new methods also being worked on by netlify and vercel for improving build time for big jamstack sites) and once user visits home page, its html served and js kicks in and does SPA part for further navigation instead of full page refresh + interactivity
Related
I want to build a single page application with react.js and react router v6.
I have images in the public folder that i use inside my application.
The problem is that when i open the developer tools, and network, i see the photos downloading when i go to specific pages that uses those photos. Imagine a profile image.
The project is deployed on netlify .
So the project is single page indeed, because i empty the logs, click on the page . Nothing downloads excepts the images used.
Can i somehow put the images elsewhere so they dont need to be downloaded?
We have some medium website (> pages) and would like to prerender the entire site. Upon content changes in some external system, we would like to update only the specific page which contains the updated content and deploy the updated version of the page as static html page to some publishing service (webserver, CDN, Netlify, etc.).
I have seen that SvelteKit supports this SSG rendering approach through the adapter-static but as far as I've seen it always builds all the pages/routes for every build. This is currently a blocker because if an editor updates some content it takes more than 15 minutes (rough estimate if you have around 1500 pages) to see this update online.
Is there already some possibility to simply build some specific page or route?
Looks like that this is still an open feature request and to be discussed there: https://github.com/sveltejs/kit/issues/2369
I have created very simple portfolio website using Angular.js, and I'm hosting it on GitHub Pages. I use Angular.js because I'd like it to have a SPA feeling. It contains mostly images, as it is an artist portfolio app, but also it contains a description page with text, which is (probably) not crawled by Google bot.
Is there a way to SEO my Angular.js website, when it's hosted in static website hosting like GitHub Pages? I've read about prerender, but it's a server-side tool, which can't work on GitHub Pages. I don't control server side (GitHub Pages doesn't allow that). So I cant use anything like server-side rendering.
I use some seo-friendly markup in my index.html file, like meta title with words of value for me, description, etc., but I'm afraid this is not enough.
Based on my custom URL parameters I process, I am trying to modify dynamically a meta tag I have id'ed in index.html like so:
<meta name="og:image" content="http://example.com/someurl.jpg" id="ogImage"/>
The code below in my home.ts seems to be working
document.getElementById('ogImage').setAttribute("content", Media.ImageURL) ;
I can verify it is via the browser dev console/elements.
However, when I view from facebook via their ojbect graph debugger at
https://developers.facebook.com/tools/debug/og/object/
It appears to see the default
http://example.com/someurl.jpg
as if the index.html is shipped before my home.ts gets chance to make the update.
Perhaps, my understanding is flawed and there is better way to do this.
Thank you.
Note1: initially, I was thinking I had to make some angular binding between index.html and one of my services but I could not locate any sample code, the closest I came to was this post
How can I update meta tags in AngularJS?
But I don't know how to apply it for my ionic2/3 code, so I opted for the document.get approach.
Note2: the ultimate goal here is to share a link into a social media (web or app) like facebook, a messenger like viber/skype, etc... and have it resolve to meaningful images, title, description to drive the visit back to the site via browser, or app if the user clicking on the link is on a mobile device with my app version of the site installed on his device.
Note3: if you decide to point me to ionic deeplinking please provide code to match above, because I could not understand how to apply to my case.
If you are trying to implement dynamic open graph meta tags values in your pages, you will need a server-side scripting language like php. Such a script will run on the server, update the pages as needed, then the pages will be served to the requesting site or application.
client-side scripting (ie. JavaScript) is usually ignored when a site or app is merely visiting your site/link for the purpose of extracting (aka scrapping, parsing html) information such as the one provided by the open graph meta tags (og:title, og:description og:image...).
I have a personal project which consumes my free time and effort for about a year without significant profit. I have problems with it appearance in Google and would really appreciate to get help here.
This project (http://yuppi.com.ua - similar to craiglist in US) is WEB-based AngularJS 1.2 application that uses PHP rest API hosted on GoDaddy. And in order to make this application popular it have to be very visible in internet and very searchable in Google and users have to be able to share pages via social networks or skype.
According to Google specification, google crawlers doesn't run javascript to get content of a web page before index, so I've added _escaped_fragment_ page that displays content of web page without javascript. For example:
Page: http://yuppi.com.ua/#!/items/sub/18/_
Dirty : yuppi.com.ua/?_escaped_fragment_=/items/sub/18/_
This dirty page will be redirected here where google will see content.
http://yuppi.com.ua/server/crawler_proxy/routee.php?path=/items/sub/18/
So basically I have two versions on HTML file for that page. One version is the one that available to users, which has styles, a lot more HTML tags etc. And the second is the version for Google crawler - very light-weight without any styles. And I am expecting to see clean link to my site in Google, not dirty.
So, If to search all links to a web site in Google you will see that one of the links displays it's "dirty" state.
Another problem is sharing links in Skype.
When I send a link to someone, I am expecting that this link will be transformed to thumbnail image but it is not happens. Instead I see ungly link to my web site.
Please help me to understand how to make happy everyone: users, google crawler, GoDaddy and me.
I was encountering the same problems last year with a big project and we ended to use : https://prerender.io/.
It's a prerendering system that work with a phantomjs browser to detect bot request and render a full html template. It does also instanciate a cache service to not render again a template that haven't change.
Hope it help's.