Complete edit:
I have at the moment small page. I am not restricted to any blogger platform. Just several server restrictions, but some of them are fixed wit .htaccess. DataURI, CSS and sprites will be used to mitigate the connect time penalties.
Will creating AMP entry page (or few pages) be considered cheating, because mainly static HTML with adaptive/responsive CSS will be served and not AMP pages?
Will standalone pictures benefit from the advertised caching if they are referenced in source, but thumbnails are used to link to standalone pictures? Do I have to make gallery with full size pictures to force caching?
Is it worth to create small AMP subset just to advertise or wait to have large content pool and many visitors?
Just answer your question in title, yes you can have AMP and NON-AMP pages. You can see WordPress plugin here https://wordpress.org/plugins/amp/ , they are currently generating AMP page for each Blog Post in Single view, however any other custom post types like Pages, Archive pages, Category pages,front page are all non-AMP.
Related
could anyone please help with my webbsite loading times?
Whilst desktop pagespeed ranks 99 the mobile version ranks 75 only.
https://pagespeed.web.dev/report?url=https%3A%2F%2Fwww.muscle-cars.eu%2F&form_factor=mobile
What could I do? I already use next/image and wepb, do I have to compress it more?
Yesterday I tried to remove Catamaran font from the website and it speeds up to 96, but it is very unlikely that removing 80kb font would help that much, I think there is something blocking up loading the font. I tried Google API and self hosted both similar result.
Thanks for help.
Here is the result of your website test, we can see that the mobile version only ranks 75.
I used chrome devtools to check this web page and found some issues to enhance your web user experience.
Your website gets the raw css and javascript files from the server. It's too big and will waste more time. You can reduce file size by using ugly javascript and css files when the page is first loaded.
Your website is getting too many javascript files on first load. I found more than 10 files to load. In mobile 3G networks, more time is wasted. So use webpack or other bundling tool to combine these files into fewer files.
You can reduce file size by using tree shaking javascript and css.
Your web page home page can use server-render teck, which renders javascript in server and sends whole html file to mobile browser.
i'm hosting a page built on react on github pages, my problem is that im displaying a video and it affects performance. Should i buy a cdn to serve videos or images, or i should change server and still serving static content? The cdn would help or github pages it's slow anyway ?
How are you embedding the video? If you embed the video via Youtube it should select a quality low enough to suit faster loading times, otherwise you can try lazy-loading=true in the video tag, and maybe async if you're pulling it from another source. Otherwise you'll find the most impact in performance is to limit the amount of data the client has to download by compressing the video.
We have some medium website (> pages) and would like to prerender the entire site. Upon content changes in some external system, we would like to update only the specific page which contains the updated content and deploy the updated version of the page as static html page to some publishing service (webserver, CDN, Netlify, etc.).
I have seen that SvelteKit supports this SSG rendering approach through the adapter-static but as far as I've seen it always builds all the pages/routes for every build. This is currently a blocker because if an editor updates some content it takes more than 15 minutes (rough estimate if you have around 1500 pages) to see this update online.
Is there already some possibility to simply build some specific page or route?
Looks like that this is still an open feature request and to be discussed there: https://github.com/sveltejs/kit/issues/2369
I have an e-commerce react app, so as you know every product has at least three or four images, so to show the image of product in my website i created a folder with a name of " images " inside the public folder so everytime i want to show images of a specific product i can get and show them very simply and for Now this is very very awesome.
The Problem:
as we know each e-commerce website should have an admin where he can publish new products and upload new images, so by the time may be i will have thousands of images in my website.
Question
what is the best practice to store images of my react app ?
do i need to use third party like AWS or Firebase ?
Thank you.
Storing images in the code-base assets folder is not the best option for large number of images. Handling updates, inserts is a big problem. So you have the following options.
Options: Cloud/On Premises
You may store in the cloud like AWS S3
If you want to store on premises, you may store in MongoDB Grids
or even on the File System with file-path stored in the database.
Step Ahead
But going ahead you might need responsive images according to the image placeholder. Example for an image you might need thumbnails of different sizes lets say for listing pages, android apps, iOS apps.You might also need to compress the images in case they are heavy for web purpose.
In that case you may choose to store the images in the desired
resolutions.For this you'll have to store multiple versions on an image. For example product1/original.jpg, product1/compressed.jpg, product1/300x300.jpg etc
You may resize/crop the images on the fly. If you want to write your resizing systems you may write ImageMagick/vips/PIL etc based code.You may try to search for readymade nginx based plugins to server responsive images.
If you do not want to do this resizing stuff yourself you may use image storage services like imgix,cloudinary,akamai that provides CDN + image manipulations.Some of these provide storage+manipulation while some only manipulation.
I have a personal project which consumes my free time and effort for about a year without significant profit. I have problems with it appearance in Google and would really appreciate to get help here.
This project (http://yuppi.com.ua - similar to craiglist in US) is WEB-based AngularJS 1.2 application that uses PHP rest API hosted on GoDaddy. And in order to make this application popular it have to be very visible in internet and very searchable in Google and users have to be able to share pages via social networks or skype.
According to Google specification, google crawlers doesn't run javascript to get content of a web page before index, so I've added _escaped_fragment_ page that displays content of web page without javascript. For example:
Page: http://yuppi.com.ua/#!/items/sub/18/_
Dirty : yuppi.com.ua/?_escaped_fragment_=/items/sub/18/_
This dirty page will be redirected here where google will see content.
http://yuppi.com.ua/server/crawler_proxy/routee.php?path=/items/sub/18/
So basically I have two versions on HTML file for that page. One version is the one that available to users, which has styles, a lot more HTML tags etc. And the second is the version for Google crawler - very light-weight without any styles. And I am expecting to see clean link to my site in Google, not dirty.
So, If to search all links to a web site in Google you will see that one of the links displays it's "dirty" state.
Another problem is sharing links in Skype.
When I send a link to someone, I am expecting that this link will be transformed to thumbnail image but it is not happens. Instead I see ungly link to my web site.
Please help me to understand how to make happy everyone: users, google crawler, GoDaddy and me.
I was encountering the same problems last year with a big project and we ended to use : https://prerender.io/.
It's a prerendering system that work with a phantomjs browser to detect bot request and render a full html template. It does also instanciate a cache service to not render again a template that haven't change.
Hope it help's.