How do I use the hugo command to update a single file? - hugo

I have a working Hugo site. It has hundreds of pages. However there are times I just want to regenerate a single page.
I know that hugo is super fast, often rendering hundreds or thousands of pages per second. However in this case I’m trying to optimize a particular situation and the ability to just generate this one page is the best option.

There is no way to request Hugo to update a single file. This is mostly due to the fact that lots of Hugo parameters and functions require Hugo to analyze the whole set of pages to render (internal linking, page counts...)
The only way would be to set all the pages you don't want to update as Draft, but this would have an impact on the site for the reason mentioned above.
You can disable some pages kinds using hugo --disableKinds strings
See here: https://gohugo.io/commands/hugo/
If it is a speed issue, the best solution is to use partialCached instead of partial, to avoid rendering the same partial for each page. This improves the rendering speed significantly.
https://gohugo.io/functions/partialcached/

Related

What causes this warning on Safari? "This webpage is using significant energy. Closing it may improve the responsiveness of your Mac"

We have a React website running with lots of high-quality images that has been experiencing this warning. How do you begin debugging this warning message on Safari? Is there specific things that cause this?
This message is caused by Safari watchdog process that monitors the Javascript scripts running on a page. It is there to notify the user when a script is utilizing too many resources. Your page when loaded on my computer raises the CPU utilization to 68 percent. Be weary of loops and custom render code.
Notes for improvement:
Make the rendering code as efficient as possible.
Combine the your internal Javascript files into a single file, instead of 7 files. Major improvement.
When possible(due to licensing and update considerations) include the 9 external scripts into the single file stated above. Minor improvement.
Split the main page into different sections either as separate pages or dynamically loaded using AJAX. Major improvement.
Avoid svg files. SVG files require a lot of computing power to rasterize and display. This is the main cause of the 7 second load times. Convert the files to png at the largest expected display resolution and offer an expanded SVG file if more detail is wanted(by click or delayed mouse over). Major improvement.
The number of images is not the issue. It is the number of SVG images(on load) and the scripts causing the issue.
Open the page in Chrome, open the Developer Tools and then switch over to the "Performance tab".
Then use the 2nd icon from the left - the one that looks like a "reload" button. Which says "Start profiling and reload page".
You will have a full rundown on what is taking how much. You can see in the top what is eating up FPS and CPU, and then you can select the timeframes that had especially high load.
In the bottom part then select the "Call Tree" or "Bottom-Up" tabs, to get a rundown of which scripts and function calls cause performance issues.
Usually "normal" websites ( e.g. not games ) would not have a lot of frame redraws. You can then spot for example if loading spinners are animated with javascript, instead of transforms and transitions; and sometimes they're still re-rendering although they are out of reach.
On a React specific note : It might also make sense to inspect it additionally with the React Developer Tools. E.g. you might be able to spot if sub-frames are re-rendering constantly for no reason.

Static Or Dynamic?

I am going to be hosting 100's of thousands of pages, only a few KB, each with the exact same format. I originally thought of using a central database, but then thought that as the content on each page will never change, is it worth the unnecessary database requests? I originally thought of using something like memcache, but then, as stated above, thought it wouldn't be too efficient as the content will never change.
Question:
As the content is only a few KB per page, is it actually that inefficient to serve as static pages?
Could it affect the ability to search through the content to find the page, and how? Eg:
There will be a description on the page, when the user is using the search function it will need to search through the descriptions of each page.
One thing on the individual pages will change, but only once every few weeks/months, should I use a database for that, or serve that statically?

Is there anyway to improve the Javascript Built Apps's web page loading time?

I found the first web page loading time for CN1 Javascript Built taking too long, need about 2 minutes.
I attached the Chrome's network loading screen shot, found the classes.js is the most heavy page, possible to zip it?
Second, there is 2 theme files that downloaded sequentially, is it possible for them to load at the same time?
Kindly advice.
Normally I would answer that you can look at the performance section of the developer guide but the relevant sections there relate to reducing the theme.res size which seems pretty small in your case.
The largest portion in your code is the class files so I'm guessing that the best way to reduce them is to further reduce dependencies so the obfucator can remove more dead code. Keep in mind that the classes.js file is cached and can be deployed via CDN's such as cloudflair to improve download speeds. It can be served in a gzipped form as well which is a part of the CDN repertoire.

Angular JS app takes 3 minutes to load on Internet Explorer browser

Background:
I have built a tool using angularjs. The user is able to view a dynamic page based on the data. The page can be really long with lots of various types of fields within it. There are many and various angularjs components used within the app.
Issue:
If a user has got lots of data (which is shown within various input fieds/date fields etc; around 500 fields ) then the page takes around 3 minutes to load in IE browser (IE11 is my browser). The same page loads within 20 seconds on chrome and firefox. I have tried now almost a month to find the issue ..but still no luck.
I am very desperate for a solution. I coudn't find any tool that would show me what is causing my page to take so long to load.
Well, first things first, you'll need to profile what is actually taking so long. I suggest you check out
https://github.com/angular/batarang
to do this. A good article that goes over it's use is available at http://daginge.com/technology/2013/08/11/debugging-performance-problems-in-angularjs-with-batarang/ -
It's too long to include in this answer, but the general flow of resolving this is probably going to involve streamlining the watchers involved in each of those component fields etc - rendering 500 fields at once seems somewhat unnecessary - pagination would probably be an easy fix. Limit the data to maybe 50 fields per page? You'll need to track if it's the actual queries you're running to get the data that is taking so long, although based on the difference betweeen IE and Chrome, I would guess it's going to be something in the browser. It could also be that IE is being forced to use some polyfills for the functionality that Chrome and FF are supplying natively. Maybe link to your repo for us to have a look at?

angularjs seo phantomjs write the snapshots first or during the crawling

I'm wondering what's the best option to write
the snapshots (a snapshots is a plain html version of an angular state/route
built for bots for seo purpose)
First (ie all the times an author user add a post in a blog)
or during the crawling
http://www.yearofmoo.com/2012/11/angularjs-and-seo.html
Generally speaking there's no best option, there is the option that best fit for you.
It depends on your case: for example, if your content is dynamically generated (I mean content generated from users as blog or forums or whatever) you have to write your snapshots during the crawling, otherwise you could generate the snapshots before.
I would try to go for first in every case, because firing up a phantomjs instance on the crawl is time costly and search engines (google) give a penalty for long loading time. It would be better to do generate the static page when new content is created.
If you have too many create events to run the first approach, you might consider ignoring some of them. For example, if you have a highly frequented blog with many comments, you could run the generation for every new blog post, but only run it for every 20 comments or every 10 minutes, whichever is earlier.

Resources