Hugo is for people building a blog, company site, portfolio, tumblog, documentation, single page site or a site with thousands of pages.
http://gohugo.io/overview/introduction/
I'd like to make a single page site. My content is written in Markdown index.md. How do I build it?
It's necessary to build with Hugo (rather than pandoc) because I want to use its csv templating feature http://gohugo.io/extras/dynamiccontent/
I tried hugo new to create a blank site. If I create _layouts/index.html then hugo server will show that. But I want to write content in Markdown.
As of Hugo 0.18, this is now possible by creating a content/_index.md file, if the theme you are using supports it. You can create this file with the command hugo new _index.md. See the Hugo docs for more details.
Just write a post example.md as usual, Hugo will build it public/example/index.html. Then rather than push public to your web host, publish public/example. Voila, single page site.
the hugo quickstart should have everything you need to get started, including adding markdown posts. Specifically, hugo new POSTNAME.md will create a markdown post with proper header at content/POSTNAME.md.
Related
so I have this website made with Next and on a page there are some graphs (the graphs content changes as it fetches an API) and info.
I want to add a button to the page and when pressed it download the page as a HTML file and includes all the JS and CSS in the HTML file instead of separately, does anyone have any idea as to how to approach this problem. (The graphs content should be the same content as it was on the time of downloading)
(The reason why I want to do this is because I want to distribute these files to others and I want to allow them to read it w/o an internet connection)
You can't really download a React 'page' because there are no pages in React to download.
Next further complicates this because it server-side renders everything and rehydrates client-side. If you inspect one of your pages, you'll see the JSON blocks Next uses for data. Look for the __NEXT_DATA__ script (usually in the footer of your page).
I think the two strategies you could use:
Screen-capture of the graphs during your build sequence and push them over to an AWS S3 bucket or similar (cumbersome)
When I ran into a requirement like this, I just made the data for the graph available as a JSON download just below the graph and it satisfied the use case sufficiently.
If you just want to download the assets and take a look, a workaround is probably leveraging the next/export package. This allows you to run yarn build and generate a static export of your entire site. This should include the file you're looking for.
Just some ideas to think through.
I have a web application that has login and role-based contents etc. which is created in nuxt (framework for vue)using universal mode .currently it is SSR(server-side rendering) app but Is it correct if convert it into static site using nuxt generate command ?
FYI:- I have tried running nuxt generate , it generates appropriate pages inside dist but my concern is inside each HTML files, only CSS is there and script tag. I understand i cannot statically generate contents for each page since it's based on users.knowing that is it correct if I go with SSG or does it kills its purpose?
Avoid SSG for sites with content that changes often(dynamic) for logged in user.
Update:-
Lot of great sites out there for helping developers new to this, such as
https://jamstack.org/
https://explorers.netlify.com/
At the end, its way of generating html for each routes with contents at build time(some new methods also being worked on by netlify and vercel for improving build time for big jamstack sites) and once user visits home page, its html served and js kicks in and does SPA part for further navigation instead of full page refresh + interactivity
Based on my custom URL parameters I process, I am trying to modify dynamically a meta tag I have id'ed in index.html like so:
<meta name="og:image" content="http://example.com/someurl.jpg" id="ogImage"/>
The code below in my home.ts seems to be working
document.getElementById('ogImage').setAttribute("content", Media.ImageURL) ;
I can verify it is via the browser dev console/elements.
However, when I view from facebook via their ojbect graph debugger at
https://developers.facebook.com/tools/debug/og/object/
It appears to see the default
http://example.com/someurl.jpg
as if the index.html is shipped before my home.ts gets chance to make the update.
Perhaps, my understanding is flawed and there is better way to do this.
Thank you.
Note1: initially, I was thinking I had to make some angular binding between index.html and one of my services but I could not locate any sample code, the closest I came to was this post
How can I update meta tags in AngularJS?
But I don't know how to apply it for my ionic2/3 code, so I opted for the document.get approach.
Note2: the ultimate goal here is to share a link into a social media (web or app) like facebook, a messenger like viber/skype, etc... and have it resolve to meaningful images, title, description to drive the visit back to the site via browser, or app if the user clicking on the link is on a mobile device with my app version of the site installed on his device.
Note3: if you decide to point me to ionic deeplinking please provide code to match above, because I could not understand how to apply to my case.
If you are trying to implement dynamic open graph meta tags values in your pages, you will need a server-side scripting language like php. Such a script will run on the server, update the pages as needed, then the pages will be served to the requesting site or application.
client-side scripting (ie. JavaScript) is usually ignored when a site or app is merely visiting your site/link for the purpose of extracting (aka scrapping, parsing html) information such as the one provided by the open graph meta tags (og:title, og:description og:image...).
I am trying to create buttons on a web page that allow users to share links to PDF documents on LinkedIn. LinkedIn loads a window without any errors but offers no link or preview of the PDF or any indication of what is being shared.
Here are the two methods I have tried. First the plugin method.
<script type="in/share" data-url="http://example.net/DocumentDownload.aspx?Command=Core_Download&entryID=114"></script>
And, secondly with a custom url.
TEST
Encoding the url makes no difference.
The above links are direct document links from a DNN web site using Document Exchange. If I change the urls to any html page it works fine and LinkedIn seems to be able to extract the useful information right from the page and use that for the share details.
Can LinkedIn handle this kind of thing? There is nothing to guide me on the type of links that can be shared. I can't find any information about it. There are no errors in the web console.
Not sure, but you should try to provide LinkedIn with the link that has .pdf at the end, like http://example.com/documents/file1.pdf. I guess LinkedIn just checks the URL if it has .pdf file at the end to decide if it is a PDF document or not.
I have no problem sharing pdf's on LinkedIn. Check it out...
https://www.linkedin.com/sharing/share-offsite/?url=https://www.revoltlib.com/anarchism/the-conquest-of-bread/view.pdf
Works perfectly fine. And view.pdf is a script, not a file, either, so, it's not looking for a PDF file to analyze, so much as headers that indicate you have a PDF file available to analyze, so, in PHP, at DocumentDownload.aspx, we would do...
header('Content-type: application/pdf; charset=utf-8');
This header let's the sharing app know that it can analyze the document as a PDF file and extract useful information from it, as you can see from the screen shot.
i need to remove sitemap.aspx from the site.
In dnn 6,there is a sitemap.aspx page that simply shows an xml sitemap.i cannot edit/remove that file.so i need to remove that page and recreate it with a simple html sitemap.
NOTE:the page name should be sitemap.aspx
Sitemap.aspx isn't a physical page you can delete.
You can, however, rename it to something else. It's in your web.config file, under the 'handlers' section. Just look for sitemap.aspx, and change it to something else, like 'searchenginesitemap.aspx'. Don't forget to update your robots.txt file to point to the new sitemap name, or go to the various webmaster console pages in search engines and advise them of the new location.
The sitemap.aspx is used to create the xml sitemap for search engines. By changing this you break this functionality and limit the search-ability of your site.
That being said, in Host Settings->Advance Settings you could setup a new Friendly Url that would match .*/sitemap.aspx to another url/page on your site.
I have long stopped using DNN's native sitemap.aspx... ITS BUGGY!... and here is how i found out.
I generated my own "CLEAN" sitemap.xml using a free 3rd party tool. And uploaded it to the root of my DNN website.... re-submitted the the domainname.com/sitemap.xml to Google via web master tools and as a result we now get a 1ST PAGE and TOP 10 RANKING.
Mostly in the top 5... where as before using DNN's native sitemap.aspx we would get random errors which was pretty ANNOYING. Plus we got very bad Google Page Rank, But those were just my findings of better results. Note:I also place the location of the sitemap within the robots.txt file...
Although i will admit that it is extremely ANNOYING that you cannot just edit the DNN Sitemap url. This creates an issue if you've built the the site on a test server and then migrate over to production... your DNN Sitemap url only reads the firs portal alias from when you first developed the site.
Anyway, this was my findings... others may vary... just sharing.