just a question from curiosity. How big online stores with lots of products (can) store their product pages?
I know one solution might be database and then generating the content on demand, but how would they be able to hand URL requests like www.onlinestore.com/productname.htm? (Specificaly I mean this store http://www.alza.cz/pevne-disky/18842851.htm ). Every product links to productname.htm page.
Is it possible (and viable) to have all product pages pregenerated?
The answer will vary from vendor to vendor, but typically this is achieved with a combination of custom URL routing, and a server-side, database-driven language like ASP.NET or PHP.
In the case of your example, it is likely that the "18842851" of http://www.alza.cz/pevne-disky/18842851.htm is actually a product Id, and their server is parsing that product Id in order to get the correct product content from a database. With the content in-hand, the server can then return a reusable template.
Related
I am working in an application where a product detail page need to be tracked without using google analytics.I have a scenario where a product is trending or not.To achieve that I need to get the views of that page.A Product which is having more number of views is considered as a trending.This trending may change in day-to-day life.
Done some research https://github.com/nytimes/react-tracking
But didn't get any perfect solution.Need help to add this functionality.
I have this big issue.
I have developed a Gatsby website for the car parts market.
The users can select car model and part type and receive a page with the parts for their specific car.
I have used React Route to create dynamically the new path and an API that recall the data:
https:/mysiteAPI.com/{car.id}/{part.id}
The paths that will be created are something like that: https:/mysite.com/comparison/{car.name}-{part.name}
The big issue that I have is that this path will be created only when i click on
Using this approach I can not generate the pages when I build the website, but the pages will be generated only when users click.
Actually I need that pages will be readable from search engine crawlers for SEO reasons.
I tried to create each page inserting in Gatsby Node a CreatePage, but the system crashed, due the huge amount of pages (over 5 million).
I don’t know how to generate pages that can be readable by crowler and persist. I hope that someone of you can help me to improve my site.
Thanks in advance
Take a look at the createPages API, available in the gatsby-node file of your project. If you would like to create static HTML files in the build process for each item you need to first fetch them somehow. With the data in your possession you could iterate over it and call createPage for each page you want to create.
i have a quick question, so i just built 3 websites for customers using React and they're all deployed and working but now one of these customers is asking me how they can update its content, all the website are stored either on one or many JS files after building them with npm run build and configured with webpack, how should i build a website where customers plan to change either texts or add new images? the websites have a couple of map functions to display many pictures or card based on products, so everytime theres a change i have to go through the react files and change them build it again and deploy them which seems pretty static to me even though React is supposed to work with components and make it dynamic, what would be the best approach and/or tools to build a website where a customer can change a .JS file with text in Cpanel file manager, or just add pictures to a directory in Cpanel file manager and have it render automatically, i was thinking that i should make it server side so i can avoid having to re build and re upload everytime that there are changes, bu what would be the best options besides React to make it dynamic, Should i use node, express??
Thank you so much for your time !
One of the options could be Contentful API, you can store all of your data there and add permission to your customer to edit the repo. This will make it easier to change the content that you mentioned: "the websites have a couple of map functions to display many pictures or card based on products". So what you will do is simply fetch data from the API and render it on your website, this will allow your Customers to CRUD the content of your website, without your intervention. It is pretty easy to use, you can find out more about it here: https://www.contentful.com/developers/docs/.
Regarding the content of the website itself, e.g pages, texts on pages, and other static stuff that does not depend on actual data, but it's only a part of your code, I think it could be hard to manage without you coding it manually. If there is an exact part of content data that is going to be changed, you could make a new Admin page, where the user will be able to change the content and data on your website, and in this could you will have to add some back-end (node, express, etc). But what you're describing is actually CMS which is really popular nowadays you might want to take a look at some of them:
https://www.wix.com/
https://wordpress.com/
But if the Texts that you've mentioned are just a part of models e.g You have a product with something like this:
const product = {
productName: 'name',
price: '100',
image: 'img',
avialable: false,
}
and clients want to change productName, price, image, avialability then Contentful is pretty sufficient for you. If you will need some help with this Platform you can DM me I will try to help you.
I’m in the process of building a new webapp which uses Wagtail. This is an architecture question.
There will be the standard About, Terms of Service, and Blog pages. All of which fit in with the Wagtail paradigm very nicely. The rest of the site content is location-based information about specific types of businesses. Think of a FourSquare type app. The data for these pages will be very structured, be updated via user-facing webpages and a mobile app, and be JavaScript heavy.
For the regular Wagtail pages, there may be hundreds over time. For the location page types there will be (hopefully) tens of thousands of pages that will be nested to a max of four levels.
From a site-wide feature perspective, I’m looking to leverage Wagtail features like sitemaps and Elastic Search.
My question is, should I use the wagtail Page class for my location based pages?
Pros:
Easy integration with Elastic Search and Wagtail sitemaps
Leverage Wagtail editor on the admin side.
Consistent model and api structure throughout.
Cons:
Potentially more overhead for location pages.
Need to utilize more hooks to manipulate the view output (Lots of JS, so will be inserting non-model info into the template).
Potentially limits the use of some third-party modules.
If I went the non-wagtail route, is it possible to add non-wagtail models to the search index?
Are there other issues I should consider in this choice?
Any Django model can be indexed and searched by inherit from index.Indexed and defining search_fields onto the model.
As for using pages or not, as always it depends on many things. However, it looks like it would make sense to keep your location model as such, not make it a page and expose in in the admin (either Django Admin if you need some extensions that already exists there, or Wagtail's modeladmin otherwise). Then you would create a LocationIndexPage which implements the RoutablePageMixin to dynamically serve the LocationPages. The page won't exists in the admin tree but will be reachable anyway.
How to get basic demograpic Data for personalizing a webpage.
There is quantcast but it doesn't provide this functionality.
To personalize a webpage it is necessary to get this information in real time, to adapt the webpage before sending it to the visitor.
It's called website personalization and there are alot blog entries about how to gather this data, but there is not one how to use existing data.
Is there a solution ready to use?