If I have several sites in one Wagtail installation, is it possible to have one database for each site or all sites are saved in the same database?
All sites are saved in the same database. Of course, if you really need separate databases, there's nothing to stop you from setting up a separate Wagtail installation for each one.
you can use python scripts to fetch Data from the main multisite database into private DBs for each website, of course you need to have knowledge with DBs by python & of course for each DATABASE engine you need different scripting..!
search on Google or StackOverflow (DB & python).. and good luck
Related
Hi guys I'm working on an exiting Episerver project (my first one) -
One of the issues that we are having is we have three enviroments for our episerver website. Developer / Staging / Live.
All have sepreate DBs. At the moment, we have had lots of media items added to our live enviroment via the CMS, we want to sync this with our staging enviroment.
However when we use the export data feature from live admin section and try to restore it to our staging enviroment, we end up with missing media, duplicate folders etc.
Is there a tool/plugin avalible to manage content/media across mulitple enviroments. Umbraco has something called "courier" (Umbraco being another CMS I have used in the past) looking for the episerver equvilent.
Or is the best way to do this export the live SQL database and over write my staging one? We have diffrent user permissions set in these enviroments how can we manage that?
How is this genreally done in the world of episerver?
Unfortunately the most common way to handle this is as you say to do it manually. Restore the db, copy the fileshare, and set up the access rights on the stage environment after the restore.
Luc made a nice provider for keeping your local environment in sync. https://devblog.gosso.se/2017/09/downloadifmissingfileblob-provider-version-1-6-for-episerver/
i am using wso2 api-manager 02.01.00 on a linux system. The Api-Manager is deployed at Folder A. The Databases (h2) are deployed ad Folder B which is not in Folder A. The datasources in /repository/conf/datasources/master-datasources.xml are pointing correctly to the databases in Folder B. I configured it like that, because i want do preserve the databases if there is a deployment. (Becaus a fiew Developer are using the API-Manager and they don't want to loose their Data.) But it seem, that WSO2AM_DB.h2.db is created new if there is an api-manager-depoyment. I think this, because i had a look to the DB-Size. I started with a Size of 1750KB for WSO2AM_DB.h2.db. I published a view API's in the Manager and the Size increases to 2774KB. Then i did a Deployment and the size returned to 1750KB.
Effect is that API-Store/Publisher says "There are no APIS published yet".
But i could see the APIS at Application Subscriptions and in Carbon Resources at /_system/governance/apimgt/applicationdata/provider/admin.
I tried to force a new Indexing with this, but it doesn't change anything.
Could i configure at any place, that the Database should not be created/manipulated at start?
Meanwhile i'm really desperated of not solving this problem.
Maybe you could help me.
Thank you for your Time.
WSO2 does not recommend to run on H2 database. You need to use a production database such as mysql, oracle, etc. H2 is only for tryouts.
Basically, WSO2 servers store data in databases as well as use the file system. For this kind of a deployment, you need to do the following.
Point to an external database. If you are using this for demo purposes, still you can go with the current mode (H2 database).
Use dep-sync. The content which comes under the WSO2_HOME/repository/deployment/server location needs to be preserved. You can use SVN based dep-sync or rsync. Basic idea is that for a new deployment, you need to have the data of the previous deployment.
Solr Indexing preservation. If you have hundreds/thousands of APIs in the system, it would take time for indexing. To avoid that you can copy the content of WSO2_HOME/solr to the new deployment.
I have been tasked with the job of creating a CMS using Symfony 3, which will be offered to various different clients to update their websites. There is one installation for our own use, to control the clients' sites, logins etc. and then another installation which will be the client CMS itself. But it only needs to be a single installation in one place that can be accessed by all clients, via their own personal login.
This means, I would like each client to have their own content stored in a separate database, so essentially there would be one CMS location and installation but hooks up to multiple databases. The database that will be used would be dependent on the client that logs in. As far as they are concerned, it would be their own CMS, with their own data.
I cannot see an obvious way of being able to set this up in Symfony 3, as it uses the parameters.yml file to reference the database setup, and also, how would it know which database to use to persist and flush the content to when it's being saved?
Help on this would be much appreciated.
I think the documentation is crystal clear
http://symfony.com/doc/current/doctrine/multiple_entity_managers.html
I have a Wordpress blog that is now running in Linux/MySQL. Now, I have seen a product called Brandoo Wordpress which let you run Wordpress on IIS + MSSQL.
Since I am using Windows Server and MSSQL for all my other projects I would very much like to use it on my Wordpress blog too. The wordpress site is quite big and important. The blog is beloved for its adult content. It has a revenue on thousands of dollars/month so I don't want to rush in anything here.
The Brandoo Wordpress is a part of the application gallery in Windows Platform Installer and also in Windows Azure.
So my questions are:
Since Brandoo Wordpress is a part of the apps in Azure, do you think it is quality assured by Microsoft?
I guess before Microsoft adds a web app to Azure and Platform Installer it has to be safe and bug free? Right?
I have tested my Wordpress locally with Brandoo Wordpress and it seems to work great so far.
I'm member of Brandoo WordPress team and I think I can help You. So... Brandoo WordPress is based on MSSQL. If You are using plugins that uses non-standard (same for MS and My SQL) db query You must face situation when You will must drop those plugins untill we will create translation for those queries that are not translated yet. There is also one thing. Brandoo WordPress is one step behind mainline right now. It's because some MySQL speciffic query in onsite search function. We do not want to fork WP and change it to MS schema so we are still working on translation or disabling this subfunction (If we willagree that this is a safe way to do it). If this is ok for You, then Brandoo WordPress is good for Your production.
I wouldn't call this a guarantee, but one of the principles of submission to the web app store is to "Be Safe"
I've just started using Heroku with Django and it seems great. However, when I change my existing models I'm not sure how to run those changes to the Heroku environment. The syncdb works just fine when adding all new database tables, but how should I modify existing tables?
I found out that Heroku provides psql access only to a dedicated database so that's out of the question. I haven't tried South but it seems like a solution.
So I guess I'm asking how to make database changes with Django and Heroku?
What you are asking for is called "schema migration" or even "schema evolution". Django has some documentation about it on the wiki.
Django's syncdb command does not support that. As a matter of fact, the documentation for syncdb is clear:
Creates the database tables for all apps in INSTALLED_APPS whose
tables have not already been created
Rather, django proposes to use drop the tables manually and then to run syncdb again in the documentation of the deprecated reset command:
You can also use ALTER TABLE or DROP TABLE statements manually.
But fear not, there are many reusable apps to help you with proper schema migrations and hopefully you can pick the one that suits you best. Rather than elaborate in my answer, please let me link an article I wrote about Django schema migration which compares all current solutions.
South works great on Heroku.