Episerver - How to Manage Media Items on Multiple Environments - episerver

Hi guys I'm working on an exiting Episerver project (my first one) -
One of the issues that we are having is we have three enviroments for our episerver website. Developer / Staging / Live.
All have sepreate DBs. At the moment, we have had lots of media items added to our live enviroment via the CMS, we want to sync this with our staging enviroment.
However when we use the export data feature from live admin section and try to restore it to our staging enviroment, we end up with missing media, duplicate folders etc.
Is there a tool/plugin avalible to manage content/media across mulitple enviroments. Umbraco has something called "courier" (Umbraco being another CMS I have used in the past) looking for the episerver equvilent.
Or is the best way to do this export the live SQL database and over write my staging one? We have diffrent user permissions set in these enviroments how can we manage that?
How is this genreally done in the world of episerver?

Unfortunately the most common way to handle this is as you say to do it manually. Restore the db, copy the fileshare, and set up the access rights on the stage environment after the restore.

Luc made a nice provider for keeping your local environment in sync. https://devblog.gosso.se/2017/09/downloadifmissingfileblob-provider-version-1-6-for-episerver/

Related

wso2am deployment overrides database, API's are lost

i am using wso2 api-manager 02.01.00 on a linux system. The Api-Manager is deployed at Folder A. The Databases (h2) are deployed ad Folder B which is not in Folder A. The datasources in /repository/conf/datasources/master-datasources.xml are pointing correctly to the databases in Folder B. I configured it like that, because i want do preserve the databases if there is a deployment. (Becaus a fiew Developer are using the API-Manager and they don't want to loose their Data.) But it seem, that WSO2AM_DB.h2.db is created new if there is an api-manager-depoyment. I think this, because i had a look to the DB-Size. I started with a Size of 1750KB for WSO2AM_DB.h2.db. I published a view API's in the Manager and the Size increases to 2774KB. Then i did a Deployment and the size returned to 1750KB.
Effect is that API-Store/Publisher says "There are no APIS published yet".
But i could see the APIS at Application Subscriptions and in Carbon Resources at /_system/governance/apimgt/applicationdata/provider/admin.
I tried to force a new Indexing with this, but it doesn't change anything.
Could i configure at any place, that the Database should not be created/manipulated at start?
Meanwhile i'm really desperated of not solving this problem.
Maybe you could help me.
Thank you for your Time.
WSO2 does not recommend to run on H2 database. You need to use a production database such as mysql, oracle, etc. H2 is only for tryouts.
Basically, WSO2 servers store data in databases as well as use the file system. For this kind of a deployment, you need to do the following.
Point to an external database. If you are using this for demo purposes, still you can go with the current mode (H2 database).
Use dep-sync. The content which comes under the WSO2_HOME/repository/deployment/server location needs to be preserved. You can use SVN based dep-sync or rsync. Basic idea is that for a new deployment, you need to have the data of the previous deployment.
Solr Indexing preservation. If you have hundreds/thousands of APIs in the system, it would take time for indexing. To avoid that you can copy the content of WSO2_HOME/solr to the new deployment.

Multi-site in one Wagtail installation

If I have several sites in one Wagtail installation, is it possible to have one database for each site or all sites are saved in the same database?
All sites are saved in the same database. Of course, if you really need separate databases, there's nothing to stop you from setting up a separate Wagtail installation for each one.
you can use python scripts to fetch Data from the main multisite database into private DBs for each website, of course you need to have knowledge with DBs by python & of course for each DATABASE engine you need different scripting..!
search on Google or StackOverflow (DB & python).. and good luck

Using multiple databases and parameters with one symfony 3 installation

I have been tasked with the job of creating a CMS using Symfony 3, which will be offered to various different clients to update their websites. There is one installation for our own use, to control the clients' sites, logins etc. and then another installation which will be the client CMS itself. But it only needs to be a single installation in one place that can be accessed by all clients, via their own personal login.
This means, I would like each client to have their own content stored in a separate database, so essentially there would be one CMS location and installation but hooks up to multiple databases. The database that will be used would be dependent on the client that logs in. As far as they are concerned, it would be their own CMS, with their own data.
I cannot see an obvious way of being able to set this up in Symfony 3, as it uses the parameters.yml file to reference the database setup, and also, how would it know which database to use to persist and flush the content to when it's being saved?
Help on this would be much appreciated.
I think the documentation is crystal clear
http://symfony.com/doc/current/doctrine/multiple_entity_managers.html

Clarification on splitting an Access Database

I've read multiple articles and watched videos but this is a big change to the structure so I want to confirm the idea that I have.
Splitting will separate the tables and forms/queries into separate files. I get that much. But two questions.
1) Should I backup my database beforehand?
2) Can I edit the forms in design/layout view while they're being used by, say, a data entry team?
The issue I'm running into now is that I created a simplistic front end for another team to use but I now need to buff it up while they use it. I heard this was the most efficient way to do so.
1) Yes, of course.
2) Every user should have their own local copy of the frontend.
You develop the new frontend version on your local computer, then when it's ready you put it on a network drive, and everyone gets their new local frontend from there.
Here are some ideas how to automate this: https://stackoverflow.com/a/33782644/3820271
You should never make design changes on a database that other users are currently using. (shudder)
Edit re. comment:
All forms are in the frontend, so yes. You work on your development frontend, connected to a development backend (a copy of the production backend).
The other users can meanwhile work with their local frontends on the production backend.
When you are ready for release, make the necessary changes in the production backend (if there are changes in table structures). Make a copy of your dev frontend and link the tables from prod backend. This is the new prod frontend, which is distributed to all users.

Integration of different works by different people in moodle

We are developing a moodle site. We are a group of 5 people and each one is working on different module locally. But now we wwant to integrate the work of all in one machine or server. Is there any way to version control it or integrate it as the databse of each one is different because of different data. Please provide the solutuion as early as possible.
It is not completely clear as to whether you are separately working on the content of the site or the code for the new site, so I will attempt to answer both questions.
For content the easiest way to integrate it all together into one site is to use the Moodle backup and restore mechanism ( http://docs.moodle.org/26/en/Course_backup ) - backup each of the courses and then restore them onto the main site. If you have a lot of courses to transfer, then it may make more sense to write some code to automate certain aspects of this, but that can be quite a bit of work, so usually it is easier to just manually do the backup and restore.
For code the answer is Git. All the core Moodle code is version controlled via git. Make sure that each developer is working with their own clone of your main git repository (you can find the core Moodle repository at . Once they have committed each of their changes, then they can be pushed (to a central repository) or pulled to your production site. Read more at http://docs.moodle.org/dev/Git_for_developers
Note that if the code for each module has been written with the proper DB installation / upgrade code ( http://docs.moodle.org/dev/Upgrade_API ) then it should simply be possible to take the code from each of the developed modules, put them together into one codebase and then create a fully-working fresh install. Once you have that, you should be able to use backup and restore to transfer any required courses from the development servers to the live server.

Resources