Deploying umbraco site to production database - database

We have an umbraco site connected to MSSQL databases, which have three phases:
1) Local:
This site is connected to a database running on our syst-server and is for our developers to mess around with as intended. This database is pretty much for testing and messing around
2) Syst:
This site is pretty much a deployed version of the local site, still connected to our syst-database. This is for our testing team to ensure everything looks good (apart from stuff like product data, since we will be creating a lot of test products in syst).
3) Production:
This is where the magic happens. We have a seperate database for production, which has valid data for our company.
Now my question is:
How would I sync umbraco changes made to our production database, whilst not syncing products? Is there a smart tool made for syncing only umbraco data, and keeping out custom data?

Umbraco Courier - https://umbraco.com/products/umbraco-courier/.
You can pick exactly what you want to push to the production site.

Related

1st - How does Heroku interact with a database (sqlite) 2nd - How can I obtain the version of database that is on Heroku?

I had a very hard time trying to find solutions to this problem online, probably because I do not know how to phrase this weird problem. I also could not reach out to Heroku as their support team only serves paid customers. I would really appreciate help from anyone.
Some context:
I have recently started experimenting with Heroku and deployed my first web app (Python, Flask, SQLite3) via GitHub. It works as intended, I am even able to interact with my database (add/delete/update). However, after making an update to my code locally and pushing it to GitHub, I realised that when my web app got built again by Heroku with the updates, the changes I made to the SQLite database is reverted back the initial, local version. (aka I lost the updates I made during testing).
So here comes the first question:
How does Heroku actually interact with the database? Does it create a copy of all my code from GitHub and runs it locally on their servers? (Since no changes to the database are reflected in my GitHub Repo despite successful updates to the database values through the web app).
And if so, this is the second question:
Is there any way for me to get this "updated version" of my database from Heroku side so I can download it locally before I make and push any updates to my code, so as not to erase all the updates to the database that a user might have made through the web app?

How to include database in website deployment?

I try to setup a deployment workflow, but I am completely new to it. I consider to use Git and Bamboo and in this whole thing I am stuck with the database.
Let's say I want to make changes on a CMS website and keep my files versioned on git (bitbucket), I understand how to setup Bamboo that it can SCP the files to my webserver, but I don't get it how I can get the database into this whole system? Are there any tools I am missing?
What I want:
I want to be able to checkout my website files from the gitserver, make changes and send them back to the gitserver and this (via Bamboo) should push the files to the live or testserver.
But even after searching for hours, I don't get a smooth way how to handle the database (getting it local, making changes and pushing it to the server via git) or any other smooth way.
I know there are tools to quickly dump the db for WordPress sites, but for other CMS there are no such tools.
Any advice how to do this right?

Use Different database for different tables in wordpress

I am using WordPress Platform for my website.And its growing constantly.
We often ,do lots of experiments and customization on it.
Now we like to use a developer website to do all these experiments.Thing is i like to use the same settings and everything for my developer site,so everything should match the live website.
My problem is,i can setup the whole new mirror site ,but since we will have different database for our test site,the posts won't be updated to it,when its published on live site.
So is there a way,i can use my post table of live site on developer site.
i.e everything will be of its own on developer site,but it will fetch posts from live site.

How does writing to a database work when a web app uses multiple databases like Sitecore has

In Sitecore you basically have three databases. The Core, Master and Web database.
Simply put the Core database holds all Sitecore settings. The Master database is the authoring database. So it contains all versions of any content.
Then in Sitecore you can "publish" the contents and it will publish the latest version of each content to the Web database.
So suppose I have a website with a news page. And a user is able to edit a news item from the web site (so not through the CMS). How would the database then get updated when it's set up like this?
It would probably update the Web database, but then when I go into the CMS I don't see the latest changes, since the CMS reads from the Master database, right?
So does that mean that it should write twice? Once to the Web database and once to the Master database?
Can anyone tell me how this works in Sitecore or the like?
The reason I'd like to know this is becasue I'm thinking of creating a similar database setup. And I'm just not sure how to solve this issue.
When you have items that needs to be updated by the website visitor, you need to use the SitecoreService SOAP webservice or create your own custom webservice that runs on the Master-instance and triggers a publish after updating.
Well, Sitecore has a publishing step. When the user publishes in Sitecore, it updates the Web database at that point. If you want to build a similar system, I would simply store all versions of an item in the Master database and only when the user chooses to publish, copy the latest version to the Web database.
If your site
- generates a lot of comments
- generates the comments continuously
- uses multiple content delivery servers
- requires CMS users to manage them
I would not store the comments as content items.
The reason is HTML cache and publishing behavior.
On high volume site you'd most certainly use html caching to achieve best possible performance. If a publish is required to show comments, you'd need frequent publish actions and thus html caches are cleared often.
You don't wan't that :-)
Modeling after the DMS implementation is the safest (not cheapest and Datatables isn't something I recommend these days), storing stuff in a separate database, possibly using queuing to prevent an overload if things get busy..

cms with remote database: performance issues

my problem's a bit complicated. basically i created a client/server CMS architecture that worked very well for a while. now that there are more customers, it's getting very slow and i don't really know how to fix it.
let me explain you the current architecture:
i've developed a content management system to serve various different customers. there's a cms server where each customer has an account to manage the content of his or her website. all customers work on the same interface and store the content in the same database on the cms server. so, for every new customer, i just have to open up a new account on the cms server and they can start managing their content.
to display that content, i have to create a customized website for each customer. that website frontend can run on one of my servers or the customer can host it himself. this frontend now has to connect to the cms server to fetch the content.
on the cms server, there's a php file called "share.php". it allows you to add some parameters such as 'content_ID' to specify the content. the php file then displays that content in JSON format.
on the frontend, i use file_get_contents("{cms_server}/share.php?content_ID=34"); to retrieve the data from the cms server.
as i said, it worked very well for some time when there were few customers using that system. now however a page load lasts at least a few seconds and it's getting worse.
do i just need to increase performance on the cms server or does the concept of retrieving data with file_get_contents(); just suck big time? :D
i appreciate your recommendations of how to fix that problem.
cheers.
Probably you need to look at your database: do you need to add indexes? Are you making redundant calls? Are you making many small SELECTs which could be made into one big one? And so forth.

Resources