Backup and restore in sonarqube - database

I would like to backup my SonarQube production server. Can you please help with below two queries:
1\ What all things backup needs to be taken (e.g. database or which all folders from SonarQube home)
2\ Is there any solution already available which can be used directly to take backup of SonarQube.
Thanks and Regards,
Deepti

We regularly do backups in our company, and for that we only backup two things:
the database - because it contains all the data and this is the only thing based on the upgrade notice you need for a restore https://docs.sonarqube.org/latest/setup/upgrading/
the ./extensions/plugin folder - because you never know, what was the version of which plugin afterwards. and you might have a custom plugin, or one which is not in the marketplace, which you will sure forget about
There is no reason to backup the elastic search data, as sonarqube will create all the necessary information on startup. Just be aware, that the first startup will take some time, depending on the amount of data you have stored.
As those two things are actually pretty straight forward, i am not sure if there is really a tool which can help you with that.

Related

Refreshing lower environments with prod copy

I know this isn't something new and people might be doing this in their environment. I have a requirement to do refreshes monthly, weekly etc in lower environments and I wanted to know if there is a quicker approach to this. I know we can do a backup and restore etc through SQL job (I would love to know if there is an automated script which takes care of the entire process). Also, instead of doing a full database restore every month is there a way we can only send changes that happened during the month or the week that way it would save a lot of time and wastage of space. I am not sure on how to achieve the second option of shipping only the changes. We aren't considering any HA technologies for x reasons so please do not give me those options. Any script that you have that can achieve this or if you are doing something similar in your environment and have the necessary details and scripts then please do share the same. Is there any tool that can achieve this but obviously this won't be my first option unless we can't do it via writing t-sql code. Also, our boxes are VMs so is there a possibility we can leverage the features and capabilities by taking file snapshots and delivering it to the lower environments (sorry I am a bit naive on VM capabilities and techniques) rather than doing backup and restore natively through SQL.
Fyi...we want complete data not just bare schema. Also, please do not share solutions using SSIS.
Any help would be much appreciated.
Thanks
Once you perform the recovery portion of a database restore, it loses the ability to restore additional backups from the source database. Depending on your setup, you may be able to get away with shipping only an additional differential backup from the source system. But you'd still need to restore the full backup again.

how to do database hot bbackup?

so I am learning about databases and still fairly new
but from what I learned so far I gathered that if the database is large then hot backup should be used not cold backup. (from my understanding hot backup is when you backup database while users are still using it , cold is when you need to have a downtime to do the backup- users can't use it)
but when you have somewhat large files (eg. PDFs of size 20M?) stored in some directory and you store the paths for the files inside the database ... if you need to do a hot backup how would you go about backing up those files??
what approaches should be used and do they have down sides?
is it possible to do a hot backup for them? if not then why!
also does it really matter what type of database is used?? (MySQL vs sqlserver ,...)
or is it a general approach for any type?
I have already googled this and got no answers! (maybe I am using the wrong terms? please point out the right ones!!!!)
if you think my question is too general please point me out in the right direction
please excuse my English as it is not my first language
I appreciate any help I can get
How to make backup of live webpage:
1) Copy all files from FTP to your local machine, write them to some DVD or place somewhere safe.
2) Go to PhpMyAdmin > Export > press Go
If database is big, select Custom export and select Compression under Output
Now you have backup of your webpage AND still running webpage itself. There is no such thing as Cold/Hot backup as you described.

Replicating / Cloning data from one MS SQL Server to another

I am trying to get the content of one MSSQL database to a second MSSQL database. There is no conflict management required, no schema updating. It is just a plain copy and replace data. The data of the destination database would be overwritten, in case somebody would have had changed something there.
Obviously, there are many ways to do that
SQL Server Replication: Well established, but using old protocols. Besides that, a lot of developers keep telling me that the devil is in the details and the replication might not always work as expected and that is this best choice for an administrator, but not for a developer.
MS Sync Framework: MSF is said to be the cool, new technology. Yes, it is this new stuff, you love to get, because it sounds so innovative. There is the generic approach for synchronisation, this sounds like: Learn one technology and how to integrate data source, you will never have to learn how to develop syncing again. But on the other hand, you can read that the main usage scenario seems to be to synchronize MSSQL Compact databases with MSSQL.
SQL Server Integration Services: This sounds like an emergency plannable solution. In case the firewall is not working, we have a package that can be executed again and again... until the firewall drops down or the authentication is fixed.
Brute Force copy and replace of database files: Probably not the best choice.
Of course, when looking on the Microsoft websites, I read that every technology (apart from brute force of course) is said to be a solid solution that can be applied in many scenarios. But that is, of course, not the stuff I wanted to hear.
So what is your opinion about this? Which technology would you suggest.
Thank you!
Stefan
The easiest mechanism is log shipping. The primary server can put the log backups on any UNC path, and then you can use any file sync tools to manage getting the logs from one server to another. The subscriber just automatically restores any transaction log backups it finds in its local folder. This automatically handles not just data, but schema changes too.
The subscriber will be read-only, but that's exactly what you want - otherwise, if someone can update records on the subscriber, you're going to be in a world of hurt.
I'd add two techniques to your list.
Write T-SQL scripts to INSERT...SELECT the data directly
Create a full backup of the database and restore it onto the new server
If it's a big database and you're not going to be doing this too often, then I'd go for the backup and restore option. It does most of the work for you and is guaranteed to copy all the objects.
I've not heard of anyone using Sync Framework, so I'd be interested to hear if anyone has used it successfully.

Warm Standby SQL Server/Web Server

Warm Standby SQL Server/Web Server
This question might fall into the IT category but as the lead developer I must come up with a solution to integration as well as pure software issues.
We just moved our web server off site and now I would like to keep a warm standby of both the website and database (SQL 2005) on site.
What are the best practices for going about doing this with the following environment?
Offsite: our website and database (SQL 2005) are on one Windows 2003 server. There is a firewall in front of the server which makes
replication or database mirroring not an option. A vpn is also not an option.
My thoughts as a developer were to write a service which runs at the remote site to zip up and ftp the database backup to an ftp server
on site. Then another process would unzip the backup and restore it to the warm standby database here.
I assume I could do this with the web site as well.
I would be open to all ideas including third party solutions.
If you want a remote standby you probably want to look into a log shipping solution.
This article may help you out. In the past I had to develop one of these solutions for the exact same problem, writing it from scratch is not too hard. The advantage you get with log shipping is that you have the ability to restore to any point in time and you avoid shipping these big heavy full backups around and instead ship light transaction log backups, and occasionally a big backup.
You have to keep in mind that transaction log backups are useless without having both the entire sequence of transaction log backups and a full backup.
You have exactly the right idea. You could maybe write a script that would insert the names of the files that you moved into a table that your warm server could read. Your job could then just poll this table at intervals and not worry about timing.
Forget about that - just found this. Sounds like what you are setting out to do.
http://www.sqlservercentral.com/articles/Administering/customlogshipping/1201/
I've heard good things about Syncback:
http://www.2brightsparks.com/syncback/sbpro-features.html
Thanks for the link to the article sambo99. Transaction log shipping was my original idea, but I dismissed it because the servers are not in the same domain not even in the same time zone. My only method of moving the files from point A to point B is via FTP. I am going to experiment with just shipping the transaction logs. And see if there is a way to fire off a restore job at given intervals.
www.FolderShare.com is a good tool from Microsoft. You could log ship to a local directory and then synchronize the directory to any other machine. You could also syncrhronize the website folders as well.
"Set it and forget it" type solution. Setup your SQL jobs to clear older files and you'll never have to edit anything after the initial install.
FolderShare (free, in beta) is currently limited to 10,000 files per library.
For all interested the following question also ties into my overall plan to implement log shipping:
SQL Server sp_cmdshell

Nightly importable or attachable copies of production database

We would like to be able to nightly make a copy/backup/snapshot of a production database so that we can import it in the dev environment.
We don't want to log ship to the dev environment because it needs to be something we can reset whenever we like to the last taken copy of the production database.
We need to be able to clear certain logging and/or otherwise useless or heavy tables that would just bloat the copy.
We prefer the attach/detach method as opposed to something like sql server publishing wizard because of how much faster an attach is than an import.
I should mention we only have SQL Server Standard, so some features won't be available.
What's the best way to do this?
MSDN
I'd say use those procedures inside a SQL Agent job (use master.xp_cmdshell to perform the copy).
You might want to put the big huge tables on their own partition and have this partition belong to a different file group. You would backup then backup and restore the main file group.
You might want to also consider doing incremental backups. Say, a full backup every weekend and an incremental every night. I haven't done file group backups, so I don't know if these work well together.
I'm guessing that you are already doing regular backups of your production database? If you aren't, stop reading this reply and go set it up right now.
I'd recommend that you write a script that automatically runs, say once a day, that:
Drops your current test database.
Restores your current production backup to your test environment.
You can write a simple script to do this and execute it using the isql.exe command line tool.

Resources