so I am learning about databases and still fairly new
but from what I learned so far I gathered that if the database is large then hot backup should be used not cold backup. (from my understanding hot backup is when you backup database while users are still using it , cold is when you need to have a downtime to do the backup- users can't use it)
but when you have somewhat large files (eg. PDFs of size 20M?) stored in some directory and you store the paths for the files inside the database ... if you need to do a hot backup how would you go about backing up those files??
what approaches should be used and do they have down sides?
is it possible to do a hot backup for them? if not then why!
also does it really matter what type of database is used?? (MySQL vs sqlserver ,...)
or is it a general approach for any type?
I have already googled this and got no answers! (maybe I am using the wrong terms? please point out the right ones!!!!)
if you think my question is too general please point me out in the right direction
please excuse my English as it is not my first language
I appreciate any help I can get
How to make backup of live webpage:
1) Copy all files from FTP to your local machine, write them to some DVD or place somewhere safe.
2) Go to PhpMyAdmin > Export > press Go
If database is big, select Custom export and select Compression under Output
Now you have backup of your webpage AND still running webpage itself. There is no such thing as Cold/Hot backup as you described.
Related
I would like to backup my SonarQube production server. Can you please help with below two queries:
1\ What all things backup needs to be taken (e.g. database or which all folders from SonarQube home)
2\ Is there any solution already available which can be used directly to take backup of SonarQube.
Thanks and Regards,
Deepti
We regularly do backups in our company, and for that we only backup two things:
the database - because it contains all the data and this is the only thing based on the upgrade notice you need for a restore https://docs.sonarqube.org/latest/setup/upgrading/
the ./extensions/plugin folder - because you never know, what was the version of which plugin afterwards. and you might have a custom plugin, or one which is not in the marketplace, which you will sure forget about
There is no reason to backup the elastic search data, as sonarqube will create all the necessary information on startup. Just be aware, that the first startup will take some time, depending on the amount of data you have stored.
As those two things are actually pretty straight forward, i am not sure if there is really a tool which can help you with that.
I know this isn't something new and people might be doing this in their environment. I have a requirement to do refreshes monthly, weekly etc in lower environments and I wanted to know if there is a quicker approach to this. I know we can do a backup and restore etc through SQL job (I would love to know if there is an automated script which takes care of the entire process). Also, instead of doing a full database restore every month is there a way we can only send changes that happened during the month or the week that way it would save a lot of time and wastage of space. I am not sure on how to achieve the second option of shipping only the changes. We aren't considering any HA technologies for x reasons so please do not give me those options. Any script that you have that can achieve this or if you are doing something similar in your environment and have the necessary details and scripts then please do share the same. Is there any tool that can achieve this but obviously this won't be my first option unless we can't do it via writing t-sql code. Also, our boxes are VMs so is there a possibility we can leverage the features and capabilities by taking file snapshots and delivering it to the lower environments (sorry I am a bit naive on VM capabilities and techniques) rather than doing backup and restore natively through SQL.
Fyi...we want complete data not just bare schema. Also, please do not share solutions using SSIS.
Any help would be much appreciated.
Thanks
Once you perform the recovery portion of a database restore, it loses the ability to restore additional backups from the source database. Depending on your setup, you may be able to get away with shipping only an additional differential backup from the source system. But you'd still need to restore the full backup again.
I am trying to get the content of one MSSQL database to a second MSSQL database. There is no conflict management required, no schema updating. It is just a plain copy and replace data. The data of the destination database would be overwritten, in case somebody would have had changed something there.
Obviously, there are many ways to do that
SQL Server Replication: Well established, but using old protocols. Besides that, a lot of developers keep telling me that the devil is in the details and the replication might not always work as expected and that is this best choice for an administrator, but not for a developer.
MS Sync Framework: MSF is said to be the cool, new technology. Yes, it is this new stuff, you love to get, because it sounds so innovative. There is the generic approach for synchronisation, this sounds like: Learn one technology and how to integrate data source, you will never have to learn how to develop syncing again. But on the other hand, you can read that the main usage scenario seems to be to synchronize MSSQL Compact databases with MSSQL.
SQL Server Integration Services: This sounds like an emergency plannable solution. In case the firewall is not working, we have a package that can be executed again and again... until the firewall drops down or the authentication is fixed.
Brute Force copy and replace of database files: Probably not the best choice.
Of course, when looking on the Microsoft websites, I read that every technology (apart from brute force of course) is said to be a solid solution that can be applied in many scenarios. But that is, of course, not the stuff I wanted to hear.
So what is your opinion about this? Which technology would you suggest.
Thank you!
Stefan
The easiest mechanism is log shipping. The primary server can put the log backups on any UNC path, and then you can use any file sync tools to manage getting the logs from one server to another. The subscriber just automatically restores any transaction log backups it finds in its local folder. This automatically handles not just data, but schema changes too.
The subscriber will be read-only, but that's exactly what you want - otherwise, if someone can update records on the subscriber, you're going to be in a world of hurt.
I'd add two techniques to your list.
Write T-SQL scripts to INSERT...SELECT the data directly
Create a full backup of the database and restore it onto the new server
If it's a big database and you're not going to be doing this too often, then I'd go for the backup and restore option. It does most of the work for you and is guaranteed to copy all the objects.
I've not heard of anyone using Sync Framework, so I'd be interested to hear if anyone has used it successfully.
I need to do some structural changes to a database (alter tables, add new columns, change some rows etc) but I need to make sure that if something goes wrong i can rollback to initial state:
All needed changes are inside a SQL script file.
I don't have administrative access to database.
I really need to ensure the backup is done on server side since the BD has more than 30 GB of data.
I need to use sqlplus (under a ssh dedicated session over a vpn)
Its not possible to use "flashback database"! It's off and i can't stop the database.
Am i in really deep $#$%?
Any ideas how to backup the database using sqlplus and leaving the backup on db server?
better than exp/imp, you should use rman. it's built specifically for this purpose, it can do hot backup/restore and if you completely screw up, you're still OK.
One 'gotcha' is that you have to backup the $ORACLE_HOME directory too (in my experience) because you need that locally stored information to recover the control files.
a search of rman on google gives some VERY good information on the first page.
An alternate approach might be to create a new schema that contains your modified structures and data and actually test with that. That presumes you have enough space on your DB server to hold all the test data. You really should have a pretty good idea your changes are going to work before dumping them on a production environment.
I wouldn't use sqlplus to do this. Take a look at export/import. The export utility will grab the definitions and data for your database (can be done in read consistent mode). The import utility will read this file and create the database structures from it. However, access to these utilities does require permissions to be granted, particularly if you need to backup the whole database, not just a schema.
That said, it's somewhat troubling that you're expected to perform the tasks of a DBA (alter tables, backup database, etc) without administrative rights. I think I would be at least asking for the help of a DBA to oversee your approach before you start, if not insisting that the DBA (or someone with appropriate privileges and knowledge) actually perform the modifications to the database and help recover if necessary.
Trying to back up 30GB of data through sqlplus is insane, It will take several hours to do and require 3x to 5x as much disk space, and may not be possible to restore without more testing.
You need to use exp and imp. These are command line tools designed to backup and restore the database. They are command line tools, which if you have access to sqlplus via your ssh, you have access to imp/exp. You don't need administrator access to use them. They will dump the database (with al tables, triggers, views, procedures) for the user(s) you have access to.
I am a developer. An architect on good days. Somehow I find myself also being the DBA for my small company. My background is fair in the DB arts but I have never been a full fledged DBA. My question is what do I have to do to ensure a realiable and reasonably functional database environment with as little actual effort as possible?
I am sure that I need to make sure that backups are being performed and that is being done. That is an easy one. What else should I be doing on a consistant basis?
Who else is involved in the database? Are you the only person making schema changes (creating new objects, releasing new stored procedures, permissioning new users)?
Make sure that the number of users doing anything that could impact performance is reduced to as close to zero as possible, ideally including you.
Make sure that you're testing your backups - ideally run a DEV box that is recreating the production environment periodically, 1. a DEV box is a good idea, 2. a backup is only useful if you can restore from it.
Create groups for the various apps that connect to your database, so when a new user comes along you don't guess what permissions they need, just add them to the group, meanwhile permission the database objects to only the groups that need them
Use indices, primary keys, foreign keys, constraints, stats and whatever other tools your database supports. Normalise.
Optimise the most common code against your box - bad stored procedures/data access code will kill you.
I've been there. I used to have a job where I wrote code, did all the infrastructure stuff, wore the DBA hat, did user support, fixed the electric stapler when it jammed, and whatever else came up that might be remotely associated with IT. It was great! I learned a little about everything.
As far as the care and feeding of your database box, I'd recommend that you do the following:
Perform regular full backups.
Perform regular transaction log backups.
Monitor your backup jobs. There's a bunch of utilities out on the market that are relatively cheap that can automate this for you. In a small shop you're often too busy
to remember to check on them daily.
Test your backups. Do a drill. Restore an old copy of your most important databases. Prove to yourself that your backups are working and that you know how to restore them properly. You'd be suprised how many people only think about this during their first real disaster.
Store backups off-site. With all the online backup providers out there today, there's not much excuse for not having an offsite backup.
Limit sa access to your boxes.
If your database platform supports it, use only role based security. Resist the temptation to have one-off user specific security.
The basic idea here is that if you restrict who has access to the box, you'll have fewer problems. Secondly, if your backups are solid, there are few things that come up that you won't be able to deal with effectively.
I would suggest:
A script to quickly restore the latest backup of a database, in case it gets corrupted
What kind of backups are you doing? Full backups each day, or incremental every hour, etc?
Some scripts to create new users and grant them basic access.
However, the number one suggestion is to limit as much as possible the power other users have, this will greatly reduce the chance of stuff getting badly messed up. Servers that have everyone as an sa tend to get screwed up quicker than servers that are locked down.