Best Way to Create Nightly Replication of Sql Server Database - sql-server

I want to create a job that runs every night. I have a database (MyDatabase) that I want to copy/replace my staging database with (MyDatabase_Stage).
I presume the easiest way is to do something related to SQL Server Agent, but I have never done anything like this before. What is the best practice and easiest route to go to get this setup and tested?
I do not care if the data is 24 hours old and the most important criteria is that is does a full copy every night at the same time.

copy the .bak file to your staging server and restore from there using a script.
Run the script on a schedule in an agent job.
The benefit of a script is that you can add functionality later - for instance you might not require audit tables and these can be truncated.

Check out snapshot replication. As part of the setup, it'll create a SQL Agent job to do the copy of the data and whatnot. You can then schedule that job at whatever time and frequency you like.

Related

SQLServer differential backup and restore

I have a scenario in which I need to maintain a replica of existing database.
Is there a solution to achieve the below mentioned approach.
1. Take a full back once and restore to a destination database.
2. Scheduled( ex: Every day) differential backup(Only the data which has changed since last backup) of the source database and restore into the destination database
This is to avoid taking full backup and restore each time.
You can use Differential Backups, but you would need to ship a new Full backup periodically or the Differentials will continue to grow.
A better solution might be Log Shipping, where you can ship just the changes on whatever schedule you want.
You can consider configuring an availability group and use a secondary SQL server instance with asynchronous data sync. This should be considered only if the primary(original live SQL server) and secondary servers are in the same location\data centre. So you don't need to take backup-restore or do any extra work other than properly configuring it at the first time.
If that is not the case (copy should be available in another location\data center), it would be better to go with configuring log shipping.
First option is a lot better because it would contain the exact copy of the primary database (with a sync delay depending on various factors...probably seconds) and you can directly fail over to the secondary in case of any issues with the primary server.

Automating a SQL Restore - but delayed by 24 hours

I need to be able to script the following process and would be grateful for any guidance in achieving it!
We have a live database which is backed up every night (SQL 2008 R2). We have a 3rd party which requires access to the database but management have decided they can have access to a time-delayed copy instead. So I have been tasked with restoring the previous nights backup to another SQL instance, complete with login information to which the 3rd party can access without impacting the live database.
I believe a script could perform the task, except the backup file name is not constant (i.e databasename_2015_02_15_223005_3661110) and I can't figure out a way of automating a restore without knowing the backup name. We keep 3 days of full backups on the server before they are archived.
Instead should I be looking at either snapshots or replication to produce the 24 hour delayed copy?

Replicating a SQL Server database for read access

I have an application that is in production with its own database for more than 10 years.
I'm currently developing a new application (kind of a reporting application) that only needs read access to the database.
In order not to be too much linked to the database and to be able to use newer DAL (Entity Framework 6 Code First) I decided to start from a new empty database, and I only added the tables and columns I need (different names than the production one).
Now I need some way to update the new database with the production database regularly (would be best if it is -almost- immediate).
I hesitated to ask this question on http://dba.stackexchange.com but I'm not necessarily limited to only using SQL Server for the job (I can develop and run some custom application if needed).
I already made some searches and had those (part-of) solutions :
Using Transactional Replication to create a smaller database (with only the tables/columns I need). But as far as I can see, the fact that I have different table names / columns names will be problematic. So I can use it to create a smaller database that is automatically replicated by SQL Server, but I would still need to replicate this database to my new one (it may avoid my production database to be too much stressed?)
Using triggers to insert/update/delete the rows
Creating some custom job (either a SQL Job or some Windows Service that runs every X minutes) that updates the necessary tables (I have a LastEditDate that is updated by a trigger on my tables, so I can know that a row has been updated since my last replication)
Do you some advice or maybe some other solutions that I didn't foresee?
Thanks
I think that the Transactional replication is the better than using triggers.
Too much resources would be used in source server/database due to the trigger fires by each DML transaction.
Transactional rep could be scheduled as a SQL job and run it few times a day/night or as a part of nightly scheduled job. IT really depends on how busy the source db is...
There is one more thing that you could try - DB mirroring. it depends on your sql server version.
If it were me, I'd use transactional replication, but keep the table/column names the same. If you have some real reason why you need them to change (I honestly can't think of any good ones and a lot of bad ones), wrap each table in a view. At least that way, the view is the documentation of where the data is coming from.
I'm gonna throw this out there and say that I'd use Transaction Log shipping. You can even set the secondary DBs to read-only. There would be some setting up for full recovery mode and transaction log backups but that way you can just automatically restore the transaction logs to the secondary database and be hands-off with it and the secondary database would be as current as your last transaction log backup.
Depending on how current the data needs to be, if you only need it done daily you can set up something that will take your daily backups and then just restore them to the secondary.
In the end, we went for the Trigger solution. We don't have that much changes a day (maybe 500, 1000 top), and it didn't put too much pressure on the current database. Thanks for your advices.

Trigger for take daily backup of sql server database

I work on database task. I have to take database backup at all day at 12:00 AM.
I can't find trigger for it but i find one job for it on this.
Job for backup for backup database and also it work good but I want trigger if it is possible.
Please any help.
You don't want to use triggers to initiate a database backup.
The correct approach for this is to put the SQL you have to take the database backup into a SQL Server Agent job and to schedule that job at whatever time you want it to "trigger" off. I recommend you pick the schedule time when you know there is minimal to no activity on your SQL Server so the database backup does not impact the SQL Server performance.

Nightly importable or attachable copies of production database

We would like to be able to nightly make a copy/backup/snapshot of a production database so that we can import it in the dev environment.
We don't want to log ship to the dev environment because it needs to be something we can reset whenever we like to the last taken copy of the production database.
We need to be able to clear certain logging and/or otherwise useless or heavy tables that would just bloat the copy.
We prefer the attach/detach method as opposed to something like sql server publishing wizard because of how much faster an attach is than an import.
I should mention we only have SQL Server Standard, so some features won't be available.
What's the best way to do this?
MSDN
I'd say use those procedures inside a SQL Agent job (use master.xp_cmdshell to perform the copy).
You might want to put the big huge tables on their own partition and have this partition belong to a different file group. You would backup then backup and restore the main file group.
You might want to also consider doing incremental backups. Say, a full backup every weekend and an incremental every night. I haven't done file group backups, so I don't know if these work well together.
I'm guessing that you are already doing regular backups of your production database? If you aren't, stop reading this reply and go set it up right now.
I'd recommend that you write a script that automatically runs, say once a day, that:
Drops your current test database.
Restores your current production backup to your test environment.
You can write a simple script to do this and execute it using the isql.exe command line tool.

Resources