I have a SQL Server 2016 database of which I am the owner. It's an archive that's no longer accessible to anyone else. I still need to read this database by linking to it from other SQL Server databases and from Access.
I would like to be able to alter and create views in this database but I would like to prevent myself from inadvertently changing any data there.
What's the quickest, easiest, and easily reversible way to allow myself only select access to all the tables?
alter database current set read_only
To reverse, or if you want to modify views, run
alter database current set read_write
No permissions-based solution is effective against the database owner.
I want to make a process for recreating a database schema-only. Not a backup/restore but standing up a schema-only copy of a database on a different server, that we would populate with data manually. I was thinking about using SMO in powershell. Does anyone know what the best approach for this would be? Prefer to stay away from 3rd party options, I have the time to do it myself.
Thanks!
One way is to use DBCC CLONEDATABASE(structures + statistics):
DBCC CLONEDATABASE('original_db_name', 'cloned_db')
ALTER DATABASE [cloned_db] SET READ_WRITE WITH NO_WAIT
Backup
Restore on second server
ALTER DATABASE [cloned_db] MODIFY NAME = original_db_name;
Ignoring any reasons why I shouldn't be doing this ...
But what would be the easiest way to refresh a set tables from another MSSQL database as a single transaction?
Context:
10 tables
DDL won't change
Refresh is 100%!
~100Megs (relatively small)
I would want to do this as a script (TSQL or SQL), and avoid any advanced Server changes (replications, etc).
Will a simple SELECT * INSERT INTO , wrapped in a transaction be the best thing to do??
If your sole question is to migrate those tables from a different database within the same server then there are multiple ways
You can run a insert into select from statement like
insert into db1.db1table
select * from db2.table;
You can create a DB dump (using SSMS) which will script the table schema along with all the data and that *.sql file you can run against your another database. probably you can use SQLCMD or SSMS if you prefer.
Third option is to, do a full DB backup and restore the same.
Dumb question - what's the best way to copy instances in an environment where I want to refresh a development server with instances from a production server?
I've done backup-restore, but I've heard detach-copy-attach and one guy even told me he would just copy the datafiles between the filesystems....
Are these the three (or two, the last one sounds kind of suspect) accepted methods?
My understanding is that the second method is faster but requires downtime on the source because of the detach aspect.
Also, in this situation (wanting an exact copy of production on a dev server) what's the accepted practice for transferring logins,etc.? Should I just backup and restore the user databases + master + msdb?
Easiest way is actually a script.
Run this on production:
USE MASTER;
BACKUP DATABASE [MyDatabase]
TO DISK = 'C:\temp\MyDatabase1.bak' -- some writeable folder.
WITH COPY_ONLY
This one command makes a complete backup copy of the database onto a single file, without interfering with production availability or backup schedule, etc.
To restore, just run this on your dev or test SQL Server:
USE MASTER;
RESTORE DATABASE [MyDatabase]
FROM DISK = 'C:\temp\MyDatabase1.bak'
WITH
MOVE 'MyDatabase' TO 'C:\Sql\MyDatabase.mdf', -- or wherever these live on target
MOVE 'MyDatabase_log' TO 'C:\Sql\MyDatabase_log.ldf',
REPLACE, RECOVERY
Then save these scripts on each server. One-click convenience.
Edit:
if you get an error when restoring that the logical names don't match, you can get them like this:
RESTORE FILELISTONLY
FROM disk = 'C:\temp\MyDatabaseName1.bak'
If you use SQL Server logins (not windows authentication) you can run this after restoring each time (on the dev/test machine):
use MyDatabaseName;
sp_change_users_login 'Auto_Fix', 'userloginname', null, 'userpassword';
The fastest way to copy a database is to detach-copy-attach method, but the production users will not have database access while the prod db is detached. You can do something like this if your production DB is for example a Point of Sale system that nobody uses during the night.
If you cannot detach the production db you should use backup and restore.
You will have to create the logins if they are not in the new instance. I do not recommend you to copy the system databases.
You can use the SQL Server Management Studio to create the scripts that create the logins you need. Right click on the login you need to create and select Script Login As / Create.
This will lists the orphaned users:
EXEC sp_change_users_login 'Report'
If you already have a login id and password for this user, fix it by doing:
EXEC sp_change_users_login 'Auto_Fix', 'user'
If you want to create a new login id and password for this user, fix it by doing:
EXEC sp_change_users_login 'Auto_Fix', 'user', 'login', 'password'
UPDATE:
My advice below tells you how to script a DB using SQL Server Management Studio, but the default settings in SSMS miss out all sorts of crucial parts of a database (like indexes and triggers!) for some reason. So, I created my own program to properly script a database including just about every type of DB object you may have added. I recommend using this instead. It's called SQL Server Scripter and it can be found here:
https://bitbucket.org/jez9999/sqlserverscripter
I'm surprised no-one has mentioned this, because it's really useful: you can dump out a database (its schema and data) to a script, using SQL Server Management Studio.
Right-click the database, choose "Tasks | Generate Scripts...", and then select to script specific database objects. Select the ones you want to copy over to the new DB (you probably want to select at least the Tables and Schemas). Then, for the "Set Scripting Options" screen, click "Advanced", scroll down to "Types of data to script" and select "Schema and data". Click OK, and finish generating the script. You'll see that this has now generated a long script for you that creates the database's tables and inserts the data into them! You can then create a new database, and change the USE [DbName] statement at the top of the script to reflect the name of the new database you want to copy the old one to. Run the script and the old database's schema and data will be copied to the new one!
This allows you to do the whole thing from within SQL Server Management studio, and there's no need to touch the file system.
Below is what I do to copy a database from production env to my local env:
Create an empty database in your local sql server
Right click on the new database -> tasks -> import data
In the SQL Server Import and Export Wizard, select product env's servername as data source. And select your new database as the destination data.
Its hard to detach your production dB or other running dB's and deal with that downtime, so I almost always use a Backup / restore method.
If you also want to make sure to keep your login's in sync check out the MS KB article on using the stored proc sp_help_revlogin to do this.
The detach/copy/attach method will take down the database. That's not something you'd want in production.
The backup/restore will only work if you have write permissions to the production server. I work with Amazon RDS and I don't.
The import/export method doesn't really work because of foreign keys - unless you do tables one by one in the order they reference one another. You can do an import/export to a new database. That will copy all the tables and data, but not the foreign keys.
This sounds like a common operation one needs to do with database. Why isn't SQL Server handling this properly? Every time I had to do this it was frustrating.
That being said, the only painless solution I've encountered was Sql Azure Migration Tool which is maintained by the community. It works with SQL Server too.
I run an SP to DROP the table(s) and then use a DTS package to import the most recent production table(s) onto my development box.
Then I go home and come back the following morning. It's not elegant; but it works for me.
If you want to take a copy of a live database, do the Backup/Restore method.
[In SQLS2000, not sure about 2008:] Just keep in mind that if you are using SQL Server accounts in this database, as opposed to Windows accounts, if the master DB is different or out of sync on the development server, the user accounts will not translate when you do the restore. I've heard about an SP to remap them, but I can't remember which one it was.
Is there a way to replicate a sql server database but not push out deletes to the subscribers?
You don't mention which version of SQL Server you're running, but Andy Warren wrote an article on configuring INSERT, UPDATE, and DELETE behaviour in SQL Server 2005. You can configure this through the GUI, using his instructions:
http://www.sqlservercentral.com/articles/Replication/3202/
It's tempting to 'intervene' in a normal replication and 'disarm' the subscriber's side delete stored procedures, but this leaves no option to recover from replication failure. If the replication tries to recover, a reinitialize may be needed and this will drop any 'stale' data that the replication agent considers deleted.
An alternative is to use a normal replication, and use a script that generates insert and update triggers on all tables in the subscriber database, that insert/update that data into yet a third database. This way the third DB will collect all the data that ever existed, the second DB can re-initialize it's subscription if it needs to (when you do, just remember that bulk inserts don't call the insert trigger and check for new data and add it to the third DB), and the first DB doesn't have to perform the extra work that the triggers are.
Do this....Drop the article. Create a new storedprocedure in the corresponding database that mimicks the system store procedure (sp_del...) and contains the same parameter but does nothing. Add the article again...and set the delete store procedure under the article's properties to the new delete stored procedure that you created....
Or you can select Do not replicate Delete Statements....I think that works but i haven't tried it.