After I access the database in VS 2015. I log into localdb SSMS to find that the database is no longer there and I must right click database and attach it. Is this normal? Btw I am using SSMS 2016
I have never heard of a disappearing database. I suppose you could have accidentally detached it, though.
What Is Detach or Attach and How do They Work?
We'll start with detach. When you detach a database in SQL Server, you are taking the database offline and removing it from the SQL Server instance from which you are detaching it. The databases data and log files remain in tact and are left in a consistent state so you can then attach the database at a later point or to another SQL Server instance. Attach connects the data and log files from a database that has been properly detached (or that were copied from a cleanly shut down instance of SQL Server) to an instance of SQL Server and brings the database online.
How Do I Detach a Database?
You can do this in T-SQL or from the SQL Server Management Studio GUI.
In the GUI, you right click on the database you wish to detach, select All Tasks and click on Detach. From there you'll get the detach dialog. You can choose to drop connections first to forcibly disconnect any active connections and rollback work they were in the middle of executing. You can also choose to update statistics before the detach. Detaching through - Choose Detach...
In T-SQL:
-- You don't want to be in the database you are trying to detach
USE Master
GO
-- Optional step to drop all active connections and roll back their work
ALTER DATABASE DatabaseName
SET SINGLE_USER WITH ROLLBACK IMMEDIATE
GO
-- Perform the detach
EXEC sp_detach_db 'DatabaseName'
GO
For the system stored procedure sp_detach_db there are two paramaters that you can pass in optionally:
#skipchecks - acceptable input is 'True' or 'False' if 'True', SQL Server will update statistics before detach. If 'False', it won't. If you don't specify anything here the statistics will be updated in SQL Server 2005 or later.
#keepfulltextindexfile - The default here is 'True' - if this is set to true, the full text index metadata will not be dropped during the detach.
To see a lot more about detach and some more details on the risks I highlight below, the Books Online article for sp_detach_db is a good place to start.
How Do I Attach a Database?
You can also do this in T-SQL or from the SQL Server Management Studio GUI.
(NOTE: If you have the data and log files from a database that was not properly detached, your attach may not work. When detach occurs, the database is brought offline and the log and data files are put into a consistent state. This also happens when a service is cleanly shut down.)
In the GUI, you right click on the top level Databases folder for your instance and select Attach. In the next dialog you would then select the primary data file (.MDF) of the database you wish to attach and ensure you have the other files selected and their appropriate locations specified, and click OK, attaching your database.
In T-SQL the best way to do this in SQL Server 2005 and forward is through the CREATE DATABASE command. This is the method that is supported beyond SQL Server 2012. If you want to see how to use sp_attach_db, you can see that in the books online articles for [sp_attach_db][3] or [sp_attach_single_file_db][4]
When you have your log file and data files available and they are consistent this is the T-SQL approach:
-- Using Create Database and the FOR ATTACH clause to attach
CREATE DATABASE DatabaseName
ON (FILENAME = 'FilePath\FileName.mdf'), -- Main Data File .mdf
(FILENAME = 'FilePath\LogFileName.ldf'), -- Log file .ldf
(FILENAME = 'FilePath\SecondaryDataFile.ndf) -- Optional - any secondary data files
FOR ATTACH
GO
You can see more about the Create Database statement in books online as well.
How Do I Detach/Attach in SQL Server Express?
It's actually the same. If you are using SQL Server Management Studio Express you can use the detach/attach dialog in the GUI described above or the T-SQL steps through SSMS Express described above as well. No difference with Express there.
If you don't have SSMS Express, you can download it (Here is the SQL Server 2012 Express version).
Of you can enter into a SQLCMD session and use the same T-SQL constructs described above.
When Should I Consider Doing a Detach or Attach?
First a word on what detach and attach is not meant to be used for: Backup and Recovery Detach and Attach is not a way to backup your database for routine recovery purposes. There are no transaction log backups this way, it puts your database into a state where the database files can be deleted accidentally and is not a good way at all for this purpose.
That said, detach and attach are good for a few use cases (not exhaustive, feel free to edit to add or create a new answer with more):
Sometimes for migrations (although I prefer backup/restore for those as discussed in my answer here)
When you want to remove a database that is no longer actively used but have the ability to attach later as needed.
In certain troubleshooting situations, this may be called upon
Don't have the space to backup or to restore both a data and log files to another environment (you shouldn't ever be here but I've used it to move dev databases around environments at times.. Didn't want or need the log so did an attach/rebuild of the log file)
Risks and Warnings
Again, books online is a good resource here, but I'll call out some specific considerations to have in mind with detaching or attaching a database -
Detach
You are taking your database offline. It won't be accessible anymore. This should be obvious, but worth calling out. This is why it isn't a great backup option.
When your database is online, SQL Server locks the files. I wouldn't recommend trying this to prove me wrong, because there could be some other situation at play, but you typically can't delete a database file (data, secondary data or log file) while SQL Server is online. This is a good thing. When you detach, you have no such protection - this can be a bad thing.
If you are dealing with database corruption and you find some article someplace that has a first step of Detach - it's wrong - if you detach a corrupt database, that may be it. You may not be attaching that database again.
Cutting and pasting your production database files throughout your network is a way to potentially introduce file level corruption.. Another reason I prefer backup/restore when doing migrations.
It might cause a maintenance plan to fail. The situation is that you have, as I did, set up a maintenance plan to carry out regular backups of all databases without checking best practice. This works fine so you stop thinking about it. Someone else then decides to take a database they're not using offline. The maintenance plan will fail from that point forwards until you modify the maintenance plan by checking the "ignore databases whose state is not online" option in the "Database(s)" dialog. Note that it won't just fail for the offline database - the maintenance plan will fail with an error at the point when it tries to backup the offline database so some online databases might not be backed up. (different author for this point so treat with suspicion)
Attach - Just like you shouldn't run scripts from the internet or accept packages from strangers at the airport, you shouldn't attach a database you got from someone else without some steps to verify it. This database could have code inside of it in triggers, stored procedures, etc. that could compromise your environment. You should review a database you want to attach in a safe, and firewalled environment, not your production system.
What About Different Versions or Editions of SQL Server?
These are no different than the rules around restoring databases between versions. You can generally restore up to the next version for 3 versions (SQL Server 2008 to SQL Server 2012, for example will work. SQL Server 2000 to SQL Server 2012 will not). You cannot go backwards at all via backup/restore or detach/attach - you'd have to script out objects and script out the inserts and do it manually or with a tool that does this. For editions, you can generally move between the main SKUs of SQL Server - for instance you can move a database from Standard to Enterprise with no extra work. If you are using Enterprise features (Say, compression or partitioning), you'll need to disable those features before you make the move, though. You can get an idea of the features you'd need to consider disable by looking here.
https://dba.stackexchange.com/questions/30440/how-do-i-attach-a-database-in-sql-server
Related
In short, I need a way to copy a database from one server to another without access to backup functionality, i.e. read and recreate schema and other objects with statements like CREATE TABLE, copy data with INSERT, copy constraints with ALTER TABLE etc...
I have a database on SQL Server in a large enterprise. There is one PROD-like DB and multiple individual developer instances that are supposed to be kept in sync with it by way of running the same migration scripts. However, this is not always done and instances tend to fall behind up to the point where auto-migrate is impossible and manual migrate takes a lot of time.
Normally, we would just restore those databases from a backup, but most rights for managing the databases are reserved for dbas, as a dev I can only read/write schema and data, but not make\restore backups, so this takes a lot of bureaucracy to do. I'm looking for a script or tool to clone PROD-like database without using backup.
P.S. We have SQL Source control from RedGate that we use for part of our migration process, I'm thinking if I could use it somehow?
SQL Source Control isn't the right tool to refresh dev environments from production. If you can't access a backup, you could try using a combination of SQL Compare and SQL Data Compare.
I am trying to use Copy Database Wizard to copy from my live server (shared hosting) to my local machine. Both the live and local servers are SQL 2008 R2.
I have used CDW for several years with perfect success when copying from a live SQL 2000 server to my local 2008 R2. But now that I have migrated my live database to SQL 2008 R2 the CDW is giving me this error:
Could not read metadata, possibly due to insufficient access rights.
I've learned that this error can be predicted before you even complete the CDW setup: On the page where the CDW asks you for your desired destination database name, it is SUPPOSED to populate the .mdf and .ldf files with their name-to-be and size (e.g. MB, GB).
But in my case these file names and sizes are not being shown (area is simply blank in the wizard) and then of course when I attempt to execute the package it gives me the error.
After much research I believe that reason for this error is due to the CDW requirement of "You must be a member of the sysadmin fixed server role on both the source and destination servers."
On my local server, my Windows Authentication login is listed as a Role Member for the sysadmin Server Role. However on my live server (keep in mind it is a shared SQL server with 250+ databases) the only Role Member listed is [sa].
Am I right in thinking that the only way to satisfy this requirement would be to add my specific SQL user to the live/source Server > Security > Server Roles > sysadmin role? I'm guessing that would never be done on a shared server right? Or is there some other way to make it work by messing with the specific database properties/users/roles?
I can't explain why CDW is working from the live SQL 2000 server and not the 2008 R2. I HOPE it is simply that something isn't set up right on the live database, but maybe it is due to changes that were made to SQL security over the years.
In case it matters, I must use the SMO method instead of detach/attach because it is a live database that I don't want to take down. Historically the CDW from SQL 2000 only takes 3 minutes with SMO method so speed isn't an issue anyway.
Here's my preference for a solution:
Find a way to get CDW to work, most likely by changing something on the live server. Is this possible? What would it be?
If that fails, then...
What about an idea of using CDW to create the package, but then going into to BIDS and manipulating something in the package to circumvent the sysadmin role requirement. (Does it really need the metadata? I don't need anything beside the actual data tables.) Is this possible?
UPDATE 6/14/2016: Editing a CDW package in BIDS won't work as it appears to simply use the .mdf and .ldf files, which of course I don't have access to on the shared server. I think an alternative is to use Import/Export Wizard to create a package, then edit in BIDS. The annoying part is that without access to metadata the Import/Export Wizard doesn't seem to be aware of Foreign Keys, and thus doesn't know what order to process the tables in.
If that fails, then...
Is there any other way to easily automate a daily copy from my live server to local machine? The reason I like CDW is because it is super simple to use (when it works), it can be scheduled to run daily as a SQL agent job, and requires no manual work on my part. Is there a "next best thing" if CDW can't be made to work?
You'd think that a very common scenario for all websites out there would be "how do I get a copy of my live database onto my local SQL server, daily, automatically"? But maybe I'm the weird one!
Another simple solution would be the Import/Export Wizard.
In SSMS right-click the database you want transferred and select 'Tasks' and then 'Export Data...'. It will open a wizard that is very similar to that of CDW. The difference here is that I could not find a sysadmin requirement to use it.
At the end it will give the option to run immediately and/or save the SSIS package. If you save the SSIS package (I prefer to save it to disk) you can then create a schedule via a SQL Agent job.
Seeking some advice before I dig in. Let me explain the ask and my current process.
We currently have Dev teams who require refreshed data from the Prod databases placed into there DEV databases. There requirements change, sometimes they just need tables and other times they need different subsets of the below
Tables
Views
Stored Procedures
Users
Schemas
Currently the process is completely manual and is as outlined below
Disable Job responsible for replication of Prod DB (actually standby)
Highlight Prod DB and "Generate Scripts"
Select the Options that is required (see above e.g. tables views etc..)
Backup Dev DB (just in case)
sp_msforeachtable and drop each table from dev db
Execute the script that was generated from step 2 on dev db
Then use the import wizard to pull the data from the prod source
An additional sql script is often required to run post import on new Dev DB (scrubber)
Enable Job on prod for repl
The SQL Server instance hosts can change as can the databases so variables will need to be passed. The SQL Servers are 2008 on Windows 2008. The box that I host the script/instance on can be any version of windows and any version of SQL Server.
I'm hoping to automate this process, at first just for the SA teams (so could be ps or cli). Eventually (hopefully sooner than later) however present this in some type of ui to the dev teams so they can manage themselves.
I'd prefer if this all runs from a management box running SQL Server and not the SQL Server instances that host the databases. I'm not sure what options are available but I suspect SSIS could be used or PowerShell and SMO and I'm sure there are other crude ways.
I'd like this to be somewhat elegant so it's easily presentable to management. I'm comfortable with PowerShell and SQL but have no experience with SSIS.
Anyway looking for some suggestions.
EDIT:
So my Requirements have actually changed. I now need to scrub the data then backup then post to a share for dev. I'm nearing completion of my script, which is powershell using SMO. I'll give a brief description below and when I'm complete I'll post more details. Prod is over wan, as are the backups. We have have log shipping enabled to my site which is the data I have to work with. Steps are probably going to make some gringe but its necessary because by db is standby.
Create new database by looping through source DB using smo for files/file settings
Backup the newly create database with standby / readonly
Stop Job for log shipping to source db
Take source db offline
Take newly create db offline
Replace newly created DB files with source db files
Bring both DB's back on line
Start Job for log shipping to source db
Restore new created / newly copied file DB with recovery
Execute .sql to scrub new DB
Backup new DB
Copy to share for dev
So thats it, all sql related work is done through SMO. I'm pretty much done, I've built out functions for each step which all work, I just have to pull it all together.
Not pretty but does the job...damn that wan!
EDIT 2:
I ended up backing up, copying local, scrubbing, backing up again with compression than copying across that WAN overnight. I did this all through task scheduler / ps / SMO.
Thanks to all that offered advise
"Automating" what you describe sounds very ambitious - probably impractical given the complexity of the requirements. I would aim for "streamlining" a process rather than total automation.
I would use the SQL Server Data Tools (SSDT) for steps 2, 3, 5 & 6. It can generate and run alter scripts for you, based on a schema compare.
https://msdn.microsoft.com/en-au/data/tools.aspx
You typically start with SSDT by "Importing" the entire database schema into a Visual Studio Project. You can then use the "Schema Compare" tool to see what the differences are between your project and various database environments. Combined with source control (e.g. Visual Studio Online), this will give you a much better handle on what the changes are between environments, avoiding surprises.
I like the import wizard for 7 - I would save the generated SSIS packages and edit them to cover step 8 (if there is ever any reuse). SSDT's Schema Compare tool can tell you if there are any changes, implying you should regenerate the SSIS package for that table.
Needing to ship schema changes from Prod to Dev is presumably a red flag that developers/dbas are risking Prod-first changes. SSDT will help you monitor and quantify that.
I would agree with Mike Honey, there are some big red flags here that something isn't right.
To answer the specific question, if you need to have production data from all of your tables (even a subset) I would personally backup and restore the production data on whatever your schedule is (nightly?) - you should have a backup already so restoring it should be simple.
Once it has been restored you can delete off bits you don't want and if there is any personally identifiable information anonymize it!
Once you are happy with the data you can apply the latest dacpac from the SSDT project to make sure it has all of the developer changes.
This approach has two benefits, firstly it is simpler than copying all of the tables individually and secondly it tests your deployment process pretty effectively for when you do go to production.
That being said, re the big red flags! I would really question why you need production data, the way it should generally work is that each developer has a test database with little to no data in but unit tests that verify that everything works - if you need more data for performance testing then use something like the redgate sql data generator tool to generate realistic test data rather than full production data.
I have seen cases where it was thought production data was required but it actually wasn't - if it is too difficult to produce realistic test data, that in itself is a sign of some bad design etc - of course every environment it unique so maybe they do need it!
ed
One of the tasks in a project that I'm working on is to migrate an existing database on SQL Server 2000, to a new server which runs SQL Server 2008. This database is extremely huge, with 23 million rows and a 78GB mdf file.
What is the best way to migrate a database of this size?
My current approach would be to:
allow for application downtime so that the application doesn't write records to the database
perform a full backup on SQL Server 2000.
move backup file over to new server across the network.
restore full backup on SQL Server 2008.
configure the application to refer to the database on the new server
restart application.
decommission the database on SQL Server 2000.
However, I'm not sure how much application downtime that would involve.
Are there any easier approaches, or an approach that involves very little downtime? Can a backup be taken while the application is running? Obviously I would need to stop the application when the backup file is transferred and the restore is completed. Interested to hear your approaches to a task like this.
If you're open to downtime:
Detach the database
Copy data file(s) and log file(s) to the new server
Attach the database on the new server instance
Detaching closes the database and finalizes the files so they safely can be moved (or backed up via filesystem backup). It will no longer be accessible on the server instance until you reattach it.
Don't cut and paste / move the data and log files, just in case something bombs during the copy.
There are several other things to keep in mind when migrating to a new server instance, like making sure logins exist on the new instance, features in use that might be deprecated, etc.
Here's the reference for detach/attach.
I have two SQL Servers and I want to do a backup on one of them and then restore that on the other server. The catch is that the database already exists on the server I'm restoring to, and I want to keep the security settings the way they are on the server I'm restoring to.
The other catch is that I want to do all of this from PowerShell, so no GUI operations.
Or is this maybe the wrong solution to the problem. Is there maybe another way to move the data without doing a backup/restore and keeping the security settings?
In my environment we use Powershell scripts with Red Gate Compare Professional to restore databases and persist security plus database object differences. The process is fairly simple
Create a Red Gate snapshot of the destination database. Using the Red Gate command-line tool. The file it generates is very small and only contain users, permissions, and objects--basically everything except for the data.
Restore the source database over the destination database using T-SQL
Use the command-line Red Gate tools to compare and synchronize snapshot created in step to the nearly restored database. Any security or object changes are restored.
This solution does require purchasing the Professional edition of SQL Compare and installing the tool on the development server from which the script can be executed. All of this can easily be put into SQL Agent job. The use of Powershell is really kind of basic since we're just executing sqlcompare.exe
There is an article here explaining how to script SQL Server permissions through SMO via PowerShell. Your scenario would then be to script permissions from your initial DB before restoring the backup, then execute the generated script after backup has been restored.