I'm trying to back up my production database to my local dev machine with the following constraints:
It will be a regular backup, so using the UI should (ideally?) be avoided.
Some tables are marked to be deleted, so these should not be included in the backup.
I would like to be able to pass the solution (file/package/etc) to other members of the team and they should only have to change a couple of variables in one file and then they can execute and get their own backup.
The DB is over 100GB and contains data that I won't need. I have identified the top largest tables and would only like to take say 5k rows from each - this should provide me with enough data for my purposes and limit space used on my local drives.
I have tried beginning with backing up the schema only using the following methods:
Using the UI to backup Schema only (Tasks -> Generate Scripts)
Get the following error:
Microsoft.SqlServer.Management.Smo.FailedOperationException: Discover dependencies failed. ---> System.ArgumentException: Item has already been added. Key in dictionary: 'Server[#Name='PRODSQL']/Database[#Name='Database1']/UnresolvedEntity[#Name='SomeObjectName' and #Schema='Some.Schema.SomeObjectName']' Key being added: 'Server[#Name='PRODSQL']/Database[#Name='Database1']/UnresolvedEntity[#Name='SomeObjectName' and #Schema='Some.Schema.SomeObjectName']' at System.Collections.SortedList.Add(Object key, Object value) at Microsoft.SqlServer.Management.Smo.DependencyTree..ctor(Urn[] urns, DependencyChainCollection dependencies, Boolean fParents, Server server) at Microsoft.SqlServer.Management.Smo.DependencyWalker.DiscoverDependencies(Urn[] urns, Boolean parents) --- End of inner exception stack trace --- at Microsoft.SqlServer.Management.SqlScriptPublish.GeneratePublishPage.worker_DoWork(Object sender, DoWorkEventArgs e) at System.ComponentModel.BackgroundWorker.OnDoWork(DoWorkEventArgs e) at System.ComponentModel.BackgroundWorker.WorkerThreadStart(Object argument)
So I moved on.
Tasks -> Copy Database
I get a message saying there is not enough space on disk
Extract Data-tier Application
Get same error as in 1. above.
Powershell script, and a batch file calling sqlcmd on the generated .sql files after the PS script was run.
I was sure this method would work and it took me 2 days to get this far, but still working through multiple errors.
Basically I am doing the following:
Create db objects from the source DB (Schemas, SPs, Tables, Views, UDFs, Triggers, Indexes) and output them to .sql files - Roughly followed http://cfmumbojumbo.com/index.cfm/coding/using-powershell-to-backup-your-stored-procedures-and-triggers/ with some more work added.
If the database already exists on my server, kill, drop, then recreate it (DropCreate.sql):
IF(db_id(#DatabaseName) IS NOT NULL)
BEGIN
DECLARE #SQL VARCHAR(max)
SELECT #SQL = COALESCE(#SQL,'') + 'Kill ' + Convert(VARCHAR, SPId) + ';'
FROM MASTER..SysProcesses
WHERE DBId = DB_ID(#DatabaseName) AND SPId <> ##SPId
EXEC(#SQL);
END
DROP DATABASE MYDATABASE
CREATE DATABASE MYDATABASE ON PRIMARY (...)
The .bat file is essentially doing this
sqlcmd -S %Server% -U %UserName% -P %Password% -i C:\Database\DatabaseScripts\TestDatabase\DropCreateDB.sql
#loop through and execute multiple .sql files in the directory
for /f %f in (`dir /b C:\Database\DatabaseScripts\TestDatabase\2018-04-18-15-13-13\StoredProcedures\`) do sqlcmd -S %Server% -d %Database% -U %UserName% -P %Password% -i %f
#Just one sql file in this directory, execute it
sqlcmd -S %Server% -d %Database% -U %UserName% -P %Password% -i C:\Database\DatabaseScripts\TestDatabase\2018-04-18-15-13-13\Schemas\AllSchemas.sql
sqlcmd -S %Server% -d %Database% -U %UserName% -P %Password% -i C:\Database\DatabaseScripts\TestDatabase\2018-04-18-15-13-13\Tables\AllTables.sql
.............
The latest error I'm experiencing is:
Changed database context to 'master'.
Msg 6107, Level 14, State 1, Server MYSERVER, Line 1
Only user processes can be killed.
do was unexpected at this time.
Everywhere I turn I am experiencing new errors and have spent over 2 days on it, and I haven't even got to getting the data yet..
TLDR: Is there any easier way backup MSSQL Db schema and top n rows of data from certain tables?
Related
I have to restore a database and am following this official documentation where I follow two steps:
- List the files
- Run the Restore command with respect to the files aforementioned.
However, I am facing "already claimed" error.
I tried to use different names but it is not possible since the backup has certain files. I also tried other answers across different domains, all have GUI.
The first command that I ran was:
sudo docker exec -it sql1 /opt/mssql-tools/bin/sqlcmd -S localhost \
-U SA -P '<YourStrong#Passw0rd>' \
-Q 'RESTORE FILELISTONLY FROM DISK = "/var/opt/mssql/backup/us_national_statistics.bak"' \
| tr -s ' ' | cut -d ' ' -f 1-2
I got the following output:
LogicalName PhysicalName
-------------------------------------------------------------------------------------------------------------------------------- --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
us_national_statistics C:\Program
us_national_statistics_log C:\Program
Then, as per the documentation, I ran this command:
sudo docker exec -it sql1 /opt/mssql-tools/bin/sqlcmd \
-S localhost -U SA -P '<YourStrong#Passw0rd>' \
-Q 'RESTORE DATABASE US_NATIONAL FROM DISK = "/var/opt/mssql/backup/us_national_statistics.bak" WITH MOVE "us_national_statistics" TO "C:\Program", MOVE "us_national_statistics_log" TO "C:\Program"'
Here, I get the following error:
Msg 3176, Level 16, State 1, Server 0a6a6aac7476, Line 1
File 'C:\Program\New' is claimed by 'us_national_statistics_log'(2) and 'us_national_statistics'(1). The WITH MOVE clause can be used to relocate one or more files.
Msg 3013, Level 16, State 1, Server 0a6a6aac7476, Line 1
RESTORE DATABASE is terminating abnormally.
I expect the database to be restored.
You can't restore to C:\Program for multiple reasons. That's not a full path (you seem to have lost the string after the first space in Program Files); the data and log can't both be put in the same file; you don't typically have write access to the root of any drive; and C:\ is not valid in Docker or Linux.
You need the LogicalName, but you should not be using the PhysicalName directly, either in the case where you are restoring to Docker or Linux, or in the case where you are restoring a database alongside an existing copy that you want to keep, or in the case where you are restoring a database to a different instance (which will more than likely have a different data folder structure).
Try:
RESTORE DATABASE US_NATIONAL_COPY
FROM DISK = "/var/opt/mssql/backup/us_national_statistics.bak"
WITH REPLACE, RECOVERY,
MOVE "us_national_statistics" TO "/var/opt/mssql/data/usns_copy.mdf",
MOVE "us_national_statistics_log" TO "/var/opt/mssql/data/usns_copy.ldf";
I have the following T-SQL codes configured to run on a daily basis using SQL Server Agent job. My database is running on SQL Server 2012.
INSERT INTO OPENROWSET('Microsoft.ACE.OLEDB.12.0','Text;Database=C:\;HDR=YES;FMT=Delimited','SELECT * FROM [myfile.csv]')
SELECT ReservationStayID,NameTitle,FirstName,LastName,ArrivalDate,DepartureDate FROM [GuestNameInfo]
My issue is that the output of this query is being appended to the existing records in the csv file. I want the output to overwrite the existing content each time the SQL Server Agent job is run.
How do I modify my query to acheive this?
I would recommend first renaming your existing myfile.csv to something else (like myfile_[DateOfLastRun].csv). Then start fresh with a new myfile.csv. That way if something goes wrong outside this process and you need whatever was in myfile.csv the day/week/month before, you have it.
You could use BCP for this in a BAT file:
set vardate=%DATE:~4,10%
set varDateWithoutSlashes=%vardate:/=-%
bcp "SELECT someColumns FROM aTable" queryout myFile_%varDateWithoutSlashes%.csv -t, -c -T
The example above creates your CSV with the date already in the name. You could also rename the existing file, then create your new myfile.csv without the date:
set vardate=%DATE:~4,10%
set varDateWithoutSlashes=%vardate:/=-%
ren myFile.csv myFile_%varDateWithoutSlashes%.csv
bcp "SELECT someColumns FROM aTable" queryout myFile.csv -t, -c -T
Be sure to build in cleanup of old files somewhere - that can even be done in the same batch process as this one.
You can add DB name and server name to the bcp line - by default it connects to the local server and the user's default DB (See the BCP documentation link for even more options)
bcp databaseName "SELECT someColumns FROM aTable" queryout myFile.csv -t, -c -T -S serverName
I have a database which contains 50 tables (5 schemas, 5 tablespaces). And tried to take a backup of few tables (each table in different tablespace) using following command.
$psql -U my_db_user my_db_name -t my_table_1 -t my_table_2 -t my_table_3 > ttables.sql
Above command is working fine to take the *sql backup. But the table column value is having null values. While restoring the the dump using the following command getting some error due to null (\N) values which is in backup file (ttables.sql).
$cat ttables.sql | psql -d new_db -U new_db_user
Is there any way to avoid \N characters in backup dump file? or Any wrong with backup / restore command which I have used?
(Postgres version 9.1)
I already have an existing database I am trying to keep up to date on a daily basis. I get a daily dump of sql files. The batch script below created and populated the database the first time I run it, but that doesn't work when I try to update the database with it.
`#echo off
ECHO %USERNAME% started the batch process at %TIME% >output.txt
for %%f in (*.sql) do (
sqlcmd.exe -S servername -E -d DatabaseName -i %%f >>output.txt
)
pause`
Is there a different command for updating a database with sql files?
This is the output i get.
HUTRC started the batch process at 9:55:12.25 Changed database context to 'master'. Msg 15416, Level 16, State 1, Server HUTRC1-HP, Procedure sp_dbcmptlevel, Line 67 Usage: sp_dbcmptlevel [dbname [, compatibilitylevel]] Valid values of the database compatibility level are 100, 110, or 120.
What the sql file looks like. It is very long. I just got the first few lines.
USE [master]
GO
IF NOT EXISTS (SELECT [name] FROM sys.databases WHERE name = N'Migration')
BEGIN
CREATE DATABASE [Migration] COLLATE SQL_Latin1_General_CP1_CI_AS
END
GO
EXEC dbo.sp_dbcmptlevel #dbname=N'Migration', #new_cmptlevel=90
The error message tells you precisely where to look for the problem and what it is (emphasis added):
Procedure sp_dbcmptlevel, Line 67 Usage: sp_dbcmptlevel [dbname [, compatibilitylevel]] Valid values of the database compatibility level are 100, 110, or 120.
Examining your SQL script for sp_dbcmptlevel shows that it uses a different value:
EXEC dbo.sp_dbcmptlevel #dbname=N'Migration', #new_cmptlevel=90
^^
You'll need to either edit the SQL script to a valid compatibility level, or downgrade your server version to the same version as the source server.
I have all the scripts to do:
Set up a database.
Create schema/s.
Create tables.
Create stored procedures.
I would like to write a batch file that will have SQL Server run those scripts and consequently my database will be created easier and quicker. For the sake of this example, lets assume that I have a folder with the address C:\folder and inside this folder I have files SetDatabase.sql, SetSchema.sql, SetTable.sql, and SetSP.sql. How would I set all that up on localhost\TSQL2012?
You can do this in powershell using sqlcmd
sqlcmd -S serverName\instanceName -i scripts.sql
The above statement will execute a script.
You can use the :r command in another file (scripts.sql) to store all your scripts.
:r C:\..\script1.sql
:r C:\..\script2.sql
....
set _connectionCredentialsMaster=-S MyServer\MyInstance -d Master -U sa -P mypassword
set _connectionCredentialsMyDatabase=-S MyServer\MyInstance -d MyDatabase -U sa -P mypassword
set _sqlcmd="%ProgramFiles%\Microsoft SQL Server\110\Tools\Binn\SQLCMD.EXE"
%_sqlcmd% -i MyFileCreateDatabase001.sql -b -o MyFileCreateDatabase001.Sql.log %_connectionCredentialsMaster%
%_sqlcmd% -i MyFile001.sql -b -o MyFile001.Sql.log %_connectionCredentialsMyDatabase%
%_sqlcmd% -i MyFile002.sql -b -o MyFile002.Sql.log %_connectionCredentialsMyDatabase%
set _connectionCredentialsMaster=
set _connectionCredentialsMyDatabase=
set _sqlcmd=
Just remember, when you run the 'Create Database' statement, you are actually USING the "Master" database. Then, after MyDatabase is created, you can use it. Thus why the first line in the example above...connects to Master.
The above will let you set the credentials "at the top" "one time"....and keep your lines in the file for each file.
Use SQL data tools to implement your needs. You should study about that before you do.
http://msdn.microsoft.com/en-in/data/tools.aspx