I run the query below (from inside powershell), from different servers and all is fine:
Invoke-Sqlcmd -Query "
SELECT [ServerName]=##servername,m.Match_id, m.HostfamilyId, hf.NIEFlag, ra.DS2019Sent
FROM APIA_Repl_pub.dbo.repl_HostFamily hf
INNER JOIN APIA_Repl_pub.dbo.repl_Match m ON m.HostfamilyId = hf.HostFamilyID
INNER JOIN APIA_Repl_Pub.dbo.repl_Aupair ra on ra.AuPairID = m.AupairId
WHERE ra.JunoCore_applicationID = 459630
" -ServerInstance "CTSTGDB"
I even sometimes run the same query into several servers to compare them, see the result below as an example:
I just want to run the same query from inside SSMS, but see what I do and what I get:
(I have even simplified the query but still I get the error below)
GO
declare #sql varchar(8000)
SET #SQL=N'powershell.exe -command Invoke-Sqlcmd -Query "SELECT [ServerName]=##servername" -ServerInstance "CTSTGDB"'
IF OBJECT_ID('tempdb..#Radhe','U') IS NOT NULL
DROP TABLE #RADHE
CREATE TABLE #RADHE(I INT NOT NULL IDENTITY(1,1) PRIMARY KEY CLUSTERED, OUTPUT NVARCHAR(4000))
INSERT INTO #RADHE
EXEC xp_cmdshell #sql
SELECT * FROM #RADHE
GO
OUTPUT
Invoke-Sqlcmd : A positional parameter cannot be found that accepts argument
'[ServerName]=##servername'.
At line:1 char:1
+ Invoke-Sqlcmd -Query SELECT [ServerName]=##servername -ServerInstance ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [Invoke-Sqlcmd], ParameterB
indingException
+ FullyQualifiedErrorId : PositionalParameterNotFound,Microsoft.SqlServer.
Management.PowerShell.GetScriptCommand
NULL
I found out an interesting article:
6 methods to write PowerShell output to a SQL Server table
It is related or could be an alternative to the way I am working.
what exactly is the problem of Invoke-Sqlcmd from inside SSMS?
As I understand, you struggle with executing a query against a different server than the one the query is connected to. Well, I can think of at least 2 ways of doing this in SSMS:
If your current server has a linked server for the one you are after, you can use the AT clause of the EXEC statement, like this:
declare #Sql nvarchar(max) = N'SELECT [ServerName]=##servername,m.Match_id,
m.HostfamilyId, hf.NIEFlag, ra.DS2019Sent
FROM APIA_Repl_pub.dbo.repl_HostFamily hf
INNER JOIN APIA_Repl_pub.dbo.repl_Match m ON m.HostfamilyId = hf.HostFamilyID
INNER JOIN APIA_Repl_Pub.dbo.repl_Aupair ra on ra.AuPairID = m.AupairId
WHERE ra.JunoCore_applicationID = 459630';
-- This example assumes that the linked server name is the same as the remote server itself
exec (#Sql) at [CTSTGDB];
If linked server is not available, you may utilise the SQLCMD mode for the query:
:connect CTSTGDB
go
SELECT [ServerName]=##servername,m.Match_id, m.HostfamilyId, hf.NIEFlag, ra.DS2019Sent
FROM APIA_Repl_pub.dbo.repl_HostFamily hf
INNER JOIN APIA_Repl_pub.dbo.repl_Match m ON m.HostfamilyId = hf.HostFamilyID
INNER JOIN APIA_Repl_Pub.dbo.repl_Aupair ra on ra.AuPairID = m.AupairId
WHERE ra.JunoCore_applicationID = 459630;
go
:exit
go
This mode can be toggled by the Query -> SQLCMD Mode menu option.
I have a SQL stored procedure that executes a powershell file, and I want to log any errors that occur executing the powershell file into a SQL table.
My SQL stored procedure:
CREATE PROCEDURE [dbo].[sp_RemoveEmptyFiles]
#filePath varchar(260)
AS
DECLARE #sql as varchar(4000)
DECLARE #powershellFileLocation varchar(260)
SET #powershellFileLocation = '\\MyComputerName\Files\Powershell\cleandirectory.ps1'
SET #sql = 'powershell -c "& { . ' + #powershellFileLocation + '; clean-directory ' + #filePath + ' }"'
EXEC master..xp_cmdshell #sql
My powershell script:
function clean-directory {
param ([string]$path)
try
{
if ($path.Length -le 0 -or -not (test-path -literalPath $path)) {
throw [System.IO.FileNotFoundException] """$path"" not a valid file path."
}
#
#
# Clean directories here
#
#
}
catch
{
write-host $error
}
}
Right now, if the script is successful, it returns an output of NULL and a Return Value of 0. The goal is to replace that catch block with something that will save those errors to a SQL table.
My first (inefficient) thought is to then invoke a SQL command in that catch block, something like:
$commandText = "INSERT INTO ErrorLogTable (TimeStamp, ErrorMessage) VALUES ($(Get-Date), $error)"
$command = $conn.CreateCommand()
$command.CommandText = $commandText
$command.ExecuteNonQuery()
But this hardly seems like the best way to do it-- connecting back to the SQL server the stored procedure was called from and creating a new command, etc. It should be noted that the powershell script, file path argument of the stored procedure, and SQL server are in different locations, so I do need to keep permission issues in mind (and hence why I am trying to avoid calling Invoke-Sqlcmd from my powershell script).
Is there a way to get the output of the powershell file in the stored procedure, and then save the error message into a table from there?
Since I'm using xp_cmdshell, I can capture that output by changing the SQL script as follows:
SET #sql = 'powershell -c "& { . ' + #powershellFileLocation + '; clean-directory ' + #filePath + ' }"'
-- table to hold the output from the cmdshell
CREATE TABLE #PowershellOutput ([Output] varchar(1000))
-- execute actual powershell script
INSERT INTO #PowershellOutput ([Output]) EXEC master..xp_cmdshell #sql
This captures each line sent to the console as a separate row. It's not the best solution for capturing error messages from powershell, so I'm still looking for a better way to capture them. I'll either join these rows to a single output, or find a better way to capture only the powershell error (as opposed to the entire stacktrace).
I am using SQL Server 2005. I have a few SSIS packages located here: C:\SSIS
This code below is used to execute all the packages but I still need to place each package name in a table called Packages.
Can I execute all the packages without having to save the package name? I just want to supply the path of where they all are sitting and want SQL to do the rest.
DECLARE #package_name varchar(200)
Declare #PackageCount int
Declare #X int
Set #X = 1
Set #PackageCount = (Select COUNT(*) from Packages)
set #FilePath = 'C:\SSIS'
While (#X <= #PackageCount)
Begin
With PackageList as
(
Select PackageName, Row_Number() Over(Order by PackageName) as Rownum
From Packages
)
SELECT #package_name = PackageName
FROM PackageList
Where Rownum = #X
select #cmd = 'DTExec /F "' + #FilePath + #Package_name + '"'
print #cmd
Set #X = #X + 1
exec master..xp_cmdshell #cmd
End
you would need to use xp_cmdshell to execute a loop on the folder and get the file names.
Here is an example on how to do it. Of course, you'll need to clean up the result and get only the rows that matter
SET NOCOUNT ON
DECLARE #Command VARCHAR(100)
SET #Command = 'dir C:\test'
DECLARE #Folder VARCHAR(100)
SET #Folder = 'C:\test'
DECLARE #FilesInAFolder TABLE (FileNamesWithFolder VARCHAR(500))
INSERT INTO #FilesInAFolder
EXEC MASTER..xp_cmdshell #Command
select * from #FilesInAFolder
How to enable xp_cmdshel:
EXEC sp_configure 'show advanced options', 1
GO
RECONFIGURE
GO
EXEC sp_configure 'xp_cmdshell', 1
GO
RECONFIGURE
GO
Why not just use SSIS to do this? Drop a Foreach Loop container onto a new package. Grab the fully qualified path for all .dtsx files. Inside the foreach loop, have an Execute Package Task and assign it the current package path.
This would reduce your problem to run a single package which you can solve through a host of measures (.NET, SQL Agent, windows scheduler, etc)
I'm not looking to relocate the database to another server entirely, but just move the data file(s) and log file to another drive with more space. I've seen conflicting directions on how to do this, so I'm looking for the recommended proper way of doing it.
Detach the Database:
use master
go
sp_detach_db 'mydb'
Move the Database files (Xcopy through xp_cmdshell shown):
DECLARE #SRCData nvarchar(1000)
SET #SRCData = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\mydb.mdf';
DECLARE #SRCLog nvarchar(1000)
SET #SRCLog = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\mydb_log.ldf';
DECLARE #FILEPATH nvarchar(1000);
DECLARE #LOGPATH nvarchar(1000);
SET #FILEPATH = N'xcopy /Y ' + #SRCData + N' D:\Data';
SET #LOGPATH = N'xcopy /Y ' + #SRCLog + N' E:\Log';
exec xp_cmdshell #FILEPATH;
exec xp_cmdshell #LOGPATH;
ReAttach Database:
sp_attach_db 'mydb', 'D:\Data\mydb.mdf', 'E:\Log\mydb_log.ldf'
There's more detail at this Microsoft KB article.
Another way - detach database files (database->tasks->detach), move them to new drive and then attach again.
But way described by Jay S is the simpliest.
To be absolutely safe, I would do the following:
Backup the database to a BAK file.
Take the current database offline, or delete it if you want to.
Restore the database and change the location of the MDF and LDF files.
Scripts sample:
-- Get the file list from a backup file.
-- This will show you current logical names and paths in the BAK file
RESTORE FILELISTONLY FROM disk = N'C:\Backups\MyDatabaseName.bak'
-- Perform the restore of the database from the backup file.
-- Replace 'move' names (MDFLogicalName, LDFLogicalName) with those found in
-- the previous filelistonly command
restore database MyDatabaseName
from disk = N'C:\Backups\MyDatabaseName.bak'
with move 'MDFLogicalName' to 'D:\SQLData\MyDatabaseName.mdf',
move 'LDFLogicalName' to 'D:\SQLLogs\MyDatabaseName_log.ldf',
replace, stats=10;
Notes
The first script will get you the current names and paths that you'll need in the second script. The second script restores the database back to the name you want it to have, but you can change where you want to store. In the example above, it moves the MDF and LDF files to the D: drive.
I'd rather not enable xp_cmdshell on my SQL Server instance, so I wrote a function to do this using Powershell instead; it was especially useful when I had to move a large number of databases.
function Move-Database
{
param ($database, $newPath)
$paths = Invoke-SqlCmd "SELECT master_files.physical_name as Path
FROM sys.databases
JOIN sys.master_files ON master_files.database_id = databases.database_id
WHERE databases.name = '$database';";
$paths = $paths | % { $_.Path };
if (!$paths)
{
throw "Unknown database '$database'";
}
Write-Host "Setting $database to single-user mode...";
Invoke-SqlCmd "ALTER DATABASE [$database] SET SINGLE_USER WITH ROLLBACK IMMEDIATE;";
Write-Host "Detaching $database";
Invoke-SqlCmd "EXEC sp_detach_db '$database';";
if (!(test-path $newPath))
{
[void](mkdir $newPath);
}
$clauses = #();
foreach ($oldFile in $paths)
{
$filename = [System.IO.Path]::GetFileName($oldFile);
$newFile = [System.IO.Path]::Combine($newPath, $filename);
$clauses += "(FILENAME = `"$newFile`")";
Write-Host "Moving $oldFile to $newFile";
mv $oldFile $newFile;
}
$clauses = $clauses -join ", ";
Write-Host "Re-attaching $database";
Invoke-SqlCmd "CREATE DATABASE [$database] ON $clauses FOR ATTACH;";
Write-Host "All done!";
}
You can use it like so:
Move-Database -database "MyDatabase" -newPath "D:\SqlData";
I also think this method is a bit more robust than the others - what if your database is split into many files or you have a strange naming convention for logs for example?
I needed to move multiple databases within same server, so I expanded the accepted solution a bit, to avoid copying and pasting or retyping commands. This allows moving data files in one script run, only changing the database name. Note this assumes that advanced commands are enabled; if not, use sp_configure. The data and log files are assumed to be in the same directory.
use master
DECLARE #DBName nvarchar(50)
SET #DBName = 'YOUR_DB_NAME'
DECLARE #RC int
EXEC #RC = sp_detach_db #DBName
DECLARE #NewPath nvarchar(1000)
SET #NewPath = 'E:\Data\Microsoft SQL Server\Data\';
DECLARE #OldPath nvarchar(1000)
SET #OldPath = 'C:\Program Files\Microsoft SQL Server\MSSQL11.SQLEXPRESS\MSSQL\DATA\';
DECLARE #DBFileName nvarchar(100)
SET #DBFileName = #DBName + '.mdf';
DECLARE #LogFileName nvarchar(100)
SET #LogFileName = #DBName + '_log.ldf';
DECLARE #SRCData nvarchar(1000)
SET #SRCData = #OldPath + #DBFileName;
DECLARE #SRCLog nvarchar(1000)
SET #SRCLog = #OldPath + #LogFileName;
DECLARE #DESTData nvarchar(1000)
SET #DESTData = #NewPath + #DBFileName;
DECLARE #DESTLog nvarchar(1000)
SET #DESTLog = #NewPath + #LogFileName;
DECLARE #FILEPATH nvarchar(1000);
DECLARE #LOGPATH nvarchar(1000);
SET #FILEPATH = N'xcopy /Y "' + #SRCData + N'" "' + #NewPath + '"';
SET #LOGPATH = N'xcopy /Y "' + #SRCLog + N'" "' + #NewPath + '"';
exec xp_cmdshell #FILEPATH;
exec xp_cmdshell #LOGPATH;
EXEC #RC = sp_attach_db #DBName, #DESTData, #DESTLog
go
You also need to make sure the user under which the SQL Server process is running has access to the folder. For SQL2014, the default user process is "NT Service\MSSQL$SQL2014".
I've been too lax with performing DB backups on our internal servers.
Is there a simple command line program that I can use to backup certain databases in SQL Server 2005? Or is there a simple VBScript?
To backup a single database from the command line, use osql or sqlcmd.
"C:\Program Files\Microsoft SQL Server\90\Tools\Binn\osql.exe"
-E -Q "BACKUP DATABASE mydatabase TO DISK='C:\tmp\db.bak' WITH FORMAT"
You'll also want to read the documentation on BACKUP and RESTORE and general procedures.
I use ExpressMaint.
To backup all user databases I do for example:
C:\>ExpressMaint.exe -S (local)\sqlexpress -D ALL_USER -T DB -BU HOURS -BV 1 -B c:\backupdir\ -DS
Schedule the following to backup all Databases:
Use Master
Declare #ToExecute VarChar(8000)
Select #ToExecute = Coalesce(#ToExecute + 'Backup Database ' + [Name] + ' To Disk = ''D:\Backups\Databases\' + [Name] + '.bak'' With Format;' + char(13),'')
From
Master..Sysdatabases
Where
[Name] Not In ('tempdb')
and databasepropertyex ([Name],'Status') = 'online'
Execute(#ToExecute)
There are also more details on my blog: how to Automate SQL Server Express Backups.
I found this on a Microsoft Support page http://support.microsoft.com/kb/2019698.
It works great! And since it came from Microsoft, I feel like it's pretty legit.
Basically there are two steps.
Create a stored procedure in your master db. See msft link or if it's broken try here: http://pastebin.com/svRLkqnq
Schedule the backup from your task scheduler. You might want to put into a .bat or .cmd file first and then schedule that file.
sqlcmd -S YOUR_SERVER_NAME\SQLEXPRESS -E -Q "EXEC sp_BackupDatabases #backupLocation='C:\SQL_Backup\', #backupType='F'" 1>c:\SQL_Backup\backup.log
Obviously replace YOUR_SERVER_NAME with your computer name or optionally try .\SQLEXPRESS and make sure the backup folder exists. In this case it's trying to put it into c:\SQL_Backup
I'm using tsql on a Linux/UNIX infrastructure to access MSSQL databases. Here's a simple shell script to dump a table to a file:
#!/usr/bin/ksh
#
#.....
(
tsql -S {database} -U {user} -P {password} <<EOF
select * from {table}
go
quit
EOF
) >{output_file.dump}
You can use the backup application by ApexSQL. Although it’s a GUI application, it has all its features supported in CLI. It is possible to either perform one-time backup operations, or to create a job that would back up specified databases on the regular basis. You can check the switch rules and exampled in the articles:
ApexSQL Backup CLI support
ApexSQL Backup CLI examples
Eventual if you don't have a trusted connection as the –E switch declares
Use following command line
"[program dir]\[sql server version]\Tools\Binn\osql.exe" -Q "BACKUP DATABASE mydatabase TO DISK='C:\tmp\db.bak'" -S [server] –U [login id] -P [password]
Where
[program dir] is the directory where the osql.exe exists
On 32bit OS c:\Program Files\Microsoft SQL Server\
On 64bit OS c:\Program Files (x86)\Microsoft SQL Server\
[sql server version] your sql server version 110 or 100 or 90 or 80 begin with the largest number
[server] your servername or server ip
[login id] your ms-sql server user login name
[password] the required login password
Microsoft's answer to backing up all user databases on SQL Express is here:
The process is: copy, paste, and execute their code (see below. I've commented some oddly non-commented lines at the top) as a query on your database server. That means you should first install the SQL Server Management Studio (or otherwise connect to your database server with SSMS). This code execution will create a stored procedure on your database server.
Create a batch file to execute the stored procedure, then use Task Scheduler to schedule a periodic (e.g. nightly) run of this batch file. My code (that works) is a slightly modified version of their first example:
sqlcmd -S .\SQLEXPRESS -E -Q "EXEC sp_BackupDatabases #backupLocation='E:\SQLBackups\', #backupType='F'"
This worked for me, and I like it. Each time you run it, new backup files are created. You'll need to devise a method of deleting old backup files on a routine basis. I already have a routine that does that sort of thing, so I'll keep a couple of days' worth of backups on disk (long enough for them to get backed up by my normal backup routine), then I'll delete them. In other words, I'll always have a few days' worth of backups on hand without having to restore from my backup system.
I'll paste Microsoft's stored procedure creation script below:
--// Copyright © Microsoft Corporation. All Rights Reserved.
--// This code released under the terms of the
--// Microsoft Public License (MS-PL, http://opensource.org/licenses/ms-pl.html.)
USE [master]
GO
/****** Object: StoredProcedure [dbo].[sp_BackupDatabases] ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
-- =============================================
-- Author: Microsoft
-- Create date: 2010-02-06
-- Description: Backup Databases for SQLExpress
-- Parameter1: databaseName
-- Parameter2: backupType F=full, D=differential, L=log
-- Parameter3: backup file location
-- =============================================
CREATE PROCEDURE [dbo].[sp_BackupDatabases]
#databaseName sysname = null,
#backupType CHAR(1),
#backupLocation nvarchar(200)
AS
SET NOCOUNT ON;
DECLARE #DBs TABLE
(
ID int IDENTITY PRIMARY KEY,
DBNAME nvarchar(500)
)
-- Pick out only databases which are online in case ALL databases are chosen to be backed up
-- If specific database is chosen to be backed up only pick that out from #DBs
INSERT INTO #DBs (DBNAME)
SELECT Name FROM master.sys.databases
where state=0
AND name=#DatabaseName
OR #DatabaseName IS NULL
ORDER BY Name
-- Filter out databases which do not need to backed up
IF #backupType='F'
BEGIN
DELETE #DBs where DBNAME IN ('tempdb','Northwind','pubs','AdventureWorks')
END
ELSE IF #backupType='D'
BEGIN
DELETE #DBs where DBNAME IN ('tempdb','Northwind','pubs','master','AdventureWorks')
END
ELSE IF #backupType='L'
BEGIN
DELETE #DBs where DBNAME IN ('tempdb','Northwind','pubs','master','AdventureWorks')
END
ELSE
BEGIN
RETURN
END
-- Declare variables
DECLARE #BackupName varchar(100)
DECLARE #BackupFile varchar(100)
DECLARE #DBNAME varchar(300)
DECLARE #sqlCommand NVARCHAR(1000)
DECLARE #dateTime NVARCHAR(20)
DECLARE #Loop int
-- Loop through the databases one by one
SELECT #Loop = min(ID) FROM #DBs
WHILE #Loop IS NOT NULL
BEGIN
-- Database Names have to be in [dbname] format since some have - or _ in their name
SET #DBNAME = '['+(SELECT DBNAME FROM #DBs WHERE ID = #Loop)+']'
-- Set the current date and time n yyyyhhmmss format
SET #dateTime = REPLACE(CONVERT(VARCHAR, GETDATE(),101),'/','') + '_' + REPLACE(CONVERT(VARCHAR, GETDATE(),108),':','')
-- Create backup filename in path\filename.extension format for full,diff and log backups
IF #backupType = 'F'
SET #BackupFile = #backupLocation+REPLACE(REPLACE(#DBNAME, '[',''),']','')+ '_FULL_'+ #dateTime+ '.BAK'
ELSE IF #backupType = 'D'
SET #BackupFile = #backupLocation+REPLACE(REPLACE(#DBNAME, '[',''),']','')+ '_DIFF_'+ #dateTime+ '.BAK'
ELSE IF #backupType = 'L'
SET #BackupFile = #backupLocation+REPLACE(REPLACE(#DBNAME, '[',''),']','')+ '_LOG_'+ #dateTime+ '.TRN'
-- Provide the backup a name for storing in the media
IF #backupType = 'F'
SET #BackupName = REPLACE(REPLACE(#DBNAME,'[',''),']','') +' full backup for '+ #dateTime
IF #backupType = 'D'
SET #BackupName = REPLACE(REPLACE(#DBNAME,'[',''),']','') +' differential backup for '+ #dateTime
IF #backupType = 'L'
SET #BackupName = REPLACE(REPLACE(#DBNAME,'[',''),']','') +' log backup for '+ #dateTime
-- Generate the dynamic SQL command to be executed
IF #backupType = 'F'
BEGIN
SET #sqlCommand = 'BACKUP DATABASE ' +#DBNAME+ ' TO DISK = '''+#BackupFile+ ''' WITH INIT, NAME= ''' +#BackupName+''', NOSKIP, NOFORMAT'
END
IF #backupType = 'D'
BEGIN
SET #sqlCommand = 'BACKUP DATABASE ' +#DBNAME+ ' TO DISK = '''+#BackupFile+ ''' WITH DIFFERENTIAL, INIT, NAME= ''' +#BackupName+''', NOSKIP, NOFORMAT'
END
IF #backupType = 'L'
BEGIN
SET #sqlCommand = 'BACKUP LOG ' +#DBNAME+ ' TO DISK = '''+#BackupFile+ ''' WITH INIT, NAME= ''' +#BackupName+''', NOSKIP, NOFORMAT'
END
-- Execute the generated SQL command
EXEC(#sqlCommand)
-- Goto the next database
SELECT #Loop = min(ID) FROM #DBs where ID>#Loop
END
Here is an example one, it will take backup database, compress using 7zip and delete backup file so issue related to storage also solved. In this example I use 7zip, which is free
#echo off
CLS
echo Running dump ...
sqlcmd -S SERVER\SQLEXPRESS -U username -P password -Q "BACKUP DATABASE master TO DISK='D:\DailyDBBackup\DB_master_%date:~-10,2%%date:~-7,2%%date:~-4,4%.bak'"
echo Zipping ...
"C:\Program Files\7-Zip\7z.exe" a -tzip "D:\DailyDBBackup\DB_master_%date:~-10,2%%date:~-7,2%%date:~-4,4%_%time:~0,2%%time:~3,2%%time:~6,2%.bak.zip" "D:\DailyDBBackup\DB_master_%date:~-10,2%%date:~-7,2%%date:~-4,4%.bak"
echo Deleting the SQL file ...
del "D:\DailyDBBackup\DB_master_%date:~-10,2%%date:~-7,2%%date:~-4,4%.bak"
echo Done!
Save this as sqlbackup.bat and schedule it to be run everyday.
If you just want to take backup only then you can create script without zipping and deleting.
You could use a VB Script I wrote exactly for this purpose:
https://github.com/ezrarieben/mssql-backup-vbs/
Schedule a task in the "Task Scheduler" to execute the script as you like and it'll backup the entire DB to a BAK file and save it wherever you specify.
SET NOCOUNT ON;
declare #PATH VARCHAR(200)='D:\MyBackupFolder\'
-- path where you want to take backups
IF OBJECT_ID('TEMPDB..#back') IS NOT NULL
DROP TABLE #back
CREATE TABLE #back
(
RN INT IDENTITY (1,1),
DatabaseName NVARCHAR(200)
)
INSERT INTO #back
SELECT 'MyDatabase1'
UNION SELECT 'MyDatabase2'
UNION SELECT 'MyDatabase3'
UNION SELECT 'MyDatabase4'
-- your databases List
DECLARE #COUNT INT =0 , #RN INT =1, #SCRIPT NVARCHAR(MAX)='', #DBNAME VARCHAR(200)
PRINT '---------------------FULL BACKUP SCRIPT-------------------------'+CHAR(10)
SET #COUNT = (SELECT COUNT(*) FROM #back)
PRINT 'USE MASTER'+CHAR(10)
WHILE(#COUNT > = #RN)
BEGIN
SET #DBNAME =(SELECT DatabaseName FROM #back WHERE RN=#RN)
SET #SCRIPT ='BACKUP DATABASE ' +'['+#DBNAME+']'+CHAR(10)+'TO DISK =N'''+#PATH+#DBNAME+ N'_Backup_'
+ REPLACE ( REPLACE ( REPLACE ( REPLACE ( CAST ( CAST ( GETDATE () AS DATETIME2 ) AS VARCHAR ( 100 )), '-' , '_' ), ' ' , '_' ), '.' , '_' ), ':' , '' )+'.bak'''+CHAR(10)+'WITH COMPRESSION, STATS = 10'+CHAR(10)+'GO'+CHAR(10)
PRINT #SCRIPT
SET #RN=#RN+1
END
PRINT '---------------------DIFF BACKUP SCRIPT-------------------------'+CHAR(10)
SET #COUNT =0 SET #RN =1 SET #SCRIPT ='' SET #DBNAME =''
SET #COUNT = (SELECT COUNT(*) FROM #back)
PRINT 'USE MASTER'+CHAR(10)
WHILE(#COUNT > = #RN)
BEGIN
SET #DBNAME =(SELECT DatabaseName FROM #back WHERE RN=#RN)
SET #SCRIPT ='BACKUP DATABASE ' +'['+#DBNAME+']'+CHAR(10)+'TO DISK =N'''+#PATH+#DBNAME+ N'_Backup_'
+ REPLACE ( REPLACE ( REPLACE ( REPLACE ( CAST ( CAST ( GETDATE () AS DATETIME2 ) AS VARCHAR ( 100 )), '-' , '_' ), ' ' , '_' ), '.' , '_' ), ':' , '' )+'.diff'''+CHAR(10)+'WITH DIFFERENTIAL, COMPRESSION, STATS = 10'+CHAR(10)+'GO'+CHAR(10)
PRINT #SCRIPT
SET #RN=#RN+1
END
This can be helpful when you are dealing with dockerised mssql container in your day to day work and want to take a quick dump of the data from table. I have specially found it useful when you are re-building the db container quite frequently and don't want to loose the test data after the re-build.
Export data using bcp utility
/opt/mssql-tools/bin/bcp <Table_Name> out /tmp/MyData.bcp -d <database_name> -c -U <user_name> -P "<password>" -S <server_name>
Import data using bcp utility
/opt/mssql-tools/bin/bcp <Table_Name> IN /tmp/MyData.bcp -d <database_name> -c -U <user_name> -P "<password>" -S <server_name>
If you can find the DB files... "cp DBFiles backup/"
Almost for sure not advisable in most cases, but it's simple as all getup.