How to copy tables plus data for testing purposes - sql-server

How can I copy a number of tables, plus the data they contain, from a live SQL Server database?
I want to get some data for basic testing on my machine.
I was originally going to do a simple backup of the database, until I noticed that it was over 100GB in size. However the tables I'm after are only a number of the smaller ones. I then tried export to Excel, but hit the 65K limit (and we don't have Excel 2007)

You can try Exporting Data by Using the SQL Server Import and Export Wizard
Here is MSDN video
you can export it as Flat file
In Management Studio, select the database, right-click and select Tasks->Export Data. There you will see options to export to different kinds of formats including CSV.
You can also run your query from the Query window and save the results to CSV.

Can't you use the Export Data wizard from your live server to your testing machine? Or use bcp? Or even use a simple PowerShell script?
$Server = "MyServer"
$ServerInstance = "$Server\MyInstance"
$database = "MyDatabase"
$BackupFile = "c:\MyBackupFile.sql"
$tables = #('TableBlah','TableBluh','TableBloh')
$server = New-Object (
'Microsoft.SqlServer.Management.Smo.Server') $ServerInstance
$scripter = New-Object ('Microsoft.SqlServer.Management.Smo.Scripter') $server
$scripter.Options.SchemaQualify = $false
$scripter.Options.ScriptSchema = $false
$scripter.Options.ScriptData = $true
$scripter.Options.NoCommandTerminator = $true
$scripter.Options.ToFileOnly = $true
$scripter.Options.FileName = $BackupFile
$ServerUrn=$server.Urn
$UrnsToScript = New-Object Microsoft.SqlServer.Management.Smo.UrnCollection
foreach ($t in $tables)
{
# Could use bcp here for dumping big tables (like archives)
# $ret = (bcp.exe "$database..$t" out `"$ConfigBackupDir\$t.bcp`"
# -S $ServerInstance -U sa -P $SAPWD -n)
$Urn = "$ServerUrn/Database[#Name='" +
$database + "']/Table[#Name='" + $t + "' and #Schema='dbo']"
$UrnsToScript.Add($Urn)
}
$scripter.EnumScript($UrnsToScript)

Related

Loop array and store all results

Admittedly I am not a strong developer but I have done some research and I want to get a solid answer to my problem. I see Multi-dimensional Arrays but I am not sure if this is the right answer
I have a three part issue. Database Server, Database, and Units
There are n Units, there are 4 Database Servers, there are n database (1 database per 1 Unit).
So for example:
Unit1 is on Database Server 4 using Database DB_Unit1
Unit2 is on Database Server 4 using Database DB_Unit2
Unit3 is on Database Server 2 using Database Unit3 (Some Databases are not named DB_Unit)
Unit4 is on Database Server 1 using Database XYZ
Unit5 is on Database Server 1 using Database DB_Unit5
I assumed I could use an array to store each string for each Agency but I'm not sure how that works.
So I am trying to write a PowerShell script that uses all of these functions
$units = ("Unit1","Unit2","Unit3","Unit4","Unit5")
FOREACH ($Unit in Units){
Invoke-Sqlcmd -ServerInstance $DatabaseServer -Database $Database -Query "Select * from tbl1"
}
My outcome is that it would Query each Database Server with the assigned database for each Unit.
Any ideas on how this works with an Array or is there a separate way to associate this data?
I think this might be more what you are after. It should run n times, where n is the number of strings in $Units. There should be 5 results added to $SQLResults.
$Units= ("Unit1","Unit2","Unit3","Unit4","Unit5")
$SQLResults = New-Object System.Collections.ArrayList
ForEach ($Unit in $Units){
switch ($Unit) {
"Unit1" { $DatabaseServer = "Database Server 4";$Database = "DB_Unit1" }
"Unit2" { $DatabaseServer = "Database Server 4";$Database = "DB_Unit2" }
"Unit3" { $DatabaseServer = "Database Server 2";$Database = "Unit3" }
"Unit4" { $DatabaseServer = "Database Server 1";$Database = "XYZ" }
"Unit5" { $DatabaseServer = "Database Server 1";$Database = "DB_Unit5" }
}
$UnitResults = Invoke-Sqlcmd -ServerInstance $DatabaseServer -Database $Database -Query "Select * from tbl1"
$UnitResults | Add-Member -MemberType NoteProperty -Name "Unit" -Value $Unit
$SQLResults.Add($UnitResults)
}
You should then be able to get the specific unit results by doing $SQLResults | where {$_.Unit -eq "Unit1"}

How to create a copy of existing SQL DB on same server

I to perform an operation a part of which has me looking for a way to create a copy of SQL DB on same server. I tried the suggestion given at Copy SQL Server database with PowerShell script . However the resulting copy is about a quarter size of the actual DB.
Ideas anyone?
Thanks
If your PowerShell solution is working except you are noticing a file-size discrepancy with the newly copied database compared to the source database, it may not be an actual problem.
SQL Server database and log sizes are variable and are typically not an exact indication of the amount of data they contain. The copied database may be "optimized" in terms of its disk file usage in a way that the source database is currently not.
There are three things you can do to convince yourself you have a working solution.
Run the Shrink command on both of the databases to free space and see if the resulting disk files are more similar in terms of size https://learn.microsoft.com/en-us/sql/t-sql/database-console-commands/dbcc-shrinkdatabase-transact-sql?view=sql-server-ver15
Write a benchmark script that compares record counts on all the tables. If you see a discrepancy in the record count, you know you have a problem.
Example benchmark script:
declare #sourceCount int;
declare #copyCount int;
set #sourceCount = (select count(*) from SourceDb.dbo.SomeTable);
set #copyCount = (select count(*) from CopyDb.dbo.SomeTable);
if #sourceCount <> #copyCount
begin
select 'PROBLEM!'
end
-- Now repeat for all the other tables
Use a SQL data comparison tool to automate the method in step 2. above. There are many such tools including the one built-in to Visual Studio 2019 (Tools menu) or otherwise: https://learn.microsoft.com/en-us/sql/ssdt/how-to-compare-and-synchronize-the-data-of-two-databases?view=sql-server-ver15
However, all these methods will only work reliably if you can ensure that either database isn't receiving updates during the copy/benchmarking process. If someone is accessing one or the other of the databases independently while you are measuring, and they alter the data independently while you are measuring, your results would be invalidated, possibly without your knowing.
EDIT
I managed to make it work as soon as I have started using the SqlServer module instead of the SQLPS module, because the latter had long been deprecated. I edited the answer I am referring to in my initial post, below.
I was having a similar error implementing this. Tried literally everything, it just wouldn't work. What did work for me, was generating a script through the ScriptTransfer method, create the new database and then apply the script to the new database through Invoke-SqlCmd. I have posted a detailed explanation and code in this answer.
Okay. I managed to implement this. And anybody who needs to do this in future, please try this:
Import-Module SQLPS -DisableNameChecking
$SQLInstanceName = "$env:COMPUTERNAME\sqlexpress"
$SourceDBName = "xxx"
$CopyDBName = "${SourceDBName}_copy"
$Server = New-Object -TypeName 'Microsoft.SqlServer.Management.Smo.Server' -ArgumentList $SQLInstanceName
$SourceDB = $Server.Databases[$SourceDBName]
$CopyDB = New-Object -TypeName 'Microsoft.SqlServer.Management.SMO.Database' -ArgumentList $Server , $CopyDBName
# Delete any existing copy
Try
{
Invoke-Sqlcmd -ServerInstance "$SQLInstanceName" -Query "Drop database $CopyDBName;" -Username "***" -Password "****" -Verbose
}
Catch
{
Write-Output 'Failed to delete database'
}
$CopyDB.create()
$ObjTransfer = New-Object -TypeName Microsoft.SqlServer.Management.SMO.Transfer -ArgumentList $SourceDB
$ObjTransfer.DestinationDatabase = $CopyDBName
$ObjTransfer.DestinationServer = $Server.Name
$ObjTransfer.DestinationLoginSecure = $true
$ObjTransfer.CopyData = $true
$ObjTransfer.CopyAllObjects = $false
$ObjTransfer.CopyAllDatabaseTriggers = $true
$ObjTransfer.CopyAllDefaults = $true
$ObjTransfer.CopyAllRoles = $true
$ObjTransfer.CopyAllRules = $true
$ObjTransfer.CopyAllSchemas = $true
$ObjTransfer.CopyAllSequences = $true
$ObjTransfer.CopyAllSqlAssemblies = $true
$ObjTransfer.CopyAllSynonyms = $true
$ObjTransfer.CopyAllTables = $true
$ObjTransfer.CopyAllViews = $true
$ObjTransfer.CopyAllStoredProcedures = $true
$ObjTransfer.CopyAllUserDefinedAggregates = $true
$ObjTransfer.CopyAllUserDefinedDataTypes = $true
$ObjTransfer.CopyAllUserDefinedTableTypes = $true
$ObjTransfer.CopyAllUserDefinedTypes = $true
$ObjTransfer.CopyAllUserDefinedFunctions = $true
$ObjTransfer.CopyAllUsers = $true
$ObjTransfer.PreserveDbo = $true
$ObjTransfer.Options.AllowSystemObjects = $false
$ObjTransfer.Options.ContinueScriptingOnError = $true
$ObjTransfer.Options.IncludeDatabaseRoleMemberships = $true
$ObjTransfer.Options.Indexes = $true
$ObjTransfer.Options.Permissions = $true
$ObjTransfer.Options.WithDependencies = $true
$ObjTransfer.TransferData()

How to use PowerShell to batch call Update-Database

We use an Azure Elastic Pool resulting in multiple client databases and one master database with references to the client database.
We already have multiple databases and are working on a new version of the code. We use EF6 Code-First.
When we make a change to our model (add a property) we create the migration file and need to call Update-Database for all existing client databases.
This is monkey work we want to skip.
I already have a Powershell script to connect to the master database and execute a query on a table. This returns the names of the child databases.
With it I can change the Web.config and replace the Template database name with the proper name of the child database.
Now I need to call Update-Database to execute the migration scripts. With this last part I'm struggling because I'm running the ps1-script outside Visual Studio and thus the command Update-database is unknown. I tried using migrate.exe but then I get lots of errors.
I think the easiest solution is to run my script within the Package manager console but I can't figure out how to do that.
I managed to get it working. After I placed the ps1-file in the root of my code folder I could run it in the Package Manager Console using .\UpdateDatabases.ps1.
For completeness here's the script I created. I'm new to PowerShell so some optimizations might be possible.
cls
$currentPath = (Get-Item -Path ".\" -Verbose).FullName
#Read Web.config
$webConfig = $currentPath + "\<your project>\Web.config"
$doc = (Get-Content $webConfig) -as [Xml]
$DatabaseNamePrefix = $doc.configuration.appSettings.add | where {$_.Key -eq 'DatabaseNamePrefix'}
#Get Master connectionstring
$root = $doc.get_DocumentElement();
foreach($connString in $root.connectionStrings.add | where {$_.Name -eq "Master"})
{
$masterConn = $connString.connectionString
}
#Connect to master database
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $masterConn
#Query Client table for the child database names
$SqlQuery = "select Code from Clients"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = $SqlQuery
$SqlCmd.Connection = $SqlConnection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd
#Put query result in dataset
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
foreach ($row in $DataSet.Tables[0].Rows)
{
$clientDbName = $row[0].ToString().Trim()
#Change Web.Config
foreach($connString in $root.connectionStrings.add | where {$_.Name -eq "DevelopmentDb"})
{
$newDatabaseName = "Database=" + $DatabaseNamePrefix.value + $clientDbName + ";";
$newConn = $connString.connectionString -replace "(Database=.*?;)",$newDatabaseName
$connString.connectionString = $newConn;
}
$doc.Save($webConfig)
#Update database
Update-Database -ConfigurationTypeName Application
}
"Finished"
You may want to take a look at Azure Elastic Database Jobs. Which is designed to work with the elastic database pools.
The Elastic Database Jobs SDK includes also PowerShell components.

How do I execute a SELECT query against a SQLServer database and iterate results using PowerShell

Say I have a table with 3 columns - "Column1", "Column2", and "Column3" - datatype is varchar(100) for all 3.
Using PowerShell, how do I connect to SQL Server and use SqlDataReader and ForEach operator to view the contents of "Column2"?
Here's roughly how I'm doing it:
$SqlServer = 'sql.example.com';
$SqlDatabase = 'MyDB';
$SqlConnectionString = 'Data Source={0};Initial Catalog={1};Integrated Security=SSPI' -f $SqlServer, $SqlDatabase;
$SqlQuery = "SELECT Name FROM dbo.Person ORDER BY Name;";
$SqlConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $SqlConnectionString;
$SqlCommand = $SqlConnection.CreateCommand();
$SqlCommand.CommandText = $SqlQuery;
$SqlConnection.Open();
$SqlDataReader = $SqlCommand.ExecuteReader();
#Fetch data and write out to files
while ($SqlDataReader.Read()) {
Write-Output $SqlDataReader['Name'];
}
$SqlConnection.Close();
$SqlConnection.Dispose();
If I remember right, I basically refactored the code from the MSDN example.
For those wondering why I'm using SqlDataReader: Most of my scripts use SqlDataAdapter, but this one retrieves about 8,000 PDFs from a database so I wasn't really interested in calling SqlDataAdapter.Fill(). In exchange for holding shared locks on the table much longer than SqlDataAdapter.Fill() would, SqlDataReader.Read() keeps memory usage down to a manageable level for the client by fetching one record at a time.

Any way to code import/export data in MS SQL Server 2012 without SSIS, openrowset, Linked surver or BCP?

My organization does not allow linked servers, BCP, openrowset, or the SSIS service.
I need to import and export data between sql servers and sql servers to the PC file system (into excel files) on a regular basis.
Is there a way to do this without using the import/export Wizard? I think I have exhausted all the other alternatives...
Use power shell.
This snippet dumps data from adventureworks2012.person.person to a pipe delimited file. Comma delimited files are recognized by MS Excel.
Many other things power shell can do.
1 - create directories
2 - date time stamp files
3 - move files to archive directories
4 - zip files
5 - ftp files
The limit is only how much you are willing to learn.
Here is an article on "Microsoft.Office.Interop.Excel". Unfortunately, the MS Office products are not re-written in managed code (.NET) yet.
http://import-powershell.blogspot.com/2012/03/excel-part-1.html
# ******************************************************
# *
# * Name: dump-sql-query-to-delimited-file.ps1
# *
# * Design Phase:
# * Author: John Miner
# * Date: 01-09-2014
# * Purpose: Given a sql query, store the data
# * in a delimited file.
# *
# ******************************************************
# Debug script?
[string]$debug = "T"
# Debug info
if ($debug -eq "T")
{
Write-Host "Starting [dump-sql-query-to-delimited-file]";
};
# Set these variables
[string]$server = ".";
[string]$database = "AdventureWorks2012";
[string]$query = "SELECT TOP 10 FirstName, LastName FROM Person.Person";
[string]$file = "C:\temp\data.txt"
[string]$delimiter = "|"
# Create connection
$con = New-Object System.Data.SqlClient.SqlConnection;
$con.ConnectionString = "Server=" + $server + "; Database=" + $database + ";Integrated Security=true;";
# Create command
$cmd = New-Object System.Data.SqlClient.SqlCommand;
$cmd.CommandText = $query;
$cmd.Connection = $con;
# Create adapter
$da = New-Object System.Data.SqlClient.SqlDataAdapter;
$da.SelectCommand = $cmd;
# Fill DataTable
$dt = New-Object System.Data.DataTable;
$da.Fill($dt) | Out-Null;
# Close connection
$con.close();
# Dump the data
$dt | Export-CSV -delimiter $delimiter -Path $file;
# Debug detailed info
if ($debug -eq "T")
{
Write-Host;
foreach ($row in $dt.rows)
{
for ($i=0;$i -lt $row.ItemArray.Count; $i++)
{ Write-Host $row.Table.Columns[$i].ToString() = $row.ItemArray[$i] };
Write-Host;
};
}
# Debug info
if ($debug -eq "T")
{
Write-Host "Ending [dump-sql-query-to-delimited-file]";
};
Various routines can be implemented and loaded as CLR assemblies. However, I'd wager that your organization does not allow the CLR to be enabled...
Having to do ETL without being allowed to use any ETL tools sucks. I have a CLR for exporting .xml workbooks which can be opened by management-types in Excel from SQL Server 2005+ databases if you'd like to have a look at it, but nothing already on-hand for import. Sounds like you're going to be having a ton of fun with your new project.
EXECUTE master.dbo.xp_ExcelGenerator
#FileName = '\\<some unc path if you''re lucky>\shared\',
#Statement = 'SELECT x = 1; SELECT y = 2;',
#SheetNames = 'SheetTest1, SheetTest2';
This assumes whatever overbearing policy management going on over there lets you install CLRs, of course.

Resources