I have more than a hundred encrypted procedures and functions that I want to decrypt (I am trying a bacpac file export but it fails due to procedures being encrypted). I tried using dbforge sql decryptor decryption wizard for in place alter but I get the error:
Definition is invalid. Can't find CREATE keyword.
When I try to see the DDL script of a stored procedure(using dbforge sql decryptor), I get the error:
A definition for the object dbo.pt_blocks cannot be shown because it is encrypted by a third party tool
I can not find a resolution to this. Are there any solutions or other tools available for this?
Edit: I found this resource which mentions
take the source code and issue an ALTER command without the encryption option. Just take the source code and remove the WITH ENCRYPTION
How could I achieve this?
EDIT: I have enabled remote DAC. How can I decrypt everything? The accepted answer from this question has a broken link.
Edit: The problem has been solved by uninstalling a third party tool which was creating encrypted procedures.
Below is a PowerShell example that creates a script file of all encrypted objects, gleaned from Paul White's The Internals of WITH ENCRYPTION article. Change the data source and initial catalog in the 2 connection strings to the desired server and database as well as script file path.
A DAC connection is used to retrieve values from system tables so sysadmin server role membership is required. If run remotely, the SQL Server remote admin connections option must be enabled and TCP port 1434 allowed through the firewall.
The script can be run from the PowerShell ISE or from a command prompt after customization. Example command-line invocation, assuming script was saved to file "Decrypt-Objects.ps1".
powershell -ExecutionPolicy RemoteSigned -File C:\PowershellScripts\Decrypt-Objects.ps1
PowerShell script:
# PowerShell implementation of T-SQL code from https://sqlperformance.com/2016/05/sql-performance/the-internals-of-with-encryption
Function Get-DecryptedString($pwd, $data) {
$key = [System.Array]::CreateInstance([int], 256)
$box = [System.Array]::CreateInstance([int], 256)
$cipher = [System.Array]::CreateInstance([byte], $data.Length)
for ($i = 0; $i -lt 256; ++$i) {
$key[$i] = $pwd[$i % $pwd.Length]
$box[$i] = $i
}
for ($j = $i = 0; $i -lt 256; ++$i) {
$j = ($j + $box[$i] + $key[$i]) % 256
$tmp = $box[$i]
$box[$i] = $box[$j]
$box[$j] = $tmp
}
for ($a = $j = $i = 0; $i -lt $data.Length; ++$i) {
++$a
$a %= 256
$j += $box[$a]
$j %= 256
$tmp = $box[$a]
$box[$a] = $box[$j]
$box[$j] = $tmp
$k = $box[(($box[$a] + $box[$j]) % 256)]
$cipher[$i] = ($data[$i] -bxor $k)
}
$decryptedString = [System.Text.Encoding]::Unicode.GetString($cipher)
return $decryptedString
}
Function Get-ClearObjectText($connectionString, $objectName) {
$getRc4KeyQuery = #"
DECLARE
#objectid integer = OBJECT_ID(#ObjectName),
#family_guid binary(16),
#objid binary(4),
#subobjid binary(2);
-- Find the database family GUID
SELECT #family_guid = CONVERT(binary(16), DRS.family_guid)
FROM sys.database_recovery_status AS DRS
WHERE DRS.database_id = DB_ID();
-- Convert object ID to little-endian binary(4)
SET #objid = CONVERT(binary(4), REVERSE(CONVERT(binary(4), #objectid)));
SELECT
-- Read the encrypted value
#imageval = SOV.imageval,
-- Get the subobjid and convert to little-endian binary
#subobjid = CONVERT(binary(2), REVERSE(CONVERT(binary(2), SOV.subobjid)))
FROM sys.sysobjvalues AS SOV
WHERE
SOV.[objid] = #objectid
AND SOV.valclass = 1;
-- Compute the RC4 initialization key
SELECT #RC4key = HASHBYTES('SHA1', #family_guid + #objid + #subobjid);
"#
$connection = New-Object System.Data.SqlClient.SqlConnection($dacConnectionString)
$connection.Open()
$command = New-Object System.Data.SqlClient.SqlCommand($getRc4KeyQuery, $connection)
($command.Parameters.Add("#ObjectName", [System.Data.SqlDbType]::NVarChar, 261)).Value = $objectName
($command.Parameters.Add("#imageval", [System.Data.SqlDbType]::VarBinary, -1)).Direction = [System.Data.ParameterDirection]::Output
($command.Parameters.Add("#RC4key", [System.Data.SqlDbType]::Binary, 20)).Direction = [System.Data.ParameterDirection]::Output
[void]$command.ExecuteNonQuery()
$imageval = $command.Parameters["#imageval"].Value
$RC4key = $command.Parameters["#RC4key"].Value
$connection.Close()
$decryptedString = Get-DecryptedString -pwd $RC4key -data $imageval
Return $decryptedString
}
# ############
# ### MAIN ###
# ############
# DAC connection string for decryption
$dacConnectionString = "Data Source=admin:YourServer;Initial Catalog=YourDatabase;Integrated Security=SSPI"
# normal connection string for encrypted object list
$connectionString = "Data Source=YourServer;Initial Catalog=YourDatabase;Integrated Security=SSPI"
# target file path for clear encrypted objects DDL
$scriptFilePath = "C:\Scripts\EncryptedObjects.sql"
[void](New-Item -Path "C:\Scripts\EncryptedObjects.sql" -ItemType file -Force) # create directory (if needed) and empty script file
$EncryptedObjectQuery = #"
SELECT
QUOTENAME(OBJECT_SCHEMA_NAME(object_id)) + '.' + QUOTENAME(name) AS QualifiedObjectName
FROM sys.objects
WHERE OBJECTPROPERTY(object_id, 'IsEncrypted') = 1;
"#
try {
$connection = New-Object System.Data.SqlClient.SqlConnection($connectionString)
$command = New-Object System.Data.SqlClient.SqlCommand($EncryptedObjectQuery, $connection)
$connection.Open()
$reader = $command.ExecuteReader()
while ($reader.Read()) {
$createObjectScript = Get-ClearObjectText -connectionString $dacConnectionString -objectName $reader["QualifiedObjectName"]
$createObjectScript | Out-File -FilePath $scriptFilePath -Append
"GO" | Out-File -FilePath $scriptFilePath -Append
}
$connection.Close()
}
catch {
throw
}
Related
I try to bulk load data from Oracle to SqlServer through Powershell Sqlserver Module Sqlbulkcopy
On small Data, everything works fine, but on big Datasets, even if bachsize and streaming are set, sqlbulkcopy is taking all the memory available... until an out of memory
Also the notify function seems to give no answer, so I guess even with streaming=True, the process first load everything to memory...
What did I missed ?
$current = Get-Date
#copy table from Oracle table to SQL Server table
add-type -path "D:\oracle\product\12.1.0\client_1\odp.net\managed\common\Oracle.ManagedDataAccess.dll";
#define oracle connectin string
$conn_str = "cstr"
# query for oracle table
$qry = "
SELECT
ID,CREATEDT,MODIFIEDDT
FROM MYTABLE
WHERE source.ISSYNTHETIC=0 AND source.VALIDFROM >= TO_Date('2019-01-01','yyyy-mm-dd')
";
# key (on the left side) is the source column while value (on the right side) is the target column
[hashtable] $mapping = #{'ID'='ID';'CREATEDT'='CREATEDT';'MODIFIEDDT'};
$adapter = new-object Oracle.ManagedDataAccess.Client.OracleDataAdapter($qry, $conn_str);
#$info = new-object Oracle.ManagedDataAccess.Client;
#Write-Host ( $info | Format-Table | Out-String)
$dtbl = new-object System.Data.DataTable('MYTABLE');
#this Fill method will populate the $dtbl with the query $qry result
$adapter.Fill($dtbl);
#define sql server target instance
$sqlconn = "cstr";
$sqlbc = new-object system.data.sqlclient.Sqlbulkcopy($sqlconn)
$sqlbc.BatchSize = 1000;
$sqlbc.EnableStreaming = $true;
$sqlbc.NotifyAfter = 1000;
$sqlbc.DestinationTableName="DWHODS.MYTABLE";
#need to tell $sqlbc the column mapping info
foreach ($k in $mapping.keys)
{
$colMapping = new-object System.Data.SqlClient.SqlBulkCopyColumnMapping($k, $mapping[$k]);
$sqlbc.ColumnMappings.Add($colMapping) | out-null
}
$sqlbc.WriteToServer($dtbl);
$sqlbc.close;
$end= Get-Date
$diff= New-TimeSpan -Start $current -End $end
Write-Output "import needed : $diff"
Thanks to Jeroen, I changed the code like this, now its no more consuming memory :
$oraConn = New-Object Oracle.ManagedDataAccess.Client.OracleConnection($conn_str);
$oraConn.Open();
$command = $oraConn.CreateCommand();
$command.CommandText=$qry;
$reader = $command.ExecuteReader()
...
$sqlbc.WriteToServer($reader);
I have put together a nice PowerShell script to script out the objects (tables, functions, sprocs etc) from a database, limiting it to the ones in a list.
But I am stuck trying to find a way to script the database itself. Each time I do that, it seems to try to script out the whole database (it is way to large for that to go well).
Assuming I have a $db variable that is a reference to my database, how can I use SMO to script out that database, creating it with the same Properties and DatabaseScopedConfigurations, but none of the actual objects in it?
Update:
For reference here is my current script. It takes a server and database name and will script out all the objects found in a file called DbObjectsList.txt (assuming they are in the database). But this does not actually make the database. The database I am running this on is a legacy one, and it has a bunch of odd options set. I would like to preserve those.
$serverName = "MyServerName"
$databaseName = "MyDbName"
$date_ = (date -f yyyyMMdd)
$path = ".\"+"$date_"
# Load the Sql Server Management Objects (SMO) and output to null so we don't show the dll details.
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') > $null
# Setup the scripting options
$scriptOptions = new-object ('Microsoft.SqlServer.Management.Smo.ScriptingOptions')
$scriptOptions.ExtendedProperties = $true
$scriptOptions.AnsiPadding = $true
$scriptOptions.ClusteredIndexes = $true
# Dri = Declarative Referential Integrity
$scriptOptions.DriAll = $true
$scriptOptions.Triggers = $true
$scriptOptions.NoCollation = $false
$scriptOptions.SchemaQualify = $true
$scriptOptions.ScriptSchema = $true
$scriptOptions.EnforceScriptingOptions = $true
$scriptOptions.SchemaQualifyForeignKeysReferences = $true
$scriptOptions.NonClusteredIndexes = $true
$scriptOptions.Statistics = $true
$scriptOptions.Permissions = $true
$scriptOptions.OptimizerData = $true
# get a reference to the database we are going to be scripting from
$serverInstance = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $serverName
$db=$serverInstance.Databases | Where-Object {$_.Name -eq $databaseName}
$dbname = "$db".replace("[","").replace("]","")
$dbpath = "$path"+ "\"+"$dbname" + "\"
if ( !(Test-Path $dbpath))
{
$null=new-item -type directory -name "$dbname"-path "$path"
}
# Load the list of db objects we want to script.
$listPath = ".\DbObjectList.txt"
if ((Test-Path $listPath))
{
$dbListItems = Get-Content -Path $listPath
}
else
{
throw "Could not find DbObjectst.txt file (it should have a list of what to script)."
}
# Setup the output file, removing any existing one
$outFile = "$dbpath" + "FullScript.sql"
if ((Test-Path $outFile)){Remove-Item $outFile }
$typeDelimiter = "=========="
foreach ($dbListItem in $dbListItems)
{
# Let the caller know which one we are working on.
echo $dbListItem
if ($dbListItem.StartsWith($typeDelimiter))
{
# Pull the type out of the header
$startIndex = $typeDelimiter.Length;
$stopIndex = $dbListItem.LastIndexOf($typeDelimiter)
$type = $dbListItem.Substring($startIndex, $stopIndex - $startIndex).Trim()
continue;
}
if ($type -eq $null)
{
throw "Types not included DbObjectsList.txt. Add types before groups of objects, surrounded by " + $typeDelimiter
}
foreach ($dbObjectToScript in $db.$type)
{
$objName = "$dbObjectToScript".replace("[","").replace("]","")
$compareDbListItem = "$dbListItem".replace("[","").replace("]","")
if ($compareDbListItem -eq $objName)
{
"-- " + $dbListItem | out-File -Append $outFile
$dbObjectToScript.Script($scriptOptions)+"GO" | out-File -Append $outFile
}
}
}
I am trying to import an XML file into a MS SQL table using PowerShell, I have modified a PowerShell script I already had that worked fine but could do with some help as the new XML file has multiple entries for some elements.( I have included a sample of the XML file)
Table layout is below and as is a sample of the file I am trying to import and also my script I am trying to modify to get working.
Table layout is:
------------------------------
:supplier_id : name : Symbol :
and here is an part of the XML file itself
<SupplierMapping supplier_id="4536" name="Joby">
<Symbol>Joby</Symbol>
</SupplierMapping>
<SupplierMapping supplier_id="4537" name="ACT">
<Symbol>ACT</Symbol>
<Symbol>ADVANCED CABLE TECH</Symbol>
<Symbol>ADVANCED CABLE TECHNOLOGY</Symbol>
<Symbol>IEC LOCK</Symbol>
</SupplierMapping>
As you can see some supplier id's and names will have multiple <Symbol> names so I would like to have multiple entries in my table for those, however the script I have modified only seems to pull in the supplier_id and name elements and misses the <Symbol> part entirely. any help or guidance would be much appreciated.
Set-ExecutionPolicy Unrestricted -Scope LocalMachine
[String]$global:connectionString = "Data Source=Apps2\Apps2;Initial Catalog=DTEDATA;Integrated Security=SSPI";
[System.Data.DataTable]$global:dt = New-Object System.Data.DataTable;
[System.Xml.XmlTextReader]$global:xmlReader = New-Object System.Xml.XmlTextReader("C:\Scripts\icecat\supplier_mapping.xml");
[Int32]$global:batchSize = 100;
function Add-FileRow() {
$newRow = $dt.NewRow();
$null = $dt.Rows.Add($newRow);
$newRow["supplier_id"] = $global:xmlReader.GetAttribute("supplier_id");
$newRow["name"] = $global:xmlReader.GetAttribute("name");
$newRow["Symbol"] = $global:xmlReader.GetAttribute("Symbol");
}
# init data table schema
$da = New-Object System.Data.SqlClient.SqlDataAdapter("SELECT * FROM Supplier_Mapping_Test WHERE 0 = 1", $global:connectionString);
$null = $da.Fill($global:dt);
$bcp = New-Object System.Data.SqlClient.SqlBulkCopy($global:connectionString);
$bcp.DestinationTableName = "dbo.Supplier_Mapping_Test";
$recordCount = 0;
while ($xmlReader.Read() -eq $true) {
if (($xmlReader.NodeType -eq [System.Xml.XmlNodeType]::Element) -and ($xmlReader.Name -eq "SupplierMapping"))
Add-FileRow -xmlReader $xmlReader;
$recordCount += 1;
if (($recordCount % $global:batchSize) -eq 0) {
$bcp.WriteToServer($dt);
$dt.Rows.Clear();
Write-Host "$recordCount file elements processed so far";
}
}
}
if ($dt.Rows.Count -gt 0) {
$bcp.WriteToServer($dt);
}
$bcp.Close();
$xmlReader.Close();
Write-Host "$recordCount file elements imported ";
catch {
throw;
}
I need to back up several powershell databases that are on the same server in Azure.
I currently have a script that helps me make backups but individually,
apart I must keep changing the names of the backups
This is my code:
Import-Module $PSScriptRoot\..\util\utilConnection.ps1;
Import-Module $PSScriptRoot\..\util\utilDate.ps1;
#Import-Module $PSScriptRoot\..\logging\Logging_Functions.ps1;
Import-Module AzureRM.sql
$TIMESTAMP = getTimeStamp;
#$LogPath = getPathLog;
#$logFileName = "prueba_jobDatabaseBackup.log";
#Log-Start -LogPath $LogPath -LogName $logFileName -ScriptVersion "1.5"
#return;
#Login-AzureRmAccount
loginRMAccount;
#Set subscription Azure
prueba;
Write-Output "";
#Create credential Source DB Server (QA)
#$myPasswordDB = ConvertTo-SecureString $SQL_ACCOUNT_PASSWORD_QA -AsPlainText -Force;
#$myCredentialDB = New-Object System.Management.Automation.PSCredential ($SQL_ACCOUNT_NAME_QA, $myPasswordDB);
#$sqlCredential = Get-Credential -Credential $myCredentialDB;
#Create credential Source DB Server (Prod)
$myPasswordDB = ConvertTo-SecureString $SQL_ACCOUNT_PASSWORD_QA -AsPlainText -Force;
$myCredentialDB = New-Object System.Management.Automation.PSCredential ($SQL_ACCOUNT_NAME_QA, $myPasswordDB);
$sqlCredential = Get-Credential -Credential $myCredentialDB;
$resourceGroup = "resGroupDB";
$serverName = "domserverqa";
$database = "prueba"; **// here I have to change the name of the backup file**
$primarykey = $STORAGE_ACCOUNT_BACKUP_KEY; #strdatabasebackup
$StorageUri = ("https://strdatabasebackup.blob.core.windows.net/strdatabasebackupblob/(2018-01-09-07:00)dbdom_buin.bacpac"); // here I also have to change the final name individually for each database
#$sqlCredential = getCredentialSQLServerQA; #SQL Server target
$SQL_SERVER_FULLNAME_QA = getSQLServerFullNameAzureQA;
$TIMEOUT = 300;
$importRequest = New-AzureRmSqlDatabaseImport –ResourceGroupName $resourceGroup –ServerName $serverName –DatabaseName $database –StorageKeytype StorageAccessKey –StorageKey $primarykey -StorageUri $StorageUri -AdministratorLogin $sqlCredential.UserName –AdministratorLoginPassword $sqlCredential.Password –Edition Basic –ServiceObjectiveName basic -DatabaseMaxSizeBytes 2147483648 # 2GB -> 2 * 1024 MB -> 2 * 1024 * 1024 KB -> 2 * 1024 * 1024 * 1024 Bytes
$importStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink;
while ($importStatus.Status -eq "InProgress")
{
$importStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink;
Write-Output ".";
[System.Threading.Thread]::Sleep(2000);
}
[System.Threading.Thread]::Sleep(4000);
How can I implement a foreach or array to put all the databases together and back them up one by one without having to do it manually?
If someone has any ideas please help me thanks
Pass you DBNames as a list to a ForLoop or a funciton
Just pass in a list of database names in a ForLoop, passing the dbname in a variable to your code.
$AzureDBNames = 'AzureDB01','AzureDB01','AzureDB01'
ForEach ($AzureDBName in $AzureDBNames)
{
# Code begins here
"Backing up $AzureDBName"
}
Turn your code into a function with a parameter that accepts one or more db names.
Function New-AzureDBBackup
{
[CmdletBinding()]
[Alias('NABB')]
Param
(
[string[]]$AzureDBNames
)
# Code begins here
}
New-AzureDBBackup -AzureDBNames 'AzureDB01','AzureDB01','AzureDB01'
Read the online help on:
About_Functions
About_For
About_Loops
About_Variables
Are there any limits to the size of a string you can assign to a variable in powershell or any limits to the size of the text sent within an SQL INSERT query?
I have a big CSV file coming in to PowerShell and through string construction in a foreach loop I am generating SQL INSERT queries for each row. The resulting INSERT query; INSERT query; is over about 4MB.
The SQL server has a perfect schema to receive the data, however, when sending the 4MB collection of INSERT queries (each seperated by ;) I get an error that looks to me like the long 4MB set of insert queries was truncated somehow. I guess I have hit some kind of limit.
Is there a way of getting around this (programatically in PowerShell) or a way of increasing the size limit of an acceptable collection of SQL INSERT queries?
My code is using System.Data.SqlClient.SqlConnection and System.Data.sqlclient.SqlCommand.
Smaller datasets work ok but the larger datasets give an error like the following example. Each different dataset gives off a different "Incorrect syntax near" indicator.
Exception calling "ExecuteNonQuery" with "0" argument(s): "Incorrect syntax
near '('."
At C:\Users\stuart\Desktop\git\ADStfL\WorkInProgress.ps1:211 char:3
+ $SQLCommand.executenonquery()
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : SqlException
In my experience, the best performing way to do this is to load the CSV into a DataTable and then use SQLBulkCopy.
$ErrorActionPreference = 'Stop';
$Csv = Import-Csv -Path $FileName;
$SqlServer = 'MyServer';
$SqlDatabase = 'MyDatabase';
$DestinationTableName = 'MyTable';
# Create Connection String
$SqlConnectionString = 'Data Source={0};Initial Catalog={1};Integrated Security=SSPI' -f $SqlServer, $SqlDatabase;
# Define your DataTable. The column order of the DataTable must either match the table in the database, or
# you must specify the column mapping in SqlBulkCopy.ColumnMapping. If you have an IDENTITY column, it's a
# bit more complicated
$DataTable = New-Object -TypeName System.Data.DataTable -ArgumentList $DestinationTableName;
$NewColumn = $DataTable.Columns.Add('Id',[System.Int32]);
$NewColumn.AllowDBNull = $false;
$NewColumn = $DataTable.Columns.Add('IntegerField',[System.Int32]);
$NewColumn.AllowDBNull = $false;
$NewColumn = $DataTable.Columns.Add('DecimalField',[System.Decimal]);
$NewColumn.AllowDBNull = $false;
$NewColumn = $DataTable.Columns.Add('VarCharField',[System.String]);
$NewColumn.MaxLength = 50;
$NewColumn = $DataTable.Columns.Add('DateTimeField',[System.DateTime]);
$NewColumn.AllowDBNull = $false;
# Populate your datatable from the CSV file
# You may find that you need to type cast some of the fields.
$Csv | ForEach-Object {
$NewRow = $DataTable.NewRow();
$NewRow['Id'] = $_.Id;
$NewRow['IntegerField'] = $_.IntegerField;
$NewRow['DecimalField'] = $_.DecimalFiled;
$NewRow['StringField'] = $_.StringField1;
$NewRow['DateTimeField'] = $_.DateTimeField1;
$DataTable.Rows.Add($NewRow);
}
# Create Connection
$SqlConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $SqlConnectionString;
# Open Connection
$SqlConnection.Open();
# Start Transaction
$SqlTransaction = $SqlConnection.BeginTransaction();
# Double check the possible options at https://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopyoptions(v=vs.110).aspx
# If you need multiple then -bor them together
$SqlBulkCopyOptions = [System.Data.SqlClient.SqlBulkCopyOptions]::CheckConstraints;
# Create SqlBulkCopy class
$SqlBulkCopy = New-Object -TypeName System.Data.SqlClient.SqlBulkCopy -ArgumentList $SqlConnection, $SqlBulkCopyOptions, $SqlTransaction;
# Specify destination table
$SqlBulkCopy.DestinationTableName = $DestinationTableName;
# Do the insert; rollback on error
try {
$SqlBulkCopy.WriteToServer($DataTable);
$SqlTransaction.Commit();
}
catch {
# Roll back transaction and rethrow error
$SqlTransaction.Rollback();
throw ($_);
}
finally {
$SqlConnection.Close();
$SqlConnection.Dispose();
}
The other method is to use an SQLCommand and do it row by row:
$ErrorActionPreference = 'Stop';
$Csv = Import-Csv -Path $FileName;
$SqlServer = 'MyServer';
$SqlDatabase = 'MyDatabase';
# Create Connection String
$SqlConnectionString = 'Data Source={0};Initial Catalog={1};Integrated Security=SSPI' -f $SqlServer, $SqlDatabase;
# Create Connection
$SqlConnection = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $SqlConnectionString;
# Create Command
$InsertCommandText = 'INSERT INTO DestinationTable (Id, IntegerField, DecimalField, StringField, DateTimeField) VALUES (#Id, #IntegerField, #DecimalField, #StringField, #DateTimeField)';
$InsertCommand = New-Object -TypeName System.Data.SqlClient.SqlCommand -ArgumentList $SqlConnection;
[void]$InsertCommand.Parameters.Add('#Id', [System.Data.SqlDbType]::Int);
[void]$InsertCommand.Parameters.Add('#IntegerField', [System.Data.SqlDbType]::Int);
[void]$InsertCommand.Parameters.Add('#DecimalField', [System.Data.SqlDbType]::Decimal);
[void]$InsertCommand.Parameters.Add('#StringField', [System.Data.SqlDbType]::VarChar,50);
[void]$InsertCommand.Parameters.Add('#DateTimeField', [System.Data.SqlDbType]::DateTime);
# Open connection and start transaction
$SqlConnection.Open()
$SqlTransaction = $SqlConnection.BeginTransaction();
$InsertCommand.Transaction = $SqlTransaction;
$RowsInserted = 0;
try {
$line = 0;
$Csv | ForEach-Object {
$line++;
# Specify parameter values
$InsertCommand.Parameters['#Id'].Value = $_.Id;
$InsertCommand.Parameters['#IntegerField'].Value = $_.IntegerField;
$InsertCommand.Parameters['#DecimalField'].Value = $_.DecimalField;
$InsertCommand.Parameters['#StringField'].Value = $_.StringField;
$InsertCommand.Parameters['#DateTimeField'].Value = $_.DateTimeField;
$RowsInserted += $InsertCommand.ExecuteNonQuery();
# Clear parameter values
$InsertCommand.Parameters | ForEach-Object { $_.Value = $null };
}
$SqlTransaction.Commit();
Write-Output "Rows affected: $RowsInserted";
}
catch {
# Roll back transaction and rethrow error
$SqlTransaction.Rollback();
Write-Error "Error on line $line" -ErrorAction Continue;
throw ($_);
}
finally {
$SqlConnection.Close();
$SqlConnection.Dispose();
}
Edit: Oh, I forgot one important point. If you need to set the value of a field to null in the database, you need to set it's value to [System.DBNull]::Value, not $null.