in PowerShell how to read database data into a variable - sql-server

In a PowerShell script I need to get a value from a SQL Server table. The value needs to be stored in a PowerShell variable. It's just one value -- not a whole dataset.
How can this be done?
Thanks.

Here's a example using Sqlclient ( in a C# style:-) ).
$cn = New-Object System.Data.SqlClient.SqlConnection("Data Source=.;Initial Catalog=tempdb;Integrated Security=SSPI");
$cmd = New-Object System.Data.SqlClient.SqlCommand("SELECT 1", $cn);
$cn.Open();
$yourVariable = $cmd.ExecuteScalar();
Write-Host "The value of yourVariable is $yourVariable";
$cn.Close();

FWIW, here's how I'm applying that technique to get the path for the Output file for a SQL Agent job step, so I can write to that file.
$cn = New-Object System.Data.SqlClient.SqlConnection("Data Source=.;Initial Catalog=msdb;Integrated Security=SSPI");
$cmd = New-Object System.Data.SqlClient.SqlCommand("SELECT msdb.dbo.sysjobsteps.output_file_name FROM msdb.dbo.sysjobs LEFT JOIN msdb.dbo.sysjobsteps ON msdb.dbo.sysjobs.job_id = msdb.dbo.sysjobsteps.job_id WHERE name = 'TestPS1' AND step_name = 'MyJobStep'", $cn);
$cn.Open();
$OutputFile = $cmd.ExecuteScalar();
$cn.Close();
Add-Content $OutputFile "Starting DEPLOY_SANITIZE_SCHEMA.sql __test_b"
Add-Content $OutputFile "##########"

Related

Sqlbulkcopy Excessive Memory Consumtion even with EnableStreaming and low BatchSize

I try to bulk load data from Oracle to SqlServer through Powershell Sqlserver Module Sqlbulkcopy
On small Data, everything works fine, but on big Datasets, even if bachsize and streaming are set, sqlbulkcopy is taking all the memory available... until an out of memory
Also the notify function seems to give no answer, so I guess even with streaming=True, the process first load everything to memory...
What did I missed ?
$current = Get-Date
#copy table from Oracle table to SQL Server table
add-type -path "D:\oracle\product\12.1.0\client_1\odp.net\managed\common\Oracle.ManagedDataAccess.dll";
#define oracle connectin string
$conn_str = "cstr"
# query for oracle table
$qry = "
SELECT
ID,CREATEDT,MODIFIEDDT
FROM MYTABLE
WHERE source.ISSYNTHETIC=0 AND source.VALIDFROM >= TO_Date('2019-01-01','yyyy-mm-dd')
";
# key (on the left side) is the source column while value (on the right side) is the target column
[hashtable] $mapping = #{'ID'='ID';'CREATEDT'='CREATEDT';'MODIFIEDDT'};
$adapter = new-object Oracle.ManagedDataAccess.Client.OracleDataAdapter($qry, $conn_str);
#$info = new-object Oracle.ManagedDataAccess.Client;
#Write-Host ( $info | Format-Table | Out-String)
$dtbl = new-object System.Data.DataTable('MYTABLE');
#this Fill method will populate the $dtbl with the query $qry result
$adapter.Fill($dtbl);
#define sql server target instance
$sqlconn = "cstr";
$sqlbc = new-object system.data.sqlclient.Sqlbulkcopy($sqlconn)
$sqlbc.BatchSize = 1000;
$sqlbc.EnableStreaming = $true;
$sqlbc.NotifyAfter = 1000;
$sqlbc.DestinationTableName="DWHODS.MYTABLE";
#need to tell $sqlbc the column mapping info
foreach ($k in $mapping.keys)
{
$colMapping = new-object System.Data.SqlClient.SqlBulkCopyColumnMapping($k, $mapping[$k]);
$sqlbc.ColumnMappings.Add($colMapping) | out-null
}
$sqlbc.WriteToServer($dtbl);
$sqlbc.close;
$end= Get-Date
$diff= New-TimeSpan -Start $current -End $end
Write-Output "import needed : $diff"
Thanks to Jeroen, I changed the code like this, now its no more consuming memory :
$oraConn = New-Object Oracle.ManagedDataAccess.Client.OracleConnection($conn_str);
$oraConn.Open();
$command = $oraConn.CreateCommand();
$command.CommandText=$qry;
$reader = $command.ExecuteReader()
...
$sqlbc.WriteToServer($reader);

How to loop over the result (system.Data.DataSet object) of SQL run from PowerShell

I am trying to run a SQL from Power Shell(which is on my windows 7 64 bit desktop) and the remote database host is MS SQL Server 2012.
I am running SQL1 by calling function Get-ODBC-Data which will give me single column of type string. It can have unknown number of rows (up to 20). Then I am using each of these column values as parameter ($var1) to the second function Get-ODBC-Data-Count. The SQL2 in this function Get-ODBC-Data-Count will give me count using $var1 in where clause.
The code is:
function Get-ODBC-Data{
param(
[string]$query=$('
SELECT col3
FROM [master].[sys].[table_name]'),
[string]$username='db_user_name',
[string]$password='db_password'
)
$conn = New-Object System.Data.Odbc.OdbcConnection
$conn.ConnectionString = "DRIVER={SQL Server};Server=123.456.78.90;Initial Catalog=master;Uid=$username;Pwd=$password;"
$conn.open()
$cmd = New-object System.Data.Odbc.OdbcCommand($query,$conn)
$ds = New-Object system.Data.DataSet
(New-Object system.Data.odbc.odbcDataAdapter($cmd)).fill($ds) | out-null
$conn.close()
$ds.Tables[0]
}
function Get-ODBC-Data-Count{
[parameter(Mandatory=$true)][string]$var1,
param(
[string]$query=$('
SELECT COUNT(*)
FROM [master].[sys].[table_name]
WHERE col2 = '$($var1)'
;
'),
[string]$username='db_user_name',
[string]$password='db_password'
)
$conn = New-Object System.Data.Odbc.OdbcConnection
$conn.ConnectionString = "DRIVER={SQL Server};Server=123.456.78.90;Initial Catalog=master;Uid=$username;Pwd=$password;"
$conn.open()
$cmd = New-object System.Data.Odbc.OdbcCommand($query,$conn)
$ds = New-Object system.Data.DataSet
(New-Object system.Data.odbc.odbcDataAdapter($cmd)).fill($ds) | out-null
$conn.close()
$ds.Tables[0]
}
$result = Get-ODBC-Data
$count_1 = Get-ODBC-Data-Count -var1 $result[0][0]
Write-Host "`$count_1[0]:" $count_1[0];
Write-Host "Message: Count of " $result[0][0] " is" $count_1[0];
$count_2 = Get-ODBC-Data-Count -var1 $result[1][0]
Write-Host "`$count_2:" $count_2[0];
Write-Host "Message: Count of " $result[1][0] " is" $count_2[0];
$count_3 = Get-ODBC-Data-Count -var1 $result[2][0]
Write-Host "`$count_3:" $count_3[0];
Write-Host "Message: Count of " $result[2][0] " is" $count_3[0];
This code works if I know number of rows in the result of SQL1.
My Question: How can I modify this code so unknown number of rows in result of SQL1 will be handled and I can call function Get-ODBC-Data-Count for each row of SQL1?
There are a lot of issues here.
You're building SQL strings. Don't do this! Use SQL parameters instead!
You're repeating a lot of code unessescarily.
You're using Data Tables, which I would avoid, at least in powershell.
You're not re-using the database connection.
Always try really hard to avoid loops with a query inside when working with SQL. Try and think if you can rewrite the SQL instead.
Try this SQL:
SELECT
col2,
COUNT(<thePrimaryKeyColumnOfTheTable>)
FROM [master].[sys].[table_name]
GROUP BY col2
That should give you the count of all the different values of col2.

Powershell SQL Select statements using csv variables

In the code below, I'm trying to query a DB with multiple select statements using variables brought in from a csv and load a data-table using a reader.
The code runs without error but does not retrieve any data.
$csv = Import-Csv $filepath
$database = "DBNAME"
$connectionString = "Server=$dataSource;uid=$user; pwd=$pwd;Database=$database;Integrated Security=True;"
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
$sqlCommand = $connection.CreateCommand()
$Datatable = New-Object System.Data.DataTable
ForEach ($row in $csv){
$query = "Select Emailaddress,Column2 from Users Where [Emailaddress] = '$row.Email'"
$sqlCommand.CommandText = $query
$DataReader = $sqlCommand.ExecuteReader()
$DataTable.Load($DataReader)
}
$DataTable | export-csv "c:\Output\Seereader.csv" -NoTypeInformation
$connection.Close()
This:
$query = "Select Emailaddress,Column2 from Users Where [Emailaddress] = '$row.Email'"
Should probably be this:
$query = "Select Emailaddress,Column2 from Users Where [Emailaddress] = '$($row.Email)'"
Whilst the previous answer works, it is vulnerable to SQL injection.
Obligatory xkcd
If you're not sure what "SQL Injection" is; it's only a very worthy Google away...
i.e. you really need to go and find out!
The correct way...
Parameterise your queries!
# Your query; with a #param
$Query = "SELECT Emailaddress, Column2 FROM [Users] WHERE [Emailaddress] = #emailAddress";
# Set up your basic command
$command = $connection.CreateCommand()
$command.CommandText = $Query
# Fill in the parameters!
$command.Parameters.AddWithValue("#emailAddress", $row.Email)
# Run boy, run!
$results = $command.ExecuteReader()
# Resultification (that's definitely not a made up word)
$table = New-Object System.Data.DataTable
$table.Load($results)
Safe and sound :-)

How do I account for CSV column titles and null rows when trying to import a CSV to SQL Server using PowerShell

I have a large CSV file with 5 columns, the first row being the title of the columns. I'm trying to pass the values to a datatable using powershell that I then pass to a table-value parameter to load the data into my SQL Server instance. My SQL Server table has already been created with the 5 columns, but I am running into issues.
cls
#CSV variables
$csvfile = "C:\Students\test.csv"
$csvdelimiter = ","
$firstRowColumns = $true
#creating the datatable
$dt = New-Object System.Data.Datatable
$reader = New-Object System.IO.StreamReader $csvfile
$columns = (Get-Content $csvfile -First 1).Split($csvdelimiter)
foreach ($column in $columns)
{
if ($firstRowColumns -eq $true)
{
[void]$dt.Columns.Add($column)
$reader.ReadLine()
} else { [void]$dt.Columns.Add() }
}
# Read in the data, line by line
while (($line = $reader.ReadLine()) -ne $null)
{
[void]$dt.Rows.Add($line.Split($csvdelimiter))
}
function ExecSproc
{
param ($Conn, $Sproc, $Parameters=#{})
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandType = [System.Data.CommandType]::StoredProcedure
$SqlCmd.Connection = $Conn
$SqlCmd.CommandText = $Sproc
foreach($p in $Parameters.Keys){
[Void] $SqlCmd.Parameters.AddWithValue("#$p",$Parameters[$p])
}
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter($SqlCmd)
$DataSet = New-Object System.Data.DataSet
[Void] $SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
return $DataSet.Tables[0]
}
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "server='.';database='ActiveStudents';trusted_connection=true;"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
ExecSproc -Conn $SqlConnection -Sproc spInsertStudents -Parameters #{TVP = $dt}
These are the errors being thrown:
Exception calling "Add" with "1" argument(s): "Input array is longer than the number of columns in this table."
At C:\Scripts\ActiveStudentInsert.ps1:24 char:2
Exception calling "Fill" with "1" argument(s): "Conversion failed when converting the nvarchar value '' to data type int.
The data for table-valued parameter "#TVP" doesn't conform to the table type of the parameter. SQL Server error is: 245, state: 1
The statement has been terminated."
Chad Miller made a great script in the Script Gallery that outputs DataTables in a very similar way to the one you are looking to do.
If you used his functions, then it would look something like this:
$dt = Import-Csv $csvfile | Out-DataTable
If you have a really large amount of data, I would recommend you check out the SQL BCP command-line utility.

How to format output when exporting SQL query to CSV

I have a task to save the results of a SQL Server query into a .csv file. After some googling I decided to use PowerShell. I found a script, modified it a bit, it works and almost all is ok.
$server = "server"
$database = "database"
$query = "SELECT * from et_thanks"
$tod = Get-Date;
$file = "{0:yyyyMMdd}_go.csv" -f $tod;
$extractFile = #"
\\info\export_files\$file
"#
$connectionTemplate = "Data Source={0};Integrated Security=SSPI;Initial Catalog={1};"
$connectionString = [string]::Format($connectionTemplate, $server, $database)
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$command = New-Object System.Data.SqlClient.SqlCommand
$command.CommandText = $query
$command.Connection = $connection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $command
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$connection.Close()
$DataSet.Tables[0] | Export-Csv -Force -Delimiter ";" $extractFile
But I have 2 problems which I can't solve:
When I open the .csv file I see columns headers and commented string on first line:
#TYPE System.Data.DataRow
"ob_no","c_name","c_visible","c_fp","e_from","e_to"
"436439","09.09.2013 11:29:08","0","","10937","260153"
How can I get rid of it?
All values are surrounded with quotes. Is it possible to modify script not to use it while exporting? Autoreplace isn't good idea, cause there is a possibility that quote symbol can be found in sql data.
I tried to find answers in documentation (http://ss64.com/ps/export-csv.html) but with no success.
You might run in to trouble removing the quotes, but if that's what you really want then the following should achieve it.
-NoTypeInformation will remove the additional type information you are seeing.
($DataSet.Tables[0] | ConvertTo-Csv -Delimiter ";" -NoTypeInformation) -replace "`"", "" | `
Out-File -Force $extractFile
This uses convertto-csv to convert to a string representation of the csv followed by replacing all instances of " with nothing and the final string is piped to Out-File.
...and, to get rid of the header record, if you first convert the data to csv (Convert-Csv), then pipe those results to Select to skip the 1st record:
($DataSet.Tables[0] | ConvertTo-Csv -Delimiter "`t" -NoTypeInformation ) -Replace "`"","" | Select -skip 1 | Out-File blahblahblah...
Agreed export-csv isn't the best tool for the job. I would use sqlcmd.exe or bcp.exe provided SQL Server command-lines tools are installed. You could also build a simple routine to create a CSV from a datatable:
$result = new-Object text.stringbuilder
$dt = $DataSet.Tables[0]
foreach ($dr in $dt.Rows) {
for ($i = 0; $i -lt $dt.Columns.Count; $i++) {
$null = $result.Append($($dr[$i]).ToString())
$null = $result.Append($(if ($i -eq $dt.Columns.Count - 1) {"`n" } else { ","} ))
}
}
$result.ToString()

Resources