I'm seeing some odd behavior. On my machine, PowerShell returns the recordset and I can iterate through the records no problem. On my co-worker's machine (who has access to the file share that I need to copy the files from) is getting a record count returned instead the actual records. I must be missing something easy. Any idea why I'm seeing this different behavior?
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Server = server; Database = db; Integrated Security = True"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = "SELECT fileName from SomeTable"
$SqlCmd.Connection = $SqlConnection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$Table = new-object data.datatable
$Table = $DataSet.tables[0]
$SqlConnection.Close()
function Out-FileForce {
PARAM($path)
PROCESS
{
if(Test-Path $path)
{
Out-File -inputObject $_ -append -filepath $path
}
else
{
new-item -force -path $path -value $_ -type file
}
}
}
foreach ($Row in $Table.Rows)
{
$fullPath = $Row.FullFilePathWithName
$path = "\\server\folder\"
$newPath = "C:\newFolder\"
$newDestination = $fullPath -replace [regex]::Escape($path), $newPath
#Write-Output $newDestination
#Write-Output $fullPath
# recurse should force the creation of the folder structure
#Copy-Item $fullPath $newDestination -recurse
Out-FileForce $newDestination
Copy-Item $fullPath $newDestination -force
Write-Output $newDestination " done"
}
This line:-
$SqlAdapter.Fill($DataSet)
returns the row count if you would like that for later assign it to something:-
$rowCount = $SqlAdapter.Fill($DataSet)
or if you don't require it:-
[void]$SqlAdapter.Fill($DataSet)
both of the above will avoid the need to skip 1
Hope this helps
Figured it out a fix, I'm still not sure why.
$DataSetTableRows was causing the issue
Fixing the original script I posted.
Added this to the top:
$Table = new-object data.datatable
$Table = $DataSet.tables[0]
Then in my loop I used $Table.Rows
Related
Hopefully, this is not a duplicate. I aggregated numerous solutions I came across from the last year or so to get where I am. This is all relatively new to me and I am looking for the most secure and effective solution. When I run this, nothing happens. The intended result is to execute the stored procedure.
$Server = 'Server Name'
$database = 'DBName'
$userName = 'un'
$password = 'pw'
$Name = 'Name'
$Job = '15'
$Logs = Get-Content -Path $global:LOGFILE
$StartTime ='time'
$End = 'End'
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "Server=$('$Server');Database=$('$Database');trusted_connection=true;User Id=$('$userName');Password=$('$password')"
$Connection.Open()
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
$Command.CommandText ="EXEC dbo.UpdateOutput #Name,#Job,#StartTime,#End,#Status,#Logs"
$Command.Parameters.AddWithValue("#Name", $Name)| Out-Null
$Command.Parameters.AddWithValue("#Job", $Job)| Out-Null
$Command.Parameters.AddWithValue("#Start", $StartTime)| Out-Null
$Command.Parameters.AddWithValue("#End", $End)| Out-Null
$Command.Parameters.AddWithValue("#Status", $Status)| Out-Null
$Command.Parameters.AddWithValue("#Logs", $Logs)| Out-Null
$Command.ExecuteNonQuery()
$Connection.Close()
Get rid of the apostrophes in the ConnectionString line as such:
$Connection.ConnectionString = "Server=$($Server);Database=$($Database);trusted_connection=true;User Id=$($userName);Password=$($password)"
I have the following contents in PowerShell script (Process.ps1) that reads from SQL tables and appends results to the variables listed:
function Query($Query) {
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Server=$Server;Initial Catalog=$Database;Integrated Security=SSPI"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.Connection = $SqlConnection
$SqlCmd.CommandText = $Query
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd
$DataSet = New-Object System.Data.DataSet
$a=$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
$DataSet.Tables[0]
}
$Result = Query "SELECT * from [$cubeTable]" | Out-GridView -Wait;
$CUBE = Query "SELECT [cube_name] FROM [$cubeTable] WHERE [cube_name] = '$CUBE_input'" | Select -ExpandProperty cube_name;
$Destination_Server = Query "SELECT [destination_server] FROM [$cubeTable] WHERE [cube_name] = '$CUBE'" | Select -ExpandProperty destination_server;
$BasePath = Query "SELECT [variable_value] FROM [$pathTable] WHERE [variable_name] = 'base_path'" | Select -ExpandProperty variable_value;
$jsonPath = Join-Path -Path $BasePath -ChildPath $jsonDirectory
New-Item -ItemType Directory -Force -Path $jsonPath
$JSON_file = Join-Path $jsonPath $CUBE |
%{ ($_ + ".json") }
$processPATH = Join-Path -Path $BasePath -ChildPath $process_output_Directory
New-Item -ItemType Directory -Force -Path $processPATH
$process_output = Join-Path $processPATH $CUBE |
%{ ($_ + ".txt") }
$autosysPATH = Join-Path -Path $BasePath -ChildPath $AUTOSYS_output_Directory
New-Item -ItemType Directory -Force -Path $autosysPATH
$process_AUTOSYS_output = Join-Path $autosysPATH $CUBE |
%{ ($_ + "_process.txt") }
When I run the script in CMD through a batch file, it runs great as it should, however, it's outputting these directory information somewhere from the following variables:
json_file, process_output, and autosys_output
I have an output image here:
Where exactly is the writing to console happening? I don't have any echo OR a Write-Host! let alone a function to output the directories...
and its definitely not this part: $Result = Query "SELECT * from [$cubeTable]" | Out-GridView -Wait; because I commented it out and it still outputted the directories info as the screenshot shows.
New-Item returns the created FileInfo or DirectoryInfo object. That's what you're seeing in your output. PowerShell default output formating just merges similar consecutive objects to provide more compact output, so you're getting a single table instead of three separate tables with one object each.
You can suppress the output by adding | Out-Null to the New-Item statements:
New-Item -ItemType Directory -Force -Path $jsonPath | Out-Null
Other options would be capturing the output in a variable or using redirection (> $null).
I have a Powershell Function that is being used to run multiple queries in SQL and export as CSVs. Each of these queries relies on a date variable. Is there a way to pass this date variable from Powershell into these SQL Scripts (not stored procedures) using my current setup? Any help is much appreciated!
Function Run-Query
{
param([string[]]$queries,[string[]]$sheetnames)
Begin
{
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Server = $SQLServer; Database = $Database; User ID = $uid; Password = $pwd;"
Write-host "Connection to database successful."
}#End Begin
Process
{
# Loop through each query
For($i = 0; $i -lt $queries.count; $i++)
{
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
# Use the current index ($i) to get the query
$SqlCmd.CommandText = $queries[$i]
$SqlCmd.Connection = $SqlConnection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
# Use the current index ($i) to get the sheetname for the CSV
$DataSet.Tables[0] #| Export-Csv -NoTypeInformation -Path "C:\Users\mbaron\Downloads\$($sheetnames[$i]).csv"
}
}#End Process
End
{
$SqlConnection.Close()
}
}#End function run-query.
You could add a marker in your queries where the data is being used, then do a replace with the relevant date, e.g.:
cls
$date = '1/1/2016'
$query = 'some $$marker$$ script'
$query = $query.replace('$$marker$$', $date )
$query
I have a task to save the results of a SQL Server query into a .csv file. After some googling I decided to use PowerShell. I found a script, modified it a bit, it works and almost all is ok.
$server = "server"
$database = "database"
$query = "SELECT * from et_thanks"
$tod = Get-Date;
$file = "{0:yyyyMMdd}_go.csv" -f $tod;
$extractFile = #"
\\info\export_files\$file
"#
$connectionTemplate = "Data Source={0};Integrated Security=SSPI;Initial Catalog={1};"
$connectionString = [string]::Format($connectionTemplate, $server, $database)
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$command = New-Object System.Data.SqlClient.SqlCommand
$command.CommandText = $query
$command.Connection = $connection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $command
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$connection.Close()
$DataSet.Tables[0] | Export-Csv -Force -Delimiter ";" $extractFile
But I have 2 problems which I can't solve:
When I open the .csv file I see columns headers and commented string on first line:
#TYPE System.Data.DataRow
"ob_no","c_name","c_visible","c_fp","e_from","e_to"
"436439","09.09.2013 11:29:08","0","","10937","260153"
How can I get rid of it?
All values are surrounded with quotes. Is it possible to modify script not to use it while exporting? Autoreplace isn't good idea, cause there is a possibility that quote symbol can be found in sql data.
I tried to find answers in documentation (http://ss64.com/ps/export-csv.html) but with no success.
You might run in to trouble removing the quotes, but if that's what you really want then the following should achieve it.
-NoTypeInformation will remove the additional type information you are seeing.
($DataSet.Tables[0] | ConvertTo-Csv -Delimiter ";" -NoTypeInformation) -replace "`"", "" | `
Out-File -Force $extractFile
This uses convertto-csv to convert to a string representation of the csv followed by replacing all instances of " with nothing and the final string is piped to Out-File.
...and, to get rid of the header record, if you first convert the data to csv (Convert-Csv), then pipe those results to Select to skip the 1st record:
($DataSet.Tables[0] | ConvertTo-Csv -Delimiter "`t" -NoTypeInformation ) -Replace "`"","" | Select -skip 1 | Out-File blahblahblah...
Agreed export-csv isn't the best tool for the job. I would use sqlcmd.exe or bcp.exe provided SQL Server command-lines tools are installed. You could also build a simple routine to create a CSV from a datatable:
$result = new-Object text.stringbuilder
$dt = $DataSet.Tables[0]
foreach ($dr in $dt.Rows) {
for ($i = 0; $i -lt $dt.Columns.Count; $i++) {
$null = $result.Append($($dr[$i]).ToString())
$null = $result.Append($(if ($i -eq $dt.Columns.Count - 1) {"`n" } else { ","} ))
}
}
$result.ToString()
I get this error message when I run the Powershell script at the bottom:
Exception calling "EnumScript" with "1" argument(s): "Script failed for Table 'dbo.Product'. "
At :line:48 char:35
+ foreach ($s in $scripter.EnumScript <<<< ($tbl)) { write-host $s }
However, when I comment out the output_file line
#$output_file="C:\Product.sql"
(which won't set the Scripter options to write to file), it works fine and outputs the INSERT statments to the console.
Here's the failing script, is there something I'm missing?
# Script INSERTs for given table
param
(
[string] $server,
[string] $database,
[string] $schema,
[string] $table,
[string] $output_file
)
$server="devdidb02"
$database="EPCTrunk_EPC"
$schema="dbo"
$table="Product"
$output_file="C:\Product.sql"
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | out-null
$srv = New-Object "Microsoft.SqlServer.Management.SMO.Server" $server
$db = New-Object ("Microsoft.SqlServer.Management.SMO.Database")
$tbl = New-Object ("Microsoft.SqlServer.Management.SMO.Table")
$scripter = New-Object ("Microsoft.SqlServer.Management.SMO.Scripter") ($server)
# Get the database and table objects
$db = $srv.Databases[$database]
$tbl = $db.tables | Where-object {$_.schema -eq $schema-and$_.name -eq $table}
# Set scripter options to ensure only data is scripted
$scripter.Options.ScriptSchema = $false;
$scripter.Options.ScriptData = $true;
#Exclude GOs after every line
$scripter.Options.NoCommandTerminator = $true;
if ($output_file -gt "")
{
$scripter.Options.FileName = $output_file
$scripter.Options.ToFileOnly = $true
}
# Output the script
foreach ($s in $scripter.EnumScript($tbl)) { write-host $s }
I ran both yours and Keith's and it looks like the issue is in the path you are setting for the file. I was able to reproduce your error. You were using $output_file="C:\Product.sql". Then I changed the path to: $output_file="$home\Product.sql" it ran just fine and gave me the file.
I am guessing that the reason for this is that I don't have permission to write to c:\ which may be the problem you are having.
BTW - my home dir in this case for me is my user folder for my login so I was able to find it there.
FWIW I'm not able to repro the error you see using the AdventureWorks DB. The following generates the foo.sql file without any errors:
Add-Type -AssemblyName ('Microsoft.SqlServer.Smo, Version=10.0.0.0, ' + `
'Culture=neutral, PublicKeyToken=89845dcd8080cc91')
$serverName = '.\SQLEXPRESS'
$smo = new-object Microsoft.SqlServer.Management.Smo.Server $serverName
$db = $smo.Databases['AdventureWorks']
$tbl = $db.Tables | Where {$_.Schema -eq 'Production' -and `
$_.Name -eq 'Product'}
$output_file = "$home\foo.sql"
$scripter = new-object Microsoft.SqlServer.Management.Smo.Scripter $serverName
$scripter.Options.ScriptSchema = $false;
$scripter.Options.ScriptData = $true;
$scripter.Options.NoCommandTerminator = $true;
if ($output_file -gt "")
{
$scripter.Options.FileName = $output_file
$scripter.Options.ToFileOnly = $true
}
# Output the script
foreach ($s in $scripter.EnumScript($tbl)) { write-host $s }