What part of the script is printing the directory information? - sql-server

I have the following contents in PowerShell script (Process.ps1) that reads from SQL tables and appends results to the variables listed:
function Query($Query) {
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Server=$Server;Initial Catalog=$Database;Integrated Security=SSPI"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.Connection = $SqlConnection
$SqlCmd.CommandText = $Query
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd
$DataSet = New-Object System.Data.DataSet
$a=$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
$DataSet.Tables[0]
}
$Result = Query "SELECT * from [$cubeTable]" | Out-GridView -Wait;
$CUBE = Query "SELECT [cube_name] FROM [$cubeTable] WHERE [cube_name] = '$CUBE_input'" | Select -ExpandProperty cube_name;
$Destination_Server = Query "SELECT [destination_server] FROM [$cubeTable] WHERE [cube_name] = '$CUBE'" | Select -ExpandProperty destination_server;
$BasePath = Query "SELECT [variable_value] FROM [$pathTable] WHERE [variable_name] = 'base_path'" | Select -ExpandProperty variable_value;
$jsonPath = Join-Path -Path $BasePath -ChildPath $jsonDirectory
New-Item -ItemType Directory -Force -Path $jsonPath
$JSON_file = Join-Path $jsonPath $CUBE |
%{ ($_ + ".json") }
$processPATH = Join-Path -Path $BasePath -ChildPath $process_output_Directory
New-Item -ItemType Directory -Force -Path $processPATH
$process_output = Join-Path $processPATH $CUBE |
%{ ($_ + ".txt") }
$autosysPATH = Join-Path -Path $BasePath -ChildPath $AUTOSYS_output_Directory
New-Item -ItemType Directory -Force -Path $autosysPATH
$process_AUTOSYS_output = Join-Path $autosysPATH $CUBE |
%{ ($_ + "_process.txt") }
When I run the script in CMD through a batch file, it runs great as it should, however, it's outputting these directory information somewhere from the following variables:
json_file, process_output, and autosys_output
I have an output image here:
Where exactly is the writing to console happening? I don't have any echo OR a Write-Host! let alone a function to output the directories...
and its definitely not this part: $Result = Query "SELECT * from [$cubeTable]" | Out-GridView -Wait; because I commented it out and it still outputted the directories info as the screenshot shows.

New-Item returns the created FileInfo or DirectoryInfo object. That's what you're seeing in your output. PowerShell default output formating just merges similar consecutive objects to provide more compact output, so you're getting a single table instead of three separate tables with one object each.
You can suppress the output by adding | Out-Null to the New-Item statements:
New-Item -ItemType Directory -Force -Path $jsonPath | Out-Null
Other options would be capturing the output in a variable or using redirection (> $null).

Related

Duplicate SQL results from Powershell CSV export

I am trying to output SQL results to a .csv file using Powershell separated by its respective column.
The script I wrote works, but it will duplicate the same result three times in the csv. Even if I have only 1 result from the Select statement from the table, it will output it three times in the .csv file.
I tried using pscustomobject as well. But it throws me an error and does not output anything.
Clear-Variable Results
Clear-Variable Report
[string] $query = "Select Name, Value From options with(nolock) where Name IN('ExportFolder','ImportFolder','GlobalExportFolder'); Select ##ROWCOUNT AS AffectedRows"
[string[]] $servers = #('sqlinstance=mytestdb')
foreach($server in $servers)
{
$instance = ($server -split '=')[0]
$db = ($server -split '=')[1]
Try{
$Results = Invoke-Sqlcmd -ServerInstance $instance -Database $db -Query $query
$ExportFolder = ($Results.ItemArray[1])
$GlobalExportFolder = ($Results.ItemArray[3])
$ImportFolder = ($Results.ItemArray[5])
$Array = '$ExportFolder','$GlobalExportFolder','ImportFolder'
$mail = $Array | Select-Object #{n="SQLServer";e={$instance}},#{n="DBName";e={$db}}, #{n="ExportFolder";e={$ExportFolder}}, #{n="GlobalExportFolder";e={$GlobalExportFolder}}, #{n="ImportFolder";e={$ImportFolder}}
$mail | Export-Csv -Path "C:\Users\localadmin\Documents\Logs\HostNameCheck.csv" -NoTypeInformation -Append -Verbose
$Report += $Results
$Report | Select Name, Value | Export-Csv -Path "C:\Users\localadmin\Documents\Logs\SFTPHostnameModificationCheck.csv" -NoTypeInformation -Append -Verbose
Clear-Variable Results
Clear-Variable Report
}
Catch {
Write-Host ("Error: Data retrieval failed against instance $instance for $db" + " - " + (Get-Date)) -ForegroundColor Red
Write-Output ("Error: Data retrieval failed against $instance on $db" + " - " + (Get-Date)) | Out-File -FilePath $PathFailedLogs -Append
}
}
$attachment = Get-ChildItem -Path "C:\Users\localadmin\Documents\Logs" -Include *.csv -Recurse -Force
Send-MailMessage -From "test#test.com" -To "localadmin#nonprod.com" -Subject "SFTPHostnameCheck" -SmtpServer "localrelay#local.com" -Attachments $attachment
Using PSCustomObject
$obj = New-Object [PSCustomObject] -Property #{
'SQLServer' = $instance
'DBName' = $db
'ExportFolder' = "$ExportFolder
'GlobalExportFolder' = $GlobalExportFolder
'ImportFolder' = $ImportFolder"
}
$list += $obj
$list | Export-Csv -Path "C:\Users\localadmin\Documents\Logs\HostNamecheck.csv" -NoTypeInformation -Append -Verbose

how to execute multiple Invoke-Sqlcmd in one transaction?

I would like to perform a bunch of invoke-sqlcmd in one sql transaction. Here's what I'm doing:
try{
$scope = New-Object -TypeName System.Transactions.TransactionScope
GetFiles $SqlFilesDirectory
$scope.Complete()
}
catch{
$_.exception.message
}
finally{
$scope.Dispose()
}
Here's how GetFiles is defined:
#
# Get SQL Files recursively
#
function GetFiles($path = $pwd)
{
$subFolders = Get-ChildItem -Path $path -Directory | Select-Object FullName,Name | Sort-Object -Property Name
$sqlFiles = Get-ChildItem -Path $path -Filter *.sql | Select-Object FullName,Name | Sort-Object -Property Name
foreach ($file in $sqlFiles)
{
Write-Host "file: " $file.Name
Invoke-Sqlcmd -ServerInstance $ServerInstance -Database $DBName -Username $SvcAdminAccount -Password $SvcAdminPassword -InputFile $file.FullName -QueryTimeout 65535
}
foreach ($folder in $subFolders)
{
Write-Host "`nGetting files for subfolder: " $folder.Name
GetFiles $folder.FullName
}
}
How do we perform a series of invoke-sqlcmd in one transaction?
Here's the output:
The behavior that I want is that ALL
changes are rolled back if a single sql script fails.

What is the best format to export bigger output?

I have a big file server, with a big files and folder tree and I need export the NTFS permissions. I used a following script:
$FolderPath = Get-ChildItem -Path C:\FS -Filter * -Recurse -Force
ForEach ($Folder in $FolderPath) {
$Acl = Get-Acl -Path $Folder.FullName
ForEach ($Access in $Acl.Access) {
$Properties = [ordered]#{'Folder Name'=$Folder.FullName;'Group/User'=$Access.IdentityReference;'Permissions'=$Access.FileSystemRights;'Inherited'=$Access.IsInherited}
New-Object -TypeName PSObject -Property $Properties
}
}
What kind of format recommend me to get out the result from the script, I think CSV is a very good format, but I donĀ“t know if is the correct file format.
You can write to CSV format and it will be easier to process them later in excel or another place.
$FolderPath = Get-ChildItem -Path C:\FS -Filter * -Recurse -Force
$collection = #() #Define collection
ForEach ($Folder in $FolderPath) {
$Acl = Get-Acl -Path $Folder.FullName
ForEach ($Access in $Acl.Access) {
$Properties = [ordered]#{'Folder Name'=$Folder.FullName;'Group/User'=$Access.IdentityReference;'Permissions'=$Access.FileSystemRights;'Inherited'=$Access.IsInherited}
$collection += New-Object -TypeName PSObject -Property $Properties
}
}
$collection | Export-Csv -LiteralPath C:\ACLInformation.csv -NoTypeInformation -Encoding UTF8

How to format output when exporting SQL query to CSV

I have a task to save the results of a SQL Server query into a .csv file. After some googling I decided to use PowerShell. I found a script, modified it a bit, it works and almost all is ok.
$server = "server"
$database = "database"
$query = "SELECT * from et_thanks"
$tod = Get-Date;
$file = "{0:yyyyMMdd}_go.csv" -f $tod;
$extractFile = #"
\\info\export_files\$file
"#
$connectionTemplate = "Data Source={0};Integrated Security=SSPI;Initial Catalog={1};"
$connectionString = [string]::Format($connectionTemplate, $server, $database)
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$command = New-Object System.Data.SqlClient.SqlCommand
$command.CommandText = $query
$command.Connection = $connection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $command
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$connection.Close()
$DataSet.Tables[0] | Export-Csv -Force -Delimiter ";" $extractFile
But I have 2 problems which I can't solve:
When I open the .csv file I see columns headers and commented string on first line:
#TYPE System.Data.DataRow
"ob_no","c_name","c_visible","c_fp","e_from","e_to"
"436439","09.09.2013 11:29:08","0","","10937","260153"
How can I get rid of it?
All values are surrounded with quotes. Is it possible to modify script not to use it while exporting? Autoreplace isn't good idea, cause there is a possibility that quote symbol can be found in sql data.
I tried to find answers in documentation (http://ss64.com/ps/export-csv.html) but with no success.
You might run in to trouble removing the quotes, but if that's what you really want then the following should achieve it.
-NoTypeInformation will remove the additional type information you are seeing.
($DataSet.Tables[0] | ConvertTo-Csv -Delimiter ";" -NoTypeInformation) -replace "`"", "" | `
Out-File -Force $extractFile
This uses convertto-csv to convert to a string representation of the csv followed by replacing all instances of " with nothing and the final string is piped to Out-File.
...and, to get rid of the header record, if you first convert the data to csv (Convert-Csv), then pipe those results to Select to skip the 1st record:
($DataSet.Tables[0] | ConvertTo-Csv -Delimiter "`t" -NoTypeInformation ) -Replace "`"","" | Select -skip 1 | Out-File blahblahblah...
Agreed export-csv isn't the best tool for the job. I would use sqlcmd.exe or bcp.exe provided SQL Server command-lines tools are installed. You could also build a simple routine to create a CSV from a datatable:
$result = new-Object text.stringbuilder
$dt = $DataSet.Tables[0]
foreach ($dr in $dt.Rows) {
for ($i = 0; $i -lt $dt.Columns.Count; $i++) {
$null = $result.Append($($dr[$i]).ToString())
$null = $result.Append($(if ($i -eq $dt.Columns.Count - 1) {"`n" } else { ","} ))
}
}
$result.ToString()

Powershell - DataSet contains the number of records

I'm seeing some odd behavior. On my machine, PowerShell returns the recordset and I can iterate through the records no problem. On my co-worker's machine (who has access to the file share that I need to copy the files from) is getting a record count returned instead the actual records. I must be missing something easy. Any idea why I'm seeing this different behavior?
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Server = server; Database = db; Integrated Security = True"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = "SELECT fileName from SomeTable"
$SqlCmd.Connection = $SqlConnection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$Table = new-object data.datatable
$Table = $DataSet.tables[0]
$SqlConnection.Close()
function Out-FileForce {
PARAM($path)
PROCESS
{
if(Test-Path $path)
{
Out-File -inputObject $_ -append -filepath $path
}
else
{
new-item -force -path $path -value $_ -type file
}
}
}
foreach ($Row in $Table.Rows)
{
$fullPath = $Row.FullFilePathWithName
$path = "\\server\folder\"
$newPath = "C:\newFolder\"
$newDestination = $fullPath -replace [regex]::Escape($path), $newPath
#Write-Output $newDestination
#Write-Output $fullPath
# recurse should force the creation of the folder structure
#Copy-Item $fullPath $newDestination -recurse
Out-FileForce $newDestination
Copy-Item $fullPath $newDestination -force
Write-Output $newDestination " done"
}
This line:-
$SqlAdapter.Fill($DataSet)
returns the row count if you would like that for later assign it to something:-
$rowCount = $SqlAdapter.Fill($DataSet)
or if you don't require it:-
[void]$SqlAdapter.Fill($DataSet)
both of the above will avoid the need to skip 1
Hope this helps
Figured it out a fix, I'm still not sure why.
$DataSetTableRows was causing the issue
Fixing the original script I posted.
Added this to the top:
$Table = new-object data.datatable
$Table = $DataSet.tables[0]
Then in my loop I used $Table.Rows

Resources