I would like to perform a bunch of invoke-sqlcmd in one sql transaction. Here's what I'm doing:
try{
$scope = New-Object -TypeName System.Transactions.TransactionScope
GetFiles $SqlFilesDirectory
$scope.Complete()
}
catch{
$_.exception.message
}
finally{
$scope.Dispose()
}
Here's how GetFiles is defined:
#
# Get SQL Files recursively
#
function GetFiles($path = $pwd)
{
$subFolders = Get-ChildItem -Path $path -Directory | Select-Object FullName,Name | Sort-Object -Property Name
$sqlFiles = Get-ChildItem -Path $path -Filter *.sql | Select-Object FullName,Name | Sort-Object -Property Name
foreach ($file in $sqlFiles)
{
Write-Host "file: " $file.Name
Invoke-Sqlcmd -ServerInstance $ServerInstance -Database $DBName -Username $SvcAdminAccount -Password $SvcAdminPassword -InputFile $file.FullName -QueryTimeout 65535
}
foreach ($folder in $subFolders)
{
Write-Host "`nGetting files for subfolder: " $folder.Name
GetFiles $folder.FullName
}
}
How do we perform a series of invoke-sqlcmd in one transaction?
Here's the output:
The behavior that I want is that ALL
changes are rolled back if a single sql script fails.
I'm basically trying to build a routine that reads a named folder directory, builds a CSV file, then reads in that CSV file, manipulates some of the properties to split data into new columns, exports that to another CSV.
This I have achieved with the following code:
$Folder = Read-Host 'Please enter a folder path'
$File = Read-Host 'Please enter a filename'
$OutputFile = $Folder + '\' + $File + '.csv'
$SplitFile = $Folder + '\' + $File + '_Split.csv'
$CopyDir = $Folder + '\WantedDocs\'
Get-ChildItem $Folder -Recurse -Include *.* |
select Directory, FullName, Name |
Export-Csv -Delimiter ',' -NoTypeInformation $OutputFile
$a = Import-Csv $OutputFile
$values = $a.Name.Split("_")
$a | Add-Member 'CliCode' -NotePropertyValue $values[3]
$a | Add-Member 'CopyDir' -NotePropertyValue $CopyDir
$a | Select Directory, FullName, Name, CliCode, CopyDir |
Export-Csv -Delimiter ',' -NoTypeInformation $SplitFile
Excuse me if my terminology isn't right, but I am now looking to build a batch file full of xcopy commands, using item values from the properties.
xcopy 'C:\Test\OriginalDocs\A\file1_A_A_12345678.txt' 'C:\Test\WantedDocs\*' /Y
When you use Import-Csv and assign it to a variable, that variable contain various properties, with each property containing an array of values taken from each line in the CSV file.
In my example the variable $a has properties called "Directory", "FullName" and "Name", the headers of the 3 columns in my CSV file.
If my CSV file contains these lines:
"Directory","FullName","Name"
"C:\Test\OriginalDocs\A","C:\Test\OriginalDocs\A\file1_A_A_12345678.txt","file1_A_A_12345678.txt"
"C:\Test\OriginalDocs\B","C:\Test\OriginalDocs\B\file2_B_B_43534554.txt","file1_B_B_43534554.txt"
The Directory property would be an array of 2 items: "C:\Test\OriginalDocs\A" and "C:\Test\OriginalDocs\B\"
The FullName property would be an array of 2 items: "C:\Test\OriginalDocs\A\file1_A_A_12345678.txt" and "C:\Test\OriginalDocs\B\file2_B_B_43534554.txt"
The Name property would be an array of 2 items: "file1_A_A_12345678.txt" and "file2_B_B_43534554.txt"
What I want to know is how would I be able to select all [0] items in the array for each property and build the xcopy command
e.g. if I do this:
$xc1 = "xcopy '"
$xc2 = $a.FullName
$xc3 = "' '"
$xc4 = $a.CopyDir
$xc5 = $a.CliCode
$xc6 = "\*' /Y"
$xcopy = $xc1 + $xc2 + $xc3 + $xc4 + $xc5+ $xc6
The resulting $xcopy variable contains all array vales
e.g. for the example above the xcopy variable ends up with the value:
xcopy 'C:\Test\OriginalDocs\A\file1_A_A_12345678.txt C:\Test\OriginalDocs\B\file2_B_B_43534554.txt' 'C:\Test\OriginalDocs\WantedDocs\ C:\Test\OriginalDocs\WantedDocs\12345678 43534554\*' /Y
What I want to achieve is to effectively do this with the [0] array values from each selected property:
$xc1 = "xcopy '"
$xc2 = $a.FullName[0]
$xc3 = "' '"
$xc4 = $a.CopyDir[0]
$xc5 = $a.CliCode[0]
$xc6 = "\*' /Y"
$xcopy = $xc1 + $xc2 + $xc3 + $xc4 + $xc5+ $xc6
Write the $xcopy variable to the text file (using Add-Content I believe)
Then do the same with the [1] array values:
$xc1 = "xcopy '"
$xc2 = $a.FullName[1]
$xc3 = "' '"
$xc4 = $a.CopyDir[1]
$xc5 = $a.CliCode[1]
$xc6 = "\*' /Y"
$xcopy = $xc1 + $xc2 + $xc3 + $xc4 + $xc5+ $xc6
And so on until all items in the arrays are dealt with.
So producing a text/batch file with a line for each item in the arrays i.e. all the [0], all the [1] etc.
Using my example above I'd get a text file like below.
xcopy 'C:\Test\OriginalDocs\A\file1_A_A_12345678.txt' 'C:\Test\OriginalDocs\WantedDocs\12345678\*' /Y
xcopy 'C:\Test\OriginalDocs\B\file2_B_B_43534554.txt' 'C:\Test\OriginalDocs\WantedDocs\43534554\*' /Y
I've been looking at foreach and ForEach-Object but so far I've not found anything that works for my needs. Maybe it can't be done?
To work line by line use foreach:
foreach ($Line in $a){ DoSomethingLikeCopy $Line.FullName to "$CopyDir\$($Line.CliCode)" }
Instead of XCopy use New-Item to create a new textfile with the Value of the old file or to create the folder for the new Item:
Get-Content -Path 'C:\Test\OriginalDocs\A\file1_A_A_12345678.txt' -raw | New-Item -Path 'C:\Test\OriginalDocs\WantedDocs\12345678\file1_A_A_12345678.txt' -Force
or
New-Item -Path'C:\Test\OriginalDocs\WantedDocs\12345678\*' -ItemType directory
It's pointless to export data to a CSV that you're reading back right away. Simply use a pipeline. Also, xcopy is an external command and can be run directly from PowerShell, so there's no need to have PowerShell create a batch file first.
This should be all you need:
$Folder = Read-Host 'Please enter a folder path'
Get-ChildItem $Folder -Recurse | ForEach-Object {
$clicode = $_.BaseName.Split("_")[-1]
& xcopy $_.FullName "${Folder}\WantedDocs\${clicode}\*" /y
}
If you indeed want output CSV files for every step of the way, you can do something like this:
# YOU NEED TO ADD CODE FOR CHECKING THE USER INPUT
# What I'm doing here is very rudimentary..
do {
$Folder = Read-Host 'Please enter a folder path'
} while (-not (Test-Path -Path $Folder -PathType Container))
$File = Read-Host 'Please enter a filename (no extension)'
# at the very least sanitize the given filename, and get only the Name without Extension
$BaseName = [System.IO.Path]::GetFileNameWithoutExtension($File)
$OutputFile = Join-Path -Path $Folder -ChildPath ($BaseName + '.csv')
$SplitFile = Join-Path -Path $Folder -ChildPath ($BaseName + '_Split.csv')
$CopyDir = Join-Path -Path $Folder -ChildPath 'WantedDocs'
# collect the files and get the properties Directory, FullName and Name
$a = Get-ChildItem $Folder -Recurse -Include *.* -File | Select-Object Directory,FullName,Name
# write the first CSV file:
$a | Export-Csv -Path $OutputFile -Delimiter ',' -NoTypeInformation
# redefine the collection to add extra properties CliCode, CopyDir and Result
$a = $a | Select-Object *,CliCode,CopyDir,Result
# loop through the collection
$a | ForEach-Object {
# the automatic variable $_ is a single object in the collection
# get the CliCode from the Name property:
# if the filename is "file1_A_A_12345678.txt", the CliCode will be "12345678"
if ($_.Name -match '([^_.]+)\..*$') {
$cliCode = $matches[1]
$targetDir = Join-Path -Path $CopyDir -ChildPath $cliCode
$_.CliCode = $cliCode # example: "12345678"
$_.CopyDir = $targetDir # example: "C:\Test\WantedDocs\12345678"
# copy the file, but create the target folder first if this does not exist
if (-not (Test-Path -Path $targetDir -PathType Container)) {
New-Item -Path $targetDir -ItemType Directory | Out-Null
}
Copy-Item -Path $_.FullName -Destination $targetDir
$_.Result = "OK"
}
else {
# show the error and add "Failure" to the result property
Write-Warning "Skipped file '$_.FullName'. Reason: CliCode not found"
$_.Result = "Failure"
}
}
# output the results of the copy as CSV file
$a | Export-Csv -Path $SplitFile -Delimiter ',' -NoTypeInformation
When done, the files are copied to the new locations and you'll have two CSV files:
The first 'Something.csv' before the copy:
"Directory","FullName","Name"
"D:\Test\OriginalDocs\A","D:\Test\OriginalDocs\A\file1_A_A_12345678.txt","file1_A_A_12345678.txt"
"D:\Test\OriginalDocs\B","D:\Test\OriginalDocs\B\file2_B_B_43534554.txt","file2_B_B_43534554.txt"
and the second 'Something_Split.csv' after the copy:
"Directory","FullName","Name","CliCode","CopyDir","Result"
"D:\Test\OriginalDocs\A","D:\Test\OriginalDocs\A\file1_A_A_12345678.txt","file1_A_A_12345678.txt","12345678","D:\Test\OriginalDocs\WantedDocs\12345678","OK"
"D:\Test\OriginalDocs\B","D:\Test\OriginalDocs\B\file2_B_B_43534554.txt","file2_B_B_43534554.txt","43534554","D:\Test\OriginalDocs\WantedDocs\43534554","OK"
The 'Result' column will display Failure if the filename did not contain a CliCode in the name, otherwise OK
Thank you for all the replies. Using a combination of what has been advised, I now have the solution I need.
Many thanks for all the assistance. I've added an if else section to the file processing because I would only be interested in files that follow a specific naming convention (xx_xx_xx_clicode_xxx.ext). This is for a specific project where I'll be supplied with 1000s of files, most of which should follow the naming convention. So I'm checking the number of elements in the $values variable array to make sure it has at least 4 values (i.e. [3] exists as a value). Where it doesn't exist I'm writing the filename out to a log file.
This is the completed solution:
do {
$Folder = Read-Host 'Please enter a folder path'
} while (-not (Test-Path -Path $Folder -PathType Container))
$File = Read-Host 'Please enter a filename (no extension)'
$OutputFile = Join-Path -Path $Folder -ChildPath ($File + '.csv')
$SplitFile = Join-Path -Path $Folder -ChildPath ($File + '_Split.csv')
$CopyDir = Join-Path $Folder -ChildPath 'WantedDocs'
$logfile = "log.txt"
$log = Join-Path -Path $Folder -ChildPath $logfile
Get-ChildItem $Folder -Recurse -Include *.* | select Directory,FullName,Name | Export-Csv -Delimiter ',' -NoTypeInformation $OutputFile
$a = Import-Csv $OutputFile
$a | Add-Member 'CopyDir' -NotePropertyValue $CopyDir
$a | Select Directory,FullName,Name,CopyDir | Export-Csv -Delimiter ',' -NoTypeInformation $SplitFile
Foreach ($Row in $a)
{
$values = $Row.Name.split("_")
If ($values.Count -gt 3)
{
$tempfile = Join-Path -Path $CopyDir -ChildPath $values[3]
$OriginalFile = $($Row.FullName)
$CopyFile = $tempfile
New-Item -ItemType directory -Path $tempfile -Force
Copy-Item $OriginalFile -Destination $CopyFile -Force -Recurse
}
Else
{
Add-Content $log $Row.Name
}
}
Write-Output "Finished"
Many thanks once again. Much appreciated.
I'm working on a powershell script to read file attributes filtered by CreationTime on multiple shares. The scripts works, sporadically. It works great when I use a single path but I get mixed results when I add the folders paths to an array. The most disturbing result is when it successfully find and reads all path and then includes everything under c:windows\system32. Same anomaly when shares are empty.
So what I want to accomplish is:
Read list of Shares
Read each share content filtered by 'CreationTime' and 'Archive' attributes.
Save results to a csv file.
If file not empty, write results to event log.
here is the code
$timer = (Get-Date -Format yyy-MM-dd-HHmm)
$Date=(Get-Date).AddHours(-3)
$FolderList = "C:\Software\Scripts\FolderList.txt"
$Folders = get-content $FolderList
$Filepath = "C:\Software\Scripts"
$filename = "$Filepath\" + $timer + "OldFiles.csv"
foreach ($Folder in $Folders)
{
Get-ChildItem $Folder | Where-Object { $_.CreationTime -lt $Date -and $_.Attributes -band [System.IO.FileAttributes]::Archive} | Select Attributes, CreationTime, Fullname | Export-Csv -Path $filename -NoTypeInformation
}
if ( (get-childitem $filename).length -eq 0 )
{
exit
}
else{
#Write to OpsMgr Log
$Message = get-content $filename
Write-EventLog -LogName "Operations Manager" -Source "Health Service Script" -EventID 402 -EntryType Information -Message "Old files found. $Message"
}
This (untested) script might do what you want:
$Date = (Get-Date).AddHours(-3)
$FolderList = "C:\Software\Scripts\FolderList.txt"
$Folders = Get-Content $FolderList
$Filepath = "C:\Software\Scripts"
$timer = (Get-Date -Format yyyy-MM-dd-HHmm)
$filename = Join-Path $Filepath ("{0}_OldFiles.csv" -f $timer)
$Data = foreach ($Folder in $Folders){
Get-ChildItem $Folder |
Where-Object { $_.CreationTime -lt $Date -and
$_.Attributes -band [System.IO.FileAttributes]::Archive} |
Select Attributes, CreationTime, Fullname
}
if ($Data.Count){
#Write to OpsMgr Log
$Data | Export-Csv -Path $filename -NoTypeInformation
$Message = $Data | ConvertTo-Csv
Write-EventLog -LogName "Operations Manager" -Source "Health Service Script" `
-EventID 402 -EntryType Information `
-Message "Old files found. $Message"
}
I'm trying to create an array that will remove files from the destination path and then copy from the source path to the destination path. I've created a .txt document on the build server with a list of files with their relative path. When I run the below block of code it's removing all contents in folder B and copy Folder A(without any contents) to Folder B.
This is what I'm running
$files = get-content "C:\files.txt"
foreach ($filepath in $files)
{
$source = '\\Server1\Folder A' + $filepath
$dest = '\\Server2\Folder B' + $filepath
foreach ($removefile in $dest)
{
rd $removefile -recurse -force
}
foreach ($addfile in $source)
{
cp $addfile -destination $dest
}
}
Soda,
I've tried your suggestion but it's trying to remove from/copy to the incorrect directory.
Code:
$targetList = Get-Content "C:\MCSfiles.txt"
foreach ($target in $targetList) {
$destPath = Join-Path "\\Server2\MCSWebTest" $target
$destFiles = Get-ChildItem $destPath
foreach ($file in $destFiles) {
Remove-Item $file -Recurse -Force
}
$sourcePath = Join-Path "\\Server1\WebSites\McsWeb2" $target
$sourceFiles = Get-ChildItem $sourcePath
foreach ($file in $sourceFiles) {
Copy-Item $file -Destination $destPath
}
}
Error:
Remove-Item : Cannot find path 'C:\Program Files (x86)\Jenkins\jobs\MCSTest\workspace\App_Code' because it does not exist.
At C:\Users\SVC-VI~1\AppData\Local\Temp\jenkins5893875881898738781.ps1:9 >char:1
9
+ Remove-Item <<<< $file -Recurse -Force
+ CategoryInfo : ObjectNotFound: (C:\Program >File...kspace\App_Co
de:String) [Remove-Item], ItemNotFoundException
+ FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.Remov
eItemCommand
Copy-Item : Cannot find path 'C:\Program Files (x86)\Jenkins\jobs\MCSTest\works
pace\App_Code' because it does not exist.
At C:\Users\SVC-VI~1\AppData\Local\Temp\jenkins5893875881898738781.ps1:16 >char:
18
+ Copy-Item <<<< $file -Destination $destPath
+ CategoryInfo : ObjectNotFound: (C:\Program >File...kspace\App_Co
de:String) [Copy-Item], ItemNotFoundException
+ FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.CopyI
temCommand
Soda,
Neither of the suggestions work. It's still removing everything in the destination directory and adding the source directory folder to the destination directory without files. I'm a little lost here.
you are missing backslashes in your paths, use Join-Path to build your strings to avoid that (or maybe leading backslashes are included in files.txt)
you iterate on $dest and $source but these are strings, not file collections, which you should be able to retrieve with Get-ChildItem $dest and Get-ChildItem $source, for instance
Also, for readability, you should not use aliases in scripts (rd and cp)
PS: I believe your script produces errors, you should include them in your question (you can edit it)
EDIT regarding comments:
Try this (untested, remove -WhatIf's to process):
$targetList = Get-Content "D:\files.txt"
foreach ($target in $targetList) {
$destPath = Join-Path "D:\destination" $target
Remove-Item $destPath -Recurse -Force -WhatIf
$sourcePath = Join-Path "D:\source" $target
Copy-Item $sourcePath -Destination $destPath -Recurse -WhatIf
}
EDIT2: I have corrected and simplified the code above, my logic was slightly off.
This could even be simpler, and better, to group remove statements and run them before the copy statements, like:
$targetList = Get-Content "D:\files.txt"
#remove all
$targetList | ForEach-Object {
Remove-Item (Join-Path "D:\destination" $_) -Recurse -Force -WhatIf
}
#copy all
$targetList | ForEach-Object {
Copy-Item (Join-Path "D:\source" $_) (Join-Path "D:\destination" $_) -Recurse -Force -WhatIf
}
Both snippets have been tested with a sample folder structure.
Just for the sake of completeness, the error you got with my first attempt was due to passing the $file objects to the processing cmdlets, instead of full paths ($file.FullName).
The issue is you are trying to loop too many times. The outer loop processes the files one at a time. There should be no inner loop. Also, you aren't validating the file exists before issuing a deletion.
$files = Get-Content "C:\test.txt"
$source_base="C:\TEMP\source\"
$dest_base="C:\TEMP\dest\"
foreach ($filepath in $files)
{
$source = $source_base + $filepath
$dest = $dest_base + $filepath
if(Test-Path $dest) {rd $dest -recurse -force}
cp $source -destination $dest
}
I have this bit of powershell script but i can't get the $DirectoryName to behave as expected.
1,2,3 |
foreach {
$count = $_;
$x = gci -Path \\myserver-web$count\d$\IISLogs\ -include *.log -recurse
$x | Copy-Item -Destination D:\ServerLogsAndBackups\IIS\w$count\$_.DirectoryName_$_.Name -whatIf
}
When I run this though I get
What if: Performing operation "Copy File" on Target "Item: \myserver-web1\d$\IISLogs\W3SVC1165836668\ex101224.log Destination: D:\ServerLogsAndBackups\IIS\w1\1.DirectoryName_1.Name".
What I want it to be is
W3SVC1165836668_ex101206.log
where my directory structure is like:
\\myserver-web1\d$\IISLogs\W3SVC1165836668
\\myserver-web1\d$\IISLogs\W3SVC1165837451
\\myserver-web1\d$\IISLogs\W3SVC1165836966
\\myserver-web1\d$\IISLogs\W3SVC1165812365
with files called ex101206.log in each folder
Cheers
You need to evaluate the $_.Directoryname_$_.Name part. Like so,
$x | Copy-Item -Destination $(D:\ServerLogsAndBackups\IIS\w$count\$_.DirectoryName_$_.Name) -whatIf
1,2,3 |
foreach {
$count = $_;
gci -Path \\myserver-web$count\d$\IISLogs\ -include *.log -recurse | % { $dirName = $_.directoryname.Substring($_.directoryname.LastIndexOf("\")+1); $logname = $_.Name; $_ | Copy-Item -Destination $("D:\ServerLogsAndBackups\IIS\w"+$count+"\"+$dirname+"_"+$logName) -whatif }
}