I am looking for some assistance with a powershell GUI I am building. I have a function that polls the bandwidth of a remote machine. I am trying to figure out how to get it to output to a textbox. What I currently get back just running this function by itself is below. I would like just the value to display in the textbox
InstanceName value
------------ -----
intel[r] i350 gigabit network connection 11.85
Also the last line in the function starts the loop process (Start-Monitoring) and runs every 10 seconds. I have other functions in my powershell GUI and with the Start-Monitoring outside of the function brackets, it does not work correctly. I am assuming that taking the Start-Monitoring at the bottom out and attaching it to a textbox would fix this
Function Start-Monitoring
{
$Username = 'domain\user'
$Password = 'password'
$pass = ConvertTo-SecureString -AsPlainText $Password -Force
$SecureString = $pass
# Uses your password securely
$MySecureCreds = New-Object -TypeName
System.Management.Automation.PSCredential -ArgumentList $Username, $SecureString
While ($true)
{
# Do things lots
Invoke-command {Get-Counter -Counter "\Network Interface(intel[r] i350 gigabit network connection)\Bytes Received/sec" -sampleinterval 6 |select -exp countersamples|ft -a instancename,#{l="value";e={[math]::round($_.cookedvalue/.1MB,2)}}} -Credential $MySecureCreds -Verbose -ComputerName ipaddress
# Add a pause so the loop doesn't run super fast and use lots of CPU
Start-Sleep 10
}
}
Start-Monitoring
This is the textbox I built for it
#BandTextBox
#
$BandTextBox.Location = '175, 75'
$BandTextBox.Name = 'BandTextBox'
$BandTextBox.Size = '40, 20'
$BandTextBox.TabIndex = 4
$BandTextBox.Text = ''
#
Any help is greatly appreciated!
Save the out put in a variable and update the text box.
Function Start-Monitoring
{
$Username = 'domain\user'
$Password = 'password'
$pass = ConvertTo-SecureString -AsPlainText $Password -Force
$SecureString = $pass
# Uses your password securely
$MySecureCreds = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $Username, $SecureString
While ($true) {
# Do things lots
Invoke-command {Get-Counter -Counter "\Network Interface(intel[r] i350 gigabit network connection)\Bytes Received/sec" -sampleinterval 6 |select -exp countersamples|ft -a instancename,#{l="value";e={[math]::round($_.cookedvalue/.1MB,2)}}} -Credential $MySecureCreds -Verbose -ComputerName ipaddress
# Add a pause so the loop doesn't run super fast and use lots of CPU
Start-Sleep 10
}
}
$Output = Start-Monitoring
$BandTextBox.Location = '175, 75'
$Ban1dTextBox.Name = 'BandTextBox'
$BandTextBox.Size = '40, 20'
$BandTextBox.TabIndex = 4
$BandTextBox.Text = $Output.Value
Related
I am using powershell to check a folder and when a file gets added to the folder, it queries a sql table and if there hasn't been a file added in the last 6 hours then it will send an email to several people letting them know that the file was copied/uploaded to that folder.
I can send the email when a file gets added, but when I added the code to check the SQL table, it stopped working. Can someone help me figure out the rest of this script?
[code]
# make sure you adjust this to point to the folder you want to monitor
$PathToMonitor = "U:\temp\test"
explorer $PathToMonitor
$FileSystemWatcher = New-Object System.IO.FileSystemWatcher
$FileSystemWatcher.Path = $PathToMonitor
$FileSystemWatcher.IncludeSubdirectories = $true
# make sure the watcher emits events
$FileSystemWatcher.EnableRaisingEvents = $true
# define the code that should execute when a file change is detected
$Action = {
$details = $event.SourceEventArgs
$Name = $details.Name
$FullPath = $details.FullPath
$OldFullPath = $details.OldFullPath
$OldName = $details.OldName
$ChangeType = $details.ChangeType
$Timestamp = $event.TimeGenerated
$JustPath = Path.GetDirectoryName($FullPath)
# SQL Work ---------------------------------------------------------------------
$Server = 'SQL01'
$Database = 'FTP_Upload'
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server='$Server';database='$Database';trusted_connection=true;"
$Connection.Open()
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
$text = "{0} was {1} at {2}" -f $FullPath, $ChangeType, $Timestamp
$sql = "IF Not Exists (Select 1 From Transmit Where DateDiff(IsNull(TimeGenerated, '01/01/2020 01:00:00 PM'), '$Timestamp') < 6 ) AND PathOnly = '$JustPath' )
BEGIN
Insert Transmit(FullPath, PathOnly, TimeGenerated)
Values('$FullPath', '$JustPath', '$Timestamp')
END "
Write-Host ""
Write-Host $text -ForegroundColor Green
Write-Host $sql
# you can also execute code based on change type here
switch ($ChangeType)
{
'Created' {
# Check SQL to see if there has been a file ftp'd in the last 6 hours ------
$Command.CommandText = $sql
$Command.ExecuteReader()
$Connection.Close()
# Send Email ---------------------------------
$EmailFrom = “email1#domain1.com”
$EmailTo = “email2#domain2.com, email3#domain3.com”
$Subject = “FTP Notification”
$Body = $text
$SMTPServer = “smtp.office365.com”
$SMTPClient = New-Object Net.Mail.SmtpClient($SmtpServer, 587)
$SMTPClient.EnableSsl = $true
$SMTPClient.Credentials = New-Object System.Net.NetworkCredential(“email1#domain1.com”, “password”);
$SMTPClient.Send($EmailFrom, $EmailTo, $Subject, $Body)
Start-Sleep -Seconds 5
$SMTPClient.Dispose()
# this executes only when a file was renamed
$text = "File {0} was Created" -f $FullPath
Write-Host $text -ForegroundColor Yellow
}
default { Write-Host $_ -ForegroundColor Red -BackgroundColor White }
}
}
# add event handlers
$handlers = . {
Register-ObjectEvent -InputObject $FileSystemWatcher -EventName Changed -Action $Action -SourceIdentifier FSChange
Register-ObjectEvent -InputObject $FileSystemWatcher -EventName Created -Action $Action -SourceIdentifier FSCreate
Register-ObjectEvent -InputObject $FileSystemWatcher -EventName Deleted -Action $Action -SourceIdentifier FSDelete
Register-ObjectEvent -InputObject $FileSystemWatcher -EventName Renamed -Action $Action -SourceIdentifier FSRename
}
Write-Host "Watching for changes to $PathToMonitor"
try
{
do
{
Wait-Event -Timeout 1
Write-Host "." -NoNewline
} while ($true)
}
finally
{
# this gets executed when user presses CTRL+C
# remove the event handlers
Unregister-Event -SourceIdentifier FSChange
Unregister-Event -SourceIdentifier FSCreate
Unregister-Event -SourceIdentifier FSDelete
Unregister-Event -SourceIdentifier FSRename
# remove background jobs
$handlers | Remove-Job
# remove filesystemwatcher
$FileSystemWatcher.EnableRaisingEvents = $false
$FileSystemWatcher.Dispose()
"Event Handler disabled."
}
[/code]
I need to back up several powershell databases that are on the same server in Azure.
I currently have a script that helps me make backups but individually,
apart I must keep changing the names of the backups
This is my code:
Import-Module $PSScriptRoot\..\util\utilConnection.ps1;
Import-Module $PSScriptRoot\..\util\utilDate.ps1;
#Import-Module $PSScriptRoot\..\logging\Logging_Functions.ps1;
Import-Module AzureRM.sql
$TIMESTAMP = getTimeStamp;
#$LogPath = getPathLog;
#$logFileName = "prueba_jobDatabaseBackup.log";
#Log-Start -LogPath $LogPath -LogName $logFileName -ScriptVersion "1.5"
#return;
#Login-AzureRmAccount
loginRMAccount;
#Set subscription Azure
prueba;
Write-Output "";
#Create credential Source DB Server (QA)
#$myPasswordDB = ConvertTo-SecureString $SQL_ACCOUNT_PASSWORD_QA -AsPlainText -Force;
#$myCredentialDB = New-Object System.Management.Automation.PSCredential ($SQL_ACCOUNT_NAME_QA, $myPasswordDB);
#$sqlCredential = Get-Credential -Credential $myCredentialDB;
#Create credential Source DB Server (Prod)
$myPasswordDB = ConvertTo-SecureString $SQL_ACCOUNT_PASSWORD_QA -AsPlainText -Force;
$myCredentialDB = New-Object System.Management.Automation.PSCredential ($SQL_ACCOUNT_NAME_QA, $myPasswordDB);
$sqlCredential = Get-Credential -Credential $myCredentialDB;
$resourceGroup = "resGroupDB";
$serverName = "domserverqa";
$database = "prueba"; **// here I have to change the name of the backup file**
$primarykey = $STORAGE_ACCOUNT_BACKUP_KEY; #strdatabasebackup
$StorageUri = ("https://strdatabasebackup.blob.core.windows.net/strdatabasebackupblob/(2018-01-09-07:00)dbdom_buin.bacpac"); // here I also have to change the final name individually for each database
#$sqlCredential = getCredentialSQLServerQA; #SQL Server target
$SQL_SERVER_FULLNAME_QA = getSQLServerFullNameAzureQA;
$TIMEOUT = 300;
$importRequest = New-AzureRmSqlDatabaseImport –ResourceGroupName $resourceGroup –ServerName $serverName –DatabaseName $database –StorageKeytype StorageAccessKey –StorageKey $primarykey -StorageUri $StorageUri -AdministratorLogin $sqlCredential.UserName –AdministratorLoginPassword $sqlCredential.Password –Edition Basic –ServiceObjectiveName basic -DatabaseMaxSizeBytes 2147483648 # 2GB -> 2 * 1024 MB -> 2 * 1024 * 1024 KB -> 2 * 1024 * 1024 * 1024 Bytes
$importStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink;
while ($importStatus.Status -eq "InProgress")
{
$importStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink;
Write-Output ".";
[System.Threading.Thread]::Sleep(2000);
}
[System.Threading.Thread]::Sleep(4000);
How can I implement a foreach or array to put all the databases together and back them up one by one without having to do it manually?
If someone has any ideas please help me thanks
Pass you DBNames as a list to a ForLoop or a funciton
Just pass in a list of database names in a ForLoop, passing the dbname in a variable to your code.
$AzureDBNames = 'AzureDB01','AzureDB01','AzureDB01'
ForEach ($AzureDBName in $AzureDBNames)
{
# Code begins here
"Backing up $AzureDBName"
}
Turn your code into a function with a parameter that accepts one or more db names.
Function New-AzureDBBackup
{
[CmdletBinding()]
[Alias('NABB')]
Param
(
[string[]]$AzureDBNames
)
# Code begins here
}
New-AzureDBBackup -AzureDBNames 'AzureDB01','AzureDB01','AzureDB01'
Read the online help on:
About_Functions
About_For
About_Loops
About_Variables
I'm working on adapting what was a Multi-Class WMI Query into the same thing but to roll through multiple PC's in a txt file and then append each line from the loop into a CSV
It works fine still with one machine but anytime there is more than one machine name in the list the lines in the CSV appear to not get appended as arrays and each cell reads "System.Object[]"
This is the code I've adapted. I feel like the error is as simple as my formatting of the loop but unfortunately, my knowledge is limited enough that I'm not quite sure how to fix it. Here is my code:
#Start
$computers = #(Get-Content -path "C:\users\kyle.ray5\desktop\TestHosts.txt")
$disk = Get-WmiObject Win32_DiskDrive -ComputerName $computers
$bios = Get-WmiObject Win32_BIOS -ComputerName $computers
$physicalMemory = Get-WmiObject Win32_PhysicalMemory -ComputerName $computers
$processor = Get-WmiObject Win32_Processor -ComputerName $computers
$video = Get-WmiObject Win32_VideoController -ComputerName $computers
$volume = Get-WmiObject Win32_Volume -ComputerName $computers
$os = Get-WmiObject Win32_OperatingSystem -ComputerName $computers
$computerSystem = Get-WmiObject Win32_ComputerSystem -ComputerName $computers
foreach($computer in $computers){
$line = #()
$o = [PSCustomObject]#{
'ComputerName' = $computerSystem.Name
'Manufacturer' = $bios.Manufacturer
'Serial Number' = $bios.SerialNumber
'Version' = $bios.Version
'Operating System' = $os.Name
'Service Pack' = $os.ServicePackMajorVersion
'CPU Manufacturer' = $processor.Manufacturer
'Processor Arch' = $processor.Architecture
'Family' = $processor.Family
'CPU Name' = $processor.NumberOfCores
'Mem Capacity' = $physicalMemory.Capacity
'Volume Label' = $volume.Label
'Volume Name' = $volume.Name
'Total Capacity' = $volume.Capacity
'Available Space' = $volume.Availability
'Disk Part' = $disk.Partitions
'Disk Size' = $disk.Size
'Disk Availability' = $disk.Availability
'Video Card' = $video.Name
'GPU Desc.' = $video.Description
'GPU' = $video.VideoProcessor
}
$line += $o
$line | Export-CSV -Append -Path "C:\users\kyle.ray5\desktop\TestQuery.CSV"
}
#end
This is the output I get when I Put anymore than 1 Hostname in my list
I tried this with and without adding the PSCustomObject to the array $line each loop and still receive this result.
Any help would be appreciated as like I said my knowledge is still somewhat limited! Thanks!
Try the following:
$computers = #("Computer1", "Computer2")
$csvLines = #()
foreach($computer in $computers){
# IF you don't need any credentials, you can omit the -Credential parameter
$info = Invoke-Command -ComputerName $computer -Credential (Get-Credential) -ScriptBlock {
$disk = Get-WmiObject Win32_DiskDrive
$bios = Get-WmiObject Win32_BIOS
$physicalMemory = Get-WmiObject Win32_PhysicalMemory
$processor = Get-WmiObject Win32_Processor
$video = Get-WmiObject Win32_VideoController
$volume = Get-WmiObject Win32_Volume
$os = Get-WmiObject Win32_OperatingSystem
$computerSystem = Get-WmiObject Win32_ComputerSystem
$o = [PSCustomObject]#{
'ComputerName' = $computerSystem.Name
'Manufacturer' = $bios.Manufacturer
'Serial Number' = $bios.SerialNumber
'Version' = $bios.Version
'Operating System' = $os.Name
'Service Pack' = $os.ServicePackMajorVersion
'CPU Manufacturer' = $processor.Manufacturer
'Processor Arch' = $processor.Architecture
'Family' = $processor.Family
'CPU Name' = $processor.NumberOfCores
'Mem Capacity' = $physicalMemory.Capacity
'Volume Label' = $volume.Label
'Volume Name' = $volume.Name
'Total Capacity' = $volume.Capacity
'Available Space' = $volume.Availability
'Disk Part' = $disk.Partitions
'Disk Size' = $disk.Size
'Disk Availability' = $disk.Availability
'Video Card' = $video.Name
'GPU Desc.' = $video.Description
'GPU' = $video.VideoProcessor
}
# return content of $o
$o
}
$csvLines += $info
}
$csvLines | Export-Csv -LiteralPath C:\temp\test.csv
I'm using Powershell remoting (about remoting), because it uses a more "standardized" way to get information from remote machines. Remoting is done via Invoke-Command -ComputerName $computer. The content of -ScriptBlock is performed on the remote machine. Remoting allows returing values of the ScriptBlock, this is simple done by calling $o -> since it is the last statement that is not written to any pipeline, the remote Powershell instance returns the content of $o over the network.
Be sure that you've enabled PsRemoting on your remote machines -> see enable PS remoting.
Hope that helps.
With Vincents help I was able to get this working. The issue was my formatting of the loop was causing multiple properties to be put in one cell.
Thanks all!
I've been working on a little project in Powershell.
My task was to create a script that will collect all files from mail attachments, merge all .pdf files into one and send the generated file to my email.
The script works completely fine in Powershell ISE, but when I try to run it from task scheduler, the merged .pdf file is corrupted without any data in it.
Keep in mind I am new to this stuff.
This is my main code that does all the heavy work:
function getAttachments
{
#########################################################
##-----------------------------------------------------##
## GET ATTACHMENTS ##
##-----------------------------------------------------##
#########################################################
##PATH TO CREDENTIAL
$credpath = "C:\Users\" + $env:UserName + "\Documents\myCred_${env:USERNAME}_${env:COMPUTERNAME}.xml"
#test variable
$test = Test-Path $credpath
##TEST IF CREDENTIAL EXISTS
if(!$test){
## USER PROMPT PSW CREDENTIAL ##
$cred = Get-Credential
#save credential in documents
$cred | Export-CliXml -Path $credpath
}else{
##READ USER CREDENTIAL FROM FILE
$cred = Import-CliXml -Path $credpath
}
##url and date variables
$url = "https://outlook.office365.com/api/v1.0/me/messages"
$d = [DateTime]::Today.AddDays(-1)
$global:date = $d.ToString("yyyy-MM-dd")
## Get all messages that have attachments where received date is greater than $date
$messageQuery = "" + $url + "?`$select=Id&`$filter=HasAttachments eq true and DateTimeReceived ge " + $date
$messages = Invoke-RestMethod $messageQuery -Credential $cred
## Loop through each results
foreach ($message in $messages.value)
{
# get attachments and save to file system
$query = $url + "/" + $message.Id + "/attachments"
$attachments = Invoke-RestMethod $query -Credential $cred
# in case of multiple attachments in email
foreach ($attachment in $attachments.value)
{
Write-Host “Found File :- ” $attachment.Name
$path = "c:\Attachments\" + $attachment.Name
$Content = [System.Convert]::FromBase64String($attachment.ContentBytes)
Set-Content -Path $path -Value $Content -Encoding Byte
}
}
}
function sendAttachments
{
#############################################################
##---------------------------------------------------------##
## SEND ATTACHMENTS AND DELETE FILES ##
##---------------------------------------------------------##
#############################################################
#Connection Details
#PATH TO CREDENTIAL
$credpath = "C:\Users\" + $env:UserName + "\Documents\myCred_${env:USERNAME}_${env:COMPUTERNAME}.xml"
$cred = Import-CliXml -Path $credpath
$smtpServer = “ smtp.office365.com”
$msg = new-object Net.Mail.MailMessage
#Change port number for SSL to 587
$smtp = New-Object Net.Mail.SmtpClient($SmtpServer, 25)
#Uncomment Next line for SSL
$smtp.EnableSsl = $true
$smtp.Credentials = $cred
$msg.IsBodyHtml = $true
#From Address
$msg.From = $cred.UserName
#To Address, Copy the below line for multiple recipients
$msg.To.Add(“email#gmail.com”)
#Message Body
$msg.Body=”<h2>Alle attachments samen bevinden zich in de bijlage van did email</h2> <br/><br/>”
#Message Subject
$msg.Subject = “no-reply: Email met alle attachments”
#your file location
$files=Get-ChildItem “C:\Attachments\”
#attach the right file
$file = $global:pname
Write-Host “Attaching File :- ” $file
$attachment = New-Object System.Net.Mail.Attachment –ArgumentList C:\Attachments\$file
$msg.Attachments.Add($attachment)
#send email
$smtp.Send($msg)
$attachment.Dispose();
$msg.Dispose();
#delete the files from the folder
Get-ChildItem -Path C:\Attachments -Include * -File -Recurse | foreach { $_.Delete()}
}
function mergePDF
{
#############################################################
##---------------------------------------------------------##
## MERGE ALL PDF FILES ##
##---------------------------------------------------------##
#############################################################
$workingDirectory = "C:\Attachments"
$itspath = $PSScriptRoot
$global:pname = $global:date + "_pdfAttachments.pdf"
$pdfs = ls $workingDirectory -recurse | where {-not $_.PSIsContainer -and $_.Extension -imatch "^\.pdf$"};
[void] [System.Reflection.Assembly]::LoadFrom([System.IO.Path]::Combine($itspath, 'itextsharp.dll'));
$output = [System.IO.Path]::Combine($workingDirectory, $pname);
$fileStream = New-Object System.IO.FileStream($output, [System.IO.FileMode]::OpenOrCreate);
$document = New-Object iTextSharp.text.Document;
$pdfCopy = New-Object iTextSharp.text.pdf.PdfCopy($document, $fileStream);
$document.Open();
foreach ($pdf in $pdfs) {
$reader = New-Object iTextSharp.text.pdf.PdfReader($pdf.FullName);
[iTextSharp.text.pdf.PdfReader]::unethicalreading = $true
$pdfCopy.AddDocument($reader);
$reader.Dispose();
}
$document.Close()
$pdfCopy.Dispose();
$document.Dispose();
$fileStream.Dispose();
}
getAttachments
Start-Sleep -s 10
mergePDF
Start-Sleep -s 10
sendAttachments
In this piece of code that I run in another script file, I create a new task:
#############################################################
##---------------------------------------------------------##
## SCHEDULE SCRIPTS IN WINDOWS TASKS ##
##---------------------------------------------------------##
#############################################################
##PATH TO CREDENTIAL
$credpath = "C:\Users\" + $env:UserName + "\Documents\myCred_${env:USERNAME}_${env:COMPUTERNAME}.xml"
#test variable
$test = Test-Path $credpath
##TEST IF CREDENTIAL EXISTS
if(!$test){
## USER PROMPT PSW CREDENTIAL ##
$cred = Get-Credential
#save credential in documents
$cred | Export-CliXml -Path $credpath
}
$taskName = "ManageEmailAttachments"
$taskExists = Get-ScheduledTask | Where-Object {$_.TaskName -like $taskName }
if($taskExists)
{
Get-ScheduledJob ManageEmailAttachments
Unregister-ScheduledJob ManageEmailAttachments
$wshell = New-Object -ComObject Wscript.Shell
$wshell.Popup("Task successfully deleted, run the script again to schedule the task",0,"Done",0x0)
}
else
{
$tt = Get-Date
$tt = $tt.AddMinutes(1)
$testtime = $tt.ToString("HH:mm:ss")
#set trigger
$trigger = New-JobTrigger -Daily -At "1:00"
$testtrigger = New-JobTrigger -Daily -At $testtime
#path to the scripts
$scriptPath = $PSScriptRoot + "\wps_manage_pdf_attachments.ps1"
#options(optional)
$option = New-ScheduledJobOption -WakeToRun: $true
#create a new task
Register-ScheduledJob -Name ManageEmailAttachments -FilePath $scriptPath -Trigger $testtrigger -ScheduledJobOption $option
}
The script when run in Powershell works great, it gets all the attachments from mailbox, merges them into 1 .pdf file and sends them to the requested email address. But when scheduled in windows task scheduler it does the first step fine, but when merged, the .pdf file is corrupted without any content.
I couldn't figure out how to make it work so I posted a question on the forum.
Hope you guys find a way to figure it out.
Thanks in advance
Use below function to get script root directory.
Function Get-ScriptDirectory
{
$Invocation = (Get-Variable MyInvocation -scope 1).Value
Split-path $Invocation.MyCommand.Path
}
$scriptPath=Join-Path(Get-ScriptDirectory) 'wps_manage_pdf_attachments.ps1'
Apparently the problem nested itself in the main code. I used:
Try{...}
Catch{$_ | Out-File C:\errors.txt}
In mergePDF function to find out what the error was. Seems like the path to my ITextSharp.dll was incorrect. $PSScriptRoot that I used showed "C:\windows\windows32" instead of where the script actually is.
So what I did instead was add a line in my batch file to copy the itextsharp.dll to %Temp%:
xcopy Scripts\itextsharp.dll %Temp% /D >NUL 2>NUL
and then read the file from there with:
$itsPath = [System.IO.Path]::GetTempPath()
And everything works as it should be. I know this isn't the best way to do it but I had this batch file before to make the script run by just dubbleclicking it.
So adding a little line won't hurt.
I hope this helps anyone with the same problem.
I get this error message when I run the Powershell script at the bottom:
Exception calling "EnumScript" with "1" argument(s): "Script failed for Table 'dbo.Product'. "
At :line:48 char:35
+ foreach ($s in $scripter.EnumScript <<<< ($tbl)) { write-host $s }
However, when I comment out the output_file line
#$output_file="C:\Product.sql"
(which won't set the Scripter options to write to file), it works fine and outputs the INSERT statments to the console.
Here's the failing script, is there something I'm missing?
# Script INSERTs for given table
param
(
[string] $server,
[string] $database,
[string] $schema,
[string] $table,
[string] $output_file
)
$server="devdidb02"
$database="EPCTrunk_EPC"
$schema="dbo"
$table="Product"
$output_file="C:\Product.sql"
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | out-null
$srv = New-Object "Microsoft.SqlServer.Management.SMO.Server" $server
$db = New-Object ("Microsoft.SqlServer.Management.SMO.Database")
$tbl = New-Object ("Microsoft.SqlServer.Management.SMO.Table")
$scripter = New-Object ("Microsoft.SqlServer.Management.SMO.Scripter") ($server)
# Get the database and table objects
$db = $srv.Databases[$database]
$tbl = $db.tables | Where-object {$_.schema -eq $schema-and$_.name -eq $table}
# Set scripter options to ensure only data is scripted
$scripter.Options.ScriptSchema = $false;
$scripter.Options.ScriptData = $true;
#Exclude GOs after every line
$scripter.Options.NoCommandTerminator = $true;
if ($output_file -gt "")
{
$scripter.Options.FileName = $output_file
$scripter.Options.ToFileOnly = $true
}
# Output the script
foreach ($s in $scripter.EnumScript($tbl)) { write-host $s }
I ran both yours and Keith's and it looks like the issue is in the path you are setting for the file. I was able to reproduce your error. You were using $output_file="C:\Product.sql". Then I changed the path to: $output_file="$home\Product.sql" it ran just fine and gave me the file.
I am guessing that the reason for this is that I don't have permission to write to c:\ which may be the problem you are having.
BTW - my home dir in this case for me is my user folder for my login so I was able to find it there.
FWIW I'm not able to repro the error you see using the AdventureWorks DB. The following generates the foo.sql file without any errors:
Add-Type -AssemblyName ('Microsoft.SqlServer.Smo, Version=10.0.0.0, ' + `
'Culture=neutral, PublicKeyToken=89845dcd8080cc91')
$serverName = '.\SQLEXPRESS'
$smo = new-object Microsoft.SqlServer.Management.Smo.Server $serverName
$db = $smo.Databases['AdventureWorks']
$tbl = $db.Tables | Where {$_.Schema -eq 'Production' -and `
$_.Name -eq 'Product'}
$output_file = "$home\foo.sql"
$scripter = new-object Microsoft.SqlServer.Management.Smo.Scripter $serverName
$scripter.Options.ScriptSchema = $false;
$scripter.Options.ScriptData = $true;
$scripter.Options.NoCommandTerminator = $true;
if ($output_file -gt "")
{
$scripter.Options.FileName = $output_file
$scripter.Options.ToFileOnly = $true
}
# Output the script
foreach ($s in $scripter.EnumScript($tbl)) { write-host $s }