What I'd like to do is execute two stored procedures from a PowerShell and save the results into a log file.
I have two procedures so I though it would be easier to create one "master" procedure which will execute the two procedures.
So far I was able to execute the master procedure successfully (both child procedures ran), but only the output from the second is stored in the log file.
param([string]$datein="string")
$OutputFile = "C:\...\ScriptOutput.txt"
$handler = [System.Data.SqlClient.SqlInfoMessageEventHandler] {param($sender, $event) Out-File -filepath $OutputFile -inputobject $event.Message };
$SqlConnection = new-Object System.Data.SqlClient.SqlConnection("server=ServerName;database=DB;integrated security=true")
$SqlConnection.add_InfoMessage($handler);
$SqlConnection.FireInfoMessageEventOnUserErrors = $true;
$SqlConnection.Open() | Out-Null
# Execute procedures
$cmd = new-Object System.Data.SqlClient.SqlCommand("[dbo].[tst_master]", $SqlConnection)
$cmd.CommandType = [System.Data.CommandType]'StoredProcedure'
$cmd.Parameters.Add("#date_in",$datein) | Out-Null
$cmd.ExecuteNonQuery() | Out-Null
$SqlConnection.Close()
EDIT:
Its on virtual machine, Windows server 2016. Powershell version Major:5, Minor: 1. Im working with MSSQL 2014.
Related
Looking for a way to generate individual scripts for each table and include any relationships or extended properties. (After getting this working I will try to generate scripts for stored procedure, and function)
I am aware of the Sql-Server UI Generator (Database=>Tasks=>Generate Scripts) but that does one big file, not individuals. If there is a way to make this produce individual files (with out doing them 1 at a time) that would be best.
I have used Powershell package DBATools with some limited success. The following will make a file that contains create scripts for the table and the table's extended property but not the column extended properties.
$server = "sql01"
$database = "MyDatabase"
$table = "MyTable"
Get-DbaDbTable -SqlInstance $server -Database $database -Table $table | Export-DbaScript -FilePath ($database + "\" + $table +".sql")
Get-DbaDbTable -SqlInstance $server -Database $database -Table $table | Get-DbaExtendedProperty | ForEach-Object { Export-DbaScript -InputObject $_ -FilePath ($database + "\" + $table +".sql") -Append }
The answer was given by Larnu in the commnets.
The comment pointed out the option to save scripts individually and I have been over looking it for years.
While you found the option in SSMS, I wanted to say that this is also possible using the dbatools approach you tried. The "secret sauce" is specifying a ScriptingOptions object that controls the scripting behavior.
$so = New-DbaScriptingOption;
$so.ExtendedProperties = $true;
foreach ($table in Get-DbaDbTable -SqlInstance . -Database AdventureWorks2019){
$path = '{0}.{1}.sql' -f $table.Schema, $table.Name;
Export-DbaScript -InputObject $table -FilePath $path -ScriptingOptionsObject $so;
}
I have a PowerShell script that I am writing to extract all the jobs of a specific server like
$sqlserver = "Servername"
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo') | Out-Null
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $sqlserver
$jobs = $srv.JobServer.Jobs
ForEach ($job in $jobs)
{
$jobname =$Servernaam.replace("\","_") + '_'+ $job.Name.replace(" ","_").replace("\","_").replace("[","_").replace("]","_").replace(".","_").replace(":","_") + ".sql"
$job.Script() | Out-File C:\Users\Desktop\Jobs_from_Server\Orgineel\$jobname
if ($jobs -like '*.dtsx*')
}
The code I got now gets all the jobs from the server and store them in separate files.
The problem is that I only want to get the jobs of the Micrososft SQL Server that contains .dtsx in the string of #command
I tried for example
ForEach ($job in $jobs)
{
$jobname =$Servernaam.replace("\","_") + '_'+ $job.Name.replace(" ","_").replace("\","_").replace("[","_").replace("]","_").replace(".","_").replace(":","_") + ".sql"
$job.Script() | Out-File C:\Users\Desktop\Jobs_from_Server\Orgineel\$jobname
if ($jobs -like '*.dtsx*')
I also have tried - Contain and set the code in the foreachloop like
ForEach ($job in $jobs |if ($jobs -like '*.dtsx*'))
Your code has a number of typos and other errors in which objects you're using where. Try this as a starter. It assumes that the first step in a job that uses SSIS is an SSIS step. Modify as needed.
The key here is checking the subsystem of the job step(s) to detect if the step is run with the SSIS subsystem.
$sqlserver = "servername"
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo') | Out-Null
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $sqlserver
$jobs = $srv.JobServer.Jobs
ForEach ($job in $jobs)
{
if ($Job.JobSteps[0].Subsystem -eq "ssis") {
# Do SSIS stuff
write-output "Found an SSIS job $($job.name)"
}
}
We use an Azure Elastic Pool resulting in multiple client databases and one master database with references to the client database.
We already have multiple databases and are working on a new version of the code. We use EF6 Code-First.
When we make a change to our model (add a property) we create the migration file and need to call Update-Database for all existing client databases.
This is monkey work we want to skip.
I already have a Powershell script to connect to the master database and execute a query on a table. This returns the names of the child databases.
With it I can change the Web.config and replace the Template database name with the proper name of the child database.
Now I need to call Update-Database to execute the migration scripts. With this last part I'm struggling because I'm running the ps1-script outside Visual Studio and thus the command Update-database is unknown. I tried using migrate.exe but then I get lots of errors.
I think the easiest solution is to run my script within the Package manager console but I can't figure out how to do that.
I managed to get it working. After I placed the ps1-file in the root of my code folder I could run it in the Package Manager Console using .\UpdateDatabases.ps1.
For completeness here's the script I created. I'm new to PowerShell so some optimizations might be possible.
cls
$currentPath = (Get-Item -Path ".\" -Verbose).FullName
#Read Web.config
$webConfig = $currentPath + "\<your project>\Web.config"
$doc = (Get-Content $webConfig) -as [Xml]
$DatabaseNamePrefix = $doc.configuration.appSettings.add | where {$_.Key -eq 'DatabaseNamePrefix'}
#Get Master connectionstring
$root = $doc.get_DocumentElement();
foreach($connString in $root.connectionStrings.add | where {$_.Name -eq "Master"})
{
$masterConn = $connString.connectionString
}
#Connect to master database
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = $masterConn
#Query Client table for the child database names
$SqlQuery = "select Code from Clients"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = $SqlQuery
$SqlCmd.Connection = $SqlConnection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd
#Put query result in dataset
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$SqlConnection.Close()
foreach ($row in $DataSet.Tables[0].Rows)
{
$clientDbName = $row[0].ToString().Trim()
#Change Web.Config
foreach($connString in $root.connectionStrings.add | where {$_.Name -eq "DevelopmentDb"})
{
$newDatabaseName = "Database=" + $DatabaseNamePrefix.value + $clientDbName + ";";
$newConn = $connString.connectionString -replace "(Database=.*?;)",$newDatabaseName
$connString.connectionString = $newConn;
}
$doc.Save($webConfig)
#Update database
Update-Database -ConfigurationTypeName Application
}
"Finished"
You may want to take a look at Azure Elastic Database Jobs. Which is designed to work with the elastic database pools.
The Elastic Database Jobs SDK includes also PowerShell components.
I'm using PowerShell and have to import data from a .csv file into a already created table on a SQL Server Database. So I don't need the header line from the csv, just write the data.
Here is what I have done so far:
#Setup for SQL Server connection
#SQL Server Name
$SQLServer = "APPLIK02\SQLEXPRESS"
#Database Name
$SQLDBName = "code-test"
#Create the SQL Connection Object
$SQLConn = New-Object System.Data.SQLClient.SQLConnection
#Create the SQL Command Object, to work with the Database
$SQLCmd = New-Object System.Data.SQLClient.SQLCommand
#Set the connection string one the SQL Connection Object
$SQLConn.ConnectionString = "Server=$SQLServer;Database=$SQLDBName; Integrated Security=SSPI"
#Open the connection
$SQLConn.Open()
#Handle the query with SQLCommand Object
$SQLCmd.CommandText = $query
#Provide the open connection to the Command Object as a property
$SQLCmd.Connection = $SQLConn
#Execute
$SQLReturn=$SQLCmd.ExecuteReader()
Import-module sqlps
$tablename = "dbo."+$name
Import-CSV .\$csvFile | ForEach-Object Invoke-Sqlcmd
-Database $SQLDBName -ServerInstance $SQLServer
#-Query "insert into $tablename VALUES ('$_.Column1','$_.Column2')"
#Close
$SQLReturn.Close()
$SQLConn.Close()
I wrote a blog post about using SQL with PowerShell, so you can read more about it here.
We can do this easily if you have the SQL-PS module available. Simply provide values for your database name, server name, and table, then run the following:
$database = 'foxdeploy'
$server = '.'
$table = 'dbo.powershell_test'
Import-CSV .\yourcsv.csv | ForEach-Object {Invoke-Sqlcmd `
-Database $database -ServerInstance $server `
-Query "insert into $table VALUES ('$($_.Column1)','$($_.Column2)')"
}
To be clear, replace Column1, Column2 with the names of the columns in your CSV.
Be sure that your CSV has the values in the same format as your SQL DB though, or you can run into errors.
When this is run, you will not see any output to the console. I would recommend querying afterwards to be certain that your values are accepted.
I am trying to call a SP (Ola's maintenance script!) on a remote server (that part of the code works), and the output of the SP is the results of DBCC CHECKDB (so it's in the Message tab).
I tried to put together some code to capture this Message output into a file on the remote server, but the file is not being created, though the SP completes fine.
$OutputFile = "\\XXX\E$\SQLAdmin\DatabaseCheckDB\ScriptOutput\ScriptOutput.txt"
$handler = [System.Data.SqlClient.SqlInfoMessageEventHandler] {param($sender, $event) Out-File -filepath $OutputFile -inputobject $event.Message };
$SqlConnection.add_InfoMessage($handler);
$SqlConnection.FireInfoMessageEventOnUserErrors = $true;
$SqlConnection = new-Object System.Data.SqlClient.SqlConnection("Server=XXX;DataBase=master;Integrated Security=SSPI")
$SqlConnection.Open() | Out-Null
$cmd = new-Object System.Data.SqlClient.SqlCommand("dbo.DatabaseIntegrityCheck", $SqlConnection)
$cmd.CommandType = [System.Data.CommandType]'StoredProcedure'
$cmd.Parameters.Add("#Databases","ALL_DATABASES") | Out-Null
$cmd.ExecuteNonQuery() | Out-Null
$SqlConnection.Close()
Can anyone see what i'm doing wrong here? THanks in advance!
Do you have the SQL Powershell module installed (sqlps)? If so, then you can use this and pipe the output from the Verbose stream (which contains printed messages from SQL), to your file.
Invoke-Sqlcmd -Query 'DBCC CHECKDB' `
-ServerInstance '(local)' `
-Database 'tempdb' `
-Verbose 4>&1 |
Out-File c:\temp\test.txt
If that isn't an option, then I think I have spotted the problem in your original code - you wire up the InfoMessage event, but you then proceed to create a brand new SqlConnection. This new SqlConnection doesn't have the event handler on it, and so won't respond to any of the printed messages.
Try replacing
$SqlConnection.add_InfoMessage($handler);
$SqlConnection.FireInfoMessageEventOnUserErrors = $true;
$SqlConnection = new-Object System.Data.SqlClient.SqlConnection("Server=XXX;DataBase=master;Integrated Security=SSPI")
with
$SqlConnection = new-Object System.Data.SqlClient.SqlConnection("Server=XXX;DataBase=master;Integrated Security=SSPI")
$SqlConnection.add_InfoMessage($handler);
$SqlConnection.FireInfoMessageEventOnUserErrors = $true;