I am writing a Powershell script that does several things with a local SQL Server database.
One thing I am doing is running several SQL jobs, one after another. I run them like this:
sqlcmd -S .\ -Q "EXECUTE msdb.dbo.sp_start_job #job_name = 'Rebuild Content Asset Relationship Data'"
Is there a way to get Powershell to delay running the next job until the first one is completed?
Thanks
To get access to SQL Agent Jobs from PowerShell you can use SMO:
EDIT: Thinking on efficiency if you are going to add this function to your script I would take the SMO loading out and just place it near the top of your script (prior to this function). It will probably slow your script down if every time you call the function it reloads the assembly.
Function Get-SQLJobStatus
{
param ([string]$server, [string]$JobName)
# Load SMO assembly, and if we're running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
# Create object to connect to SQL Instance
$srv = New-Object "Microsoft.SqlServer.Management.Smo.Server" $server
# used to allow piping of more than one job name to function
if($JobName)
{
foreach($j in $jobName)
{
$srv.JobServer.Jobs | where {$_.Name -match $JobName} | Select Name, CurrentRunStatus
}
}
else #display all jobs for the instance
{
$srv.JobServer.Jobs | Select Name, CurrentRunStatus
} #end of Get-SQLJobStatus
}
Example of ways you could use this function:
#will display all jobs on the instance
Get-SQLJobStatus MyServer
#pipe in more than one job to get status
"myJob","myJob2" | foreach {Get-SQLJobStatus -Server MyServer -JobName $_}
#get status of one job
Get-SQLJobStatus -Server MyServer -JobName "MyJob"
You could utilize this function in your script and just repeatedly call it in a while loop or something until your job status shows "Idle". At least in my head that is what I think could work :)
Sure, execute a Start-Sleep -seconds <nn> between invocations of sqlcmd.exe.
My suggestion would be to wrap your job in a new sproc that starts the job then waits for it to finish by continually polling its status. From the attached article, you can do something like this:
DECLARE #JobStatus INT
SET #JobStatus = 0
EXEC MSDB.dbo.sp_start_job #Job_Name = 'JobName'
SELECT #JobStatus = current_execution_status FROM OPENROWSET('SQLNCLI', 'Server=localhost;Trusted_Connection=yes;',
'EXEC MSDB.dbo.sp_help_job #job_name = ''JobName'', #job_aspect = ''JOB'' ')
WHILE #JobStatus <> 4
BEGIN
SELECT #JobStatus = current_execution_status FROM OPENROWSET('SQLNCLI', 'Server=localhost;Trusted_Connection=yes;',
'EXEC MSDB.dbo.sp_help_job #job_name = ''JobName'', #job_aspect = ''JOB'' ')
END
Then, rather than calling sp_start_job from the command line, call your sproc from the command line and PowerShell will be blocked until that sproc finishes. Hope this helps!
Your job is executed by SQL Server Agent. Either you call the corresponding stored proc (or SQL Statements) directly from your PowerShell script, or you implement something in the line of what is explained here.
Related
I am working to update the agent job with data based on certain string found under job step. I can see the SMO finding the string displaying the replacement object in memory. But when I try to alter the final output
$AgentJob = Get-SqlAgentjob -ServerInstance $InstanceName | where Name -Like "somestring*"
it doesn't work in updating the actual agent job steps.
Foreach ($steps in $AgentJob.jobsteps)
{
$steps.Command -Replace("CurrentString1","$NewString2")
$steps.Alter()
$steps.Command -Replace("CurrentString2","$NewString2")
$steps.Alter()
$steps.Command -Replace("CurrentString3","$NewString3")
$steps.Alter()
}
You're not actually updating the command text, you're just outputting it.
Try
$steps.Command = $steps.Command -Replace 'CurrentStringX, $NewStringX
I'm trying to figure out a good way to run insert/select/update statements into a MSSQL database through PowerShell scripting and log it thoroughly. I have managed to use Invoke-Sqlcmd which does the inserts and select statements pretty fine. But my concern is to capture the logs some where as an output file. That is when doing inserts the log should capture the no. of rows affected/inserted with the data that was inserted.
Invoke-Sqlcmd used:
PS C:\Users\moshink\Desktop\Powershell SQL> Invoke-Sqlcmd -Server $server -Database $db -Username $user -Password $pass -InputFile ".\test.sql" | Out-File -FilePath ".\test_$datetime.rpt" >> $LogFile
test.sql file query below:
use TCS_MIS
select * from tblMasterAccountType where acctType in ('0121') and intCat in ('0020')
--insert into tblMasterAccountType values ('DEP','0121','0020','PARENTHOOD ASSISTANCE ACCOUNT')
the test.sql file is called in the Invoke-Sqlcmd.
This file can be dynamic with any query going in there.
The Out-File ".\test_$datetime.rpt" does capture the select statement outputs if the data exits matching the criteria, but it will be blank if no data.
Is there something that can be ran to capture instantly when running a Insert .sql file/script? i.e. which will say 20 rows inserted and listing out the data inserted.
Basically what I'm after is a thorough logging when running any .sql scripts through PowerShell. It should capture no of rows affected with the data inserted/updated/deleted and the user who performed it.
I want to make a MSSQL Trigger which will Fires in Everyday when date will change.
For MSSS Express editions create MS Windows job which will start Sqlcmd, see https://technet.microsoft.com/en-us/library/ms165702(v=sql.105).aspx
which will run an Sql script. Note, when sqlcmd is run from the command line, sqlcmd uses the OLE DB provider.
How to create a Sqlcmd job by using Windows Task Scheduler https://support.microsoft.com/en-us/kb/2019698 . This article deals with DB backup task. Replace the Sql script at step A with the one you need and adjust following steps accordingly.
You have to schedule a JOB in SQL which will fire in defined time and put your query in JOB
Expand the SQL Server Agent node and right click the Jobs node in SQL Server Agent and select 'New Job'.
In the 'New Job' window enter the name of the job and a description on the 'General' tab.
Select 'Steps' on the left hand side of the window and click 'New' at the bottom.
In the 'Steps' window enter a step name and select the database you want the query to run against.
Paste in the T-SQL command you want to run into the Command window and click 'OK'.
Click on the 'Schedule' menu on the left of the New Job window and enter the schedule information (e.g. daily and a time).
Click 'OK' - and that should be it.
For that purpose you can use PowerShell and Task Sheduler. All action below must be done on the machine where SQL Server is running.
At first create .sql file with a batch to run. I call it my_batch.sql. F.e. with this inside:
USE [MyDB]
INSERT INTO [dbo].[test]
([id]
,[somevalue]
,[New Column]
,[NewColumn])
VALUES
(NEWID()
,'testing'
,'test'
,'just a test')
Do not use GO in this script!
Then create .ps1 script to run that batch file (my_batch.ps1):
$conn=new-object System.Data.SqlClient.SQLConnection
$ConnectionString = "Server=(local)\SQLEXPRESS;Database=MyDB;Integrated Security=True;Connect Timeout=0"
$conn.ConnectionString=$ConnectionString
$conn.Open()
$fileToGetContent = 'D:\my_batch.sql'
$commandText = Get-Content -Path $fileToGetContent
$command = $conn.CreateCommand()
$command.CommandText = $commandText
$command.ExecuteNonQuery()
$conn.Close()
Then create a schedule task. You can make it manually (here is a good sample) or via PowerShell (I prefer this way):
#Create a new trigger that is configured to trigger at startup
$STTrigger = New-ScheduledTaskTrigger -Daily -At 00:01
#Name for the scheduled task
$STName = "Run SQL batch"
#Action to run as
$STAction = New-ScheduledTaskAction -Execute "PowerShell.exe" -Argument "D:\my_batch.ps1"
#Configure when to stop the task and how long it can run for. In this example it does not stop on idle and uses the maximum possible duration by setting a timelimit of 0
$STSettings = New-ScheduledTaskSettingsSet -DontStopOnIdleEnd -ExecutionTimeLimit ([TimeSpan]::Zero)
#Configure the principal to use for the scheduled task and the level to run as
$STPrincipal = New-ScheduledTaskPrincipal -User "DOMAIN\user" -RunLevel "Highest"
#Register the new scheduled task
Register-ScheduledTask $STName -Action $STAction -Trigger $STTrigger -Principal $STPrincipal -Settings $STSettings
I'm maintaining large t-sql based application.
It has a lot of usages of bcp called through xp_cmdshell.
It is problematic, because xp_cmdshell has the same security context as SQL Server service account and it's more than necessary to the work.
My first idea to get rid of this disadvantage is to use CLR code. CLR is running with permissions of user that called the code.
I created following procedure and it works fine. I can see that it's using permissions of account that is running this code:
public static void RunBCP(SqlString arguments, out SqlString output_msg, out SqlString error_msg, out SqlInt32 return_val) {
output_msg = string.Empty;
error_msg = string.Empty;
try {
var proc = new Process {
StartInfo = new ProcessStartInfo {
FileName = "bcp",
Arguments = arguments.ToString(),
UseShellExecute = false,
RedirectStandardOutput = true,
CreateNoWindow = true
}
};
proc.Start();
while (!proc.StandardOutput.EndOfStream) {
output_msg += proc.StandardOutput.ReadLine();
}
return_val = proc.ExitCode;
}
catch (Exception e) {
error_msg = e.Message;
return_val = 1;
}
}
This is good solution because I'm not messing up in BCP calls(arguments are the same). There are no major changes in logic so there is no risk of an error.
Therefore previous call of BCP in T-SQL was looking this way:
declare #ReturnCode int;
declare #cmd varchar(1000);
SELECT #CMD = 'bcp "select FirstName, LastName, DateOfBirth" queryout "c:\temp\OutputFile.csv" -c -t -T -S"(local)"'
EXEC #ReturnCode=xp_cmdshell #CMD,no_output
Now I call it this way:
declare #ReturnCode int;
declare #cmd varchar(1000);
SELECT #CMD = '"select FirstName, LastName, DateOfBirth" queryout "c:\temp\OutputFile.csv" -c -t -T -S"(local)"'
exec DataBase.dbo.up_RunBCP #arguments = #cmd;
So, the question is: is there any other way to get rid of xp_cmdshell bcp code?
I heard that I can use PowerShell(sqlps). But examples I found suggest to create a powershell script.
Can I call such script from t-sql code?
How this code(powershell script) should be stored? As a database object?
Or maybe there is some other way? Not necessary SSIS. Most what I'd like to know is about powershell.
Thanks for any advices.
Your options for data EXPORT are the following:
using xp_cmdshell to call bcp.exe - your old way of bulk copying
using CLR - your new way of bulk copying
SSIS - my preferred way of doing this; here is the example
INSERT INTO OPENROWSET - the interesting alternative you can use if you are either working on 32-bit environment with text/Jet/whatever drivers installed, or you can install 64-bit drivers (e.g. 64-bit ODBC text driver, see Microsoft Access Database Engine 2010 Redistributable)
SQL Server Import/Export wizard - ugly manual way that seldom works in the way you want it to work
using external CSV table - not supported yet (SQL Server 2016 promises it will be...)
HTH
I would use simple Powershell script that does this, something like:
Invoke-SqlCommand -query '...' | ExportTo-Csv ...
Generally, for administrative functions you could add this to Task Scheduler and be done with it. If you need to execute this task as needed, you can do it via xp_cmdshell using schtasks.exe run Task_NAME which might be better for you since it might be easier to express yourself in Powershell then in T-SQL in given context.
Other mentioned thing all require extra tools (SSIS requires VS for example), this is portable with no dependencies.
To call script without xp_cmdshell you should create a job with powershell step and run it from within t-sql.
I'm looking for suggestions on either returning multiple datasets, or keeping the session open, with Invoke-SqlCmd?
I have a series of SQL queries that use temporary tables to generate a few related views of the data that I am sending on (via Excel) to a manager. As we work on what is required from the datasets, I am getting a little tired of cutting and pasting samples into Excel.
I thought to use Powershell to simply send the results to HTML files as output for the manager, however I ran into a couple of problems
If I put the final extracts into one SQL file, Powershell appends all of the data into a single result set (sort of a union of the tables)
If I attempt to build the temporary tables and then extract each query individually, each Invoke-Sqlcmd is a seperate session, meaning my Temporary tables get dropped.
I'm looking for suggestions on either returning multiple datasets, or keeping the session open?
Invoke-Sqlcmd -InputFile .\GenerateTimecard.sql -Variable $params | Out-Null;
#{
'Summary' = 'select * from #WeeklyTimeSummary;'
'ByDay' = 'select * from #WeeklyTimeDaily order by postdate desc;'
'ByTask' = 'select * from #WeeklyTimeEvents order by HoursSpent desc;'
'Detail' = 'select * from #WeeklyTimeDetail order by postdate desc;'
}.GetEnumerator() | ForEach-Object {
Write-Output $_.Name;
$fname = $_.Name + '.html';
Invoke-Sqlcmd -Query $_.Value | ConvertTo-Html | Out-File -Encoding ascii $fname;
};
The Description section from Get-Help Invoke-Sqlcmd says it supports GO commands so you could try running everything at once. Personally I'd use the -InputFile parameter and pipe the results to Out-File.
You can specify the ApplicationName parameter for Invoke-SqlCmd, which results in a different SQL connection.
Omitting ApplicationName will result in the temp tables getting removed the second time you call Invoke-SqlCmd.
Something like:
Invoke-SqlCmd -ApplicationName CreateTable -Query 'CREATE TABLE ##FooTable (FooKey INT)
Invoke-SqlCmd -ApplicationName SelectTable -Query 'SELECT * FROM ##FooTable'