Using custom cmdlet from SQL Server - sql-server

I want to execute a Powershell script from SQL Server's maintenance plan. This is fine and perfectly possible, but what if I want to use a custom cmdlet? Can this still work from the Powershell script SQL Server job step (in this case, I need to use the SCVMM cmdlet).

No this won't work from SQL Agent PowerShell job step because SQL Agent uses sqlps, the SQL Server minishell. Since the minishell does not support adding cmdlets either via add-pssnapin or import-module, there is no way to add the SCVMM cmdlets.
Instead use a CmdExec (Operating System) job step and specify regular PowerShell. For example (not sure of commands to add SCVMM cmdlets)
C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.EXE -command "add-pssnapin SCVMM;invoke-someCmd"
or put the commands in a script file and call the script:
C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.EXE -file"C:\Scripts\Invoke-SCVMM.ps1"

Related

Need to create a CSV per day for output from SQL Server

I need to create a CSV file from a stored procedure that runs everyday on SQL Server with shipping details.
What would be the best way to go about this?
The solution we found best was to call the stored procedure using Powershell and have a task running in windows scheduler everyday which runs the Powershell.
Another option is BCP, or bulk command protocol. You essentially can either build this into the stored proc, or exclusively in the sql server agent job that runs it, or in an SSIS package. Wherever you prefer.
Here’s a link to Microsoft’s BCP utility:
https://learn.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-ver15
It is important to note you may need to enable some more advanced options in SQL Server. It is highly configurable though, including the ability to dump entire tables to a csv file.
I don't know about Oracle but for SQL Server the easiest method I found was to use PowerShell.
$hostname = hostname
Invoke-Sqlcmd -ServerInstance $hostname -Database master -Query "select * from sys.sysdatabases" | Export-Csv "d:\result.csv" -NoTypeInformation
gives a CSV file at desired location.
If your SQL Server is at remote location, please make sure your server can connect to said SQL Server and if any certificate/encryption is in place then you have all that is necessary.

Run SSIS package using T-SQL under different account

I normally run an SSIS package using a Sql Agent Job and a proxy user as described here: https://www.sqlservercentral.com/articles/run-an-ssis-package-under-a-different-account
I now need to run the same package using the same proxy user using T-SQL. I've been trying to use the [catalog].[create_execution] and [catalog].[start_execution] procedures to do this but there doesn't seem to be a way to specify a user.
How do I execute a package as a different user?
Is my best recourse the use of T-SQL to execute a SQL Agent Job that is configured to use the proxy user instead?
I don't know how agent actually works to make proxy users work - especially with regard to SSIS packages.
In a "normal" sql session say in SSMS, if I wanted to run a query as another user
EXECUTE AS USER = 'TurgidWizard';
SELECT USER_NAME() AS WhoAmI;
REVERT;
That code would allow me to impersonate you until I hit the REVERT call.
But, if you swap out calls to create_execution/start_execution you'll run into the same issue as trying to use a local sql server user runs into with using the methods in the SSISDB - it doesn't work. The methods in the SSISDB all run checks before they begin to ensure users have the correct access level and there isn't impersonation going on. Because once those methods start running, they themselves do impersonation so I guess that doesn't work well.
How can I run a package using tsql under a different account? I would start SSMS/sqlcmd under the credentials using RUNAS For example, the following will open a new command window as you.
runas /netonly /user:corpdomain.com\turgiwizard "cmd"
From there, things I do will be under the aegis of your user so I could run sqlcmd calls like
sqlcmd -S TheServer -d SSISDB -Q "EXECUTE catalog.create_execution ...;"
Mouse click will be Ctrl+Shift+right click executable.
Your SSMS install location is version dependent but try various ten digit increments of 140 in the following path
C:\Program Files (x86)\Microsoft SQL Server\140\Tools\Binn\ManagementStudio\Ssms.exe
The downside for me with regard to runas is that I could not automate getting my credentials passed into it. I've seen articles about use auto hot key and such but never had any luck with it.
Cleanest/easiest approach for something that needs to run regularly is to use sql agent with a proxy, or you could use Windows Task Scheduler and create it as the target user. One off executions, I'd likely use the runas approach.

Automate Running SQL Queries in SSMS

I have a SQL Server database set up that I manage using SQL Server Management Studio 17.
In that database, I have 27 tables that I maintain by running pretty simple OPENQUERY scripts every morning, something to the effect of:
DROP TABLE IF EXISTS [databasename].[dbo].[table27]
SELECT * INTO [databasename].[dbo].[table27] FROM OPENQUERY(OracleInstance, '
SELECT
table27.*
FROM
table27
INNER JOIN table26 ON table27.criteria = table26.criteria
WHERE
< filter >
< filter >
');
And this works great! But, it is cumbersome to every morning, sign into SSMS, and right click on my database and hit "New Query" and copy in 27 individual SQL scripts and run them. I am looking for a way to automate that. My directory that holds these scripts looks like this:
I don't know if this is achievable in SSMS or in like a batch script. I would imagine for the latter, some pseudocode looking like:
connect to sql server instance
given instance:
for each sql_script in directory:
sql_script.execute
I have tried creating a script in SSMS, by following:
Tasks -> Script Database ->
But there is no option to execute a .sql file on the tables in question.
I have tried looking at the following resources on using T-SQL to schedule nightly jobs, but have not had any luck conceiving of how to do so:
https://learn.microsoft.com/en-us/sql/ssms/agent/schedule-a-job?view=sql-server-2017
Scheduled run of stored procedure on SQL server
The expected result would be the ability to automatically run the 27 sql queries in the directory above to update the tables in SQL Server, once a day, preferably at 6:00 AM EST. My primary issue is that I cannot access anything but SQL Server Management Studio; I can't access the configuration manager to use things like SQL Server Agent. So if I am scheduling a task, I need to do so through SSMS.
You actually can't access the SQL Server Agent via Object Explorer?
This is located below "Integration Services Catalog"
See highlighted below:
You describe not being able to access that in the question for some reason. If you can't access that then something is wrong with SQL Server or perhaps you don't have admin rights to do things like schedule jobs (a guess there).
In SSMS you would wnat to use Execute T-SQL Statement Task and write your delete statement in the SQL Statement field in the General Tab.
However, I would look at sqlcmd. Simply make a batch script and schedule it in Task Scheduler (if you're using windows). Or you could use
for %%G in (*.sql) do sqlcmd /S servername /d databaseName -E -i"%%G"
pause
From this post. Run all SQL files in a directory
So basically you have to create a Powershell script that calls and execute the sql scripts.
After that you can add your Powrshell script to the Task Scheduler.
I suggest you add these scripts as jobs for the SQL Server Agent.

Is it possible to upload ftp files from SQL server without xp_cmdshell?

Ideally would like to run something from a SQL query or SQL agent job to FTP upload a file to an external site, but cannot use xp_cmdshell.
Yes. You need to split your work into two separate tasks:
How to run executable or a batch program from within SQL Server without resorting to xp_cmdshell.
An example of how to do it can be found in:
https://www.mssqltips.com/sqlservertip/2014/replace-xpcmdshell-command-line-use-with-sql-server-agent/.
You should modify this example to suit your particular needs. Suggested stored procedure would:
run command passed as a parameter in created on-the-fly SQL job (indicate CmdExec subsystem)
wait for SQL job completion (query msdb.dbo.sysjobactivity) or kill the job if predefined timeout value has been reached
return results of job execution (query msdb.dbo.sysjobhistory)
delete the job
Note: Full version of SQL Server required. If you are using express version, you would have to manually define a windows scheduled task.
How to send a file via ftp using a batch program.
Please see:
How to ftp with a batch file?

Copy .bak file to another Windows server using job MS SQL Server agent

I am trying to copy a .bak file nightly from Server A to Server B.
Can I do that using SQL server Job Agent to run this every night?
I am thinking of adding the copy command as a statement within a step of a job.
Something like: 'copy "G:\source\folder\" "\target\folder\"'
inside the step and setting the type to Operating System(CmdExec).
Is there a way to do it?
is this question about the command to copy the files?
If you want to copy entire folder use robocopy instead of copy
You can make a SSIS package to do that, and then run it from the SQL agent.
However, don't use logical drives, such as G: -- if the server doesn't have the same mapping, it won't work. Use the actual named servers: \serverA\source\folder to \serverB\target\folder.
Short answer is yes. You can try SSIS package as described here or here. Another option is to use windows task scheduler (vs using SQL Server Agent) and a simple bat script to do the same thing.

Resources