When using a command like
powershell -command "\\%host1%\supportfiles\mypowershellscript"
from my central server to a remote computer, would it be using that powershell on the remote computer or on my own computer when I run it using the batch?
When you execute:
PowerShell -command <path to a script>
The script, whether it is located on the local machine or on a remote machine, will execute locally. If you want to execute some PowerShell script remotely, you need to enable remoting on the remote machine using Enable-PSRemoting -force. Then on the local machine, you have to execute your script as administrator and your account also has to have admin privileges on the remote machine. Inside your script you can execute parts of the script remotely like so:
$session = New-PSSession remoteComputerName
Invoke-Command -Session $session -Scriptblock { ... script to execute on remoteComputerName ...}
...
Remove-PSSession $session
Related
I am trying to run an sql script by writing powershell script(commands). I am using below Invoke-Command for running the sql script.
Invoke-Command -Session $sess -ScriptBlock {D:\000000000041906_JCWJOUYKDB\dm_sql1.sql}
But the script is not executing, but if I am running a bat script it is running perfectly.
Invoke-Command -Session $sess -ScriptBlock {D:\000000000041906_JCWJOUYKDB\dm_os1.bat}
The bat script are just making directories by connecting to remote machine, where as in sql scripts I have define connecting string for sql server and from there create tablespaces.
I'm trying to run a PS1 file in T-SQL using XP_CMDSHELL, like so:
exec xp_cmdshell 'powershell -ExecutionPolicy bypass -command "C:\Users\sleven\Documents\DimAcctImport.ps1"'
The powershell script is as follows:
import-module dbatools
Import-DbaCsv -SqlInstance 'MSSQL' -Database 'Test' -Table 'Account' -Path "R:\Data\Account.csv" -Delimiter ',' -Quote '"' -KeepNulls -NoProgress
This script uses the cmdlet Import-DbaCSV of module DbaTools to import the CSV to the target table.
Here is the error I receive in SSMS:
import-module : The specified module 'dbatools' was not loaded because no valid module file was found in any module
directory.
At C:\Users\sleven\Documents\DimAcctImport.ps1:1 char:1
+ import-module dbatools
+ ~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ResourceUnavailable: (dbatools:String) [Import-Module], FileNotFoundException
+ FullyQualifiedErrorId : Modules_ModuleNotFound,Microsoft.PowerShell.Commands.ImportModuleCommand
The module is installed and runs as expected if I copy the PowerShell script and run it in PS ISE.
I'm using SQL Developer Edition on the same PC as I'm using to run the sp - my local PC.
What am I missing?
EDIT: Adding output from get-module and $PSVersionTable
PS> (Get-Module -ListAvailable dbatools).Path:
C:\Users\sleven\Documents\WindowsPowerShell\Modules\dbatools\0.9.834\dbatools.psd1
PS> $PSVersionTable
Name Value
---- -----
PSVersion 5.1.18362.145
PSEdition Desktop
PSCompatibleVersions {1.0, 2.0, 3.0, 4.0...}
BuildVersion 10.0.18362.145
CLRVersion 4.0.30319.42000
WSManStackVersion 3.0
PSRemotingProtocolVersion 2.3
SerializationVersion 1.1.0.1
Using Import-Module cmdlet tells PowerShell to look for cmdlet only in certain folders. You can check these folders by checking environment variable PSModulePath (splitting added for better visibility):
$env:PSModulePath -split ';'
In that case, when running PowerShell by xp_cmdshell, that folder is not included in PSModulePath (might happen as well if you use different account as your module is currently in your profile folder). Now you have two options:
Reference the module using its path, not name. You can get path using (get-module -ListAvailable dbatools).path:
# Replace the path with the path you found with Get-Module
Import-Module 'C:\path\to\module\dbatools.psd1
Modify your $env:PSModulePath. As that topic is more broader, let me give you the link to the docs. Remember that you should set that variable for the user running xp_cmdshell. As RThomas mentioned, by default it's not your user account. See linked answer for explanation on that topic.
NOTE (credits go to #David Browne - Microsoft from his comments):
If you're going to set the environment variable, it should be a system environment variable, as you wouldn't want this process to break if you change the SQL Server service account. And remember, setting a system environment variable requires a reboot for services to see the change.
It's likely an account issue. Keep in mind that running xp_cmdshell by default runs everything as the service account behind the SQL Server instance. So you'll want to verify what account this is.
If it's a built in system account this can cause strange behavior when it comes to rights.
If it's a windows or a domain account then you'll want to test externally to SQL Server by running the PS file not as yourself but as the same account SQL Server uses as a service account. You can do this easily by opening the cmd shell with shift/right click and specifying the other account.
Your other option is to set up a proxy account for the xp_cmdshell call to use. Instructions on how to do this can be found in Microsoft documentation.
I want to connect two VM/server one has data file and batch file another have sql install wanted to trigger command of server A which will hit server b files
Sql command--- in server A
EXEC master..xp_cmdshell 'cd.. && "C:\Program Files\Powershell\6\pwsh.exe" -File "C:\Users\sprasad\Desktop\script\command1.ps1"'
error are
1- import-module: The specified module 'C:\Program
Files\Derivation_19_01_rev0\Core.PowershellModule.TradeLoader.dll' was
not loaded because no valid module file was found in any module
directory.
because files are in server B
I am running sql scripts stored in files (sqlscript.sql) on ServerA against a SQL installation on ServerB using remote powershell. The Powershel module SQLPS is needed on serverA.
see link
steps:
enable remotepowershell on serverB
Enable-PSRemoting -force
in your SQL server, add the user that executes the script on serverA in the list of users with rights on the database.
use a script on severA like:
Import-Module -Name SQLPS -NoClobber -DisableNameChecking -Scope Local Invoke-Sqlcmd -ServerInstance serverA -InputFile sqlscript.sql
-Verbose
Invoke-Sqlcmd -ServerInstance 'serverB' -InputFile sqlscript.sql -Verbose
I currently have Server A which is where my TFS and Build Agent is located. I then have Server B which is when my source code site. I am trying to set up a build definition and copies file from on location in server B to another and then build the solution.
However when I run this batch file as part of a build definition it is not creating folders where it need to be. I believe due to the agent not having correct permissions.
Is there a way to run the following batch script to run with Admin permission from a build definition.
You can try below workarounds:
Convert the batch script to PowerShell script, then copy the
PowerShell script to target machine and use the PowerShell on Target
Machines task to run the script. You can enter your admin user
and password using the task. Reference below screenshot.
Add a PowerShell task and run below script to call the cmd.exe to
run the batch script with an admin user account on target machine
(Copy the batch script to target machine first, in below sample I
copied the batch script to C:\Scripts\Test.bat):
Param(
[string]$computerName = "v-tinmo-12r2",
)
$Username = "Domain\user"
$Password = ConvertTo-SecureString "password-here" -AsPlainText -Force
$cred = New-Object System.Management.Automation.PSCredential($Username,$password)
Invoke-Command -ComputerName $computerName -Credential $cred -ErrorAction Stop -ScriptBlock {Invoke-Expression -Command:"cmd.exe /c 'C:\Scripts\Test.bat'"}
What I am looking to do seems fairly simple but I cant figure it out. I am looking to run a Powershell script to launch an RDP session, copy a file to the c:\ directory, then run that file from a command line. I would like it to loop, getting the paramaters froma csv file, such as server IP, username, and password. So in essence the steps would be as follows...
import infor from the csv file to define variables
copy specefied file
(then loop)
launch mstsc.exe
enter server IP, username, password
paste copied file into the c:\ directory
launch cmd.exe
Run the file that was copied to the c:\ directory
log off server
I wanted to see if someone could help me out with this.I am new to power shell and have been able to work through a lot of it. If someone could point me in the right direction, or even provide me the code to fill in the blanks, I would greatly appreaciate it.
I have done remote installs using psexec. psexec \\servername -u domain\usernamr -p password cmd /c "msiexec /i program.msi
PSexec download: https://learn.microsoft.com/en-us/sysinternals/downloads/psexec
This means instead of RDP you will use psexec to run the install remotely.
I have created a small PowerShell script to get you started. So let's assume your CSV file (c:\info.csv) has three columns ServerName, UserName, Password.
Run the below code, and it should work but make sure the change the first 4 lines as per your environment. Start by putting one server to observe the script behavior.
# Set intial variables
$CSVFile = "c:\info.csv"
$MSI = "\\servername\sharename\setup.msi"
$MSILog = "c:\Windows\temp\setup.log"
$Domain = "YourDomain"
# Import info from CSV file
$Servers = import-csv $CSVFile
# loop through each server
foreach ($server in $servers) {
# run psexec on each server to install a program
psexec \\$server.servername -u $Domain\$server.username -p $server.password -h cmd /c "msiexec /i $MSI /quite /l*v $MSILog"
}
I recommend you using this command because I don't exactly what you are trying to do.
Get-help Import-CSV
Get-help about_remoting this will avoid doing the mstsc.exe thing for you.
Enter into the session and Invoke-Command against this session and you can run commands on that server.
$session = New-PSSession -ComputerName Server1 -Credentials Get-Credential
Invoke-Command -Session $session -ScriptBlock {
}
Inside the Script block specify you powershell command to copy files and run them.