Powershell SqlPs module not importing properly - sql-server

I have a (pretty much clean) Windows Server 2008 R2 Build with SQL Server 2012 installed. I'm having a problem with PowerShell (Version 3).
I am trying to use the Invoke-Sqlcmd cmdlet. However, when I call get it i get a message saying that 'Invoke-Sqlcmd' is not recognized as the name of a cmdlet, function, script file, or operable program.
It works if if I run Import-Module SqlPs -Verbose -Force (bypassing the security policy). However it this lasts only as long as the tab is open. If i open a new tab, try to run another script or re-open Powershell ISE I have to import the module all over again.
Any ideas why this is happening?

This is more of a work around, but you can add it to your profile.
"'Import-Module SqlPs -Verbose -Force'" > $profile
Obviously if you have a profile file from before, you dont want to do it like that, but instead open it and add it through notepad or similar.
Also note that Powershell and PowershellISE has different profiles, so you would need to add it in both if you mix.

From my understanding, module loading is done on a per session basis. So the behavior you're seeing is what is expect.

Related

Invoke-ASCmd not a cmdlet in Powershell Module sqlserver (version 21.1.18228)

I'm trying to do some automation with SQL Server Analysis Services using Powershell.
Research points me to "Invoke-ASCmd" command as the one to use as per this documentation: https://learn.microsoft.com/en-us/powershell/module/sqlserver/invoke-ascmd?view=sqlserver-ps#examples
However, even after installing the SqlServer Module, I don't see "Invoke-ASCmd" in the list of commands. I'm assuming this comes default with this module. If I run this command with no parameters, I get the following response:
"Invoke-ASCmd: The term 'Invoke-ASCmd' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again."
If I do a Get-Command -module sqlserver I don't see "Invoke-ASCmd" in the list.
First of all you want to make sure that the SQL Server Module is installed and available.
You can do this by using Get-Module cmdlet.
Get-Module SqlServer -ListAvailable
Once you are sure SQL Server Module is installed you want to import by using Import-Module cmdlet as well as specifying version.
Import-Module SqlServer -Version 21.1.18080
After that you can use Get-Command cmdlet to see if particular module is loaded and has needed commands you trying to use. Like so:
Get-Command -Module SqlServer
Note that there is still a difference between Windows Powershell and Powershell Core including for the sqlserver module. Quite a few cmdlets have not yet been ported and maybe some will never be. See this GitHub issue for comments from the MSFT team, e.g. from Christian Wade in Jan 2020 and from Matteo-T in Jan 2021.
So if you need the "AS" commands best stick with Windows Powershell if possible.

SQL Server Agent Powershell Job Returns Blank

I have a PowerShell script:
Invoke-Sqlcmd -ServerInstance "PRODUCTION" -Database "DATABASE" -InputFile "E:\DW_Exports\cdd.sql" | Export-Csv "E:\DW_Exports\Pearsonvue\CDD.csv" -NoTypeInformation
When I run this manually in ISE, works fine, no problems.
However, when I set it up as a SQL Agent job it just returns a blank file. No errors reported, it says it was successful, but all I end up with is a blank file.
I've tried the process with very simple queries (just changing the input file the PowerShell points to), and it works fine. So we can rule out SQL Server Agent access issues to the file location or running PowerShell. It just doesn't work for this specific query.
Whats also odd is that sometimes after I run the job, if I try to run the PowerShell script manually it says I don't have access to the file location unless I delete the blank file, then it works fine again.
Any ideas?

Invoke-sqlcmd to remote server issue: Could not find file

I'm trying to write a PowerShell script which will execute tsql query to only one remote server using invoke-sqlcmd. Those tsql queries are simple one like backup/restore database, create a user, etc.
Below is an extract of it :
# 5 # Create clientdb database on secondary server by restoring the full backup for primary
Try {
Invoke-sqlcmd -ServerInstance 'REMOTESQLSRV'`
-Username 'ts_sql' -Password 'somepassword'`
-InputFile "$LScltid\__01_On_Secondary_CreateDB2_srv2.sql"`
-ErrorAction Stop
Write-Host " clt_$id is now restored to secondary server "`
-ForegroundColor White -BackgroundColor Green
} Catch {
Write-Host " Restore operation for clt_$id did not succeed. Check the error logs " -ForegroundColor Black -BackgroundColor Red
}
My scripts always break here. For some reasons that i could not put my head on, invoke-sqlcmd does not use the variable "$LScltid" to resolve the path where it will find the .sql script.
Everytime i had to run it, it change the current directory to either the SQLERVER:\ provider or some other causing the script to failed at this step.
Am I doing this the right way? If so, how should i adapt the command to perform as I expect it to.?
UPDATE
Forgot to mention, if i run the script with the variables values hard-coded I'm able to get the result i need (in this case restoring a database from device).
Thanks for your feedbacks.
Odd thing, the command is now working. I don't really know what i've done wrong previously, but the exact same command is now working.
Just a heads up for those facing the same issue:
If you have to use Invoke-Sqlcmd in your scripts, beware of the provider change especially if the commands coming after Invoque-Sqlcmd are regulars one (get, set, copy, new, etc....)
In my case, somewhere in my script between two Invoke-sqlcmd commands i had to copy files from local to remote server. Everytime the command failed because the provider changed. As a workaround I set-location before copy-item command execution and that maneuver seemed to do the trick (don't know if its re-commanded tough.
Thanks Stackoverflow Team

Deploying a dacpac via Powershell caused error: "Unable to determine the identity of domain"

Has anyone else met a similar problem to the one described below?
I am having a problem deploying a SQL server 2012 dacpac database upgrade with Powershell. The details are as follows:
Its a dacpac file built for sql server 2012 and I'm trying to apply it to a sql server 2012 database via Powershell run from the command line when logged in as administrator.
Exception calling "Deploy" with "4" argument(s): "Unable to determine the identity of domain."
At ... so.ps1:17 char:8
+ $d.Deploy($dp, $TargetDatabase,$true,$DeployOptions)
The redacted script (logging and literals changed) is as follows:
[System.Reflection.Assembly]::LoadFrom("C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\Microsoft.SqlServer.Dac.dll") | Out-Null
$d = new-object Microsoft.SqlServer.Dac.DacServices ("... Connection string ...")
$TargetDatabase = "databasename"
$fullDacPacPath = "c:\temp\...\databasename.dacpac"
# Load dacpac from file & deploy to database named pubsnew
$dp = [Microsoft.SqlServer.Dac.DacPackage]::Load($fullDacPacPath)
$DeployOptions = new-object Microsoft.SqlServer.Dac.DacDeployOptions
$DeployOptions.IncludeCompositeObjects = $true
$DeployOptions.IgnoreFileSize = $false
$DeployOptions.IgnoreFilegroupPlacement = $false
$DeployOptions.IgnoreFileAndLogFilePath = $false
$DeployOptions.AllowIncompatiblePlatform = $true
$d.Deploy($dp, $TargetDatabase,$true,$DeployOptions)
Here is some supporting information:
Dac framework version is 11.1
The script throws the error when run on the command line:
ie. Powershell -File databaseupgrade.ps1
but not when run in the Powershell integrated script environment
Similar scripts work from the command line for other dacpacs.
Research on the web might suggest that it might be something to do with the size of dacpac. The ones that work are all smaller than the one that does not and this link mentions a figure of 1.3mb which the file size of the failing dacpac just exceeds. If anyone can confirm that this is the problem can you also suggest a solution?
Update
The following script exhibits the same behavior ie. works in PS Ide not from command line.
[Reflection.Assembly]::LoadWithPartialName("System.IO.IsolatedStorage")
$f = [System.IO.IsolatedStorage.IsolatedStorageFile]::GetMachineStoreForDomain();
Write-Host($f.AvailableFreeSpace);
I believe this issue here (at least in our case) is actually when the dacpac is working with a database that utilizes multiple filegroups. When doing the comparison for deployment my hypothesis is that it's utilizing the IsolatedStorage for the different files.
The link above was helpful, but it was not the entry as much as the last comment on that blog by Tim Lewis. I modified his code to work in native powershell. Putting this above the SMO assembly load should fix this issue:
$replacementEvidence = New-Object System.Security.Policy.Evidence
$replacementEvidence.AddHost((New-Object System.Security.Policy.Zone ([Security.SecurityZone]::MyComputer)))
$currentAppDomain = [System.Threading.Thread]::GetDomain()
$securityIdentityField = $currentAppDomain.GetType().GetField("_SecurityIdentity", ([System.Reflection.BindingFlags]::Instance -bOr [System.Reflection.BindingFlags]::NonPublic))
$securityIdentityField.SetValue($currentAppDomain,$replacementEvidence)
Edit - this answer is incorrect, see the link added in the original question for information about the real root cause.
It sounds like you're trying to connect with Windows Authentication and that's the cause of the failure (see this post as it seems to cover the error message you're getting). Change your connection string to use SQL Authentication or ensure that the user your powershell script is running as both has a domain-joined identity and has permissions to access the server. Basically, this is a SQL connection issue not a DAC issue.
It's been a few days now so I don't think a proper explanation will be forthcoming. I'll just post this as our workaround for anyone else who finds themselves in this situation.
There is a Microsoft command line program SqlPackage.exe that is fairly easy to get hold of. It will silently deploy a dacpac, can be executed in Powershell and has parameters that support all the options that we need.
If we use this instead of the Dac services assembly directly the domain problem does not arise.

Calling Powershell script via SQL Server agent job - start-transcript doesn't get output

There's a question on this already (here). But I'm hoping that because I'm using PowerShell 2, and SQL Server 2008, that I might get different answers if I ask again.
Basically, I have a SQL Server Agent job, which runs a cmd script like so:
powershell "&D:\SQL\Job\JobDir\Upload.ps1 (various arguments including log file name)"
When it runs start-transcript, the file is created, but nothing except a header is written to the file. However, when I run on the command line, everything is logged properly.
I've already got lots of logic to manage old log files in the powershell script, and don't want to just use the default logging that SQL Server Agent provides.
To write out info, I use invoke-sqlcmd with the verbose flag, and also write-message and write-error.
Thanks for any pointers!
Sylvia
start-transcript creates a text file
it is likely that the permissions under which the agent job is running does not have permissions to create the text file where powershell wants to create it
Permissions were not an issue for me. I think there's ways of manipulating the output from powershell very exactly, with all the different output streams (error, verbose, etc), but instead of getting into those details, I just used the default logging that SQL Server agent provides for each job step. It works okay. I still use the logic that I wrote to manage and archive log files in my script, I can do that even though the log file itself is generated by SQL Server.
I believe that start-transcript does not work when a console was not started (e.g. in a service): outputs are not generated and that would explain why the file is empty.
See start-transcript causes script to fail in a background job
As already stated above the Start-Transcript will create a file in the location specified (as long as agent has permission to create files there). However nothing will be streamed to the transcript file because technically no console window is running.
When running a powershell file through the SQL agent (pre-2012) like
C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.exe C:\myscript.ps1 "SOMEPARAMETER"
I just use the SQL Agent's native Output File option in the SQL steps advanced properties like
C:\LOGGING\myfile_$(ESCAPE_SQUOTE(STRTDT))_$(ESCAPE_SQUOTE(STRTTM)).txt and skip powershell's transcript ability.
Include Write-Host where necessary. Write-Host "rn" will create new line appropriately.
Also include things like -Verbose where allowed and try / catch with error output and the file will record appropriately.

Resources