SSIS Execute Package Task on SQL Server 2016 - sql-server

In Visual Studio I am creating a master package that I will use to execute a number of child packages using the Execute Package Task.
The child packages have been deployed to the Integration Services Catalog SSISDB on SQL Server 2016.
When configuring my Execute Package Task to execute a child package I am not able to access the SSISDB on the server. How can I access the SSISDB?
It seems the task is expecting the child package to be located in the MSDB on the Integration Services Server.
The following are the configurations made I the Execute Package Task
Reference Type: External Reference
Location: SQL Server
Connection: .SSISDB
PackageName: ? - Unable to select child package
Password: *********
ExecuteOutOfProcess: False
Update:
Changing to package deployment mode does not change the behavior of the Execute Package Task. I am still not able to access the packages stored in the SSISDB

Looks like you deployed packages with packages deployment mode. Switch deployment mode to package: Project - Convert to project deployment mode:
and redeploy package.
To run package from another project you have to place "Execute SQL Task" with SSIS.create_execution stored procedure (https://msdn.microsoft.com/en-us/library/ff878034(v=sql.110).aspx) on control flow instead of "Execute Package Task", use following T-SQL as example (copied from mentioned link):
Declare #execution_id bigint
EXEC [SSISDB].[catalog].[create_execution] #package_name=N'Child1.dtsx', #execution_id=#execution_id OUTPUT, #folder_name=N'TestDeply4', #project_name=N'Integration Services Project1', #use32bitruntime=False, #reference_id=Null
Select #execution_id
DECLARE #var0 sql_variant = N'Child1.dtsx'
EXEC [SSISDB].[catalog].[set_execution_parameter_value] #execution_id, #object_type=20, #parameter_name=N'Parameter1', #parameter_value=#var0
DECLARE #var1 sql_variant = N'Child2.dtsx'
EXEC [SSISDB].[catalog].[set_execution_parameter_value] #execution_id, #object_type=20, #parameter_name=N'Parameter2', #parameter_value=#var1
DECLARE #var2 smallint = 1
EXEC [SSISDB].[catalog].[set_execution_parameter_value] #execution_id, #object_type=50, #parameter_name=N'LOGGING_LEVEL', #parameter_value=#var2
EXEC [SSISDB].[catalog].[start_execution] #execution_id

Related

Deploy a project to SSISDB using T-SQL Script

My project has 3 servers Dev for devlopment, Test for Testing and prod for production. I am asked to create SSISDB catalogs from scratch in all the 3 servers.
The folder is named ABC_Dev. Now I have to deploy the SSIS Project into folder named ABC_Devfolder using T-SQL Script. When the script runs on all 3 environments, the project and all the SSIS packages should be deployed into the catalog folder.
I tried something like this
DECLARE #ProjectBinary as varbinary(max)
DECLARE #operation_id as bigint
Set #ProjectBinary = (SELECT * FROM OPENROWSET(BULK '<projectpath.ispac>', SINGLE_BLOB) as
BinaryData)
Exec catalog.deploy_project #folder_name = 'ABC_Dev', #project_name = 'ABC_Dev',
#Project_Stream = #ProjectBinary, #operation_id = #operation_id out
'<projectpath.ispac>' - I gave the path as
C:\Users\ABC_Dev\SSIS\ABC_Dev\bin\Development\ABC_Dev.ispac
Somehow its not taking that path on the servers.
Is there a way to declare the .ispac path that runs on all 3 servers when T-SQL script executes

SQL Server Machine Learning Services - Unable to launch the runtime. ErrorCode 0x80070032: 50(The request is not supported.)

Trying to configure MLS on SQL Server 2017, but when running a very basic external script like so:
EXEC sp_execute_external_script #language =N'R',
#script=N'OutputDataSet <- InputDataSet;',
#input_data_1 =N'SELECT 1 AS hello'
WITH RESULT SETS (([hello] int not null));
GO
I get this error:
Msg 39021, Level 16, State 1, Line 1
Unable to launch runtime for 'R' script. Please check the configuration of the 'R' runtime.
Msg 39019, Level 16, State 2, Line 1
An external script error occurred:
Unable to launch the runtime. ErrorCode 0x80070032: 50(The request is not supported.).
And if I look at the log EXTLAUNCHERRORLOG I see:
2020-12-29 17:53:49.554 SetCpuRateCap failed with error code 0x80070032.
I can't find a reference to this error anywhere, and am very perplexted. Tried all kinds of things (checking permissions, turning the resource governor off, updating to latest CU, reinstalling MLS, etc). We have a similar server that is running on same Azure platform (same size, W2012R2, same memory/cpu config), configured about the same time and it seems to not have this issue. This happens with both R and Python services.
Any help would be appreciated.
I have a solution we used today changing from R 3.3.3 and python 3.5.2 to R 3.5.2 and Python 3.7.1.
SQL 2017 CU22 installs the higher value of R & Python, so you have to run an exe to tell SQL to use the higher R & Python versions that were installed on disk.
The fix that worked for us :
(1) Create a new directory e.g. D:\MLSTEMP
(2) Create subdirectories D:\MLSTEMP\ < SQL_instance >00 and D:\MLSTEMP\ < SQL_instance >01
(3) Make sure the local MSSQLLaunchpad service, Everyone group & SQL service account has full access to both the D:\MLSTEMP and the sub dirs.
Note : the < > brackets are not part of the directory name
Note: launchpad service name is : NTService\MSSQLLaunchpad$ < instance_name >
(4) Go to your
D:\Program Files\Microsoft SQL Server\MSSQL14.SQL001\MSSQL\Binn\pythonlauncher.config
and
D:\Program Files\Microsoft SQL Server\MSSQL14.SQL001\MSSQL\Binn\rlauncher.config
files and alter the WORKING_DIRECTORY setting in each file to be WORKING_DIRECTORY=D:\MLSTEMP
(5) Restart the MSSQLLaunchpad service on the box.
Interestingly, the ****01 directory is the one that's used, not the ****00 dir.
It should now work. You may have to play with permissions a bit.
HTH.

Using pandas 0.25.1 in MS SQL Server 2019

I am trying to update to pandas==0.25.1 on my MS SQL Server 2019.
import sqlmlutils
connection = sqlmlutils.ConnectionInfo(server=SERVER_NAME, database=DATABASE_NAM)
sqlmlutils.SQLPackageManager(connection).install('pandas', True, '0.25.1')
which successfully installs and updates pandas:
>>> Installing dependencies...
>>> Done with dependencies, installing main package...
>>> Installing pandas version: 0.25.1
However, when I execute a python script with sp_execute_external_script command
EXEC sp_execute_external_script #language = N'Python',
#script = N'
import pandas as pd
print(pd.__version__)
'
I get the following output:
>>> STDOUT message(s) from external script:
>>> 0.23.4
i.e., that the instance is using pandas==0.23.4 rather than pandas==0.25.1.
Why is this? Is there a method for using pandas==0.25.1 within MS SQL Server 2019?
cmd (as administrator) (also stopped launchpad service in sql configuration manager, maybe not needed ?)
navigate to C:\Program Files\Microsoft SQL Server\xyz\PYTHON_SERVICES\condabin>
and type: conda install pandas=0.25.1
after package download and validation you'll be asked [y\n] confirmation for installation
if you get an ssl error, you'll need to install openssl for windows.
I had the same issue. I try using sqlmlutils, pip, conda for install online, offline but pandas still version 0.23.4, each installs always success.
One thing, you can install the new package, but you can not upgrade it.
In my case is seaborn. I had installed it with version 0.9, and trying to upgrade to 0.10, but can not upgrade.
It seems SQL server ML services don't allow for upgrade packages.

Sqitch Snowflake deployment failing

I am using Sqitch Snowflake Docker image sqitch/sqitch:1.0.0-snowflake . I can conntect to the database without any issues.
However, when I run the following
./sqitch deploy
It just stays at the following forever.
Adding registry tables to test_db
I have checked the logs at snowflake side in information_schema.query_history() table and it is failing at the following
USE SCHEMA sqitch
SQL compilation error:
Object does not exist, or operation cannot be performed.
It seems that even when pointed to an empty database, Sqitch is assuming that the SQITCH schema already exists.
private_key_path = "/home/bcg/rsa_key_poc.p8" seems to be the issue in ~/.snowsql/config
I can docker exec -it container_id /bin/bash and deploy it within the container successfully. But it doesn't work from the host machine. I am using CentOS 7.7
This was fixed by adding SNOWSQL_PRIVATE_KEY_PASSPHRASE variable to the docker-sqitch script (under Iterate over optional Sqitch and engine variables)

Otcopus deploy - SQL - Deploy DACPAC "Could not connect to database server."

Trying out Octopus deploy for the first time. Trying to deploy a dacpac to a machine and it keeps on failing. I keep on getting the following error:
Exception calling "Extract" with "4" argument(s): "Could not connect to database server."
At C:\Octopus\Work\20191023152506-102-81\Script.ps1:394 char:13
+ $dacServices.Extract($dbDacPacFilepath, $TargetDatabase, ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], ParentContainsErrorRecordException
+ FullyQualifiedErrorId : DacServicesException
The remote script failed with exit code 1
The action SQL - Deploy DACPAC on Staging failed
I am currently using SQL server 2017 and have the dacframework installed for SQL server 2016. for the connection string i tried using ., localhost, and the name of the server given in sql management studio. I am not passing any credentials, I am using integrated security. I am also passing the database name as well.
I followed this youtube video also, just without using the project variables.
In my previous experience I just used the SqlPackage.exe to deploy a dacpac. Helps for manually testing and polishing out permissions, or other issues.
For example:
#example usage:Generate-DBUpdate-Script -server $dbServer -db $dbName -user $dbUser -psw $dbPassword -dacpacFilePath $dacpacFile -publishProfilePath ".\Publish\$dbPublishProfile" -outputFile $SqlUpgradeArtifactName
function Generate-DBUpdate-Script($server, $db, $user, $psw, $dacpacFilePath, $publishProfilePath, $outputFile)
{
#generate an update script
& 'C:\Program Files (x86)\Microsoft SQL Server\110WorkingDAC\DAC\bin\SqlPackage.exe' /Action:Script /OutputPath:$outputFile /SourceFile:$dacpacFilePath /Profile:$publishProfilePath /TargetServerName:$server /TargetDatabaseName:$db /TargetUser:$user /TargetPassword:$psw
#save generated script as deployment artifact
New-OctopusArtifact $outputFile
}
Can change the action to publish to avoid generating the script and just deploy straight away.
Hope that helps.

Resources