Task does not execute in snowflake - snowflake-cloud-data-platform

I created a task on snowflake using CREATE TASK command.
However, the task seems to be suspended, so I wanted to resume the task by following commend:
ALTER TASK TASK_DELETE3 RESUME;
I'm receiving following error message:
Cannot execute task , EXECUTE TASK privilege must be granted to owner role
Does anyone know how to solve this issue?

You need to run this command as ACCOUNTADMIN
GRANT EXECUTE TASK ON ACCOUNT TO ROLE <your_role>

Related

execute as user failed for the requested user 'dbo' in the database, but if remove the run as user dbo, then no error

My job call a stored procedure dbo.HBM_SYNC_QAS_QUARTERS in database CSDHRMS_UAT, the database owner is HRMS-PRE-DB-1\Administrator. the stored procedure dbo.HBM_SYNC_QAS_QUARTERS will select record in another database QASMGR which owner is also HRMS-PRE-DB-1\Administrator, in the same database instance.
the job owner is also HRMS-PRE-DB-1\Administrator.
in the job step, if I set the run as user as dbo. then error 'execute as user failed for the requested user 'dbo' in the database' is thrown, but if I remove the run as user, then the job can run successfully, why?
and if I just run the stored procedure dbo.HBM_SYNC_QAS_QUARTERS in SSMS query window, database CSDHRMS_UAT, no any error, why?

Azure Data Factory pipelines confuse accounts for DB access

We faced the following problem:
In the same data factory we have pipeline A and pipeline B. They include Copy Data Activity (A and B). Each Copy Data Activitie linked to different Data Sets (A and B) and different SQL Linked Services (A and B) with different service accounts (A and B). All Linked Services use the same Integration Runtime (IR) to connect on premise database.
A and B Activities run Stored procedure A and B. Service account A have execute permission
for procedure A and service account B for procedure B (obviously).
But when pipelines A and B get started at the same time by timer trigger we have errors:
[The EXECUTE permission was denied on the Stored procedure 'A'] in pipeline A log and [The EXECUTE permission was denied on the Stored procedure 'B'] in pipeline B log.
When we stoped one of time triggers for each pipelines, to avoid parallel execution, everything worked fine!
I believe pipelines A and B confuse accounts in some way.
Does anyone know how to check and fix it?
It seems like the two pipelines are trying to use the same resources at the same time, due to which you are getting error.
Please try the below solutions to resolve the issue.
In the same pipeline, you can run Pipeline A and Pipeline B in sequence. Arrange the activities in the pipeline A to run. At the end pipeline A, add the Execute Pipeline activity to run the Pipeline B.
You can create a custom "executor" role and new user, then grant execute permissions to it. Now use this user credentials in your Pipeline B Linked Service to access the database.
In your master DB, first create a new user:
CREATE USER MyUser FOR LOGIN MyLogin WITH DEFAULT_SCHEMA=[dbo]
GO
Then, in your new DB:
CREATE ROLE [db_executor] AUTHORIZATION [dbo]
GO
GRANT EXECUTE TO [db_executor]
GO
sp_addrolemember #rolename = 'db_executor', #membername = 'MyUser'

Command failed: The EXECUTE permission was denied on the object X, database Y, schema 'dbo'

I created new database-scoped credentials as well as a new target group to run my jobs on an Azure SQL Database and with the help of an Elastic Job Agent. The Elastic Job Agent is merely there and tied to the database on which I am running the job. Other than that, it does not play any part in running the following commands.
The job is only one step and in that step, I assign #credential_name to the credential that I created and #target_group_name to the newly created target group.
EXEC jobs.sp_add_job #job_name='UpdatePowerBIReport'
, #description='Update the Power BI Report by calling the stored procedure';
-- And add the job step
EXEC jobs.sp_add_jobstep
#job_name='UpdatePowerBIReport'
, #command = N'EXEC dbo.UpdatePowerBI;'
, #credential_name= 'ElasticJobUserCredential'
, #target_group_name='MyTargetGroup'
As you see in the command, the job only runs a stored procedure called under the dbo schema called UpdatePowerBI.
When I run the job and monitor its execution it fails with the following message:
Command failed: The EXECUTE permission was denied on the object
'UpdatePowerBI', database 'MYDBNAME', schema 'dbo'. (Msg 229, Level
14, State 5, Line 1)
I am running this command in SQL Server Management Studio.
On the target database, have you completed the following:
1 Created a USER with the same name and password as the Credential in the Job Database
2 GRANTED the EXECUTE permission to that USER on the Stored Procedure?
The error message is clear that the user context that is calling the procedure has not been granted the permissions to do so.

Grant monitor on snowflake task

I am trying to grant the monitor privilege on all current and future tasks in a snowflake database to a particular role.
The documentation offers no examples.
I tried GRANT MONITOR ON ALL TASKS ON DATABASE MY_DB TO ROLE ROLE_OVER
Is something like that possible? Do you have to go schema by schema? Individual task by task?
Try this (IN instead of ON):
GRANT MONITOR ON ALL TASKS IN DATABASE MY_DB TO ROLE ROLE_OVER
;

Execution permission is denied on a stored procedure which is created by me while running a report which uses the same sp

I have created a stored procedure, and used it to create a report in report builder. When I run the report, it says that execute permission was denied. But I can execute that stored procedure in the SQL SERVER. I could run the report when I give the same stored procedure as query. Can you please suggest me the solution. Thanks in advance.
log in to SQL Server and do the following :
GRANT EXEC ON [YourStoredProcedure] TO [TheReportServerUser]
Make sure the account you are using for this process have the appropriate permissions.

Resources