Role 'DBT_DEV_ROLE' specified in the connect string does not exist or not authorized - database

I am following this tutorial: https://quickstarts.snowflake.com/guide/data_engineering_with_dbt/#1
I ran this in my worksheet after selecting the securityadmin role and then sysadmin role,
-------------------------------------------
-- dbt credentials
-------------------------------------------
USE ROLE securityadmin;
-- dbt roles
CREATE OR REPLACE ROLE dbt_dev_role;
CREATE OR REPLACE ROLE dbt_prod_role;
------------------------------------------- Please replace with your dbt user password
CREATE OR REPLACE USER dbt_user PASSWORD = "<mysecretpassword>";
GRANT ROLE dbt_dev_role,dbt_prod_role TO USER dbt_user;
GRANT ROLE dbt_dev_role,dbt_prod_role TO ROLE sysadmin;
-------------------------------------------
-- dbt objects
-------------------------------------------
USE ROLE sysadmin;
CREATE OR REPLACE WAREHOUSE dbt_dev_wh WITH WAREHOUSE_SIZE = 'XSMALL' AUTO_SUSPEND = 60 AUTO_RESUME = TRUE MIN_CLUSTER_COUNT = 1 MAX_CLUSTER_COUNT = 1 INITIALLY_SUSPENDED = TRUE;
CREATE OR REPLACE WAREHOUSE dbt_dev_heavy_wh WITH WAREHOUSE_SIZE = 'LARGE' AUTO_SUSPEND = 60 AUTO_RESUME = TRUE MIN_CLUSTER_COUNT = 1 MAX_CLUSTER_COUNT = 1 INITIALLY_SUSPENDED = TRUE;
CREATE OR REPLACE WAREHOUSE dbt_prod_wh WITH WAREHOUSE_SIZE = 'XSMALL' AUTO_SUSPEND = 60 AUTO_RESUME = TRUE MIN_CLUSTER_COUNT = 1 MAX_CLUSTER_COUNT = 1 INITIALLY_SUSPENDED = TRUE;
CREATE OR REPLACE WAREHOUSE dbt_prod_heavy_wh WITH WAREHOUSE_SIZE = 'LARGE' AUTO_SUSPEND = 60 AUTO_RESUME = TRUE MIN_CLUSTER_COUNT = 1 MAX_CLUSTER_COUNT = 1 INITIALLY_SUSPENDED = TRUE;
GRANT ALL ON WAREHOUSE dbt_dev_wh TO ROLE dbt_dev_role;
GRANT ALL ON WAREHOUSE dbt_dev_heavy_wh TO ROLE dbt_dev_role;
GRANT ALL ON WAREHOUSE dbt_prod_wh TO ROLE dbt_prod_role;
GRANT ALL ON WAREHOUSE dbt_prod_heavy_wh TO ROLE dbt_prod_role;
CREATE OR REPLACE DATABASE dbt_hol_dev;
CREATE OR REPLACE DATABASE dbt_hol_prod;
GRANT ALL ON DATABASE dbt_hol_dev TO ROLE dbt_dev_role;
GRANT ALL ON DATABASE dbt_hol_prod TO ROLE dbt_prod_role;
GRANT ALL ON ALL SCHEMAS IN DATABASE dbt_hol_dev TO ROLE dbt_dev_role;
GRANT ALL ON ALL SCHEMAS IN DATABASE dbt_hol_prod TO ROLE dbt_prod_role;
I have this in my profiles.yml file:
dbt_hol:
target: dev
outputs:
dev:
type: snowflake
######## Please replace with your Snowflake account name
account: xyz.eu-central-1
user: TEST
######## Please replace with your Snowflake dbt user password
password: password
role: dbt_dev_role
database: dbt_hol_dev
warehouse: dbt_dev_wh
schema: public
threads: 200
prod:
type: snowflake
######## Please replace with your Snowflake account name
account: xyz.eu-central-1
user: TEST
######## Please replace with your Snowflake dbt user password
password: password
role: dbt_prod_role
database: dbt_hol_prod
warehouse: dbt_prod_wh
schema: public
threads: 200
Although I am following the tutorial, when I run dbt debug, I get an error that:
Connection:
account: xyz.eu-central-1
user: TEST
database: dbt_hol_dev
schema: public
warehouse: dbt_dev_wh
role: dbt_dev_role
client_session_keep_alive: False
Connection test: ERROR
dbt was unable to connect to the specified database.
The database returned the following error:
>Database Error
250001 (08001): Failed to connect to DB: xyz.eu-central-1.snowflakecomputing.com:443. Role 'DBT_DEV_ROLE' specified in the connect string does not exist or not authorized. Contact your local system administrator, or attempt to login with another role, e.g. PUBLIC.
What could I be doing wrong?

As I see, you try to connect using the user TEST:
Connection:
account: xyz.eu-central-1
user: TEST
database: dbt_hol_dev
schema: public
warehouse: dbt_dev_wh
role: dbt_dev_role
client_session_keep_alive: False
Connection test: ERROR
On the other hand, you granted the dbt_dev_role to the following users:
GRANT ROLE dbt_dev_role,dbt_prod_role TO USER dbt_user;
GRANT ROLE dbt_dev_role,dbt_prod_role TO ROLE sysadmin;
You need to grant the role to the user TEST.

Related

How to connect MS SQL Database with Azure Databricks and run command

I want to connect Azure MS SQL Database with Azure Databricks via python spark. I could do this with pushdown_query if I run Select * from.... But I need to run ALTER DATABASE to scale up/down.
I must change this part
spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
otherwise I get this error Incorrect syntax near the keyword 'ALTER'.
Anyone can help me. Much appreciated.
jdbcHostname = "xxx.database.windows.net"
jdbcDatabase = "abc"
jdbcPort = 1433
jdbcUrl = "jdbc:sqlserver://{0}:{1};database={2}".format(jdbcHostname, jdbcPort, jdbcDatabase)
connectionProperties = {
"user" : "..............",
"password" : "............",
"driver" : "com.microsoft.sqlserver.jdbc.SQLServerDriver"
}
pushdown_query = "(ALTER DATABASE [DBNAME] MODIFY (SERVICE_OBJECTIVE = 'S0')) dual_down"
df = spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
display(df)

Pyodbc - SQL Server database restore incomplete

Am trying to restore the database from python 3.7 in Windows using below script.
Drop database functions correctly as expected.
The restore database doesn't work as expected, database always shows "Restoring...." and never completes.
Database files are there in the specified path, but database is not usable.
How to fix this?
import pyodbc
try:
pyconn = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};SERVER=MY-LAPTOP\\SQLEXPRESS;DATABASE=master;UID=sa;PWD=sa123')
cursor = pyconn.cursor()
pyconn.autocommit = True
sql = "IF EXISTS (SELECT 0 FROM sys.databases WHERE name = 'data_test') BEGIN DROP DATABASE data_test END"
pyconn.cursor().execute(sql)
sql = """RESTORE DATABASE data_test FROM DISK='G:\\dbbak\\feb-20-2020\\data_test_backup_2020_02_20_210010_3644975.bak' WITH RECOVERY,
MOVE N'Omnibus_Data' TO N'd:\\db\\data_test.mdf',
MOVE N'Omnibus_Log' TO N'd:\\db\\data_test_1.ldf';"""
print(sql)
pyconn.cursor().execute(sql)
while pyconn.cursor().nextset():
pass
pyconn.cursor().close()
except Exception as e:
print(str(e))
You're not using a single cursor, and so your program is exiting before the restore is complete, aborting it in the middle.
Should be something like:
conn = pyodbc.connect(' . . .')
conn.autocommit = True
cursor = conn.cursor()
cursor.execute(sql)
while cursor.nextset():
pass
cursor.close()
0
After hours I found solution. It must be performed no MASTER, other sessions must be terminated, DB must be set to OFFLINE, then RESTORE and then set to ONLINE again.
def backup_and_restore():
server = 'localhost,1433'
database = 'myDB'
username = 'SA'
password = 'password'
cnxn = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};SERVER='+server+';DATABASE=MASTER;UID='+username+';PWD='+ password)
cnxn.autocommit = True
def execute(cmd):
cursor = cnxn.cursor()
cursor.execute(cmd)
while cursor.nextset():
pass
cursor.close()
execute("BACKUP DATABASE [myDB] TO DISK = N'/usr/src/app/myDB.bak'")
execute("ALTER DATABASE [myDB] SET SINGLE_USER WITH ROLLBACK IMMEDIATE;")
execute("ALTER DATABASE [myDB] SET OFFLINE;")
execute("RESTORE DATABASE [myDB] FROM DISK = N'/usr/src/app/myDB.bak' WITH REPLACE")
execute("ALTER DATABASE [myDB] SET ONLINE;")
execute("ALTER DATABASE [myDB] SET MULTI_USER;")

Add-AzureADAdministrativeUnitMember throws error insufficient privileges

I'm trying to create a runbook to add users an administrative unit in Azure AD.
Anyone able to add members to an administrative unit using a service prinicpal, or know if it's even possible at this point? The module containing the commands is in preview.. Any clues would be appreciated.
It works fine when running it with my regular admin account, but not with the AzureRunAsAccount..
These are the API permissions added to the automation service principal
I've even tried to grant the principal global admin rights in the tenant.
Here is the script I'm running:
Import-Module -Name AzureADPreview
$Conn = Get-AutomationConnection -Name AzureRunAsConnection
Connect-AzureAD -Tenant $Conn.TenantID `
-ApplicationId $Conn.ApplicationID -CertificateThumbprint $Conn.CertificateThumbprint
$XXAU = Get-AzureADAdministrativeUnit -ObjectId "REMOVED"
$XXAUMembers = Get-AzureADAdministrativeUnitMember -ObjectId "REMOVED" | Select -ExpandProperty ObjectID
$XXUsers = Get-AzureADGroup -SearchString "REMOVED" | Get-AzureADGroupMember | Select -ExpandProperty ObjectID
Foreach ($XXuser in $XXUsers){
if($XXAUMembers -notcontains $XXuser){
Add-AzureADAdministrativeUnitMember -ObjectId $XXAU.ObjectId -RefObjectId $XXuser
}
}
This is the error message:
Add-AzureADAdministrativeUnitMember : Error occurred while executing
AddAdministrativeUnitMember
Code: Authorization_RequestDenied
Message: Insufficient privileges to complete the operation.
RequestId:
DateTimeStamp: Fri, 19 Jul 2019 20:14:39 GMT
HttpStatusCode: Forbidden
HttpStatusDescription: Forbidden
HttpResponseStatus: Completed

How to check in Access VBA whether ODBC SQL Server table has write access?

I have a ODBC Linked table in Microsoft Access which is connected to sql server.
For some users the login which connect access to SQL Server has access to only one database with db_datareader role so they cannot edit any data in the tables.
For other users they have db_datareader + db_datawriter role and they can edit any data.
How can I check in vba that my table is not editable in case of db_datareader logins?
You can use passthrough queries to get user role membership, and querydefs to create or access them:
Public Function is_datawriter() As Boolean
Dim qdef As DAO.QueryDef
Dim rst As DAO.Recordset
Set qdef = CurrentDb.CreateQueryDef("")
qdef.Connect = "ODBC; MY_ODBC_CONN_STRING"
qdef.SQL = "SELECT IS_ROLEMEMBER('db_datawriter')"
Set rst = qdef.OpenRecordset(dbOpenDynaset)
If rst.Fields(0).Value = 1 Then is_datawriter = True
End Function
Testing table-specific rights is somewhat more difficult, but in your case this will probably do.

server properties in loops in Capistrano

I am new to Capistrano.
I need to get the server properties in tasks using a loop. I am using this code:
server 'IP_address', user: 'root', password: 'pass', roles: %w{web}, database: 'production1'
server 'IP_address', user: 'root', password: 'pass', roles: %w{web}, database: 'production2'
task :backup_FilesDatabaseServerfiles do
on roles (:web) do |h|
puts h.database
end
end
How can I fetch database options in the above task?
This should do it.
task :backup_FilesDatabaseServerfiles do
on roles :web do |server|
p server.properties.database
end
end
Per Capistrano 3: use server custom variable in task

Resources