Cannot find the CREDENTIAL because it does not exist or you do not have permission- Azure SQL Server - sql-server

I've a Azure SQL database where I'm trying to create an external data source to load CSV data to database from a blob storage.
I've created a database scoped credential using the following query.
CREATE DATABASE SCOPED CREDENTIAL [https://forecaststorage01.blob.core.windows.net]
WITH IDENTITY = 'SHARED ACCESS KEY',
SECRET = 'SAS Token';
Then created an external data source,
CREATE EXTERNAL DATA SOURCE [demodata]
WITH (
TYPE = BLOB_STORAGE,
LOCATION = 'https://forecaststorage01.blob.core.windows.net',
CREDENTIAL = [https://forecaststorage01.blob.core.windows.net]
);
When I try to run the following query,
SET NOCOUNT ON;
BULK INSERT input.RawData
FROM 'csv-data/egypt_sales_data.csv'
WITH (DATA_SOURCE = 'demodata',
FORMAT = 'CSV');
It says
"Failed to execute query. Error: Cannot find the CREDENTIAL 'https://forecaststorage01.blob.core.windows.net' because it does not exist or you do not have permission.'
Then I've granted access to database scoped credential to the user 'dbo' using the following query,
GRANT CONTROL ON DATABASE SCOPED CREDENTIAL::[https://forecaststorage01.blob.core.windows.net] TO [dbo]
Then I tried to bulk insert again. But it still shows the error,
"Failed to execute query. Error: Cannot find the CREDENTIAL 'https://forecaststorage01.blob.core.windows.net', because it does not exist or you do not have permission.
"
Why I cannot access the database scoped credentials?

If you try to run the first statement on Azure SQL you should get an error:
'CREATE CREDENTIAL' is not supported in this version of SQL Server.
You have to use the CREATE DATABASE SCOPED CREDENTIAL command. And also the IDENTITY value must be SHARED ACCESS SIGNATURE not "key":
CREATE DATABASE SCOPED CREDENTIAL [https://forecaststorage01.blob.core.windows.net]
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = 'SAS Token';
That should fix your issue

Related

Why is my Azure SQL Elastic Query CREATE DATABASE SCOPED CREDENTIAL failing on "WITH Identity"?

I am trying to create an external table to be able to elastic query across databases on my Azure SQL server. I am following the steps outlined in:
https://learn.microsoft.com/en-us/azure/azure-sql/database/elastic-query-getting-started-vertical and https://www.mssqltips.com/sqlservertip/6445/azure-sql-cross-database-query/.
After creating the master key
CREATE MASTER KEY ENCRYPTION BY PASSWORD = '<master_key_password>';
I then try to create the Database Scoped Credential that I am going to use to access the server for the external database.
CREATE DATABASE SCOPED CREDENTIAL = ElasticDBQueryCred
WITH IDENTITY = '<username I use to login to server>',
SECRET = '<password I use to login to server>';
The error then pops up on IDENTITY that says
Incorrect syntax near 'IDENTITY'. Expecting CREATEDBOPT_BACKUPSTORAGEREDUNDANCY, CREATEDBOPT_CATALOGCOLLATION, CREATEDBOPT_FILESTREAM, CREATEDBOPT_LOGAPPLY, CREATEDBOPT_OTHER, or CREATEDBOPT_PERSISTENT_LOG_BUFFER
There is nothing in the documentation about this.

How to connect database to data files in Azure Blob?

This resource states that a Azure SQL database may be attached to a database by using the following command:
WITH IDENTITY='SHARED ACCESS SIGNATURE',
SECRET = '<your SAS key>'
CREATE DATABASE testdb
ON
( NAME = testdb_dat,
FILENAME = 'https://testdb.blob.core.windows.net/data/TestData.mdf' )
LOG ON
( NAME = testdb_log,
FILENAME = 'https://testdb.blob.core.windows.net/data/TestLog.ldf')
This results in Syntax Error near "ON". What is the issue here?
To connect Azure SQL Database to Azure blob storage, you need to create an external data source with the database scoped credentials.
CREATE DATABASE SCOPED CREDENTIAL BlobCred
WITH
IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = 'SAS token' ;
CREATE EXTERNAL DATA SOURCE BlobStg
WITH
( LOCATION =‘https://storagename.blob.core.windows.net’,
CREDENTIAL = BlobCred,
TYPE = BLOB_STORAGE
) ;
Refer to official documents this & this for more details.

Cannot perform CREATE TABLE using Liquibase update for Snowflake as this session does not have a current database?

I am trying to run Snowflake update using Liquibase as the following command which has fully qualified name of the server, database and the schema.
liquibase --username=myusername --password=mypassword --url="jdbc:snowflake://myserver-name-europe.azure.snowflakecomputing.com/?db=mydb&schema=public" --changelog-file=/samplechangelog.snowflake.sql update
But it just gives me error that I didn't specify the database which I did.
Also, the same command work with other Snowflake account? I just even copy and paste it from other project, it's just the name of the server URL and DB is the different.
Here is the error message:
Unexpected error running Liquibase: Cannot perform CREATE TABLE. This session does not have a current database. Call 'USE DATABASE', or use a qualified name. [Failed SQL: (90105) CREATE TABLE DATABASECHANGELOGLOCK (ID INT NOT NULL, LOCKED BOOLEAN NOT NULL, LOCKGRANTED TIMESTAMP_NTZ, LOCKEDBY VARCHAR(255), CONSTRAINT PK_DATABASECHANGELOGLOCK PRIMARY KEY (ID))]
I suggest setting up a default role for user "username":
ALTER USER myusername SET DEFAULT_ROLE = my_default_role;
If user does not have default warehouse assigned then:
ALTER USER myusername SET DEFAULT_WAHREOUSE = my_warehouse_name;
It is also possible to set dabatabase/schema/warehouse/... in db.properties file. Related: Specifying Properties in a Connection Profile
This error is usually due to insufficient permissions - if the user cannot 'see' the database, liquibase silently sets it to null and then fails with this error. The best method of troubleshooting this is to connect via Snowflake web GUI with the same credentials (without setting any role or warehouse), then try running 'use database ...'.
The solution is to grant enough permissions to the user - either directly or via a role (but then the role name needs to be passed on in the jdbc connection string as well).
Yes, the issue was in the permissions for the current user that I am trying to use to connect to the Snowflake, here what I did:
ALTER USER myuser SET DEFAULT_ROLE=SYSADMIN
This lead to another error as there is no active warehouse, so I set the whorehouse in the connection string of the JDBC driver as the following:
liquibase --username=myusername --password=mypassword --url="jdbc:snowflake://myserver-name-europe.azure.snowflakecomputing.com/?&warehouse=COMPUTE_WH&db=mydb&schema=public"--changelog-file=/samplechangelog.snowflake.sql update
So, I can send any values in the connection string for the JDBC as the following:
jdbc:snowflake://myorganization-myaccount.snowflakecomputing.com/?user=peter&warehouse=mywh&db=mydb&schema=public
Here is the page for more info about JDBC driver connection string:
JDBC connection string to Snowflake

Permissions issue trying to create an external data source on Azure SQL Database

Please bear with me as I am trying to learn Azure. I have in my resource group a SQL Server database, and a blob storage account with a container. I am the owner of these resources.
I am trying to create an external data source on my SQL database to link to my blob storage account, but I am running into a permissions issue that I cannot seem to resolve. Running the query:
CREATE EXTERNAL DATA SOURCE MyAzureBlobStorage
WITH (
TYPE = BLOB_STORAGE,
LOCATION = 'https://[redacted].blob.core.windows.net/'
);
Returns this error message:
Msg 15247, Level 16, State 1, Line 1
User does not have permission to perform this action.
My Google-fu seems to be betraying me, as I can't seem to find any references to this issue. Am I missing something basic? I'm browsing through my Azure Dashboard but I can't find any obvious way to manage specific database permissions, although I would have assumed that given that I am the owner, I had maximum possible permissions?
Please provide the credential as shown below:
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'some strong password';
CREATE DATABASE SCOPED CREDENTIAL MyAzureBlobStorageCredential
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = 'sv=2015-12-11&ss=b&srt=sco&sp=rwac&se=2017-02-01T00:55:34Z&st=2016-12-29T16:55:34Z&spr=https&sig=copyFromAzurePortal';
CREATE EXTERNAL DATA SOURCE MyAzureBlobStorage
WITH ( TYPE = BLOB_STORAGE,
LOCATION = 'https://myazureblobstorage.blob.core.windows.net',
CREDENTIAL= MyAzureBlobStorageCredential);
I was having the same error when trying to create an EXTERNAL DATA SOURCE. What worked for me was add the grant CONTROL for the database user:
GRANT CONTROL to your_db_user

SQL Azure cross DB Query Privileges

I'm currently working through an example on cross database queries on SQL Azure. http://www.c-sharpcorner.com/article/cross-database-queries-in-azure-sql/
I'm currently working on the following section -
CREATE EXTERNAL DATA SOURCE RefmyDemoDB2
WITH
(
TYPE=RDBMS,
LOCATION='your server name',
DATABASE_NAME='myDemoDB2',
CREDENTIAL= your “Server admin login”
);
I'm getting the following error The specified credential cannot be found or the user does not have permission to perform this action.
I have the correct LOCATION and DATABASE_NAME however the CREDENTIAL seems wrong. I am using the Server-Admin account that is displayed in the overview of the database server on Azure, I also use this role to log into management studio and can query both databases ok.
Can anyone please advise?
Try this, Creating new Credentials
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'STrongPA5swor$';
CREATE DATABASE SCOPED CREDENTIAL MyLogin
WITH IDENTITY = 'MyLogin',
SECRET = 'STrongPA5swor$';
CREATE EXTERNAL DATA SOURCE PHPSTGRemoteReferenceData
WITH
(
TYPE=RDBMS,
LOCATION='servername',
DATABASE_NAME='DBName',
CREDENTIAL= MyLogin
);
This works for me

Resources