How to create the file in azure storage blob using SQL Server - sql-server

I am looking for an approach to create the file in the azure blob storage using SQL server Stored procedure.
I tried to use below approach
CREATE CREDENTIAL indcredential
WITH IDENTITY= 'storage account name', -- this is the name of the storage account you specified when creating a storage account
SECRET = 'key1'
BACKUP DATABASE ILS
TO URL = 'blobpath/dbbackup.bacpac'
/* URL includes the endpoint for the BLOB service, followed by the container name, and the name of the backup file*/
WITH CREDENTIAL = 'indcredential' ;
/* name of the credential you created in the previous step */
GO
Using the above code , I am able to create the db backup file dbbackup.bacpac in azure storage.
Any idea how to create the simple text file in azure blob path ?

I am looking for an approach to create the file in the azure blob storage using SQL server Stored procedure.
There's no direct way to do this. You can read files from BLOB storage, but not write them. Instead, you can use an Azure Function, or perhaps a Logic App to copy data from Azure SQL Database to Azure Storage.

Related

Database-to-Database queries using Managed Identity between Azure SQL Databases

I'm currently trying to answer a problem that can only be answered by combining the datasets of two different Azure SQL databases (different servers, if that matters).
When using user+password authentication, there was a way to do cross-database queries like this (Azure SQL Database Elastic Queries):
CREATE DATABASE SCOPED CREDENTIAL RemoteCredential WITH
IDENTITY = '<remote database user name>',
SECRET = '<remote database user password>'
CREATE EXTERNAL DATA SOURCE RemoteDatabase WITH (
Location = '<database server URL>',
DATABASE_NAME = '<database name>',
CREDENTIAL = RemoteCredential,
TYPE = RDBMS
)
CREATE EXTERNAL TABLE [dbo].[RemoteTable] (
<Remote table definition>
)
SELECT TOP(1) * FROM [RemoteTable]
That worked very well before but we have since migrated to using only managed-identity logins, and user + password authentication is no longer an option.
I've found below snipped to change the credential for managed identity in the context of accessing Azure Storage Accounts here:
CREATE DATABASE SCOPED CREDENTIAL RemoteCredential
WITH IDENTITY = 'Managed Identity'
But this results in the following error message:
Msg 33047, Level 16, State 5, Line 47
Fail to obtain or decrypt secret for credential 'RemoteCredential'.
I've also tried to provide my personal username or the source database server's name, but with the same result.
Some more details:
Both database servers are part of the same tenant and subscription
I've enabled system-assigned identity on the source database server that I am querying.
I've also created an external source user in the target database for the use with managed identity and granted it the required roles.
My user has the required permissions on both databases.
Access with managed identity from my Management Studio works fine for both databases.
The final solution would have to work with Azure SQL databases in Azure China, but I would be grateful for a solution in Azure Global as well.
My current assumption is that managed identity authentication towards other Azure SQL databases from within a SQL query is not yet supported. But maybe someone else has found a way to make this work.
Have you tried Azure SQL Database elastic query.
Is buggy and slow and it's in preview since 2 years now, but it's the closest thing I could find.

How to connect database to data files in Azure Blob?

This resource states that a Azure SQL database may be attached to a database by using the following command:
WITH IDENTITY='SHARED ACCESS SIGNATURE',
SECRET = '<your SAS key>'
CREATE DATABASE testdb
ON
( NAME = testdb_dat,
FILENAME = 'https://testdb.blob.core.windows.net/data/TestData.mdf' )
LOG ON
( NAME = testdb_log,
FILENAME = 'https://testdb.blob.core.windows.net/data/TestLog.ldf')
This results in Syntax Error near "ON". What is the issue here?
To connect Azure SQL Database to Azure blob storage, you need to create an external data source with the database scoped credentials.
CREATE DATABASE SCOPED CREDENTIAL BlobCred
WITH
IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = 'SAS token' ;
CREATE EXTERNAL DATA SOURCE BlobStg
WITH
( LOCATION =‘https://storagename.blob.core.windows.net’,
CREDENTIAL = BlobCred,
TYPE = BLOB_STORAGE
) ;
Refer to official documents this & this for more details.

PowerShell New-AzSqlDatabaseImport - 0: The storage account cannot be accessed. Please check the storage account name and key and try again

I've written a PowerShell script that copies a local bacpac file to an Azure storage account container. After the bacpac file has been copied from the local file system to the Azure blob container, I provide the storage account key and URI to the bacpac file to the New-AzSqlDatabaseImport command.
However, it never succeeds. I always get a 0: The storage account cannot be accessed. Please check the storage account name and key and try again error message. Unfortunately, no matter what I've tried, I cannot get it to work, and that error message isn't very descriptive or helpful at all. In the details response, I see severity "16" code "0" and the message of "The storage account cannot be accessed. Please check the storage account name and key and try again."
Does the -ResourceGroupName for the New-AzSqlDatabaseImport have to contain both the SQL server and the storage account? What permissions need to be enabled in order to allow the New-AzSqlDatabaseImport command to successfully access the storage account?
I do not understand what I'm doing wrong. I use the same key for the Set-AzStorageBlobContent command which works without a problem, so I know the key is correct. The URI to the bacpac file is correct as well. So what am I doing incorrectly? Does it matter that the Resource Group Name is not the same for the storage account and SQL server instance? Is there something in the firewall for the storage account I need to change?
Any help is appreciated!
We have tested in local environment by using the below cmdlet we can successfully import SQL database & below statements are based on our analysis.
New-AzSqlDatabaseImport -ResourceGroupName "<resourceGroupName>" -ServerName "bacapserverdb" -DatabaseName "<DatabaseName>" -StorageKeyType "StorageAccessKey" -StorageKey "<storageAccountKey>" -StorageUri "<bacpacFileUri" -AdministratorLogin "<SQLServerUserName>" -AdministratorLoginPassword $SecureString -Edition Standard -ServiceObjectiveName S0 -DatabaseMaxSizeBytes 1073741824
Reference the below screenshot to pull Access key for a storage account from Azure portal.
Does the -ResourceGroupName for the New-AzSqlDatabaseImport have to contain both the SQL server and the storage account?
You need to pass the ResourceGroupName of the SQL database only as mentioned in the documentation.
What permissions need to be enabled in order to allow the
New-AzSqlDatabaseImport command to successfully access the storage
account?
You don't require any special permissions explicitly as you are passing the storage account Access key as a part of New-AzSqlDatabaseImport cmdlet.
Does it matter that the Resource Group Name is not the same for the
storage account and SQL server instance?
As you are passing the resource group Name of the SQL database for the New-AzSqlDatabaseImport command. as per my understanding even if both the resources are in different resource group this operation can be succeeded.
Is there something in the firewall for the storage account I need to
change?
When i have tested in our local environment we haven't applied any changes to firewall settings on the storage account.
Here is the sample output post successfully importing the sql database.
Limitations for Import \Exporting SQL Database :
Import Export Service does not work when Allow access to Azure services is set to OFF.
Import does not support specifying a backup storage redundancy while creating a new database and creates with the default geo-redundant backup storage redundancy. To workaround, first create an empty database with desired backup storage redundancy using Azure portal or PowerShell and then import the BACPAC into this empty database.

I keep receiving an error while trying to load data to Azure DataBase using Azure Function

I created an Azure Function to load data from Eudonet CRM to my Azure SQL Database. I have two databases :
named Datawarehouse
named Datawarehouse-Dev
Both databases are identical and are in the same server.
When I load data directly to "Datawarehouse" the copy works fine, but when I change the database name to "Datawarehouse-Dev", I receive the following errors :
Index #0 Message: Login failed for user 'AzureFunction'. LineNumber: 65536 Source: Core .Net SqlClient Data Provider Procedure: Error Code:18456
-- Sql server error. If error code <17: => check sql transac code (user error). Else: => software or hardware errors (check availability of database)
Login failed for user 'AzureFunction'.
If anyone has an idea on where the problem could come from I would be very grateful and I also don't understand why there is an authentification error since they're both in the same server and are accessed with the same user/password.
Thanks in advance
Though #adnane already resolved the issue after using the connection string directly into Function application setting instead of storing it into Vault. This approach might compromise the application security because using connection string directly might expose it to unauthorized person.
Azure Key Vault is a good place to keep our application credentials in a secured and centralized manner. Moving the secrets to Key Vault becomes even more important while our Azure solution is growing.
In case, if anyone still looking for the solution by storing the connection string in Azure Key Vault and then using it in the Function, please follow the below steps.
Firstly, open the Azure Key Vault service and from the Settings menu select Access policies. Then select + Add new access policy.
Then choose Select principal and search for the name of the Function App as shown in below example.
Once your principal is selected choose the Secret permissions menu. In this case, we’ll only need to get the secret from the Key Vault (concretely read our connection string). Therefore, check Get permission only and then select OK.
At the end, select Save to store the new functionapp-demo-mw access policy.
ADDING SECRET TO AZURE KEY VAULT
Adding a secret to Azure Key Vault is straightforward. From the Key Vault, Settings menu select Secrets and then select + Generate/Import secret.
For Key Vault secret two values are required – name and the value. In this case we’ve called our secret OrderManagementDbConnectionString and as a value we put our SQL Database connection string. Select Create to save the secret.
By default, the secret is Enabled so it’s ready to use. Once the secret is created, we’ll need to get its URI (a unique location identifying the secret). Go to the Settings menu and select Secrets. We’ll find here our recently added secret (OrderManagementDbConnectionString). Select the secret and we’ll see it’s the only version in the list.
Select the current version of secret and copy its secret identifier. The identifier is an URI with pattern : https://<url_of_the_key_value>/<secret_name>/<secret_version>.
GETTING SECRET FROM KEY VAULT IN AZURE FUNCTION APP
Go back to the Azure Function App (functionapp-demo-mw) and on the Overview tab, select Configuration in Configured featured section.
Well, Select + New application settings. Put a name that describes the new setting (we’ve put OrderManagementConnectionString). At the end of the last year, Microsoft has added an option of sourcing Key Vault secrets directly from App Settings. This simplifies a lot the way how the secrets were used before. So, set the value of the setting to a secret reference in the following format:
#Microsoft.KeyVault(SecretUri=**secret_uri_with_version**)
We just need to replace secret_uri_with_version with the value we’ve previously copied from the secret in Azure Key Vault.
In Azure Function, you’ll just retrieve the value from the application settings and work with it the same way as it was directly a connection string stored in application settings.
// get value from appliction settings
var connectionString = Environment.GetEnvironmentVariable("OrderManagementConnectionString");
// create connection
SqlConnection connection = new SqlConnection(connectionString);

Is there a way to delete a file in Azure blob storage from trigger delete in SQL Server Azure?

I have an ASP.NET MVC application where the files are uploaded to Azure blob storage and the generated filenames are saved in a table in a SQL Server database in Azure. I want to delete the file directly in the database and I want the delete trigger to be able to delete the files in the Azure blob storage.
Maybe PowerShell would work but seems that is not supported by SQL Server Azure
Reference to database and/or server name in 'master.dbo.xp_cmdshell' is not supported in this version of SQL Server.
Any ideas? Thanks
According to this article and this, CLR Stored procs, Query Notifications, Extended stored procs are all not available on Azure sql.
So I think we couldn’t directly delete the blob by using the sql server trigger.
One way:
You could use queue trigger in azure webjob to delete the sql record and blob file.
You could add a message(Include the blob file name) to the queue in your web application or using azure storage explorer.
Then in the webjob, you could write code to delete the blob file and sql record according to this queue message.
More details about how to use queue trigger in azure webjob, you could refer to below article:https://learn.microsoft.com/en-us/azure/app-service-web/websites-dotnet-webjobs-sdk-storage-queues-how-to
Another way:
You could enable change tracking on the specific table so that you could use scheduled webjob discovers the latest changes (i.e. delete) to the table and according to the record to delete the file in the blob.
More details you could refer to below query and article:
1.Enable change tracking:
ALTER DATABASE yourdatabase
SET CHANGE_TRACKING = ON
(CHANGE_RETENTION = 2 DAYS, AUTO_CLEANUP = ON)
ALTER TABLE YourBolbTable
ENABLE CHANGE_TRACKING
WITH (TRACK_COLUMNS_UPDATED = ON)
2.Select the changed records:
SELECT
*
FROM CHANGETABLE(CHANGES YourBlobTable, 0) AS ChTbl;
Result:
Notice: I suggest you could use blob name as primary key since if you delete the record the change tracking table just record the primary key.
More details about how to use change tracking table, you could refer to this article.
Then you could use timertrigger or scheduled webJob ] to get the table changed records every minutes.
At last, you could delete the blob according to this record in webjob by using azure storage SDK.

Resources