I have a database in Azure SQL database, single database. It is Northwind sample that I have created in Azure SQL database, by using scripts. I am trying to export this database into blob storage in a storage account Gen 2. I have created a storage account in the same resource group where my Azure SQL database, single database resides. Via the portal I export the database. In firewall settings of my database, I have already checked "Allow Azure services and resources to access this server" as shown below:
I have also added my IP address to have access.
When I click on export I can see my storage account, and the container to save my backups (export database files) as shown below:
However, when I click OK and my export request is submitted, after a few minutes, I can see in "Import/Export History" that my request status remians on "Running, Progress=1%" and later the status changes to "Failed".
When I check my blob container in the storage account I can see the files are there all with 4B size as show below:
What is the cause of this and how can I resolve it? Basically I want to export the database into a blob storage by following this link but it is failing to export. Thank you in advance.
You can use SQL Server Management Studio (SSMS) to export the database bacpac file in blob storage account.
Follow the below steps:
Open the SSMS in your local machine and login to your Azure SQL Server. Fill the required details and click on Connect as shown below.
Right click on database you want to backup. Follow Task -> Export Data-tier Application.
Select Save to Microsoft Azure option. Click on Connect to connect with your Azure account and provide storage account details. Click on Next.
On the next page, check the summary of your settings and simply click on Finish button.
Once finished, you can check the bacpac file in your storage account. See the image below for your reference.
Additional step may required: If the authentication was cached, you need to run DBCC FLUSHAUTHCACHE; command in Management Studio and it will exported perfectly.
Related
Through SSMS, connecting to my on prem server, you can right click a database, select tasks and "DEPLOY DATABASE TO MICROSOFT AZURE SQL DATABASE". Is there a way to log/audit who did this from my on prem server?
After doing this, I've checked the SQL Server logs and do not see any entries for this.
Thanks!
You should be able to investigate the logs using power-BI content pack refer below link.
https://powerbi.microsoft.com/fr-be/blog/monitor-azure-audit-logs-with-power-bi/
You can also use Azure Activity Log API to check the resource changes refer API link below for details - https://learn.microsoft.com/en-us/rest/api/monitor/activity-logs/list?tabs=HTTP
I want to export my current database backups from Azure to my local on-premise environment. I have created a .bacpac file from the Azure SQL Database and stored this within my Azure Blob Storage.
However, whenever I download my backup it is always downloaded as a .ZIP file and not a .bacpac file. How can I ensure that I download a .bacpac file?
When you are prompted to save the file locally, simply change the extension from .ZIP to .bacpac.
A .bacpac file is simply a ZIP file (with a .bacpac extension and contains the metadata and data from the SQL Server database).
To export your SQL Database from Azure, navigate to the Azure portal for your SQL Database. You should be able to simply click the 'export' button at the top of the 'Overview' blade.
On the next blade, you will then need to specify
The BACPAC filename
The Azure storage account
The container for export, and
The credentials to connect to the source database
You can then see the progres under "Import/Export" history on the 'Overview' blade towards the bottom.
I have created Extended event session for azure database from my local sql server management studio and storing resulting file in azure blob storage.
but when i start extended event session it giving me an following error :
I have follow all the steps properly for creating extended events storage account with following steps:
1.Created shared access signature for container storage.
2.Created credential in SQL server using following script :
CREATE DATABASE SCOPED CREDENTIAL [Cre_Name]
WITH IDENTITY='SHARED ACCESS SIGNATURE',
SECRET = 'shared access signature of blob container'
GO
3.In extended event screen i used credential created in step 2.
Also i found that Filestream data is not supported in current version of azure storage. Is this is cause of error?
https://msdn.microsoft.com/en-in/library/dn385720.aspx
You have created wrong storage account, create as "General" rather than "blob". I have the same problem and this fix it
Also i found that Filestream data is not supported in current version of azure storage. Is this cause of error?
No,It is not cause of the error. But It may cause we can’t view the session data directly that session data in the azure storage. As forester123 mentioned, it may be related to the target data itself.The following is my detail steps:
Create an Azure SQL Virtual machine [SQL 2016]
Create an Azure SQL Database
In the Azure SQL Virtual machine, connect the azure SQL Database via Microsoft SQL Management Studio
Create Database Scoped Credential for the azure SQL database.
Create the session Extended events ->Sessions -> New session Wizard
Start the session and check the file from azure storage and file status is locked.
The session can start the correctly.
I'd like to browse data via excel and the data source is Analysis services database. After I grant my account to Roles->Membership in AS database in SQL Server, I can connect to the AS database and browse data from excel. But when I remove my account from the 'Membership' list, I can also connect to the AS database and browse data that makes me very confused. So I guess if there's cache in AS database or I should do other actions to make the configuration into effect?
If you are able to modify the SSAS roles you are probably in the admin list of the SSAS server. Here is a screenshot from Microsoft on how to edit the server admin list:
http://msdn.microsoft.com/en-us/library/ms174561.aspx
You can test the SSAS role functionality via MS SQL Server Management Studio by browsing the cube and changing the security context: http://easyroles.com/2014/02/testing-role-functionality/. This will allow you to have the 'look and feel' of the cube as if you were a user with limited access.
You should reconnect to the cube every time you change role definition to avoid weird caching issues.
I am trying to create a login system on an asp.net website which allows a user to register and log in the website. I require the registered user details to be stored on a database which I already have on Azure.
I have so far created the login system as shown here: http://msdn.microsoft.com/en-us/library/windowsazure/hh508981.aspx and the form works. However, when I log in to the azure management portal I cannot find the user's registered details. I also need to add more fields to the registration form in the database.
Does anyone know where I can view the registered users and how I can add more fields?
If you follow the same link you used to create your application you will see the schema that is used by SQL Azure to store this information as below:
Now if you want to access this database directly, what you really need is SQL Server Management Studio (SSMS) to connect with this Database and access it from your local machine. Follow this link below to download and configure SQL Azure studio locally so you can connect to SQL database and the modify the schema or look the records etc. Once you have SSMS connected with SQL Database, you can manage it like any other on-premise database.
Managing Windows Azure SQL Database using SQL Server Management Studio