Automate the Upload of a .bak file to Azure - sql-server

I'm very new to Azure and have been tasked with automating the process of taking an existing version of our database, converting it to the newer version and then uploading that to Azure.
The conversion is done, that parts easy, what I'm struggling with is getting a .bacpac file from SSMS using PowerShell. I know I can use the Export Data Tier Application function in SSMS to do this but I need it to be automated. From there I can use something like the following to actually upload the database:
https://blogs.msdn.microsoft.com/brunoterkaly/2013/09/26/how-to-export-an-on-premises-sql-server-database-to-windows-azure-storage/
I have looked around and cannot find a solution to this, or even know where to start.

You can create bacpac of your on-premises databases and locate them on a local folder (c:\MyBacpacs) using SQLPackage.
sqlpackage.exe /Action:Export /SourceServerName:. /sdn:"DB_Foo" /tf:"c:\MyBacpacs\DB_Foo.bacpac"
You can then use AzCopy to upload bacpacs to Azure BLOB storage
AzCopy /Source:"c:\MyBacpacs" /Dest:"https://exampleaccount.blob.core.windows.net/bacpacs" /DestKey:storageaccountkey /Pattern:*.bacpac

Related

Automating import of data-tier application (SQL database) from Azure with a Master Key

When I extract a data-tier application from a Microsoft Azure SQL database that has a Master Key, I was unable to import it into SQL server on my local PC.
You will find others had this issue here: SSMS 2016 Error Importing Azure SQL v12 bacpac: master keys without password not supported
However the steps provided as the answer did not work on my installation.
Steps are
1. Disable auditing on the server (or database)
2. Drop the database master key with DROP MASTER KEY command.
Microsoft Tech Support verified this solution did not work on my installation of SQL Server and after actually taking remote control of my PC and trouble shooting, they were unable to determine why this was occurring.
I needed to find a way to remove the Master Key from the bacpac file. I have a Powershell script to remove the Master Key from the BACPAC file but it requires extracting, renaming files and running scripts from Windows Powershell to get the db imported.
Does anyone have a program or set of scripts which would automate the process of removing the Master Key and importing a SQL DB from Azure with a single command?
I am new to this forum. Please do not be harsh with this post. I am trying to do the best I can to help others to save the many hours I spent coming up with this.
I have cobbled together a T-SQL script which calls a Windows Powershell script (also cobbled from multiple sources) to extract a data-tier application (database) from Microsoft Azure SQL database and import it into a database on my local SQL Server by running ONE command. Over the months I found some of the code that is in my scripts from other blogs etc. I am not able to provide the credit due to those folks as I didn't keep track of where I got the info. If you are reading this and you see your code, please take credit. I apologize for not being able to give you the credit for your work.
There may be configuration settings on your PC and your local SQL server that need adjustments as this entire solution requires pretty much full access to your computer. If you run into trouble with compatibility, let me know and I will do the best I can to let you know how my system is configured in case it will help you.
I am using Windows 10 Pro and Microsoft SQL Server Developer (64-bit) v12.0.5207.0
I have placed the two files that do all the work on GitHub here: https://github.com/Wingloader/Auto-Azure-BACPAC-Download.git
GetNewBacpac-forGitHub.sql
GetAzureDB-forGitHub.ps1
WARNING: The Powershell script file will store your SQL sa password and your Azure SQL login in clear text!
If you don't want to do this, don't use this solution.
My computer is owned and controlled solely by me so I am able to open up the security in my system and I am willing to assume the responsibility of safeguarding it.
The basic steps of my solution are are accomplished as follows: (steps 1 and 2 are optional as I like to keep a version of the DB I am working with as of the point in time I pull down a clean production copy of my Azure DB)
Back up the current DB as MyLocalDB.bak.
Restore that backup from step 1 to a new DB with the previous day stamped at the end of the DB name (e.g., MyLocalDB20171231)
Delete the original MyLocalDB database (needed so we can recreate the DB with the original name later on)
Pull down the production database from Azure and create a new database with the name MyLocalDB.
The original DB is deleted in step 3 so that the restored DB can use the original name (important when you have data connections referring to that DB name)
In Step 4, the work of extracting the data-tier application DB from Azure is initiated by this line in the T-SQL:
EXEC MASTER..xp_cmdshell '%SystemRoot%\System32\WindowsPowerShell\v1.0\powershell.exe -File C:\Git\GetUpdatedAzureDB\GetAzureDB.ps1"'
The Powershell script does the following:
The target for the extract is a file named today.bacpac (hardcoded). The first thing to do is delete that file if it already exists.
Extract the DB from Azure into the today.bacpac file.
Note: my DB on Azure has a Master Key for encryption. This will need to be removed from the files prior to importing the bacpac file into your local DB or it will fail (this may not be required in SQL 2017 according to my previous conversations with MS Support). If you do not use a Master Key, you can either strip out the code that does this step or just leave it alone. It won't remove anything if it isn't there. It would just add a little overhead to the program.
Open the today.bacpac file (zip file) and remove the MasterKey node from the Origin.xml file.
Modify the Model.xml file to updates the SHA hash length. This is required in order for the file not to appear to have been tampered with when SQL opens the bacpac file.
Re-zips the files back into a new file today-patched.bacpac
Runs this line of code (from Powershell) to import the bacpac file into SQL Server
&C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\SqlPackage.exe" /Action:Import /SourceFile:"C:\Git\GetUpdatedAzureDB\today-patched.bacpac" /TargetConnectionString:"Data Source=MyLocalSQLServer;User ID=sa; Password=MySAPassword; Initial Catalog=MyLocalDB; Integrated Security=false;"
After editing the two files to provide updated paths, usernames and passwords, run the SQL script. You do not need to edit the scripts again. You can run the SQL script again without modification and it will create a new copy of your Azure DB.
Done!

Upload Images to Azure Storage Using Portal (not programmatically)

I need a SQL Server database that stores images, and their name, category, etc, so the SQL table will have 5 or so columns. I'm using Azure as my SQL Server host. It appears I cannot seem to insert image data into my VARBINARY(MAX) column from SQL Server Management Studio which was my first plan. I cannot do this because I cannot seem to give my user permissions to use BULK LOAD. Azure SQL seems to make this impossible. I think I need to use Azure Storage, and then in the SQL Server database, just store a link to the image.
To be clear, I want the images in the database already, I do not want to add them from within the application I am developing. The application I'm developing will only download the images to the device, not upload them.
So How do I upload the images to Azure Storage using the portal, not using code?
So how do I upload the images to Azure Storage using the portal, not using code?
Short Answer
You cannot. The portal does not have a way to upload an image to a storage container from either the old or the new portal.
Alternative
Use the AzCopy Command-Line Utility by Microsoft. It allows you to do what you want with just two command lines. There is terrific tutorial here.
First, download and install the utility. Second, open a command prompt and navigate to the installation AzCopy install directory. Third, upload a file to your storage account. Here are the second and third steps.
> cd C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy
> AzCopy /Source:folder /Dest:account /DestKey:key /Pattern:file
And here are what the parameters mean.
Source The folder on your computer that contains the images to upload.
Dest The address of the storage container at which to store the images.
DestKey The primary access key for your storage account.
Pattern The name of the file to upload (or a pattern).
Example
This uploads an image named my-cat.png from the C:\temp folder on my computer to a storage contained called mvp1. If you wanted to upload all the png images in that folder, you could replace my-cat.png with *.png and it work upload them all.
AzCopy /Source:C:\temp /Dest:https://my.blob.core.windows.net/mvp1 /DestKey:tLlbC59ggDdJ+Dg== /Pattern:my-cat.png
You might also what to take a look at the answers to this question: How do I upload some file into Azure blob storage without writing my own program?

Export MSSQL DB, Import in shared environment

We are in the process of trying to migrate from a VPS to a shared environment. The VPS is running Studio Express 2005 so is therefore limited quite a lot in functionality in terms of exporting.
I have managed to export a database in .bak format and upload (Restore) it to the shared environment.
However, here comes the problem, the schema has come with the database. Causing problems when connecting via asp.
The table name structure is as follows [SCHEMA].[TABLE_NAME].
The shared environment does not allow for changing of schema or many advanced features. (Its running myLittleAdmin).
So I guess the schema changes would have to be done on the database, then exported then imported.
Ps. I'm new to MSSQL and more experienced in MYSQL.
Ok So I have found a solution to this.
Export the schema from Studio Express using Right Click > Tools > Generate Script.
Execute this script on the server.
Open this file, find and replace the old user with your new one.
Use a tool such as this one http://sqldumper.ruizata.com/ (SQL Dumper) to export the DB to .SQL.
Find and replace on this file, again for the old user to the new.
Copy this SQL and execute it on the server.
Job done!
Joe

How do you export settings from the Database Publishing Wizard?

I'm using the Database Publishing Wizard in VS2008 to push changes to my hosting provider. It doesn't look like those settings go with the project (which seems a little silly to me), but rather they go with the machine.
On each new machine that I'd like to publish from I need to go through the process of digging up my database settings.
Is there a way to backup/export/save these settings to a file?
If so, what is it?
Hmmm ... it appears that the publishing wizard uses 2 config files for it's application configuration:
The Publish Database Wizard uses the
files user.config and hoster.config to
store configuration information. The
directory in which these files are
stored must have the appropriate NTFS
file system permissions set. These
files contain user names and encrypted
passwords. The passwords are encrypted
by using DPAPI.
user.config File
This file stores persisted host and
configuration settings for the Publish
Database Wizard. The user.config file
is located at %SystemDrive%\Documents
and Settings\%Username%\Application
Data\Microsoft\Microsoft SQL
Server\90\Tools\Publishing
Wizard\user.config.
hoster.config File
This file stores options about Web
service addresses, user names, and
databases for shared hosting
providers. The hoster.config file is
located at %SystemDrive%\Documents and
Settings\%Username%\Application
Data\Microsoft\Microsoft SQL
Server\90\Tools\Publishing
Wizard\hoster.config.
I'm trying to access these files in Vista (with VS2008 and SQL Express installed) and striking out. I'll keep you posted.
Update: Looks like I didn't have the Publishing Wizard installed after all. After grabbing the version for VS2008 here, and then installing (and don't worry -- you won't see any indication it's getting installed, but it does).... and THEN setting up my first database using the wizard, it looks like there are indeed some settings stored uder the 'Application Data' directory listed above. However, it looks like it's just a user.config, and then an XML file for each database configured. I believe you can backup each of these files for later use.

Batch file to "Script" a Database

Is it possible to somehow use a .bat file to script the schema and/or content of a SQL Server database?
I can do this via the wizard, but would like to streamline the creation of this file for source control purposes.
I would like to avoid the use of 3rd party tools, just limiting myself to the tools that come with SQL Server.
There is a free tool called SubCommander that is a part of the open source SubSonic software. I have successfully used this tool myself to create both schema and data "dumps" each night.
You can script out your schema and
data (and then version it in your
favorite source control system) using
SubCommander. Simply use the command
"version" and tell SubCommander where
to put the data:
sonic.exe version /out Scripts
This will output a script file (.sql)
to the local scripts directory of your
project
You can also try using the Microsoft SQL Server Database Publishing wizard, although i am not sure that you can use it in a bat file.

Resources