Is it possible to convert a sdf umbraco database to Azure SQL? - database

I have a project that uses a local database(SDF) for umbraco which is hosted on azure, I would like to copy the data from that local database to a new one that will be hosted on Azure sql, then I want to update connection string to use that database, is it possible ? Or I will need to redo the website from scratch.
I've tried searching for the answer but I can't seem to find the answer.
I have downloaded the website using webmatrix, but the database doesn't download with the whole project :/

Yes, If you use this exporter you can generate a script of the database:
http://exportsqlce.codeplex.com/releases/view/116839
ExportSqlCe40.exe "Data Source=Umbraco.sdf;" export.sql sqlite
Initializing....
Generating the tables....
Generating the data....
Generating the indexes....
Sent script to output file(s) : export.sql in 760 ms
You must then add an extra line to add an index on the logins table
CREATE CLUSTERED INDEX umbracoUserLogins_Index ON umbracoUserLogins (contextID);
The issues I had were that the order of the records generated for the umbracoNode table. You must ensure the item with key -1 is inserted first and that it did not mark identity columns which I had to do manually.
Once you have ran this on an azure sql server instance, change your connection string ensuring you update the provider string to System.Data.SqlClient
<add name="umbracoDbDSN" providerName="System.Data.SqlClient" connectionString="[azureconnectionstring]" />

In the end i created a new umbraco project using webmatrix, created a new template site with an sql database and executed the sql query from the old sdf. database in that database, then I migrated it to azure with few modifications regarding clustered keys and it worked

Related

how to mirror a whole database cluster in postgresql

I'm using a postgresql (9.6) database in my project which is currently in development stage.
For production I want to use an exact copy/mirror of the database-cluster with a slightly different name.
I am aware of the fact that I can make a backup and restore it under a different cluster-name, but is there something like a mirror function via the psql client or pgAdmin (v.4) that mirrors all my schemas and tables and puts it in a new clustername?
In PostgreSQL you can use any existing database (which needs to be idle in order for this to work) on the server as a template when you want to create a new database with that content. You can use the following SQL statement:
CREATE DATABASE newdb WITH TEMPLATE someDbName OWNER dbuser;
But you need to make sure no user is currently connected or using that database - otherwise you will get following error.
ERROR: source database "someDbName" is being accessed by other users
Hope that helped ;)

Automating import of data-tier application (SQL database) from Azure with a Master Key

When I extract a data-tier application from a Microsoft Azure SQL database that has a Master Key, I was unable to import it into SQL server on my local PC.
You will find others had this issue here: SSMS 2016 Error Importing Azure SQL v12 bacpac: master keys without password not supported
However the steps provided as the answer did not work on my installation.
Steps are
1. Disable auditing on the server (or database)
2. Drop the database master key with DROP MASTER KEY command.
Microsoft Tech Support verified this solution did not work on my installation of SQL Server and after actually taking remote control of my PC and trouble shooting, they were unable to determine why this was occurring.
I needed to find a way to remove the Master Key from the bacpac file. I have a Powershell script to remove the Master Key from the BACPAC file but it requires extracting, renaming files and running scripts from Windows Powershell to get the db imported.
Does anyone have a program or set of scripts which would automate the process of removing the Master Key and importing a SQL DB from Azure with a single command?
I am new to this forum. Please do not be harsh with this post. I am trying to do the best I can to help others to save the many hours I spent coming up with this.
I have cobbled together a T-SQL script which calls a Windows Powershell script (also cobbled from multiple sources) to extract a data-tier application (database) from Microsoft Azure SQL database and import it into a database on my local SQL Server by running ONE command. Over the months I found some of the code that is in my scripts from other blogs etc. I am not able to provide the credit due to those folks as I didn't keep track of where I got the info. If you are reading this and you see your code, please take credit. I apologize for not being able to give you the credit for your work.
There may be configuration settings on your PC and your local SQL server that need adjustments as this entire solution requires pretty much full access to your computer. If you run into trouble with compatibility, let me know and I will do the best I can to let you know how my system is configured in case it will help you.
I am using Windows 10 Pro and Microsoft SQL Server Developer (64-bit) v12.0.5207.0
I have placed the two files that do all the work on GitHub here: https://github.com/Wingloader/Auto-Azure-BACPAC-Download.git
GetNewBacpac-forGitHub.sql
GetAzureDB-forGitHub.ps1
WARNING: The Powershell script file will store your SQL sa password and your Azure SQL login in clear text!
If you don't want to do this, don't use this solution.
My computer is owned and controlled solely by me so I am able to open up the security in my system and I am willing to assume the responsibility of safeguarding it.
The basic steps of my solution are are accomplished as follows: (steps 1 and 2 are optional as I like to keep a version of the DB I am working with as of the point in time I pull down a clean production copy of my Azure DB)
Back up the current DB as MyLocalDB.bak.
Restore that backup from step 1 to a new DB with the previous day stamped at the end of the DB name (e.g., MyLocalDB20171231)
Delete the original MyLocalDB database (needed so we can recreate the DB with the original name later on)
Pull down the production database from Azure and create a new database with the name MyLocalDB.
The original DB is deleted in step 3 so that the restored DB can use the original name (important when you have data connections referring to that DB name)
In Step 4, the work of extracting the data-tier application DB from Azure is initiated by this line in the T-SQL:
EXEC MASTER..xp_cmdshell '%SystemRoot%\System32\WindowsPowerShell\v1.0\powershell.exe -File C:\Git\GetUpdatedAzureDB\GetAzureDB.ps1"'
The Powershell script does the following:
The target for the extract is a file named today.bacpac (hardcoded). The first thing to do is delete that file if it already exists.
Extract the DB from Azure into the today.bacpac file.
Note: my DB on Azure has a Master Key for encryption. This will need to be removed from the files prior to importing the bacpac file into your local DB or it will fail (this may not be required in SQL 2017 according to my previous conversations with MS Support). If you do not use a Master Key, you can either strip out the code that does this step or just leave it alone. It won't remove anything if it isn't there. It would just add a little overhead to the program.
Open the today.bacpac file (zip file) and remove the MasterKey node from the Origin.xml file.
Modify the Model.xml file to updates the SHA hash length. This is required in order for the file not to appear to have been tampered with when SQL opens the bacpac file.
Re-zips the files back into a new file today-patched.bacpac
Runs this line of code (from Powershell) to import the bacpac file into SQL Server
&C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\SqlPackage.exe" /Action:Import /SourceFile:"C:\Git\GetUpdatedAzureDB\today-patched.bacpac" /TargetConnectionString:"Data Source=MyLocalSQLServer;User ID=sa; Password=MySAPassword; Initial Catalog=MyLocalDB; Integrated Security=false;"
After editing the two files to provide updated paths, usernames and passwords, run the SQL script. You do not need to edit the scripts again. You can run the SQL script again without modification and it will create a new copy of your Azure DB.
Done!

SQL Server - copy table data from one server to a table in an Azure SQL Database

I'm migrating a database from Azure VM to Azure SQL Database. I tried to use the "Deploy Database to Azure SQL Database" function in SSMS but it failed several times, seemingly due to the size of the database (110 GB). So I made a copy of the source database on the same source server, truncated the table with the majority of the data in it, then tried the deploy again. Success.
Now I need to get that data from the original source table into the destination table. I've tried two different approaches to this and both gave errors
In SSMS, I connected to both SQL Servers. I ran the below while attached to the destination database:
INSERT INTO dbo.DestinationTable
SELECT *
FROM [SourceServer].[SourceDatabase].dbo.SourceTable
With that I was given the error:
Reference to database and/or server name in
'SourceServer.SourceDatabase.dbo.SourceTable' is not supported in this
version of SQL Server.
In SSMS, used the Export Data Wizard from the Source Table. When trying to start that job, I received this error during the validation phase:
Error 0xc0202049: Data Flow Task 1: Failure inserting into the
read-only column "CaptureId"
How can I accomplish what should be this seemingly simple task?
instead of using deploy database to azure directly for large databases ,you could try below steps
1.Extract bacpac on local machine
2.Copy bacpac to blob
3.Now while creating database, you could use the bacpac in blob as source
This approach worked for us and is very fast,you may also have to ensure,that blob is in same region as SQLAzure
I've solved the issue. Using option #2, you simply need to tick the checkbox for "Enable Identity Insertion" and it works with no errors. This checkbox is inside the Edit Mappings sub-menu of the Export Data Wizard.

ASP.NET Membership mdf to SQL Server

I had membership working fine locally. But now need to deploy so created membership tables / stored procedures using the Aspnet_regsql.exe and changed my web.config to point local sqlexpress database instance.
My membership is still working but not using SQL Server. I went and deleted the .mdf file that my connection string was referring to earlier from the APP_DATA folder. My form authentication is still working allowing me to register new users but not using SQL Server!
What is it using?
<remove name="DefaultConnection"/>
<add name="DefaultConnection"
connectionString="server=PC\SQLEXPRESS;Trusted_Connection=true;database=adatabase;"/>
My connection string looks like above. I have one more connection string below this whose value is exactly like this for now, just FYI. All the membership sections like DefaultMembershipProvider use DefaultConnection connection string.
I was looking at this table called dbo.aspnet_Users all this time. There is another table down below called dbo.Users and that is where users are getting created. So IT IS using SQL Server! The Aspnet_regsql.exe tool installed these aspnet_* tables. And it looks like the actual table where users are created is getting created afterwards. I got that from these tables' time stamps - they are over 20 mins apart.

Sql Server keep old references to deleted databases

I have an ASP.NET MVC solution with a DB named LearningDB.
Here is my connectionstring in my web.config
<add name="EntityFrameworkDbContext"
connectionString="Data Source=.\SQL2008;Integrated Security=SSPI;Initial Catalog=LearningDB;MultipleActiveResultSets=True;User Instance=true"
providerName="System.Data.SqlClient"/>
My Sql Server is named SQL2008 and is a full SQL SERVER (not the express one).
Strange thing: when I run my solution, it successfully created the DB and allows me to add data to this DB. I didn't use the AttachDBFilename attribute (to specify where to store the DB) so the DB is located under C:\Users\Thierry\AppData\Local\Microsoft\Microsoft SQL Server Data\SQL2008
I already query the sys.databases to list my table but I only have 4 tables (master, tempdb, model, msdb). I would like to delete these corrupted references to deleted databases but I don't know where!
Any idea?
Thanks.
EDIT
As Mikael suggest me, I edit my question to be clear: I didn't want to delete master tables, etc...
Le'ts take an example: I have a DB named LearningDB.mdf in my asp.net mvc solution. I deleted physically this DB (with Windows Explorer, I know this is not the best thing to do). Next, when I restart my solution and run it, the system try to recreate the DB and warn me that the name LearningDB is already exist!! How this can be?? I already check in SQL SERVER with Management Studio but didn't found anything!

Resources