We are testing the migration from a local SQL Server 2008R2 database to Azure, but have hit a bump in the road.
Process followed, based on SO articles:
Installed SQL Server 2012 Client tools
Amended DB to remove indexes with a fill factor specified, as well as invalid views and procedures (this was determined by using the Export Data-tier Application tool for SSMS, until it successfully created bacpac file)
uploaded the successfully created bacpac file to Azure
Went through steps to create new database using import method
bacpac file is retrieved from blob storage status shown, but then the following error occurs
BadRequest ;Request Error;Error Status Code:</B>
'BadRequest'</P><P><B>Details:
</B>Error encountered during the service operation. ; Exception
Microsoft.SqlServer.Management.Dac.Services.ServiceException:Unable to
authenticate request; </P></DIV></BODY></html>
Note: error text above was trimmed to exclude URL's as I don't have sufficient points.
I can't seem to find any info on this error or where there may be any additional log details to help determine why it will not import.
As the error mentions unable to authenticate, we also tried doing the following:
Created a new user and password on the local DB
Used this same new user and password for the definition of the new DB on Azure
This did not make any difference.
Would appreciate if someone could point us in the right direction to get this working, as we would need to replicate this process quite a few times.
Thanks.
We needed the same thing. Here is some steps that we did and the results:
1) Exporting using SQL Database Migration Tool created by ghuey
You can download here: https://sqlazuremw.codeplex.com/
It's a great tool and i really recommend you to try this first. Depends of the complexity of your database, it will work just fine.
For us, unfortunately didnt work. So you moved to the next step.
2) DAC Package
The 2008 has the option to generate the DACPAC witch creates the structure of the database on Azure and then you can Deploy to Azure by references a connection in the 2008 Studio Managament, Right click on Azure Server, Deploy ... se more details here: http://world.episerver.com/documentation/Items/Upgrading/EPiserver-Commerce/8/Migrating-Commerce-databases-to-Azure/
Well, if this works for you, TRY THIS. It's more easy.
For us, unfortunately didnt work. So you moved to the next step.
3) Using an 2012 server to export bacpac and then import into azure
This steps requires multiple actions to complete. Here it is:
a. Generate a backup into 2008 and move the file do 2012 server;
b. Restore the backup into 2012;
c. Do some SQL that:
c1. Set all owners of SCHEMAs to DBO. You can use an SQL to move schema like this: ALTER AUTHORIZATION ON SCHEMA::[db_datareader] TO [dbo]
c2. Remove all users that was created by you;
c3. Remove all MS_Description (Extend Properties) of all columns and tables
c4. Drop all constraints (tip: generate a complete script of the database with drop and create option enabled and copy the part of "drop constraint"
c5. We need to removed the fill factor options of the indexes of your database. You can do that re-creating the index (including PK that has clustered index associated). Well to drop every PK Clustered, is not that easy but with a little help of Google you will able do find an script to help you create and drop. Here is the script:
DECLARE #object_id int;
DECLARE #parent_object_id int;
DECLARE #TSQL NVARCHAR( 4000);
DECLARE #COLUMN_NAME SYSNAME;
DECLARE #is_descending_key bit;
DECLARE #col1 BIT;
DECLARE #action CHAR( 6);
SET #action = 'DROP';
--SET #action = 'CREATE';
DECLARE PKcursor CURSOR FOR
select kc.object_id , kc .parent_object_id
from sys.key_constraints kc
inner join sys .objects o
on kc.parent_object_id = o.object_id
where kc.type = 'PK' and o. type = 'U'
and o.name not in ( 'dtproperties','sysdiagrams' ) -- not true user tables
order by QUOTENAME (OBJECT_SCHEMA_NAME( kc.parent_object_id ))
,QUOTENAME( OBJECT_NAME(kc .parent_object_id));
OPEN PKcursor ;
FETCH NEXT FROM PKcursor INTO #object_id, #parent_object_id;
WHILE ##FETCH_STATUS = 0
BEGIN
IF #action = 'DROP'
SET #TSQL = 'ALTER TABLE '
+ QUOTENAME (OBJECT_SCHEMA_NAME( #parent_object_id))
+ '.' + QUOTENAME(OBJECT_NAME (#parent_object_id))
+ ' DROP CONSTRAINT ' + QUOTENAME(OBJECT_NAME (#object_id))
ELSE
BEGIN
SET #TSQL = 'ALTER TABLE '
+ QUOTENAME (OBJECT_SCHEMA_NAME( #parent_object_id))
+ '.' + QUOTENAME(OBJECT_NAME (#parent_object_id))
+ ' ADD CONSTRAINT ' + QUOTENAME(OBJECT_NAME (#object_id))
+ ' PRIMARY KEY'
+ CASE INDEXPROPERTY( #parent_object_id
,OBJECT_NAME( #object_id),'IsClustered' )
WHEN 1 THEN ' CLUSTERED'
ELSE ' NONCLUSTERED'
END
+ ' (' ;
DECLARE ColumnCursor CURSOR FOR
select COL_NAME (#parent_object_id, ic.column_id ), ic .is_descending_key
from sys .indexes i
inner join sys. index_columns ic
on i .object_id = ic .object_id and i .index_id = ic .index_id
where i .object_id = #parent_object_id
and i .name = OBJECT_NAME (#object_id)
order by ic. key_ordinal;
OPEN ColumnCursor ;
SET #col1 = 1 ;
FETCH NEXT FROM ColumnCursor INTO #COLUMN_NAME, #is_descending_key;
WHILE ##FETCH_STATUS = 0
BEGIN
IF (#col1 = 1 )
SET #col1 = 0
ELSE
SET #TSQL = #TSQL + ',';
SET #TSQL = #TSQL + QUOTENAME( #COLUMN_NAME)
+ ' '
+ CASE #is_descending_key
WHEN 0 THEN 'ASC'
ELSE 'DESC'
END;
FETCH NEXT FROM ColumnCursor INTO #COLUMN_NAME, #is_descending_key;
END;
CLOSE ColumnCursor ;
DEALLOCATE ColumnCursor ;
SET #TSQL = #TSQL + ');';
END;
PRINT #TSQL;
FETCH NEXT FROM PKcursor INTO #object_id , #parent_object_id ;
END;
CLOSE PKcursor ;
DEALLOCATE PKcursor ;
c6. Re-create the FKs
c7. Remove all indexes
c8. Re-create all indexes (without the fill factor options)
d. Now, right click on the database on 2012 and export data-tier to Azure Storage in format BACPAC. After finished, import on Azure.
It should works :-)
For anyone who may stumble across this, we have been able to locate the issue by using the bacpac file to create a new database on the local 2008R2 server, through the 2012 Client tools.
The error relates to a delete trigger that is being fired, which I don't understand why it is being executed, but that's another question.
Hopefully this may help others with import errors on SQL Azure.
Related
I'm preparing to test an application in development. The application uses SQL Server 2019 for backend databases. It allows users to maintain multiple databases (for compliance and regulatory reasons).
QA testing scenarios require databases to be restored frequently to a known state before a staff member performs test cases in sequence. They then note the results of the test scenario.
There are approximately a dozen test scenarios to work on for this release, and an average of 6 databases to be used for most scenarios. For every scenario, this setup takes about 10 minutes and involves over 20 clicks.
Since scenarios will be tested before and after code changes, this means a time commitment of about 8 hours on setup alone. I suspect this can be reduced to about 1 minute since most of the time is spent navigating menus and the file system while restorations only take a few seconds each.
So I'd like to automate restorations. How can I automate the following sequence of operations inside of SSMS?
Drop all user created databases on the test instance of SQL Server
Create new or overwritten databases populated from ~6 .BAK files. I currently perform this one-by-one using "Restore Database", then adding a new file device, and finally launching the restorations.
EDIT: I usually work with SQL, C#, Batchfiles, or Python. But this task allows flexibility as long as it saves time and the restoration process is reliable. I would imagine either SSMS or a T-SQL query are the natural first places for me to begin.
We are currently using full backups and these seem to remain connected to their parent SQL Server instance and database. This caused me to encounter an SSMS bug when attempting to overwrite an existing database with a backup from another database on the same instance -- the restore fails to overwrite the target database, and the database that created the backup becomes stuck "restoring" until SSMS is closed or I manually restore it with the correct backup.
So as a minor addendum, what backup settings are appropriate for creating these independent copies of databases that have been backed up from other SQL Server instances?
I would suggest you utilize Database Snapshots instead. This allows you to take a snapshot of the database, and then revert back to it after changes are made. The disk space taken up by the snapshot is purely the difference in changes to pages, not the whole database.
Here is a script to create database snapshots for all user databases (you cannot do this for system DBs).
DECLARE #sql nvarchar(max);
SELECT #sql =
STRING_AGG(CAST(CONCAT(
'CREATE DATABASE ',
QUOTENAME(d.name + '_snap'),
' ON ',
f.files,
' AS SNAPSHOT OF ',
QUOTENAME(d.name),
';'
)
AS nvarchar(max)), '
' )
FROM sys.databases d
CROSS APPLY (
SELECT
files = STRING_AGG(CONCAT(
'(NAME = ',
QUOTENAME(f.name),
', FILENAME = ''',
REPLACE(f.physical_name + 'snap', '''', ''''''),
''')'
), ',
' )
FROM sys.database_files f
WHERE f.type_desc = 'ROWS'
) f
WHERE d.database_id > 4; -- not system DB
PRINT #sql;
EXEC sp_executesql #sql;
And here is a script to revert to the snapshots
DECLARE #sql nvarchar(max);
SELECT #sql =
STRING_AGG(CAST(CONCAT(
'RESTORE DATABASE ',
QUOTENAME(dSource.name),
' FROM DATABASE_SNAPSHOT = ',
QUOTENAME(dSnap.name),
';'
)
AS nvarchar(max)), '
' )
FROM sys.databases dSnap
JOIN sys.databases dSource ON dSource.database_id = dSnap.source_database_id;
PRINT #sql;
EXEC sp_executesql #sql;
And to drop the snapshots:
DECLARE #sql nvarchar(max);
SELECT #sql =
STRING_AGG(CAST(CONCAT(
'DROP DATABASE ',
QUOTENAME(d.name),
';'
)
AS nvarchar(max)), '
' )
FROM sys.databases d
WHERE d.source_database_id > 0;
PRINT #sql;
EXEC sp_executesql #sql;
I have a SQL Server 2005 DB project and am looking to deploy the Schema over an existing DB that is on a later version of SQL Server. The issue I have is that Change Tracking is enabled on the DB I wish to deploy to and so the first thing SSDT wants to do is disable CT. This poses a problem as I get the error below:
(43,1): SQL72014: .Net SqlClient Data Provider: Msg 22115, Level 16,
State 1, Line 5 Change tracking is enabled for one or more tables in
database 'Test'. Disable change tracking on each table before
disabling it for the database. Use the sys.change_tracking_tables
catalog view to obtain a list of tables for which change tracking is
enabled. (39,0): SQL72045: Script execution error. The executed
script:
IF EXISTS (SELECT 1
FROM [master].[dbo].[sysdatabases]
WHERE [name] = N'$(DatabaseName)')
BEGIN
ALTER DATABASE [$(DatabaseName)]
SET CHANGE_TRACKING = OFF
WITH ROLLBACK IMMEDIATE;
END
In an effort to get around this I have created a PreDeployment script that executes the below:
/* Run pre-deployment scripts to resolve issues */
IF(SELECT SUBSTRING(##VERSION, 29,4)) = '11.0'
BEGIN
PRINT 'Enabling Change Tracking';
DECLARE #dbname VARCHAR(250)
SELECT #dbname = DB_NAME()
EXEC('
IF NOT EXISTS(SELECT * FROM [master].[dbo].[sysdatabases] WHERE name = ''' + #dbname + ''')
ALTER DATABASE ['+ #dbname +
']SET CHANGE_TRACKING = ON
(CHANGE_RETENTION = 5 DAYS, AUTO_CLEANUP = ON);
');
EXEC('
IF NOT EXISTS(SELECT * FROM sys.change_tracking_tables ctt
INNER JOIN sys.tables t ON t.object_id = ctt.object_id
INNER JOIN sys.schemas s ON s.schema_id = t.schema_id
WHERE t.name = ''TableName'')
BEGIN
ALTER TABLE [dbo].[TableName] ENABLE CHANGE_TRACKING;
END;');
So based on the DB Version Change Tracking is set to enabled on the DB and relevant Tables assuming it is not already enabled.I got this idea from a previous post: # ifdef type conditional compilation in T-SQL sql server 2008 2005
Unfortunately this is still not working as SSDT is trying to disable Change Tracking before the PreDeployment script is executed.
Make sure change tracking is enabled in your database project.
Open your database project's properties > Project Settings > Database Settings... > Operational tab > check the "Change tracking" option
As Keith said if you want it in enable it. If you do want to disable it then just run your script before doing the compare so you have a pre-pre-deploy script like:
https://the.agilesql.club/Blog/Ed-Elliott/Pre-Compare-and-Pre-Deploy-Scripts-In-SSDT
If you are disabling it then it is a one off thing so pretty simple.
Other options are to write your own deployment contributor and raising a bug via connect.
Deployment Contributor:
https://the.agilesql.club/blog/Ed-Elliott/2015/09/23/Inside-A-SSDT-Deployment-Contributor
https://github.com/DacFxDeploymentContributors/Contributors
Ed
The following is the start if the standard Install Northwind sql script as provided by Microsoft in their Install sql server 200 sample databases.
SET NOCOUNT ON
GO
USE master
GO
if exists (select * from sysdatabases where name='Northwind')
drop database Northwind
go
DECLARE #device_directory NVARCHAR(520)
SELECT #device_directory = SUBSTRING(filename, 1, CHARINDEX(N'master.mdf', LOWER(filename)) - 1)
FROM master.dbo.sysaltfiles WHERE dbid = 1 AND fileid = 1
EXECUTE (N'CREATE DATABASE Northwind
ON PRIMARY (NAME = N''Northwind'', FILENAME = N''' + #device_directory + N'northwnd.mdf'')
LOG ON (NAME = N''Northwind_log'', FILENAME = N''' + #device_directory + N'northwnd.ldf'')')
go
exec sp_dboption 'Northwind','trunc. log on chkpt.','true'
exec sp_dboption 'Northwind','select into/bulkcopy','true'
GO
set quoted_identifier on
GO
Under normal circumstances I always use a full copy of sql server or sql server express for development, however an unrelated support issue with a third party component has occurred that requires me to provide a self contained sample application with the basic nortwind database file contained within the sample using localdb.
To that end how should I adapt the Execute Create Database section of the sql script so that it will create a copy of the nortwind .mdf in a given location (let's say C:\MyData so that I can then use, that file to send with the sample I need to build for the support team. Essentially it is vital that they have a completely self contained sample to help narrow down the problem.
Many Thanks
I use the handy Database Diagramming tool in SQL Server 2008 for creating and managing relationships. I have exported the sourceDB to the destinationDB but the diagram doesn't come across.
I am looking around trying to figure out how to export just the diagram I have in one database to another... This online KB article fails since select * from dtproperties doesn't exist anymore.
#Ash I was having the same problem. Here's what we did to get around it...
It seems that System Diagrams are stored within the "sysdiagrams" table. So the first thing you need to do is determine the diagram_id of the Diagram you wish to copy. Run the following query to list them all. ** Note you need to replace "SourceDB" with the name of your database.
-- List all database diagrams
SELECT * FROM [SourceDB].[dbo].sysdiagrams
Then you can use INSERT to duplicate the diagram from one database to another as follows. ** Note again replace "SourceDB" with the name of the Database containing the existing diagram and "DestinationDB" with the name of the Database you wish to copy to. Also #SourceDiagramId should be set to the id retrieved above.
-- Insert a particular database diagram
DECLARE #SourceDiagramId int = 1
INSERT INTO [DestinationDB].[dbo].sysdiagrams
SELECT [name],diagram_id , version,definition from [SourceDB].[dbo].sysdiagrams
WHERE diagram_id = #SourceDiagramId
Then you need to set the "principal_id" to 1 manually.
-- Update the principal id (no idea why, but it set the owner as some asp_net user
UPDATE [DestinationDB].[dbo].sysdiagrams
SET principal_id = 1
This worked for us it seems pretty hacky especially since the Diagram is stored entirely in a single binary field "definition".
Answer comes from:
http://www.dotnetspider.com/resources/21180-Copy-or-move-database-digram-from-for.aspx
This generates an import string:
SELECT
'DECLARE #def AS VARBINARY(MAX) ; ' +
'SELECT #def = CONVERT(VARBINARY(MAX), 0x' + CONVERT(NVARCHAR(MAX), [definition], 2) + ', 2) ;' +
' EXEC dbo.sp_creatediagram' +
' #diagramname=''' + [name] + ''',' +
' #version=' + CAST([version] AS NVARCHAR(MAX)) + ',' +
' #definition=#def'
AS ExportQuery
FROM
[dbo].[sysdiagrams]
WHERE
[name] = '' -- Diagram Name
Next, you run the generated string in other DB.
As PROCEDURE:
-- =============================================
-- Author: Eduardo Cuomo
-- Description: Export Database Diagrama to SQL Query
-- =============================================
CREATE PROCEDURE [dbo].[Sys_ExportDatabaseDiagram]
#name SYSNAME -- Diagram Name
AS
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
SELECT
'DECLARE #def AS VARBINARY(MAX) ; ' +
'SELECT #def = CONVERT(VARBINARY(MAX), 0x' + CONVERT(NVARCHAR(MAX), [definition], 2) + ', 2) ; ' +
' EXEC dbo.sp_creatediagram' +
' #diagramname=''''' + [name] + ''''',' +
' #version=' + CAST([version] AS NVARCHAR(MAX)) + ',' +
' #definition=#def'
AS ExportQuery
FROM
[dbo].[sysdiagrams]
WHERE
[name] = #name
You can get rid of the UPDATE statement by fixing your INSERT statement - specifically the select portion. You are inserting the diagram_id column into the principal_id column (diagram_id is an identity).
Change it to:
DECLARE #SourceDiagramId int = 1
INSERT INTO [DestinationDB].[dbo].sysdiagrams
SELECT [name],principal_id,version,definition from [SourceDB].[dbo].sysdiagrams
WHERE diagram_id = #SourceDiagramId
And presto, it's all in there right the first time.
As in C Isaze answer, there are three simple steps:
1- Create the same number of "dummy" diagrams in the target server where you want to copy the diagrams
2- Add the target server as a Linked Server in the source server
3- run this script on source server
update [LINKEDSERVER].TARGETDB.[dbo].sysdiagrams set [definition]=
(SELECT [definition] from SOURCEDB.[dbo].sysdiagrams WHERE diagram_id = 1)
where diagram_id=1
If the databases are in different servers, there may be permission issues.
To copy the sysdiagrams, create the same number of "dummy" diagrams in the target server where you want to copy the diagrams, add the target server as a Linked Server in the source server and then run the script:
SELECT * from [LINKEDSERVER].TARGETDB.[dbo].sysdiagrams
SELECT * from SOURCEDB.[dbo].sysdiagrams
update [LINKEDSERVER].TARGETDB.[dbo].sysdiagrams set definition=
(SELECT definition from SOURCEDB.[dbo].sysdiagrams WHERE diagram_id = 1)
where diagram_id=1
-- the first 2 select commands will confirm that you are able to connect to both databases
-- then change the id as required to copy all the diagrams
There's a tool for exporting the diagrams to file and back into a database that you can find here: https://github.com/timabell/database-diagram-scm/
You'd be able to use this by pointing it at your original database and doing an export, and then pointing at your target database and doing an import.
I have an MS SQL 2000 database that was backed up from a public server and restored at a test location for an upgrade test. The problem is that the user that had access permission on the public server does not exist on the testing server, and now all tables are prefixed with that username (which requires ALL queries against those tables to be changed!)
Is there any quick way to fix this? I have changed the database owner but this did not help.
Create the login and users, but find out the SID from sysusers
EXEC sp_addlogin 'TheLogin', 'ThePassword', #sid = ???
EXEC sp_adduser 'TheLogin','TheUser'
Note: SQL Server 2000 so can't use CREATE LOGIN or CREATE USER
Ok, found the answer - the OBJECT owner must be changed to DBO, negating the need to prefix references to your object in your SQL scripts/queries - the object in this case being the database table(s)
Here is a script that will change the owner for objects within a database (not my own code)
DECLARE #currentObject nvarchar(517)
DECLARE #qualifiedObject nvarchar(517)
DECLARE #currentOwner varchar(50)
DECLARE #newOwner varchar(50)
SET #currentOwner = 'old_owner'
SET #newOwner = 'dbo'
DECLARE alterOwnerCursor CURSOR FOR
SELECT [name] FROM dbo.sysobjects
WHERE xtype = 'U' or xtype = 'P'
AND LEFT([name], 2) <> 'dt'
OPEN alterOwnerCursor
FETCH NEXT FROM alterOwnerCursor INTO #currentObject
WHILE ##FETCH_STATUS = 0
BEGIN
SET #qualifiedObject = CAST(#currentOwner as varchar) + '.' + CAST(#currentObject as varchar)
EXEC sp_changeobjectowner #qualifiedObject, #newOwner
FETCH NEXT FROM alterOwnerCursor INTO #currentObject
END
CLOSE alterOwnerCursor
DEALLOCATE alterOwnerCursor