Cannot Import DB using the Azure Portal - sql-server

I am trying to import a DB in the Azure Portal. The original DB I exported from was on a different server, but is setup identically to where I am trying to import. I am importing by going to the target server and clicking the import button. I then choose the storage account, container, and bacpac file I would like to import. I check that the database size and type are the same for the import as the bacpac file. I also double check that the collation is the same on the import as it is in the bacpac. I then confirm. It attempts to do the import for about 20 minutes before giving the error message below. I can see the DB is created when I go to the sql server and click the sql databases blade, but the tables inside the DB are empty.
Could not import package.
Warning SQL72012: The object [data_0] exists in the target, but it will not be dropped even though you selected the 'Generate drop statements for objects that are in the target database but that are not in the source' check box.
Warning SQL72012: The object [log] exists in the target, but it will not be dropped even though you selected the 'Generate drop statements for objects that are in the target database but that are not in the source' check box.
Error SQL72014: .Net SqlClie
I have seen some responses in regards to similar issues, but they all seem to be using SSMS. Does anyone have any ideas on how to fix this issue inside the Azure Portal? Also, does anyone know what check box they are talking about? there is no checkbox when I am doing the import setup.

The warnings you are getting are a bit of a red herring. The issue is with the error you are getting. The line you posted only shows a generic error which should be followed by the actual error after. Try going to the actual database server and check import\export history.
try also importing using powershell which may give you some more details on the error you are getting:
$importStatus = Get-AzSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink
[Console]::Write("Importing")
while ($importStatus.Status -eq "InProgress") {
$importStatus = Get-AzSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink
[Console]::Write(".")
Start-Sleep -s 10
}
[Console]::WriteLine("")
$importStatus
Without knowing what the error you are getting it's a bit of a stab in the dark trying to guess what the issue is. Giving the version of SQL you are exporting from I can guess this is an on prem database server.
One of the common reasons dacpac files tend to fail when importing is your source database server isn't configured to allow contained databases
If that's the case, you need to go to your source database server (where you are exporting from) and enable that option:
sp_configure 'contained database authentication', 1;
GO
RECONFIGURE;
GO
Once this is run, recreate your dacpac file and try to import that.
As I mentioned, this is a complete stab in the dark as you haven't provided the error you are actually getting

Related

Azure SQL Bacpac import continually fails

Created a new SQL Server and uploaded a Bacpac file to storage. In the SQL Server area, hit the Import button, filled in the fields and the import starts. It always fails 5 mins later giving me an empty database which then has to be deleted as you can't restore an Azure DB. The Bacpac file was created with MSSMS 2016 CTP3. It couldn't connect directly to the DB server, error said "Bad request". Importing the Bacpac from within the Azure UI also results in "Bad request":-
Inner exception Microsoft.WindowsAzure.StorageClient.StorageClientException:
The value for one of the HTTP headers is not in the correct format.;
Inner exception System.Net.WebException:The remote server returned an
error: (400) Bad Request
The DB is an Umbraco website, so nothing huge or controversial I wouldn't have thought. OR is Azure down today? Getting this far has been a trial as the UI is the worst. However, if anyone can give me any advice on how to get this done I'd appreciate it.
I eventually found an article that stated the Blob Storage had to be "Classic" storage. So tried that and it all worked. The Azure UI is awful, it's too cryptic and there are far too many important things hidden by fancy widgets of limited use IMHO. Anyway, sorted now, hope this helps someone else.

SQL Server 2008 R2 Errors running SSIS Package through the Job Agent

I have an SSIS Package that I created through the Import Export (32-bit) Tool. When I executed the package manually through the Execute Package Utility the package run successfully with no issues. However when I try to run the package through a Job Agent in SSMS I keep getting errors. The primary error I get seems to be:
Failed to decrypt protected XML node "DTS:Password" with error
0x8009000B "Key not valid for use in specified state" You may not be
authorized to access this information. This error occurs when there is
a cryptographic error. Verify that the correct key is available.
I'm using SQL Server 2008 R2.
I have researched this error to some degree and I think it has something to do with the package protection level. I feel like I've tried the configurations that make the most sense but none seem to be working for me. The Options are:
Encrypt sensitive data with user key
Do not save sensitive data
Encrypt sensitive data with password
Encrypt all data with user key
Encrypt all data with password
Rely on server storage and roles for access control
I feel like Ishould be using the last option here (Rely on server storage...) because I prefer to use SQL Server Authentication. I use SQL Server Authentication on the 'Choose Destination' window of the SQL Server Import and Export Wizard, and similarly I use this with the same username and password when I create the Job Agent in SSMS on the General Tab of the Job Step Properties. Is it possible that there is something that I need to add to the User I'm using in SSMS - even though it works outside of SSMS?
Something else I wondered that might have an impact is having the option "Drop and Re-create destination table" checked in the Column Mappings window of the Import and Export Tool. I was using a stored procedure to remove the tables before executing the Job Agent and I feel like ti was working at one point - could that have something to do with it?
Again the thing that baffles me most is that it runs no problem when I execute it manually through the 'SQL Server Execute Utility Package' tool.
I've included images of some of the windows I mentioned above if that helps.

How to import a database (or Schema) using impdp command in Oracle?

I've not worked in Oracle database anytime. Everything I've done in databases is using only MySql. I got a .dpdmp file which I need to import it into Oracle database.
I tried with the example provided in this link, but not a single statement is executed. Totally it completes with 208 error
http://gerardnico.com/wiki/database/oracle/oracle_db_datapump
impdp system/root DIRECTORY=data_dump_dir DUMPFILE=MYDUMPFILE.DPDMP
If I look at the log files, I assume this is the root cause of the problem
Processing object type SCHEMA_EXPORT/USER
ORA-39083: Object type USER:"MYUSER" failed to create with error:
ORA-65096: invalid common user or role name
Since the user creation failed, so every statements that is executed after this also resulted in error. The dump file is created in Oracle 10G where my Oracle is 12c. Is this due to the version conflict?
Please create the user "MYUSER" and then try to import with additional parameters like
remap_talbespace=old_tbsp:new_tbsp
This will import the data into your new tablespace

SSIS: Enabling transactions causes database connection to fail

I have a package that I'm using to load records from a CSV file into a table. It has three elements in the control flow:
Truncate table
Load File into Table
Verify that there are records on the table after the load or raise an error
The idea is to have a single transaction on the package, so if the load of elements fails or the file was empty then the transaction is rolled back and the table isn't truncated.
To enable the transaction I just go to the package properties and set TransactionOption=Required, then I just try to execute the package and get this error while trying to execute the first element (The SQL task that tries to truncate the table):
[Execute SQL Task] Error: Failed to acquire connection "Database
Connection". Connection may not be configured correctly or you may not
have the right permissions on this connection.
If I just go back and change the TransactionOption property of the package to the default (Supported) then the package executes correctly but if there's an error there's no rollback.
I am using ADO.NET to connect to a SQL Server DB.
Any idea of what am I doing wrong? Is this the correct way to use transactions or am I missing something?
Thanks!
I know this is an old topic, but I had the same problem as you - the package works fine until I set one of the containers transaction's option to TransactionOption=Required
From what I understand, this might be related to Microsoft Distributed Transaction Coordinator (MSDTC) service not being started on the SQL server.
When I had this issue I checked if MSDTC is started on the machine on which I was running the package - it was. Sadly, I couldn't access the SQL server to check the same thing.
But, following these steps on the machine running the package solved the problem:
On Windows Server 2008 and Windows Vista:
Click Start, click Run, and type dcomcnfg to launch the Component Services Management console.
Click to expand Component Services and click to expand Computers.
Click to expand My Computer, click to expand Distributed Transaction Coordinator, right-click Local DTC, and click Properties.
Click the Security tab of the Local DTC Properties dialog.
In that dialog box, I had to enable "Network DTC Access" and also "Allow Inbound" and "Allow Outbound".
Sources:
msdn forum about this
msdn article about troubleshooting Problems with MSDTC

What state is my SQL server database in when msdeploy fails on user creation?

I am using msdeploy (version 2) to transfer a database from machine A to machine B.
On in the database on machine A there are some users that do not exist on machine B, thus the transfer (partially) fails with the message:
Error Code: ERROR_SQL_EXECUTION_FAILURE
More Information: An error occurred during execution of the database script.
The error occurred between the following lines of the script: "3" and "5".
The verbose log might have more information about the error.
The command started with the following: "CREATE USER [someDomain\someUser] FOR LOGIN [someDomain"
Windows NT user or group 'someDomain\someUser' not found.
Check the name again. http://go.microsoft.com/fwlink/?LinkId=178587
The database seems to be transfered, except for the user creation. Does anyone know what state the database is in after this failure?
Is there any way I can transfer the database without the users (or better without specific users) using msdeploy?
Web Deploy uses SMO (SQL Management Objects) to script out and apply the scripts for SQL databases, and exposes most of the SMO settings with the dbfullsql provider (so, most of these options: http://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.transfer_properties.aspx). If you want to skip the users due to this kind of login-not-exists or user-not-found error, you should be able to do this by adding the scripting option: copyAllUsers=false to the source of the sync. For example:
msdeploy.exe -verb:sync -source:dbfullsql="Data Source=.\SQLExpress;Initial Catalog=MySourceDb;User Id=localUser;Password=LocalPass",copyAllUsers=false -dest:dbfullsql="Data Source=RemoteSQLServer;Initial Catalog=MyDestDb;User Id=remoteUser;Password=RemotePass"
Incidentally, I am surprised you note the db appears to have been sync'd - I would expect this is not actually the case. If you have the permissions for it, Web Deploy will create the database if it did not already exist when it initially tries to make the connection, but your failure occurred very early in the script execution, and I believe Web Deploy dbfullsql syncs are transacted by default (the db creation is separate from the script execution and is not transacted). Thus the db may exist where it did not pre-sync, but I wouldn't expect the data to be present in it.

Resources