Import Wide World Importers DB into Azure Data Studio - sql-server

I would like to import the sample WideWorldImporters-Full.bacpac database from:
https://github.com/Microsoft/sql-server-samples/releases/download/wide-world-importers-v1.0/WideWorldImportersDW-Standard.bacpac
into Azure Data Studio 1.32 for Windows 10. I am using the Data-tier Application Wizard -> Create a DB from a .bacpac file. I get the following error which has stumped me:
Import bacpac: Could not import package.
Error SQL72014: Core Microsoft SqlClient Data Provider: Msg 5105, Level 16, State 2, Line 1 A file activation error occurred. The physical file name 'WideWorldImportersDW-Standard_USERDATA_B694BC2.mdf' may be incorrect. Diagnose and correct additional errors, and retry the operation.
Error SQL72045: Script execution error. The executed script:
ALTER DATABASE [$(DatabaseName)]
ADD FILE (NAME = [USERDATA_B694BC2], FILENAME = N'$(DefaultDataPath)$(DefaultFilePrefix)_USERDATA_B694BC2.mdf') TO FILEGROUP [USERDATA];
Error SQL72014: Core Microsoft SqlClient Data Provider: Msg 5009, Level 16, State 8, Line 1 One or more files listed in the statement could not be found or could not be initialized.
Error SQL72045: Script execution error. The executed script:
ALTER DATABASE [$(DatabaseName)]
ADD FILE (NAME = [USERDATA_B694BC2], FILENAME = N'$(DefaultDataPath)$(DefaultFilePrefix)_USERDATA_B694BC2.mdf') TO FILEGROUP [USERDATA];
Any help would be appreciated

I followed the same MS document and was able to import the WideWorldImporters-Full.bacpac database successfully using Azure data studio.
Please make sure you provide correct file location in Step2 to import the dacpac file.

Related

Could not import package. Warning SQL72012: The object exists in the target

I exported my Azure database using Tasks > Export Data-tier Application in to a .bacpac file. Recently when I tried to import it into my local database server (Tasks > Import Data-tier Application), I encountered this error:
Could not import package.
Warning SQL72012: The object [MyDatabase_Data] exists in the target, but it will not be dropped even though you selected the 'Generate drop statements for objects that are in the target database but that are not in the source' check box.
Warning SQL72012: The object [MyDatabase_Log] exists in the target, but it will not be dropped even though you selected the 'Generate drop statements for objects that are in the target database but that are not in the source' check box.
Error SQL72014: .Net SqlClient Data Provider: Msg 12824, Level 16, State 1, Line 5 The sp_configure value 'contained database authentication' must be set to 1 in order to alter a contained database. You may need to use RECONFIGURE to set the value_in_use.
Error SQL72045: Script execution error. The executed script:
IF EXISTS (SELECT 1
FROM [master].[dbo].[sysdatabases]
WHERE [name] = N'$(DatabaseName)')
BEGIN
ALTER DATABASE [$(DatabaseName)]
SET CONTAINMENT = PARTIAL
WITH ROLLBACK IMMEDIATE;
END
Error SQL72014: .Net SqlClient Data Provider: Msg 5069, Level 16, State 1, Line 5 ALTER DATABASE statement failed.
Error SQL72045: Script execution error. The executed script:
IF EXISTS (SELECT 1
FROM [master].[dbo].[sysdatabases]
WHERE [name] = N'$(DatabaseName)')
BEGIN
ALTER DATABASE [$(DatabaseName)]
SET CONTAINMENT = PARTIAL
WITH ROLLBACK IMMEDIATE;
END
(Microsoft.SqlServer.Dac)
I followed the advice on other posts and tried to run this on SQL Azure database:
sp_configure 'contained database authentication', 1;
GO
RECONFIGURE;
GO
However, it says
Could not find stored procedure 'sp_configure'.
I understand the equivalent statement in Azure is:
https://learn.microsoft.com/en-us/sql/t-sql/statements/alter-database-scoped-configuration-transact-sql?view=sql-server-2017
What is the equivalent statement to "sp_configure 'contained database authentication', 1;"?
The solution is to execute this against the master database of your local/on-premise SQL Server:
sp_configure 'contained database authentication', 1;
GO
RECONFIGURE;
GO
Thank you to David Browne - Microsoft and Alberto Morillo for the quick solution.
I had the same issue and it got fixed by importing the bacpac via command prompt. In link Import Bacpac go to SQLPackage section and run the command provided there.
sqlpackage.exe /a:import /tcs:"Data Source=<serverName>.database.windows.net;Initial Catalog=<migratedDatabase>;User Id=<userId>;Password=<password>" /sf:AdventureWorks2008R2.bacpac /p:DatabaseEdition=Premium /p:DatabaseServiceObjective=P6
The solution that worked for me was to directly copy DB from azure to my localhost.
Create a new database in your local DB server.
Right-click on a new database, Tasks -> Import data.
Provide data for Azure db:
Provide data for your localhost:
Click Next. Check the first checkbox to select all.
On the next form choose -> Run immediately
Wait until the process is completed.
Using the Azure portal to Import a bacpac:
The Collation chosen in the first step of the Azure SQL Import wizard, is important!
It needs to match the Collation in the source DB that was used to create the bacpac.
The sqlpackage and SSMS methods do NOT need this step.

Error Importing DB from SQL Azure to a localDB

I have a SQLPaaS instance which we have exported into a .bacpac file. When I try and import the .bacpac into my local SQL Server 2017 CU14, I am getting an error on an ExternalDataSource, I do not need this external data source so I am trying to figure out a way to import this .bacpac.
I have tried using sqlpackage.exe (the import action doesn't let you exclude types, and the publish action requires a .dacpac not a .bacpac).
The error is:
Error importing database:Could not import package.
Error SQL72014: .Net SqlClient Data Provider:
Msg 102, Level 15, State 1, Line 3
Incorrect syntax near 'RDBMS'.
Error SQL72045: Script execution error. The executed script: CREATE EXTERNAL DATA SOURCE [LocalLoopBack]
WITH (
TYPE = RDBMS,
LOCATION = N'xxxxxx.database.windows.net',
DATABASE_NAME = N'xxxxxxx',
CREDENTIAL = [xxxxxx]
);
Is there anyway I can get this to import?
Can you copy that Azure SQL Database with a new name using Azure portal as explained here?
Remove the external data source from the new database and then export it as bacpac. Delete the newly created database with the copy operation. Import the bacpac created to your localDB instance.

Cannot Do BULK INSERT or cannot run BCP OpenRowSet from CSV File?

I have data in the data CSV File. I am trying to insert data from this csv data file into SQL Server Database Table.
I tried the below 2 options. Nothing is working for me.
1. BULK INSERT
2.BCP OPENROWSET
For Bulk Insert I am getting this below Error.
Msg 4861, Level 16, State 1, Line 1
Cannot bulk load because the file "\\ATACLS001PVFS\userdata$\haritha.pinninty\work\Test\Test.csv" could not be opened. Operating system error code 5(Access is denied.).
For BCP OpenRowSet, I am getting below Error.
Msg 7403, Level 16, State 1, Line 1
The OLE DB provider "Microsoft.ACE.OLEDB.12.0" has not been registered.
How to resolve these issues?
I am executing these Queries/Stored procedures from the SQL Server Query Analyser where i logged on using Windows Authentication.
I did not have Admin Priveleges on the machine yet.
Apprecaite your responses.
Thanks
Rita
Error message says it all,you need to download ACE Provider using below link.please note that this should be installed on machine where you are trying to import
https://www.microsoft.com/en-us/download/details.aspx?id=13255
You need admin Permissions to install
"For Bulk Insert I am getting this below Error."
Did you have the file open (i.e. Excel)? If so, close it from your desktop application.

Azure Import Error: The internal target platform type SqlAzureV12DatabaseSchemaProvider does not support schema file version '3.1'

For some reasons I cannot import new BACPACs from Azure. I still can import old ones.
This is the error message I get:
Internal Error. The internal target platform type SqlAzureV12DatabaseSchemaProvider does not support schema file version '3.1'.
I've tried this this solution , but it didn't help, because all my settings are already set up to default.
I also downloaded latest SSMS Preview, but on import it gives me other errors:
Warning SQL0: A project which specifies Microsoft Azure SQL Database v12 as the target platform may experience compatibility issues with SQL Server 2014.
Error SQL72014: .Net SqlClient Data Provider: Msg 102, Level 15, State 1, Line 1 Incorrect syntax near 'Admin'.
Error SQL72045: Script execution error. The executed script:
CREATE DATABASE SCOPED CREDENTIAL [Admin]
WITH IDENTITY = N'Admin';
Error SQL72014: .Net SqlClient Data Provider: Msg 319, Level 15, State 1, Line 2 Incorrect syntax near the keyword 'with'. If this statement is a common table expression, an xmlnamespaces clause or a change tracking context clause, the previous statement must be terminated with a semicolon.
Error SQL72045: Script execution error. The executed script:
CREATE DATABASE SCOPED CREDENTIAL [Admin]
WITH IDENTITY = N'Admin';
I have SSMS 2014 CU6 installed.
Any help would be much appreciated! Thank you!
Finally figured out what happened. It's a specific case, but maybe it helps someone else.
We tried to use elasic query to write queries across databases. To do it you need to create database scoped credentials. When package was imported, it tried to do the same locally and failed executing this:
CREATE DATABASE SCOPED CREDENTIAL [Admin]
WITH IDENTITY = N'Admin';
Since we decided to use different approach, I dropped scoped credentials and external data source (couldn't drop credentials without dropping data source):
DROP EXTERNAL DATA SOURCE Source
DROP DATABASE SCOPED CREDENTIAL Admin
Now everything is working again. Just be aware you cannot import database from Azure if it has scoped credentials created.
Make sure that you are using the new SQL Server Management Studio
https://msdn.microsoft.com/en-us/library/mt238290.aspx

Trying to Import FoxPro DBF File to SQL Server

As the title says, I'm trying to import a FoxPro dbf file into sql server using openrowset. At first I tried to export the DBF to an xls file and import using the Import/Export wizard. This works pretty well normally, but there is one field that sometimes holds a really long string, and this string is being truncated at 4096 characters during the export from the dbf to xls.
I found an old post with instructions on how to do this using openrowset.
When I try the first answer:
select *
from openrowset('MSDASQL', 'Driver=Microsoft Visual FoxPro Driver;
SourceDB=\\path\;
SourceType=DBF',
'select * from TABLE.DBF')
I get the error:
OLE DB provider "MSDASQL" for linked server "(null)" returned message "[Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified".
Msg 7303, Level 16, State 1, Line 1
Cannot initialize the data source object of OLE DB provider "MSDASQL" for linked server "(null)".
When I try the second answer:
select *
from openrowset('VFPOLEDB',
'\\Path\';'';'',
'select * from TABLE.DBF')
I get the error:
Msg 7403, Level 16, State 1, Line 1
The OLE DB provider "VFPOLEDB" has not been registered.
I tried to register the OLE*.dll files manually with regsvr32, but only some of them worked. On ole32, oleacc, oleaut32, and oleprn I got a success message. On oleacchooks, oleaccrc, oledlg, and oleres I got this error:
The module "oleacchooks" was loaded but the
entry-point DllRegisterServer was not found.
Make sure that "oleacchooks" is a valid DLL or OCX file
and then try again
After some investigation I tried to install the componet, but when I tried to install the msi file for FoxPro (found here), I got this error:
An error occurred while processing the last operation.
Error code 80110408 - Error occurred reading the application file
The event log may contain additional troubleshooting information.
So, I'm officially lost here. Does anybody have suggestions on how to get openrowset to work, or some other way of importing the dbf file?
Pat, you can use DBF Commander Pro for this task.
Download it, install, then click File -> Export to DBMS. In the window appears click Build button in order to build the connection string: select MS OLEDB Provider for SQL Server, then choose your server from the list, provide login and password, select a database, click OK:
In the Export to DBMS window select the destination table you want to import source DBF file to, then click Export.
More info on import and export DBF to a database you can find here.
P.S. The app has fully functional free trial 20-days period.
Try using VFPOLEDB.1 as the provider. You may be experiencing driver version issues.

Resources