Connecting SQL Server to Oracle database using Polybase - sql-server

Trying to create an External Source using Polybase in SQL Server and got this error:
Msg 46505, Level 16, State 1, Line 1 Missing required external DDL option 'type
However, base online examples there was no type when creating the external source.
CREATE EXTERNAL DATA SOURCE test_external_data_source_name with (
LOCATION = 'oracle://10.170.35.249:1522',
credential = credential_name
);

Related

"A transport-level error has occurred" with BULK INSERT in SQL Server Management Studio

I'm doing a very simple bulk insert:
CREATE EXTERNAL DATA SOURCE mysource WITH ( TYPE = BLOB_STORAGE, LOCATION = 'https://xxxxxxx.blob.core.windows.net/zzzz');
BULK INSERT mytable FROM 'myfile.csv'
WITH (DATA_SOURCE = 'mysource',FORMAT='CSV',CODEPAGE = 65001,FIRSTROW = 2,TABLOCK,ROWTERMINATOR = '0x0a');
Which throws:
Msg 64, Level 20, State 0, Line 0 A transport-level error has occurred
when receiving results from the server. (provider: TCP Provider,
error: 0 - The specified network name is no longer available.)
More information:
I'm using Microsoft SQL Server Management Studio 17.5
Database is in Azure
Storage is in Azure, and the blob has public access
The current count in mytable is large: 3942767
myfile.csv contains only 2 rows:
id,rbd,run,foo
"aaaabbbbb",4,0,5
In the past, this arrangement (storage + bulk insert) worked OK. Maybe this is happenning because the table is too large?
Notice:
This question has been identified as a possible duplicate of this: A transport-level error has occurred when receiving results from the server - well, they are very different. This is happening from SSMS directly sending the command to SQL Server. In that post, he is using a .NET application. Also this is happening exclusively with BULK INSERT. Normal inserts work fine.

Issue creating external tables from sql server to hadoop using polybase

We have recently installed Polybase in SQL Server. We are trying to use Hortonworks to get data. I am facing below issue while creating a external table.
Msg 105019, Level 16, State 1, Line 1
EXTERNAL TABLE access failed due to internal error: 'Java exception raised on call to HdfsBridge_IsDirExist:
Error [End of File Exception between local host is: "xxxxx"; destination host is: "xxxxx":1111; :
java.io.EOFException; For more details see: http://wiki.apache.org/hadoop/EOFException] occurred while accessing external file.'
I tried changing the core-site.xml file in SQL Server polybase location by uncommenting the Kerberos option which is now giving me the below error:
Msg 105019, Level 16, State 1, Line 1
EXTERNAL TABLE access failed due to internal error: 'Java exception raised on call to HdfsBridge_Connect:
Error [Unable to instantiate LoginClass] occurred while accessing external file.'
I suspect it might be with creating database scoped credentials.
CREATE DATABASE SCOPED CREDENTIAL HadoopUser3
WITH IDENTITY = '<user>', Secret = '<Passw0rd>';
go
Can someone help me understand the issue I'm having with the Identity and Secret. Is it related to Hadoop credentials? What kind of secret do we have to give?
Few things to check:
In your core-site.xml file try setting hadoop.security.authentication to lower case "kerberos". Also, make sure you have added polybase.kerberos.realm and polybase.kerberos.kdchost
Make sure your data source created uses the correct LOCATION= to your HDFS cluster. Make sure that is the IP address:Port and not the name of the server itself
Check the external table and ensure you have the right path to the cluster physical file itself.

Error Importing DB from SQL Azure to a localDB

I have a SQLPaaS instance which we have exported into a .bacpac file. When I try and import the .bacpac into my local SQL Server 2017 CU14, I am getting an error on an ExternalDataSource, I do not need this external data source so I am trying to figure out a way to import this .bacpac.
I have tried using sqlpackage.exe (the import action doesn't let you exclude types, and the publish action requires a .dacpac not a .bacpac).
The error is:
Error importing database:Could not import package.
Error SQL72014: .Net SqlClient Data Provider:
Msg 102, Level 15, State 1, Line 3
Incorrect syntax near 'RDBMS'.
Error SQL72045: Script execution error. The executed script: CREATE EXTERNAL DATA SOURCE [LocalLoopBack]
WITH (
TYPE = RDBMS,
LOCATION = N'xxxxxx.database.windows.net',
DATABASE_NAME = N'xxxxxxx',
CREDENTIAL = [xxxxxx]
);
Is there anyway I can get this to import?
Can you copy that Azure SQL Database with a new name using Azure portal as explained here?
Remove the external data source from the new database and then export it as bacpac. Delete the newly created database with the copy operation. Import the bacpac created to your localDB instance.

CREATE EXTERNAL DATA SOURCE from SS2019 CTP2.2 not working

So ... I have 2 SQL Server 2019 instances (CTP2.2) and I have one installed with Polybase in single node config (reference this as SS-A). I have created MASTER KEY in the master of SS-A, and created a DATABASE SCOPED CREDENTIAL in a database on SS-A. When I try to do the following:
CREATE EXTERNAL DATA SOURCE acmeAzureDB WITH
(TYPE = RDBMS,
LOCATION = 'ss2019azure.database.windows.net',
DATABASE_NAME = 'dbAcmeAzure',
CREDENTIAL = acmeAzureCred
);
I get an error
Msg 102, Level 15, State 1, Line 6
Incorrect syntax near 'RDBMS'
I have tried to work with MS SQL Server SMEs without any luck (been working on this for many weeks to no avail).
Any ideas here -- plus a message to Microsoft -- your docs on this are AWFUL!!
You have 2 SQL Server 2019 instances (CTP2.2).
But they are not Azure SQL Database instance.
RDBMS External Data Sources are currently only supported on Azure SQL Database.
-- Elastic Database query only: a remote database on Azure SQL Database as data source
-- (only on Azure SQL Database)
CREATE EXTERNAL DATA SOURCE data_source_name
WITH (
TYPE = RDBMS,
LOCATION = '<server_name>.database.windows.net',
DATABASE_NAME = '<Remote_Database_Name>',
CREDENTIAL = <SQL_Credential>
)
Another way, you can create a linked server for your SQL Server 2019 instance to Azure SQL Database. Then you can query data from the Azure SQL DB as EXTERNAL DATA SOURCE.
To see this official tutorial: How to Create a Linked Server.
Reference blob:Incorrect syntax near 'RDBMS'. When I try to create external data source, Anyone having the same issue?
Hope this helps.
SO - worked with MS today - and success -- you can do a CREATE EXTERNAL DATA SOURCE in SS2019 and point to AZURE SQL -- here is the TSQL I used:
(MASTER KEY ALREADY CREATED)
CREATE DATABASE SCOPED CREDENTIAL acmeCred WITH IDENTITY = 'remoteAdmin', SECRET ='XXXXXXXXX';
go
CREATE EXTERNAL DATA SOURCE AzureDB
WITH (
LOCATION = 'sqlserver://ss2019azure.database.windows.net',
CREDENTIAL = acmeCred
);
go
CREATE EXTERNAL TABLE [dbo].[tblAcmeDataAzure]
(
ID varchar(10)
)
WITH (
LOCATION='dbAcmeAzure.dbo.tblAcmeDataAzure',
DATA_SOURCE=AzureDB
);
go

Azure Import Error: The internal target platform type SqlAzureV12DatabaseSchemaProvider does not support schema file version '3.1'

For some reasons I cannot import new BACPACs from Azure. I still can import old ones.
This is the error message I get:
Internal Error. The internal target platform type SqlAzureV12DatabaseSchemaProvider does not support schema file version '3.1'.
I've tried this this solution , but it didn't help, because all my settings are already set up to default.
I also downloaded latest SSMS Preview, but on import it gives me other errors:
Warning SQL0: A project which specifies Microsoft Azure SQL Database v12 as the target platform may experience compatibility issues with SQL Server 2014.
Error SQL72014: .Net SqlClient Data Provider: Msg 102, Level 15, State 1, Line 1 Incorrect syntax near 'Admin'.
Error SQL72045: Script execution error. The executed script:
CREATE DATABASE SCOPED CREDENTIAL [Admin]
WITH IDENTITY = N'Admin';
Error SQL72014: .Net SqlClient Data Provider: Msg 319, Level 15, State 1, Line 2 Incorrect syntax near the keyword 'with'. If this statement is a common table expression, an xmlnamespaces clause or a change tracking context clause, the previous statement must be terminated with a semicolon.
Error SQL72045: Script execution error. The executed script:
CREATE DATABASE SCOPED CREDENTIAL [Admin]
WITH IDENTITY = N'Admin';
I have SSMS 2014 CU6 installed.
Any help would be much appreciated! Thank you!
Finally figured out what happened. It's a specific case, but maybe it helps someone else.
We tried to use elasic query to write queries across databases. To do it you need to create database scoped credentials. When package was imported, it tried to do the same locally and failed executing this:
CREATE DATABASE SCOPED CREDENTIAL [Admin]
WITH IDENTITY = N'Admin';
Since we decided to use different approach, I dropped scoped credentials and external data source (couldn't drop credentials without dropping data source):
DROP EXTERNAL DATA SOURCE Source
DROP DATABASE SCOPED CREDENTIAL Admin
Now everything is working again. Just be aware you cannot import database from Azure if it has scoped credentials created.
Make sure that you are using the new SQL Server Management Studio
https://msdn.microsoft.com/en-us/library/mt238290.aspx

Resources