I am trying to access data from one DB to another DB.For that I am using Elastic Job.Using Elastic Job I am to create table from one DB to another.But not able to access data or transfer data.I tried it using External Data source and External Table.
I used the below code :
External Data Source
CREATE EXTERNAL DATA SOURCE RemoteReferenceData
WITH
(
TYPE=RDBMS,
LOCATION='myserver',
DATABASE_NAME='dbname',
CREDENTIAL= JobRun
);
CREATE EXTERNAL TABLE [tablename] (
[Id] int null,
[Name] nvarchar(max) null
)
WITH (
DATA_SOURCE = RemoteReferenceData,
SCHEMA_NAME = N'dbo',
OBJECT_NAME = N'mytablename'
);
Getting error below:
> Error retrieving data from server.dbname. The underlying error
> message received was: 'The server principal "JobUser" is not able to
> access the database "dbname" under the current security context.
> Cannot open database "dbname" requested by the login. The login
> failed. Login failed for user 'JobUser'.
There are some errors in you statements:
the LOCATION value should be: LOCATION='[servername].database.windows.net'
Make sure when you create the CREDENTIAL: The "username" and "password" should be the username and password used to log in into the Customers database. Authentication using Azure Active Directory with elastic queries is not currently supported.
The whole T-SQL code example should be like this:
CREATE DATABASE SCOPED CREDENTIAL ElasticDBQueryCred
WITH IDENTITY = 'Username',
SECRET = 'Password';
CREATE EXTERNAL DATA SOURCE MyElasticDBQueryDataSrc WITH
(TYPE = RDBMS,
LOCATION = '[servername].database.windows.net',
DATABASE_NAME = 'Mydatabase',
CREDENTIAL = ElasticDBQueryCred,
) ;
CREATE EXTERNAL TABLE [dbo].[CustomerInformation]
( [CustomerID] [int] NOT NULL,
[CustomerName] [varchar](50) NOT NULL,
[Company] [varchar](50) NOT NULL)
WITH
( DATA_SOURCE = MyElasticDBQueryDataSrc)
SELECT * FROM CustomerInformation
I using the code to query the table in Mydatabase from DB1:
For more details, ref here: Get started with cross-database queries (vertical partitioning) (preview)
Hope this helps.
Related
I tried to create a external table in sql server pointing to hdfs, but getting the below error
Msg 110813, Level 16, State 1, Line 16
105019;External file access failed due to internal error: 'The Remote Java Bridge has not been attached yet.'
enter image description here
I have configured Hadoop and SLQ Server on Ubuntu-20.04 and installed polybase as well.
-> SQL Server version - 2019
-> hadoop-3.3.0
Below are the queries I have executed.
CREATE EXTERNAL DATA SOURCE [HadoopDFS1]
WITH (
TYPE = Hadoop,
LOCATION = N'hdfs://localhost:9000'
)
CREATE EXTERNAL FILE FORMAT CSVFF WITH (
FORMAT_TYPE = DELIMITEDTEXT,
FORMAT_OPTIONS (FIELD_TERMINATOR =',',
USE_TYPE_DEFAULT = TRUE));
CREATE EXTERNAL TABLE [dbo].[Salary] (
[Company Name] nvarchar(200),
[Job Title] nvarchar(100),
[Salaries Reported] int,
[Location] nvarchar(50)
)
WITH (LOCATION='/Data/input/',
DATA_SOURCE = HadoopDFS,
FILE_FORMAT = CSVFF
);
I am accessing the other database using elastic queries. The data source was created like this:
CREATE EXTERNAL DATA SOURCE TheCompanyQueryDataSrc WITH (
TYPE = RDBMS,
--CONNECTION_OPTIONS = 'ApplicationIntent=ReadOnly',
CREDENTIAL = ElasticDBQueryCred,
LOCATION = 'thecompanysql.database.windows.net',
DATABASE_NAME = 'TheCompanyProd'
);
To reduce the database load, the read-only replica was created and should be used. As far as I understand it, I should add the CONNECTION_OPTIONS = 'ApplicationIntent=ReadOnly' (commented out in the above code). However, I get only the Incorrect syntax near 'CONNECTION_OPTIONS'
Both databases (the one that sets the connection + external tables, and the other to-be-read-only are at the same server (thecompanysql.database.windows.net). Both are set the compatibility lever SQL Server 2019 (150).
What else should I set to make it work?
The CREATE EXTERNAL DATA SOURCE Syntax doesn't support the option CONNECTION_OPTIONS = 'ApplicationIntent=ReadOnly'. We can't use that in the statements.
If you want achieve that readonly request, the way is that please use the user account which only has the readonly(db_reader) permission to login the external database.
For example:
CREATE MASTER KEY ENCRYPTION BY PASSWORD = '<password>' ;
CREATE DATABASE SCOPED CREDENTIAL SQL_Credential
WITH
IDENTITY = '<username>' -- readonly user account,
SECRET = '<password>' ;
CREATE EXTERNAL DATA SOURCE MyElasticDBQueryDataSrc
WITH
( TYPE = RDBMS ,
LOCATION = '<server_name>.database.windows.net' ,
DATABASE_NAME = 'Customers' ,
CREDENTIAL = SQL_Credential
) ;
Since the option is not supported, then we can't use it with elastic query. The only way to connect to the Azure SQL data with SSMS is like this:
HTH.
I'm trying to initialize a managed PostgreSQL database using Pulumi. The PostgreSQL server itself is hosted and managed by Google Cloud SQL, but I set it up using Pulumi.
I can successfully create the database, but I'm stumped how to actually initialize it with my schemas, users, tables, etc. Does anyone know how to achieve this?
I believe I need to use the Postgres provider, similar to what they do for MySQL in this tutorial or this example. The below code shows what I have so far:
# Create database resource on Google Cloud
instance = sql.DatabaseInstance( # This works
"db-instance",
name="db-instance",
database_version="POSTGRES_12",
region="europe-west4",
project=project,
settings=sql.DatabaseInstanceSettingsArgs(
tier="db-g1-small", # Or: db-n1-standard-4
activation_policy="ALWAYS",
availability_type="REGIONAL",
backup_configuration={
"enabled": True,
}
),
deletion_protection=False,
)
database = sql.Database( # This works as well
"db",
name="db",
instance=instance.name,
project=project,
charset="UTF-8",
)
# The below should create a table such as
# CREATE TABLE users (id uuid, email varchar(255), api_key varchar(255);
# How to tell it to use this SQL script?
# How to connect it to the above created PostgreSQL resource?
postgres = pg.Database( # This doesn't work
f"users",
name="users",
is_template=False,
)
Here is sample code with an explanation on how we set everything up including create/delete table with Pulumi.
The code will look like this:
# Postgres https://www.pulumi.com/docs/reference/pkg/postgresql/
# provider: https://www.pulumi.com/docs/reference/pkg/postgresql/provider/
postgres_provider = postgres.Provider("postgres-provider",
host=myinstance.public_ip_address,
username=users.name,
password=users.password,
port=5432,
superuser=True)
# creates a database on the instance in google cloud with the provider we created
mydatabase = postgres.Database("pulumi-votes-database",
encoding="UTF8",
opts=pulumi.ResourceOptions(provider=postgres_provider)
)
# Table creation/deletion is via pg8000 https://github.com/tlocke/pg8000
def tablecreation(mytable_name):
print("tablecreation with:", mytable_name)
create_first_part = "CREATE TABLE IF NOT EXISTS"
create_sql_querty = "(id serial PRIMARY KEY, email VARCHAR ( 255 ) UNIQUE NOT NULL, api_key VARCHAR ( 255 ) NOT NULL)"
create_combined = f'{create_first_part} {mytable_name}{create_sql_querty}'
print("tablecreation create_combined_sql:", create_combined)
myconnection=pg8000.native.Connection(
host=postgres_sql_instance_public_ip_address,
port=5432,
user=postgres_sql_user_username,
password=postgres_sql_user_password,
database=postgres_sql_database_name
)
print("tablecreation starting")
cursor=myconnection.run(create_combined)
print("Table Created:", mytable_name)
selectversion = 'SELECT version();'
cursor2=myconnection.run(selectversion)
print("SELECT Version:", cursor2)
def droptable(table_to_drop):
first_part_of_drop= "DROP TABLE IF EXISTS "
last_part_of_drop= ' CASCADE'
combinedstring = f'{first_part_of_drop} {table_to_drop} {last_part_of_drop}'
conn=pg8000.native.Connection(
host=postgres_sql_instance_public_ip_address,
port=5432,
user=postgres_sql_user_username,
password=postgres_sql_user_password,
database=postgres_sql_database_name
)
print("droptable delete_combined_sql ", combinedstring)
cursor=conn.run(combinedstring)
print("droptable completed ", cursor)
After the 1st time of bringing the infrastructure up via pulumi up -y, you can uncomment the following code block in __main__.py and then add the configs for the postgressql server via cli and then run pulumi up -y
create_table1 = "votertable"
creating_table = tablecreation(create_table1)
print("")
create_table2 = "regionals"
creating_table = tablecreation(create_table2)
print("")
drop_table = "table2"
deleting_table = droptable(drop_table)
The settings for the table are in the Pulumi.dev.yaml file and are set via pulumi config set
Getting the error while creating External File Format in Azure SQL DB
Incorrect syntax near 'EXTERNAL'.
I am using the following commands (Used the T-SQL syntax from Microsoft Docs Link - https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-file-format-transact-sql?view=sql-server-ver15&tabs=delimited) but still getting the syntax error:
--Example 1
CREATE EXTERNAL FILE FORMAT textdelimited1
WITH ( FORMAT_TYPE = DELIMITEDTEXT
, FORMAT_OPTIONS ( FIELD_TERMINATOR = '|')
GO
--Example 2
CREATE EXTERNAL FILE FORMAT skipHeader_CSV
WITH (FORMAT_TYPE = DELIMITEDTEXT,
FORMAT_OPTIONS(
FIELD_TERMINATOR = ',',
STRING_DELIMITER = '"',
FIRST_ROW = 2,
USE_TYPE_DEFAULT = True)
)
As #wBob mentioned, since External file format is not supported on Azure SQL DB and MI. We can use EXTERNAL DATA SOURCE. There are many reasons for this problem (Cannot bulk load because the ... could not be opened).
Check whether the SAS key has expired. And please check the Allowed permissions.
Did you delete the question mark when you create the SECRET?
CREATE DATABASE SCOPED CREDENTIAL UploadInvoices
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = 'sv=2019-12-12******2FspTCY%3D'
I've tried the following test, it works well.
CREATE MASTER KEY ENCRYPTION BY PASSWORD = '***';
go
CREATE DATABASE SCOPED CREDENTIAL UploadInvoices
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = 'sv=2019-12-12&ss=bfqt&srt=sco&sp******%2FspTCY%3D'; -- dl
CREATE EXTERNAL DATA SOURCE MyAzureInvoices
WITH (
TYPE = BLOB_STORAGE,
LOCATION = 'https://***.blob.core.windows.net/<container_name>',
CREDENTIAL = UploadInvoices
);
BULK INSERT production.customer
FROM 'bs140513_032310-demo.csv'
WITH
(
DATA_SOURCE = 'MyAzureInvoices',
FORMAT = 'CSV',
FIRSTROW = 2
)
GO
I am trying to load data from a CSV file to a table in my Azure Database following the steps in https://learn.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-ver15#f-importing-data-from-a-file-in-azure-blob-storage, using the Managed Identity option. When I run the query, I receive this error:
Failed to execute query. Error: Referenced external data source "adfst" not found.
This is the name of the container I created within my storage account. I have also tried using my storage account, with the same error. Reviewing https://learn.microsoft.com/en-us/sql/relational-databases/import-export/examples-of-bulk-access-to-data-in-azure-blob-storage?view=sql-server-ver15 does not provide any further insight as to what may be causing the issue. My storage account does not have public (anonymous) access configured.
I'm assuming that I'm missing a simple item that would resolve this issue, but I can't figure out what it is. My SQL query is below, modified to not include content that should not be required.
CREATE MASTER KEY ENCRYPTION BY PASSWORD = '**************';
GO
CREATE DATABASE SCOPED CREDENTIAL msi_cred WITH IDENTITY = '***********************';
CREATE EXTERNAL DATA SOURCE adfst
WITH ( TYPE = BLOB_STORAGE,
LOCATION = 'https://**********.blob.core.windows.net/adfst'
, CREDENTIAL= msi_cred
);
BULK INSERT [dbo].[Adventures]
FROM 'Startracker_scenarios.csv'
WITH (DATA_SOURCE = 'adfst');
If you want to use Managed Identity to access Azure Blob storage when you run BULK INSERT command. You need to enable Managed Identity for the SQL server. Otherwise, you will get the error Referenced external data source "***" not found. Besides, you also need to assign Storage Blob Data Contributor to the MSI. If you do not do that, you cannot access the CVS file storing in Azure blob
For example
Enable Managed Identity for the SQL server
Connect-AzAccount
#Enable MSI for SQL Server
Set-AzSqlServer -ResourceGroupName your-database-server-resourceGroup -ServerName your-SQL-servername -AssignIdentity
Assign role via Azure Portal
Under your storage account, navigate to Access Control (IAM), and select Add role assignment. Assign Storage Blob Data Contributor RBAC role to the server which you've registered with Azure Active Directory (AAD)
Test
a. Data
1,James,Smith,19750101
2,Meggie,Smith,19790122
3,Robert,Smith,20071101
4,Alex,Smith,20040202
b. script
CREATE TABLE CSVTest
(ID INT,
FirstName VARCHAR(40),
LastName VARCHAR(40),
BirthDate SMALLDATETIME)
GO
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'YourStrongPassword1';
GO
--> Change to using Managed Identity instead of SAS key
CREATE DATABASE SCOPED CREDENTIAL msi_cred WITH IDENTITY = 'Managed Identity';
GO
CREATE EXTERNAL DATA SOURCE MyAzureBlobStorage
WITH ( TYPE = BLOB_STORAGE,
LOCATION = 'https://jimtestdiag417.blob.core.windows.net/test'
, CREDENTIAL= msi_cred
);
GO
BULK INSERT CSVTest
FROM 'mydata.csv'
WITH (
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
DATA_SOURCE = 'MyAzureBlobStorage');
GO
select * from CSVTest;
GO