Im trying to create alembic migrations for Azure Synapse DW. Im constantly getting following error :
[Microsoft][ODBC Driver 17 for SQL Server][SQL Server]111214;An attempt to complete a transaction has failed. No corresponding transaction found.
My connection string in alembic.ini file is :
sqlalchemy.url = mssql+pyodbc:///?odbc_connect=Driver={ODBC Driver 17 for SQL Server};Server=tcp:{host},1433;Database={database};Uid=sqladminuser;Pwd={Password};Encrypt=yes;TrustServerCertificate=no;Connection+Timeout=30;
Version im trying to migrate :
def upgrade():
op.create_table(
'test',
sa.Column('id', sa.Integer, primary_key=True),
sa.Column('description', sa.String, nullable=False),
sa.Column('source_type', sa.String, nullable=False),
sa.Column('source_schema', sa.String, nullable=False),
sa.Column('source_entity', sa.String, nullable=False),
)
def downgrade():
op.drop_table('test')
I resolved this by following the github link provided by Gord in the comments. I updated by sqlalchemy.url to
mssql+pyodbc://{user}:{password}#{host/server}:1433/{db}?autocommit=True&driver=ODBC+Driver+17+for+SQL+Server
Related
Im trying to load a database dump into my Sybase backup server.
Running sybase ASE-16_0 on both my primary and backup machine.
Import is done in isql cli via
load database DB from ./dumps/data_dump
Error message is the following:
Backup Server session id is: 22. Use this value when executing the
'sp_volchanged' system stored procedure after fulfilling any volume change
request from the Backup Server.
Backup Server: 4.141.2.40: [11] The 'open' call failed for database/archive
device while working on stripe device
'/opt/dumps/data_dump' with error number
13 (Permission denied). Refer to your operating system documentation for further
details.
Also found this SAP Knowledge Base article, but it's hidden behind a paywall:
https://userapps.support.sap.com/sap/support/knowledge/en/3140989
Setting up a new Server fixed the issue for me. Im still working on a possible answer for this problem.
I have just installed MS SQL server on my system and I am having trouble using it from python. It works fine from the Management Studio, and I can see all my tables. I also have an MySQL server installed and it works fine from python using sqlalchemy. I would like now to use MS SQL as well. Here is what I tried:
from sqlalchemy import create_engine
import pyodbc
server = 'NEW-OFFICE\\NEWOFFICE'
database = 'testdb'
username = 'NEW-OFFICE\user'
password = 'password'
DRIVER = "ODBC Driver 17 for SQL Server"
SERVERNAME = "NEWOFFICE"
INSTANCENAME = "\MSSQLSERVER_ZVI"
First I tried:
SQLALCHEMY_DATABASE_URI = "mssql+pyodbc://{username}:{password}#{hostname}/{database}".format(
username=username,
password=password,
hostname=server,
database=database,
)
Second I tried:
engine = create_engine('mssql+pyodbc://' + server + '/' + database + '?trusted_connection=yes&driver={DRIVER}')
Third I tried:
engine = create_engine(
f"mssql+pyodbc://{username}:{password}#{SERVERNAME}{INSTANCENAME}/{database}?driver={DRIVER}", fast_executemany=True
)
Then the next code for all tries is:
db = SQLAlchemy(app)
db.app = app
db.init_app(app)
engine_container = db.get_engine(app)
engine_container.dispose() # to close db connection
db.engine.connect()
hallDB.query.filter_by(client_id = 1).first()
At the last line I get an exception:
(sqlite3.OperationalError) no such table: hall
Of course table "hall" exists and I can query it in the Management Studio.
I am stuck so any help will be mush appreciated.
An Update
The following code did work, but I want the sqlalchemy model approach, which I am not able to use, as listed above.
u = 'DRIVER=ODBC Driver 17 for SQL Server;SERVER=NEW-OFFICE\user;DATABASE=testdb;Trusted_Connection=yes;'
cursor = pyodbc.connect(u).cursor()
cursor.execute("SELECT * FROM hall;")
row = cursor.fetchone()
while row:
print(row[0])
row = cursor.fetchone()
Maybe this will show what needs to be done in order to connect using sqlalchemy.
An Update 2
The error related to sqlite3 was resolved after I saw the following warning
UserWarning: Neither SQLALCHEMY_DATABASE_URI nor SQLALCHEMY_BINDS is set. Defaulting SQLALCHEMY_DATABASE_URI to "sqlite:
I realized that although I set SQLALCHEMY_DATABASE_URI, in the test code, I skipped the line app.config["SQLALCHEMY_DATABASE_URI"] = SQLALCHEMY_DATABASE_URI
After fixing it, and using:
SQLALCHEMY_DATABASE_URI = r'mssql+pyodbc://NEW-OFFICE\user/testdb;driver=ODBC Driver 17 for SQL Server;Trusted_Connection=yes'
or
SQLALCHEMY_DATABASE_URI = r"mssql+pyodbc://NEW-OFFICE\user/testdb?driver=ODBC Driver 17 for SQL Server?trusted_connection=yes"
the error now is:
Message=(pyodbc.InterfaceError) ('IM002', '[IM002] [Microsoft][ODBC
Driver Manager] Data source name not found and no default driver
specified (0) (SQLDriverConnect)')
(Background on this error at: https://sqlalche.me/e/14/rvf5)
And also a warning:
A Warning: No driver name specified; this is expected by PyODBC when using DSN-less connections
Finally after many frustrating hours I found the problem -> the connection staring requires # at the beginning. The proper one is:
mssql+pyodbc://#NEW-OFFICE\user/testdb?driver=ODBC Driver 17 for SQL Server
So ... I have 2 SQL Server 2019 instances (CTP2.2) and I have one installed with Polybase in single node config (reference this as SS-A). I have created MASTER KEY in the master of SS-A, and created a DATABASE SCOPED CREDENTIAL in a database on SS-A. When I try to do the following:
CREATE EXTERNAL DATA SOURCE acmeAzureDB WITH
(TYPE = RDBMS,
LOCATION = 'ss2019azure.database.windows.net',
DATABASE_NAME = 'dbAcmeAzure',
CREDENTIAL = acmeAzureCred
);
I get an error
Msg 102, Level 15, State 1, Line 6
Incorrect syntax near 'RDBMS'
I have tried to work with MS SQL Server SMEs without any luck (been working on this for many weeks to no avail).
Any ideas here -- plus a message to Microsoft -- your docs on this are AWFUL!!
You have 2 SQL Server 2019 instances (CTP2.2).
But they are not Azure SQL Database instance.
RDBMS External Data Sources are currently only supported on Azure SQL Database.
-- Elastic Database query only: a remote database on Azure SQL Database as data source
-- (only on Azure SQL Database)
CREATE EXTERNAL DATA SOURCE data_source_name
WITH (
TYPE = RDBMS,
LOCATION = '<server_name>.database.windows.net',
DATABASE_NAME = '<Remote_Database_Name>',
CREDENTIAL = <SQL_Credential>
)
Another way, you can create a linked server for your SQL Server 2019 instance to Azure SQL Database. Then you can query data from the Azure SQL DB as EXTERNAL DATA SOURCE.
To see this official tutorial: How to Create a Linked Server.
Reference blob:Incorrect syntax near 'RDBMS'. When I try to create external data source, Anyone having the same issue?
Hope this helps.
SO - worked with MS today - and success -- you can do a CREATE EXTERNAL DATA SOURCE in SS2019 and point to AZURE SQL -- here is the TSQL I used:
(MASTER KEY ALREADY CREATED)
CREATE DATABASE SCOPED CREDENTIAL acmeCred WITH IDENTITY = 'remoteAdmin', SECRET ='XXXXXXXXX';
go
CREATE EXTERNAL DATA SOURCE AzureDB
WITH (
LOCATION = 'sqlserver://ss2019azure.database.windows.net',
CREDENTIAL = acmeCred
);
go
CREATE EXTERNAL TABLE [dbo].[tblAcmeDataAzure]
(
ID varchar(10)
)
WITH (
LOCATION='dbAcmeAzure.dbo.tblAcmeDataAzure',
DATA_SOURCE=AzureDB
);
go
For some reasons I cannot import new BACPACs from Azure. I still can import old ones.
This is the error message I get:
Internal Error. The internal target platform type SqlAzureV12DatabaseSchemaProvider does not support schema file version '3.1'.
I've tried this this solution , but it didn't help, because all my settings are already set up to default.
I also downloaded latest SSMS Preview, but on import it gives me other errors:
Warning SQL0: A project which specifies Microsoft Azure SQL Database v12 as the target platform may experience compatibility issues with SQL Server 2014.
Error SQL72014: .Net SqlClient Data Provider: Msg 102, Level 15, State 1, Line 1 Incorrect syntax near 'Admin'.
Error SQL72045: Script execution error. The executed script:
CREATE DATABASE SCOPED CREDENTIAL [Admin]
WITH IDENTITY = N'Admin';
Error SQL72014: .Net SqlClient Data Provider: Msg 319, Level 15, State 1, Line 2 Incorrect syntax near the keyword 'with'. If this statement is a common table expression, an xmlnamespaces clause or a change tracking context clause, the previous statement must be terminated with a semicolon.
Error SQL72045: Script execution error. The executed script:
CREATE DATABASE SCOPED CREDENTIAL [Admin]
WITH IDENTITY = N'Admin';
I have SSMS 2014 CU6 installed.
Any help would be much appreciated! Thank you!
Finally figured out what happened. It's a specific case, but maybe it helps someone else.
We tried to use elasic query to write queries across databases. To do it you need to create database scoped credentials. When package was imported, it tried to do the same locally and failed executing this:
CREATE DATABASE SCOPED CREDENTIAL [Admin]
WITH IDENTITY = N'Admin';
Since we decided to use different approach, I dropped scoped credentials and external data source (couldn't drop credentials without dropping data source):
DROP EXTERNAL DATA SOURCE Source
DROP DATABASE SCOPED CREDENTIAL Admin
Now everything is working again. Just be aware you cannot import database from Azure if it has scoped credentials created.
Make sure that you are using the new SQL Server Management Studio
https://msdn.microsoft.com/en-us/library/mt238290.aspx
I am working on database migration project. Connecting to SQL Server using Oracle gateway.
Image datatype in SQL Server is migrated to blob data in Oracle. But when I try to insert the data using insert command it gives an error.
SQL Server table:
create table xyz ([Image_Data] [image] NULL )
Oracle table:
create table xyz (Image_data BLOB null)
Insert command used:
insert into xyz
select * from xyz#sqldb;
Error message:
SQL Error: ORA-00997: illegal use of LONG datatype
00997. 00000 - "illegal use of LONG datatype"
ODBC is mapping SQL Server column to Oracle's LONG data type. It is not that easy to handle it.
The best way is to use DBMS_SQL package.
Here you can find useful notes:
http://ellebaek.wordpress.com/2010/12/06/converting-a-long-column-to-a-clob-on-the-fly/
We have successfully migrated the data with DTS Tool.