INSERT failure with SQLAlchemy ORM / SQL Server 2017 - Invalid Object Name - sql-server

Firstly, I have seen many answers which is specific to the Invalid Object Name error working with SQL Server, but None of them seem to solve my problem. I don't have much idea on SQL Server dialect, but here is my current setup required on the project.
SQL Server 2017
SQLAlchemy (pyodbc+mssql)
Python 3.9
I'm trying to insert a database row, using the SQLAlchemy ORM, but it fails to resolve the schema and table, giving me the error of type
(pyodbc.ProgrammingError) ('42S02', "[42S02] [Microsoft][ODBC Driver
17 for SQL Server][SQL Server]Invalid object name 'agent.dbo.tasks'.
(208) (SQLExecDirectW); [42S02] [Microsoft][ODBC Driver 17 for SQL
Server][SQL Server]Statement(s) could not be prepared. (8180)")
I'm creating a session with the following code.
engine = create_engine(connection_string, pool_size=10, max_overflow=5,
pool_use_lifo=False, pool_pre_ping=True, echo=True, echo_pool=True)
db_session_manager = sessionmaker()
db_session_manager.configure(bind=engine)
session = db_session_manager()
I have a task object defined like
class Task(BaseModel):
__tablename__ = "tasks"
__table_args__ = {"schema": "agent.dbo"}
# .. field defs
I'm trying to fill the object fields and then do the insert like the usual
task = Task()
task.tid = 1
...
session.add(task)
session.commit()
But this fails, with the error mentioned before. I tried to execute a direct query like
session.execute("SELECT * FROM agent.dbo.tasks")
And it returned a result set.
The connection string is a URL object, which prints like
mssql+pyodbc://task-user:******#CRD4E0050L/agent?driver=ODBC+Driver+17+for+SQL+Server
I tried to use the SQL Server Management Studio to insert manually and check, there it shown me a sql dialect with [] as field separators
like
INSERT INTO [dbo].[tasks]
([tid]..
, but SQLAlchemy on echo did not show that, instead it used the one I see in MySQL like
INSERT INTO agent.dbo.tasks (onbase_handle_id,..
What is that I'm doing wrong ? I thought SQLAlchemy if configured with a supported dialect, should work fine (I use it on MySQL quite well). Am I missing any configuration ? Any help appreciated.

Related

ORA-28500: [ODBC Driver 11 for SQL Server]: Attempt to access a column "UtilizaMetrica_DescontoComerci" {42S22,NativeErr = 207} - Column +30 character

I configure a Heterogenous Service from Oracle to access SQL Server using the ODBC Drive from Microsoft
It works, but some query in specific table return the right message for example:
ORA-28500: connection from ORACLE to a non-Oracle system returned this message:
[Microsoft][ODBC Driver 11 for SQL Server][SQL Server]Attempt to access a column 'UtilizaMetrica_DescontoComerci'. {42S22,NativeErr = 207}[Microsoft][ODBC Driver 11 for SQL Server][SQL Server]
The right column 'UtilizaMetrica_DescontoComercial' has 32 characters, but truncate in the return message to 30 characters
It seems that OHS has a limitation on the length of a column name (30 characters).
Workaround is to shorten the name to an acceptable length by defining a shorter alias for that column or using a view to do the same thing.

Incorrect syntax near Go with Pypyodbc

I am using the pypyodbc library to establish a connection to a SQL Server 2008 R2 database and every time I try to execute a .sql file I encounter the following error:
pypyodbc.ProgrammingError: ('42000', "[42000] [Microsoft][ODBC SQL Server Driver][SQL Server]Incorrect syntax near 'Go'.")
This is the sql query I am trying to execute:
Use SL_Site1_App
Go
select emp_num,name, trans_num, job, trans_type
from Hours where trans_type like '1000%' order by trans_date desc
This is the python script that I am using:
import pypyodbc, ExcelFile
def main():
# read the SQL queries externally
queries = ['C:\\Temp\\Ready_to_use_queries\\Connection_sql_python.sql']
for index, query in enumerate(queries):
cursor = initiate_connection_db()
results = retrieve_results_query(cursor, query)
if index == 0:
ExcelFile.write_to_workbook(results)
print("The workbook has been created and data has been inserted.\n")
def initiate_connection_db():
connection_live_db = pypyodbc.connect(driver="{SQL Server}", server="xxx.xxx.xxx.xxx", uid="my-name",
pwd="try-and-guess", Trusted_Connection="No")
connection = connection_live_db.cursor()
return connection
The workaround for this problem is to delete the Use SL_Site1_App Go line but I want to know if this is a known problem related to the pypyodbc library to process these lines and if so, where should I look to notify the developers about this issue.
GO is a batch separator used by sqlcmd and SSMS. It's not a T-SQL operator.
Considering you're using an application to connect to SQL Server, declare your database in the connection string, by adding database="SL_Site1_App", and then remove the USE and GO statements in your SQL Statement.

Using stored procedure as a source of external dictionary

Due to security policy I need to use a stored procedure (MS SQL Server) as a source of ClickHouse external dictionary via ODBC.
According to documentation for ClickHouse it is possible to use only a table (or view). Although ODBC allows to call stored procedures.
<odbc>
<db>DatabaseName</db>
<table>TableName</table>
<connection_string>DSN=some_parameters</connection_string>
<invalidate_query>SQL_QUERY</invalidate_query>
</odbc>
When I tried "{CALL my_procedure_name}" in table I got this error.
Poco::Exception. Code: 1000, e.code() = 0, e.displayText() = ODBC
handle exception: Failed to get number of columns: Connection:NetConn:
000000001760E400\nServer:OMEGA_DSN\n===========================\nODBC
Diagnostic record #1:\n===========================\nSQLSTATE =
42S02\nNative Error Code = 208\n[Microsoft][ODBC Driver 17 for SQL
Server][SQL Server]54>?CAB8<>5 81J5:B0 "{CALL
my_procedure_name}".\n\n===========================\nODBC Diagnostic
record
Can anybody sujest some solution or workaround?
This is most likely due to the incompatibility of the ODBC driver. You may try testing it via https://www.mankier.com/1/isql
As for a workaround, I would start from wrapping the stored procedure in a view.

RODBCext: SQL 42000 402 Error when using sqlExecute()

I'm working on a Shiny app for updating entries maintained in a SQLServer2008 remote DB. I've been connecting to the DB using RODBC and I'm attempting to use parameterized queries through RODBCext to support mass updates of information.
I'm able to get the parameterized queries to work from my Windows 7, RStudio running R 3.2.3, but for some reasons when I try to run the same code from the linux machine running the same version of R and connecting with the same version of the driver, I get the following error:
Error in sqlExecute(Connection, data = dat) :
42000 402 [Microsoft][ODBC Driver 13 for SQL Server][SQL Server]The data types char and text are incompatible in the equal to operator.
42000 8180 [Microsoft][ODBC Driver 13 for SQL Server][SQL Server]Statement(s) could not be prepared.
[RODBCext] Error: SQLExecute failed
In addition: Warning messages:
1: In sqlExecute(Connection, data = dat) :
42000 402 [Microsoft][ODBC Driver 13 for SQL Server][SQL Server]The data types char and text are incompatible in the equal to operator.
2: In sqlExecute(Connection, data = dat) :
42000 8180 [Microsoft][ODBC Driver 13 for SQL Server][SQL Server]Statement(s) could not be prepared.
Here is the simple example code that works properly on my windows machine, but not on the linux machine (I removed the connection string information):
library(RODBCext)
Connection <- odbcDriverConnect(paste('Driver=ODBC Driver 13 for SQL Server',
'Server=<Server IP>', 'Port=<Port>', 'Database=<Database>', 'UID = <UserID>',
'PWD=<Password>', sep = ';'))
dat <- data.frame(Node_ID = "999", NodeGUID = "AF213171-201B-489B-B648-F7D289B735B1")
query <- "UPDATE dbo.Nodes SET Node_ID = ? WHERE NodeGUID = ?"
sqlPrepare(Connection, query)
sqlExecute(Connection, data = dat)
In this example, the dataframe is created with the columns as factors. I've tried explicitly casting the columns as characters first, as this seemed to work for the users having trouble with dates, but that still results in the same SQL error. I've also tried casting the Node_ID as numeric to match the SQL table, and I get the same error. The columns in the Nodes table in SQL are defined as:
NodeGUID (PK, char(36), not null)
Node_ID (int, null)
I've tried combining the sqlPrepare and sqlExecute calls by supplying the query argument for sqlExecute, and from what I understand that's a trivial difference and it results in the same error.
I suspect there must be a difference in the drivers and how they implement whatever SQL calls sqlExecute() makes. I also suspect sqlExecute() must handle the data types, as my results don't change regardless of the column types.
Thank you for any help you can provide!
Thanks to everyone who took a look at my question.
One of the SQL Server folks at my job was able to solve the issue. They suggested explicitly casting the arguments in the SQL query written for sqlExecute(). Here's the code that works, note I know the GUID will always be 36 characters and I'm confident the rest of the arguments I use this query for will be less than 1000 when converted to strings:
my_query <- "UPDATE dbo.Nodes SET Node_ID = CAST(? As varchar(1000)) WHERE NodeGUID = CAST(? As varchar(36))"
sqlExecute(Connection, data = dat, query = my_query)
I'm guessing the driver for Windows is somehow handling the casting from text to varchar, but the linux driver does not.
I hope this helps others working with RODBCext. Thanks to Mateusz Zoltak and the team of contributors for a great package!

How do I convert an Oracle TIMESTAMP data type to SQL Server DATETIME2 data type while connected via Link Server.?

I have tried some examples but so far not working.
I have a Link Server (SQL Server 2014) to an Oracle 12C Database.
The table contain a datatype TIMESTAMP with data like this:
22-MAR-15 04.18.24.144789000 PM
When attempting to query this table in SQL Server 2014 via link server I get the following error using this code:
SELECT CAST(OracleTimeStampColumn AS DATETIME2(7)) FROM linkServerTable
Error:
Msg 7354, Level 16, State 1, Line 8
The OLE DB provider "OraOLEDB.Oracle" for linked server "MyLinkServer" supplied invalid metadata for column "MyDateColumn". The data type is not supported.
While the error is self explanatory, I am not certain how to resolve this.
I need to convert the timestamp to datetime2. Is this possible?
You can work around this problem by using OPENQUERY. For me, connecting to Oracle 12 from SQL 2008 over a linked server, this query fails:
SELECT TOP 10 TimestampField
FROM ORACLE..Schema.TableName
...with this error:
The OLE DB provider "OraOLEDB.Oracle" for linked server "ORACLE" supplied invalid metadata for column "TimestampField". The data type is not supported.
This occurs even if I do not include the offending column (which is of type TIMESTAMP(6). Explicitly casting it to DATETIME does not help either.
However, this works:
SELECT * FROM OPENQUERY(ORACLE, 'SELECT "TimestampField" FROM SchemaName.TableName WHERE ROWNUM <= 10')
...and the data returned flows nicely into a DATETIME2() field.
One way to solve the problem is to create a view in oracle server and convert the OracleTimeStampColumn compatible with sql server's datetime2datatype. You can change the time format to 24 hours format in oracle server's view and mark the field as varchar. Then you can convert the varchar2 column to datetime2 when selecting the column in SQL Server.
In Oracle Server
Create or Replace View VW_YourTableName As
select to_char(OracleTimeStampColumn , 'DD/MM/YYYY HH24:MI:SS.FF') OracleTimeStampColumn from YourTableName
In SQL Server
SELECT CAST(OracleTimeStampColumn AS DATETIME2(7)) FROM **linkServerVIEW**

Resources