Problems Iterating through a list of SQL tables with python - sql-server

I am trying to iterate through tables in a mssql database using a python((3.5)pymssql) script I am using the following,after connecting:
table = ("Accounts")
cursor.execute("SELECT TOP 1 * FROM %s",table)
if %s is replaced by a string, say 'Accounts' it works,
cursor.execute("SELECT TOP 1 * FROM Accounts")
when I use table it fails with the following error:
_mssql.MSSQLDatabaseException: (102, b"Incorrect syntax near
'Accounts'.DB-Lib error message 20018
pymssql shows
cursor.execute("select 'hello' where 1 =%d", 1) as correct
Please help if you can, I am somewhat confused by what should be a simple problem.
Best Regards Richard C

Looks like python creates parameters, so this should work
import pymssql
conn = pymssql.connect(".", "sa", "password", "AdventureWorks2014")
cursor = conn.cursor()
table = ("AWBuildVersion")
cursor.execute("declare #stmt nvarchar(400);set #stmt = 'select top 1 * from ' + %s;exec sp_executesql #stmt", table)
row = cursor.fetchone()
while row:
print("ID=%d, Name=%s" % (row[0], row[1]))
row = cursor.fetchone()
conn.close()

Related

Why does Snowflake variable binding in query throw error with table name but not integer?

I am following the Snowflake Python Connector docs for variable binding to avoid SQL injection. I successfully set up a db connection with the following dict of credentials:
import snowflake.connector
CONN = snowflake.connector.connect(
user=snowflake_creds['user'],
password=snowflake_creds['password'],
account=snowflake_creds['account'],
warehouse=snowflake_creds["warehouse"],
database=snowflake_creds['database'],
schema=snowflake_creds['schema'],
)
cur = CONN.cursor(snowflake.connector.DictCursor)
The following block works fine and I get back query results, hard-coding the table name and using the standard format binding:
command = ("SELECT * FROM TEST_INPUT_TABLE WHERE batch_id = %s")
bind_params = (2)
results = cur.execute(command % bind_params).fetchall()
Similarly, this block works fine, using the pyformat binding:
command = ("SELECT * FROM TEST_INPUT_TABLE WHERE batch_id = %(id)s")
bind_params = {"id": 2}
results = cur.execute(command, bind_params).fetchall()
But the following two blocks both result in a ProgrammingError (pasted below the second block):
command = ("SELECT * FROM %s WHERE batch_id = %s")
bind_params = ("TEST_INPUT_TABLE", 2)
results = cur.execute(command, bind_params).fetchall()
command = ("SELECT * FROM %(tablename)s WHERE batch_id = %(id)s")
bind_params = {
"tablename": "TEST_INPUT_TABLE",
"id": 2
}
results = cur.execute(command, bind_params).fetchall()
ProgrammingError: 001011 (42601): SQL compilation error:
invalid URL prefix found in: 'TEST_INPUT_TABLE'
Is there some difference between how strings and ints get interpolated? I would
not think it would make a difference but that is all I can think of. Am I
missing something simple here? I don't want to have to choose between hard-coding the table name and putting the system at risk of SQL injection. Thanks for any guidance.
You should be wrapping your bind variables with an INDENTIFER() function when they reference an object, rather than a string literal. For example:
command = ("SELECT * FROM IDENTIFIER(%(tablename)s) WHERE batch_id = %(id)s")
https://docs.snowflake.com/en/sql-reference/identifier-literal.html
Give that a try.

I have a Pymmsql error in python Query processor could not produce a query plan

I have a issue reading my index select statement with this error any solution:
File "src\pymssql.pyx", line 468, in pymssql.Cursor.execute
pymssql.OperationalError: (8622, b'Query processor could not produce a query plan because of the
hints defined in this query. Resubmit the query without specifying any hints and without using SET
FORCEPLAN.DB-Lib error message 20018, severity 16:\nGeneral SQL Server error: Check messages from
the SQL Server\n')
Code :
with pymssql.connect ("*********","*********", "**********","*****") as myDbConn:
with myDbConn.cursor() as cursor:
cursor.execute(""" if exists (select * from sys.indexes where name = 'Micros' and object_id('payrolldata') = object_id)
begin
drop index Micros on payrolldata
end """)
sql = """create index Micros on payrolldata(stono,payrollid,busdate)where busdate >= 's' and busdate <= 's' """ .format(dteStartDate,m0weekend);
cursor.execute(sql)
DbConnect = 'Micros'
myDbConn = pymssql.connect("*******","*******", "********",*******)
cursor = myDbConn.cursor()
cursor.execute("""select * from payrolldata with(INDEX(Micros)) ;""")

psycopg2 write list of strings (with text delimiter) to a postgres array

Objective:
I have a list containing strings, some have single quotes in them (as part of the string itself) ;
listOfStr = ['A sample string', "A second string with a ' single quote", 'a third string', ...]
Note that each entry does not necessarily use the same text delimiter, some are single quoted, other (the ones containing single quote as part of the string) are double quoted.
I want to insert my list as a postgresql ARRAY using psycopg2:
import psycopg2
connString = (...) # my DB parameters here.
conn = psycopg2.connect(connString)
curs = conn.cursor()
update_qry = ("""UPDATE "mytable" SET arraycolumn = {listofStr}::varchar[],
timestamp = now() WHERE id = {ID}""".format(listofStr=listofStr,
ID=ID))
curs.execute(update_qry)
The problem:
But I get this error:
SyntaxError: syntax error at or near "["
LINE 1: UPDATE "mytable" SET arraycolumn = ['A sample string'...
If I specify the ARRAY data type in the SQL query by adding the word 'ARRAY' in front of my list:
update_qry = ("""UPDATE "mytable" SET arraycolumn = ARRAY {listofStr}::varchar[],
timestamp = now() WHERE id = {ID}""".format(listofStr=listofStr,
ID=ID))
I get this error:
UndefinedColumn: column "A second string with a ' single quote" does not exist
LINE 1: 'A sample string', "A second string with a '...
I don't know how to fix it.
Environment:
Ubuntu 18.04 64 bits 5.0.0-37-generic x86_64 GNU/Linux
Python 3.6.9 (default, Nov 7 2019, 10:44:02)
psycopg2 2.7.7
psycopg2-binary 2.8.4
"PostgreSQL 10.10 (Ubuntu 10.10-0ubuntu0.18.04.1) on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu 7.4.0-1ubuntu1~18.04.1) 7.4.0, 64-bit"
Related threads:
Postgres/psycopg2 - Inserting array of strings
Doc:
http://initd.org/psycopg/docs/usage.html -> # list adaptation
Basically the question should have been closed as a duplicate. However, you know Piro's answer and I think you have a problem with interpreting it.
id = 1
list_of_str = ['A sample string', "A second string with a ' single quote", 'a third string']
update_qry = """
UPDATE mytable
SET arraycolumn = %s,
timestamp = now()
WHERE id = %s
"""
cur = conn.cursor()
cur.execute(update_qry, [list_of_str, id])
conn.commit()
I agree with #piro that you really want Bind Parameters,
rather than attempting to do any crazy quoting.
You already know how to accomplish that when inserting
one simple VARCHAR row per list element.
I recommend you create a TEMP TABLE and
send your data to the database in that way.
Then consult https://www.postgresql.org/docs/current/sql-expressions.html#SQL-SYNTAX-ARRAY-CONSTRUCTORS
and use this example to munge rows of the temp table into an array:
SELECT ARRAY(SELECT oid FROM pg_proc WHERE proname LIKE 'bytea%');
You will want an expression like
SELECT ARRAY(SELECT my_text FROM my_temp_table);
It is possible that your temp table will also need an integer column,
to preserve element order.

Better way to generate dates from start date and end date in POSTGRES

def db_connect():
#Please assign variables in dict db = {"database":"", ...}
try:
conn = psycopg2.connect(database=db["database"], user=db["user"], password=db["password"], host=db["host"], port=db.get("port", "5432"))
except:
conn = None
print "I am unable to connect to the database"
return conn
def db_query_time():
conn = db_connect()
if conn is not None:
cur = conn.cursor()
query_generate_date_series = '''SELECT to_char(day, 'YYYYMMDD') as day_f FROM generate_series( '2017-10-23'::timestamp,'2017-10-29'::timestamp, '1 day'::interval) day ;'''
cur.execute(query_generate_date_series)
rows = cur.fetchall()
print rows
Output looks like this: [('20171023',), ('20171024',), ('20171025',), ('20171026',), ('20171027',), ('20171028',), ('20171029',)]
I want dates to be in a list format. and I don't like how we are returning tuple and 2 commas instead of 1 comma. Can anyone please explain me what is the reason behind and how to fix this?
Note: Need postgres DB to run this.
Perhaps it's because you're using a table "day" and not a column.
This might work:
SELECT to_char(generate_series( '2017-10-23'::timestamp,'2017-10-29'::timestamp, '1 day'::interval), 'YYYYMMDD') AS day_f

Sybase BulkCopy WriteToServer error: Incorrect syntax near ','

Is it possible to populate a temp table using AseBulkCopy.WriteToServer?
I am calling the below method twice in my test app: firstly with a non-temp table and secondly with a temp table. The code runs fine with the non-temp table, but when trying to populate a temp table the error:
Incorrect syntax near ','.
is raised.
In both cases the target and source tables have just a single column, defined as an INT with the same name.
I have tried using a DataTable and an IDataReader as the source of the data and both result in the same error being raised.
I have tried using both "EnableBulkLoad=1" and "EnableBulkLoad=2" in the connection string.
I have tried using both the raw temp table name and the name prefixed with "dbo."
The data to be loaded is a single int value (ie, 1 row, 1 column) although it also happens if have longer rows or multiple rows.
It's worth noting that I can insert data into the temp table (using AseCommand.ExecuteNonQuery) and can execute a 'SELECT COUNT (1)' from the temp table (using AseCommand.ExecuteScalar) successfully.
Here is the code:
private static void BulkCopyInsertIntoTable(string tableName)
{
IDataReader dataSource = null;
SqlConnection sourceConnection = null;
MssqlCommand.GetDataReader(SqlServerConnectionString, out sourceConnection, out dataSource);
AseConnection targetConnection = new AseConnection(SybaseConnectionString);
try
{
targetConnection.Open();
AseCommand cmd = new AseCommand();
AseBulkCopy blk = new AseBulkCopy(targetConnection);
blk.DestinationTableName = tableName;
//blk.ColumnMappings.Clear();
//blk.ColumnMappings.Add(new AseBulkCopyColumnMapping(0, 0));//Doesn't make any difference with a datasource, causes an error to be raised with a datatable.
Console.WriteLine("bulkcopy insert into the table " + tableName + " ..starting: datasource");
//blk.WriteToServer(dataSource);
Console.WriteLine("bulkcopy insert into the table " + tableName + " ..starting: datatable");
blk.ColumnMappings.Clear();
DataTable dt = SybaseCommand.GetFakeDataTable(); ;
blk.WriteToServer(dt);
}
catch (AseException ex)
{
Console.WriteLine(ex.Message);
}
finally
{
targetConnection.Dispose();
Console.WriteLine("bulkcopy insert into the table " + tableName + " ..ended");
}
}
Firstly, is it possible to populate a temp table using WriteToServer?
Assuming it is, what might I being doing wrong?
UPDATE:
When I change the line
blk.DestinationTableName = tableName;
to
blk.DestinationTableName = "XXXX";
I get the same error, so are there rules about how the temp table is named when using WriteToServer? The value of tableName is what I was using for the direct INSERT and SELECT COUNT(1) queries so I expected it to be correct.
Thanks
In my experience, the answer is no, you can't use AseBulkCopy.WriteToServer to populate a temporary table.

Resources