Python parameterized query and insert into SQL Server - sql-server

I'm using pyodbc connector for storing data and image into SQL Server. The storing function contains parameterized arguments which the value was supply by global variables from others function.
With hard-coded the values, I able to insert into the DB without any issue, but seems like no luck when trying to insert using the variable values.
What is the right method for me to execute this transaction in Python? Any help/advice is highly appreciated!
def convertToBinaryData(filename):
# Convert digital data to binary format
with open(filename, 'rb') as file:
binaryData = file.read()
return binaryData
def saveRecord1(self,DocumentType, FileName, DocumentContent, DocumentText, LastUpdate, UpdatedBy):
print("Inserting into database")
conn = pyodbc.connect('Driver={SQL Server};'
'Server=localhost;'
'Database=testDB;'
'uid=test;'
'pwd=test01;'
'Trusted_Connection=No;')
cursor = conn.cursor(prepared=True)
sql_insert_blob_query = """INSERT INTO testDB.dbo.OCRDocuments (DocumentType, FileName, DocumentContent, DocumentText, LastUpdate, UpdatedBy) VALUES (?,?,?,?,?,?)"""
pics = convertToBinaryData(DocumentContent)
insert_blob_tuple = (DocumentType, FileName, pics, DocumentText, LastUpdate, UpdatedBy)
result = cursor.execute(sql_insert_blob_query, insert_blob_tuple)
QtGui.QMessageBox.warning(self, 'Status', 'Successfully saved!',
QtGui.QMessageBox.Cancel, QtGui.QMessageBox.Ok)
conn.commit()
conn.close()
#saveRecord( 'k1', 'imgFileType', "output.png", '2020-10-27 11:20:47.000', '2020-10-27 11:20:47.000','1000273868')
saveRecord1(self, docType, imgFileType, output, docNum, datetime,userID)

Related

NIFI - upload binary.zip to SQL Server as varbinary

I am trying to upload a binary.zip to SQL Server as varbinary type column content.
Target Table:
CREATE TABLE myTable ( zipFile varbinary(MAX) );
My NIFI Flow is very simple:
-> GetFile:
filter:binary.zip
-> UpdateAttribute:<br>
sql.args.1.type = -3 # as varbinary according to JDBC types enumeration
sql.args.1.value = ??? # I don't know what to put here ! (I've triying everything!)
sql.args.1.format= ??? # Is It required? I triyed 'hex'
-> PutSQL:<br>
SQLstatement= INSERT INTO myTable (zip_file) VALUES (?);
What should I put in sql.args.1.value?
I think it should be the flowfile payload, but it would work as part of the INSERT in the PutSQL? Not by the moment!
Thanks!
SOLUTION UPDATE:
Based on https://issues.apache.org/jira/browse/NIFI-8052
(Consider I'm sending some data as attribute parameter)
import java.nio.charset.StandardCharsets
import org.apache.nifi.controller.ControllerService
import groovy.sql.Sql
def flowFile = session.get()
def lookup = context.controllerServiceLookup
def dbServiceName = flowFile.getAttribute('DatabaseConnectionPoolName')
def tableName = flowFile.getAttribute('table_name')
def fieldName = flowFile.getAttribute('field_name')
def dbcpServiceId = lookup.getControllerServiceIdentifiers(ControllerService).find
{ cs -> lookup.getControllerServiceName(cs) == dbServiceName }
def conn = lookup.getControllerService(dbcpServiceId)?.getConnection()
def sql = new Sql(conn)
flowFile.read{ rawIn->
def parms = [rawIn ]
sql.executeInsert "INSERT INTO " + tableName + " (date, "+ fieldName + ") VALUES (CAST( GETDATE() AS Date ) , ?) ", parms
}
conn?.close()
if(!flowFile) return
session.transfer(flowFile, REL_SUCCESS)
session.commit()
maybe there is a nifi native way to insert blob however you could use ExecuteGroovyScript instead of UpdateAttribute and PutSQL
add SQL.mydb parameter on the level of processor and link it to required DBCP pool.
use following script body:
def ff=session.get()
if(!ff)return
def statement = "INSERT INTO myTable (zip_file) VALUES (:p_zip_file)"
def params = [
p_zip_file: SQL.mydb.BLOB(ff.read()) //cast flow file content as BLOB sql type
]
SQL.mydb.executeInsert(params, statement) //committed automatically on flow file success
//transfer to success without changes
REL_SUCCESS << ff
inside the script SQL.mydb is a reference to groovy.sql.Sql oblject

KeyError while trying to connect to database using pymssql

The below code tries to connect to a mssql database using pymssql. I have a CSV file and I am trying to push all the rows into a single data table in the mssql database. I am getting a 'KeyError' when I try to execute the code after opening the CSV file.
import csv
import pymssql
conn = pymssql.connect(host="host name",
database="dbname",
user = "username",
password = "password")
cursor = conn.cursor()
if(conn):
print("True")
else:
print("False")
with open ('path to csv file', 'r') as f:
reader = csv.reader(f)
columns = next(reader)
query = "INSERT INTO Marketing({'URL', 'Domain_name', 'Downloadables', 'Text_without_javascript', 'Downloadable_Link'}) VALUES ({%s,%s,%s,%s,%s})"
query = query.format(','.join('[' + x + ']' for x in columns), ','.join('?' * len(columns)))
cursor = conn.cursor()
for data in reader:
cursor.execute(query, tuple(data))
cursor.commit()
The below is the error that I get:
KeyError: "'URL', 'Domain_name', 'Downloadables', 'Text_without_javascript', 'Downloadable_Link'"
Using to_sql
file_path = "path to csv"
engine = create_engine("mssql://user:password#host/database")
df = pd.read_csv(file_path, encoding = 'latin')
df.to_sql(name='Marketing',con=engine,if_exists='append')
Output:
InterfaceError: (pyodbc.InterfaceError) ('IM002', '[IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified (0) (SQLDriverConnect)')
I tried everything, from converting the parameters which were being passed to a tuple, passing it as is, but didn't help. Below is the code that helped me fix the issue:
with open ('path to csv file', 'r') as f:
for row in f:
reader = csv.reader(f)
# print(reader)
columns = next(reader)
# print(columns)
cursor = conn.cursor()
for data in reader:
# print(data)
data = tuple(data)
# print(data)
query = ("INSERT INTO Marketing(URL, Domain_name, Downloadables, Text_without_javascript, Downloadable_Link) VALUES (%s,%s,%s,%s,%s)")
parameters = data
# query = query.format(','.join('?' * len(columns)))
cursor.execute(query, parameters)
conn.commit()
Note: The connecting to the database part remains as in the question.

How to check that any particular record already exist in the database or not before inserting record in MVC?

I am using the following code to export data from excel to Sql server database. What is going on with this code is, its importing complete data into the database.
[HttpPost]
public ActionResult Importexcel()
{
if (Request.Files["FileUpload1"].ContentLength > 0)
{
string extension = System.IO.Path.GetExtension(Request.Files["FileUpload1"].FileName);
string path1 = string.Format("{0}/{1}", Server.MapPath("~/Content/UploadedFolder"), Request.Files["FileUpload1"].FileName);
if (System.IO.File.Exists(path1))
System.IO.File.Delete(path1);
Request.Files["FileUpload1"].SaveAs(path1);
string sqlConnectionString = #"Data Source=xyz-101\SQLEXPRESS;Database=PracDB;Trusted_Connection=true;Persist Security Info=True";
string excelConnectionString = #"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + path1 + ";Extended Properties=Excel 12.0;Persist Security Info=False";
OleDbConnection excelConnection = new OleDbConnection(excelConnectionString);
OleDbCommand cmd = new OleDbCommand("Select [ID],[Name],[Designation] from [Sheet1$]", excelConnection);
excelConnection.Open();
OleDbDataReader dReader;
dReader = cmd.ExecuteReader();
SqlBulkCopy sqlBulk = new SqlBulkCopy(sqlConnectionString);
sqlBulk.DestinationTableName = "Excel_Table";
sqlBulk.WriteToServer(dReader);
excelConnection.Close();
}
return RedirectToAction("Index");
}
How to check that any particular record already exist in the database or not. If not then Insert the record into the databse else it should not.
Thanks in Advance !
Since your target is SQL Server, you can use this to your advantage.
What I would do is read the data from the excel into a DataTable (instead of using a DataReader you can use a DataAdapter), Send that DataTable to a stored procedure in the SQL server, and handle the insert there. In order to sand a data table to a stored procedure you first need to create a Table-value user defined type in your sql server, like this:
CREATE TYPE MyType AS TABLE
(
Id int,
Name varchar(20), -- use whatever length best fitted to your data
Designation varchar(max) -- use whatever length best fitted to your data
)
Then you can write a simple stored procedure with an argument of this type:
CREATE PROCEDURE InsertDataFromExcel
(
#ExcelData dbo.MyType readonly -- Note: readonly is a required!
)
AS
INSERT INTO MyTable(Id, Name, Designation)
SELECT a.Id, a.Name, a.Designation
FROM #ExcelData a LEFT JOIN
MyTable b ON(a.Id = b.Id)
WHERE b.Id IS NULL -- this condition with the left join ensures you only select records that has different id values then the records already in your database
in order to send this parameter to the stored procedure from your c# code you will have to use a SqlCommand object and add the DataTable as a parameter, something like this:
using(SqlConnection Con = new SqlConnection(sqlConnectionString))
{
using(SqlCommand InsertCommand = new SqlCommand("InsertDataFromExcel", Con))
{
SqlParameter MyParam = new SqlParameter("#ExcelData", SqlDBType.Structured);
MyParam.Value = MyDataTable; // this is the data table from the
InsertCommand.Parameters.Add(MyParam);
Con.Open();
InsertCommand.ExecuteNoQuery();
Con.Close();
}
}
Note: Code was writen directly here, some errors might be found.

Maximum length of string which can be returned from stored proc in SQL Server 2008 to .net apps

I am returning a static string from a stored procedure (in SQL Server 2008) as below:
select 'abcdefgh.........xyz'
If the static string length is exceeding more than some limit (eg:8kb) then only partial string (eg:7kb) is returned to the .net apps.
Though I tried in different ways like assigning static string to varchar(max) and selecting the variable, is still returning only partial string.
I should return complete string which could be of max of 5mb. So, main concerns:
What is the max string length I can return from a stored procedure
How to return 5 mb string from stored procedure to .net apps.
I request someone can help me to resolve this issue.
please find the code below
using (SqlCommand command = new SqlCommand(Source.GetExportRecordSP, Connection))
{
command.CommandType = CommandType.StoredProcedure;
command.Parameters.Add(new SqlParameter("#CandidateRecordID ", SqlDbType.NVarChar, 32)).Value = record;
try
{
if (Connection.State != ConnectionState.Open)
{
Connection.Open();
}
using (SqlDataReader reader = command.ExecuteReader())
{
if(reader.Read())
{
xmlRecord = new XmlDocument();
xmlRecord.LoadXml(reader.GetString(0));
}
}
}
catch (Exception Ex)
{
Logging.WriteError(string.Format("Error while retrieving the Record \"{0}\" details from Database. Exception: {1} ", Ex.ToString()));
throw;
}
}
Thanks in advance geeks.
Since you appear not to be using an OLEDB connection (which has an 8k limit), I think the problem is in your procedure code.
Or, perhaps, the compatibility version of your database is set to something other than SQL Server 2008 (SQL Server 2000 could not return more than 8k using GetString()).
Thanks for support, I found 1 fix for this at
http://www.sqlservercentral.com/Forums/Topic350590-145-1.aspx
Fix is, declare a variable, and should be initlized to empty string and concatenated with the main string.
DECLARE #test varchar(MAX);
set #test =''
select #test = #test + '<Invoice>.....'
If the string length is <8000 it will work without the above approach.
Thanks all.

Insert data into SQL Server database table with inline query, not allowed to use stored proc

I have a table in C#, the data is coming from an Excel file. I need this data to be inserted into a SQL Server 2000 table. I am not to use stored procedures. How do I program it? Any help would be appreciated.
Do you have a DataTable ?
You'd need something like:
// set up connection to your database
using (SqlConnection con = new SqlConnection("your-connection-string-here"))
{
// define the INSERT statement - of course, I don't know what your table name
// is and which and how many fields you want to insert - adjust accordingly
string insertStmt =
"INSERT INTO dbo.YourTable(field1, field2, field3) " +
"VALUES(#field1, #field2, #field3)";
// create SqlCommand object
using (SqlCommand cmd = new SqlCommand(insertStmt, con))
{
// set up the parameters - again: I don't know your parameter names
// nor the parameters types - adjust to your needs
cmd.Parameters.Add("#field1", SqlDbType.Int);
cmd.Parameters.Add("#field2", SqlDbType.VarChar, 100);
cmd.Parameters.Add("#field3", SqlDbType.VarChar, 250);
// open connection
con.Open();
// iterate over all the Rows in your data table
foreach (DataRow row in YourDataTable.Rows)
{
// assign the values to the parameters, based on your DataRow
cmd.Parameters["#field1"].Value = Convert.ToInt32(row["columnname1"]);
cmd.Parameters["#field2"].Value = row["columnname2"].ToString();
cmd.Parameters["#field3"].Value = row["columnname3"].ToString();
// call INSERT statement
cmd.ExecuteNonQuery();
}
// close connection
con.Close();
}
}
Of course, this has no error checking whatsoever, you will need to add some of that yourself (try....catch and so on). But basically, that's the way I would do it, if I can't use stored procedures.
use System.Data.SqlClient.SqlCommand

Resources