.Net Using Transactions with Prepared Statements for SqlClient - sql-server

I'm trying to implement Transactions for a script but I've run into a strange issue.
When I attempt to run a Prepared SQL Statement inside of a Transaction it is failing because it says that it needs a Transaction when the connection is assigned.
How does this work with Prepared Statements though because I intend to have multiple Transactions all using the same Prepared Statements.
My code is as follows
class dbTest {
public static SqlConnection db;
public static SqlCommand query;
static void Main(string[] args) {
db = connect();
prepare();
transaction01();
transaction02();
transaction03();
}
public static void prepare() {
query = new SqlCommand("select id from table where id = 1 for update", db);
query.Prepare();
}
public static void transaction01() {
SqlTransaction trans = db.BeginTransaction("Trn01");
SqlDataReader result = query.ExecuteReader();
while(result.Read()) { Console.WriteLine(result["id"]); }
result.Close();
trans.Commit();
}
public static void transaction02() {
SqlTransaction trans = db.BeginTransaction("Trn02");
SqlDataReader result = query.ExecuteReader();
while(result.Read()) { Console.WriteLine(result["id"]); }
result.Close();
trans.Commit();
}
public static void transaction03() {
SqlTransaction trans = db.BeginTransaction("Trn03");
SqlDataReader result = query.ExecuteReader();
while(result.Read()) { Console.WriteLine(result["id"]); }
result.Close();
trans.Commit();
}
}
How do I assign the Transaction to an existing Prepared Statement?
UPDATE
Changed the above code to better show the issue. The SQL is prepared once but I will be using it for multiple Transactions (or at least I want to)
UPDATE AGAIN
I have marked an answer below as the correct one because it looks like the best way to achieve this but for my needs in this very small example using query.Transaction got it working
public static void transaction01() {
SqlTransaction trans = db.BeginTransaction("Trn01");
query.Transaction = trans; // this line fixed it
SqlDataReader result = query.ExecuteReader();
while(result.Read()) { Console.WriteLine(result["id"]); }
result.Close();
trans.Commit();
}

When working with the SqlTransaction, you must set the SqlCommand.Transaction explicitly, even though enlisting in the current transaction is not optional in SQL Server.
select ... for update is not valid SQL Server syntax, instead use UPDLOCK to read a table and retain a restrictive lock for the duration of the transaction. EG
select id from table with (updlock) where id = 1
When I attempt to run a Prepared SQL Statement
It's rarely useful to use prepared statements with SQL Server. Query plan caching happens automatically even without it, and it really just reduces the size of the request on the network when you are executing a SqlCommand many times with differing parameters.
But a prepared SqlCommand is still bound to a single SqlConnection, which typically has a short lifetime, minimizing the potential benefits of preparing the SqlCommand.

You need to set the SqlCommand.Transaction to your transaction object.
it is not necessary with SQL Server to prepare the statement. Just keep executing.
Note also, as you can see in this post, that you must correctly dispose all DB objects.
Here is your code cleaned up:
class dbTest {
// DO NOT cache connection object
static void Main(string[] args) {
using(var db = connect())
using(var comm = GetCommand(db))
{
transaction01(comm);
transaction02(comm);
transaction03(comm);
}
}
public static SqlCommand GetCommand(SqlConnection conn) {
return new SqlCommand("select id from table with (updlock) where id = 1", conn);
}
public static void transaction01(SqlCommand comm) {
using(SqlTransaction trans = comm.Connection.BeginTransaction("Trn01"))
{
comm.Transaction = trans;
using(SqlDataReader result = query.ExecuteReader())
while(result.Read()) { Console.WriteLine(result["id"]); }
trans.Commit();
} // no need to close, using will sort that out
}
public static void transaction02(SqlCommand comm) {
using(SqlTransaction trans = comm.Connection.BeginTransaction("Trn02"))
{
comm.Transaction = trans;
using(SqlDataReader result = query.ExecuteReader())
while(result.Read()) { Console.WriteLine(result["id"]); }
trans.Commit();
} // no need to close, using will sort that out
}
public static void transaction03(SqlCommand comm) {
using(SqlTransaction trans = comm.Connection.BeginTransaction("Trn03"))
{
comm.Transaction = trans;
using(SqlDataReader result = query.ExecuteReader())
while(result.Read()) { Console.WriteLine(result["id"]); }
trans.Commit();
} // no need to close, using will sort that out
}
}

Related

How to use scope_identity in jdbc [duplicate]

I want to INSERT a record in a database (which is Microsoft SQL Server in my case) using JDBC in Java. At the same time, I want to obtain the insert ID. How can I achieve this using JDBC API?
If it is an auto generated key, then you can use Statement#getGeneratedKeys() for this. You need to call it on the same Statement as the one being used for the INSERT. You first need to create the statement using Statement.RETURN_GENERATED_KEYS to notify the JDBC driver to return the keys.
Here's a basic example:
public void create(User user) throws SQLException {
try (
Connection connection = dataSource.getConnection();
PreparedStatement statement = connection.prepareStatement(SQL_INSERT,
Statement.RETURN_GENERATED_KEYS);
) {
statement.setString(1, user.getName());
statement.setString(2, user.getPassword());
statement.setString(3, user.getEmail());
// ...
int affectedRows = statement.executeUpdate();
if (affectedRows == 0) {
throw new SQLException("Creating user failed, no rows affected.");
}
try (ResultSet generatedKeys = statement.getGeneratedKeys()) {
if (generatedKeys.next()) {
user.setId(generatedKeys.getLong(1));
}
else {
throw new SQLException("Creating user failed, no ID obtained.");
}
}
}
}
Note that you're dependent on the JDBC driver as to whether it works. Currently, most of the last versions will work, but if I am correct, Oracle JDBC driver is still somewhat troublesome with this. MySQL and DB2 already supported it for ages. PostgreSQL started to support it not long ago. I can't comment about MSSQL as I've never used it.
For Oracle, you can invoke a CallableStatement with a RETURNING clause or a SELECT CURRVAL(sequencename) (or whatever DB-specific syntax to do so) directly after the INSERT in the same transaction to obtain the last generated key. See also this answer.
Create Generated Column
String generatedColumns[] = { "ID" };
Pass this geneated Column to your statement
PreparedStatement stmtInsert = conn.prepareStatement(insertSQL, generatedColumns);
Use ResultSet object to fetch the GeneratedKeys on Statement
ResultSet rs = stmtInsert.getGeneratedKeys();
if (rs.next()) {
long id = rs.getLong(1);
System.out.println("Inserted ID -" + id); // display inserted record
}
When encountering an 'Unsupported feature' error while using Statement.RETURN_GENERATED_KEYS, try this:
String[] returnId = { "BATCHID" };
String sql = "INSERT INTO BATCH (BATCHNAME) VALUES ('aaaaaaa')";
PreparedStatement statement = connection.prepareStatement(sql, returnId);
int affectedRows = statement.executeUpdate();
if (affectedRows == 0) {
throw new SQLException("Creating user failed, no rows affected.");
}
try (ResultSet rs = statement.getGeneratedKeys()) {
if (rs.next()) {
System.out.println(rs.getInt(1));
}
rs.close();
}
Where BATCHID is the auto generated id.
I'm hitting Microsoft SQL Server 2008 R2 from a single-threaded JDBC-based application and pulling back the last ID without using the RETURN_GENERATED_KEYS property or any PreparedStatement. Looks something like this:
private int insertQueryReturnInt(String SQLQy) {
ResultSet generatedKeys = null;
int generatedKey = -1;
try {
Statement statement = conn.createStatement();
statement.execute(SQLQy);
} catch (Exception e) {
errorDescription = "Failed to insert SQL query: " + SQLQy + "( " + e.toString() + ")";
return -1;
}
try {
generatedKey = Integer.parseInt(readOneValue("SELECT ##IDENTITY"));
} catch (Exception e) {
errorDescription = "Failed to get ID of just-inserted SQL query: " + SQLQy + "( " + e.toString() + ")";
return -1;
}
return generatedKey;
}
This blog post nicely isolates three main SQL Server "last ID" options:
http://msjawahar.wordpress.com/2008/01/25/how-to-find-the-last-identity-value-inserted-in-the-sql-server/ - haven't needed the other two yet.
Instead of a comment, I just want to answer post.
Interface java.sql.PreparedStatement
columnIndexes « You can use prepareStatement function that accepts columnIndexes and SQL statement.
Where columnIndexes allowed constant flags are Statement.RETURN_GENERATED_KEYS1 or Statement.NO_GENERATED_KEYS[2], SQL statement that may contain one or more '?' IN parameter placeholders.
SYNTAX «
Connection.prepareStatement(String sql, int autoGeneratedKeys)
Connection.prepareStatement(String sql, int[] columnIndexes)
Example:
PreparedStatement pstmt =
conn.prepareStatement( insertSQL, Statement.RETURN_GENERATED_KEYS );
columnNames « List out the columnNames like 'id', 'uniqueID', .... in the target table that contain the auto-generated keys that should be returned. The driver will ignore them if the SQL statement is not an INSERT statement.
SYNTAX «
Connection.prepareStatement(String sql, String[] columnNames)
Example:
String columnNames[] = new String[] { "id" };
PreparedStatement pstmt = conn.prepareStatement( insertSQL, columnNames );
Full Example:
public static void insertAutoIncrement_SQL(String UserName, String Language, String Message) {
String DB_URL = "jdbc:mysql://localhost:3306/test", DB_User = "root", DB_Password = "";
String insertSQL = "INSERT INTO `unicodeinfo`( `UserName`, `Language`, `Message`) VALUES (?,?,?)";
//"INSERT INTO `unicodeinfo`(`id`, `UserName`, `Language`, `Message`) VALUES (?,?,?,?)";
int primkey = 0 ;
try {
Class.forName("com.mysql.jdbc.Driver").newInstance();
Connection conn = DriverManager.getConnection(DB_URL, DB_User, DB_Password);
String columnNames[] = new String[] { "id" };
PreparedStatement pstmt = conn.prepareStatement( insertSQL, columnNames );
pstmt.setString(1, UserName );
pstmt.setString(2, Language );
pstmt.setString(3, Message );
if (pstmt.executeUpdate() > 0) {
// Retrieves any auto-generated keys created as a result of executing this Statement object
java.sql.ResultSet generatedKeys = pstmt.getGeneratedKeys();
if ( generatedKeys.next() ) {
primkey = generatedKeys.getInt(1);
}
}
System.out.println("Record updated with id = "+primkey);
} catch (InstantiationException | IllegalAccessException | ClassNotFoundException | SQLException e) {
e.printStackTrace();
}
}
I'm using SQLServer 2008, but I have a development limitation: I cannot use a new driver for it, I have to use "com.microsoft.jdbc.sqlserver.SQLServerDriver" (I cannot use "com.microsoft.sqlserver.jdbc.SQLServerDriver").
That's why the solution conn.prepareStatement(sql, Statement.RETURN_GENERATED_KEYS) threw a java.lang.AbstractMethodError for me.
In this situation, a possible solution I found is the old one suggested by Microsoft:
How To Retrieve ##IDENTITY Value Using JDBC
import java.sql.*;
import java.io.*;
public class IdentitySample
{
public static void main(String args[])
{
try
{
String URL = "jdbc:microsoft:sqlserver://yourServer:1433;databasename=pubs";
String userName = "yourUser";
String password = "yourPassword";
System.out.println( "Trying to connect to: " + URL);
//Register JDBC Driver
Class.forName("com.microsoft.jdbc.sqlserver.SQLServerDriver").newInstance();
//Connect to SQL Server
Connection con = null;
con = DriverManager.getConnection(URL,userName,password);
System.out.println("Successfully connected to server");
//Create statement and Execute using either a stored procecure or batch statement
CallableStatement callstmt = null;
callstmt = con.prepareCall("INSERT INTO myIdentTable (col2) VALUES (?);SELECT ##IDENTITY");
callstmt.setString(1, "testInputBatch");
System.out.println("Batch statement successfully executed");
callstmt.execute();
int iUpdCount = callstmt.getUpdateCount();
boolean bMoreResults = true;
ResultSet rs = null;
int myIdentVal = -1; //to store the ##IDENTITY
//While there are still more results or update counts
//available, continue processing resultsets
while (bMoreResults || iUpdCount!=-1)
{
//NOTE: in order for output parameters to be available,
//all resultsets must be processed
rs = callstmt.getResultSet();
//if rs is not null, we know we can get the results from the SELECT ##IDENTITY
if (rs != null)
{
rs.next();
myIdentVal = rs.getInt(1);
}
//Do something with the results here (not shown)
//get the next resultset, if there is one
//this call also implicitly closes the previously obtained ResultSet
bMoreResults = callstmt.getMoreResults();
iUpdCount = callstmt.getUpdateCount();
}
System.out.println( "##IDENTITY is: " + myIdentVal);
//Close statement and connection
callstmt.close();
con.close();
}
catch (Exception ex)
{
ex.printStackTrace();
}
try
{
System.out.println("Press any key to quit...");
System.in.read();
}
catch (Exception e)
{
}
}
}
This solution worked for me!
I hope this helps!
You can use following java code to get new inserted id.
ps = con.prepareStatement(query, Statement.RETURN_GENERATED_KEYS);
ps.setInt(1, quizid);
ps.setInt(2, userid);
ps.executeUpdate();
ResultSet rs = ps.getGeneratedKeys();
if (rs.next()) {
lastInsertId = rs.getInt(1);
}
It is possible to use it with normal Statement's as well (not just PreparedStatement)
Statement statement = conn.createStatement();
int updateCount = statement.executeUpdate("insert into x...)", Statement.RETURN_GENERATED_KEYS);
try (ResultSet generatedKeys = statement.getGeneratedKeys()) {
if (generatedKeys.next()) {
return generatedKeys.getLong(1);
}
else {
throw new SQLException("Creating failed, no ID obtained.");
}
}
Most others have suggested to use JDBC API for this, but personally, I find it quite painful to do with most drivers. When in fact, you can just use a native T-SQL feature, the OUTPUT clause:
try (
Statement s = c.createStatement();
ResultSet rs = s.executeQuery(
"""
INSERT INTO t (a, b)
OUTPUT id
VALUES (1, 2)
"""
);
) {
while (rs.next())
System.out.println("ID = " + rs.getLong(1));
}
This is the simplest solution for SQL Server as well as a few other SQL dialects (e.g. Firebird, MariaDB, PostgreSQL, where you'd use RETURNING instead of OUTPUT).
I've blogged about this topic more in detail here.
With Hibernate's NativeQuery, you need to return a ResultList instead of a SingleResult, because Hibernate modifies a native query
INSERT INTO bla (a,b) VALUES (2,3) RETURNING id
like
INSERT INTO bla (a,b) VALUES (2,3) RETURNING id LIMIT 1
if you try to get a single result, which causes most databases (at least PostgreSQL) to throw a syntax error. Afterwards, you may fetch the resulting id from the list (which usually contains exactly one item).
In my case ->
ConnectionClass objConnectionClass=new ConnectionClass();
con=objConnectionClass.getDataBaseConnection();
pstmtGetAdd=con.prepareStatement(SQL_INSERT_ADDRESS_QUERY,Statement.RETURN_GENERATED_KEYS);
pstmtGetAdd.setString(1, objRegisterVO.getAddress());
pstmtGetAdd.setInt(2, Integer.parseInt(objRegisterVO.getCityId()));
int addId=pstmtGetAdd.executeUpdate();
if(addId>0)
{
ResultSet rsVal=pstmtGetAdd.getGeneratedKeys();
rsVal.next();
addId=rsVal.getInt(1);
}
If you are using Spring JDBC, you can use Spring's GeneratedKeyHolder class to get the inserted ID.
See this answer...
How to get inserted id using Spring Jdbctemplate.update(String sql, obj...args)
If you are using JDBC (tested with MySQL) and you just want the last inserted ID, there is an easy way to get it. The method I'm using is the following:
public static Integer insert(ConnectionImpl connection, String insertQuery){
Integer lastInsertId = -1;
try{
final PreparedStatement ps = connection.prepareStatement(insertQuery);
ps.executeUpdate(insertQuery);
final com.mysql.jdbc.PreparedStatement psFinal = (com.mysql.jdbc.PreparedStatement) ps;
lastInsertId = (int) psFinal.getLastInsertID();
connection.close();
} catch(SQLException ex){
System.err.println("Error: "+ex);
}
return lastInsertId;
}
Also, (and just in case) the method to get the ConnectionImpl is the following:
public static ConnectionImpl getConnectionImpl(){
ConnectionImpl conexion = null;
final String dbName = "database_name";
final String dbPort = "3306";
final String dbIPAddress = "127.0.0.1";
final String connectionPath = "jdbc:mysql://"+dbIPAddress+":"+dbPort+"/"+dbName+"?autoReconnect=true&useSSL=false";
final String dbUser = "database_user";
final String dbPassword = "database_password";
try{
conexion = (ConnectionImpl) DriverManager.getConnection(connectionPath, dbUser, dbPassword);
}catch(SQLException e){
System.err.println(e);
}
return conexion;
}
Remember to add the connector/J to the project referenced libraries.
In my case, the connector/J version is the 5.1.42. Maybe you will have to apply some changes to the connectionPath if you want to use a more modern version of the connector/J such as with the version 8.0.28.
In the file, remember to import the following resources:
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import com.mysql.jdbc.ConnectionImpl;
Hope this will be helpful.
Connection cn = DriverManager.getConnection("Host","user","pass");
Statement st = cn.createStatement("Ur Requet Sql");
int ret = st.execute();

Using lock in Hangfire executed ASP.Net code

I am using hangfire in an ASP.Net MVC project to manage longrunning background job.
I am trying to use lock statement block for database operation. Here is my lock statement code-
public class LockedTransaction
{
private Object thisLock = new Object();
public LockedTransaction() { }
public void UpdateCustomerBalance(long CustomerId, decimal AmountToDeduct, string ConnectionString)
{
lock (thisLock)
{
using (SqlConnection connection = new SqlConnection(ConnectionString))
{
connection.Open();
using (SqlTransaction transaction = connection.BeginTransaction(System.Data.IsolationLevel.ReadCommitted))
{
using (SqlCommand command = new SqlCommand())
{
command.Connection = connection;
command.Transaction = transaction;
command.CommandText = "SELECT Balance FROM Customer WHERE Id=" + CustomerId;
var userBalance = Convert.ToDecimal(command.ExecuteScalar());
userBalance = userBalance - AmountToDeduct;
command.CommandText = "UPDATE Customer SET Balance=" + userBalance + " WHERE Id=" + CustomerId;
command.ExecuteNonQuery();
transaction.Commit();
}
}
}
}
}
}
Here is how I'm calling the above code-
foreach (var queue in queues)
{
queue.Send();
LockedTransaction lockedTransaction = new LockedTransaction();
lockedTransaction.UpdateCustomerBalance(queue.CustomerId, queue.cost, "ConnectionString");
}
The problem is, database value is not updated as expected. For example, I have 5 queues as follows -
queue[0].cost = 0.50;
queue[1].cost = 0.50;
queue[2].cost = 0.50;
queue[3].cost = 0.50;
queue[4].cost = 0.50;
Database value should be deducted 2.5 (cost total) after completing the loop. But it's not happening. Sometimes deducted value is 2.00, sometimes 1.5, etc.
Any suggestion?
Your lock object (thisLock) is an instance property. And cause in foreach loop you create a new instance of LockedTransaction for each element in the queue, lock doesn't prevent concurrent executions (each calling of UpdateCustomerBalance method uses own lock object).
Changing thisLock to static property should help you:
private static Object thisLock = new Object();

Multi-threading NpgsqlConnections and readers producing duplicate results. C#

I have thousands of queries I need to run on thousands of different Schema over 10 databases. I am trying to thread these queries and use a BlockingCollection to write results into, while also using another thread to read from this collection and write it to disk as the result sets of these queries are too large to store in memory.
Here is the problem area in my code:
public class Node {
public string ConnectionString;
public string Query;
public Node(string databaseDetails, string query) {
//Cannot put in actual logic, but this part is fine
ConnectionString = {logic for connection string}
Query = "set search_path to {schema from databaseDetails};" + query
}
}
public void runQuery(string query, BlockingCollection<Dictionary<string, object>> producer) {
List<Node> nodes = getNodes(query);
Parallel.ForEach(nodes, node => {
NpgsqlConnection conn = new NpgsqlConnection(node.ConnectionString);
conn.Open();
NpgsqlCommand npgQuery = new NpgsqlCommand(node.Query, conn);
NpgsqlDataReader reader = npgQuery.ExecuteReader();
while (reader.Read()) {
Dictionary<string, object> row = new Dictionary<string, object>();
for (int i = 0; i < reader.FieldCount; i++) {
row[reader.GetName(i)] = reader.GetValue(i);
}
producer.Add(row);
}
conn.Close();
});
producer.CompleteAdding();
}
This code runs, and retrieves all of the results, but it duplicates a lot of the results as well, so the blocking collection has 5-10 times more records than it should. Any help would be greatly appreaciated.
So I was just an idiot and was comparing my generated result set to the UNION of all the queries I was running not the UNION ALL, so my "true" result set had no duplicates in it because the union was removing them :/

Dapper calls sp_executesql when I have parameters, is there a way around that?

When I call
connection.Execute(sql);
Dapper executes and everything is fine. When I call
connection.Execute(sql, new { UserId = _userId });
it executes with sp_executesql.
The issue is when it uses sp_executesql it's in its own scope. If it creates a temporary table, it's not accessible to subsequent queries that use the same connection. I could get around it by using global temporary tables, but I don't want to risk having two processes interfere with each other.
Does anybody know a way around that?
Update: I have the same problem when I use SqlCommand objects without Dapper. I wrote a unit test that illustrates the problem I'm having. WorksWithParameters fails with System.Data.SqlClient.SqlException : Invalid object name '#TEMP_OBJECTLIST'.
[TestFixture]
public class DapperTest
{
private const string TestObjectType = "S";
private const string ConnectionString = "XXXXXXXXX";
private static void CreateTempTableWithoutParameters(SqlConnection connection)
{
const string sql = "SELECT TOP 10 * INTO #TEMP_OBJECTLIST FROM sys.objects WHERE TYPE = 'S'";
connection.Execute(sql);
}
private static void UseTempTableWithoutParameters(SqlConnection connection)
{
const int expectedCount = 10;
const string sql = "SELECT COUNT(*) FROM #TEMP_OBJECTLIST WHERE TYPE = 'S'";
var count = connection.Query<int>(sql).First();
Assert.AreEqual(expectedCount, count);
}
private static void CreateTempTableWithParameters(SqlConnection connection)
{
const string sql = "SELECT TOP 10 * INTO #TEMP_OBJECTLIST FROM sys.objects WHERE TYPE = #OBJECT_TYPE";
connection.Execute(sql, new {OBJECT_TYPE = TestObjectType});
}
private static void UseTempTableWithParameters(SqlConnection connection)
{
const int expectedCount = 10;
const string sql = "SELECT COUNT(*) FROM #TEMP_OBJECTLIST WHERE TYPE = #OBJECT_TYPE";
var param = new {OBJECT_TYPE = TestObjectType};
var count = connection.Query<int>(sql, param).First();
Assert.AreEqual(expectedCount, count);
}
[Test]
public void WorksWithParameters()
{
using (var connection = new SqlConnection(ConnectionString))
{
connection.Open();
CreateTempTableWithParameters(connection);
UseTempTableWithParameters(connection);
}
}
[Test]
public void WorksWithoutParameters()
{
using (var connection = new SqlConnection(ConnectionString))
{
connection.Open();
CreateTempTableWithoutParameters(connection);
UseTempTableWithoutParameters(connection);
}
}
}
One way around the temp table scope problem is to create the temp table with one dummy column in the outer scope, then use alter table statements to add all the desired columns and use it.
Additionally, How to share data between procedures by Erland Sommarskog may be useful to you or another person looking for different options for sharing data.
I ran into the same problem with Dapper, but it's not Dapper's fault. sp_executesql is called by ADO.NET and this switches the "scope" so temp tables become invisible.
As a workaround:
//no parameters, so it runs without sp_executesql
conn.Execute("CREATE TABLE #temp BLAHBLAH");
//do your thing
conn.Execute("INSERT INTO #temp BLAHBLAH", parameters);
//cleanup (no parameters again)
conn.Execute("DROP TABLE #temp");

Is it possible to use `SqlDbType.Structured` to pass Table-Valued Parameters in NHibernate?

I want to pass a collection of ids to a stored procedure that will be mapped using NHibernate. This technique was introduced in Sql Server 2008 ( more info here => Table-Valued Parameters ). I just don't want to pass multiple ids within an nvarchar parameter and then chop its value on the SQL Server side.
My first, ad hoc, idea was to implement my own IType.
public class Sql2008Structured : IType {
private static readonly SqlType[] x = new[] { new SqlType(DbType.Object) };
public SqlType[] SqlTypes(NHibernate.Engine.IMapping mapping) {
return x;
}
public bool IsCollectionType {
get { return true; }
}
public int GetColumnSpan(NHibernate.Engine.IMapping mapping) {
return 1;
}
public void NullSafeSet(DbCommand st, object value, int index, NHibernate.Engine.ISessionImplementor session) {
var s = st as SqlCommand;
if (s != null) {
s.Parameters[index].SqlDbType = SqlDbType.Structured;
s.Parameters[index].TypeName = "IntTable";
s.Parameters[index].Value = value;
}
else {
throw new NotImplementedException();
}
}
#region IType Members...
#region ICacheAssembler Members...
}
No more methods are implemented; a throw new NotImplementedException(); is in all the rest. Next, I created a simple extension for IQuery.
public static class StructuredExtensions {
private static readonly Sql2008Structured structured = new Sql2008Structured();
public static IQuery SetStructured(this IQuery query, string name, DataTable dt) {
return query.SetParameter(name, dt, structured);
}
}
Typical usage for me is
DataTable dt = ...;
ISession s = ...;
var l = s.CreateSQLQuery("EXEC some_sp #id = :id, #par1 = :par1")
.SetStructured("id", dt)
.SetParameter("par1", ...)
.SetResultTransformer(Transformers.AliasToBean<SomeEntity>())
.List<SomeEntity>();
Ok, but what is an "IntTable"? It's the name of SQL type created to pass table value arguments.
CREATE TYPE IntTable AS TABLE
(
ID INT
);
And some_sp could be like
CREATE PROCEDURE some_sp
#id IntTable READONLY,
#par1 ...
AS
BEGIN
...
END
It only works with Sql Server 2008 of course and in this particular implementation with a single-column DataTable.
var dt = new DataTable();
dt.Columns.Add("ID", typeof(int));
It's POC only, not a complete solution, but it works and might be useful when customized. If someone knows a better/shorter solution let us know.
A simpler solution than the accepted answer would be to use ADO.NET. NHibernate allows users to enlist IDbCommands into NHibernate transactions.
DataTable myIntsDataTable = new DataTable();
myIntsDataTable.Columns.Add("ID", typeof(int));
// ... Add rows to DataTable
ISession session = sessionFactory.GetSession();
using(ITransaction transaction = session.BeginTransaction())
{
IDbCommand command = new SqlCommand("StoredProcedureName");
command.Connection = session.Connection;
command.CommandType = CommandType.StoredProcedure;
var parameter = new SqlParameter();
parameter.ParameterName = "IntTable";
parameter.SqlDbType = SqlDbType.Structured;
parameter.Value = myIntsDataTable;
command.Parameters.Add(parameter);
session.Transaction.Enlist(command);
command.ExecuteNonQuery();
}
For my case, my stored procedure needs to be called in the middle of an open transaction.
If there is an open transaction, this code works because it is automatically reusing the existing transaction of the NHibernate session:
NHibernateSession.GetNamedQuery("SaveStoredProc")
.SetInt64("spData", 500)
.ExecuteUpdate();
However, for my new Stored Procedure, the parameter is not as simple as an Int64. It's a table-valued-parameter (User Defined Table Type)
My problem is that I cannot find the proper Set function.
I tried SetParameter("spData", tvpObj), but it's returning this error:
Could not determine a type for class: …
Anyways, after some trial and error, this approach below seems to work.
The Enlist() function is the key in this approach. It basically tells the SQLCommand to use the existing transaction. Without it, there will be an error saying
ExecuteNonQuery requires the command to have a transaction when the
connection assigned to the command is in a pending local transaction…
using (SqlCommand cmd = NHibernateSession.Connection.CreateCommand() as SqlCommand)
{
cmd.CommandText = "MyStoredProc";
NHibernateSession.Transaction.Enlist(cmd); // Because there is a pending transaction
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add(new SqlParameter("#wiData", SqlDbType.Structured) { Value = wiSnSqlList });
int affected = cmd.ExecuteNonQuery();
}
Since I am using the SqlParameter class with this approach, SqlDbType.Structured is available.
This is the function where wiSnList gets assigned:
private IEnumerable<SqlDataRecord> TransformWiSnListToSql(IList<SHWorkInstructionSnapshot> wiSnList)
{
if (wiSnList == null)
{
yield break;
}
var schema = new[]
{
new SqlMetaData("OriginalId", SqlDbType.BigInt), //0
new SqlMetaData("ReportId", SqlDbType.BigInt), //1
new SqlMetaData("Description", SqlDbType.DateTime), //2
};
SqlDataRecord row = new SqlDataRecord(schema);
foreach (var wi in wiSnList)
{
row.SetSqlInt64(0, wi.OriginalId);
row.SetSqlInt64(1, wi.ShiftHandoverReportId);
if (wi.Description == null)
{
row.SetDBNull(2);
}
else
{
row.SetSqlString(2, wi.Description);
}
yield return row;
}
}
You can pass collections of values without the hassle.
Example:
var ids = new[] {1, 2, 3};
var query = session.CreateQuery("from Foo where id in (:ids)");
query.SetParameterList("ids", ids);
NHibernate will create a parameter for each element.

Resources