I am using ServiceStack.Ormlite for SQL Server and just updated from 3.9.71 to 4.0.33.0 and now transactions for direct commands are failing. I can get ORMlite transactions working or direct commands, but not both.
The complication is I am doing some very complicated DB commands and since Sql.In() for a large list of GUIDs is massively slow I have a workaround which uses db.CreateCommand() and then passes the GUID list in as a custom table type.
Thus I need a single transaction to span across ORMLite commands and direct db commands. The following code used to work and now it fails.
For instance the following code used to work. I now get errors saying that the CreateCommand() should use the transaction. When I directly try then I get the indicated cast exception:
using (var db = DB.Connection.OpenDbConnection())
{
using (var transaction = db.OpenTransaction())
{
// Some ORMLite code
db.Delete<SomeType>();
using (var command = db.CreateCommand())
{
// Direct DB command
command.CommandText = "Delete from SomeTable where ...";
command.Parameters.Add(GUIDList)
command.ExecuteNonQuery();
}
}
}
Clarification:
In the code OpenTransaction() will work for the OrmLite code, but fail on the CreateCommand code. BeginTransaction() will fail for the OrmLite code.
The actual error is at command.ExecuteNonQuery(): ExecuteNonQuery requires the command to have a transaction when the connection assigned to the command is in a pending local transaction. The Transaction property of the command has not been initialized.
To use Transactions in OrmLite you should use the OpenTransaction() API, e.g:
using (var trans = db.OpenTransaction())
{
//...
}
I've added a couple new API's to be able to use an OrmLite transaction with a raw ADO.NET IDbCommand in this commit.
Use a managed OrmLite DB Command
Use a managed OrmLite command with OpenCommand() which will automatically prepopulate the DB command with the current transaction, e.g:
using (var trans = db.OpenTransaction())
using (var command = db.OpenCommand())
{
command.CommandText = "Delete from SomeTable where ...";
}
Manually assign underlying DB Transaction
When using the underlying ADO.NET IDbCommand you will need to also manually assign the Transaction to the command yourself, i.e:
using (var trans = db.OpenTransaction())
using (var command = db.CreateCommand())
{
command.Transaction = trans.ToDbTransaction();
command.CommandText = "Delete from SomeTable where ...";
}
The ToDbTransaction() extension method lets you access the underlying ADO.NET IDbTransaction which is required when using the underlying ADO.NET IDbCommand.
Both of these new API's are available from v4.0.34+ that's now available on MyGet.
Here is my suggestion that works.It is based on previous answers
IDbConnection conn = DB.Connection;
IDbCommand cmd = conn.CreateCommand();
using (IDbTransaction transaction = conn.OpenTransaction())
{
//ADO.NET code
cmd.Transaction = transaction.ToDbTransaction();
cmd.CommandText = "...Some sql text";
cmd.executeNonQuery();
// Some ORMLite code
conn.Delete<SomeType>();
}
Related
I am using dependency injection with Entity Framework Core and the context is created through a scope factory:
using (var scope = this._scopeFactory.CreateScope())
{
var context = scope.ServiceProvider.GetRequiredService<thecontext>();
//run code here
}
A transaction is started after all the objects are created:
using (var trans = context.Database.BeginTransaction())
{
try
{
This works well but during the process I need to insert records by running a stored procedure that is outside the transaction. In other words even if the transaction is aborted I still need the results of this stored procedure insert to persist. In addition any inserts from the 'isolated' stored procedure must be available to the process running under the transaction as well as outside the current scope.
The normal method of executing the stored procedure is by getting the connection and attaching the transaction to a new command. But I need to either use a new connection that is outside the current scope so it is not bound by the transaction or perhaps there is another way?
string sql = $"theProc";
var cnn = _context.Database.GetDbConnection();
if (cnn.State == ConnectionState.Closed)
cnn.Open();
DbCommand cmd = cnn.CreateCommand();
cmd.CommandText = sql;
cmd.CommandType = CommandType.StoredProcedure;
if (_context.Database.CurrentTransaction != null)
cmd.Transaction = _context.Database.CurrentTransaction.GetDbTransaction();
I ended up doing something like:
var cnnstr = _context.Database.GetConnectionString();
var isolatedCnn = new SqlConnection(cnnstr);
isolatedCnn.Open();
using (isolatedCnn)
{
This works but is ugly for a number of reasons so I'd like to find a better solution.
When one executes a stored proc against MS SQL Server, the recordset comes with some type info. Clients are capable of retrieving that type info, e. g. like this (C#/.NET):
SqlCommand cmd = new SqlCommand("dbo.MyProc", conn);
SqlDataAdapter ada = new SqlDataAdapter(cmd);
ada.Fill(ds);
string ColName = ds.Tables[0].Columns[0].ColumnName;
Type ColType = ds.Tables[0].Columns[0].DataType;
What about nullability of those columns? While it's not knowable in the general case, sometimes it is - for example, when a recordset column comes straight from a table field. If SQL Server is smart enough, it can determine column nullability at least for those cases. Does SQL Server report nullability to the clients in whatever metadata it provides along with the recordset?
Specifically in the .NET client, there is AllowDBNull in retrieved column properties, but in the scenario above it's unreliable - it comes across as true both for columns that came from nullable fields, and for columns that came from nonnullable fields. Is this a limitation of the .NET client, or a shortcoming of the underlying protocol?
This is a limitation of SqlDataAdapter.
The TDS protocol does return this information, as you can see in the specification.
In turn, SqlDataReader will return this information via GetSchemaTable. But it does not make its way to a table filled using a DbDataAdapter.
You can see this with the following code
using(var conn = new SqlConnection(YourConnectionString))
{
conn.Open();
using(var comm = new SqlCommand("select 1", conn))
using(var reader=comm.ExecuteReader())
{
reader.Read();
reader.GetSchemaTable().Dump();
}
using(var comm = new SqlCommand("select case when getdate() = 1 then 1 end", conn))
using(var reader = comm.ExecuteReader())
{
reader.Read();
reader.GetSchemaTable().Dump();
}
}
You will see that the first has AllowDBNull as false and the second as true.
I would advise you in any case to avoid SqlDataAdapter, as it is only actually useful in data-binding UI scenarios. In back-end code, just use a SqlDataReader, or even better: use an ORM such as Dapper or Entity Framework, which will sort out all of this kind of thing for you.
I have procedure in Snowflake and would like to call it from my Timer Triggered Azure Function App.
That procedure expects a parameter which is of type string. Following is my code snippet to connect to Snowflake and calling that procedure with parameter.
using (IDbConnection conn = new SnowflakeDbConnection())
{
//Connect to Snowflake
conn.ConnectionString = Environment.GetEnvironmentVariable("SnowflakeConnection");
conn.Open();
using (IDbCommand cmd = conn.CreateCommand())
{
if (conn.State == ConnectionState.Open)
{
cmd.CommandText = "SP_Snowflake_Procedure";
//cmd.CommandType = CommandType.StoredProcedure;
var date = cmd.CreateParameter();
date.ParameterName = "RUNDATE";
date.DbType = DbType.String;
date.Value = "2018-01-01";
cmd.Parameters.Add(date);
using (IDataReader dr = cmd.ExecuteReader())
{
/****************
Logic to work on data
received from SP
*****************/
}
}
}
}
When control comes to cmd.ExecuteReader(), it's failing with error:
Snowflake.Data: SQL compilation error: syntax error line 1 at position 0 unexpected 'SP_Snowflake_Procedure'.
I don't understand this Snowflake, how to call a procedure. I had a thought of, it is way similar to MS SQL. But I am wrong. I couldn't even find proper related documents.
I could use same code without procedure call but simple SELECT statement and worked fine.
Suggest me any changes here.
I can't tell from the code if you're using the ODBC driver for Snowflake or the .NET driver for Snowflake. The ODBC driver supports more features than the .NET driver, but I think executing SPs should be supported in both.
You'll need to make the call using a SQL statement that executes a query (as opposed to methods that execute non-query). It will return a table with a single row with the return from the SP. It will contain a single column with the name of the SP and the scalar value of the SP (basically what would be returned to the SQL worksheet if run in the web UI).
Here's a sample SP to test in case you need a simple one:
create or replace procedure EchoString(stringValue string)
returns VARCHAR
language JavaScript
as
$$
// Note that variables passed to Snowflake stored procedures
// muat be all CAPITAL letters when used in the body of the
// procedure code.
return STRINGVALUE
$$;
--Run the stored procedure to echo the value.
call EchoString('Echo this string.');
Here's how to call the SP from a C# project using an ODBC connection:
OdbcConnection DbConnection = new OdbcConnection("DSN=Snowflake;pwd=******");
OdbcCommand DbCommandSetup = DbConnection.CreateCommand();
DbConnection.Open();
// These two lines are only required if you get a message about no running warehouse.
// It will depend on how your calling user is set up in Snowflake.
DbCommandSetup.CommandText = "use warehouse TEST;";
DbCommandSetup.ExecuteNonQuery();
OdbcCommand DbCommand = DbConnection.CreateCommand();
DbCommand.CommandText = "call TEST.PUBLIC.ECHOSTRING('Echo this string.')";
OdbcDataReader DbReader = DbCommand.ExecuteReader();
// Note: If you define a Snowflake SP, DB, or schema in mixed case without double quoting
// the name, Snowflake will uppercase it in the catalog. You can call it from here without
// converting to upper case as long as it's not double quoted (escaped \") in the string.
I have used SQLdependency with SignalR to show alerts to users.. The code is as follows:
public IEnumerable<AlertInfo> GetData(long UserId)
{
using (var connection = new SqlConnection(ConfigurationManager.ConnectionStrings["yafnet"].ConnectionString))
{
connection.Open();
using (SqlCommand command = new SqlCommand(#"SELECT [AlertID],[AlertNote],[AlertDetails],[AlertDate],[Location]
FROM [dbo].[Alerts] where [UserID]=" + UserId + " AND [IsViewed]=0", connection))
{
// Make sure the command object does not already have
// a notification object associated with it.
command.Notification = null;
SqlDependency.Stop(ConfigurationManager.ConnectionStrings["yafnet"].ConnectionString);
SqlDependency.Start(ConfigurationManager.ConnectionStrings["yafnet"].ConnectionString);
SqlDependency dependency = new SqlDependency(command);
dependency.OnChange += new OnChangeEventHandler(dependency_OnChange);
if (connection.State == ConnectionState.Closed)
connection.Open();
using (var reader = command.ExecuteReader())
return reader.Cast<IDataRecord>()
.Select(x => new AlertInfo()
{
AlertID = x.GetInt64(0),
AlertNote = x.GetString(1),
AlertDetails = x.GetString(2),
AlertDate = x.GetDateTime(3),
Location = x.GetString(4)
}).ToList();
}
}
}
It is working fine on localhost. But after uploading to Azure server, this method throws the following error:
Message":"An error has occurred.","ExceptionMessage":"Statement 'RECEIVE MSG' is not supported
in this version of SQL Server.","ExceptionType":"System.Data.SqlClient.SqlException","StackTrace":"
\r\nServer stack trace: \r\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception
, Boolean breakConnection, Action`1 wrapCloseInAction)
What could be the issue?
Actually your SQL Server database must have is_broker_enabled = 1.
You need to check whether it's enabled or not.
To verify this, use the command SELECT name, is_broker_enabled FROM sys.databases.
If your database shows result "1" it's okay and if "0" then you must enable it using this command ALTER DATABASE yourdb SET ENABLE_BROKER.
But the bad news is Azure SQL database shows it enabled but it no longer supports is_broker_enabled.
To do this, you need to install the full instance of SQL Server to Azure VM.
As demonstrated by previous Stack Overflow questions (TransactionScope and Connection Pooling and How does SqlConnection manage IsolationLevel?), the transaction isolation level leaks across pooled connections with SQL Server and ADO.NET (also System.Transactions and EF, because they build on top of ADO.NET).
This means, that the following dangerous sequence of events can happen in any application:
A request happens which requires an explicit transaction to ensure data consistency
Any other request comes in which does not use an explicit transaction because it is only doing uncritical reads. This request will now execute as serializable, potentially causing dangerous blocking and deadlocks
The question: What is the best way to prevent this scenario? Is it really required to use explicit transactions everywhere now?
Here is a self-contained repro. You will see that the third query will have inherited the Serializable level from the second query.
class Program
{
static void Main(string[] args)
{
RunTest(null);
RunTest(IsolationLevel.Serializable);
RunTest(null);
Console.ReadKey();
}
static void RunTest(IsolationLevel? isolationLevel)
{
using (var tran = isolationLevel == null ? null : new TransactionScope(0, new TransactionOptions() { IsolationLevel = isolationLevel.Value }))
using (var conn = new SqlConnection("Data Source=(local); Integrated Security=true; Initial Catalog=master;"))
{
conn.Open();
var cmd = new SqlCommand(#"
select
case transaction_isolation_level
WHEN 0 THEN 'Unspecified'
WHEN 1 THEN 'ReadUncommitted'
WHEN 2 THEN 'ReadCommitted'
WHEN 3 THEN 'RepeatableRead'
WHEN 4 THEN 'Serializable'
WHEN 5 THEN 'Snapshot'
end as lvl, ##SPID
from sys.dm_exec_sessions
where session_id = ##SPID", conn);
using (var reader = cmd.ExecuteReader())
{
while (reader.Read())
{
Console.WriteLine("Isolation Level = " + reader.GetValue(0) + ", SPID = " + reader.GetValue(1));
}
}
if (tran != null) tran.Complete();
}
}
}
Output:
Isolation Level = ReadCommitted, SPID = 51
Isolation Level = Serializable, SPID = 51
Isolation Level = Serializable, SPID = 51 //leaked!
The connection pool calls sp_resetconnection before recycling a connection. Resetting the transaction isolation level is not in the list of things that sp_resetconnection does. That would explain why "serializable" leaks across pooled connections.
I guess you could start each query by making sure it's at the right isolation level:
if not exists (
select *
from sys.dm_exec_sessions
where session_id = ##SPID
and transaction_isolation_level = 2
)
set transaction isolation level read committed
Another option: connections with a different connection string do not share a connection pool. So if you use another connection string for the "serializable" queries, they won't share a pool with the "read committed" queries. An easy way to alter the connection string is to use a different login. You could also add a random option like Persist Security Info=False;.
Finally, you could make sure every "serializable" query resets the isolation level before it returns. If a "serializable" query fails to complete, you could clear the connection pool to force the tainted connection out of the pool:
SqlConnection.ClearPool(yourSqlConnection);
This is potentially expensive, but failing queries are rare, so you should not have to call ClearPool() often.
In SQL Server 2014 this seem to have been fixed. If using TDS protocol 7.3 or higher.
Running on SQL Server version 12.0.2000.8 the output is:
ReadCommitted
Serializable
ReadCommitted
Unfortunately this change is not mentioned in any documentation such as:
Behavior Changes to Database Engine Features in SQL Server 2014
Breaking Changes to Database Engine Features in SQL Server 2014
But the change has been documented on a Microsoft Forum.
Update 2017-03-08
Unfortunately this was later "unfixed" in SQL Server 2014 CU6 and SQL Server 2014 SP1 CU1 since it introduced a bug:
FIX: The transaction isolation level is reset incorrectly when the SQL Server connection is released in SQL Server 2014
"Assume that you use the TransactionScope class in SQL Server client-side source code, and you do not explicitly open the SQL Server connection in a transaction. When the SQL Server connection is released, the transaction isolation level is reset incorrectly."
Workaround
It appears that, since passing through a parameter makes the driver use sp_executesql, this forces a new scope, similar to a stored procedure. The scope is rolled back after the end of the batch.
Therefore, to avoid the leak, pass through a dummy parameter, as show below.
using (var conn = new SqlConnection(connString))
using (var comm = new SqlCommand(#"
SELECT transaction_isolation_level FROM sys.dm_exec_sessions where session_id = ##SPID
", conn))
{
conn.Open();
Console.WriteLine(comm.ExecuteScalar());
}
using (var conn = new SqlConnection(connString))
using (var comm = new SqlCommand(#"
SET TRANSACTION ISOLATION LEVEL SNAPSHOT;
SELECT transaction_isolation_level FROM sys.dm_exec_sessions where session_id = ##SPID
", conn))
{
comm.Parameters.Add("#dummy", SqlDbType.Int).Value = 0; // see with and without
conn.Open();
Console.WriteLine(comm.ExecuteScalar());
}
using (var conn = new SqlConnection(connString))
using (var comm = new SqlCommand(#"
SELECT transaction_isolation_level FROM sys.dm_exec_sessions where session_id = ##SPID
", conn))
{
conn.Open();
Console.WriteLine(comm.ExecuteScalar());
}
For those using EF in .NET, you can fix this for your whole application by setting a different appname per isolation level (as also stated by #Andomar):
//prevent isolationlevel leaks
//https://stackoverflow.com/questions/9851415/sql-server-isolation-level-leaks-across-pooled-connections
public static DataContext CreateContext()
{
string isolationlevel = Transaction.Current?.IsolationLevel.ToString();
string connectionString = ConfigurationManager.ConnectionStrings["yourconnection"].ConnectionString;
connectionString = Regex.Replace(connectionString, "APP=([^;]+)", "App=$1-" + isolationlevel, RegexOptions.IgnoreCase);
return new DataContext(connectionString);
}
Strange this is still an issue 8 years later ...
I just asked a question on this topic and added a piece of C# code, which can help around this problem (meaning: change isolation level only for one transaction).
Change isolation level in individual ADO.NET transactions only
It is basically a class to be wrapped in an 'using' block, which queries the original isolation level before and restores it later.
It does, however, require two additional round trips to the DB to check and restore the default isolation level, and I am not absolutely sure that it will never leak the altered isolation level, although I see very little danger of that.