Using lock in Hangfire executed ASP.Net code - sql-server

I am using hangfire in an ASP.Net MVC project to manage longrunning background job.
I am trying to use lock statement block for database operation. Here is my lock statement code-
public class LockedTransaction
{
private Object thisLock = new Object();
public LockedTransaction() { }
public void UpdateCustomerBalance(long CustomerId, decimal AmountToDeduct, string ConnectionString)
{
lock (thisLock)
{
using (SqlConnection connection = new SqlConnection(ConnectionString))
{
connection.Open();
using (SqlTransaction transaction = connection.BeginTransaction(System.Data.IsolationLevel.ReadCommitted))
{
using (SqlCommand command = new SqlCommand())
{
command.Connection = connection;
command.Transaction = transaction;
command.CommandText = "SELECT Balance FROM Customer WHERE Id=" + CustomerId;
var userBalance = Convert.ToDecimal(command.ExecuteScalar());
userBalance = userBalance - AmountToDeduct;
command.CommandText = "UPDATE Customer SET Balance=" + userBalance + " WHERE Id=" + CustomerId;
command.ExecuteNonQuery();
transaction.Commit();
}
}
}
}
}
}
Here is how I'm calling the above code-
foreach (var queue in queues)
{
queue.Send();
LockedTransaction lockedTransaction = new LockedTransaction();
lockedTransaction.UpdateCustomerBalance(queue.CustomerId, queue.cost, "ConnectionString");
}
The problem is, database value is not updated as expected. For example, I have 5 queues as follows -
queue[0].cost = 0.50;
queue[1].cost = 0.50;
queue[2].cost = 0.50;
queue[3].cost = 0.50;
queue[4].cost = 0.50;
Database value should be deducted 2.5 (cost total) after completing the loop. But it's not happening. Sometimes deducted value is 2.00, sometimes 1.5, etc.
Any suggestion?

Your lock object (thisLock) is an instance property. And cause in foreach loop you create a new instance of LockedTransaction for each element in the queue, lock doesn't prevent concurrent executions (each calling of UpdateCustomerBalance method uses own lock object).
Changing thisLock to static property should help you:
private static Object thisLock = new Object();

Related

Script task in SSIS package is executing but not performing the action

I have the two SSIS packages which basically has two action like below
First it truncates the contents of the table and then it executes the script task like basically call an API and inserts the response in to the table
[Microsoft.SqlServer.Dts.Tasks.ScriptTask.SSISScriptTaskEntryPointAttribute]
public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{
public async void Main()
{
try
{
var sqlConn = new System.Data.SqlClient.SqlConnection();
ConnectionManager cm = Dts.Connections["SurplusMouse_ADONET"];
string serviceUrl = Dts.Variables["$Project::RM_ServiceUrl"].Value.ToString();
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12 | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls;
HttpClient client = new HttpClient();
client.BaseAddress = new Uri(serviceUrl);
client.DefaultRequestHeaders.Accept.Add(
new MediaTypeWithQualityHeaderValue("application/json"));
string APIUrl = string.Format(serviceUrl + "/gonogo");
var response = await client.GetAsync(APIUrl);
if (response.IsSuccessStatusCode)
{
var result = await response.Content.ReadAsStringAsync();
try
{
sqlConn = (System.Data.SqlClient.SqlConnection)cm.AcquireConnection(Dts.Transaction);
const string query = #"INSERT INTO [dbo].[RM_Approved_Room_State]
(APPROVED_ROOM_STATEID,SOURCE_ROOMID,DEST_ROOMID,ENTITY_TYPEID)
SELECT id, sourceRoomRefId, destinationRoomRefId,entityRefId
FROM OPENJSON(#json)
WITH (
id int,
sourceRoomRefId int,
destinationRoomRefId int,
entityRefId int
) j;";
using (var sqlCmd = new System.Data.SqlClient.SqlCommand(query, sqlConn))
{
sqlCmd.Parameters.Add("#json", SqlDbType.NVarChar, -1).Value = result;
await sqlCmd.ExecuteNonQueryAsync();
}
}
catch (Exception ex)
{
Dts.TaskResult = (int)ScriptResults.Failure;
}
finally
{
if (sqlConn != null)
cm.ReleaseConnection(sqlConn);
}
}
}
catch (Exception ex)
{
Dts.TaskResult = (int)ScriptResults.Failure;
}
}
#region ScriptResults declaration
enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
#endregion
} }
Similar to the above package I have another one which does more likely does insert records into different table the response from a different endpoint. When I execute the packages locally/ execute them separately after deploying it in to the server it works fine. But when I add them in to the SQL Server Agent Job like below and run them on a schedule
The Jobs run successfully and dont show any errors but I can see only one table with data from one package but the other one truncates the records but I dont think the script task is getting executed / I dont see any records inserted. I dont think there are any issues with access because when I run them seperate manually the data are getting inserted, Just when it is running on a schedule it is not working as expected. Any idea what could be happening here.. Any help is greatly appreciated

.Net Using Transactions with Prepared Statements for SqlClient

I'm trying to implement Transactions for a script but I've run into a strange issue.
When I attempt to run a Prepared SQL Statement inside of a Transaction it is failing because it says that it needs a Transaction when the connection is assigned.
How does this work with Prepared Statements though because I intend to have multiple Transactions all using the same Prepared Statements.
My code is as follows
class dbTest {
public static SqlConnection db;
public static SqlCommand query;
static void Main(string[] args) {
db = connect();
prepare();
transaction01();
transaction02();
transaction03();
}
public static void prepare() {
query = new SqlCommand("select id from table where id = 1 for update", db);
query.Prepare();
}
public static void transaction01() {
SqlTransaction trans = db.BeginTransaction("Trn01");
SqlDataReader result = query.ExecuteReader();
while(result.Read()) { Console.WriteLine(result["id"]); }
result.Close();
trans.Commit();
}
public static void transaction02() {
SqlTransaction trans = db.BeginTransaction("Trn02");
SqlDataReader result = query.ExecuteReader();
while(result.Read()) { Console.WriteLine(result["id"]); }
result.Close();
trans.Commit();
}
public static void transaction03() {
SqlTransaction trans = db.BeginTransaction("Trn03");
SqlDataReader result = query.ExecuteReader();
while(result.Read()) { Console.WriteLine(result["id"]); }
result.Close();
trans.Commit();
}
}
How do I assign the Transaction to an existing Prepared Statement?
UPDATE
Changed the above code to better show the issue. The SQL is prepared once but I will be using it for multiple Transactions (or at least I want to)
UPDATE AGAIN
I have marked an answer below as the correct one because it looks like the best way to achieve this but for my needs in this very small example using query.Transaction got it working
public static void transaction01() {
SqlTransaction trans = db.BeginTransaction("Trn01");
query.Transaction = trans; // this line fixed it
SqlDataReader result = query.ExecuteReader();
while(result.Read()) { Console.WriteLine(result["id"]); }
result.Close();
trans.Commit();
}
When working with the SqlTransaction, you must set the SqlCommand.Transaction explicitly, even though enlisting in the current transaction is not optional in SQL Server.
select ... for update is not valid SQL Server syntax, instead use UPDLOCK to read a table and retain a restrictive lock for the duration of the transaction. EG
select id from table with (updlock) where id = 1
When I attempt to run a Prepared SQL Statement
It's rarely useful to use prepared statements with SQL Server. Query plan caching happens automatically even without it, and it really just reduces the size of the request on the network when you are executing a SqlCommand many times with differing parameters.
But a prepared SqlCommand is still bound to a single SqlConnection, which typically has a short lifetime, minimizing the potential benefits of preparing the SqlCommand.
You need to set the SqlCommand.Transaction to your transaction object.
it is not necessary with SQL Server to prepare the statement. Just keep executing.
Note also, as you can see in this post, that you must correctly dispose all DB objects.
Here is your code cleaned up:
class dbTest {
// DO NOT cache connection object
static void Main(string[] args) {
using(var db = connect())
using(var comm = GetCommand(db))
{
transaction01(comm);
transaction02(comm);
transaction03(comm);
}
}
public static SqlCommand GetCommand(SqlConnection conn) {
return new SqlCommand("select id from table with (updlock) where id = 1", conn);
}
public static void transaction01(SqlCommand comm) {
using(SqlTransaction trans = comm.Connection.BeginTransaction("Trn01"))
{
comm.Transaction = trans;
using(SqlDataReader result = query.ExecuteReader())
while(result.Read()) { Console.WriteLine(result["id"]); }
trans.Commit();
} // no need to close, using will sort that out
}
public static void transaction02(SqlCommand comm) {
using(SqlTransaction trans = comm.Connection.BeginTransaction("Trn02"))
{
comm.Transaction = trans;
using(SqlDataReader result = query.ExecuteReader())
while(result.Read()) { Console.WriteLine(result["id"]); }
trans.Commit();
} // no need to close, using will sort that out
}
public static void transaction03(SqlCommand comm) {
using(SqlTransaction trans = comm.Connection.BeginTransaction("Trn03"))
{
comm.Transaction = trans;
using(SqlDataReader result = query.ExecuteReader())
while(result.Read()) { Console.WriteLine(result["id"]); }
trans.Commit();
} // no need to close, using will sort that out
}
}

Nunit database rollback

I'm fairly new to using Nunit as a test framework and have come across something I don't really understand.
I am writing an integration test which inserts a row into a database table. As I want to repeatedly run this test I wanted to delete the row out once the test has completed. The test runs fine, however when I look in the database table the row in question is still there even though the delete command ran. I can even see it run when profiling the database whilst running the test.
Does Nunit somehow rollback database transaction? If anyone has any ideas why I am seeing this happen please let me know. I have been unable to find any information about Nunit rolling back transactions which makes me think I'm doing something wrong, but the test runs the row appears in the database it is just not being deleted afterwards even though profiler shows the command being run!
Here is the test code I am running
[Test]
[Category("Consultant split integration test")]
public void Should_Save_Splits()
{
var splitList = new List<Split>();
splitList.Add(new Split()
{
UnitGroupId = 69,
ConsultantUserId = 1,
CreatedByUserId = 1,
CreatedOn = DateTime.Now,
UpdatedByUserId = 1,
UpdatedOn = DateTime.Now,
Name = "Consultant1",
Unit = "Unit1",
Percentage = 100,
PlacementId = 47
});
var connection = Helpers.GetCpeDevDatabaseConnection();
var repository = new Placements.ConsultantSplit.DAL.PronetRepository();
var notebookManager = Helpers.GetNotebookManager(connection);
var userManager = Helpers.GetUserManager(connection);
var placementManager = Helpers.GetPlacementManager(connection);
var sut = new Placements.ConsultantSplit.ConsultantSplitManager(repository, connection, notebookManager, userManager, placementManager);
IEnumerable<string> errors;
sut.SaveSplits(splitList, out errors);
try
{
using (connection.Connection)
{
using (
var cmd = new SqlCommand("Delete from ConsultantSplit where placementid=47",
connection.Connection))
{
connection.Open();
cmd.CommandType = CommandType.Text;
connection.UseTransaction = false;
cmd.Transaction = connection.Transaction;
cmd.ExecuteNonQuery();
connection.Close();
}
}
}
catch (Exception exp)
{
throw new Exception(exp.Message);
}

SqlServer better to batch statements or foreach?

Hypothetically, is it better to send N statements to Sql Server (2008), or is it better to send 1 command comprising N statements to Sql Server? In either case, I am running the same statement over a list of objects, and in both cases I would be using named parameters. Suppose my use case is dumping a cache of log items every few hours.
foreach example
var sql = "update blah blah blah where id = #id";
using(var conn = GetConnection())
{
foreach(var obj in myList)
{
var cmd = new SqlCommand()
{CommandText = sql, Connection = conn};
//add params from obj
cmd.ExecuteNonQuery();
}
}
batch example
var sql = #"
update blah blah blah where id = #id1
update blah blah blah where id = #id2
update blah blah blah where id = #id3
-etc";
using (var conn = GetConnection())
{
var cmd = new SqlCommand
{ CommandText = sql, Connection = conn};
for(int i=0; i<myList.Count; i++)
{
//add params: "id" + i from myList[i]
}
cmd.ExecuteNonQuery();
}
In time tests, the batch version took 15% longer than the foreach version for large inputs. I figure the batch version takes longer to execute because the server has to parse a huge statement and bind up to 2000 parameters. Supposing Sql Server is on the LAN, is there any advantage to using the batch method?
Your tests would seem to have given you the answer however let me add another. It is preferrable to encapsulate the update into a separate function and call that using a foreach:
private function UpdateFoo( int id )
{
const sql = "Update Foo Where Id = #Id";
using ( var conn = GetConnection() )
{
using ( var cmd = new SqlCommand( sql, conn ) )
{
cmd.AddParameterWithValue( "#Id", id )
cmd.ExecuteNonQuery();
}
}
}
private function UpdateLotsOfFoo()
{
foreach( var foo in myList )
{
UpdateFoo( foo.Id );
}
}
In this setup you are leveraging connection pooling which mitgates the cost of opening and closing connections.
#Thomas - this design can increase overhead of opening / closing connections in a loop. This is not a preferred practice and should be avoided. The code below allows the iteration of the statements while using one connection and will be easier on resources (both client and server side).
private void UpdateFoo(int id)
{
const string sql = "Update Foo Where Id = #Id";
using (var conn = GetConnection())
{
conn.Open();
foreach (var foo in myList)
{
UpdateFoo(foo.Id);
using (var cmd = new SqlCommand(sql, conn))
{
cmd.AddParameterWithValue("#Id", id);
cmd.ExecuteNonQuery();
}
}
conn.Close();
}
}

Is it possible to use `SqlDbType.Structured` to pass Table-Valued Parameters in NHibernate?

I want to pass a collection of ids to a stored procedure that will be mapped using NHibernate. This technique was introduced in Sql Server 2008 ( more info here => Table-Valued Parameters ). I just don't want to pass multiple ids within an nvarchar parameter and then chop its value on the SQL Server side.
My first, ad hoc, idea was to implement my own IType.
public class Sql2008Structured : IType {
private static readonly SqlType[] x = new[] { new SqlType(DbType.Object) };
public SqlType[] SqlTypes(NHibernate.Engine.IMapping mapping) {
return x;
}
public bool IsCollectionType {
get { return true; }
}
public int GetColumnSpan(NHibernate.Engine.IMapping mapping) {
return 1;
}
public void NullSafeSet(DbCommand st, object value, int index, NHibernate.Engine.ISessionImplementor session) {
var s = st as SqlCommand;
if (s != null) {
s.Parameters[index].SqlDbType = SqlDbType.Structured;
s.Parameters[index].TypeName = "IntTable";
s.Parameters[index].Value = value;
}
else {
throw new NotImplementedException();
}
}
#region IType Members...
#region ICacheAssembler Members...
}
No more methods are implemented; a throw new NotImplementedException(); is in all the rest. Next, I created a simple extension for IQuery.
public static class StructuredExtensions {
private static readonly Sql2008Structured structured = new Sql2008Structured();
public static IQuery SetStructured(this IQuery query, string name, DataTable dt) {
return query.SetParameter(name, dt, structured);
}
}
Typical usage for me is
DataTable dt = ...;
ISession s = ...;
var l = s.CreateSQLQuery("EXEC some_sp #id = :id, #par1 = :par1")
.SetStructured("id", dt)
.SetParameter("par1", ...)
.SetResultTransformer(Transformers.AliasToBean<SomeEntity>())
.List<SomeEntity>();
Ok, but what is an "IntTable"? It's the name of SQL type created to pass table value arguments.
CREATE TYPE IntTable AS TABLE
(
ID INT
);
And some_sp could be like
CREATE PROCEDURE some_sp
#id IntTable READONLY,
#par1 ...
AS
BEGIN
...
END
It only works with Sql Server 2008 of course and in this particular implementation with a single-column DataTable.
var dt = new DataTable();
dt.Columns.Add("ID", typeof(int));
It's POC only, not a complete solution, but it works and might be useful when customized. If someone knows a better/shorter solution let us know.
A simpler solution than the accepted answer would be to use ADO.NET. NHibernate allows users to enlist IDbCommands into NHibernate transactions.
DataTable myIntsDataTable = new DataTable();
myIntsDataTable.Columns.Add("ID", typeof(int));
// ... Add rows to DataTable
ISession session = sessionFactory.GetSession();
using(ITransaction transaction = session.BeginTransaction())
{
IDbCommand command = new SqlCommand("StoredProcedureName");
command.Connection = session.Connection;
command.CommandType = CommandType.StoredProcedure;
var parameter = new SqlParameter();
parameter.ParameterName = "IntTable";
parameter.SqlDbType = SqlDbType.Structured;
parameter.Value = myIntsDataTable;
command.Parameters.Add(parameter);
session.Transaction.Enlist(command);
command.ExecuteNonQuery();
}
For my case, my stored procedure needs to be called in the middle of an open transaction.
If there is an open transaction, this code works because it is automatically reusing the existing transaction of the NHibernate session:
NHibernateSession.GetNamedQuery("SaveStoredProc")
.SetInt64("spData", 500)
.ExecuteUpdate();
However, for my new Stored Procedure, the parameter is not as simple as an Int64. It's a table-valued-parameter (User Defined Table Type)
My problem is that I cannot find the proper Set function.
I tried SetParameter("spData", tvpObj), but it's returning this error:
Could not determine a type for class: …
Anyways, after some trial and error, this approach below seems to work.
The Enlist() function is the key in this approach. It basically tells the SQLCommand to use the existing transaction. Without it, there will be an error saying
ExecuteNonQuery requires the command to have a transaction when the
connection assigned to the command is in a pending local transaction…
using (SqlCommand cmd = NHibernateSession.Connection.CreateCommand() as SqlCommand)
{
cmd.CommandText = "MyStoredProc";
NHibernateSession.Transaction.Enlist(cmd); // Because there is a pending transaction
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add(new SqlParameter("#wiData", SqlDbType.Structured) { Value = wiSnSqlList });
int affected = cmd.ExecuteNonQuery();
}
Since I am using the SqlParameter class with this approach, SqlDbType.Structured is available.
This is the function where wiSnList gets assigned:
private IEnumerable<SqlDataRecord> TransformWiSnListToSql(IList<SHWorkInstructionSnapshot> wiSnList)
{
if (wiSnList == null)
{
yield break;
}
var schema = new[]
{
new SqlMetaData("OriginalId", SqlDbType.BigInt), //0
new SqlMetaData("ReportId", SqlDbType.BigInt), //1
new SqlMetaData("Description", SqlDbType.DateTime), //2
};
SqlDataRecord row = new SqlDataRecord(schema);
foreach (var wi in wiSnList)
{
row.SetSqlInt64(0, wi.OriginalId);
row.SetSqlInt64(1, wi.ShiftHandoverReportId);
if (wi.Description == null)
{
row.SetDBNull(2);
}
else
{
row.SetSqlString(2, wi.Description);
}
yield return row;
}
}
You can pass collections of values without the hassle.
Example:
var ids = new[] {1, 2, 3};
var query = session.CreateQuery("from Foo where id in (:ids)");
query.SetParameterList("ids", ids);
NHibernate will create a parameter for each element.

Resources