Replacing SQL Server IDENTITY Due to Autogenerated Issue - sql-server

I have an Invoice Database that contains an ID IDENTITY that SQL Server is autogenerating by an increment of one (+1) each time a new record is created by a LINQ Insert.
The code that I am currently using to create a new record is posted below and the Incremental ID is autogenerated by SQL Server.
public async Task<IActionResult> Create([Bind(
"PurchaseOrder,InvDate,DelDate,PaidDate,AgentName,FullName,FirstName, LastName,CustId,CompanyName,ClientRole,Email,Phone,Address," +
"City,State,Zip,Country,ProdCode,Description,Quantity,UnitPrice,LineTotal,OrderTotal,Tax,Discount,Credit," +
"Shipping,GrandTotal,Deposit,AmtDue,DiscAmt,Notes,Published,Year,Expenses,ProjectDescription,ClientProvision")]
CreateNewOrderViewModel cnq)
{
int invId;
try
{
await _context.AddAsync(cnq);
await _context.SaveChangesAsync();
}
catch (InvalidCastException e)
{
ViewBag.Result = $"Database insert failed with: {e}";
return View();
}
}
My issue is with the SQL Server ID IDENTITY. Every time the server is rebooted, my ID IDENTITY value increases by a factor of 1000 instead of the default value of 1, which for example, changes/increases the next record that I created by a factor of 1000. Hence, if my last record was 1001, the next record that is created will be 2001, instead of 1002. This behavior continues every time the server is updated and needs to be rebooted. I searched for an answer and discovered that the issue is a SQL Server bug that is based on the Cached protocol that remembers the latest ID values.
Since I am on a Shared Hosting Server and do not have full control of the Database, I only have DBO to my own database. I was wondering if there was a way for me to use LINQ to generate the incremental value for a new InvID column that I can then use as the record ID, instead of the SQL Server generated value.

If you don't have control over database to handle sql server identity gap issue, you need to manually read the largest Id from the table and increment it by 1 for the new record.
Sample code to retrieve the last Id:
int lastId = _context.TableA.OrderByDescending(t => t.Id).FirstOrDefault().Id;
int newId = lastId + 1;
// Add the new record with newId

Reading several recommendations, I decided to check and reseed the ID before running the LINQ query.
using (var connection = new SqlConnection(_data.DATA))
{
connection.Open();
try
{
var seed = new SqlCommand("CreateIDSeed", connection)
{
CommandType = CommandType.StoredProcedure
};
seed.ExecuteNonQuery();
await _context.AddAsync(cnq);
await _context.SaveChangesAsync();
}
catch (InvalidCastException e)
{
ViewBag.Result = $"Database insert failed with: {e}";
return View();
}
}
Where the CreateIDSeed looks like this:
CREATE PROCEDURE [dbo].[CreateIDSeed]
AS
BEGIN
SET NOCOUNT ON;
declare #newId int
select #newId = max(ID) from dbo.CreateNewOrderViewModel
DBCC CheckIdent('dbo.CreateNewOrderViewModel', RESEED, #newId)
END
GO
I tried inserting a few test records and it seems to be working, but I will know more when the Server is rebooted.

Related

Entity Framework Insert Into Table With AFTER INSERT Trigger

I am working on a Web API and Entity Framework 6 that is doing a "bulk" insert of under 500 records at any given time to a Microsoft SQL Server table. The DbContext.SaveChanges() method will insert all the records into a table in a couple seconds, so have no issues with that. However, when the method is called to insert the same number of records into the same table with a semi-extensive trigger attached to it, the process can take many minutes. The trigger has some calls to table joins and inserts into other tables and then deletes the newly inserted record.
I do not have much control of the table or the trigger, so I am looking for suggestions on how to improve performance. I made a suggestion to move the trigger to a stored procedure and have the trigger call the stored procedure, but I am uncertain if that will achieve any gains.
EDIT: As I understand my question was kind of generic, I will post some of my code in case it helps. The SQL is not mine, so I will see what I can actually post.
Here is the part of my Web API method that does the call to SaveChanges():
string[] stringArray = results[0].Split(new[] { "\r\n", "\r", "\n" }, StringSplitOptions.None);
var profileObjs = db.Set<T_ProfileStaging>();
foreach (var s in stringArray)
{
string[] columns = s.Split(new[] {",", "\t"}, StringSplitOptions.None);
if (columns.Length == 6)
{
T_ProfileStaging profileObj = new T_ProfileStaging();
profileObj.CompanyCode = columns[0];
profileObj.SubmittedBy = columns[1];
profileObj.VersionName = columns[2];
profileObj.DMName = columns[3];
profileObj.Zone = columns[4];
profileObj.DMCode = columns[5];
profileObj.ProfileName = columns[6];
profileObj.Advertiser = columns[7];
profileObj.OriginalInsertDate = columns[8];
profileObjs.Add(profileObj);
}
}
try
{
db.SaveChanges();
return Ok();
}
catch (Exception e)
{
return Content(HttpStatusCode.BadRequest, "SQL Server Insert Exception");
}
When you load with SaveChanges() EF will send each row in a separate INSERT statement. So if you have a statement trigger, it will run for each row.
To work around this you need either
use a bulk load API from the client (instead of EF's SaveChanges()) using SqlBulkCopy directly, or one of the many EF extensions that wrap it.
or
Configure EF to insert into a different table and then INSERT ... SELECT into the target table

ASP.NET Core issue with Update record in Sql

I am using ASP.NET Core and SQL Server . When trying to update a record in the database I am getting this following error message:
Database operation expected to affect 1 row(s) but actually affected 39 row(s).
I will past the code below, but the part that is very confusing is that the "where" statement is using the tables primary key -- there is no way that there is more than 1 record with the same key -- which I verified several times.
using (var transaction = _ctx.Database.BeginTransaction())
{
var sql = #"Update [Policies] SET DateInvalid = #DateTimeNow Where EntryNum = #EntryNum";
_ctx.Database.ExecuteSqlCommand(sql,
new SqlParameter("#DateTimeNow", DateTime.Now),
new SqlParameter("#EntryNum", existing.EntryNum) );
_ctx.SaveChanges();
transaction.Commit();
}

SQL Server: Get Latest Auto-Increment Value [duplicate]

I am creating a winform application in c#.and using sql database.
I have one table, employee_master, which has columns like Id, name, address and phone no. Id is auto increment and all other datatypes are varchar.
I am using this code to get the next auto increment value:
string s = "select max(id) as Id from Employee_Master";
SqlCommand cmd = new SqlCommand(s, obj.con);
SqlDataReader dr = cmd.ExecuteReader();
dr.Read();
int i = Convert.ToInt16(dr["Id"].ToString());
txtId.Text = (i + 1).ToString();
I am displaying on a textBox.
But when last row from table is deleted, still I get that value which is recently deleted in textbox
How should I get the next autoincrement value?
To get the next auto-increment value from SQLServer :
This will fetch the present auto-increment value.
SELECT IDENT_CURRENT('table_name');
Next auto-increment value.
SELECT IDENT_CURRENT('table_name')+1;
------> This will work even if you add a row and then delete it because IDENT_CURRENT returns the last identity value generated for a specific table in any session and any scope.
try this:
SELECT IDENT_CURRENT('tbl_name') + IDENT_INCR('tbl_name');
If you are using Microsoft SQL Server. Use this statement to get current identity value of table. Then add your seed value which you have specified at time of designing table if you want to get next id.
SELECT IDENT_CURRENT(<TableName>)
As for me, the best answer is:
dbcc checkident(table_name)
You will see two values (probably same)
current identity value , current column value
When you delete a row from the table the next number will stay the same as it doesnt decrement in any way.
So if you have 100 rows and you deleted row 100. You would have 99 rows but the next number is still going to be 101.
select isnull((max(AddressID)+1),1) from AddressDetails
the max(id) will get you maximum number in the list pf employee_master
e.g. id = 10, 20, 100 so max will get you 100
But when you delete the record it must have been not 100
So you still get 100 back
One important reason for me to say this might be the issue because you are not using order by id in your query
For MS SQL 2005 and greater:
Select Cast(IsNULL(last_value,seed_value) As Int) + Cast(increment_value As Int) As NextID
From sys.identity_columns
WHERE NAME = <Table_Name>
Just a thought, if what you wanted was the last auto-number that you inserted on an already open connection try using:
SELECT ##IDENTITY FROM...
from that connection. That's the best way to keep track of what has just happened on a given connection and avoids race conditions w/ other connections. Getting the maximum identity is not generally feasible.
SqlConnection con = new SqlConnection("Data Source=.\SQLEXPRESS;Initial Catalog=databasename;User ID=sa;Password=123");
con.Open();
SqlCommand cmd = new SqlCommand("SELECT TOP(1) UID FROM InvoiceDetails ORDER BY 1 DESC", con);
SqlDataReader reader = cmd.ExecuteReader();
//won't need a while since it will only retrieve one row
while (reader.Read())
{
string data = reader["UID"].ToString();
//txtuniqueno.Text = data;
//here is your data
//cal();
//txtuniqueno.Text = data.ToString();
int i = Int32.Parse(data);
i++;
txtuid.Text = i.ToString();
}

Sql Server Database stored procedure triggered when a value becomes 0

I have a column in one of my tables that acts as a countdown. So as the current datetime approaches the endtime, it works its way down to 0. The countdown's value becomes 0 once the endtime is reached. When this happens, I would like it to trigger a stored procedure. Does anyone know how I could create a trigger such as this?
Table:
CREATE TABLE t1 (
id int IDENTITY,
enddate datetime NOT NULL,
daysleft AS (DATEDIFF(dd, GETDATE(), enddate))
);
INSERT INTO t1 (enddate)
VALUES(DATEADD(dd, 1, GETDATE())),
(DATEADD(dd, 2, GETDATE())),
(DATEADD(dd, 15, GETDATE()))
I haven't created the procedure yet, only because it would be useless if I can't create my required trigger.
I don't think there is a construct in SQL Server that allows you to run a stored procedure when the value of data equals user defined value. The reason behind this is if something as such were to be made, the performance of SQL Server would be tremendously impacted.
I would instead recommend you to consider writing a Windows Service that periodically wakes up and checks whether there are any rows in your table that have expired and then invoke your intended stored procedure on them. I can give you a sample of how to achieve this if you would like.
EDIT: A sample implementation of Windows Service
Okay so as I said above, there is no way of being notified when a particular data has reached its expiry time. So instead of being in the push notification model, we are working in a pull notification model where we periodically query the database for expired rows (e.g. something like "SELECT id FROM t1 WHERE enddate = #todayDate")
Now the key part of this solution is we need a periodic service. This has been demonstrated in the another user's answer here. The method where we will do our SQL operations is private void timer_Elapsed(...).
private void timer_Elapsed(object sender, System.Timer.ElapsedEventArgs e)
{
// since I am not sure whether you are accessing your database using ADO.NET
// (i.e. SqlConnection, etc.) or Entity Framework (i.e. DbContext),
// I will assume ADO.NET from here on
// retrieve all rows that have expired
using (var connection = new SqlConnection(connectionString))
{
connection.Open();
var command = new SqlCommand("SELECT id FROM t1 WHERE enddate = #todayDate", connection);
var paramDate = new SqlParameter("#todayDate", DateTime.Now.Date);
command.Parameters.Add(paramDate);
var reader = command.ExecuteReader();
while (reader.Read())
{
var storedProcCommand = new SqlCommand("EXEC CleanUpExpiredRow #id", connection);
var paramId = new SqlParameter("#id", reader.GetInt32(0));
storedProcCommand.Parameters.Add(paramId);
command.ExecuteNonQuery();
}
}
}
NOTE None of this code is tested.

Correct method of deleting over 2100 rows (by ID) with Dapper

I am trying to use Dapper support my data access for my server app.
My server app has another application that drops records into my database at a rate of 400 per minute.
My app pulls them out in batches, processes them, and then deletes them from the database.
Since data continues to flow into the database while I am processing, I don't have a good way to say delete from myTable where allProcessed = true.
However, I do know the PK value of the rows to delete. So I want to do a delete from myTable where Id in #listToDelete
Problem is that if my server goes down for even 6 mintues, then I have over 2100 rows to delete.
Since Dapper takes my #listToDelete and turns each one into a parameter, my call to delete fails. (Causing my data purging to get even further behind.)
What is the best way to deal with this in Dapper?
NOTES:
I have looked at Tabled Valued Parameters but from what I can see, they are not very performant. This piece of my architecture is the bottle neck of my system and I need to be very very fast.
One option is to create a temp table on the server and then use the bulk load facility to upload all the IDs into that table at once. Then use a join, EXISTS or IN clause to delete only the records that you uploaded into your temp table.
Bulk loads are a well-optimized path in SQL Server and it should be very fast.
For example:
Execute the statement CREATE TABLE #RowsToDelete(ID INT PRIMARY KEY)
Use a bulk load to insert keys into #RowsToDelete
Execute DELETE FROM myTable where Id IN (SELECT ID FROM #RowsToDelete)
Execute DROP TABLE #RowsToDelte (the table will also be automatically dropped if you close the session)
(Assuming Dapper) code example:
conn.Open();
var columnName = "ID";
conn.Execute(string.Format("CREATE TABLE #{0}s({0} INT PRIMARY KEY)", columnName));
using (var bulkCopy = new SqlBulkCopy(conn))
{
bulkCopy.BatchSize = ids.Count;
bulkCopy.DestinationTableName = string.Format("#{0}s", columnName);
var table = new DataTable();
table.Columns.Add(columnName, typeof (int));
bulkCopy.ColumnMappings.Add(columnName, columnName);
foreach (var id in ids)
{
table.Rows.Add(id);
}
bulkCopy.WriteToServer(table);
}
//or do other things with your table instead of deleting here
conn.Execute(string.Format(#"DELETE FROM myTable where Id IN
(SELECT {0} FROM #{0}s", columnName));
conn.Execute(string.Format("DROP TABLE #{0}s", columnName));
To get this code working, I went dark side.
Since Dapper makes my list into parameters. And SQL Server can't handle a lot of parameters. (I have never needed even double digit parameters before). I had to go with Dynamic SQL.
So here was my solution:
string listOfIdsJoined = "("+String.Join(",", listOfIds.ToArray())+")";
connection.Execute("delete from myTable where Id in " + listOfIdsJoined);
Before everyone grabs the their torches and pitchforks, let me explain.
This code runs on a server whose only input is a data feed from a Mainframe system.
The list I am dynamically creating is a list of longs/bigints.
The longs/bigints are from an Identity column.
I know constructing dynamic SQL is bad juju, but in this case, I just can't see how it leads to a security risk.
Dapper request the List of object having parameter as a property so in above case a list of object having Id as property will work.
connection.Execute("delete from myTable where Id in (#Id)", listOfIds.AsEnumerable().Select(i=> new { Id = i }).ToList());
This will work.

Resources