I am creating an agent job, using SQL server. In my database there are 2 tables.
The columns in the first table are:
Idproduct, number
The columns in the second tables are:
Idproduct, start (datatime), duration (time)
I need to increment the field number of a product when its (start+duration)<= getdate() and then i need delete this record.
How can i do?
Create table product(
Idproduct int primary key,
Number int default 0)
Create table production(
Idproduct int primary key,
Start datetime not null,
Times time not null)
One method is with the OUTPUT clause of a DELETE statement to insert the produced products into a table variable. Then use the table variable to increment the count. The example below uses SQL 2012 and above features but you retrofit the error handling for earlier versions if needed.
SET XACT_ABORT ON;
DECLARE #ProducedProducts TABLE(
Idproduct int
);
BEGIN TRY
BEGIN TRAN;
--delete produced products
DELETE FROM dbo.production
OUTPUT deleted.Idproduct INTO #ProducedProducts
WHERE
DATEADD(millisecond, DATEDIFF(millisecond, '', Times), Start) <= GETDATE();
--increment produced products count
UPDATE dbo.product
SET Number += 1
WHERE Idproduct IN(
SELECT pp.Idproduct
FROM #ProducedProducts AS pp
);
COMMIT;
END TRY
BEGIN CATCH
THROW;
END CATCH;
Related
I am having much difficulty attempting to replicate Logging as I had done in Oracle using PRAGRMA AUTOMOUS_TRANSACTION which allows you to COMMIT records to a LOG table while NOT COMMITing any other DML operations. I've been banging my head as to how more experienced SQL Server guys are LOGGING the Success or Errors of their Database Programs/Processes. Bascially how are experienced T-SQL guys logging in the middle of an active T-SQL program? Another way to put it... I have a HUGE process that is NOT to be COMMITed until the entire process executes without Critical Errors but I still need to log ALL Errors and if it's a Critical Error then ROLLBACK entire process but I still need the Logs.
Using MERGE as Example demonstrating my inability to COMMIT some records but ROLLING back others ...
I have TWO named Transactions in the below script (1. sourceData, 2. mainProcess)... This is my first attempt at trying to accomplish this goal. First the script INSERTs records using the 1st Transaction into a table without COMMIT. Then in Transaction 2 in the MERGE block I am INSERTing records into the Destination table and the Log table and I COMMIT transaction 2 (mainProcess).
I then AM ROLLING BACK the 1st named Transaction (sourceData)..
The issue is... EVERYTHING is getting ROLLED Back even though I explicitly COMMITed the mainProcess Transaction.
GO
/* temp table to simulate Log Table */
IF OBJECT_ID('tempdb..#LogTable') IS NULL
CREATE TABLE #LogTable
( Action VARCHAR(50),
primaryID INT,
secondaryID INT,
CustomID INT,
Note VARCHAR(200),
ConvDate DATE
) --DROP TABLE IF EXISTS #LogTable;
; /* SELECT * FROM #LogTable; TRUNCATE TABLE #LogTable; */
/* SELECT * FROM #ProductionSrcTable */
IF OBJECT_ID('tempdb..#ProductionSrcTable') IS NULL
CREATE TABLE #ProductionSrcTable( primaryKey INT, contactName VARCHAR(200), sourceKey INT )
; --DROP TABLE IF EXISTS #ProductionSrcTable; TRUNCATE TABLE #ProductionSrcTable;
/* SELECT * FROM #ProductionDestTable */
IF OBJECT_ID('tempdb..#ProductionDestTable') IS NULL
CREATE TABLE #ProductionDestTable( primaryKey INT, contactName VARCHAR(200), secondaryKey INT )
; --DROP TABLE IF EXISTS #ProductionDestTable; TRUNCATE TABLE #ProductionDestTable;
GO
/* Insert some fake data into Source Table */
BEGIN TRAN sourceData
BEGIN TRY
INSERT INTO #ProductionSrcTable
SELECT 1001 AS primaryKey, 'Jason' AS contactName, 789105 AS sourceKey UNION ALL
SELECT 1002 AS primaryKey, 'Jane' AS contactName, 789185 AS sourceKey UNION ALL
SELECT 1003 AS primaryKey, 'Sam' AS contactName, 788181 AS sourceKey UNION ALL
SELECT 1004 AS primaryKey, 'Susan' AS contactName, 681181 AS sourceKey
;
END TRY
BEGIN CATCH
ROLLBACK TRANSACTION sourceData
END CATCH
/* COMMIT below is purposely commented out in order to Test */
--COMMIT TRANSACTION sourceData
GO
BEGIN TRAN mainProcess
DECLARE #insertedRecords INT = 0,
#CustomID INT = 2,
#Note VARCHAR(200) = 'Test Temp DB Record Population via OUTPUT Clause',
#ConvDate DATE = getDate()
;
BEGIN TRY
MERGE INTO #ProductionDestTable AS dest
USING
(
SELECT src.primaryKey, src.contactName, src.sourceKey FROM #ProductionSrcTable src
) AS src ON src.primaryKey = dest.primaryKey AND src.sourceKey = dest.secondaryKey
WHEN NOT MATCHED BY TARGET
THEN
INSERT --INTO ProductionDestTable
( primaryKey, contactName, secondaryKey )
VALUES
( src.primaryKey, src.contactName, src.sourceKey )
/* Insert Output in Log Table */
OUTPUT $action, inserted.primaryKey, src.sourceKey, #CustomID, #Note, #ConvDate INTO #LogTable;
; /* END MERGE */
/* Store the number of inserted Records into the insertedRecords variable */
SET #insertedRecords = ##ROWCOUNT;
END TRY
BEGIN CATCH
ROLLBACK TRANSACTION mainProcess
END CATCH
;
--ROLLBACK TRANSACTION mainProcess
COMMIT TRANSACTION mainProcess
ROLLBACK TRANSACTION sourceData
PRINT 'Records Inserted:' + CAST(#insertedRecords AS VARCHAR);
/* END */
--SELECT ##TRANCOUNT
I have a bank db with two tables accountmaster and transactionmaster.
Accountmaster has columns:
accid(pk)
accname
bal
branch
Transactionmaster columns:
Tnumber(pk)
dot
txnAmt
transactiontype
accid(fk)
branch(fk)
I want to find the below without using instead of triggers.
whenever a transaction is made by the accountholder (transaction type like deposit, withdraw) it should reflect in the balance of accountmaster table.
whenever an accountholder makes a transaction > 50000 (withdraw or deposit), that transaction details are to be inserted into a new table 'hightransaction' and remove that particular transaction in the transactionmaster table.
I tried something like this but in the result only column names are displayed and no values.
First I copied the transactionmaster into newtable hightransaction
select *
into hightransaction
from transactionmaster
where 1 = 2
then I created a trigger
create trigger [dbo].[transaction2]
on transactionmaster
for insert
as
declare #transtype nvarchar(10);
select #transtype = [TXNTYPE]
from inserted
if (select txnamt from inserted) > 75000
begin
insert into [dbo].[hightransactionmaster3]
select
dot, txntype, chqnum, chqdate, txnamt,
acid, brid, userid
from
inserted
end
else
begin
insert into [dbo].[TRANSACTIONMASTER]
select
dot, txntype, chqnum, chqdate, txnamt,
acid, brid, userid
from
inserted
end
and I tried to execute
select * from hightransaction
The output is only column names and no values.
I am thinking of a stored procedure like this.
CREATE PROCEDURE [dbo].[transaction]
#transactionAmt int
/*
ADD OTHER PARAMETERS HERE
*/
AS
BEGIN
SET NOCOUNT ON;
IF (#transactionAmt > 50000)
BEGIN
/*INSERT STATEMENT FOR HIGHTRANSCTION TABLE*/
END
ELSE
BEGIN
/*INSERT STATEMENT FOR TRANSACTIONMASTER TABLE*/
END
END
GO
This is provided that you have control over the application and can have it pass the parameters into a stored procedure.
Identity counter increment by one although it is in TRY Catch and Transaction is roll-backed ? SSMS 2008 is there any way i can stop it +1 or rollback it too.
In order to understand why this happened, Let's execute below sample code first-
USE tempdb
CREATE TABLE dbo.Sales
(ID INT IDENTITY(1,1), Address VARCHAR(200))
GO
BEGIN TRANSACTION
INSERT DBO.Sales
( Address )
VALUES ( 'Dwarka, Delhi' );
ROLLBACK TRANSACTION
Now, Execution plan for above query is-
The second last operator from right Compute Scalar is computing value for [Expr1003]=getidentity((629577281),(2),NULL) which is IDENTITY value for ID column. So this clearly indicates that IDENTITY values are fetched & Incremented prior to Insertion (INSERT Operator). So its by nature that even transaction rollback at later stage once created IDENTITY value is there.
Now, in order to reseed the IDENTITY value to Maximum Identity Value present in table + 1, you need sysadmin permission to execute below DBCC command -
DBCC CHECKIDENT
(
table_name
[, { NORESEED | { RESEED [, new_reseed_value ] } } ]
)
[ WITH NO_INFOMSGS ]
So the final query should include below piece of code prior to rollback statement:-
-- Code to check max ID value, and verify it again IDENTITY SEED
DECLARE #MaxValue INT = (SELECT ISNULL(MAX(ID),1) FROM dbo.Sales)
IF #MaxValue IS NOT NULL AND #MaxValue <> IDENT_CURRENT('dbo.Sales')
DBCC CHECKIDENT ( 'dbo.Sales', RESEED, #MaxValue )
--ROLLBACK TRANSACTION
So it is recommended to leave it on SQL Server.
You are right and the following code inserts record with [Col01] equal to 2:
CREATE TABLE [dbo].[DataSource]
(
[Col01] SMALLINT IDENTITY(1,1)
,[Col02] TINYINT
);
GO
BEGIN TRY
BEGIN TRANSACTION;
INSERT INTO [dbo].[DataSource] ([Col02])
VALUES (1);
SELECT 1/0
END TRY
BEGIN CATCH
IF ##TRANCOUNT > 0
BEGIN
ROLLBACK TRANSACTION
END;
END CATCH;
GO
INSERT INTO [dbo].[DataSource] ([Col02])
VALUES (1);
SELECT *
FROM [dbo].[DataSource]
This is by design (as you can see in the documentation:
Consecutive values after server restart or other failures –SQL Server
might cache identity values for performance reasons and some of the
assigned values can be lost during a database failure or server
restart. This can result in gaps in the identity value upon insert. If
gaps are not acceptable then the application should use its own
mechanism to generate key values. Using a sequence generator with the
NOCACHE option can limit the gaps to transactions that are never
committed.
I try using NOCACHE sequence but it does not work on SQL Server 2012:
CREATE TABLE [dbo].[DataSource]
(
[Col01] SMALLINT
,[Col02] TINYINT
);
CREATE SEQUENCE [dbo].[MyIndentyty]
START WITH 1
INCREMENT BY 1
NO CACHE;
GO
BEGIN TRY
BEGIN TRANSACTION;
INSERT INTO [dbo].[DataSource] ([Col01], [Col02])
SELECT NEXT VALUE FOR [dbo].[MyIndentyty], 1
SELECT 1/0
END TRY
BEGIN CATCH
IF ##TRANCOUNT > 0
BEGIN
ROLLBACK TRANSACTION
END;
END CATCH;
GO
INSERT INTO [dbo].[DataSource] ([Col01], [Col02])
SELECT NEXT VALUE FOR [dbo].[MyIndentyty], 1
SELECT *
FROM [dbo].[DataSource]
DROP TABLE [dbo].[DataSource];
DROP SEQUENCE [dbo].[MyIndentyty];
You can use MAX to solve this:
CREATE TABLE [dbo].[DataSource]
(
[Col01] SMALLINT
,[Col02] TINYINT
);
BEGIN TRY
BEGIN TRANSACTION;
DECLARE #Value SMALLINT = (SELECT MAX([Col01]) FROM [dbo].[DataSource]);
INSERT INTO [dbo].[DataSource] ([Col01], [Col02])
SELECT #Value, 1
SELECT 1/0
END TRY
BEGIN CATCH
IF ##TRANCOUNT > 0
BEGIN
ROLLBACK TRANSACTION
END;
END CATCH;
GO
DECLARE #Value SMALLINT = ISNULL((SELECT MAX([Col01]) FROM [dbo].[DataSource]), 1);
INSERT INTO [dbo].[DataSource] ([Col01], [Col02])
SELECT #Value, 1
SELECT *
FROM [dbo].[DataSource]
DROP TABLE [dbo].[DataSource];
But you must pay attentions to your isolation level for potential issues:
If you want to insert many rows at the same time, do the following:
get the current max value
create table where to store the rows (that are going to be inserted) generating ranking (you can use identity column, you can use ranking function) and adding the max value to it
insert the rows
I need to update a field depending on another record. I use SQL Server.
In my database there are two tables.
create table product(
IDproduct int primary key,
numberA int default 0,
numberB int default 0)
create table production(
IDproduct int primary key,
start datetime not null,
duration time(7) not null,
columnName varchar(32) not null)
I need to increment the field numberA or numberB of product when its (start+duration)<=getdate(). In the columnName of production there is the name of column to update (numberA or numberB). Finally I delete the record in production.
This is my current code, but I only update the column numberA:
SET XACT_ABORT ON;
DECLARE #ProducedProducts TABLE(
IDproduct int
);
BEGIN TRY
BEGIN TRAN;
DELETE FROM PRODUCTION
OUTPUT deleted.IDproduction INTO #ProducedProducts
WHERE DATEADD(second,
datepart(hour,duration) * 3600 +
datepart(minute,duration) * 60 +
datepart(second,duration),
start) <= GETDATE();
UPDATE PRODUCT
SET numberA += 1
WHERE IDproduct IN(
SELECT pp.IDproduct
FROM #ProducedProducts AS pp
);
COMMIT;
END TRY
BEGIN CATCH
THROW;
END CATCH;
You can have multiple job agent in order to update each column. In this way you don't need to dynamic job.
When adding an item in my database, I need it to auto-determine the value for the field DisplayOrder. Identity (auto-increment) would be an ideal solution, but I need to be able to programmatically change (UPDATE) the values of the DisplayOrder column, and Identity doesn't seem to allow that. For the moment, I use this code:
CREATE PROCEDURE [dbo].[AddItem]
AS
DECLARE #DisplayOrder INT
SET #DisplayOrder = (SELECT MAX(DisplayOrder) FROM [dbo].[MyTable]) + 1
INSERT INTO [dbo].[MyTable] ( DisplayOrder ) VALUES ( #DisplayOrder )
Is it the good way to do it or is there a better/simpler way?
A solution to this issue from "Inside Microsoft SQL Server 2008: T-SQL Querying"
CREATE TABLE dbo.Sequence(
val int IDENTITY (10000, 1) /*Seed this at whatever your current max value is*/
)
GO
CREATE PROC dbo.GetSequence
#val AS int OUTPUT
AS
BEGIN TRAN
SAVE TRAN S1
INSERT INTO dbo.Sequence DEFAULT VALUES
SET #val=SCOPE_IDENTITY()
ROLLBACK TRAN S1 /*Rolls back just as far as the save point to prevent the
sequence table filling up. The id allocated won't be reused*/
COMMIT TRAN
Or another alternative from the same book that allocates ranges easier. (You would need to consider whether to call this from inside or outside your transaction - inside would block other concurrent transactions until the first one commits)
CREATE TABLE dbo.Sequence2(
val int
)
GO
INSERT INTO dbo.Sequence2 VALUES(10000);
GO
CREATE PROC dbo.GetSequence2
#val AS int OUTPUT,
#n as int =1
AS
UPDATE dbo.Sequence2
SET #val = val = val + #n;
SET #val = #val - #n + 1;
You can set your incrementing column to use the identity property. Then, in processes that need to insert values into the column you can use the SET IDENITY_INSERT command in your batch.
For inserts where you want to use the identity property, you exclude the identity column from the list of columns in your insert statement:
INSERT INTO [dbo].[MyTable] ( MyData ) VALUES ( #MyData )
When you want to insert rows where you are providing the value for the identity column, use the following:
SET IDENTITY_INSERT MyTable ON
INSERT INTO [dbo].[MyTable] ( DisplayOrder, MyData )
VALUES ( #DisplayOrder, #MyData )
SET IDENTITY_INSERT MyTable OFF
You should be able to UPDATE the column without any other steps.
You may also want to look into the DBCC CHECKIDENT command. This command will set your next identity value. If you are inserting rows where the next identity value might not be appropriate, you can use the command to set a new value.
DECLARE #DisplayOrder INT
SET #DisplayOrder = (SELECT MAX(DisplayOrder) FROM [dbo].[MyTable]) + 1
DBCC CHECKIDENT (MyTable, RESEED, #DisplayOrder)
Here's the solution that I kept:
CREATE PROCEDURE [dbo].[AddItem]
AS
DECLARE #DisplayOrder INT
BEGIN TRANSACTION
SET #DisplayOrder = (SELECT ISNULL(MAX(DisplayOrder), 0) FROM [dbo].[MyTable]) + 1
INSERT INTO [dbo].[MyTable] ( DisplayOrder ) VALUES ( #DisplayOrder )
COMMIT TRANSACTION
One thing you should do is to add commands so that your procedure's run as a transaction, otherwise two inserts running at the same time could produce two rows with the same value in DisplayOrder.
This is easy enough to achieve: add
begin transaction
at the start of your procedure, and
commit transaction
at the end.
You way works fine (with a little modification) and is simple. I would wrap it in a transaction like #David Knell said. This would result in code like:
CREATE PROCEDURE [dbo].[AddItem]
AS
DECLARE #DisplayOrder INT
BEGIN TRANSACTION
SET #DisplayOrder = (SELECT MAX(DisplayOrder) FROM [dbo].[MyTable]) + 1
INSERT INTO [dbo].[MyTable] ( DisplayOrder ) VALUES ( #DisplayOrder )
COMMIT TRANSACTION
Wrapping your SELECT & INSERT in a transaction guarantees that your DisplayOrder values won't be duplicated by AddItem. If you are doing a lot of simultaneous adding (many per second), there may be contention on MyTable but for occasional inserts it won't be a problem.