These 3 tables are connected, using short-circuit keys, ID_KOEFICIJENT is in table RADNIK through table RADNO_MESTO. And I have a problem with triggers. I think it's a conflict between triggers. When I want to update the table RADNIK, for example:
update radnik set ID_KOEFICIJENT=3, prezime_ime='Perica Milisav', datum_rodjenja='2020-03-23',
zanimanje='astronaut', id_radno_mesto=3,stepen_strucne_spreme='7.3', identifikator_casova_rada=7
where ID_RADNIK=10;
It will update everything except ID_KOEFICIJENT but it won't give back a message from trigger UIK_RADNIK if I change ID_KOEFICIJENT to 4,5,6,7... .
Table RADNIK: Primary key: ID_RADNIK, Foreign keys: ID_KOEFICIJENT and ID_RADNO_MESTO
CREATE TABLE "RADNIK"(
"ID_RADNIK" NUMBER(*,0),
"JMBG" NUMBER(*,0),
"PREZIME_IME" VARCHAR2(100 BYTE),
"DATUM_RODJENJA" DATE,
"ZANIMANJE" VARCHAR2(100 BYTE),
"STEPEN_STRUCNE_SPREME" VARCHAR2(100 BYTE),
"IDENTIFIKATOR_CASOVA_RADA" NUMBER(*,0),
"ID_KOEFICIJENT" NUMBER(*,0),
"ID_RADNO_MESTO" NUMBER
)
Table RADNO_MESTO: Primary key: ID_RADNO_MESTO, Foreign key: ID_KOEFICIJENT
CREATE TABLE "RADNO_MESTO"(
"ID_RADNO_MESTO" NUMBER(*,0),
"NAZIV" VARCHAR2(200 BYTE),
"BR_IZVRSILACA" NUMBER(*,0) DEFAULT 0,
"DATUM_OD" DATE,
"DATUM_DO" DATE,
"ID_KOEFICIJENT" NUMBER(*,0)
)
Table KOEFICIJENT: Primary key: ID_KOEFICIJENT
CREATE TABLE "KOEFICIJENT"(
"ID_KOEFICIJENT" NUMBER(*,0),
"BROJ" FLOAT(126),
"DATUM_OD" DATE,
"DATUM_DO" DATE
)
This trigger restricts directly updating column ID_KOEFICIJENT in table RADNIK.
Trigger: UIK_RADNIK:
create or replace TRIGGER UIK_RADNIK
BEFORE UPDATE OF ID_KOEFICIJENT ON RADNIK
FOR EACH ROW
BEGIN
if :new.ID_KOEFICIJENT <> :old.ID_KOEFICIJENT then
RAISE_APPLICATION_ERROR(-20000, 'Zabranjeno direktno ažuriranje koeficijenta radnika!');
END IF;
END;
This trigger updates column ID_KOEFICIJENT in table RADNIK by using column ID_RADNO_MESTO.
Trigger: UIK_RADNO_MESTO:
create or replace TRIGGER UIK_RADNO_MESTO
AFTER UPDATE OF ID_KOEFICIJENT ON RADNO_MESTO
FOR EACH ROW
DECLARE
PRAGMA AUTONOMOUS_TRANSACTION;
BEGIN
EXECUTE IMMEDIATE 'ALTER TRIGGER UIK_RADNIK DISABLE';
UPDATE RADNIK
SET ID_KOEFICIJENT = :NEW.ID_KOEFICIJENT
WHERE ID_RADNO_MESTO = :NEW.ID_RADNO_MESTO;
EXECUTE IMMEDIATE 'ALTER TRIGGER UIK_RADNIK ENABLE';
END;
Trigger: UIRM_RADNIK:
create or replace TRIGGER UIRM_RADNIK
BEFORE UPDATE OF ID_RADNO_MESTO ON RADNIK
FOR EACH ROW
DECLARE
pragma AUTONOMOUS_TRANSACTION;
v_id_koeficijent NUMBER;
BEGIN
SELECT ID_KOEFICIJENT INTO v_id_koeficijent
FROM RADNO_MESTO
WHERE ID_RADNO_MESTO = :NEW.ID_RADNO_MESTO;
:NEW.ID_KOEFICIJENT := v_id_koeficijent;
END;
You can set execution precedence in your triggers:
Order of execution of trigger and statements in Oracle stored procedure
And you can unify UIK_RADNIK and UIRM_RADNIK triggers too:
how to trigger when multiple columns are updated
Related
I have one table called [FridgeTemperture], when any record inserted it should add one value in the new table MpSensors. But records are not being inserted in the new table when a record is inserted.
Error
Explicit value must be specified for identity column in table
'MpSensors' either identity_insert is set to ON or when a replication
user is inserting into a not for replication identity column.
CREATE TRIGGER [dbo].[FridgeTemperature_INSERT]
ON [dbo].[FridgeTemperture]
AFTER INSERT
AS
BEGIN
SET IDENTITY_INSERT MpSensors ON;
SET NOCOUNT ON;
DECLARE #fridge_temp varchar(10)
INSERT INTO MpSensors(fridge_temp)
VALUES(#fridge_temp)
SET IDENTITY_INSERT MpSensors OFF;
END
GO
table schema
CREATE TABLE [dbo].[MpSensors](
[id] [int] IDENTITY(1,1) NOT NULL,
[fridge_temp] [varchar](10) NULL
) ON [PRIMARY]
CREATE TABLE [dbo].[FridgeTemperture](
[Id] [int] IDENTITY(1,1) NOT NULL,
[ShopId] [nvarchar](4) NULL,
[Fridgetemp] [decimal](4, 2) NOT NULL,
[UpdatedDate] [datetime2](7) NOT NULL
GO
You don't need the set identity_insert on if you are not attempting to insert values to the identity column. Also, your current insert statement, if you loose the set identity_insert, will simply inside a single null row for any insert statement completed successfully on the FridgeTemperture table.
When using triggers, you have access to the records effected by the statement that fired the trigger via the auto-generated tables called inserted and deleted.
I think you are after something like this:
CREATE TRIGGER [dbo].[FridgeTemperature_INSERT]
ON [dbo].[FridgeTemperture]
AFTER INSERT
AS
BEGIN
INSERT INTO MpSensors(fridge_temp)
SELECT CAST(Fridgetemp as varchar(10))
FROM inserted
END
Though I can't really see any benefit of storing the same value in two different places, and in two different data types.
Update
Following our conversation in the comments, you can simply use an update statement in the trigger instead of an insert statement:
UPDATE MpSensors
SET fridge_temp = (
SELECT TOP 1 CAST(Fridgetemp as varchar(10))
FROM inserted
ORDER BY Id DESC
)
This should give you the latest record in case you have an insert statement that inserts more than a single record into the FridgeTemperture table in a single statement.
create TRIGGER [dbo].[FridgeTemperature_INSERT]
ON [dbo].[FridgeTemperture]
AFTER INSERT
AS
BEGIN
UPDATE MpSensors
SET fridge_temp = CAST(Fridgetemp as varchar(10))
FROM inserted
END
You need to use Select statement with CAST as [fridge_temp] is varchar in MpSensors table in Trigger. Try like this:
CREATE trigger <table_name>
ON <table_name>
AFTER Insert
AS
BEGIN
INSERT INTO <table_name>(column_name)
Select CAST(column_name as varchar(10))
FROM inserted
END
The inserted table stores copies of the affected rows during INSERT and UPDATE statements. During an insert or update transaction, new rows are added to both the inserted table and the trigger table. The rows in the inserted table are copies of the new rows in the trigger table.
I'm using SQL Server and system-versioned (temporal) tables. In my main table, I have an INT column that's currently allowing NULLs. I want to update this to not allow nulls, but the system/history copy of the table allows nulls.
I run this statement:
ALTER TABLE dbo.MyTable
ALTER COLUMN MyInt INT NOT NULL;
And I get this error:
Cannot insert the value NULL into column 'MyInt', table 'mydb.dbo.MyTable_History'; column does not allow nulls. UPDATE fails.
I had created the system versioned table using this script:
ALTER TABLE dbo.MyTable
ADD
ValidFrom DATETIME2 (2) GENERATED ALWAYS AS ROW START HIDDEN CONSTRAINT DFMyTable_ValidFrom DEFAULT DATEADD(SECOND, -1, SYSUTCDATETIME()),
ValidTo DATETIME2 (2) GENERATED ALWAYS AS ROW END HIDDEN CONSTRAINT DFMyTable_ValidTo DEFAULT '9999.12.31 23:59:59.99',
PERIOD FOR SYSTEM_TIME (ValidFrom, ValidTo);
ALTER TABLE dbo.MyTable
SET (SYSTEM_VERSIONING = ON (HISTORY_TABLE = dbo.MyTable_History));
GO
Is there some other way I can make my main table's column non-nullable in this scenario? I suppose I could (maybe) manually update the existing system-versioned null values with an arbitrary garbage value, but it seems like this scenario should be supported with temporal tables.
I also looked at this and it seems you have to update the NULL values in the system version column to some value.
ALTER TABLE dbo.MyTable
SET (SYSTEM_VERSIONING = OFF)
GO
UPDATE dbo.MyTable_History
SET MyInt = 0 WHERE MyInt IS NULL --Update to default value
UPDATE dbo.MyTable
SET MyInt = 0 WHERE MyInt IS NULL --Update to default value
ALTER TABLE dbo.MyTable
ALTER COLUMN MyInt INT NOT NULL
ALTER TABLE dbo.MyTable_History
ALTER COLUMN MyInt INT NOT NULL
GO
ALTER TABLE dbo.MyTable
SET (SYSTEM_VERSIONING = ON (HISTORY_TABLE = dbo.MyTable_History));
GO
I got this issue when I was trying to add a new non-null column. I was originally trying to create the column as nullable, update all the values, and then set it to non-nullable:
ALTER TABLE dbo.MyTable
ADD COLUMN MyInt INT NULL;
GO
UPDATE dbo.MyTable
SET MyInt = 0;
GO
ALTER TABLE dbo.MyTable
ALTER COLUMN MyInt INT NOT NULL;
But I managed to get around it by using a temporary default constraint instead:
ALTER TABLE dbo.MyTable
ADD COLUMN MyInt INT NOT NULL CONSTRAINT DF_MyTable_MyInt DEFAULT 0;
ALTER TABLE dbo.MyTable
DROP CONSTRAINT DF_MyTable_MyInt;
Whilst you can change the schema of temporal tables there are certain actions that you cannot do by a direct ALTER whilst a table is system versioned. One of those is to change a Nullable column to be NOT NULL.
See Important Remarks - Changing the schema of a system-versioned temporal table
In this scenario the only thing you can do is to turn off system versioning using the following:
ALTER TABLE schema.TableName SET (SYSTEM_VERSIONING = OFF);
This leaves you with 2 separate tables - the table itself and it's history table both as separate objects. You can now make your schema updates to BOTH tables (they have to be schema aligned) and then you can turn system versioning back on:
ALTER TABLE schema.TableName SET (SYSTEM_VERSIONING = ON);
I have a pretty big table (around 1 billion rows), and I need to update the id type from SERIAL to BIGSERIAL; guess why?:).
Basically this could be done with this command:
execute "ALTER TABLE my_table ALTER COLUMN id SET DATA TYPE bigint"
Nevertheless that would lock my table forever and put my web service down.
Is there a quite simple way of doing this operation concurrently (whatever the time it will take)?
If you don't have foreign keys pointing your id you could add new column, fill it, drop old one and rename new to old:
alter table my_table add column new_id bigint;
begin; update my_table set new_id = id where id between 0 and 100000; commit;
begin; update my_table set new_id = id where id between 100001 and 200000; commit;
begin; update my_table set new_id = id where id between 200001 and 300000; commit;
begin; update my_table set new_id = id where id between 300001 and 400000; commit;
...
create unique index my_table_pk_idx on my_table(new_id);
begin;
alter table my_table drop constraint my_table_pk;
alter table my_table alter column new_id set default nextval('my_table_id_seq'::regclass);
update my_table set new_id = id where new_id is null;
alter table my_table add constraint my_table_pk primary key using index my_table_pk_idx;
alter table my_table drop column id;
alter table my_table rename column new_id to id;
commit;
Radek's solution looks great. I would add a comment if I had the reputation for it, but I just want to mention that if you are doing this you'll likely want to widen the sequence for the primary key as well.
ALTER SEQUENCE my_table_id_seq AS bigint;
If you just widen the column type, you'll still end up with problems when you hit 2 billion records if the sequence is still integer sized.
I think the issue that James points out about adding the primary key requiring a table scan can be solved with the NOT VALID/VALIDATE dance. Instead of doing alter table my_table add constraint my_table_pk primary key using index my_table_pk_idx;, you can do
ALTER TABLE my_table ADD UNIQUE USING INDEX my_table_pk_idx;
ALTER TABLE my_table ADD CONSTRAINT my_table_id_not_null CHECK (id IS NOT NULL) NOT VALID;
ALTER TABLE my_table VALIDATE CONSTRAINT my_table_id_not_null;
I think it's also worth mentioning that
create unique index my_table_pk_idx on my_table(new_id);
will do a full table scan with an exclusive lock on my_table. It is better to do
CREATE UNIQUE INDEX CONCURRENTLY ON my_table(new_id);
Merging both #radek-postołowicz and #ethan-pailes answers for a full concurrent solution, with some tweaks we get:
alter table my_table add column new_id bigint;
-- new records filling
CREATE FUNCTION public.my_table_fill_newid() RETURNS trigger
LANGUAGE plpgsql AS $$
DECLARE
record record;
BEGIN
new.new_id = new.id;
return new;
END;
$$;
CREATE TRIGGER my_table_fill_newid BEFORE INSERT ON my_table
FOR EACH ROW EXECUTE FUNCTION public.my_table_fill_newid();
-- old records filling
update my_table set new_id = id where id between 0 and 100000;
update my_table set new_id = id where id between 100001 and 200000;
update my_table set new_id = id where id between 200001 and 300000;
...
-- slow but concurrent part
create unique index concurrently my_table_pk_idx on my_table(new_id);
ALTER TABLE my_table ADD CONSTRAINT my_table_new_id_not_null
CHECK (new_id IS NOT NULL) NOT VALID; -- delay validate for concurrency
ALTER TABLE my_table VALIDATE CONSTRAINT my_table_new_id_not_null;
-- locking
begin;
ALTER TABLE my_table alter column new_id set not null; -- needed for pkey
ALTER TABLE my_table drop constraint my_table_new_id_not_null;
ALTER SEQUENCE my_table_id_seq AS bigint;
alter table my_table drop constraint my_table_pk;
alter table my_table add constraint my_table_pk primary key using index my_table_pk_idx;
alter table my_table drop column id;
alter table my_table rename column new_id to id;
drop trigger my_table_fill_newid on my_table;
commit;
I tried #radek-postołowicz solution, but it failed for me as I needed to set the new_id column as not null, and that locks the table for a long time.
My solution:
Select records from the old table, and insert it into a new table my_table_new with id being bigint. Run this as a standalone transaction.
In another transaction: do the step 1) again for the records which could have been created in the meantime, drop my_table and rename my_table_new to my_table.
The downside of this solution is that it auto-scaled the storage of my AWS RDS, and it could not be scaled back.
I'm trying to add and update a column. This code is within a transaction.
ALTER TABLE [Foo] ADD SomeId INT NULL
UPDATE [Foo] SET SomeId = 1
ALTER TABLE [Foo] ALTER COLUMN SomeId INT NOT NULL
I get this error:
Msg 207, Level 16, State 1, Line 5 Invalid column name 'SomeId'.
I tried adding a GO statement after the firstALTER TABLE, but apparently that's invalid inside a transaction. How can I make this work inside a transaction?
Try this:
Begin Try
Begin Tran
Alter Table [Foo] Add SomeId INT NOT NULL Constraint TempConstraint Default (1)
Alter Table [Foo] Drop TempConstraint
End Tran
Essentially what this is doing is adding the new column with a default value constraint of 1. All current rows will get a value of 1. Then the default value constraint is being removed, so there will be no default value.
Since you can't mix DDL and DML statements in a single transaction, like what you wanted to do originally, this is your only alternative.
This question already has answers here:
How to add identity to the column in SQL Server?
(4 answers)
Closed 8 years ago.
I have a table and primary key is already set to that table and now I want that column to be autoincrement. Table has many records. Is it possible? or which one is fastest way to do that?
I think you have to make some effort for this as you cannot create identity column on existing column. However you may have a workaround for this like first try this to add a new column having identity field:
ALTER TABLE dbo.Table_name
ADD ID INT IDENTITY
and then make your ID as primary key like this:
ALTER TABLE dbo.Table_name
ADD CONSTRAINT PK_YourTable
PRIMARY KEY(ID)
And yes you have to remove the old dependencies before performing the above steps like this:
ALTER TABLE Table_name
DROP CONSTRAINT PK_Table1_Col1
EDIT:-
From the source:
We can use ALTER TABLE...SWITCH to work around this by only modifying metadata. See Books Online for restrictions on using the SWITCH method presented below. The process is practically instant even for the largest tables.
USE tempdb;
GO
-- A table with an identity column
CREATE TABLE dbo.Source (row_id INTEGER IDENTITY PRIMARY KEY NOT NULL, data SQL_VARIANT NULL);
GO
-- Some sample data
INSERT dbo.Source (data)
VALUES (CONVERT(SQL_VARIANT, 4)),
(CONVERT(SQL_VARIANT, 'X')),
(CONVERT(SQL_VARIANT, {d '2009-11-07'})),
(CONVERT(SQL_VARIANT, N'áéíóú'));
GO
-- Remove the identity property
BEGIN TRY;
-- All or nothing
BEGIN TRANSACTION;
-- A table with the same structure as the one with the identity column,
-- but without the identity property
CREATE TABLE dbo.Destination (row_id INTEGER PRIMARY KEY NOT NULL, data SQL_VARIANT NULL);
-- Metadata switch
ALTER TABLE dbo.Source SWITCH TO dbo.Destination;
-- Drop the old object, which now contains no data
DROP TABLE dbo.Source;
-- Rename the new object to make it look like the old one
EXECUTE sp_rename N'dbo.Destination', N'Source', 'OBJECT';
-- Success
COMMIT TRANSACTION;
END TRY
BEGIN CATCH
-- Bugger!
IF XACT_STATE() <> 0 ROLLBACK TRANSACTION;
PRINT ERROR_MESSAGE();
END CATCH;
GO
-- Test the the identity property has indeed gone
INSERT dbo.Source (row_id, data)
VALUES (5, CONVERT(SQL_VARIANT, N'This works!'))
SELECT row_id,
data
FROM dbo.Source;
GO
-- Tidy up
DROP TABLE dbo.Source;