Oracle (v18/19) Trigger on Materialized View does not know about old values - database

In our tool we use triggers on materialized views in order to create log-entries (and do some other things) when a transaction is commited.
The code works good in Oracle 12. In Oracle 19 the old values in that trigger (":old") seems to be lost.
Investigations:
This seems to be the case in the combination of materialized views/triggers. If we set the same trigger on a table the logs are generated correctly (but we do not get the transaction-awareness which is required).
I have created a MWE and added comments to the DBMS_OUTPUT-Lines which describe what we see in oracle 12 and Oracle 18/19:
/*Create Test-Table*/
CREATE TABLE MAT_VIEW_TEST (
PK number(10,0) PRIMARY KEY ,
NAME NVARCHAR2(50)
);
/*insert some values*/
insert into MAT_VIEW_TEST values (1, 'Herbert');
insert into MAT_VIEW_TEST values (2, 'Hubert');
commit;
/*Create mateterialized view (log) in order to set trigger on it*/
CREATE MATERIALIZED VIEW LOG ON MAT_VIEW_TEST WITH PRIMARY KEY, ROWID including new values;
CREATE MATERIALIZED VIEW MV_MAT_VIEW_TEST
refresh fast on commit
AS select * from MAT_VIEW_TEST;
/*Create trigger to log old and new value*/
CREATE OR REPLACE TRIGGER MAT_VIEW_TRIGGER
BEFORE INSERT OR UPDATE
ON MV_MAT_VIEW_TEST
FOR EACH ROW
DECLARE
old_pk number(10,0);
new_pk number(10,0);
old_name NVARCHAR2(50);
new_name NVARCHAR2(50);
BEGIN
old_pk := :old.pk;
old_name := :old.name;
new_pk := :new.pk;
new_name := :new.name;
DBMS_OUTPUT.PUT_LINE('TEST BEGIN');
DBMS_OUTPUT.PUT_LINE('old p ' || old_pk); /*old is set in oracle 12, but not in oracle18/19*/
DBMS_OUTPUT.PUT_LINE('old n ' || old_name); /*old is set in oracle 12, but not in oracle18/19*/
DBMS_OUTPUT.PUT_LINE('new p ' || new_pk); /*new is set correctly*/
DBMS_OUTPUT.PUT_LINE('new n ' || new_name); /*new is set correctly*/
DBMS_OUTPUT.PUT_LINE('TEST END');
END;
/
/*test the log*/
update MAT_VIEW_TEST set name = 'Test' where pk = 1;
commit;
Any ideas what was changed in Oracle or what we could do to get the old values in our trigger?

I don't have a 12c to rerun your tests, but I did on a 21c, and with the trigger you show, the old values are never shown, neither on insert (normal) nor on update( which is what you're complaining about). When I changed the trigger to be 'on insert or update or delete', and reran an update, I can see the old values. So, the refresh process is converting your UPDATE to DELETE/INSERT, hence the old values when it is deleting the old row.

Related

SP2-0552: Bind variable "NEW" not declared and END Error report - Unknown Command

I have to write a trigger for the tables I made and in insert update, I have to record a separate log table for those that are updated or inserted.
Columns in the log table will be like;
Done_process (will write update, insert)
Person (student number of the person treated)
Before (previous value for update, blank for insert)
After (new value for update, new value for insert)
This is my student_info table,
CREATE TABLE student_info (
school_id NUMBER,
id_no NUMBER NOT NULL UNIQUE,
name VARCHAR2(50) NOT NULL,
surname VARCHAR2(50) NOT NULL,
city VARCHAR2(50) NOT NULL,
birth_date DATE NOT NULL,
CONSTRAINT student_info_pk PRIMARY KEY(school_id )
);
CREATE TABLE og_log(
done_process VARCHAR2(30),
person VARCHAR2(30),
before VARCHAR2(30),
after VARCHAR2(30)
);
CREATE OR REPLACE TRIGGER og_trigger
BEFORE INSERT OR UPDATE OR DELETE ON student_info
REFERENCING OLD AS OLD NEW AS NEW
FOR EACH ROW
ENABLE
DECLARE
BEGIN
IF INSERTING THEN
INSERT INTO og_log(done_process, person, before, after)
VALUES ('Insert',:new.school_id,:old.name,:new.name);
ELSIF UPDATING THEN
INSERT INTO og_log(done_process, person, before, after)
VALUES ('Update',:new.school_id,:old.name,:new.name);
END IF;
END;
/
When I try to run the code it gave an error as follows;
> Trıgger OG_TRIGGER created.
>
>
> Error starting at line : 280 in command - ELSIF UPDATING THEN Error
> report - Unknown Command
>
> SP2-0552: Bind variable "NEW" not declared.
>
> 0 rows inserted.
>
>
> Error starting at line : 283 in command - END IF Error report -
> Unknown Command
>
> SP2-0044: For a list of known commands enter HELP and to leave enter
> EXIT.
>
> Error starting at line : 284 in command - END Error report - Unknown
> Command
I believe you are creating this trigger for learning purpose and not something a real use case because what you do in trigger doesn't really making any sense.
The trigger you have mentioned is not compiling due to syntactical problems like where v_id := 20201033.
Where clause is used to compare the value and thus you should use = instead := which is an assignment operator.
Besides this problem few points which still needs to be taken care
Give a explicit convention for creating local variables. e.g. you have created a local variable v_id and the same column is also available in student_info table. Though it is not a problem in this case but it's good practice to keep the local variable specific like let's say l_v_id.
You have used a select statement inside trigger which could leads to NO_DATA_FOUND error and you should handle it by either in the exception section or another way would be using aggregate function like max() if obviously v_id is primary key. I doubt why you need this select statement ( you could use between old and new using something like coalesce(:old.school_id,:new_schoold_id) if I understood you) but I would leave it open to you to decide and act accordingly.
Considering above points final code will be,
CREATE OR REPLACE TRIGGER og_trigger
BEFORE INSERT OR UPDATE OR DELETE ON student_info
REFERENCING OLD AS OLD NEW AS NEW
FOR EACH ROW
ENABLE
DECLARE
BEGIN
IF INSERTING THEN
INSERT INTO og_log(done_process, person, before, after)
VALUES ('Insert',:new.school_id,:old.city,:new.city);
ELSIF UPDATING THEN
INSERT INTO og_log(done_process, person, before, after)
VALUES ('Update',:new.school_id,:old.city,:new.city);
END IF;
END;
/
Find demo db<>fiddle
EDITED: Solving probably tool issue
I doubt the issue is with SQL Developer tool usage , however last try i would like to make,
Step1:
Drop both the tables used by issuing drop command
drop table STUDENT_INFO;
drop table og_log;
Step2:
Open another SQL worksheet using alt+F10 and do as I have shown in the following image. Please try and let me know.

fatfreeframework with SQL Server databse using mapper copyfrom method with partial insert

I am attempting to insert a record using the copyFrom('POST') and save() methods of fatfreeframework v3.5. The data from POST does not contain an id field which for this table is set as an autoincrement. The SQL from the logs is
SET IDENTITY_INSERT [xrefs] ON;
INSERT INTO [xrefs] ([status], [supply_id], [description], [unit], [unitcost], [cap], [rev], [buq])
VALUES ('test', 'Htest', 'test', 'test', '1', '1', 1, 1)
As you can see fatfree is adding the set identity insert despite the fact there is no id column included in the insert. Is there a way to tell mapper not to set this flag? Or is there another workaround? I could get the current max ID and then insert +1 but that seems clunky.
I should add this SQL fails because the id column is not included in the columns list.
$this->db->exec(
(preg_match('/mssql|dblib|sqlsrv/',$this->engine) &&
array_intersect(array_keys($pkeys),$ckeys)?
'SET IDENTITY_INSERT '.$this->table.' ON;':'').
'INSERT INTO '.$this->table.' ('.$fields.') '.
'VALUES ('.$values.')',$args
);
This is the code that sets IDENTITY_INSERT in mapper.php function insert.
$this->logger->write( 'xrefs schema:'.
json_encode( $this->tongpodb->schema( 'xrefs' ) ) );
Calling schema on the the db object gives back this array
{"id":{"type":"int","pdo_type":1,"default":null,"nullable":false,"pkey":true},"changed_date":{"type":"datetime","pdo_type":2,"default":null,"nullable":true,"pkey":false},"status":{"type":"varchar","pdo_type":2,"default":null,"nullable":false,"pkey":false},"supply_id":{"type":"varchar","pdo_type":2,"default":null,"nullable":false,"pkey":true},"description":{"type":"varchar","pdo_type":2,"default":null,"nullable":true,"pkey":false},"unit":{"type":"varchar","pdo_type":2,"default":null,"nullable":false,"pkey":false},"hcpcs":{"type":"char","pdo_type":2,"default":null,"nullable":true,"pkey":false},"unitcost":{"type":"decimal","pdo_type":2,"default":null,"nullable":false,"pkey":false},"cap":{"type":"decimal","pdo_type":2,"default":null,"nullable":false,"pkey":false},"rev":{"type":"smallint","pdo_type":1,"default":null,"nullable":false,"pkey":false},"buq":{"type":"smallint","pdo_type":1,"default":null,"nullable":true,"pkey":false},"create_ts":{"type":"datetime","pdo_type":2,"default":null,"nullable":true,"pkey":false},"log_ts":{"type":"int","pdo_type":1,"default":null,"nullable":true,"pkey":false},"filename":{"type":"varchar","pdo_type":2,"default":null,"nullable":true,"pkey":false},"line_no":{"type":"smallint","pdo_type":1,"default":null,"nullable":true,"pkey":false},"file_ts":{"type":"datetime","pdo_type":2,"default":null,"nullable":true,"pkey":false}}
As you can see id has a "pkey":true entry so one could look at the fields from post then look at this and determine if IDENTITY_INSERT needs to set. Perhaps I will implement this. I worry this is above my paygrade.
Updated to the latest version of fatfree fixed this issue.

How to apply a cached update FDQuery using Delphi FireDAC with an UNIQUE constraint on the database

I have a problem to resolve cache updates when delta includes fields that have UNIQUE constraint on the database. I have a database with the following DDL schema (SQLite in memory can be used to reproduce):
create table FOO
(
ID integer primary key,
DESC char(2) UNIQUE
);
The initial database table contains one record with ID = 1 and DESC = R1
Acessing this table with a TFDQuery (select * from FOO), if the following steps are performed, the generated delta will be correctly applied with ApplyUpdates:
Update record ID = 1 to DESC = R2
Append a new record ID = 2 with DESC = R1
Delta includes the following:
R2
R1
No error will be generated on ApplyUpdates, because the first operation on delta will be an update. The second will be an insert. As record 1 now is R2, the insertion can be done because there are no violation of the unique contraint on this transaction.
Now, performing the following steps, will generate the exactly same delta (look at the FDQuery.Delta property), but a UNIQUE constraint violation will be generated.
Append a new temporary record ID = 2 with DESC = TT
Update the first record ID = 1 to DESC = R2
Update the temporary record 2 - TT to DESC = R1
Delta includes the following:
R2
R1
Note that FireDAC generates the same delta on both scenarios, this can be viewed through the FDquery's Delta property.
This steps cand be used to reproduce the error:
File > New VCL Forms Application; Drop a FDConnection and FDQuery on form; Set FDConnection to use SQLite driver (using in memory database); Drop two buttons on form, one to reproduce the correctly behavior, and another to reproduce the error, as follows:
Button OK:
procedure TFrmMain.btnOkClick(Sender: TObject);
begin
// create the default database with a FOO table
con.Open();
con.ExecSQL('create table FOO' + '(ID integer primary key, DESC char(2) UNIQUE)');
// insert a default record
con.ExecSQL('insert into FOO values (1,''R1'')');
qry.CachedUpdates := true;
qry.Open('select * from FOO');
// update the first record to T2
qry.First();
qry.Edit();
qry.Fields[1].AsString := 'R2';
qry.Post();
// append the second record to T1
qry.Append();
qry.Fields[0].AsInteger := 2;
qry.Fields[1].AsString := 'R1';
qry.Post();
// apply will not generate a unique constraint violation
qry.ApplyUpdates();
end;
Button Error:
// create the default database with a FOO table
con.Open();
con.ExecSQL('create table FOO' + '(ID integer primary key, DESC char(2) UNIQUE)');
// insert a default record
con.ExecSQL('insert into FOO values (1,''R1'')');
qry.CachedUpdates := true;
qry.Open('select * from FOO');
// append a temporary record (TT)
qry.Append();
qry.Fields[0].AsInteger := 2;
qry.Fields[1].AsString := 'TT';
qry.Post();
// update R1 to R2
qry.First();
qry.Edit();
qry.Fields[1].AsString := 'R2';
qry.Post();
qry.Next();
// update TT to R1
qry.Edit();
qry.Fields[1].AsString := 'R1';
qry.Post();
// apply will generate a unique contraint violation
qry.ApplyUpdates();
Update Since writing the original version of this answer, I've done some more investigation and am beginning to think that either there is a problem with ApplyUpdates, etc, in FireDAC's support for Sqlite (in Seattle, at least), or we are not using the FD components correctly. It would need FireDAC's author (who is a contributor here) to say which it is.
Leaving aside the ApplyUpdates business for a moment, there are a number of other problems with your code, namely your dataset navigation makes assumptions about the ordering on the rows in qry and the numbering of its Fields.
The test case I have used is to start (before execution of the application) with the Foo table containing the single row
(1, 'R1')
Then, I execute the following Delphi code, at the same time as monitoring the contents of Foo using an external application (the Sqlite Manager plug-in for FireFox). The code executes without an error being reported in the application, but notice that it does not call ApplyUpdates.
Con.Open();
Con.StartTransaction;
qry.Open('select * from FOO');
qry.InsertRecord([2, 'TT']);
assert(qry.Locate('ID', 1, []));
qry.Edit;
qry.FieldByName('DESC').AsString := 'R2';
qry.Post;
assert(qry.Locate('ID', 2, []));
qry.Edit;
qry.FieldByName('DESC').AsString := 'R1';
qry.Post;
Con.Commit;
qry.Close;
Con.Close;
The added row (ID = 2) is not visible to the external application until after Con.Close has executed, which I find puzzling. Once Con.Close has been called, the external application shows Foo as containing
(1, 'R2')
(2, 'R1')
However, I have been unable to avoid the constraint violation error if I call ApplyUpdates, regardless of any other changes I make to the code, including adding a call to ApplyUpdates after the first Post.
So, it seems to me that either the operation of ApplyUpdates is flawed or it is not being used correctly.
I mentioned FireDAC's author. His name is Dmitry Arefiev and he has answered a lot of FD qs on SO, though I haven't noticed him here in the past couple of months or so. You might try catching his attention by posting in EMBA's FireDAC NG forum, https://forums.embarcadero.com/forum.jspa?forumID=502.

Error update trigger after new row has inserted into same table

I want to update OrigOrderNbr and OrigOrderType (QT type) because when I create first both of column are Null value. But after S2 was created (QT converted to S2) the OrigOrderType and OrigOrderNbr (S2) take from QT reference. Instead of that, I want to update it to QT also.
http://i.stack.imgur.com/6ipFa.png
http://i.stack.imgur.com/E6qzT.png
CREATE TRIGGER tgg_SOOrder
ON dbo.SOOrder
FOR INSERT
AS
DECLARE #tOrigOrderType char(2),
#tOrigOrderNbr nvarchar(15)
SELECT #tOrigOrderType = i.OrderType,
#tOrigOrderNbr = i.OrderNbr
FROM inserted i
UPDATE dbo.SOOrder
SET OrigOrderType = #tOrigOrderType,
OrigOrderNbr = #tOrigOrderNbr
FROM inserted i
WHERE dbo.SOOrder.CompanyID='2'
and dbo.SOOrder.OrderType=i.OrigOrderType
and dbo.SOOrder.OrderNbr=i.OrigOrderNbr
GO
After I run that trigger, it showed the message 'Error #91: Another process has updated 'SOOrder' record. Your changes will be lost.'.
Per long string of comments, including some excellent suggestions in regards to proper trigger writing techniques by #marc_s and #Damien_The_Unbeliever, as well as my better understanding of your issue at this point, here's the re-worked trigger:
CREATE TRIGGER tgg_SOOrder
ON dbo.SOOrder
FOR INSERT
AS
--Update QT record with S2 record's order info
UPDATE SOOrder
SET OrigOrderType = 'S2'
, OrigOrderNbr = i.OrderNbr
FROM SOOrder dest
JOIN inserted i
ON dest.OrderNbr = i.OrigOrderNbr
WHERE dest.OrderType = 'QT'
AND i.OrderType = 'S2'
AND dest.CompanyID = 2 --Business logic constraint
AND dest.OrigOrderNbr IS NULL
AND dest.OrigOrderType IS NULL
Basically, the idea is to update any record of type "QT" once a matching record of type "S2" is created. Matching here means that OrigOrderNbr of S2 record is the same as OrderNbr of QT record. I kept your business logic constraint in regards to CompanyID being set to 2. Additionally, we only care to modify QT records that have OrigOrderNbr and OrigOrderType set to NULL.
This trigger does not rely on a single-row insert; it will work regardless of the number of rows inserted - which is far less likely to break down the line.

Why triggers try to insert NULL value when using a field from 'inserted' table?

I have to sync changes done in MSSQL with a remote MySQL database. The changes to be synced are adding invoices and users to the system. The remote server is not expected to be always reachable so I'm trying to set up a kind of log table for storing changes done in MSSQL.
Here is a fully working trigger for that:
CREATE TRIGGER [dbo].[dokument_insert]
ON [dbo].[dokument]
AFTER INSERT
AS
BEGIN
SET NOCOUNT ON;
INSERT INTO [bcg_ekodu].[dbo].[sync_stack] (event,sql, table_name, import_priority)
SELECT
'INSERT',
'INSERT INTO bills SET
date = "'+CONVERT(VARCHAR(19),dok_kuup,120)+'",
total = "'+CAST(kokkusum AS nvarchar)+'",
number = "'+RTRIM(dok_nr)+'",
created = "'+CONVERT(VARCHAR(19),savetime,120)+'",
rounded = "'+CAST(ymardus AS nvarchar)+'",
currency = "'+CAST(valuuta AS nvarchar)+'",
due_date = "'+CONVERT(VARCHAR(19),tasupaev,120)+'",
pk_joosep = "'+CAST(dok_kood AS nvarchar)+'",
joosep_hankija = "'+CAST(hankija AS nvarchar)+'";
UPDATE
bills, users, companies
SET
bills.user_id = users.id,
bills.imported = NOW()
WHERE
bills.imported IS NULL
AND companies.id = users.company_id
AND companies.pk_joosep = 10
AND bills.user_id = users.pk_joosep',
'bills',
'200'
FROM inserted
END
It inserts a row into 'sync_stack' table every time a row is inserted to 'dokument' table. The 'sql' column will contain an SQL to create the same kind of row in another (MySQL) database.
But this trigger is not working:
CREATE TRIGGER [dbo].[klient_insert]
ON [dbo].[klient]
AFTER INSERT
AS
BEGIN
SET NOCOUNT ON;
INSERT INTO [bcg_ekodu].[dbo].[sync_stack] (event,sql, table_name, import_priority)
SELECT
'INSERT',
'INSERT INTO users SET
username =10'+CAST(kl_kood as nvarchar)+',
password = NULL,
name ="'+LTRIM(RTRIM(kl_nimi))+'",
email ="'+CAST(LTRIM(RTRIM(kl_email)) as nvarchar)+'",
reference_no ="'+CAST(LTRIM(RTRIM(kl_viide)) as nvarchar)+'",
phone ="'+CAST(LTRIM(RTRIM(kl_tel1)) as nvarchar)+'",
logins ="'+CAST(0 as nvarchar)+'",
last_login = NULL,
created ="'+CONVERT(VARCHAR(19),savetime,120)+'",
updated = NULL,
deleted ="0",
address ="'+CAST(LTRIM(RTRIM(kl_aadr1)) as nvarchar)+'",
pk_joosep ="'+CAST(kl_kood as nvarchar)+'"',
'users',
'210'
FROM inserted
END
While the execution of the above SQL to create that trigger completes just fine, when I try to insert some rows to the 'triggered' table, I get the following error:
No row was updated.
The data in row 175 was not committed.
Error Source: .Net SqlClient Data Provider.
Error Message: Cannot insert the value NULL into column 'sql', table 'mydb.dbo.sync_stack'; column does not allow nulls. INSERT fails.
The statement has been terminated.
Correct the errors and retry or press ESC to cancel the change(s).
If I delete this trigger, this error does not occur.
If I insert just plain text for 'sql' column, it works as expected.
If I use any field from the inserted row, even just a text field, it fails again.
If I allow NULL values in 'sql' column, inserting rows succeeds but I get a NULL value in 'sql' column.
How to make the second trigger work as expected, too?
I suspect that at least one of the values from inserted that you are concatenating into your SQL statement is NULL. You can circumvent this by using COALESCE, e.g.
username =10'+COALESCE(CAST(kl_kood as nvarchar), '')+',
Of course you shouldn't be declaring nvarchar without specifying a length, right?
Bad habits to kick : declaring VARCHAR without (length)
Concatenating any value to NULL is NULL:
select 'test' + NULL
Results in null, you should use something like that for your columns:
select isnull(column, '')
This would result in an empty string.

Resources