using :new and :old referencing different tables in SQL - database

Revising for a uni exam. A question states:
Write an SQL command to create a trigger in table Permission. The
trigger should add one to the numberOfPermissions in table File for a
file, after each time a new permission row is entered into table
Permission with that file’s name.
here's a list of the tables provided
I've got everything down except one line, the WHERE line. How would I specify the :new value to a different table? It needs to read the new value as a the fileName column coming from the Permissions table, but I'm not sure how to do that. I've tried it in ways such as :Permissions.new.fileName etc but I always get an unspecified error around the "." point.
CREATE TRIGGER newTrig
AFTER INSERT ON Permission
BEGIN
UPDATE File
SET numberOfPermissions = numberOfPermissions+1
WHERE File.name = :new.fileName
END;

When I run your trigger creation code in this db fiddle, it gives me :
ORA-04082: NEW or OLD references not allowed in table level triggers
What happens is that you have omitted the FOR EACH ROW option in the declaration of the trigger. Because of that, Oracle believes that you want a table level trigger, which is executed once per statement (whereas a row level trigger is executed once for each row inserted).
As a single statement can result in multiple rows being inserted (eg : INSERT INTO ... AS SELECT ...), Oracle does not allow you to access :NEW and :OLD references in table level triggers.
Adding the FOR EACH ROW option to your trigger definition will make it a row level trigger, that is allowed to access :NEW and :OLD references.
CREATE TRIGGER newTrig
AFTER INSERT ON Permission
FOR EACH ROW
BEGIN
UPDATE File
SET numberOfPermissions = numberOfPermissions+1
WHERE File.name = :new.fileName;
END;
PS : you were also missing a semi-colon at the end of the UPDATE statement.
Unrelated PS :
> create table USER ( x number);
ORA-00903: invalid table name
> create table FILE ( x number);
ORA-00903: invalid table name
=> It is usually not a good idea to create tables whose names are reserved words, this can sometimes lead to tricky errors.

Trigger knows which records should get update, so you can use ":OLD" pointer.
Try something like this:
CREATE OR REPLACE TRIGGER newTrig
AFTER INSERT ON Permission
FOR EACH ROW
BEGIN
UPDATE File
SET numberOfPermissions = numberOfPermissions+1
WHERE name = :old.fileName
END;

Related

Temp table dropping immediately after query finishes

If I highlight then execute lines 1-4, I get output
Commands completed successfully.
Then, if I highlight and execute lines 6-14, I get output message
Invalid object name '#TestThis'
If I highlight and execute lines 1 - 16, I can see the one row of data returned. Why would a temp table that was just created (in the same session) immediately be dropped/invalid right after the code was executed? We're running this on an Azure based SQL Server.
If the session remains alive, the temporary table should still exist and be accessible. Make sure you are executing the create statement and the other ones on the same session and you are not getting a disconnection message in between.
Make sure you have the "Disconnect after the query executes" check in SSMS OFF.
If it still fails, do this check:
Create your temporary table, and keep the session alive (don't close the tab or disconnect it):
CREATE TABLE #TestThis (oldvalue INT, newvalue INT)
On a different session, query tempdb like the following:
SELECT * FROM tempdb.sys.tables WHERE [name] LIKE N'#TestThis%'
You should be able to see the temporary table created on the other session, starting with the same name and getting a bunch of underscores and some numbers at the end. This means the table still exists and you should be able to access it from the original session.
If you open a 3rd session and create the same temporary table, 2 of these should be listed on the tempdb query:
Matthew Walk You can try this solution as I showed in the example below.
CREATE TABLE #TestThis(oldvalue INT, newvalue INT )
INSERT INTO #TestThis(oldvalue, newvalue) VALUES (1,3),(5,7)
select oldvalue, newvalue FROM #TestThis
IF OBJECT_ID('tempdb..#TestThis') IS NOT NULL DROP TABLE #TestThis
In this example you need to check object id is null or not for your temptable, if it is not null then you need to drop your temp table.
I hope this helps you.
A session is created every time you click the execute icon, so yes, if you don’t highlight the “create table” part, the table wouldn’t exist.
You can either remove the hash (#) in front of the table name if you want the table to stay in the db until you drop it .... or also highlight the create table part every time you click “execute” ....whichever fits your needs better.
"#Temp" table is session level. If the database connection or session closed, the object "#TestThis" will be deleted.
You must keep the session going.
You should first highlight execute lines 1-4 to create the temp table "#TestThis" if not exist.
CREATE TABLE #TestThis
(
oldvalue INTEGER,
newvalue INTEGER)
Then you can execute lines 6-14. If you don't first create the temp table, how can you insert the data to temp table "#TestThis"?
Now you can execute:
INSERT INTO #TestThis
(
oldvalue,
newvalue
)
VALUES
( 1234,
7788
)
Or
SELECT * FROM #TempThis
Hope this helps.
It turns out the cause of this issue was using MFA instead of Active Directory password. Once the connection was switched to Active Directory password, the temp tables were created and accessible and persisted as expected.

Use Sybase triggers to write dynamic statement using all old and new values for creating your own replication transaction statement log?

PROBLEM SUMMARY
I have to write I/U/D-statement-generating-triggers for a bucardo/symmetricDS-inspired homemade bidirectional replication system between Sybase ADS and Postgresql 11 groups of nodes, using BEFORE triggers on any Postgresql and Sybase DB that creates Insert/Update/Delete commands based on the command entered in a replicating source table: e.g. an INSERT INTO PERSON (first_name,last_name,gender,age,ethnicity) Values ('John','Doe','M',42,'C') and manipulate them into a corresponding Insert statement, and UPDATE by getting OLD and NEW values to dynamically make an UPDATE statement, along with getting OLD values to make a DELETE command, all to run per command on a destination at some interval.
I know this is difficult and no one does this but it is for a job and I have no other options and can't object to offer a different solution. I have no other teammates or human resources to help outside of SO and something like Codementors, which was not so helpful. My idea/strategy is to copy parts of bucardo/SymmetricDS when inserting OLD and NEW values for generating a statement/command to run on the destination. Right now, I am snapshotting the whole table to a CSV as opposed to doing by individual command, but by command and looping through table that generates and saves commands will make the job much easier.
One big issue is that they come from Sybase ADS and have a mixed Key/Index structure (many tables have NO PK) and are mirroring that in Postgresql, so I am trying to write PK-less statements, or all-column commands to get around the no-pk tables. They also will only replicate certain columns for certain tables, so I have a column in a table for them to insert the column names delimited by ';' and then split it out into an array and link the column names to the values for each statement to generate a full command for I/U/D, Hopefully. I am open to other strategies but this is a big solo project and I have gone at it many ways with much difficulty.
I mostly come from DBA background and have some programming experience with the fundamentals, so I am mostly pseudocoding each major sequence,googling for syntax by part, and adjusting as I go or encounter a language incapability. I am thankful for any help given, as I am getting a bit desperate and discouraged.
WHAT I HAVE TRIED
I have to do this for Sybase ADS and Postgresql but this question is intially over ADS since it's more challenging and older.
To have one "Log" table which tracks row changes for each of the replicating tables and records and ultimately dynamically generates a command is the goal for both platforms. I am trying to make trigger statements like:
CREATE TRIGGER PERSON_INSERT
ON PERSON
BEFORE
INSERT
BEGIN
INSERT INTO Backlog (SourceTableID, TriggerType, Status, CreateTimeDate, NewValues) select ID, 'INSERT','READY', NOW(),''first_name';'last_name';'gender';'age';'ethnicity'' from __new;
END;
CREATE TRIGGER PERSON_UPDATE
ON PERSON
BEFORE
UPDATE
BEGIN
INSERT INTO Backlog (SourceTableID, TriggerType, Status, CreateTimeDate, NewValues) select ID, 'U','UPDATE','READY', NOW(),''first_name';'last_name';'gender';'age';'ethnicity'' from __new;
UPDATE Backlog SET OldValues=select ''first_name';'last_name';'gender';'age';'ethnicity'' from __old where SourceTableID=select ID from __old;
END;
CREATE TRIGGER PERSON_DELETE
ON PERSON
BEFORE
DELETE
BEGIN
INSERT INTO Backlog (SourceTableID, TriggerType, Status, CreateTimeDate, OldValues) select ID, 'D','DELETE','READY', NOW(),''first_name';'last_name';'gender';'age';'ethnicity'' from __old;
END;
but I would like the "''first_name';'last_name';'gender';'age';'ethnicity''" to come from another table as a value to make it dynamic since multiple tables will write their value and statement info to the single log table. Then, it can be made into a variable and then probably split to link to the corresponding values so the IUD statements can be made which will be executed on the destination one at a time.
ATTEMPTED INCOMPLETE SAMPLE TRIGGER CODE
CREATE TRIGGER PERSON_INSERT
ON PERSON
BEFORE
INSERT
BEGIN
--Declare #Columns string
--#Columns=select Columns from metatable where tablename='PERSON'
--String Split(#Columns,';') into array to correspond to new and old VALUES
--#NewValues=#['#Columns='+NEW.#Columns+'']
INSERT INTO Backlog (SourceTableID, TriggerType, Status, CreateTimeDate, NewValues) select ID, 'INSERT','READY', NOW(),''first_name';'last_name';'gender';'age';'ethnicity'' from __new;
END;
CREATE TRIGGER PERSON_UPDATE
ON PERSON
BEFORE
UPDATE
BEGIN
--Declare #Columns string
--#Columns=select Columns from metatable where tablename='PERSON'
--String Split(#Columns,';') into array to correspond to new and old VALUES
--#NewValues=#['#Columns='+NEW.#Columns+'']
--#OldValues=#['#Columns='+OLD.#Columns+'']
INSERT INTO Backlog (SourceTableID, TriggerType, Status, CreateTimeDate, NewValues) select ID, 'U','UPDATE','READY', NOW(),''first_name';'last_name';'gender';'age';'ethnicity'' from __new;
UPDATE Backlog SET OldValues=select ''first_name';'last_name';'gender';'age';'ethnicity'' from __old where SourceTableID=select ID from __old;
END;
CREATE TRIGGER PERSON_DELETE
ON PERSON
BEFORE
DELETE
BEGIN
--Declare #Columns string
--#Columns=select Columns from metatable where tablename='PERSON'
--String Split(#Columns,',') into array to correspond to new and old VALUES
--#OldValues=#['#Columns='+OLD.#Columns+'']
INSERT INTO Backlog (SourceTableID, TriggerType, Status, CreateTimeDate, OldValues) select ID, 'D','DELETE','READY', NOW(),''first_name';'last_name';'gender';'age';'ethnicity'' from __old;
END;
CONCLUSION
For each row inserted,updated, or deleted; in a COMMAND column in the log table, I am trying to generate a corresponding 'INSERT INTO PERSON ('+#Columns+') VALUES ('+#NewValues+')' type statement, or an UPDATE or DELETE. Then a Foreach service will run each command value ordered by create time, as the main replication service.
To be clear, I am trying to make my sample code trigger write all old values and new values to a column in a dynamic way without hardcoding the columns in each trigger since it will be used for multiple tables, and writing the values into a single column delimited by a comma or semicolon.
An even bigger wish or goal behind this is to find a way to save/script each IUD command and then be able to run them on subscriber server.DBs of postgresql and Sybase platform, therefore making my own replication from a log
It is a complex but solvable problem that would take time and careful planning to write. I think what you are looking for is the "Execute Immediate" command in ADS SQL syntax. With this command you can create a dynamic statement to then be executed once construction of the SQL statement is terminated. Save each desired column value to a temp table by carefully constructing the statement as a string and then execute it with Execute Immediate. For example:
DECLARE TableColumns Cursor ;
DECLARE FldName Char(100) ;
...
OPEN TableColumns AS SELECT *
FROM system.columns
WHERE parent = #cTableName
AND field_type < 21 //ADS_ROWVERSION
AND field_type <> 6 //ADS_BINARY
AND field_type <> 7; //ADS_IMAGE
While Fetch TableColumns DO
FldName = Trim( TableColumns.Name) ;
StrSql = 'SELECT New.[' + Trim( FldName ) + '] newVal' +
'INTO #myTmpTable FROM ___New n' ;
After constructing the statement as a string it can then be executed like this:
EXECUTE IMMEDIATE STRSQL ;
You can pickup old and new values from __old and __new temp tables that are always available to triggers. Insert values into temp table myTmpTable and then use it to update the target. Remember to drop myTmpTable at the end.
Furthermore, I would think you can create a function on the DD that can actually be called from each trigger on the tables you want to keep track of instead of writing a long trigger for each table and cTableName can be a parameter sent to the function. That would make maintenance a little easier.

Creating a trigger to copy data from one column to another in the same table - PostgreSQL

I'm interested in figuring out how to copy a row of data from an old column to a new column within the same table. This would be individually done during a trigger procedure, not something like UPDATE table SET columnB = columnA.
To try to clarify, table1.column1.row3 -> table1.column2.row3 if an INSERT or UPDATE statement is executed upon table1.column1.row3.
Have your trigger assign
NEW.column1 := NEW.column2

Using a trigger to delete rows in a table in a seperate database. Oracle

I have two databases: A and B. I am trying to build a trigger in database B that will perform update, insert, and delete functions on a table in database A. my trigger so far looks like this:
DELIMITER //
CREATE OR REPLACE TRIGGER salesFragmentLesserTrigger
AFTER INSERT OR UPDATE OR DELETE ON sales
FOR EACH ROW
BEGIN
IF INSERTING THEN
IF :new.sale_price < 5000000 THEN
IF :new.sale_type = 'auction' THEN
INSERT INTO LESSTHANFIVEMILLIONFRAGMENT#FIT5043A (sales_id,sales_date,sale_type,reserved_price,sale_price,deposit,balance,buyer_id,property_id)
VALUES(:new.sales_id,:new.sales_date,:new.sale_type,:new.reserved_price,:new.sale_price,:new.deposit,:new.balance,:new.buyer_id,:new.property_id);
END IF;
END IF;
ELSIF DELETING THEN
DELETE FROM LESSTHANFIVEMILLIONFRAGMENT#FIT5043A
WHERE LESSTHANFIVEMILLIONFRAGMENT#FIT5043A.sales_id = :old.sales_id;
END IF;
END;
/
If I create the trigger with only the IF INSERTING block then it runs fine. It successfully updates the table in database A. When I add the ELSIF DELETING block however it compiles with the errors:
PL/SQL: ORA-04054: database link FIT5043A.SALES_ID does not exist
PL/SQL: SQL Statement ignored
Why does the insert statement succeed when the delete is asking for a database link? Is a database link absolutely necessary for what I am trying to do?
Apparently it fails on the where clause and it thinks the alias name there is not FIT5043A but FIT5043A.SALES_ID. That is because database link names can contain dots.
I think the easiest solution is giving the table an alias, or omit the table altogether in the where clause.
Alias:
DELETE FROM LESSTHANFIVEMILLIONFRAGMENT#FIT5043A x
WHERE x.sales_id = :old.sales_id
Omitting:
DELETE FROM LESSTHANFIVEMILLIONFRAGMENT#FIT5043A
WHERE sales_id = :old.sales_id

How to make sure a row cannot be accidentally deleted in SQL Server?

In my database I have certain data that is important to the functioning of the app (constants, ...). And I have test data that is being generated by testing the site. As the test data is expendable it delete it regularly. Unfortunately the two types of data occur in the same table so I cannot do a delete from T but I have to do a delete from T where IsDev = 0.
How can I make sure that I do not accidentally delete the non-dev data by forgetting to put the filter in? If that happens I have to restore from a production backup which is wasting my time. I would require some sort of foreign key like behavior that fails a delete when a certain condition is met. This would also be useful to ensure that my code does not do anything harmful due to a bug.
Well, you could use a trigger that throws an exception if any of the records in the deleted meta-table have IsDev = 1.
CREATE TRIGGER TR_DEL_protect_constants ON MyTable FOR DELETE AS
BEGIN
IF EXISTS(SELECT 1 FROM deleted WHERE IsDev <> 0)
BEGIN
ROLLBACK
RAISERROR('Can''t delete constants', 1, 16)
RETURN
END
END
I'm guessing a bit on the syntax, but you get the idea.
I would use a trigger.
keep a backup of the rows you want to retain in a separate admin table
Seems like you need a trigger on delete operation that would look at the row and rollback transaction if it sees that it's a row that should never be deleted.
Also, you might want to read this article: Prevent accidental update or delete commands of all rows in a SQL Server table
Depending on how transparent you want to make this, you could use an INSTEAD OF trigger that will always remember the WHERE for you.
CREATE TRIGGER TR_IODEL_DevOnly ON YourTable
INSTEAD OF DELETE
AS
BEGIN
DELETE FROM t
FROM Deleted d
INNER JOIN YourTable t
ON d.PrimaryKey = t.PrimaryKey
WHERE t.IsDev = 0
END
I suggest that instead of writing the delete statement from scratch every time, just create a stored procedure to do the deletions and execute that.
create procedure ResetT as delete from T where IsDev = 0
You could create an extra column IS_TEST in your tables, rename the TABLE_NAME to TABLE_NAME_BAK, and create a view TABLE_NAME on the TABLE_NAME_BAK so that only rows where IS_TEST was set are displayed in it. Setting IS_TEST to zero for the data you wish to keep, and adding a DEFAULT 1 to the IS_TEST column should complete the job. It is similar to the procedure required for creating 'soft deletes'.

Resources