Salesforce Trigger workflow on record delete - salesforce

I want to listen change in my legacy system whenever there is any change in SF object (add/update/delete). So I have created outbound message and workflow. But in workflow I don't see any way to fire if object is deleted.
Is there anyway I can trigger outbound message on record delete? I know have heard that it can be done by trigger. But I don't want to write apex code for this.

To the best of my knowledge it cannot be done, the workflow actions are decoupled from the workflow rule (you can even reuse them) so they probably do not receive the transaction scope and when they execute the record is already gone and any reference inside action would point to a non-existing data. Thus the only way I know how to do it is via trigger.

Here is a workaround. However this will only be able to capture deletion made via std. Salesforce UI.
1.Create a custom checkbox field "Is Deleted"
2.Override the Del link with a custom VF page, that first updates the record status to "Is Deleted", and deletes the record.
3.Write workflow rule using the "Is Deleted" field.

Perhaps a compromise architecture would be to write an extremely small and simple after delete trigger that simply copies the deleted records in question to some new custom object. That new custom object fires your workflow rule and thus sends the outbound message you're looking for. The only issue with this would be to periodically clean up your custom object data that would grow in size as you deleted records from your other object. In other words, your "scratch" object would just need periodic cleaning - which could be done on a nightly schedule with batch Apex.
Here's a delete trigger that would do the trick using Opportunity as an example:
trigger AfterDelete on Opportunity (after delete)
{
List<CustObj__c> co = new List<CustObj__c>();
for(Opportunity o : Trigger.old)
{
CustObj__c c = new CustObj__c();
c.Name = o.Name;
c.Amount__c = o.Amount;
c.CloseDate__c = o.CloseDate;
c.Description__c = o.Description;
// etc.
co.add(c);
}
insert co;
}
It's not ideal but at least this would save you from having to code your own trigger-based outbound messages. These can only be done using the #Future annotation, btw, since callouts directly from triggers are forbidden. Hope that helps.

write a single email send in the trigger delete event. You have it in less than 1 hour.

Related

How to update QTableView when database updated?

I use QSqlRelationalTableModel to extract data from database, and use tableView to show it. Now, when I update my database, how to update tableView automatically to show it? I know i need to use function dataChanged() to make this automatically, but i do not know how to use it? Any suggestion will be appreciated.
The main code is as follows:
QSqlRelationalTableModel *model = new QSqlRelationalTableModel(NULL, db);
model->setTable(tableName);
model->select();
tableView->setModel(model);
tableView->show();
No, there is no need to use dataChanged().
You just need to call QSqlRelationalTableModel::select() whenever the database gets updated. This will re-populate the model from the database, and update the views that are using it automatically.
If the database is updated from within you application, you can just call model->select() after the update queries get executed in your application.
If the database is updated from another application, you'll have to use something like PostgreSQL's event notification system, subscribe to the notification from your application using QSqlDriver::subscribeToNotification() and call model->select() in a slot connected to the notification() signal.
You can use QSqlDriver::hasFeature(QSqlDriver::EventNotifications) to check if notifications from your database are supported.

Grails - Managing database transactions with rollbacks

GGTS 3.4 Grails 2.3.3 - When generating controllers this version includes a number of #Transactional lines I haven't seen before, and I don't fully understand what they are doing.
At the top of the controller there is the line:
#Transactional(readOnly = true)
Then just before certain dB changing actions: 'save', 'update' and 'delete' there is the line:
#Transactional
I presume that this switches the readOnly to false for each dB changing action. Does it open a new transaction that can be committed or rolled back as well? Is there simple way to force a rollback?
The 'create' action does not have #Transactional line before it despite it carrying out a 'new' db command to create a new instance of the specific domain class. What happens to this newly created but unsaved instance if the save transaction is not completed or if it is rolled back? By not completed I am thinking of introducing a 'cancel' button in the 'create' view to enable users to pull out of the creation if they choose to - also a user could simply navigate out of the create view without invoking the save.
-mike
The standard #Transactional without any properties set uses the platform defaults. These depend upon your transaction manager and your data source. It does however, create a transaction that can be comitted or rolled back.
Conroller methods without any annotation do not particpate in any transactions (provided the entire class isn't annotated as well).
In the case of create there is no need for a transaction because you aren't interacting with the database/transaction manager. Simply creating a new instance of a domain class e.g. new MyDomainClass() doesn't interact with the database at all, which is what you are seeing in the create method.
So in short, you don't need to worry about that instance if your users navigate away from the page or click cancel.
You can you use "withTransaction" method of domains to manage your Transaction manually as follow:
Account.withTransaction { status ->
try{
write your code or business logic here
}catch(Exception e){
status.setRollbackOnly()
}
}
if exception generate then this Transaction will be rollback

Need advice on Entity Manager

I open up a Member form for Add/Editing members. It has its own entity manager and when the Save button is clicked I close the form and go back to a list form.
When the save is processed, I call a routine called CalculateOwing which calculates the members balance. This method is in a separate .cs file cause it can be called from many areas in the application.
Should the CalculateOwing method be in a separate entity manager or in the same entity manager as the member record being processed?
A response to this question can be seen at http://www.ideablade.com/forum/forum_posts.asp?TID=4686&title=need-advice-on-entity-manager
EDIT:
Including the response here.
"If you want the results of CalculateOwing to be part of the same database transaction, you need to call it before the Save completes and on the same EntityManager that the Save is using."

Batch Apex conflicts and design?

I need help with a design issue and what happens in Batch Apex.
This is the scenario we have:
We have a territory object, then when you update a single field needs to update a field on UPTO hundreds of thoursands of contacts. To do this, I am using Batch Apex. And invoking it on the territory record before it’s updated.
Question:
Say the user updates the territory from A to B, and clicks save. This causes a big batch of contacts to get updated and take a while Then, he changes B to C. Are we guaranteed that the final update on all impacted records will be C? How come?
Or, is there a way to schedule your batch jobs? I’m looking into asyncApexJob and using that as a framework…
Is there a better design?
Batch Apex doesn't work the same way a Trigger works. The only way the situation described in your Question 1 would occur is if you were to call/execute a batch from a Trigger, and I would highly recommend avoiding that, if it's even possible.
(and 3.) Batches are typically scheduled to run over-night, or during off hours, using the Apex Scheduler. This is the recommended solution.
First, you will want to put the logic in the AFTER UPDATE trigger on the Territory object, not the BEFORE UPDATE section. As a general rule, if you need to update a field or value on the record/object that the trigger action is for (i.e. the Territory object in your case) then you use the BEFORE UPDATE or BEFORE INSERT section(s), and if you want to create/update/delete other records/objects (i.e. Contacts in your case) you use the AFTER UPDATE or AFTER INSERT section(s).
Second, I do not think there is anything wrong with initiating the batch apex process from your trigger.
For example, let us say you have a batch class called "BatchUpdateContactsBasedOnTerritory." And this class has three (3) key features:
it implements "Database.Stateful" in addition to "Database.Batchable"
it has a constructor method that takes a list of territories as an argument/parameter
it has a member variable to hold the list of territories that are passed in
Part of your batch class:
global list<Territory> TerritoryList;
global BatchUpdateContactsBasedOnTerritory(list<Territory> updatedTerritories){
TerritoryList = updatedTerritories;
}
Your trigger:
trigger TerritoryTrigger on Territory (after delete, after insert, after undelete, after update, before delete, before insert, before update)
{
if(trigger.isInsert)
{
if(Trigger.isBefore){
// before insert event not implemented
}
else if(Trigger.isAfter){
// after insert event not implemented
}
}else if(trigger.isUpdate){
if(Trigger.isBefore){
// before update event not implemented
}
else if(Trigger.isAfter){
// after update event - call batch class to process 1000 records at a time
Database.ExecuteBatch(new BatchUpdateContactsBasedOnTerritory(trigger.new),1000);
}
}else if(trigger.isDelete){
if(Trigger.isBefore){
// before delete event not implemented
}
else if(Trigger.isAfter){
// after delete event not implemented
}
}
else if(Trigger.isUnDelete){
// undelete event not implemented
}
}

CakePHP afterSave Timing

I have a situation where, in a model's afterSave callback, I'm trying to access data from a distant association (it's a legacy data model with a very wonky association linkage). What I'm finding is that within the callback I can execute a find call on the model, but if I exit right then, the record is never inserted into the database. The lack of a record means that I can't execute a find on the related model using data that was just inserted into the current.
I haven't found any mention of when data is actually committed with respect to when the afterSave callback is engaged. I'm working with legacy code, but I see no indication that we're specifically engaging transactions, so I'm trying to figure out what my options might be.
Thanks.
UPDATE
The gist of the scenario is this: We're taking event registrations, but folks can be wait listed. A user can register (or be registered) for a given Date. After a registration is complete, I need to check the wait list for the existence of a record for the registering user (WaitList.user_id) on the date being registered for (WaitList.date_id). If such a record exists, it can be deleted because it's become an active registration.
The legacy schema puts me in a place where the registration isn't directly tied to a date so I can't get the Date.id easily. Instead, Registration->Registrant->Ticket->Date. Unintuitive, I know, but it is what it is for now. Even better (sarcasm included), we have a view named attendees that rolls all of this info up and from which I would be able to use the newly created Registration->id to return Attendee.date_id. Since the record doesn't exist, it's not available in the view.
Hopefully that provides a little more context.
What's the purpose of the find query inside of your afterSave?
Update
Is it at all possible to properly associate the records? Or are we talking about way too much refactoring for it to be worth it? You could move the check to the controller if it's not possible to modify the associations between the records.
Something like (in psuedo code)
if (save->isSuccessful) {
if (onWaitList) {
// delete record
}
}
It's not best practice, but it will get you around your issue.

Resources