Batch Apex conflicts and design? - salesforce

I need help with a design issue and what happens in Batch Apex.
This is the scenario we have:
We have a territory object, then when you update a single field needs to update a field on UPTO hundreds of thoursands of contacts. To do this, I am using Batch Apex. And invoking it on the territory record before it’s updated.
Question:
Say the user updates the territory from A to B, and clicks save. This causes a big batch of contacts to get updated and take a while Then, he changes B to C. Are we guaranteed that the final update on all impacted records will be C? How come?
Or, is there a way to schedule your batch jobs? I’m looking into asyncApexJob and using that as a framework…
Is there a better design?

Batch Apex doesn't work the same way a Trigger works. The only way the situation described in your Question 1 would occur is if you were to call/execute a batch from a Trigger, and I would highly recommend avoiding that, if it's even possible.
(and 3.) Batches are typically scheduled to run over-night, or during off hours, using the Apex Scheduler. This is the recommended solution.

First, you will want to put the logic in the AFTER UPDATE trigger on the Territory object, not the BEFORE UPDATE section. As a general rule, if you need to update a field or value on the record/object that the trigger action is for (i.e. the Territory object in your case) then you use the BEFORE UPDATE or BEFORE INSERT section(s), and if you want to create/update/delete other records/objects (i.e. Contacts in your case) you use the AFTER UPDATE or AFTER INSERT section(s).
Second, I do not think there is anything wrong with initiating the batch apex process from your trigger.
For example, let us say you have a batch class called "BatchUpdateContactsBasedOnTerritory." And this class has three (3) key features:
it implements "Database.Stateful" in addition to "Database.Batchable"
it has a constructor method that takes a list of territories as an argument/parameter
it has a member variable to hold the list of territories that are passed in
Part of your batch class:
global list<Territory> TerritoryList;
global BatchUpdateContactsBasedOnTerritory(list<Territory> updatedTerritories){
TerritoryList = updatedTerritories;
}
Your trigger:
trigger TerritoryTrigger on Territory (after delete, after insert, after undelete, after update, before delete, before insert, before update)
{
if(trigger.isInsert)
{
if(Trigger.isBefore){
// before insert event not implemented
}
else if(Trigger.isAfter){
// after insert event not implemented
}
}else if(trigger.isUpdate){
if(Trigger.isBefore){
// before update event not implemented
}
else if(Trigger.isAfter){
// after update event - call batch class to process 1000 records at a time
Database.ExecuteBatch(new BatchUpdateContactsBasedOnTerritory(trigger.new),1000);
}
}else if(trigger.isDelete){
if(Trigger.isBefore){
// before delete event not implemented
}
else if(Trigger.isAfter){
// after delete event not implemented
}
}
else if(Trigger.isUnDelete){
// undelete event not implemented
}
}

Related

CakePhp 3.5: save other data in beforeDelete() which returns false

I am implementing a ReviewableBehavior to implement a four eyes principle. The behavior implements beforeDelete(), beforeSave() and afterSave() and uses a reviews table to store CUD requests.
for added records, a $review record is created and saved in afterSave() (because only the we have the id of the newly added record, which we need to store in the $review)
for edited records, in beforeSave() the values that have been changed are saved in a $review record, in the edited record these field values are set back to their original values (so basically no changes are saved)
for deleted records, in beforeDelete() a $review is saved to store the delete request, and false is returned in order to cancel the delete
Number 3. is the challenge, because although $review always had a correctly set primary key value as if the save was actually successful, and save($review) returned true as if everything went well, it was not actually saved in the database.
The reason, as far as I understand: Deletions are by default done within a transaction. The transaction is started in delete() of the table, then beforeDelete() of the behavior is triggered. With in the event handler, I call ReviewTable->save($review). As a transaction has been started, this save() happens within the transaction. Then I return false, because I want the deletion to be stopped. This rolls back the transaction and with it the ReviewTable->save($review).
Solution attempts:
If I do not return false, the $review is saved in the database, but the "main" record is also deleted. Disadvantage: Not a feasible approach as record is deleted which we do not want.
If I call delete($entity, ['atomic' => false]); then there is no transaction started, hence ReviewTable->save($review) is executed. Disadvantage: For any model that uses this behavior, we would need to amend every call to delete() to switch the atomic of. Also this switches the use of transactions off, which does not appear a good approach to me.
"Overwrite" delete method in ReviewableBehavior, so for any table using this behavior, when delete() is called, actually the delete() of the _ReviewableBehavior_is performed. Disadvantage: technically not possible to overwrite table methods with a behavior.
Create a new table class, overwrite delete() method in it, and derive any table using the ReviewableBehavior from the table class. Disadvantage: ugly approach having to use both the behavior and a new table class.
Create a new method deleteRequest() in the ReviewableBehavior, and instead of calling Table->delete() we call Table->deleteRequest(). In it, we can save the delete request in a $review record, deletion is not done anyway as we did not actually call delete(). Disdavantage: For any model that uses this behavior, we would need to change every call to delete() into deleteRequest()
Currently I go with the last approach, but I would really like to hear some opinions about this, and also whether there is any better method to somehow keep the transaction, but saving something "in between".
Using a separate method is a sensible approach. Another option might be to circumvent the deletion process by stopping the Model.beforeDelete event, when doing so you can return true to indicate a successful delete operation, ie no rollback will happen.
It should be noted that stopping the event will cause possible other listeners in the queue to not be notified! Also halting the regular deletion process will prevent cascading deletes (ie deleting associated records), and the Model.afterDelete event, so if you need cascading deletes, then you'd need to trigger them manually, and the afterDelete event should usually be triggered for successful deletes in any case.
Here's a quick and dirty example, see also \Cake\ORM\Table::_processDelete() for insight on how the regular deletion process works:
public function beforeDelete(
\Cake\Event\Event $event,
\Cake\Datasource\EntityInterface $entity,
\ArrayObject $options
) {
// this will prevent the regular deletion process
$event->stopPropagation();
$table = $event->getSubject();
// this deletes associated records
$table->associations()->cascadeDelete(
$entity,
['_primary' => false] + $options->getArrayCopy()
);
$result = /* create review record */;
if (!$result) {
// this will cause a rollback
return false;
}
$table->dispatchEvent('Model.afterDelete', [
'entity' => $entity,
'options' => $options,
]);
return true;
}
See also
Cookbook > Database Access & ORM > Behaviors > Defining Event Listeners
Cookbook > Events > Stopping Events

Salesforce trigger sequence

I have two triggers on a object.
One is managed package which I could not see or amend the content inside the trigger.
One is design by me.
I want to run my own created trigger before the managed package trigger. Could I control the sequence of the execution of the trigger.
Because it now always run the managed package trigger first. I would like to run my trigger first. I have been think for a few days. All colleague in my company could not know how to achieve this goal. Without fixing this issue, I couldn't be able to continue my work. Please help me out.
In the system there is no way to control the sequence of calls triggers, I think this limitation of unsafe programming. Do you have access to the package trigger? This is a really bad approach to have several triggers on a one object, better solution is to have a single trigger that will invoke various handlers. Then, on handlers level you can manage sequence of this handlers..
For example, this is simple trigger, which is invoked on different events and calls different methods with various logic:
trigger ContactTrigger on Contact (before insert, before update) {
/* Before Update*/
if(Trigger.isUpdate && Trigger.isBefore){
/*
here you can invoke different methods of different classes
(trigger handlers) in different sequences
*/
}
/* Before Insert*/
if(Trigger.isInsert && Trigger.isBefore){
//on other events you can use it too
}
}
}
In order to ensure that a handler can be invoked only after finishing of executing previous handler, you can use the state variables whose values ​​will be changed at the end of handler, and you can check before calling other handlers. I hope this will help you in the future:)

Salesforce Trigger workflow on record delete

I want to listen change in my legacy system whenever there is any change in SF object (add/update/delete). So I have created outbound message and workflow. But in workflow I don't see any way to fire if object is deleted.
Is there anyway I can trigger outbound message on record delete? I know have heard that it can be done by trigger. But I don't want to write apex code for this.
To the best of my knowledge it cannot be done, the workflow actions are decoupled from the workflow rule (you can even reuse them) so they probably do not receive the transaction scope and when they execute the record is already gone and any reference inside action would point to a non-existing data. Thus the only way I know how to do it is via trigger.
Here is a workaround. However this will only be able to capture deletion made via std. Salesforce UI.
1.Create a custom checkbox field "Is Deleted"
2.Override the Del link with a custom VF page, that first updates the record status to "Is Deleted", and deletes the record.
3.Write workflow rule using the "Is Deleted" field.
Perhaps a compromise architecture would be to write an extremely small and simple after delete trigger that simply copies the deleted records in question to some new custom object. That new custom object fires your workflow rule and thus sends the outbound message you're looking for. The only issue with this would be to periodically clean up your custom object data that would grow in size as you deleted records from your other object. In other words, your "scratch" object would just need periodic cleaning - which could be done on a nightly schedule with batch Apex.
Here's a delete trigger that would do the trick using Opportunity as an example:
trigger AfterDelete on Opportunity (after delete)
{
List<CustObj__c> co = new List<CustObj__c>();
for(Opportunity o : Trigger.old)
{
CustObj__c c = new CustObj__c();
c.Name = o.Name;
c.Amount__c = o.Amount;
c.CloseDate__c = o.CloseDate;
c.Description__c = o.Description;
// etc.
co.add(c);
}
insert co;
}
It's not ideal but at least this would save you from having to code your own trigger-based outbound messages. These can only be done using the #Future annotation, btw, since callouts directly from triggers are forbidden. Hope that helps.
write a single email send in the trigger delete event. You have it in less than 1 hour.

Update triggers on 2 objects recursively in salesforce

I have couple of objects(1 custom object called appointment and event object) which i am trying to syncronize. So i have 1 trigger each on each object which searches and updates the records. The issue is, these triggers will keep running recursively as everytime an appointmet is updated the event is also updated and the triggers keep firing and ofcourse salesforce does not accept it.
Any idea how to overcome this?
Thanks
Easiest way is to have an apex class containing a static boolean variable initialised to false. Then in each of your triggers you would check the state of this variable:
trigger MyTrigger on MyObject (after update)
{
if(CStaticTracker.bHasTriggerFired == false)
{
CStaticTracker.bHasTriggerFired = true;
// do your work and update the other object here
// shouldn't need this but let's play safe!
CStaticTracker.bHasTriggerFired = false;
}
}
The upshot being, of course, that when one of the triggers runs it'll set this variable to true, and prevent the recursive trigger from executing whatever logic is contained within the if statement. Of course this can still cause some cascading, but it will stop as soon as you don't call another update in one of the triggers.
Good luck!

CakePHP afterSave Timing

I have a situation where, in a model's afterSave callback, I'm trying to access data from a distant association (it's a legacy data model with a very wonky association linkage). What I'm finding is that within the callback I can execute a find call on the model, but if I exit right then, the record is never inserted into the database. The lack of a record means that I can't execute a find on the related model using data that was just inserted into the current.
I haven't found any mention of when data is actually committed with respect to when the afterSave callback is engaged. I'm working with legacy code, but I see no indication that we're specifically engaging transactions, so I'm trying to figure out what my options might be.
Thanks.
UPDATE
The gist of the scenario is this: We're taking event registrations, but folks can be wait listed. A user can register (or be registered) for a given Date. After a registration is complete, I need to check the wait list for the existence of a record for the registering user (WaitList.user_id) on the date being registered for (WaitList.date_id). If such a record exists, it can be deleted because it's become an active registration.
The legacy schema puts me in a place where the registration isn't directly tied to a date so I can't get the Date.id easily. Instead, Registration->Registrant->Ticket->Date. Unintuitive, I know, but it is what it is for now. Even better (sarcasm included), we have a view named attendees that rolls all of this info up and from which I would be able to use the newly created Registration->id to return Attendee.date_id. Since the record doesn't exist, it's not available in the view.
Hopefully that provides a little more context.
What's the purpose of the find query inside of your afterSave?
Update
Is it at all possible to properly associate the records? Or are we talking about way too much refactoring for it to be worth it? You could move the check to the controller if it's not possible to modify the associations between the records.
Something like (in psuedo code)
if (save->isSuccessful) {
if (onWaitList) {
// delete record
}
}
It's not best practice, but it will get you around your issue.

Resources