When applying a scenario, do we have to delete the scenario as well, to prevent applying the changes twice? - foundry-scenarios

Running through the full loop for a scenarios-based workflow and noticed that the scenario once applied is NOT auto-deleted. What is considered best practice to prevent users from accidentally applying a scenario twice? Is it best to delete them afterwards? If so, why is auto-delete not enabled?

you could delete but usually I would add a flag called "applied" and filter my list of displayed scenarios by "applied" == false or something like that.
If you ever want to use your scenarios for metrics. E.g. how many scenarios have been applied / maybe write some stats to the scenario object on apply this data is all lost if you're deleting on apply. I believe that's also why it's not part of the workflow by default. The idea is that you should make that decision for your use case.

Related

Is it possible to excise an attribute in datomic?

I am trying to modify a schema, and I'd like to get rid of some unused attributes. Is it possible to achieve this, perhaps through excision?
Excision could work and removing an attribute is an explicit use case of it.
However you need to make sure you really want to remove it because excision is a pretty dangerous process and heavy on ressources. Unless you really have legal requirements to remove the data an alternative approach could be more appropriate.
For exmple, you could rename your attribute (with a naming convention of your choice like obsolete-*, set the :db/noHistory flag to true to reduce storage requirements and disable indexing of this attribute if it was indexed (see Schema alteration section).
Anyhow make sure you do backups before any operations and thoroughly make sure that no other part of your code relies on it.

How to temporarily disable sitecore indexing while editing items

I am developing a Sitecore project that has several data import jobs running on daily basis. Every time a job is executed, it may update a large amount of Sitecore items (thousands) and I've noticed that all these editings trigger Solr index updates.
My concern is, I don't really sure if this is better or update everything at the end of the job is. So, I would love to try both options. Could anyone tell me how can I use code to temporarily disable Lucene/Solr indexing and enable it later when I finish editing all items?
This is a common requirement, and you're right to have such concerns. In general it's considered good practice to disable indexing during big import jobs, then rebuild afterwards.
Assuming you're using Sitecore 7 or above, this is pretty much what you need:
IndexCustodian.PauseIndexing();
IndexCustodian.ResumeIndexing();
Here's a comprehensive article discussing this:
http://blog.krusen.dk/disable-indexing-temporarily-in-sitecore-7/
In addition to #Martin answer, you can pass (silent=true) when you finish the editing of the item, Something like:
item.Editing.BeginEdit();
//Change fields values
item.Editing.EndEdit(true,true);
The second parameter in EndEdit() method force a silent update of the item, which means no Events/Indexing will be triggered on item save.
I feel this is safer than pausing indexing on the whole application level during import process, you just skip indexing of the items you are updating.
EDIT:
In case you need to rebuild the index for the updated items after the import process is done, you can use the following code, It will index the content tree starting from RootItemInTree and below:
var index = Sitecore.ContentSearch.ContentSearchManager.GetIndex("Your_Index_Name")
index.Refresh(new SitecoreIndexableItem(RootItemInTree));
To disable indexing during large import/update tasks you should wrap your logic inside a BulkUpdateContext block. You can also use other wrappers like the EventDisabler to stop events from being fired if that is appropriate in your context. Alternatively you could wrap your code in an EditContext and set it to silent. So your code could end up something like this:
using (new BulkUpdateContext())
using (new EditContext(targetItem, false, true))
{
// insert update logic here...
}
here is a older question that discusses this topic: Optimisation tips when migrating data into Sitecore CMS

Keeping repository synced with multiple clients

I have a WPF application that uses entity framework. I am going to be implementing a repository pattern to make interactions with EF simple and more testable. Multiple clients can use this application and connect to the same database and do CRUD operations. I am trying to think of a way to synchronize clients repositories when one makes a change to the database. Could anyone give me some direction on how one would solve this type of issue, and some possible patterns that would be beneficial for this type of problem?
I would be very open to any information/books on how to keep clients synchronized, and even be alerted of things other clients are doing(The only thing I could think of was having a server process running that passes messages around). Thank you
The easiest way by far to keep every client UI up to date is just to simply refresh the data every so often. If it's really that important, you can set a DispatcherTimer to tick every minute when you can get the latest data that is being displayed.
Clearly, I'm not suggesting that you refresh an item that is being edited, but if you get the fresh data, you can certainly compare collections with what's being displayed currently. Rather than just replacing the old collection items with the new, you can be more user friendly and just add the new ones, remove the deleted ones and update the newer ones.
You could even detect whether an item being currently edited has been saved by another user since the current user opened it and alert them to the fact. So rather than concentrating on some system to track all data changes, you should put your effort into being able to detect changes between two sets of data and then seamlessly integrating it into the current UI state.
UPDATE >>>
There is absolutely no benefit from holding a complete set of data in your application (or repository). In fact, you may well find that it adds detrimental effects, due to the extra RAM requirements. If you are polling data every few minutes, then it will always be up to date anyway.
So rather than asking for all of the data all of the time, just ask for what the user wants to see (dependant on which view they are currently in) and update it every now and then. I do this by simply fetching the same data that the view requires when it is first opened. I wrote some methods that compare every property of every item with their older counterparts in the UI and switch old for new.
Think of the Equals method... You could do something like this:
public override bool Equals(Release otherRelease)
{
return base.Equals(otherRelease) && Title == otherRelease.Title &&
Artist.Equals(otherRelease.Artist) && Artists.Equals(otherRelease.Artists);
}
(Don't actually use the Equals method though, or you'll run into problems later). And then something like this:
if (!oldRelease.Equals(newRelease)) oldRelease.UpdatePropertyValues(newRelease);
And/Or this:
if (!oldReleases.Contains(newRelease) oldReleases.Add(newRelease);
I'm guessing that you get the picture now.

Use Reactive Extensions to harmonize & simplify Control.Enabled = true/false conditions?

Is it possible, or more precisely how is it possible to use RX.Net to listen to a number and different variety of (WinForms) controls' .TextChanged/.RowsChanged/.SelectionChanged events and whenever one condition is fullfilled (ControlA.Text isn't empty, ControlB.RowsCount > 0 etc) enable that one DoSomething button.
I am asking because currently we have a lengthy if/then statement in each of these events' handlers and maintaining them if the condition changes is, due to duplicate code, quite error prone and that's why, if possible, I think it would be nice to take the stream of events and put the condition in one place.
Has anyone done that?
You can use Observable.FromEventPattern in combination with Join patterns, with Observable.When, Observable.And and Observable.Then, to create observables which will fire depending on various conditions, like combinations of events. For example, consider my answer here: Reactive Extensions for .NET (Rx): Take action once all events are completed
You should be able to use .FromEventPattern() to create observables for each of those events and then use CombineLatest() to do your logic on the current overall state and determine whether your button should be enabled, in one place.

How do you create a deconstructor (or similar) in APEX that runs right before an object is destroyed?

My application has many aggregate fields that need to be updated when any related record is changed, added or deleted. The relationships and calculations are somewhat involved, so I created a class that handles all of the calculations for all of the related tables. There is some SOQL and DML overhead involved in the calculations, so the class handles everything in bulk.
I would like to have the updateAll() method on this class run no more than once per request on all of the records that have been added to its queue. But, there doesn't appear to be "deconstructor-like" functionality in APEX that would automatically get called right before this calculator object was destroyed.
What is the best way to implement this pattern in APEX?
Yes, there is no way to detect or predict object destruction, since its essentially JSP in the background (shhh, they don't want you to know, it's the "no software" thing ;) ir probably follows its lifetime mechnisms but you can't rely on that.
We actually handle our aggregation in triggers or in te reporting (depending on whether aggregation needs to be stored). Triggers also receive batches as List rather than one-by-one row which allows for batch aggregation and allows us to satisfy the pesky governor. Unfortunately if you have multi-table aggregates you'll need to have triggers for all of them and rerun them for every batch
Here's what I did. I created a Calculator class that recalcs every related aggregate/calculated field in a ~10 table/object relationship. I used triggers on each of those objects to make the calculator class run on the set of related object families to the objects that were changed. I used a static variable on the calculator class to check if the calculator was running in each of the triggers so that they would only call the calculator if it wasn't currently running. It works well enough. A bit inefficient, but stays below governor limits and works in bulk very well. And, I can grow with it...

Resources