Im new to salesforce, I have an Object Training__c and field End_Date, and when the End_Date came I need to create a task, but I dont know how to track this End_Date, because it is not a trigger...
Thanks
Look into time-based workflows (bit old school, we're encouraged to use flows now so check scheduled flows out.
It could be a whole scheduled flow (kind of like a nightly batch job) or a scheduled path in your "normal" flow (if you already have one on this object). There are some trailhead modules to get you started:
https://trailhead.salesforce.com/content/learn/modules/record-triggered-flows/get-started-with-triggered-flows
https://trailhead.salesforce.com/content/learn/modules/record-triggered-flows/add-a-scheduled-task-to-your-flow
Roughly speaking you'd set the action to fire "0 days after end date" and it becomes Salesforce's problem to modify the job if the end date changes. It's elegant, code free, fairly easy.
There are some problems with it such as scale, will there be tens of thousands of records? Another thing is this will work only for records created / modified since you activated this flow. What about all old data? What if I need to modify the flow's definition, will it de-queue all actions? (that one was a legit concern with time-based workflows. to edit the workflow you had to deactivate it - but doing so nuked all submitted actions).
So... you may decide to write some code for this after all. Have an apex batch job running nightly, selecting records with End Date <= TODAY that don't have (open)? tasks yet - and adding these. (maybe if a task is completed you'd want to give another one). Different solution, requiring you to write an unit test for it too (which isn't neccessarily a bad thing), bit more resilient than flows.
This looks like fairly similar problem solved with a batch: https://salesforce.stackexchange.com/q/118214/799
Related
[Note: There is a Teacher Object with the fields such as Teacher Name, DateofJoining, and also a formula field called Experience]
My Task was to create a Public Group consisting of another user
and this user should only see teachers who have experience greater than 2 years
But when i create a sharing rule based on criteria the field name called Experience doesn't show up as it is a formula field.
So i got an idea of creating a new field(maybe a text or number data type) which would have the value of Experience in it. (But i have no idea on how to implement this)
Is there a way to implement this?
Any other solution is also well appreciated!
Hard to say.
Normal trick would be to create a helper field (text, number, whatever) and have piece of functionality that populates it. An "early flow" or "before insert, before update" trigger ideally. Worst case a normal flow, process builder or "after insert, after update" trigger. Something like "if Experience__c != 'your formula here' then Experience__c = 'your formula here'". Consult normal SF help and trailhead if you never used early flows
You'd make an one-off data fix to populate existing records and job done, normal field should be selectable as sharing rule criteria.
=====
But I smell trouble with your formula. What exactly you have there, something like Experience__c = (TODAY() - DateofJoining__c) / 365? That's bit evil. Formulas with TODAY(), NOW() or anything with $ (roughly speaking who's looking at the data, user's name, profile role... not what's actually on the record itself) are "nondeterministic". Unpredictable.
A "today()" changes just like that, without updating the record. Sure, when you watch the record a fresh value will be calculated but other than that LastModifiedDate doesn't change, there's no magical trigger running at midnight that rechecks sharing. (especially that there's no single midnight, you could have users in multiple timezones). SF just doesn't allow nondeterministic fields in many places, see https://salesforce.stackexchange.com/q/32122/799
So if you do rely on TODAY() in your formula you might have to make a "scheduled flow" or read about schedulable, batchable apex. Create nightly job that would run and recalculate your helper field with right experience. You'd probably even need both solutions, a "before save" flow for new data created today and nightly job to advance the clock on existing old data...
So I run my code on a weekly basis. Lets say it's 100 new tasks. Each task works on new data for that week.
I want to have a failsafe in case something happens like my computer randomly shuts off or I lose internet connection 30/100 tasks in.
So my idea was to have the database essentially load in the 100 tasks along with that weeks data into a table at the beginning as like a temporary todo list table, then remove them as I go one by one. So if it fails at task 30 out of 100, then next week, I'll have the other 70 still on my todo list plus the new 100.
Does this make sense as a design pattern? The table will essentially be empty 99% of the time. We also already use Postgres so I was thinking of just using that which I guess feels even worse since it offers so much and I'm using it for such a simple reason.
Do "state machines" fit anywhere here? It was suggested to me by someone and I don't really see how it would help after Googling it.
I'm having a problem on a batch job that has a simple SOQL query that returns a lot of records. More than a million.
The query, as it is, cannot be optimized much further according to SOQL best practices. (At least, as far as I know. I'm not an SF SOQL expert.)
The problem is that I'm getting -
Caused by: javax.ws.rs.ProcessingException: java.net.SocketTimeoutException: Read timed out
I try bumping up the Jersey readtime out value from 30 seconds to 60 seconds, but it still times out.
Any recommendation on how to deal with this issue? Any recommended value for the readtimeout parameter for a query that returns that much data?
The query is like this:
SELECT Id, field1, field2__c, field3__c, field3__c FROM Object__c
WHERE field2__c = true AND (not field3 like '\u0025Some string\u0025')
ORDER BY field4__c ASC
In no specific order...
Batches written in Apex time out after 2 minutes so maybe set same in your Java application
Run your query in Developer Console using the query plan feature (you probably will have to put real % in there, not \u0025). Pay attention which part has "Cost" column > 1.
what are field types? Plain checkbox and text or some complex formulas?
Is that text static or changes depending on what your app needs? would you consider filtering out the string in your code rather than SOQL? Counter-intuitive to return more records than you really need but well, might be an option.
would you consider making a formula field with either whole logic or just the string search and then asking SF to index the formula. Or maybe making another field (another checkbox?) with "yes, it contains that text" info, set the value by workflow maybe (essentially prepare your data a bit to efficiently query it later)
read up about skinny tables and see if it's something that could work for you (needs SF support)
can you make an analytic snapshot of your data (make a report, make SF save results to helper object, query that object)? Even if it'd just contain lookups to your original source so you'll access always fresh values it could help. Might be a storage killer though
have you considered "big objects" and async soql
I'm not proud of it but in the past I had some success badgering the SF database. Not via API but if I had a nightly batch job that was timing out I kept resubmitting it and eventually 3rd-5th time it managed to start. Something in the query optimizer, creation of cursor in underlying Oracle database, caching partial results... I don't know.
what's in the ORDER BY? Some date field? If you need records updated since X first then maybe replication API could help getting ids first.
does it make sense to use LIMIT 200 for example? Which API you're using, SOAP or REST? Might be that returning smaller chunks (SOAP: batch size, REST API: special header) would help it finish faster.
when all else fails (but do contact SF support, make sure you exhausted the options) maybe restructure the whole thing. Make SF push data to you whenever it changes, not pull. There's "Streaming API" (CometD implementation, Bayeux protocol, however these are called) and "Change Data Capture" and "Platform Events" for nice event bus-driven architecture decisions, replaying old events up to 3 days back if the client was down and couldn't listen... But that's a totally different topic.
I'm creating an application that allows me to create scheduled tasks. I have a threaded process that runs in the background every minute, and triggers the following class:
Namespace MyName.space
Public Class RunJob
...
End Class
End Namespace
A job can either be "Run Once" or "Recurring".
I am querying the job table in vb.net, storing the results in a dataset so that I can iterate over them one at a time. I then started thinking about functions to validate if a job should run or not based on its unique criteria.
For example, the simplest of all would be the "Run Once" jobs:
If Type = "Run Once"
JobShouldRun = Helpers.Validate_RunOnce(WhenToRun, RunCount)
ElseIf Type = "Recurring"
...
End If
The function would check if the "WhenToRun" is within 5 minutes of the current date and time configured on the server. I chose a 5 minute window in the event of a task failure so that a minute later it'll try again and still have 3-4 tries left.
But then I started thinking that I could control more of this within the SQL Query to get the jobs themselves, and skip validating this "easy" on within vb.net, but I'm unsure of which method would be more optimal. So I could limit the result set to begin with in my initial query:
SELECT
JobId
,JobType -- Run Once or Recurring
,WhenToRun
FROM JobTable a
WHERE Active = 1
AND (
JobType = 'Run Once'
AND TotalRuns = 0
AND DATEDIFF(minute, getdate(), a.WhenToRun) BETWEEN 1 AND 5
)
OR JobType = 'Recurring'
Then I started thinking, should I perform all my logic with either UNION or JOINS, or a complex WHERE/AND/OR conditioning on the recurring jobs as well? They get quite a bit more complicated... Such as weekly and what days of the week, and what time, etc. Or every "int" years in "Jan|March|Etc" on the Last day of the month.
So I started thinking along these lines:
SELECT
JobID
,JobType
...
FROM JobTable
WHERE RecurringType = 'Weekly'
AND ... more conditions based on all the custom job settings
UNION ALL
SELECT
JobID
,...
FROM JobTable
WHERE RecurringType = 'Monthly'
AND ... more conditions based on all the custom job settings
This query would end up quite large and complex, but my question is, should I handle this in vb.net or SQL, or the easy stuff in SQL and the more complicated conditions in vb.net? I'm unsure of the performance impact of either direction.
You're basically asking "Where do I put code for business logic? In the app? Or in the database?" It can be a hotly debated topic, but it's sensible to consider both options. There are some tradeoffs with each approach, and it seems kinda cavalier to say one way is right and the other is wrong.
If your BL code is in the app, you get the primary benefit of the robustness of the .Net Framework. I love tsql, but you just can't do as much with it as you can VB.Net. If I can stereotype developers, I suspect in general they'll be more comfortable with .Net code. For most, it's probably easier to debug than tsql. Since the .Net code is compiled to an assembly, it's not likely to be altered either.
If your BL code is in tsql, you may find that performance is a bit better. You also abstract away some of the complexity from the .Net code and make that code base a bit smaller (some might argue this is bad thing). If there are bugs that need to be fixed, it's usually easier to redeploy a stored procedure (or user-defined function, view, etc) than to redeploy an application (especially if it involves multiple workstations). On the downside, it's easy for other to see (steal!) your code, or make changes to it.
As a general guideline, the more complex the business logic is, the more likely I'd be to put it in the app. If it's pretty simple and not likely to change, I'll consider putting it in tsql. That being said, if I had to pick one or the other with no exceptions, I'd put the BL in the app.
this is a data design problem I am facing right now.
So I am currently designing a scheduling application for a farm. To deal with orders coming in, I've designed the following class called Forecast. Forecast has the following attributes:
Due_date
Entries - This is a list of {crop:X, quantity:Y} which indicates how much of each crop is required for each of these crop.
As you know, each crop has different durations for:
Propagation Duration (from seed to planting)
Spacing Duration (from planted to more spaced out for maximum sunlight)
Storage Duration (from harvest to expiry)
I wrote a "backward scheduler" which figures out when an order has to be met, then goes backwards and creates task for what needs to be done on what date.
Note that these calendar "entries" are not stored in the database - I generate them on the fly and am planning to allow my users to drag these events forward and backward depending on their preference.
Now, I am tasked with generating reports for each day on what needs to be done. My users would like to be able to generate a list of tasks for a particular date. (So if you look at my screenshot May 1st would result in one tasks being created, 'transplanting 12 units of Minutina').
I am not sure what's the best way to go about accomplishing this. My initial design is:
Create a table for tasks. This will include: (forecast_id, forecast_updated_at)
Store the tasks generated by my scheduler into this table.
When a user modifies a task, the database entry will be updated correspondingly.
However, if the user has decided to modify the original Forecast, then we will have to delete all the tasks associated with this forecast and regenerate a new sets of tasks and store them again in the table mentioned above.
The purpose of the (forecast_id, forecast_updated_at) is to allow me to quickly verify the correctness of the tasks currently in my database.
This design is a little messy and I would like to hear from any experts out there for a better approach to this problem.