Update document in Processor on the condition based - vespa

{
"status":1,
"expire":15349870000,
"detail1":"test1",
"detail2":"test2"
}
{
"status":0,
"expire":15349870000,
"detail1":"test1",
"detail2":"test2"
}
I have two different documents of same data type, I want to update status, detail1 and detail2 on conditions
if(status==0 and expire > now()) then status = 1 and detail1 = "good"
if(status==1 and expire > now()) then status = 2 and detail2 = "bad"
But all this I want to do in Processor. So how can I apply a check in processor as I am unable to get value of fields in processor?
#Override
public Progress process(Processing processing) {
for (DocumentOperation op : processing.getDocumentOperations()) {
if (op instanceof DocumentUpdate) {
DocumentUpdate documentUpdate = (DocumentUpdate) op;
if(?){
documentUpdate.addFieldUpdate(FieldUpdate.createAssign(documentUpdate.getDocumentType().getField("detail1"), new StringFieldValue("good")));
}
else if(?){
documentUpdate.addFieldUpdate(FieldUpdate.createAssign(documentUpdate.getDocumentType().getField("detail2"), new StringFieldValue("bad")));
}
}
}
return Progress.DONE;
}
Please help!

You are operating only on UPDATE document operations (if op instanceof DocumentUpdate). You don't have access to the original document fields stored in the index but only the updates which is part of the DocumentUpdate. See https://docs.vespa.ai/documentation/document-processing-overview.html

Related

Why am i getting DML error in Salesforce update trigger?

Im trying to update BilingCountry in Salesforce via the Bulk API from a csv, there are 1025 entries in the csv file. The Account ids in the csv have a BillingCountry different to what is in Salesforce but i get the following error:
CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:test_tr_U: System.LimitException: Too many DML rows: 10001:--
Here is my Trigger:
Trigger test_tr_U on Account (after update)
{
if(checkRecursion.runOnce())
{
set<ID> ids = Trigger.newMap.keyset();
List<TestCust__c> list1 = new List<TestCust__c>();
for(ID id : ids)
{
for (Account c: Trigger.new)
{
Account oldObject = Trigger.oldMap.get(c.ID);
if (c.Billing_Country__c!= oldObject.Billing_Country__c ||
c.BillingCountry!= oldObject.BillingCountry )
{
TestCust__c change = new TestCust__c();
change.Field1__c = 'update';
change.Field2__c = id;
change.Field3__c = false;
change.Field4__c = 'TESTCHANGE';
list1.add(change);
}
}
}
Database.DMLOptions dmo = new Database.DMLOptions();
dmo.assignmentRuleHeader.useDefaultRule = true;
Database.insert(list1, dmo);
}
}
Looping over every id then over every updated account will result in the creation of #id * #account. Those are the same value so you'll create #account^2 (1050625) TestCust__c records.
Salesforce splits the 1025 records in chunk of 200 records and since a Bulk API request caused the trigger to fire multiple governor limits are reset between these trigger invocations for the same HTTP request.
Anyway for each trigger run you're creating 200*200 = 40000 TestCust__c records, which are highly above the limit of 10000 records per transaction, therefore the system will raise LimitException.
You should remove the outer loop: it's simply wrong.
Trigger test_tr_U on Account (after update)
{
if(checkRecursion.runOnce())
{
List<TestCust__c> list1 = new List<TestCust__c>();
for (Account c: Trigger.new)
{
Account oldObject = Trigger.oldMap.get(c.Id);
if (c.Billing_Country__c != oldObject.Billing_Country__c ||
c.BillingCountry!= oldObject.BillingCountry)
{
TestCust__c change = new TestCust__c();
change.Field1__c = 'update';
change.Field2__c = c.Id;
change.Field3__c = false;
change.Field4__c = 'TESTCHANGE';
list1.add(change);
}
}
Database.DMLOptions dmo = new Database.DMLOptions();
dmo.assignmentRuleHeader.useDefaultRule = true;
Database.insert(list1, dmo);
}
}

Flink session window not working as expected

The session window in Flink is not working as expected on prod env (same logic works on local env). The idea is to emit the count of 'sample_event_two' for a specific user Id & record id incase if there is at least one event of type 'sample_event_one' for the same user Id & record id. ProcessingTimeSessionWindows with session gap of 30 mins is used here and ProcessWindowFunction has the below logic (I am doing a keyby user Id and record Id fields before setting the window size),
public void process(
String s,
Context context,
Iterable<SampleEvent> sampleEvents,
Collector<EnrichedSampleEvent> collector)
throws Exception {
EnrichedSampleEvent event = null;
boolean isSampleEventOnePresent = false;
int count = 0;
for (SampleEvent sampleEvent : sampleEvents) {
if (sampleEvent.getEventName().equals("sample_event_one_name")) {
Logger.info("Received sample_event_one for userId: {}");
isSampleEventOnePresent = true;
} else {
// Calculate the count for sample_event_two
count++;
if (Objects.isNull(event)) {
event = new EnrichedSampleEvent();
event.setUserId(sampleEvent.getUserId());
}
}
}
if (isSampleEventOnePresent && Objects.nonNull(event)) {
Logger.info(
"Created EnrichedSampleEvent for userId: {} with count: {}",
event.getUserId(),
event.getCount());
collector.collect(event);
} else if (Objects.nonNull(event)) {
Logger.info(
"No sampleOneEvent event found sampleTwoEvent with userId: {}, count: {}",
event.getUserId(),
count);
}
}
Though there is sample_event_one present in the collection (confirmed by verifying if the log message "Received sample_event_one" was present) and the count is calculated correctly, I don't see any output event getting created. Instead of EnrichedSampleEvent being emitted, I see log message "No sampleOneEvent event found sampleTwoEvent with userID: "123, count: 5". Can someone help me fix this?
Your ProcessWindowFunction will be called for each key individually. Since the key is a combination of user id and record id, it's not enough to know that "Received sample_event_one" appears in the logs for the same user. Even though it was the same user, it might have had a different record id.

What does the oldmap actually do here? Can someone explain

This is a code to create a new task when stage is inserted or updated to Closed Won
trigger ClosedOpportunityTrigger on Opportunity (after insert, after update) {
List<Task> tl = new List<Task>();
for(Opportunity op : Trigger.new) {
if(Trigger.isInsert) {
if(Op.StageName == 'Closed Won') {
tl.add(new Task(Subject = 'Follow Up Test Task', WhatId = op.Id));
}
}
if(Trigger.isUpdate) {
if(Op.StageName == 'Closed Won'
&& Op.StageName != Trigger.oldMap.get(op.Id).StageName) {
tl.add(new Task(Subject = 'Follow Up Test Task', WhatId = op.Id));
}
}
}
if(tl.size()>0) {
insert tl;
}
}
Here, what does && Op.StageName != Trigger.oldMap.get(op.Id).StageName) do? Why do we use oldMap here?
Trigger.newMap is the map of IDs of new object values. Available in insert, update, and undelete triggers, and 'new' records can only be modified in before triggers.
Trigger.oldMap is the map of IDs of old object values. Available in update and delete triggers only.
if (Trigger.isUpdate) {
// Iterate updated opportunities
for (Opportunity o : Trigger.new) {
// Get the opportunity before update
Opportunity oldOpp = Trigger.oldMap.get(o.Id);
// Check if a value changed
if (o.Some_Value__c == oldOpp.Some_Value__c) {
System.debug('Value did not change.');
} else {
System.debug('Value changed!');
}
}
}
Note: I could have used Trigger.newMap instead of Trigger.new but I'd be looping through Trigger.newMap.values() instead - with the same end result. newMap is just a convenient way of getting the bulkified data in map form instead of a list.
We use old map to compare with the new value of the Some_Value__c custom field. If the two values differ then the field value has changed. Of course, if you read the code in the two if branches, this is obvious.

Salesforce APEX method not bulkified

I have written an APEX Class that sends an email when a client is released. There is a method that I thought I had bulkified but I was told it does not. This is because this method calls another function which actually does the actual email creation and that is not bulkified. Can someone guide me as to how the SOQL queries can be taken out of the method?
global class LM_ChangeAccountRT {
private static final Profile sysAdmin = [select id from profile where name='System Administrator'];
#AuraEnabled
public static String resendEmails(List<String> accountIdList) {
String response = null;
try {
//Only send emails if user is either an ARMS Administor or System Administrator
if (System.label.ARMS_Administrator_Profile_Id == userinfo.getProfileId() ||
sysAdmin.Id == userinfo.getProfileId()) {
List<Account> accList = [SELECT Id,Client_Released__c, RecordTypeId,Client_Number__c, Client_Released__c, Email_Sent__c FROM Account WHERE Id IN:accountIdList];
for(Account acc: accList){
if (acc.Client_Number__c != null && acc.Client_Released__c && acc.Email_Sent__c == true) {
sendpdfgenerationEmails(acc); //this is the method thats not bulkified.
acc.Email_Sent__c = false;
response = 'Email Sent';
}else {
response= 'Access Denied';
}
}
update accList;
}
}catch(Exception e) {
System.debug(e.getMessage());
response = 'Error sending emails';
}
return response;
}
public static void sendpdfgenerationEmails(Account acc){
system.debug('start of confirmation card and pdf generation');
//Logic to find which VF template is used to send an email.
list<EmailTemplate> templateId = new list<EmailTemplate>();
string temppartner;
String partner_opt_in_attachment;
boolean sendFCAmail;
List<Dealership_PDF_Generation__c> custsettingdata = Dealership_PDF_Generation__c.getall().values();
System.debug('custom setting size = ' + custsettingdata.size());
// Fetch State
if(acc.Dealership_State__c!=null && acc.Dealership_Partner__c!=null)
{
for(Dealership_PDF_Generation__c tempcustsetting :custsettingdata)
{
if(acc.Dealership_Partner__c == tempcustsetting.Dealership_Partner__c && acc.Dealership_State__c==tempcustsetting.State__c && tempcustsetting.State__c=='WA' && acc.Dealership_State__c=='WA'){
//For WA State
// temppartner= '%' + tempcustsetting.TEMPLATE_Unique_name__c + '%';
temppartner= tempcustsetting.TEMPLATE_Unique_name__c;
if(acc.Dealership_Spiff_Payment__c == '% premium'){
partner_opt_in_attachment=tempcustsetting.opt_in_form_premium__c;
}else{
partner_opt_in_attachment=tempcustsetting.opt_in_form_nonpremium__c;
}
}
else if(acc.Dealership_Partner__c == tempcustsetting.Dealership_Partner__c && acc.Dealership_State__c==tempcustsetting.State__c && tempcustsetting.State__c=='TX' && acc.Dealership_State__c=='TX'){
//For TX State
//temppartner= '%' + tempcustsetting.TEMPLATE_Unique_name__c + '%';
temppartner= tempcustsetting.TEMPLATE_Unique_name__c;
if(acc.Dealership_Spiff_Payment__c == '% premium'){
partner_opt_in_attachment=tempcustsetting.opt_in_form_premium__c;
}else{
partner_opt_in_attachment=tempcustsetting.opt_in_form_nonpremium__c;
}
}
else if(acc.Dealership_Partner__c == tempcustsetting.Dealership_Partner__c && acc.Dealership_State__c!=tempcustsetting.State__c && tempcustsetting.State__c!='TX' && acc.Dealership_State__c!='TX' && acc.Dealership_State__c!='WA' &&tempcustsetting.State__c!='WA' ){
//For Non TX State
//temppartner= '%' + tempcustsetting.TEMPLATE_Unique_name__c + '%';
temppartner= tempcustsetting.TEMPLATE_Unique_name__c;
if(acc.Dealership_Spiff_Payment__c == '% premium'){
partner_opt_in_attachment=tempcustsetting.opt_in_form_premium__c;
}else{
partner_opt_in_attachment=tempcustsetting.opt_in_form_nonpremium__c;
}
system.debug('grabbed template: ' + temppartner);
}
if(acc.Dealership_Partner__c != null && temppartner!=null ){
templateId.add([Select id,DeveloperName from EmailTemplate where DeveloperName = :temppartner]); //This will probably cause governor limit issues. First problem
}
if (partner_opt_in_attachment != null) {
StaticResource sr = [Select s.Name, s.Id, s.Body From StaticResource s where s.Name =: partner_opt_in_attachment]; //'static_resource' is the name of the static resource PDF. This is another SOQL query that will cause problems
Blob tempBlob = sr.Body;
Messaging.EmailFileAttachment efa = new Messaging.EmailFileAttachment();
efa.setBody(tempBlob);
efa.setFileName('Opt-in.pdf');
List<Messaging.EmailFileAttachment> attachments = new List<Messaging.EmailFileAttachment>();
attachments.add(efa);
// add attachment to each email
for (Messaging.SingleEmailMessage email : emails) {
email.setFileAttachments(attachments);
}
}
system.debug('email sent: ' + emails.size());
Messaging.sendEmail(emails);
}
}
}
The reason why I am trying to bulkify this is because I have written a APEX scheduler that calls the resendemails method everyday at 7am to check which records need to have an email sent. I am afraid that if there are more than a 100 clients then it will cause problems and not send the emails. Any suggestions on how I can optimize the sendpdfemailgeenration() method?
Thank you
Yes, you are right - your's resendEmails() method is not bulkified.
Firstly, let me explain you why is that:
SOQL to get Accounts
Loop 1 on List of Account records
Call sendpdfgenerationEmails() method
Retrieve list of Dealership_PDF_Generation__c records
Loop 2 on List of Dealership_PDF_Generation__c records
SOQL to get StaticResources - Very bad! It's inside double loop!
Call Messaging.sendEmail() method - Very bad! It's inside double loop!
Update on List of Account records
You need to remember that:
1. You should never do SOQLs in loops! - Limit 100 SOQLs per transaction
2. You should never call Messaging.sendEmail() in loops! - Limit 10 calls per transaction
Now let me guide you how to refactor this method:
#AuraEnabled
public static String resendEmails(List<String> accountIdList) {
// 1. SOQL for List of Account records
// 2. Retrieve list of Dealership_PDF_Generation__c records
// 3. SOQL for List of StaticResources for all Names from Dealership_PDF_Generation__c records
// 4. Declaration of new List variable for Messaging.SingleEmailMessage objects
// 5. Loop 1 on List of Account records
// 6. Call new "prepareEmailsForAccount()" method, which prepares and returns list of Messaging.SingleEmailMessage objects
// 7. Add returned Messaging.SingleEmailMessage objects to list from point 4
// 8. End of loop 1
// 9. Call "Messaging.sendEmail()" method with list from point 4
// 10. Update on List of Account records
}
With this you will avoid SOQLs and calling Messaging.sendEmail() method in loops.

Audit of what records a given user can see in SalesForce.com

I am trying to determine a way to audit which records a given user can see by;
Object Type
Record Type
Count of records
Ideally would also be able to see which fields for each object/record type the user can see.
We will need to repeat this often and for different users and in different orgs, so would like to avoid manually determining this.
My first thought was to create an app using the partner WSDL, but would like to ask if there are any easier approaches or perhaps existing solutions.
Thanks all
I think that you can follow the documentation to solve it, using a query similar to this one:
SELECT RecordId
FROM UserRecordAccess
WHERE UserId = [single ID]
AND RecordId = [single ID] //or Record IN [list of IDs]
AND HasReadAccess = true
The following query returns the records for which a queried user has
read access to.
In addition, you should add limit 1 and get from record metadata the object type,record type, and so on.
I ended up using the below (C# using the Partner WSDL) to get an idea of what kinds of objects the user had visibility into.
Just a quick'n'dirty utility for my own use (read - not prod code);
var service = new SforceService();
var result = service.login("UserName", "Password");
service.Url = result.serverUrl;
service.SessionHeaderValue = new SessionHeader { sessionId = result.sessionId };
var queryResult = service.describeGlobal();
int total = queryResult.sobjects.Count();
int batcheSize = 100;
var batches = Math.Ceiling(total / (double)batcheSize);
using (var output = new StreamWriter(#"C:\test\sfdcAccess.txt", false))
{
for (int batch = 0; batch < batches; batch++)
{
var toQuery =
queryResult.sobjects.Skip(batch * batcheSize).Take(batcheSize).Select(x => x.name).ToArray();
var batchResult = service.describeSObjects(toQuery);
foreach (var x in batchResult)
{
if (!x.queryable)
{
Console.WriteLine("{0} is not queryable", x.name);
continue;
}
var test = service.query(string.Format("SELECT Id FROM {0} limit 100", x.name));
if(test == null || test.records == null)
{
Console.WriteLine("{0}:null records", x.name);
continue;
}
foreach (var record in test.records)
{
output.WriteLine("{0}\t{1}",x.name, record.Id);
}
Console.WriteLine("{0}:\t{1} records(0)", x.name, test.size);
}
}
output.Flush();
}

Resources