Is there a way to retrieve Salesforce picklist values for record types using apex? - salesforce

I need to gather information about what picklist values are available for every record type. I know this can be achieved using either describeLayout or readMetadata API. But when I try to gather this info for a large custom object, troubles happen. The SalesForce API returns a record type with all available picklist values for it.
<recordTypeMappings>
<name>Record1</name>
<picklistsForRecordType>
<picklistName>Picklist1</picklistName>
<picklistValues>
...
</picklistValues>
</picklistsForRecordType>
<picklistsForRecordType>
<picklistName>Picklist2</picklistName>
<picklistValues>
...
</picklistValues>
</picklistsForRecordType>
</recordTypeMappings>
<recordTypeMappings>
<name>Record2</name>
<picklistsForRecordType>
<picklistName>Picklist1</picklistName>
<picklistValues>
...
</picklistValues>
</picklistsForRecordType>
<picklistsForRecordType>
<picklistName>Picklist2</picklistName>
<picklistValues>
...
</picklistValues>
</picklistsForRecordType>
</recordTypeMappings>
It means if I have a large object (which includes 200 picklists and 100 record types), I will get 200*100=20,000 picklist records. It makes the API response extremely large, up to 80MB. And this is extremely inefficient, if a picklist values remain the same for all record types, they will still be included in each of them in API response.
The idea is to get unique picklist values sets and then just include record ids with them, so the same picklist will not be duplicated with every record type.
<recordTypeMappings>
<name>Record1, Record2</name>
<picklistsForRecordType>
<picklistName>Picklist1</picklistName>
<picklistValues>
...Values which are the same for Record1 and Record2...
</picklistValues>
</picklistsForRecordType>
<picklistsForRecordType>
<picklistName>Picklist2</picklistName>
<picklistValues>
...Values which are the same for Record1 and Record2...
</picklistValues>
</picklistsForRecordType>
</recordTypeMappings>
This will reduce the response size. Is there a way to do that in Apex? I searched in API and was not able to find anything suitable. Apex seems a better solution, since all the processing will happen on the Salesforce side.
Thanks for help.

To filter out duplicates and only get unique values, try capturing a collection of picklist values in a Set. For example here is a function that takes a List of fields (in this case it is a list of picklist fields) and returns a set of unique picklist values.
// Given a list of picklist fields, return a set of unique picklist values
Set<Schema.PicklistEntry> getUniquePickListValues(List<Schema.DescribeFieldResult> pickListFields) {
Set<Schema.PicklistEntry> uniquePicklistValues = Set<Schema.PicklistEntry>();
for(Schema.DescribeFieldResult pickList : pickListFields){
List<Schema.PicklistEntry> pickListValues = pickList.getDescribe().getPicklistValues();
for(Schema.PicklistEntry entry : pickListValues){
uniquePicklistValues.add(entry);
}
}
return uniquePicklistValues;
}
I know that using nested loops is inefficient but I don't know if there is a better way to merge list of objects into a Set.
Hope this helps.

If you want to retrieve picklist values based on record type, please check my solution here,
https://salesforce.stackexchange.com/questions/103837/how-do-i-get-the-intersection-of-recordtype-and-picklist-values-inside-apex/202519#202519
It uses a REST API call, but the response will be similar to a getdescribe result plus record type info.

Here is how the performance and volume issues were solved.
The challenge was to collect available picklist values for all the record types of huge custom objects with a lot of record types and picklists.
First of all, I did not find any way to do that directly in Apex, I used API calls.
When we receive a custom object description via describeSObject call (https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_calls_describesobject.htm), we get all the picklist values and record type values. What we do not get is the specific picklist values available for each record type. For that we need to execute describeLayout request (https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_calls_describelayout.htm). Using information from describeSObject we may try to predict how large describeLayout response will be.
For example, if we have 500 picklist values and 20 record types, total response of describeLayout will have up to 500*20=10,000 picklist values (since describeLayout returns all picklist values available for for each record type). Then we need to approximate how large that XML response would be, since Salesforce API has a response limit of 5mb. After inspecting the response I found out that to match the 5mb limit we need to meet the requirement of less than 30,000 picklist values per describeLayout request.
Solution was to break this large call into several smaller by record types, so we retrieve all picklist values for a few record types and then repeat for others.
It took up to 24 API requests to retrieve 70MB of data from SalesForce API, which was not possible via one API call because of response size limit.

Related

Number of fields

I am new in salesforce and I have a question. I am creating a new reporty type using opportunites and when I was in field layout properties I noticed there were total 40 fields for opporutiy object. I didnt had another object relation with opportunity just simple opportunity. But when I went to opportunity in in object manager there were only 29 items listed in fields and realtionship. Why are there 11 more fields in opportunity in reprot type then in fields and realtuionship
I was expeting the same number of fields in object manager and in report type but report type had more fields even when no object was linked with it
There's a number of fields that exist in the table but aren't visible in setup because you can't modify them, can't change user's right to see/edit them. It's bit like https://help.salesforce.com/s/articleView?id=sf.dev_objectfields.htm&type=5 but it doesn't mention CreatedDate, LastmodifiedDate, SystemModStamp... and on Opportunity there are more.
https://developer.salesforce.com/docs/atlas.en-us.238.0.object_reference.meta/object_reference/sforce_api_objects_opportunity.htm may be a good start but if you truly want everything - run a "describe" operation in Apex or maybe REST API (https://workbench.developerforce.com/ -> utilities -> rest explorer for example)

How can i insert a custom object into Slaesforce using Apex with requiredfields that are non-writeable?

A client has created a custom object call CustObj__c in Salesforce. They have required fields , one is Cust_Id which is of type Formula (Text)
I am trying to create a Custom Object item using the following
List<CustObj__c> CustList = new List<CustObj__c>();
CustObj__c Item_0 = new CustObj__c( Name__c='TEST1', Cust_Id='Cust: '& 123);
CustList.add(Item_0);
CustObj__c Item_1 = new CustObj__c( Name__c='TEST2', CustId__c='Cust: '& 456);
CustList.add(Item_1);
insert CustList;
But it gives the error that
Field is not writeable: CustObj__c.CustId__c
How can i insert if the field is non writeable but required ?
If it is a custom metadata object , do i need to do this ?
client hasnt provided any details
Formula fields are calculated at runtime when you view the record and are readonly, can't be marked as required. If your client has business need for it to be required they probably created a validation rule that checks the field value.
You'll have to inspect the field's formula and insert data that would satisfy it. (Some other fields populated, maybe some dates or right amounts). Maybe they assign customer id(?) in some special way, only if account reaches certain status, moves from prospect to paying customer.
You could ask them if they ever had to write unit tests that insert these records and steal get inspired by these code samples, see what prerequisites there are or fields that impact that formula.

Cloud Firestore change field name

my database like;
I want, when announcement0 field is deleted, announcement1 field name to change announcement0. Is there a way to do this ?
There is no way to rename fields in Firestore, let alone to have that happen automatically.
It sounds like you have multiple announcements in your document however. In that case, you could consider storing all announcements in a single array field announcements. In an array field, when you remove the first item (at index 0) all other items after that shift down in the array to take its place, which seems to be precisely what you want.
You cannot rename fields in a document. You'll have to delete and recreate it.
Now I'm assuming the number just defines the order of document. If that's the case you can use this workaround, instead of looking for 'announcement0' on client side, you can just store a number field in the document such as 0 in announcement0 and so on. So to get announcement1 when announcement0 is deleted you can uses this query:
const firstAnnouncement = await dbRef.orderBy('number').limit(1).get()
This will get the announcement with least number (highest rank). You can change the limit as per your needs.
But if renaming fields is needed then you'll have to delete and recreate all trailing announcements.

Using Salesforce Apex to find a record and pull data from a field

In salesforce we have two objects. The first is a Component pricing object (Comp_Pricing__c). This has records with component part numbers and pricing in their respective fields, Part_Number & Pkg_Price__c. We also have an object in which we put together proposals with quotes. In this object we use an apex class and trigger to run our calculations to determine quantities of parts needed. We would like to have the apex class, based on a variable (apPart), search through the records, find the corresponding part number and then pull back the price for use in further apex calculations. I believe I will need to run a query on the records but have no idea how to do this. Can I get some help?
list cpFTList = [SELECT Pkg_Price__c FROM Component_Pricing__c WHERE Component_Pricing__c.Part_Number__c = :PRT_Pr_Ft]; Pr_Ft = cpFTList[0].Pkg_Price__c;
This is the final query that works.
Thanks Chiz for the help.

How to ignore errors in datastore.Query.GetAll()?

I just started developing a GAE app with the Go runtime, so far it's been a pleasure. However, I have encountered the following setback:
I am taking advantage of the flexibility that the datastore provides by having several different structs with different properties being saved with the same entity name ("Item"). The Go language datastore reference states that "the actual types passed do not have to match between Get and Put calls or even across different App Engine requests", since entities are actually just a series of properties, and can therefore be stored in an appropriate container type that can support them.
I need to query all of the entities stored under the entity name "Item" and encode them as JSON all at once. Using that entity property flexibility to my advantage, it is possible to store queried entities into an arbitrary datastore.PropertyList, however, the Get and GetAll functions return ErrFieldMismatch as an error when a property of the queried entities cannot be properly represented (that is to say, incompatible types, or simply a missing value). All of these structs I'm saving are user generated and most values are optional, therefore saving empty values into the datastore. There are no problems while saving these structs with empty values (datastore flexibility again), but there are when retrieving them.
It is also stated in the datastore Go documentation, that it is up to the caller of the Get methods to decide if the errors returned due to empty values are ignorable, recoverable, or fatal. I would like to know how to properly do this, since just ignoring the errors won't suffice, as the destination structs (datastore.PropertyList) of my queries are not filled at all when a query results in this error.
Thank you in advance, and sorry for the lengthy question.
Update: Here is some code
query := datastore.NewQuery("Item") // here I use some Filter calls, as well as a Limit call and an Order call
items := make([]datastore.PropertyList, 0)
_, err := query.GetAll(context, &items) // context has been obviously defined before
if err != nil {
// something to handle the error, which in my case, it's printing it and setting the server status as 500
}
Update 2: Here is some output
If I use make([]datastore.PropertyList, 0), I get this:
datastore: invalid entity type
And if I use make(datastore.PropertyList, 0), I get this:
datastore: cannot load field "Foo" into a "datastore.Property": no such struct field
And in both cases (the first one I assume can be discarded) in items I get this:
[]
According to the following post the go datastore module doesn't support PropertyList yet.
Use a pointer to a slice of datastore.Map instead.

Resources