Maximum number of users returned per page when calling transitive member api on a group - azure-active-directory

I have the following query to get the transitive members in a group:
await _graphServiceClient
.Groups[groupId]
.TransitiveMembers
.Request()
.Top(999)
.GetAsync();
For larger groups, response takes a longer time as there are a number of pages being returned. Currently I have set the top value as 999. What is the maximum number of users that can be returned per page?

The maximum allowed value can vary depending on the kind of collection you're going to ask. Most of them have a maximum value of 999 by using top(999), but a few have lower values (If you pick a too high number you get back an error message containing the maximum allowed value).
The retrieved page object has a property NextPageRequest which will be not null, if further data is available and you can get it by calling it:
var moreMembers = await members.NextPageRequest.GetAsync();
You can do this in a while loop, till the property NextPageRequest is null to get a list of all members.
Similar issue - https://github.com/microsoftgraph/microsoft-graph-docs/issues/13233
Hope this help
Thanks

Related

How to limit the number of returned records using CloudKit framework?

Is there any way to limit the number of records returned from a query using CloudKit framework? E.g. only return the latest value by sorting it by date (already figured that out) but then limit the returned records to 1?
Yes, use a CKQueryOperation and it's resultsLimit property.
resultsLimit
The maximum number of records to return at one time.
Discussion
For most queries, leave the value of this property as the
default value, which is the maximumResults constant. When using that
value, CloudKit returns as many records as possible while minimizing
delays in receiving those records. If you want to process a fixed
number of results, change the value of this property accordingly.

Is there a way to retrieve Salesforce picklist values for record types using apex?

I need to gather information about what picklist values are available for every record type. I know this can be achieved using either describeLayout or readMetadata API. But when I try to gather this info for a large custom object, troubles happen. The SalesForce API returns a record type with all available picklist values for it.
<recordTypeMappings>
<name>Record1</name>
<picklistsForRecordType>
<picklistName>Picklist1</picklistName>
<picklistValues>
...
</picklistValues>
</picklistsForRecordType>
<picklistsForRecordType>
<picklistName>Picklist2</picklistName>
<picklistValues>
...
</picklistValues>
</picklistsForRecordType>
</recordTypeMappings>
<recordTypeMappings>
<name>Record2</name>
<picklistsForRecordType>
<picklistName>Picklist1</picklistName>
<picklistValues>
...
</picklistValues>
</picklistsForRecordType>
<picklistsForRecordType>
<picklistName>Picklist2</picklistName>
<picklistValues>
...
</picklistValues>
</picklistsForRecordType>
</recordTypeMappings>
It means if I have a large object (which includes 200 picklists and 100 record types), I will get 200*100=20,000 picklist records. It makes the API response extremely large, up to 80MB. And this is extremely inefficient, if a picklist values remain the same for all record types, they will still be included in each of them in API response.
The idea is to get unique picklist values sets and then just include record ids with them, so the same picklist will not be duplicated with every record type.
<recordTypeMappings>
<name>Record1, Record2</name>
<picklistsForRecordType>
<picklistName>Picklist1</picklistName>
<picklistValues>
...Values which are the same for Record1 and Record2...
</picklistValues>
</picklistsForRecordType>
<picklistsForRecordType>
<picklistName>Picklist2</picklistName>
<picklistValues>
...Values which are the same for Record1 and Record2...
</picklistValues>
</picklistsForRecordType>
</recordTypeMappings>
This will reduce the response size. Is there a way to do that in Apex? I searched in API and was not able to find anything suitable. Apex seems a better solution, since all the processing will happen on the Salesforce side.
Thanks for help.
To filter out duplicates and only get unique values, try capturing a collection of picklist values in a Set. For example here is a function that takes a List of fields (in this case it is a list of picklist fields) and returns a set of unique picklist values.
// Given a list of picklist fields, return a set of unique picklist values
Set<Schema.PicklistEntry> getUniquePickListValues(List<Schema.DescribeFieldResult> pickListFields) {
Set<Schema.PicklistEntry> uniquePicklistValues = Set<Schema.PicklistEntry>();
for(Schema.DescribeFieldResult pickList : pickListFields){
List<Schema.PicklistEntry> pickListValues = pickList.getDescribe().getPicklistValues();
for(Schema.PicklistEntry entry : pickListValues){
uniquePicklistValues.add(entry);
}
}
return uniquePicklistValues;
}
I know that using nested loops is inefficient but I don't know if there is a better way to merge list of objects into a Set.
Hope this helps.
If you want to retrieve picklist values based on record type, please check my solution here,
https://salesforce.stackexchange.com/questions/103837/how-do-i-get-the-intersection-of-recordtype-and-picklist-values-inside-apex/202519#202519
It uses a REST API call, but the response will be similar to a getdescribe result plus record type info.
Here is how the performance and volume issues were solved.
The challenge was to collect available picklist values for all the record types of huge custom objects with a lot of record types and picklists.
First of all, I did not find any way to do that directly in Apex, I used API calls.
When we receive a custom object description via describeSObject call (https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_calls_describesobject.htm), we get all the picklist values and record type values. What we do not get is the specific picklist values available for each record type. For that we need to execute describeLayout request (https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_calls_describelayout.htm). Using information from describeSObject we may try to predict how large describeLayout response will be.
For example, if we have 500 picklist values and 20 record types, total response of describeLayout will have up to 500*20=10,000 picklist values (since describeLayout returns all picklist values available for for each record type). Then we need to approximate how large that XML response would be, since Salesforce API has a response limit of 5mb. After inspecting the response I found out that to match the 5mb limit we need to meet the requirement of less than 30,000 picklist values per describeLayout request.
Solution was to break this large call into several smaller by record types, so we retrieve all picklist values for a few record types and then repeat for others.
It took up to 24 API requests to retrieve 70MB of data from SalesForce API, which was not possible via one API call because of response size limit.

Parse.com: query on an array field not working with a big number of values

I use Parse.com Core and Cloud Code to store data and perform some operations for a mobile app. I have an issue with a query on an array field that is sometimes not returning anything even if I am sure it should.
I store a large amount of phone numbers in an array field to keep track of user's matching contacts.
This field is called phoneContacts and look like this (with numbers only, this is just as an example):
["+33W30VXXX0V","+33W30VXX843","+33W30VZVZVZ","+33W34W3X0Y4","+33W34W386Y0", ...]
I have a function in Cloud Code that is supposed to get matching rows for a given phone number. Here is my query:
var phoneNumber = request.params.phoneNumber;
var queryPhone = new Parse.Query('UserData');
queryPhone.equalTo('phoneContacts', phoneNumber); // phoneNumber is passed as a string param, i.e. "+33W30VXX843"
queryPhone.include('user');
var usersToNotify = [];
return queryPhone.each(function(userData) {
var user = userData.get('user');
usersToNotify.push(user.get('username'));
})
.then(function() {
return usersToNotify;
});
I tested my query with an array of 2 or 3 phone numbers and it works well and returns the expected rows. But then I tried with a user having around 300 phone numbers in that phoneContacts field and even if I query a value that is present (appear with a filter in Parse Data Browser), nothing is returned. To be sure I even took a phone number existing in 2 rows: one with few values and one with many, and only the row with a few values got returned.
I've read carefully the Parse documentation and especially about queries and field limits, but it doesn't seem to have a restriction on the number of values for an array field, and nothing says that query might not work with a lot of values.
Anybody can point me in the right direction? Should I design my Parse Classes differently to avoid having so many values in an array field? Or is there something wrong with the query?
You need to be using a PFRelation or some sort of intermediate table. You should not use an array to store 300 phone numbers, your queries will get really slow.
PFRelations:
https://parse.com/docs/osx/api/Classes/PFRelation.html
http://blog.parse.com/learn/engineering/new-many-to-many/

SalesForce limit on SOQL?

Using the PHP library for salesforce I am running:
SELECT ... FROM Account LIMIT 100
But the LIMIT is always capped at 25 records. I am selecting many fields (60 fields). Is this a concrete limit?
The skeleton code:
$client = new SforceEnterpriseClient();
$client->createConnection("EnterpriseSandboxWSDL.xml");
$client->login(USERNAME, PASSWORD.SECURITY_TOKEN);
$query = "SELECT ... FROM Account LIMIT 100";
$response = $client->query($query);
foreach ($response->records as $record) {
// ... there's only 25 records
}
Here is my check list
1) Make sure you have more than 25 records
2) after your first loop do queryMore to check if there are more records
3) make sure batchSize is not set to 25
I don't use PHP library for Salesforce. But I can assume that before doing
SELECT ... FROM Account LIMIT 100
some more select queries have been performed. If you don't code them that maybe PHP library does it for you ;-)
The Salesforce soap API query method will only return a finite number of rows. There are a couple of reasons why it may be returning less than your defined limit.
The QueryOptions header batchSize has been set to 25. If this is the case, you could try adjusting it. If it hasn't been explicitly set, you could try setting it to a larger value.
When the SOQL statement selects a number of large fields (such as two or more custom fields of type long text) then Salesforce may return fewer records than defined in the batchSize. The reduction in batch size also occurs when dealing with base64 encoded fields, such as the Attachment.Body. If this is the case they you can just use queryMore with the QueryLocator from the first response.
In both cases, check the done and size properties of the done and size properties of the QueryResult to determine if you need to use queryMore and the total number of rows that match the SOQL query.
To avoid governor limits it might be better to add all the records to a list then do everything you need to do to the records in the list. After you done just update your database using: update listName;

Retrieve last row from the Google DataStore Java

I want to retrieve the last row from the data store so how can i do that??
I know the long method i.e
for(Table_name e: resultset)
{
cnt++;
}
results.get(cnt).getvalue();
I have String with (number) as primary key.can i use it to get descending order???
Is there any method through which i can get the last row???
You should probably sort in the opposite order (if possible for your query, the data store has some restrictions here) and get the first result of that.
Also, if you store numbers in String fields the order may not be what you want it to be (you might need padding here).

Resources