How to fetch more than 2000 records through SOQL ....
Is there something query more ?
call queryMore with the queryLocator provided in the first set of results, keep calling it with the next queryLocator until the done flag is true. See the Web Services API docs for more info.
You can actually do this manually as well if you order by id and then query again with "where id > : idPrevious". If you try this just as I've typed it you'll hit a problem however, you can't use > and < with id fields. There is a simple work around for this though, just create a text type formula field which takes it's value from the id field. Then you can use that field in the query with no problems.
Of course if you're just looking to process loads of data then you might really be looking for Batch Apex.
Related
I am new to SalesForce and SOQL so sorry in advance if the question has already been answered, if yes link it to me.
The aim of my SOQL query is to get all the contract information to generate PDF.
There are tables: Contract, Contact and Account
In the Contract table there are fields: Maitre_d_apprentissage__c, MaitreApprentissageNom1__c, MaitreApprentissagePrenom1__c, Apprenti__c, ApprentiNom__c, ApprentiPrenom__c
There are relationships:
Apprenti__r which link Apprenti__c to Contact table
Maitre_d_apprentissage__r which link Maitre_d_apprentissage__c to Contact table
When I looked at table, I saw that MaitreApprentissageNom1__c was equal to Maitre_d_apprentissage__r.LastName and ApprentiNom__c was equal to Apprenti__r.LastName. So I conclude I could get other information of Apprenti__c and Maitre_d_apprentissage__c from the Contact Table following the same principle. So I added to my query Apprenti__r.Date_de_naissance__c and Maitre_d_apprentissage__r.Date_de_naissance__c to get the Date_de_naissance__c field which is in my Contact table.
I see in the results that the query succeeds in getting the information but some values have changed column (lines 6 and 7), you can see the difference between query 1 and query 2. In the first query I only return the Apprenti__r.Date_de_naissance__c and in the second query I return Apprenti__r.Date_de_naissance__c and Maitre_d_apprentissage__r.Date_de_naissance__c
Query 1:
SELECT ApprentiNom__c, ApprentiPrenom__c, Apprenti__r.Date_de_naissance__c, MaitreApprentissageNom1__c, MaitreApprentissagePrenom1__c
FROM Contract
Result 1:
Query 2:
SELECT ApprentiNom__c, ApprentiPrenom__c, Apprenti__r.Date_de_naissance__c, MaitreApprentissageNom1__c, MaitreApprentissagePrenom1__c, Maitre_d_apprentissage__r.Date_de_naissance__c
FROM Contract
Result 2:
I would like to understand from where is coming the problem and how to correct it. Thank you in advance.
It's possible that it's just your query editor displaying stuff incorrectly. You can see it got confused with 2 lookups to Contact table, why there's even a column header "Contact.Date_de_naissance__c" (and why it's there twice). And they aren't shown in the order you requested...
What editor you're using? You could try built-in "Developer Console" or http://workbench.developerforce.com/
What do you need it for? In Apex order of fields won't matter, in REST API query the values fetched via lookup will come as JSON sub-objects so there will always be a way to figure out exactly which value is coming from which relation.
In Dev Console try to run this and check if it solves your fears:
System.debug(JSON.serializePretty([SELECT
ApprentiNom__c, ApprentiPrenom__c,
Apprenti__r.Date_de_naissance__c,
MaitreApprentissageNom1__c, MaitreApprentissagePrenom1__c,
Maitre_d_apprentissage__r.Date_de_naissance__c
FROM Contract]));
Then add Maitre_d_apprentissage__r.LastName to query and see what changed, what stayed as is.
I'm trying to use the DataImportHandler to keep my index in sync with a SQL database (what I would think is a pretty vanilla thing to do). Since my database will be pretty large, I want to use incremental imports using this method http://wiki.apache.org/solr/DataImportHandlerDeltaQueryViaFullImport so the calls are of the form http://localhost:8983/solr/Items/dataimport?command=full-import&clean=false. This works perfectly well for adding items.
I have a separate DeletedItems table in my database which contains the primary keys of the items that have been deleted from the Items table, along with when they were deleted. As part of the DataImport call, I had hoped to be able to delete the relevant items from my index based on a query along the lines of
SELECT Id FROM DeletedItems WHERE WasDeletedOn > '${dataimporter.last_index_time}'
but I can't figure out how to do this. The link above alludes to it with the cryptic
In this case it means obviously that in case you also want to use deletedPkQuery then when running the delta-import command is still necessary.
but setting deletedPkQuery to the above SQL query doesn't seem to work. I then read that deletedPkQuery only works with delta-imports, so I am forced to make two requests to my solr server as part of the sync process? This doesn't seem right as the operations are parameterized by the dataimporter.last_index_time property, which changes. Both steps would need to be done in one "atomic" action, surely? Any ideas?
You must use the import handler special commands
https://wiki.apache.org/solr/DataImportHandler#Special_Commands
With these commands you can alter the boost or delete a document coming from the recordset of the full import query. Be aware that you must use the $skipDoc field to avoid that the document gets indexed again and that you must repeat the id in the $deleteDocById field.
You can use a union query
select
id,
text,
'false' as [$deleteDocById],
'false' as [$skipDoc]
from [rows to update or add]
Union Select
id,
'' as text,
id as [$deleteDocById],
true as [$skipDoc]
or a case when
select
id,
text,
CASE
when deleted = 1 then id
else 'false'
END as [$deleteDocById],
CASE
when deleted = 1 then 'true'
else 'false'
END as [$skipDoc]
Where updated > ${dih.last_index_time}
The deletedPkQuery is run as part of the regular call to delta-import, so you don't have to run anything twice (and when doing a full-import, there's no need to run deletedPkQuery, since the whole connection is cleared before importing anyway).
The deletedPkQuery should be configured on the same element as your main query. Be sure to match the field names exactly as well, and that the id produced by your deletedPkQuery matches the one provided by the main query.
There's a minimal example on solr.pl for importing and deleting fields using the same deleted_entries-table structure as you have here:
<entity
name="album"
query="SELECT * from albums"
deletedPkQuery="SELECT deleted_id as id FROM deletes WHERE deleted_at > '${dataimporter.last_index_time}'"
>
Also make sure that the format of the deleted_at-field is comparable against the value produced by last_index_time. The default is yyyy-MM-dd HH:mm:ss.
.. and lastly, remember that the last_index_time property isn't available before the second time the task is run, since there's no "previous index time" the first time an index is being populated (but the deletedPkQuery shouldn't run before that anyway).
How do I create a query in SOQL like:
SELECT Id FROM User WHERE rownum > 101 AND rownum < 200
I am trying to do pagination where I load the first 100, then if you click "Load more" I can query for the next 100.
For the purposes of pagination, you would generally just make the whole query (which in your example would be just SELECT Id FROM User), setting the batch size to the size of your page, and then use queryMore() to get the next batch. This method does require you to store the QueryLocator from the QueryResult that you get from the initial query -- the sample code in the article I linked to above shows some good examples of it.
If you're doing this with a Visualforce page, you have another option -- SOQL also has an OFFSET clause that you can use to scroll to a specific row. Here's an article that describes it. Be careful though -- OFFSET is subject to some limitations that might hamstring you if users flip too many pages (see the SOSL and SOQL limits section of this page for specifics).
I have some Django 1.3 code that looks up many model instances in a loop, ie.
my_set = myinstance.subitem_set.all()
for value in values:
existing = my_set.filter(attr_name=value)
if len(existing) == 1:
...
This works, but profiling SQL queries shows that it hits the DB on each iteration. According to https://docs.djangoproject.com/en/1.3/ref/models/querysets/ iterating over the related items should eagerly load them, so I tried calling:
list(my_set)
However, this doesn't help. It does do a query to load all the sub-items, but then it still does an individual query for each sub-item inside the loop. How do I get it to use the cached set and not hit the DB each time? The DB is PostgreSQL 8.4.
The problem is in this line:
if len(existing) == 1:
From Django documentation:
len(). A QuerySet is evaluated when you call len() on it. This, as you might expect, returns the length of the result list.
Note: Don't use len() on QuerySets if all you want to do is determine the number of records in the set. It's much more efficient to handle a count at the database level, using SQL's SELECT COUNT(*), and Django provides a count() method for precisely this reason. See count() below.
So in your case it executes the query each time when you call len(existing). The more effective way is:
existing.count() == 1
This will also hit the database each time you call it but it will execute SELECT COUNT(*) which is faster.
I'm writing an NHibernate criteria that selects data supporting paging. I'm using the COUNT(*) OVER() expression from SQL Server 2005(+) to get hold of the total number of available rows, as suggested by Ayende Rahien. I need that number to be able to calculate how many pages there are in total. The beauty of this solution is that I don't need to execute a second query to get hold of the row count.
However, I can't seem to manage to write a working criteria (Ayende only provides an HQL query).
Here's an SQL query that shows what I want and it works just fine. Note that I intentionally left out the actual paging logic to focus on the problem:
SELECT Items.*, COUNT(*) OVER() AS rowcount
FROM Items
Here's the HQL:
select
item, rowcount()
from
Item item
Note that the rowcount() function is registered in a custom NHibernate dialect and resolves to COUNT(*) OVER() in SQL.
A requirement is that the query is expressed using a criteria. Unfortunately, I don't know how to get it right:
var query = Session
.CreateCriteria<Item>("item")
.SetProjection(
Projections.SqlFunction("rowcount", NHibernateUtil.Int32));
Whenever I add a projection, NHibernate doesn't select item (like it would without a projection), just the rowcount() while I really need both. Also, I can't seem to project item as a whole, only it's properties and I really don't want to list all of them.
I hope someone has a solution to this. Thanks anyway.
I think it is not possible in Criteria, it has some limits.
You could get the id and load items in a subsequent query:
var query = Session
.CreateCriteria<Item>("item")
.SetProjection(Projections.ProjectionList()
.Add(Projections.SqlFunction("rowcount", NHibernateUtil.Int32))
.Add(Projections.Id()));
If you don't like it, use HQL, you can set the maximal number of results there too:
IList<Item> result = Session
.CreateQuery("select item, rowcount() from item where ..." )
.SetMaxResult(100)
.List<Item>();
Use CreateMultiCriteria.
You can execute 2 simple statements with only one hit to the DB that way.
I am wondering why using Criteria is a requirement. Can't you use session.CreateSQLQuery? If you really must do it in one query, I would have suggested pulling back the Item objects and the count, like:
select {item.*}, count(*) over()
from Item {item}
...this way you can get back Item objects from your query, along with the count. If you experience a problem with Hibernate's caching, you can also configure the query spaces (entity/table caches) associated with a native query so that stale query cache entries will be cleared automatically.
If I understand your question properly, I have a solution. I struggled quite a bit with this same problem.
Let me quickly describe the problem I had, to make sure we're on the same page. My problem came down to paging. I want to display 10 records in the UI, but I also want to know the total number of records that matched the filter criteria. I wanted to accomplish this using the NH criteria API, but when adding a projection for row count, my query no longer worked, and I wouldn't get any results (I don't remember the specific error, but it sounds like what you're getting).
Here's my solution (copy & paste from my current production code). Note that "SessionError" is the name of the business entity I'm retrieving paged data for, according to 3 filter criterion: IsDev, IsRead, and IsResolved.
ICriteria crit = CurrentSession.CreateCriteria(typeof (SessionError))
.Add(Restrictions.Eq("WebApp", this));
if (isDev.HasValue)
crit.Add(Restrictions.Eq("IsDev", isDev.Value));
if (isRead.HasValue)
crit.Add(Restrictions.Eq("IsRead", isRead.Value));
if (isResolved.HasValue)
crit.Add(Restrictions.Eq("IsResolved", isResolved.Value));
// Order by most recent
crit.AddOrder(Order.Desc("DateCreated"));
// Copy the ICriteria query to get a row count as well
ICriteria critCount = CriteriaTransformer.Clone(crit)
.SetProjection(Projections.RowCountInt64());
critCount.Orders.Clear();
// NOW add the paging vars to the original query
crit = crit
.SetMaxResults(pageSize)
.SetFirstResult(pageNum_oneBased * pageSize);
// Set up a multi criteria to get your data in a single trip to the database
IMultiCriteria multCrit = CurrentSession.CreateMultiCriteria()
.Add(crit)
.Add(critCount);
// Get the results
IList results = multCrit.List();
List<SessionError> sessionErrors = new List<SessionError>();
foreach (SessionError sessErr in ((IList)results[0]))
sessionErrors.Add(sessErr);
numResults = (long)((IList)results[1])[0];
So I create my base criteria, with optional restrictions. Then I CLONE it, and add a row count projection to the CLONED criteria. Note that I clone it before I add the paging restrictions. Then I set up an IMultiCriteria to contain the original and cloned ICriteria objects, and use the IMultiCriteria to execute both of them. Now I have my paged data from the original ICriteria (and I only dragged the data I need across the wire), and also a raw count of how many actual records matched my criteria (useful for display or creating paging links, or whatever). This strategy has worked well for me. I hope this is helpful.
I would suggest investigating custom result transformer by calling SetResultTransformer() on your session.
Create a formula property in the class mapping:
<property name="TotalRecords" formula="count(*) over()" type="Int32" not-null="true"/>;
IList<...> result = criteria.SetFirstResult(skip).SetMaxResults(take).List<...>();
totalRecords = (result != null && result.Count > 0) ? result[0].TotalRecords : 0;
return result;