The Wayback Machine - https://web.archive.org/web/20160702091430/http://salesforce.stackexchange.com:80/questions/129706/which-one-is-better-loop-with-soql-or-create-a-list-then-loop
Salesforce Stack Exchange is a question and answer site for Salesforce administrators, implementation experts, developers and anybody in-between. It's 100% free, no registration required.

Sign up
Here's how it works:
  1. Anybody can ask a question
  2. Anybody can answer
  3. The best answers are voted up and rise to the top

Some of code I read in trigger is like this

 for(account a: [select id from account where id in: mapAccount.keySet()]) { //code } 

some are like

List<Account> listAccount = [select id from account where id in: mapAccount.keySet()]; for(Account a: listAccount ){ //coode } 

Are those same from performance view?which one is better?

share|improve this question
1  
It's going to be a detailed answer, better you go through this document – highfiveyesterday
    
I know from the grammar wise same.I just curious on the performance side. – unidhayesterday
1  
in my opinion, the first one is (a very tiny bit) better performance-wise. The difference is that in the second one you also have to create a new variable(List) which will take some memory – Novargyesterday
    
Remember that performance is rarely the key deciding factor - premature optimization is the root of all evil:. – Keith Cyesterday

Both forms are tools, with preferred use cases. You wouldn't prefer to use a hammer when a screwdriver is better, nor would you prefer a screwdriver when a hammer is better. Generally speaking, prefer the SOQL for loop when memory is at a premium, and the list assignment form when CPU time is at a premium.

Let me give you a concrete example. Let's say you want to encrypt a bunch of attachments. Your first attempt might look like this:

Attachment[] records = [SELECT Body FROM Attachment]; for(Attachment record: records) { record.Body = encrypt(record.Body); } update records; 

This works if you have less than 6 MB of files you need to process at once. However, if a user uploads 3 5MB files, you'll crash, even if you wrote a Batchable class and use a scope size larger than 2.

However, in this rare case, a DML-inside-loop method exists that would theoretically let you handle up to Limits.getLimitDmlStatements records:

for(Attachment record:[SELECT Body FROM Attachment LIMIT :Limits.getLimitDmlStatements()]) { record.Body = encrypt(record.Body); // Implementation not important update record; } 

This works because each record is loaded in memory only as needed by the iterator (lazy loading).

A second case also exists that I've found. You can search through the results of a query and abort early, and you'll actually use fewer governor limits than you would normally. I don't believe I've ever found a use case for this in business logic, but it may be worth knowing. For example:

for(Account record:[SELECT Name FROM Account LIMIT 10000 ORDER BY Name ASC]) { if(record.Name == 'Breaker') { break; } } 

The debug logs read a bit confusing, as first you'll see that 10,000 rows are returned, but some smaller value was actually applied towards the cumulative limits (currently, multiples of 200). Here's an example from one of my dev orgs:

Execute Anonymous: for(account record:[select name from account]) break; SOQL_EXECUTE_END|[1]|Rows:614 LIMIT_USAGE|[1]|SOQL_ROWS|200|50000 

As you can see, I returned 614 rows, but I aborted early and was only charged for 200 rows. This isn't something you can do with the list assignment query: you either get all the rows or you blow a governor limit.

Conversely, let's say you need to update 10,000 accounts quickly for some reason. Using the SOQL loop run a larger risk of blowing the CPU limit because of the overhead of adding records to a list:

// No memory savings, but more CPU time is used Account[] updates = new Account[0]; for(Account record: [SELECT Name FROM Account LIMIT 10000]) { record.Name = record.Name.toUpperCase(); updates.add(record); } update updates; // No memory savings, but the code runs faster, and is simpler Account[] updates = [SELECT Name FROM Account LIMIT 10000]; for(Account record: updates) { record.Name = record.Name.toUpperCase(); } update updates; 

Keep in mind that this difference is really only visible in pages that update a ton of data and triggers when an organization is using a lot of API calls/imports/batches.

So, in conclusion, I'll say this: the difference usually doesn't matter, except that one form or the other tends to be clearer to read, depending on what you're doing. Just use the following guideline: if you need to use the records from the query outside of the loop they're being processed in, use the list assignment query, otherwise use the SOQL for loop. If you do proper unit testing, you'll find times when there's exceptions to this guideline.

share|improve this answer
2  
+1 from me. Think you should start with the headline "use the SOQL for loop unless you need to use the records from the query outside of the loop they're being processed in". – Keith Cyesterday
    
The reason why I asked this question is i am facing apex cpu limit while doing test class while all the loop code is seem fine . So I assign the query to list List<Account> listAccount = [select id from account where id in: mapAccount.keySet()]; , Then the error is disappears, which leads me to this curiousity. I know it that I should check process builders after this , but thank you for explaining this. – unidhayesterday

Not the answer you're looking for? Browse other questions tagged or ask your own question.

close