Batch Apex that does DML and multiple callouts

I need to write a Batch Apex that does DML and multiple callouts at different points in the process, then chains to another batch that also does DML and multiple callouts. Here's how I plan to do it, but is there a step where I might run into "uncommitted work pending" or "Callout from scheduled Apex not supported" problems?

  1. The batch will be Stateful and AllowsCallouts.
  2. In the start method, check if an Oauth2 token refresh is needed. If so, make a callout to get the new token, then save the new token to custom settings.
  3. In the execute method, make a callout to create an object in the external system. I may need to create multiple objects of this type, so I will set the batch scope to 1. Place the object ID in a stateful collection. Don't do any DML.
  4. In the finish method, write the external system object IDs to a Salesforce custom object and update a timestamp in custom settings, and chain to another batch file. This second batch file will do the same steps for a different external object.
  5. In particular I'm wondering if a chained batch is considered "scheduled Apex" because that would break the second batch's functionality.

Do you think this should work?

Replay

You can combine the callout and the DML in the same method if you want to; the only restriction is that no callouts are allowed after a DML. Each call to start, execute, and finish are separate transactions. There's really no need to perform your DML in the finish method, as you can do so in the execute method. Notably, if you're trying to do more than 10,000 callouts before you finally write to the database, you'll exceed the DML limit.

Your current design would certainly work, assuming the OAuth token doesn't expire until the end of the batch. Personally, I'd recommend moving the OAuth check into the execute method so that if you lose your token half-way through (say, because it's revoked), your batch can recover. You may also want to increase your scope size from 1 to a larger number, depending on how much callout time you think you'll need.

To calculate how many callouts you can do, figure out how much time is needed for each callout, and divide that into the maximum callout time. For example, if your callout takes an average of 5 seconds, then your limit would be 60 callouts in a transaction. Of course, you can't go over the governor limit of 100, so that'd be your maximum value.

Finally, batchable Apex isn't scheduled Apex, even though both are a form of "asynchronous" Apex. You won't need to worry about that error, even if you're chaining, and even if you use scheduleBatch to insert a delay inbetween.

The batch will be Stateful and AllowsCallouts.

  • No problem!
    • There is no inherent problem to implementing either interface, just note that Stateful batches run slower and can run into limits during the serialization/deserialization process.

In the start method, check if an Oauth2 token refresh is needed. If so, make a callout to get the new token, then save the new token to custom settings.

  • You might defer this step until your execute method.
    • You don't want to delay your query.
    • You can refresh it if need be.

In the execute method, make a callout to create an object in the external system. I may need to create multiple objects of this type, so I will set the batch scope to 1. Place the object ID in a stateful collection. Don't do any DML.

  • If you have a very high volume, serialization could break this step because your batch is Stateful. The number of records you can process should be quite high, but maybe not in the millions.
    • It sounds like you don't have much data to store per record so I'd imagine more than 10K. Hence, I'm fairly certain you'll hit the row limit before this becomes an issue.
  • You are limited to 100 callouts per execute, so if you're really going crazy, you could go over that.
  • Asynchronous execution contexts are allotted 60 seconds, so if you used all 100 callouts, that would be 600 ms each. Adjust your maximum number of callouts accordingly.
  • The limits are reset per transaction, so you just need to make sure all your callouts are done before you perform your dml, so you can avoid serialization altogether if you just update on completion.

In the finish method, write the external system object IDs to a Salesforce custom object and update a timestamp in custom settings, and chain to another batch file. This second batch file will do the same steps for a different external object.

  • You will be hard capped at 10,000 records (DML Rows), minus however many settings you want to update and jobs you want to execute.
  • You can execute one job from a job, so that part is fine is fine.
  • Updating a `Custom Setting should be no issue. Just note the operation counts toward DML rows.

I'm wondering if a chained batch is considered "scheduled Apex" because that would break the second batch's functionality.

  • No, if you use Database.executeBatch, the resulting job is not considered Scheduled. If you schedule it via the UI or Apex, that is when it will be considered so.
Category: apex Time: 2016-07-29 Views: 5

Related post

iOS development

Android development

Python development

JAVA development

Development language

PHP development

Ruby development

search

Front-end development

Database

development tools

Open Platform

Javascript development

.NET development

cloud computing

server

Copyright (C) avrocks.com, All Rights Reserved.

processed in 0.136 (s). 12 q(s)