Skip to main content

Azure Table Storage Batch Insert Within Operations Limit

When working with Azure Table Storage and needing to bulk insert a number of records, it's more efficient and reliable to do so by inserting a batch of records in one step, rather than one at a time.

On a recent project we've been using table storage as a location to record the log for an import operation migrating information from one content management system into another. As each field was imported a couple of log lines of information were recorded - which were useful for auditing and debugging.

Rather than inserting each log line as it was generated, we've batched them up, and then loaded them into table storage in the finally part of a try/catch that sits around the import logic. In this way, we store all the log information in one go.

For a few imports though, we found this error being recorded in the application logs:

Error processing message, message: An error has occurred. Exception message: The maximum number of operations allowed in one batch has been exceeded. Type: System.InvalidOperationException

What I hadn't realised was that the Azure ExecuteBatch and ExecuteBatchAsync statements have a limit of 100 records that can be inserted at once.  Go over this, and you get this error.

My solution to this was to create a new extension method, that "batches up the batch", ensuring that we apply the updates in several batches, each one at or less than the limit.

The extension method looks like this:

And the helper extension method to group the original batch into chunks of the appropriate size is as follows:

Comments

Post a Comment