Recompilations, Part 1

(Be sure to checkout the FREE SQLpassion Performance Tuning Training Plan - you get a weekly email packed with all the essential knowledge you need to know about performance tuning on SQL Server.)

In this and next week I want to talk about a very important performance tuning topic in SQL Server: Recompilations. As you will see they can happen when you execute very simple stored procedures. Today I want to lay out the foundation of recompilations in SQL Server, why they have to happen, and next week I will show you how you can avoid them by rewriting your stored procedures in a different way.

Recompilations – what?

Before I want to talk about the specifics of recompilations in SQL Server, I want you to have a more detailed look at the following simple stored procedure.

As you can see from the code, there is nothing special about it. In the first step a temp table is created. Afterwards some records are inserted into the Temp Table, and finally the inserted data is retrieved with a simple SELECT statement. Code that you have already written hundreds – maybe thousand times in SQL Server. Let’s create in the first step an Extended Event session that tracks recompilations. The following code shows the definition of this event session.

When you now execute the stored procedure you will see that that 2 Recompilations have happened during the runtime.

2 recompilations for a simple stored procedure

A Recompilation means that SQL Server recompiled the execution plan during the execution itself. This introduces additional CPU overhead that will finally decrease the throughput of your workload. But for now the real question is why these recompilations have happened?

SQL Server mainly performs recompilations under the following 2 circumstances:

  • Schema Changes
  • Statistic Updates

And because of these 2 circumstances we have triggered the recompilations in the previous executed stored procedure. When you look again at the stored procedure definition, you can see that we have created in the first step a temp table. When you create a temp table in TempDb, you change your database schema, and because of this reason the first recompilation has happened.

And after the creation of the temp table, you have inserted some rows into it. Therefore SQL Server had to update the underlying statistics object of the Clustered Index of the temp table – the Clustered Index was created through the PRIMARY KEY constraint by SQL Server. A simple stored procedure caused 2 recompilations in SQL Server – wow!


Temp tables can have a huge performance overhead caused by recompilations. On the other hand temp tables have accurate statistics that helps the Query Optimizer to produce better performing execution plans. Therefore temp tables only make sense when you deal with a larger set of data. For a smaller set of data the CPU overhead caused by the recompilations would be larger as the performance gain through the statistics.

In next weeks blog posting I will introduce Table Variables to you, and here we will see how they can avoid the nasty side-effects of recompilations – and what other performance problems they can introduce. Stay tuned.

Like or share this blog posting to get the source code.

Thanks for your time,


It`s your turn

Your email address will not be published. Required fields are marked *


Copyright © 2018 by SQLpassion e.U. · Imprint · Offerings · Academy · Contact · Data Protection · Go to Top