Table Variables have been known to run slowly when the table variable contains lots of rows. This is because when a batch is compiled the table variable has yet to be populated with any rows, and therefore the optimizer uses an estimated row count of 1 for table variables. This occurs because table variables don’t have statistics. Having an estimated row count of 1 works well for operators that have small row counts, like a NESTED LOOP operator for a join operation. Operators that perform well for small numbers of rows do not always scale when large numbers of rows are involved. Therefore, to improve the performance of your code that use table variables that contain lots of rows, you should consider adding the optimizer hint RECOMPILE as shown below:
SELECT O.[Order Key], TV.[City Key]
FROM Fact.[Order] as O JOIN @MyCities as TV
ON O.[City Key] = TV.[City Key]
OPTION (RECOMPILE)
By adding the optimizer hint RECOMPILE, a statement that uses a table variable will be recompiled after the temporary table has been populated. This will allow the optimizer to know how many rows are in the table variable. Once the optimizer has a better guess at the actual number of rows in a table variable, then it has a much better chance of picking an operator that works well for larger record sets, like a Hash Match for a join operation.
If you find your Table Variable queries are not performing well then you might consider adding the optimizer hint RECOMPILE to see if your query starts performing better. Keep in mind with version 15 of SQL Server, Microsoft introduced a feature called “Table Variable Deferred Compilation” that solves the problem mentioned here, without the use of option RECOMPILE.