Hi I have a special case. I will explain it first and eventually get to the performance tuning issues.
I have a datatables with tables that are updating in real time.
I need to do two identical series of calculations which are performed by a specific stored procedure I built.
What i need to do is this:
1. My first run of the calc proc is with input data from my real-time updating tables.
2. Second run of the calc proc is with the input data of (1) plus additional data sent as external input.
* The input data has to be synchronized between the two runs (any input data updated in real time between the two calculations has to be ignored).
*** This procedure has to run fast. It will always only run on a very small amount of data from my tables.
What I did is this:
1. For all my input tables I create temporary tables (using select top 0 * into #temp from inputTable). This helps me in: (A). synchronization between the two runs (I know I can use snapshot isolation level - but see (B)). (B) Since I have a series of calcualtions I do not have to use joins and where clauses of big tables in each calculation.
2. I create indices for the temp tables.
3. I run the first run of the calc proc.
4. I re-adjust my input data by adding the external input data.
5. I run the second run of the calc proc.
So, now performance tuning:
1. Sometimes (not always) it takes long to create the temporary tables - why it that?
2. the second run is always exactly 3 times faster then the first run (is this compilation?, is this cacheing? and how do i find out?).
3. I want to keep on improving performance - my indices and plans are all ok - I'm looking also for something special of how to deal with this kind of scenario. I need to run this as quickly as possible.
Hope I explained things well and will be most happy for any kind of help/ideas.
Thanks in advance,
Dror