I needed to run a query against 42000 records of data which checks each record and validates it against a number of conditions before flagging it as ok to process or not.
Before starting I took a copy of the database to my local laptop and developed a query which, once complete, ran in just over 15 minutes.
Happy that all was ok with the query I then ran it on our server against the original set of data. This took 37 minutes for the exact same query on the exact same database.
Both are running SQL Server 2008R2. The difference is my laptop is Developer Edition whilst the server is Standard Edition. Could this make such a massive difference in the time?
What surprised me more is that my laptop is an Intel i7 2.6Ghz processor with 16GB RAM running Windows 7 64 Bit SQL Server Dev 10.50.4000 and the server is Intel Xeon E5-4603 @ 2.00Ghz with 16 Cores 64GB RAM Windows Server 2008R2 64 Bit SQL Server Std 10.50.4000.
Any suggestions on why such a difference should occur when the server is a much more powerful machine yet seems to run the same query nearly 3 times slower?
Any advise would be much appreciated.