I've got a view that's based on a query that uses 5 tables from one database and 1 table from another database. It has worked fine in production for several years.
The SQL statement includes a CASE statement that based on one of two values encountered returns one of two varchar(50) fields. I want to replace the CASE statement with a table that has two rows in it.
I made the new table with an ID column and a text column, added it to the database's diagrams, forced a primary/foreign key relationship, no problems with the data. But when I add the new table to the query, the execution time of the query goes for 2 seconds to over a minute. I tried dropping the table in the other database and the query goes back to 2 second execution.
I think the problem relates to the table in the other database having 100,000 rows in it and somehow adding the 6th table to the view makes SQL lose track of what to do. Any ideas on why such a simple change would have such a profound effect on performance and how to coax SQL back into executing the view with the same strategy that works well with 6 tables when it has 7 tables in the view?
I copied the SQL to a query window and with a Declare of two variables, then selecting each value into them, then putting the variables in the CASE statement I have a SQL statement that performs well and changes the results based on what's in the new table. But I can't put Declare into a view and building a function with a table output seems like a lot of effort and a potential maintenance annoyance, so I'd rather not do that if there's an easier way around this problem.
Maybe hints to the query optimizer (which I have read about but never implemented)?
Thanks for any ideas anyone has.