On my Windows 8 box running Sql 2012 11.0.3128, this query returns an IncludeCount of 0 and an ExcludeCount of 1.
On my Windows 7 box running Sql 2008 10.50.2550 this query returns an IncludeCount of 3 and an ExcludeCount of 1, which is correct.
In short, it runs properly on these versions of OS and Sql's:
Windows 2008 R2 + Sql 10.50.2550
Windows 2008 R2 + Sql 10.50.4000
Windows 2012 SP1 + Sql 11.0.3000
Windows 7 + Sql 11.0.2100
And gives incorrect results on these OS's and Sql's (so far, tested):
Windows 8 Enterprise + Sql 11.0.3128
Windows 2008 R2 + Sql 10.50.2550
I wondered if anyone else can reproduce this? I can't figure out the magic combination of OS and SQL version this breaks on.
In all scenarios, the resulting @filters table is populated correctly, and the [Include] column is properly set to a 1 or a 0, so why aren't the other variables being properly set?
If I change the [ID] column to NONCLUSTERED, it works fine, too. It doesn't matter if @filters is a TVP or a temp table or an actual table, same (incorrect) results in each case.
DECLARE @filters TABLE([ID] bigint PRIMARY KEY, [Include] bit)
DECLARE @excludecount int = 0
DECLARE @includecount int = 0
DECLARE @id bigint
INSERT INTO @filters ([ID])
VALUES (1), (3), (4), (-7)
UPDATE @filters SET
@id = [ID],
@includecount = @includecount + (CASE WHEN @id > 0 THEN 1 ELSE 0 END),
@excludecount = @excludecount + (CASE WHEN @id < 0 THEN 1 ELSE 0 END),
[Include] = CASE WHEN @id > 0 THEN 1 ELSE 0 END,
[ID] = ABS(@id)
SELECT @includecount as IncludeCount, @excludecount as ExcludeCount
SELECT * FROM @filters