# Sql-server – Under what circumstances can a division expression reduce the number of rows in a query’s result

sql serversql-server-2008

I'm experiencing a problem with a query that I've written.

Given a table with the schema:

create table somevals(division varchar(100),
subdivision varchar(100),
good float,


I would like to find, for each unique (division, subdivision), the sum of good values, the sum of bad values, and the ratio of good to bad values. So, I wrote the query:

select division, subdivision, g, b, g / b as ratio
from (select division, subdivision, sum(good) as g, sum(bad) as b
from somevals
group by division, subdivision
) as temp


The problem is that this returns far fewer rows than I would expect. However, if I remove the division (g / b as ratio), this query returns the correct number of rows in the result (about three times as many). None of the columns contain Null values. None of the bad values are less than or equal to zero, so sum(bad) will always be non-zero.

I am at a loss for how this could be happening. Under what circumstances can adding a division expression in the select statement reduce the number of rows in the result?

You simply cannot change the number of rows returned by changing the SELECT clause. The problem must lie elsewhere.
If you're absolutely sure that's what's happening, I would run DBCC on the table to make sure it's not corrupted.