A memory recommendation is to reduce MaxMemory by 4GB or 10% of RAM (equally divisible by cores) to dedicate that memory for the OS.
Most questions on CPU affinity deal with splitting them out between instances.
My question is should one (or more) cores be deselected for OS only use? One site did this on a Win2003/SQL 2005.
Q1: Was this a beneficial thing to do? As I understand, older Win versions put everything in NUMA 0 so it sort of made sense. But I just recently learned that SQL was aware of that and favored NUMA 1+ to offset. And now, as I understand it, the newer OS (Win2008+) utilize NUMA and spread their CPU usage across the cores.
Q2: So, even if it was a good idea under older Win/SQL versions is it still a good idea under newer Win/SQL versions?