For completeness, here is my reasoning for the Canon 1Series and 5D (again I've not tested the 5D2, it may be like the 5D or they may have decided to shave costs and make it like the xxD series).
The read noise in photon equivalents as a function of ISO for the 1D3 looks like this (ignore the curves, they are a fit to a mathematical model of the noise; the points are actual data):
There is essentially no difference between the read noise in groups of three -- a "standard" ISO like 200 has essentially the same noise as ISO 250 and 320. There is a marginal improvement of 125 and 160 over 100, but virtually undetectable in images. Read noise for ISO 500 and 640 is the same as 400, etc.
What does this mean for practical photography? Suppose the exposure (Tv and Av) was correct for ISO 640. That means the sensor is collecting a given amount of photons. If instead one uses ISO 400, you still have the same photons, and the same electronic read noise; but again you gain 2/3 stop in highlights if you should want them in post-processing; if you used ISO 640 there would be no benefit in S/N, and that extra 2/3 stop of highlights you might have wanted are gone forever.
So again, I set the custom function that turns off the intermediate ISO's on my 1D3, and if I need a bit more shutter speed I simply "underexpose" at a standard ISO, and compensate during RAW conversion.