Since
eZ`s data indicates >60 fps at 1080 resolution for many titles, then good frame rates at 1200 resolution may be well within reach. That is why I mentioned my preference for three 16:10 monitors running 5760x1200 resolution.
Phantom, do you do play many shooters? Do they handle x1200 well? I'd love to stick with 1920x1200, but it seems like 5760x1200 would be murdered w 1gig of ram on newer games.
I tend to play slightly older reduced-price shooters (Left 4 Dead, Fear 2, Shattered Horizon, Wolfenstein, etc). I wait for the newer shooters to come down in price before playing, so I don't have performance figures for the latest, most demanding titles (Metro 2033, for example).
So long as graphics card GPU performance is the only limiting factor, ie no CPU speed nor VRAM frame buffer bottlenecks, I find that video performance scales close to linearly with resolution for small spreads such as
5040x1050 (
5.3 Megapixels)
vs
5760x1080 (
6.2 Mp)
vs
5760x1200 (
6.9 Mp).
For example, if on 5040x1050 (5.3 Mp), a game plays at 100 fps, then we can use simple math to roughly predict performance at adjacent resolutions:
5760x1080 --> ~85 fps
100 fps x (5.3 Mp / 6.2 Mp) = 85.5 fps
5760x1200 --> ~77 fps
100 fps x (5.3 Mp / 6.9 Mp) = 76.8 fps
I agree, changing 5040x1050 to 5760x1080 and changing 5760x1080 to 5760x1200 are much smaller jumps, about 11%, than going from 5040x1050 to
5760x1200 (about 30% increase).