Hi Jkeefe,
I have had a chance to check out rFactor now. It rocks! It's also just what I am looking for in good looking/playing games trying out different display architecture. Funny, I wasn't a petrol head before and certainly had no passion for F1 but now... y'know, rFactor could be good for the planet :)
As an implementation it is a bit of a black box thing and its hard wired (well, to be fair, not exposed) variables are a bit of a shame. It does indeed look a lot like they have implemented three cameras with the outer two having different headings. The artefacts in images others have posted along with some below are typical of this. I would think it unlikely that they go to all the work of doing image warping to allude to multiple cameras when you could just use them.
The minimum variables I would look to controlling would be the horizontal field of view and the rotational transform of the cameras (assuming an auto vfov). In my setup it is clear that the cameras are configured for a desktop monitor situation with possibly quite a tight angle on the side ones (tighter than typical TH2Go setups). This would make sense as ideally you'd make a mini cave around yourself.
First image is normal single camera opened up wide and the next with the multi screen flag checked. I highlighted the horizon artefacts on the right hand side.
My opinion here would be to go for the perspective distortion you get from a single camera (if that was all you could choose from). In the heat of the action you are very focussed on the centre (head tracking is another thread I think).
I have tried to buy this great great game (did I mention how great I think it is?) but the third party group handling sales have a serious restriction to online commerce and licensing (potential grumble for another thread in another forum on another website). So I am stuck playing the demo... with all those big trucks yet to race :(
Now what I am not showing is what can happen to fix the artefacts caused by this fix to wide angle cameras. I was hoping to specify the variables mentioned above to match my system which would be: 56 degree HFOV, 45 VFOV, 50 degree offsets for side cameras and all on the same vertical axis. This would give me an effective HFOV of 150 degrees as I use a 3 degree overlap on either side with the other channels. Once to this point, a spherical distortion applied to the finished frame, either in the gpu or in the projector helps alleviate that mismatched horizon/vanishing point effect. If I could get the game to conform I would post the images but turning on the distortion the way it is configured now is quite unpleasant.
How this distortion works is a bit involved. I don't have any documents of my own I can make public but this link can give an introduction to it.
http://orihalcon.jp/projdesigner/
Hopefully Image Space Inc. will make the necessaries through their api so we can give it a go or better still, believe us that they need to be exposed.
God its good though