Though it may sound like hype to a lot of ATI owners, Nvidia has the money to do such things because they have a popular product with loyal customer base. Again, a lot of that comes back to consistently good driver writing. Both have had their screw ups on the software side, but far more so in ATI's camp from my experience.
Thats the big problem. Camps. ATI and Nvidia are companies with shareholders just caring about profits. I never understood this camping. Many of us just buy whatever the best offering we thing is at the moment on given price point regardless of ATI vs. Nvidia.
Camping creates a lot of
FUD. For some reason, users likes to bring out lies and doubts about products to promote other products. Free of charge. They are useful idiots for companies, which only interest is to earn money.
Like the driver writing. Some claim that ATI have more unstable drivers, while others say Nvidia have more unstable drivers. And, they never back it up with numbers or any kind of facts. They only throw it out as a "truth" and that this truth apply to everyone.
I've been doing support for years and I don't claim that its given you get more stability if to choose one vendor in GFX cards. If I would have made such claims, at least I would try to back it up with some facts. Something like this:
NVIDIA drivers responsible for nearly 30% of Vista crashes in 2007
http://www.engadget.com/2008/03/27/nvidia-drivers-responsible-for-nearly-30-of-vista-crashes-in-20/
Total market share of GPU's in Q1 2008:
AMD: 18.1%
Intel: 47.3%
Nvidia: 31.4%
http://www.jonpeddie.com/about/press/2008/q2-2008-gpu-shipments.php
With only the data above, I can give a more valid statement that Nvidia drivers are much worse then any other GFX manufacturer even when adjusting for market share. This statement would be more valid then any other online just saying that XXX's drivers are better then YYY's drivers, because they feel so and one vendor hurt their feelings. :P
If you are to go on about driver stability, at least dignify it with some facts and statistics to back up your statements.
This is pure speculation on my part, but it could be that Nvidia decided to focus on DX11 vs solely on DX10.1 when it was discovered MS was ahead of schedule on W7. I like what I'm hearing about W7, but I'm skeptical that despite it having Vista as it's core foundation, the heavy reworking of the code may put us back to square one with software compatibility taking a while to mature.
DX11 can use DX10.1 features and by implementing DX11 on the newer cards, the userbase of DX10.1 will increase. Going DX11 would only mean more people to use DX10.1 games. Nvidia should have given gamers DX10.1 meanwhile, since there's no backside to that.
@Tamlin,
Sure DX10.1 offers more than enhanced AA, but so far the AA is all the support we've really seen in games, and very few games at that. It's common for ATI customers to bemoan what could be. Frankly you can't count on what could be in this industry though. To a degree you have to go with what is being supported the most to not have wasted features.
The thing is that its not a wasted feature. The userbase of DX10.1 grows more and more and when the DX11 comes in cards, DX10.1 base will be even bigger including the DX11 owners. The reason why we see DX9 games, is still because of XP's large market share and developers go after the lowest common denominator (and a lot of console ports being to blame as well). DX10.1 is a supported feature already and everyting indicates that it will grow.
I'm never going to buy my hardware with the mindset that the features and intended use sounds good on paper. I want concrete evidence that I'll get widespread and ongoing support for it. With Nvidia, I feel it's much less of a gamble to have that.
I didn't quite get what you ment with that. Its not a gamble having DX10.1, its a free bonus. You still have support for DX10 and below. Its just that you have support for more as well and can take advantage of those features natively.