Should i buy 7950




















As is typical with heavily overclocked cards, overclocking quickly drives up power consumption and the s are no exception. After overclocking power consumption is almost identical to the stock , so while you can get performance you still need to pay the price with power consumption.

With the increase in power comes an increase in temperatures. Last but not least we have load noise. Post Your Comment Please log in or sign up to comment. Its a nice chip, it really is. But the problem with its pricing just became even more obvious with the as AMD is selling you yesterday's performance at next-gen prices.

In other words, if you wanted this level of performance, you could've gotten it a year ago with the GTX for almost the same price And that's why AMD's pricing of these parts fails. Instead, Nvidia recommends only a watt power supply to power a single GeForce GX2, a marked improvement. So it's neither the design of the card nor its requirements that hold us back, but rather the 3D chips themselves.

The GeForce series has been a solid performer for Nvidia. It helped usher in the dual-card SLI era, and even though ATI's Radeon X's can jump through a few more hoops, I think most gamers would argue that this current generation of 3D chips has served the gaming public well.

If that sounds like an epitaph, you're not far off. The problem is Windows Vista , or more specifically, Vista's updated multimedia programming interface, DirectX DirectX is Microsoft's Rosetta stone for combining hardware and software. As long as software programmers and hardware developers design their products to cue into DirectX, compatibility should be guaranteed.

While Nvidia or ATI has yet to announce a DirectX 10 chip, you can bet that such cards will be out or will be very near release by the time Vista launches. We're not surprised by the clock speed reductions given the heat and power issues inherent to running two fast GPUs in a single-slot package, but we were surprised by the performance results.

Props to Sarju Shah, GameSpot's illustrious associate hardware editor, for sharing his benchmark scores with us. ATI still wins for 3D image quality, since on some games you can turn on more image quality features at the same time. See all comments Gigabyte HD missing in action :. Just thought you might like to know that the x-axis captions for the noise level graphs are labelled as they would be for temperature, rather than decibels.

That nvidia turbo boost is a bit cheating and you should turn it off for the test. It's basically same as overclocking and your review sample cards are most likely binned to get much better than average OC. Nice article. Glad to know I'm not missing much by sticking with brand loyalty.

Why was there no expanded information on the overclocking ability of the GPUs? As well, what kind of memory overclocks did you get? Did you have to change the voltage of the GPUs to get said clock speeds? This article was missing a bit too much info there to really be able to get the full picture of these GPUs. The HD 6 card shootout is how it should have been done for the HDs.

XFX HD didn't make the list?



0コメント

  • 1000 / 1000