pc

It's not hatred. I just don't like Nvidias false advertising. Let me show you: http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications

2woetjs.jpg


That's false information, after almost full year after information about 3,5+0,5GB memory configuration was revealed.

Also, given that Nvidia's promises about future proof are also false, so anyone who buys GTX970 is supporting company that sells products with false information. GPU power consumption has never been issue in the past so it's not today either.



Your current CPU is fine for games, couple of expections that are unplayable anyway. In case you really need new CPU, LGA1150 is already obsolate and replaced with LGA1151. Also, many points about GTX970 above.
Again, stop with the politics. The 970 DOES have 4GB of VRAM, it's just that when accessing the last 512MB of VRAM performance suffers greatly. Again, stop looking at specs like the clock rate and articles that everyone has known about for months, and look at the actual real-world benchmarks. The price of the 970 reflects what the card is worth, which is very similar to the 390X in most games and over $100 cheaper.
 
Again, stop with the politics. The 970 DOES have 4GB of VRAM, it's just that when accessing the last 512MB of VRAM performance suffers greatly. Again, stop looking at specs like the clock rate and articles that everyone has known about for months, and look at the actual real-world benchmarks.

I'm still talking about facts. See my picture. Nvidia claims that GTX970 has 4GB memory WITH 256-bit bus. This is incorrect because only 3.5GB has 256-bit bus. So that is false advertising. Nvidia just don't seem to care. Very few gaming benchmarks are real world benchmarks.

The price of the 970 reflects what the card is worth, which is very similar to the 390X in most games and over $100 cheaper.

- GTX970 not 4GB card
- 390 is better on VR
- 390 is better on DX12
- 390 delivers what it promises (Nvidia has caught not one but times lying about GTX970 specs)

So price is too high compared to R9 390X. I still recommend to wait 14/16nm GPU's. If new GPU is needed right now, then I recommend cheap card or something future proof. GTX970 too expensive to be "cheap" and it's not future proof either. 390X is expensive but quite future proof.
 
I'm still talking about facts. See my picture. Nvidia claims that GTX970 has 4GB memory WITH 256-bit bus. This is incorrect because only 3.5GB has 256-bit bus. So that is false advertising. Nvidia just don't seem to care. Very few gaming benchmarks are real world benchmarks.
Okay, you lost all credibility right there.

If you think taking the average framerates at different resolution, AA, and quality settings from several games are not "real world", I'd love for you to tell me what is.
 
Okay, you lost all credibility right there.

If you think taking the average framerates at different resolution, AA, and quality settings from several games are not "real world", I'd love for you to tell me what is.

That's not so hard:

Real world benchmarking:

- Game is actually played (I like to play games, not watching recorded runs)
- Windows installation is not fresh (who reinstalls Windows every day?)
- Computer is connected to internet (who disconnects from internet when starts to play?)
- Several programs are running on background, including anti-virus software
- In case something goes wrong (game crashes, noticable slowdowns etc), it's taken on results
- User experience is also taken into account

Etc

Normal gaming benchmarking:

- Game is not actually played, only pre-recorded demo is run
- Windows installation is fresh
- Computer is not connected to internet
- No background programs on
- All failed runs (too low result, crash etc) are simply discarded and run again
- User experience is completly ignored, all is based on numbers

Etc

So "taking the average framerates at different resolution, AA, and quality settings from several games" are far from real world. What this actually means:

http://www.anandtech.com/show/2715/4 Core 2 Quad vs Phenom II 940

Benchmarks say processors are very equal, but

After playing through the several levels on each platform, we thought the Phenom II 940 offered a better overall gaming experience in this title than the Intel Q9550 based on smoother game play. It is difficult to quantify without a video capture, but player movement and weapon control just seemed to be more precise.

And http://www.anandtech.com/show/2715/9

Again, benchmarks say processors are very close:

Now that we have discussed the numbers, what about the game play experience? As we alluded to earlier, the Intel platforms had problems with minimum frame rates throughout testing - not just in the benchmarks, but also during game play in various levels and online. We have not nailed it down yet, but we have noticed this problem consistently. In contrast, the Phenom II X4 940 had rock solid frame rates and offered the smoothest game play experience. The problem is very likely driver related in some manner (as the man who helped to start DirectX once put it, "the drivers are always broken"), but nevertheless this is an issue on the two Intel platforms.

But real world experience tells AMD is best by clear margin.

So, when reviewer actually played game, he thought that AMD is better. Looking at benchmarks only, I cannot see that. That's the main difference between real world testing and benchmarks.
 
That's not so hard:

Real world benchmarking:

- Game is actually played (I like to play games, not watching recorded runs)
- Windows installation is not fresh (who reinstalls Windows every day?)
- Computer is connected to internet (who disconnects from internet when starts to play?)
- Several programs are running on background, including anti-virus software
- In case something goes wrong (game crashes, noticable slowdowns etc), it's taken on results
- User experience is also taken into account

Etc

Normal gaming benchmarking:

- Game is not actually played, only pre-recorded demo is run
- Windows installation is fresh
- Computer is not connected to internet
- No background programs on
- All failed runs (too low result, crash etc) are simply discarded and run again
- User experience is completly ignored, all is based on numbers

Etc

So "taking the average framerates at different resolution, AA, and quality settings from several games" are far from real world. What this actually means:

So, when reviewer actually played game, he thought that AMD is better. Looking at benchmarks only, I cannot see that. That's the main difference between real world testing and benchmarks.
Your complaints would be valid if you were thinking the actual FPS the testers were achieving was what is most important. Your statements that a new OS install, no rogue apps running in the background, not connected to the internet, etc. are likely to show a slight FPS improvement over identical hardware on a average users OS and application state is correct, however that is not what is being argued here. With a fresh install, no rogue apps running, etc. you have a more consistent testing platform. Remember, what we are comparing here are video cards only, not whether Crysis is playable when running Windows update in the background or not.

A benchmark is exactly that, a benchmarking program like 3DMark. Running games and averaging your FPS IS real world benchmarking, as it's using the same game engine, environment, renderings, etc. that will be present in that game at that resolution/quality settings. You obviously need to have a pre-programmed scenario that can be replicated, otherwise comparing one card in a scene with 50 dead bodies flying around compared to another test where you only see 20 dead bodies will vastly skew the results.

Again, having a clean OS with no rogue apps ONLY means that the underlying OS will provide consistent results, and will allow you to get a more fair comparison between video cards (remember, that's what we are talking about here). The end goal is to show you the percentage difference and min/max between cards, not the actual average FPS number one achieves when running the game tests.

By the way, your quotes only prove that benchmarking is the best way to show these issues. As those would show minimum FPS, which is what the complaints you quoted are showing. (And also, it's nearly 7 years old).
 
Last edited:
Your complaints would be valid if you were thinking the actual FPS the testers were achieving was what is most important. Your statements that a new OS install, no rogue apps running in the background, not connected to the internet, etc. are likely to show a slight FPS improvement over identical hardware on a average users OS and application state is correct, however that is not what is being argued here. With a fresh install, no rogue apps running, etc. you have a more consistent testing platform. Remember, what we are comparing here are video cards only, not whether Crysis is playable when running Windows update in the background or not.

A benchmark is exactly that, a benchmarking program like 3DMark. Running games and averaging your FPS IS real world benchmarking, as it's using the same game engine, environment, renderings, etc. that will be present in that game at that resolution/quality settings. You obviously need to have a pre-programmed scenario that can be replicated, otherwise comparing one card in a scene with 50 dead bodies flying around compared to another test where you only see 20 dead bodies will vastly skew the results.

Again, having a clean OS with no rogue apps ONLY means that the underlying OS will provide consistent results, and will allow you to get a more fair comparison between video cards (remember, that's what we are talking about here). The end goal is to show you the percentage difference and min/max between cards, not the actual average FPS number one achieves when running the game tests.

By the way, your quotes only prove that benchmarking is the best way to show these issues. As those would show minimum FPS, which is what the complaints you quoted are showing. (And also, it's nearly 7 years old).

You are talking about consistent platform when benchmarking. There is one big problem: real world computer use is never consistent. So consistent platform automatically means that it cannot be real world benchmarking. Yes, video cards are compared but in real world situation it really matters what happens if Windows update is running on background. On normal gaming benchmarking that can be ignored, just running benchmark again. On real world gaming situation, it matters unless someone has time machine in use.

It is not because of reasons I already provided. Results will change but as I already said, real world is never consistent. In real life testing conditions will be different. That's one price of real world testing.

Now you are talking about fair comparison. Video card comparison may be more fair with clean OS, but once again, fair comparison, consistent platforms etc does not make it real world benchmarking. Real life benchmarking is what happens on real world conditions. To understand what's the difference, classic car example: http://www.nytimes.com/interactive/...-diesel-emissions-scandal-explained.html?_r=0

Normal exhaust tests were like normal benchmarks. Conditions were equal, no difference between enviroments and such. What happened when same cars were tested on real world situations?

Minimum FPS shows those best platform? See Company of Heroes benchmark. Without overclocking, i7-920 has higher minimum FPS than Phenom II. So according to that, i7-920 offers better experience. On Crysis i7-920 was considered best despite Phenom II having equal minimum FPS. And Core 2 Quad was not very far from Phenom on benchmarks but gaming experience showed something else. 7 years old, so what? That clearly shows that benchmarking on consistent platforms and gaming on consistent platforms are two different things. Benchmarking on consistent platform and gaming on inconsistent platform is much more different.
 
You are talking about consistent platform when benchmarking. There is one big problem: real world computer use is never consistent. So consistent platform automatically means that it cannot be real world benchmarking. Yes, video cards are compared but in real world situation it really matters what happens if Windows update is running on background. On normal gaming benchmarking that can be ignored, just running benchmark again. On real world gaming situation, it matters unless someone has time machine in use.

It is not because of reasons I already provided. Results will change but as I already said, real world is never consistent. In real life testing conditions will be different. That's one price of real world testing.

Now you are talking about fair comparison. Video card comparison may be more fair with clean OS, but once again, fair comparison, consistent platforms etc does not make it real world benchmarking. Real life benchmarking is what happens on real world conditions. To understand what's the difference, classic car example: http://www.nytimes.com/interactive/...-diesel-emissions-scandal-explained.html?_r=0

Normal exhaust tests were like normal benchmarks. Conditions were equal, no difference between enviroments and such. What happened when same cars were tested on real world situations?

Minimum FPS shows those best platform? See Company of Heroes benchmark. Without overclocking, i7-920 has higher minimum FPS than Phenom II. So according to that, i7-920 offers better experience. On Crysis i7-920 was considered best despite Phenom II having equal minimum FPS. And Core 2 Quad was not very far from Phenom on benchmarks but gaming experience showed something else. 7 years old, so what? That clearly shows that benchmarking on consistent platforms and gaming on consistent platforms are two different things. Benchmarking on consistent platform and gaming on inconsistent platform is much more different.
Again, I'm not saying that these tests equate to the same performance one should expect on their own machine with an old OS and apps running, I'm saying that things such as a bloated OS, rogue apps, other apps running in the background, etc. do not make any difference between video cards, so there is no point in including them in these tests.

If you include these in their tests, sure if you only care about that one card it would give you a possibly more accurate score/FPS, but then you are assuming everyone has X number of background apps. The point here is to compare video cards among each other. Doing these tests with some setups with more active background tasks than others serve no purpose. Having a clean baseline that doesn't sway the results of the video card is what you want when comparing cards.
 
Again, I'm not saying that these tests equate to the same performance one should expect on their own machine with an old OS and apps running, I'm saying that things such as a bloated OS, rogue apps, other apps running in the background, etc. do not make any difference between video cards, so there is no point in including them in these tests.

If you include these in their tests, sure if you only care about that one card it would give you a possibly more accurate score/FPS, but then you are assuming everyone has X number of background apps. The point here is to compare video cards among each other. Doing these tests with some setups with more active background tasks than others serve no purpose. Having a clean baseline that doesn't sway the results of the video card is what you want when comparing cards.

Background apps can make difference on other parts and because PC performance is always depending on every part, background tasks can make difference on gaming performance.

Outside rogue apps, another big problem is that game version and driver version has very large impact on performance. Also drivers are usually optimized for most popular games. So testing with more uncommon games give much better idea about video card performance.

Anyway this discussion went badly off topic and I feel there's not much to discuss anymore so I quit here. Feel free to reply if you have something to add.
 
Background apps can make difference on other parts and because PC performance is always depending on every part, background tasks can make difference on gaming performance.

Outside rogue apps, another big problem is that game version and driver version has very large impact on performance. Also drivers are usually optimized for most popular games. So testing with more uncommon games give much better idea about video card performance.

Anyway this discussion went badly off topic and I feel there's not much to discuss anymore so I quit here. Feel free to reply if you have something to add.
Again, you are missing my entire point. I'm NOT saying that gaming performance is not impacted by background tasks. What I'm saying is that when comparing multiple cards, you don't want such vast discrepancies between each test. Comparing one card to another with one test being done with some rogue 20% CPU usage task running in the background has nothing to do with that video card, and only serves to make the results useless.

Driver versions and game versions do not matter, as you are comparing tests on a clean slate with the same version of software and drivers. The ONLY difference is the specific video card you are swapping out to test.
 
In modern computers that match the 970 where the CPU and other things are not bottlenecked, running other apps won't make a lick of difference to gaming.
 
You can run many apps in the background and it wont be an issue. Yes, running a CPU intensive benchmark where the CPU becomes a bottleneck is an extreme situation.
 
Back
Top