SBLiveFordFE
banned
After talking to my really close friend last night (also a computer nut), we both agreed on one thing. ATI and Nvidia have both took it way to far.
Here's why. HL2....a VERY anticipated game. Finally comes out, but you must have a $400+ card to run it at a full 60 FPS. What about Doom III? Again, your going to have to drop major bucks in your wallet in order to play it.
It wasn't always like this. I'm not a past dweller, but Ill use an example.
My first computer was a pentium 133Mhz. I would play thief on it all the time. probably got 8 FPS, but it was still fun. I went to staples and got a Voodoo 3 2000 for $60 on sale. that 8 FPS now turned to 30 FPS. It was much better for awhile. Then I baught half-life, and all those other games.
Decided it was time for a computer upgrade. Went to a K6-2 500 MHz CPU (my first built computer @ age 14) with an Asus motherboard. I was running at a full 40-50 FPS now. Hitman 1 and 2 starting to come out, along with Max-payne which required more use of the CPU / Videocard.
Grabbed a 800MHz Athlon. From there (for about a year), I had a full 60 + FPS (huge gain was due to the weak FPU in the K6-2) in every game I played. During the next 2 years, I just did little tad-bit upgrades on the side when I saw really good deals. Now I have a 1800+ Athlon, Asus board, Radeon 8500, ETC. If you follow the pattern here, you should notice that this setup "should" be able to run the new stuff easily. But something has changed.
Now, even with this setup, HL2 and DoomIII is going to run like trash. I bet I wont even get 30 FPS @ 1024 X 768 x 32.
Now, is this that the software needs to become more efficient, or the graphic card prices needs to Fall WAY down?
So, I've decided to grab an Athon 64 3000, Upgrade the CHipset from 333Mhz to 400 Mhz, and get faster ram @ christmas.
Guess what? It probably still wont hit 30 FPS consitantly.
So then what? Drop $175 for a Radeon 9800 PRO after christmas? Then what? 45 FPS? See how this doesnt make sense? Thats how much a PS2 cost!
What I'm trying to say is, Nvidia and ATi has thier customers clinched in thier fists. They are drying us of our money, and thats not hard to see. They have the technology to FAR exceed what they have now. Why dont they? To strive our money out as far as they can before they come out with something "new".
They could EASILY drop thier prices $50 or more, and make us ALL happy. Even with just a "measily" 50 bucks, it still will be very expensive. Until then, we have bust out $400 on a video card (+ more on a CPU ETC) to run a brand new game at 60 FPS. How much do you think it costs them to make a board/chipset? $3?
I still think I should be able to go out and buy a $150 card, and be able to slam a new game @ 60 FPS, not 30 fps. Theres a problem here.
What I've been doing to battle the graphics card problem is just wait. I've been waiting nearly 1.5 years, and the prices havnt seem to fall Tremendously. Radeon 9700 or 9800 are still nearly $200. Plus, why even get the 9700 or 9800 if halflife is still going to hit 40 FPS?
What do you guys think?
Nash
Here's why. HL2....a VERY anticipated game. Finally comes out, but you must have a $400+ card to run it at a full 60 FPS. What about Doom III? Again, your going to have to drop major bucks in your wallet in order to play it.
It wasn't always like this. I'm not a past dweller, but Ill use an example.
My first computer was a pentium 133Mhz. I would play thief on it all the time. probably got 8 FPS, but it was still fun. I went to staples and got a Voodoo 3 2000 for $60 on sale. that 8 FPS now turned to 30 FPS. It was much better for awhile. Then I baught half-life, and all those other games.
Decided it was time for a computer upgrade. Went to a K6-2 500 MHz CPU (my first built computer @ age 14) with an Asus motherboard. I was running at a full 40-50 FPS now. Hitman 1 and 2 starting to come out, along with Max-payne which required more use of the CPU / Videocard.
Grabbed a 800MHz Athlon. From there (for about a year), I had a full 60 + FPS (huge gain was due to the weak FPU in the K6-2) in every game I played. During the next 2 years, I just did little tad-bit upgrades on the side when I saw really good deals. Now I have a 1800+ Athlon, Asus board, Radeon 8500, ETC. If you follow the pattern here, you should notice that this setup "should" be able to run the new stuff easily. But something has changed.
Now, even with this setup, HL2 and DoomIII is going to run like trash. I bet I wont even get 30 FPS @ 1024 X 768 x 32.
Now, is this that the software needs to become more efficient, or the graphic card prices needs to Fall WAY down?
So, I've decided to grab an Athon 64 3000, Upgrade the CHipset from 333Mhz to 400 Mhz, and get faster ram @ christmas.
Guess what? It probably still wont hit 30 FPS consitantly.
So then what? Drop $175 for a Radeon 9800 PRO after christmas? Then what? 45 FPS? See how this doesnt make sense? Thats how much a PS2 cost!
What I'm trying to say is, Nvidia and ATi has thier customers clinched in thier fists. They are drying us of our money, and thats not hard to see. They have the technology to FAR exceed what they have now. Why dont they? To strive our money out as far as they can before they come out with something "new".
They could EASILY drop thier prices $50 or more, and make us ALL happy. Even with just a "measily" 50 bucks, it still will be very expensive. Until then, we have bust out $400 on a video card (+ more on a CPU ETC) to run a brand new game at 60 FPS. How much do you think it costs them to make a board/chipset? $3?
I still think I should be able to go out and buy a $150 card, and be able to slam a new game @ 60 FPS, not 30 fps. Theres a problem here.
What I've been doing to battle the graphics card problem is just wait. I've been waiting nearly 1.5 years, and the prices havnt seem to fall Tremendously. Radeon 9700 or 9800 are still nearly $200. Plus, why even get the 9700 or 9800 if halflife is still going to hit 40 FPS?
What do you guys think?
Nash