8800 gt price drop?

I've been doing some reading online and everyone is having this problem, so it will probably be more then 5 days before you get one, but who knows.

By them saying "5 days after ordering", seems like they are just throwing a number out there.
 
Hermes, you know when the G92 GTS is released will that be better than the current GTX and will it be like a domino effect; e.g. i mean will they release a new 8800GTX and then a new Ultra etc.

Cheers, just curious
salman
 
Hermes, you know when the G92 GTS is released will that be better than the current GTX and will it be like a domino effect; e.g. i mean will they release a new 8800GTX and then a new Ultra etc.

Cheers, just curious
salman

Nothing would make me angrier than that... If you ask me, they should have built the G92 before the crap that I bought, that was just an easy way to get something on the market for Nvidia...
 
That would be pretty annoying, Just the way that nVidia labels their products confuses me with the 8800GT they seem to be going against the previous law that the GTS is always faster than a GT and when the new GTS is released will it be faster than the GTX because if it is, its going against the previous rule that the GTX is faster than the GTS.

Confusion... hahah
 
If the new GTS is in fact better than the GT, there's a likely chance it will outperform the GTX. It makes no sense to release the GTS and have it perform on par with the GT, common sense says it will be better. The price tag will obviously be higher, $100-150 or so...

Lol, yeah, confusion. Those Nvidiots should've named them differently, hell should've made them mid level 9000 cards, even. Better than this.
 
If the new GTS is in fact better than the GT, there's a likely chance it will outperform the GTX. It makes no sense to release the GTS and have it perform on par with the GT, common sense says it will be better. The price tag will obviously be higher, $100-150 or so...

Lol, yeah, confusion. Those Nvidiots should've named them differently, hell should've made them mid level 9000 cards, even. Better than this.

No they shouldn't have named, they should've built the things right in the first place... You tell me why, over a course of maybe a year, that they couldn't have devoted more effort into finding G92 technology earlier.... Ugh, this is aggravating... I might swap for 8800GT and get another for SLI...


EDIT: Too late for me to step-up... Perfect timing by Nvidia to screw me over, I might go back to the ATI junk...
 
It isn't yet decided whether the new 8800GTS is in fact based on the G92 core--not everyone is convinced it will be. If it goes back to the good ole 256-bit architecture, that will be the sign, as the old G80 uses that stupid 320-640 memory b-s.

However, I consider it nearly a done deal that they'll be releasing it with a G92. That would explain why the 8800 GT's aren't available--every available G92 is being tossed into a GTS redux.

I also predict that the release of the G92'd GTS cards will coincide with the release of Crysis! the Crysis team has been out working the media, talking about how the new G92 cores offer the perfect deal for the midrange customer to enjoy next-gen graphics. November 15 does seem to be a likely date. I am prepared to pounce with my Step-Up!

I predict that GTX owners will not be Stepping-Down, and not solely out of pride. I really doubt that the new GTS will hands-down beat the GTX or Ultra. But it might nip at the GTX's heels, and OCers will probly topple the kings.
 
Considering the 8800gt already nips at the GTX's heels, the GTS will naturally have to outdo that. I wouldn't do for the new GTS to be beaten by the GT, neither would it do for them to be comparable. GTS has to be the stronger card, thus it will meet, or beat the GTX.

I also can't see a new GTS not using the G92 core, or better. Rehashing "old" tech is always a bad move, and rarely, if ever, done.
 
No they shouldn't have named, they should've built the things right in the first place... You tell me why, over a course of maybe a year, that they couldn't have devoted more effort into finding G92 technology earlier.... Ugh, this is aggravating... I might swap for 8800GT and get another for SLI...

you have no clue how tech works i guess... :P
 
http://www.dailytech.com/article.aspx?newsid=9474
NVIDIA's GeForce 8800 GT might be remembered as one of the most successful NVIDIA graphics cards of our time, at least according to the flurry of reviews this week. Virtually every top tier e-tailer managed to sell out of the card in less than two days.

Yet NVIDIA isn't done yet. A G92-derivative will appear later this year with even more shader units. According to company guidance, the new G92 will launch in early December and feature 128 shader units as opposed to the 112 featured on GeForce 8800 GT.

This would be mean the additional 16 shader units exist on all GeForce 8800 GT cards, but are disabled for yield or marketing purposes. In addition to the extra shaders, the new G92 will also feature higher core frequencies and support for up to 1GB GDDR3.

The new 65nm G92 has a tentative SKU designation of GeForce 8800 GTS. This might sound confusing as NVIDIA already sports a GeForce 8800 GTS card based on the 90nm G80 silicon. However, since G92 sports a 256-bit memory interface, the new 8800 GTS cards will feature traditional memory blocks of 512MB or 1024MB. The older, G80-based GeForce 8800 GTS features 320-bit memory blocks of 320MB or 640MB.

As the new GeForce 8800 GT generally outpaces the existing GeForce 8800 GTS, the new GeForce 8800 GTS will likely surpass NVIDIA's high-end GeForce 8800 GTX and potentially GeForce 8800 Ultra.

May be there might a G92 GTX and Ultra with even more shader units?
 
Point taken, Hermes. That's why I figured the new GTS would have to be a G92. And why it seems likely that they'll be releasing updated GTX/Ultra units with G92 cores. Although frankly, rehashing old tech has happened before--any time the company really doesn't care about the customer. But it is unlikely in this case.

I do agree, however, that they made a mistake in rushing the 8-series to market. I believe nVidia desperately wanted to dominate the video card market. For the most part, the succeeded--almsot everyone uses nVidia cards now, with only ATI fans, those who don't need gaming performance, and those who don't upgrade their computers often running ATI cards. I doubt that they seriously thought they could knock ATI out, however. It will be rather interesting to see what happens when ATI's next-gen cards are released.

However, software developers are partly to blame as well. They were designing games for technology that didn't really exist yet, and for technology they knew the average person couldn't afford. Just look at DX10--it's a really stretch for even Ultras to run it smoothly at high framerates, but games still support it. At the same time, however, what should we expect them to do? Not improve their graphics quality?

This is why I like Valve's software philosophy: scaleability and high framerates. Good performance and quality for everybody instead of superb quality for a few. HL2: Episode 2 was a real visual feast. I played it at max settings with an 8600 GTS 256MB, at 1680x1050 resolution. I got roughly 45-50 fps (not average or max or anything, just the number I saw all the time, even when lots of stuff was happening). I just got Call of Duty 4, and can't get anywhere near that resolution with anything even close to that level of detail. The Bioshock demo sucked on my rig.

And if you bought a GTX...look at it this way. They *made* you spend that kind of money to get what you wanted.

And if you built a new rig six months ago, installed Vista, and popped in an SLI'd pair of Ultras (who needs two kidneys? system runs fine with one), and then promptly announced to the world in your sig that you were "ready for Crysis"...why did you upgrade for a game that wasn't going to be out for another half a year? Thanks for making tech so expensive for the rest of us, ya fool.
 
Back
Top