When do you think SSD's will completely replace HDD's?

The title says it all. SSD's are getting cheaper and cheaper. Using a system with an HDD is absolutely intolerable (I have to do it at work and it SUCKS). I think SSD's will replace HDD's in 3-5 years at least home computing wise. Maybe 5-10 in industry (servers, archival systems, backups, etc.) What do you guys think?
 

strollin

Well-Known Member
As long as large capacity spinner drives continue to offer more storage per dollar, I think they will never be completely replaced by SSDs.
 
Sure. But what about in consumer devices. Do you think we'll ever see base model computers coming with SSD's as standard and foregoing HDD's? I feel like we will at some point. HDD's are essentially obsolete.
 

beers

Moderator
Staff member
Depends what you're trying to do. You can already buy all flash SAN, desktops, laptops, etc. Most ultrabooks ship with m.2 or equivalent.

I think after this NAND inflation dies out you could probably see small OS drives make it into low end PCs, but as far as in the industry it's already a lot more prevalent than you might think.
 

Darren

Moderator
Staff member
I'm starting to see SSD's in refurb machines at work that are a couple years old, let alone new stuff. They're pretty common.
 

voyagerfan99

Master of Turning Things Off and Back On Again
Staff member
I work for an MSP. The new Latitudes we sell have SSD's but we still get standard HDD's in the desktops (don't know why we do that - probably to keep cost down).
 

Darren

Moderator
Staff member
I work for an MSP. The new Latitudes we sell have SSD's but we still get standard HDD's in the desktops (don't know why we do that - probably to keep cost down).
Desktops are probably restarted way less frequently and leave programs open longer so SSD has less noticeable difference than use cases on a lappy.
 
Cost certainly is a big factor and always will be. Companies will almost always try to save money wherever and whenever they can.

https://www.computerworld.com/artic...and-hard-drive-prices-are-nearing-parity.html

According to this article, SSD prices per gig are roughly 3x that of HDD's. Pretty remarkable considering in 2012 they were about 16x the price per gig. I definitely see a lot of SSD penetration in the computer industry. Most all laptops come with at least a boot SSD these days. So that's good. Also something I thought about was optane. Optane performance is quite impressive and more or less makes a system with an HDD run like it has an SSD. Perhaps that's the route industry will take. Throw in an optane module and go with an HDD.
 

Darren

Moderator
Staff member
Well I imagine Optane will drop in price much like SSD's have. It's an alternative. Optane has the potential to be a big innovation.
Yeah but if OEM's wont pay for it because Intel prices it too high then won't do them much good.

Also SSD's have advantages apart from just speed. I got a noticeable battery life increase going from a 750GB HDD to a 120GB SSD since my computer didn't have to physically spin the drive. My battery life on my 3 year old laptop is still about 5ish hours whereas when it was new it was about 5 but dropped down closer to 4 over time until I switch the SSD into it.

Also maybe even more importantly is that SSD's don't really get damage from shock like mechanical HDD's. I work in a repair shop and laptop HDD's fail a LOT due to every little bump and tap wearing down the drive. SSD's don't have that problem.
 

beers

Moderator
Staff member
I'd be interested in the straight Optane SSDs that are U.2 but being a new technology they're still a little on the prohibitively expensive side.
 
Yeah SSD durability and shock resistance is fantastic. I'm also interested in straight optane SSD's. The technology has to mature and supporting protocols, drivers, etc. have to catch up but it holds a lot of promise. An order of magnitude lower latency and order of magnitude greater speeds (eventually at least)? Sign me up.
 

Darren

Moderator
Staff member
We do get to the point though is where is all of that theoretical performance gains going to be noticed in the real world? CPU's have seen this happen a lot in the past 5ish years as even day to day usage for a lot of users can still be handled by older dual cores like earlier i3's. Obviously on the enthusiast end and performance sector there is always drive for innovation but the stagnation of hardware advances these days are in at least some part due to these gains not being applicable to a lot of use cases.

Even me as a performance nut, I don't see myself dropping money on an M.2 drive for the faster speeds over my conventional SATA SSD since fast as hell as is.
 
100 percent. The tech industry, in many respects, is stagnant. The cell phone market has almost no innovation (yeah OLED's, face ID, machine learning, neural networking, fingerprint sensor under screen) but by and large, each year's flagship is basically the same thing they've been for years. As for computers CPU's haven't advanced much, mobos are basically the same, and ram is the same. GPU's continue to advance but AMD's inability to compete has stunted that rate of advance.

The industry needs a paradigm shift and I think it comes in 2 forms: graphene and quantum computing.

Graphene is a few years out. I invest heavily in graphene stocks, knowing that they will soon supplant silicon once we are unable to reduce die size (depends how long Intel and AMD take to go down to 5nm or whatever is the limit). Quantum computing is gaining traction, with Intel's 49 qbit chip. This will take longer to develop but once it does it will be like the invention of the computer all over again.

Anyway I digress. I'm a performance nut as well but I only upgrade if my computer isn't handling my workload (gaming). I ran with an i7 920 for 10 years (literally until it was starting to fail). Optane's performance may not provide a tangible difference in most use case scenarios and probably won't. It's mainly targeted for enterprise solutions anyway.

It's all about software not hardware these days.
 

Darren

Moderator
Staff member
@jarlmaster475 Watch it pal. I've had AMD GPU's for 6 years. :p Bought my first Nvidia GPU in a decade this past September. They've been competing decently (and even "winning" some generations) up until Vega vs Pascal. Hence my 1080.
 
@Darren Lol I used to roll with AMD GPU's back in the 4xxx and 5xxx days. Then I switched to Nvidia and never looked back. Objectively speaking, Nvidia has a much greater marketshare (70 to 30 percent). AMD is always playing catchup and Nvidia is sandbagging because their flagships are never really threatened. AMD releases a line of cards 6 months after Nvidia that trade blows with those Nvidia cards. Meanwhile, Nvidia has something else cooked up that gets release unopposed. Sure, AMD has some wins here and there across the price gamut (performance per dollar) but for the most part, Nvidia's line of cards outperform AMD's direct competitors. AMD had something with the RX 480 but Vega is an absolute disaster. It really is too bad, because competition is always good for the consumer. Nvidia is content to charge high prices, knowing that consumers will buy them without batting an eyelash. AMD's niche is APU's and if the trend continues with Ryzen's follow-ups, the budget conscious CPU market. AMD needs a ryzen type success in the GPU market.
 
Top