PS4 or Xbox One?

PS4 or Xbox One?

  • PS4

    Votes: 25 64.1%
  • Xbox One

    Votes: 14 35.9%

  • Total voters
    39

Geoff

VIP Member
Lol omg.

Ok...

Well add the compulsory online fee, the initial capital outlay, the inability to upgrade and you have a lemon.

The directx thing is just silly, as it simply proves the point that xbox, (which will be using directx) has the same limitations....
True, at $5 a month. Add in that $60/year and the price is still a LOT less than buying a $2k+ gaming machine every few years.

Why do you need to upgrade? Newer games do get slightly better graphics, but there is no need to upgrade a console. How is the DirectX thing "silly"? A new DirectX version comes out every couple years, so eventually new games won't run on older version of DX.
 

Geoff

VIP Member
Oh and dude, you need to know how visitor messages work :p

i-t29jfnD.png
 

Okedokey

Well-Known Member
True, at $5 a month. Add in that $60/year and the price is still a LOT less than buying a $2k+ gaming machine every few years.

Why do you need to upgrade? Newer games do get slightly better graphics, but there is no need to upgrade a console. How is the DirectX thing "silly"? A new DirectX version comes out every couple years, so eventually new games won't run on older version of DX.

I don't even know where to start with this....

BTW... I just sold my 580s, motherboard, cpu etc etc got 2K back. So all in all I spent the 500 bucks you want to spend on a ps4 or whatever. Sigh..

so my net spend was about the same as a ps4, which machine you think will be faster at bf4 at 5760x1080....??? oh wait ,the console cant do that...

Ill just leave this here https://robertsspaceindustries.com/about-the-game
 
Last edited:

Troncoso

VIP Member
Okay. So you spent around $2000 originally, and you spent $500 to upgrade your system, not to mention the one you sold wasn't even 5 years old. Where as my PS3 lasted 7 years before I bought a PS4. And you have to wonder what engineer would be stupid enough to build a console with the capability of running at 5760x1080 when no TV can even support that resolution. They run the same architecture, but game consoles and gaming PC's are fundamentally different in their overall purpose and intended use.

You won't build a PC with better performance than a console at $400-$500. The $500 you spent was an upgrade, that isn't from scratch. I could have sold my PS3 for around $200 and then my PS4 would only have been $200 net. See how that works?
 

SpringWater

Member
I prefer the ps4 due to its lower price, and I had bad experiences with the 360, so my choice is purely personal preference and has nothing to do with the functionality of the consoles.
 

Geoff

VIP Member
Okay. So you spent around $2000 originally, and you spent $500 to upgrade your system, not to mention the one you sold wasn't even 5 years old. Where as my PS3 lasted 7 years before I bought a PS4. And you have to wonder what engineer would be stupid enough to build a console with the capability of running at 5760x1080 when no TV can even support that resolution. They run the same architecture, but game consoles and gaming PC's are fundamentally different in their overall purpose and intended use.

You won't build a PC with better performance than a console at $400-$500. The $500 you spent was an upgrade, that isn't from scratch. I could have sold my PS3 for around $200 and then my PS4 would only have been $200 net. See how that works?
Exactly.

Look man, I love computers too, I spent close to $2,000 for my rig earlier this year with an IPS 27" display. Do games look better on it over a console? Of course they do. However, there is a LOT more to game play than just graphics. Consoles have a definite edge when it comes to game play, such as:

Party chat. Sure you could Skype with your friends, but that means everyone has to have the same program, plus a lot of PC gamers don't have/use mics, which makes it a lot less fun and personal IMO.

# of users. This differs for people, but most of my friends aren't computer guys/gals, so if they game 99% of them are Xbox/PS gamers. If you meet someone new outside of computers, and if they game, there is a good chance they will have a console.

More level playing field. With consoles it's much more common to have a level playing field. The majority of console gamers have stock controllers, where as in PC gaming you can spend upwards of hundreds of dollars on keyboards, mice, etc. that give you a gaming edge. In console gaming, just because you can't afford the best accessories doesn't mean you won't have a good experience.

Overall you need to understand that there is a definite need for consoles, and a need for computers. Just because a computer does what a console can, doesn't mean a console it useless. A computer can do anything a phone can do, are phones stupid? What about tablets? Laptops? Why buy any of those if you have a computer that can do everything they can do, and better?
 

PCunicorn

Active Member
Really you only need to spend $750 and $500 to upgrade the system to max pretty much every game out, but yes, it's still a lot more than a PS4.


Woow yes I can get 750$ out of my account anytime... Don't worry you'll grow up and have to pay rent, food, gas and bills. You'll see if you can afford it. Price is really important on this, you've almost doubled the price of the PS4, said anyone could afford it just to say a PC can run this game at this res... I can't agree with you.

That's not really what I meant. The average person buys a new $500 PC every 3 years, and the average gamer buys a console every 6 years, like a PS4 for $400.Now let's say you need to buy a new PC this year, the last one you built was in 2010, and now the new consoles are out. That's $900 you could be spending on a gaming PC instead of buying two things, but you could actually spend $750 and save $150.
 
Last edited:

Geoff

VIP Member
Really you only need to spend $750 and $500 to upgrade the system to max pretty much every game out, but yes, it's still a lot more than a PS4.
How long will that upgrade last you? 2 years? Maybe 3. Again, consoles are around for upwards of almost 10 years before you have to buy a new one again.
 

PCunicorn

Active Member
Nope, a good 5 years on low, I guarantee you, with a $750 system and $500 on upgrades. That contains a 280X and 8320. The GPU will last you 3 or 4 years, and the CPU 5 or 6. That $500 will cover failed parts and a new GPU when the time comes, in 3 years. And saying consoles last more than 10 years is a bit unfair. Consoles really only last about 6 years, with some support for a couple more. If I built a PC with a 8800GTX and a Q6600 when the PS3 was released 6 years, I could still play some games for the next 4 years, just very unintensive ones. As I said before, if on a budget and you have just bought a new PC, go with a console. Otherwise, I say PC all the way. Now I am sure other people's opinions are different, but that's mine.
 
Last edited:

G80FTW

Active Member
I dont care how many times you beat it through. I half beat it and then got my PC. If you really want to start comparing it to Crysis, we can. The Last of Us looks a little better then Crysis 1. Crysis 2 looks much better. And Crysis 3 absolutley destroys TLoU in terms of graphics quality. All these are playable maxed oout a $750 machine. $350 more than a PS3, yes, but most people can afford it.

Not sure how that makes any sense because in MANY areas Crysis 2 looked worse than Crysis 1. Mostly because they chose to use lower resolution textures than they did in Crysis 1 because they decided to make it for console.
 

G80FTW

Active Member
The PS3 was not out in 2005 either ;) Just like the GeForce 8 series, it was out in 2006/2007, depending on where you are in the world. You also seem to be mistaking a 7800 for the top card in the 7 series, forgetting completely about the 7950.

Nobody said that you said it was better than Crysis either, nor did anyone say that it looked bad for a 7 year old console (not 8), the fact of the matter is that there is a huge difference between the shot you put and the in game. The hair is less detailed, the clothes less detailed, the faces less detailed, there are more jaggies, I don't know how you cannot see that.

A 7 year old system (Q6600, 8800GT, 4GB DDR2) would play games like that no problem at 30 FPS and more, just go and look at Crysis 1 as an example, an 8800GT, or if you want to be more pedantic about release dates, an 8800GTS, would run Crysis no problem at medium-high settings, which is as good or better than the graphics in The Last of Us.

You are not the only person on this planet that has played any of these games, so you can carry on repeating over and over that you have played the games, it gives you no more authority on the matter than anyone else because guess what, I have too and so have a lot of others, and even if they had not, videos are readily available to people to see for themselves. When everyone else that has played the game is then saying it is not all that graphically and you are the only one saying it is, does that not make you question whether you are right or wrong?

The problem that we have is lazy developers. We have so much more grunt now on PC's compared to relatively ancient hardware that is in the previous gen consoles that developers do not need to optimize code when porting over, they can keep it sloppy and inefficient and we can still max it out. If they put any modicum of time and effort into porting it over, people could very easily sit on their systems from 7 years ago and wouldn't have to upgrade as often.


With regards to price, PC gaming is cheaper, just not with the initial investment. You start off out of pocket but once you factor in the price difference in games and the fact that you are offsetting some of that cost by only needing one PC to game and use as a PC, instead of two, a PC for a PC and a console for games, the prices start to swing in favour of computers rather than consoles.

As Punk said though, people cannot just fork out nearly double the price as and when, you have to plan large investments like that unless you are fortunate enough to be rolling in it, so that large initial sum could very easily be a deal breaker

Really. Your going to bring in the 7950? A dual GPU card? I excluded dual GPU cards for one very simple reason: They have 2 GPUs not one. Might as well run 2 7800s in SLi. For a single GPU card, the 7800 was at the top.

No one could sit on their system for 7 or 8 years. Id imagine you were not building computers 7 or 8 years ago but we have had some new DirectX versions released since then as Geoff pointed out that would leave an 8 year old PC almost useless today for gaming unless you enjoy playing games that look worse than todays console games.

We can certainly leave the last of us as a subjective matter (I know people who also have the game and seen it in person and they agree with me, however it really is more of an opinion when it all comes down to it without being able to provide any numbers supporting its true graphical aspects), however you cant honestly advise people that a 7800, 7950, or even an 8800GTX Ultra (which can be found for $50 or so on ebay) paired with something like a C2D is really going to be a decent gaming machine capable of playing games that look just as good as Halo 4 or the last of us. I have an 8800, I know what the card is capable of. And the 7800/7950 is much much less powerful and only supports DX9 so its completely out of the question as a viable gaming card today.

Let me put it to you this way:
2 years ago my current PC was nearly top of the line. My graphics card was when I bought it. My graphics card is already showing age, running at JUST over 30FPS average in Crysis 3 and Tomb Raider on max settings @ 1920x1080 so I would imagine that the next generation of consoles will bring even more intense graphics to PC and maybe even better optimization for PC however I dont expect this graphics card to last more than 2 more years. 5 years is pushing it for a gaming PC.
 
Last edited:

Aastii

VIP Member
Really. Your going to bring in the 7950? A dual GPU card? I excluded dual GPU cards for one very simple reason: They have 2 GPUs not one. Might as well run 2 7800s in SLi. For a single GPU card, the 7800 was at the top.

No one could sit on their system for 7 or 8 years. Id imagine you were not building computers 7 or 8 years ago but we have had some new DirectX versions released since then as Geoff pointed out that would leave an 8 year old PC almost useless today for gaming unless you enjoy playing games that look worse than todays console games.

We can certainly leave the last of us as a subjective matter (I know people who also have the game and seen it in person and they agree with me, however it really is more of an opinion when it all comes down to it without being able to provide any numbers supporting its true graphical aspects), however you cant honestly advise people that a 7800, 7950, or even an 8800GTX Ultra (which can be found for $50 or so on ebay) paired with something like a C2D is really going to be a decent gaming machine capable of playing games that look just as good as Halo 4 or the last of us. I have an 8800, I know what the card is capable of. And the 7800/7950 is much much less powerful and only supports DX9 so its completely out of the question as a viable gaming card today.

Let me put it to you this way:
2 years ago my current PC was nearly top of the line. My graphics card was when I bought it. My graphics card is already showing age, running at JUST over 30FPS average in Crysis 3 and Tomb Raider on max settings @ 1920x1080 so I would imagine that the next generation of consoles will bring even more intense graphics to PC and maybe even better optimization for PC however I dont expect this graphics card to last more than 2 more years. 5 years is pushing it for a gaming PC.

Except for that a 7950 was a single GPU ;). And I did not mention the 7950 to sit in our hypothetical system, because that was last gen hardware when the PS3 came out, I was on about an 8800GTS/GT, which is more than viable.

Crysis and Tomb Raider on max settings are not the same as what a console has, almost no game is. Just like a lot of games, graphics features are missed out of the console versions, like higher res textures (in some cases) and AA/AF.

I have a second system sat in the other room behind me with my old AMD 720BE and a 9800GT, which is just an 8800GT renamed. Does it perform as well as my 3570K and 7970? No, but it can still play 90% of the games out now, albeit not at max settings.

You do not need to push to 1920x1080 either, there are no 360 titles playing at that res and there are relatively few PS3 games playing at that res either. By using your 1920x1080 as the benchmark here, you are saying "my hardware, when having to push more than double the pixels of the consoles in question, only just performs as well as the consoles do". Drop your res down to 1280x720, your settings to medium-high, turn off AA and look at your FPS hit the roof, way higher than 30, because that is the sort of graphics settings that you would be using on a console.


Just to make clear at this point, I do not believe the 8 series will be any use in 6-12 months time unless your biggest buy over the next few years is something like The Sims 4, because with the new generation of consoles, demands from PC hardware will shoot up because the ports will be coming from much newer hardware. However, with the hardware in the consoles being so close to that of a PC, maybe we will see far better code being produced so the demands will not actually raise as high as would otherwise be expected... Who knows at this point how these consoles will affect PC gaming in a years time. However, someone with a 290X or 780Ti in 6 or 7 years time will still be chugging along, though just like the 8 series now, probably on low-medium settings
 
Last edited:

G80FTW

Active Member
Just to make clear at this point, I do not believe the 8 series will be any use in 6-12 months time unless your biggest buy over the next few years is something like The Sims 4, because with the new generation of consoles, demands from PC hardware will shoot up because the ports will be coming from much newer hardware. However, with the hardware in the consoles being so close to that of a PC, maybe we will see far better code being produced so the demands will not actually raise as high as would otherwise be expected... Who knows at this point how these consoles will affect PC gaming in a years time. However, someone with a 290X or 780Ti in 6 or 7 years time will still be chugging along, though just like the 8 series now, probably on low-medium settings

This is all you had to say as its what I was trying to say.... I dont think the 8 series is of much use now. It can only play the new games at low-medium settings at lower resolutions and missing out on DX11 features. Which a big reason I would say the 8 series isnt really a viable PC gaming card today. Yes you can game with it, and yes its ultra cheap, however for a current budget system today something closer to a 660ti will blow any 8 series out of the water.

And I dont remember the 7950 being a single GPU card, I only remember the GX2 but even a singe GPU 7950 probably wasnt significantly faster than the 7800GTX due to the old style pipeline architecture of the GPU itself. In todays games they would probably perform very similar....which would be little to not at all for alot of games.

Im not comparing how current PCs perform versus consoles, I am simply pointing out that if you want to continue playing your games with them looking good you cant use the same hardware for 8 years like a console can. The console will always have a software advantage because the system is specifically designed for gaming, a PC is not. No matter what hardware you have in your PC the operating system is not designed specifically for gaming.

But you know, its personal preference. If you and okey over there would like to believe that 8 year old computers can play todays games just fine and look just as good as the console version then be my guest. I will stick to the real life scenario.

According to this, which was posted in 2008, the 7800GTX couldnt really handle the games back then and it was only 3 years old!
http://hardforum.com/showthread.php?t=1309512
 
Last edited:

PCunicorn

Active Member
Not sure how that makes any sense because in MANY areas Crysis 2 looked worse than Crysis 1. Mostly because they chose to use lower resolution textures than they did in Crysis 1 because they decided to make it for console.

Crysis_3.jpg


Yeah, sure. Maybe some lower res textures where used to fit on CONSOLES, the systems you have been arguing for, but everything else destroyed C1. And I know for a fact Crysis 2 whoops TLoU little ass in graphics, because I have played them both.
 

G80FTW

Active Member
Yeah, sure. Maybe some lower res textures where used to fit on CONSOLES, the systems you have been arguing for, but everything else destroyed C1. And I know for a fact Crysis 2 whoops TLoU little ass in graphics, because I have played them both.

Some? Crysis 2 was LITTERED with low resolution textures. Do you not see the screens you just posted? Even with the high resolution texture pack the buildings still looked similar to textures you would find on duke nukem from 1995. It was horrible. The road is the only thing that looks decent in those screens and they dont even look that good, GTA5 has better road textures and its running on a game engine developed in like 2004.

And textures make up a good 80% of the game considering everything aside from the people in the game is a texture.

They remind me of RAGE. Which possibly had the lowest resolution textures I had ever seen.
 

Darren

Moderator
Staff member
Some? Crysis 2 was LITTERED with low resolution textures. Do you not see the screens you just posted? Even with the high resolution texture pack the buildings still looked similar to textures you would find on duke nukem from 1995. It was horrible. The road is the only thing that looks decent in those screens and they dont even look that good, GTA5 has better road textures and its running on a game engine developed in like 2004.

And textures make up a good 80% of the game considering everything aside from the people in the game is a texture.

They remind me of RAGE. Which possibly had the lowest resolution textures I had ever seen.

Not that I don't believe you, but GTA5 runs on an engine from 2004? Where'd you hear that?
 

G80FTW

Active Member
Not that I don't believe you, but GTA5 runs on an engine from 2004? Where'd you hear that?

Haha. Well, it was an exaggeration. But its from 2006, and the same engine used in GTA4 and Red Dead Redemption. Either way, the RAGE engine is old and Rockstar really need to scrap it because its really not that great of an engine and I feel like they hit their wall with it with RDR.

And actually, I was hoping Rockstar would have developed a whole new engine for GTA5. But sadly, they didnt. So we may not see a new engine from Rockstar until a few years into the next gen consoles :(
 
Last edited:

Aastii

VIP Member
ok, to clarify some things , because you are not saying what I am at all:

At this moment in time, anybody with an Nvidia 8800GTX is sat on a more than viable graphics card. However, in a few months it will not be.

Nobody is noticing the difference between a DX10 and DX11 game unless they look hard... very hard, so that is a moot point. Whilst you can get some performance gains going from 10-11, mainly because that is what the architecture is designed for, the extra features aren't something to get an entire new system from, at least not when released, as will be the case with DX12.

You need to forget about the 7950 now. That is not the card that was out when both last gen consoles hit market, that would be the 8 series, with the GT and GTX coming out a few days before the PS3 launch. The performance of these was, at the time, mind blowing. You had cards more than doubling the performance of the previous gen and were so good that they carried on for not one, but two generations under other names (8800, 9800, GTS250).

A friend of mine was still using a 250 up until last year. The only reason he swapped was because of Arma 2, which is not a console title. He was still playing every game without a hitch except for that (which he could play on low-medium @1080, but wanted more so we built him a new system). Interestingly he had that paired with dual single core Xeon's, so chips performing worse that the CPU for the time of the consoles - C2D E6750 or C2Q Q6600.


With regards to the Crysis thing, you are completely and totally 100% correct. A lot was taken out of Crysis 2 graphically and with the physics compared with Crysis 1 so that the old hardware could handle it without blowing up.

http://www.gamespot.com/articles/crysis-2-graphics-comparison/1100-6307286/

However if anything this proves what I have been saying all along. An 8800GTX, released at the time of the previous consoles, could push 35-40FPS @720 resolutions, so higher than the consoles do at the same resolutions, on a game that is more graphically demanding than the console equivalent.

The consoles last longer than PC's idea is plainly and simply wrong, it is only the case if you demand the highest settings with full AA/AF at higher resolutions and higher FPS. Yes, consoles have software designed specifically for the hardware so will perform better on that hardware than the PC equivalent, but we have already surpassed the console performance with the current PC hardware, we did a couple years back before the One/PS4 were even close to release, so this hit in performance seen on PC code is offset with the latest hardware at release because it is that much more powerful already.
 

G80FTW

Active Member
My original statement:
I have to point out The Last of Us. Pretty much the best looking game for the previous gen consoles and its quality was near that of Crysis 3 on PC. And yet 8 year old hardware was playing it just fine. It shows that hardware is really only as good as the software made for it. And thats where consoles have always had an advantage over PCs. But now that consoles are becoming more like a PC in terms of design it will make it all that much better for PC gamers and console gamers.

The argument (if you can call it one):
Yeah, that's just nonsense.

And what your saying now:
Yes, consoles have software designed specifically for the hardware so will perform better on that hardware than the PC equivalent

Which is all I was trying to say. And the reason the 7 series was brought in was because they were the top series upon the release of the Xbox 360 which started the generation of those consoles and the PS3 also uses a 7 series GPU. Comparing them to newer and much more amazing GPU architecture I felt wasnt exactly a PC equivalent to the PS3 nor the 360.

And yes, the original G80 core architecture is actually still in use. That is why my name is G80FTW :) Upon the release of the G80 we did see numbers we hadnt seen a long time and a good leap forward in graphics.

And I should also point out that I had an 8800GTS shortly after its release. It was a great card. I held onto that card for as long as I could until finally deciding it was time to upgrade from it in 2011 but waiting until 2012 for the GTX680. So my 8800GTS, as good as it was, only lasted me 4 years before I really could not bare to game with it anymore. I was not getting better than console graphics at the performance of a console, I was just getting by with much lower resolutions and no AA (which consoles do use) making all the games have good looking textures but very jaggy and blocky everything else. So like I said, 5 years is really pushing it for a gaming PC. If you want to push it to 7 or 8 years thats fine but you wont be seeing console equal graphics.
 
Last edited:

PCunicorn

Active Member
image_303307_620.jpg

Random screenshot. Most of the Google Images for "the last of us screenshots" turned up art or something else, not honest to God in game screenshots.

SfuHFFx.jpg

Here is one I took off a YouTube video. 1080p, mind you.


2861ACBCF967D00A380E90B679CD3E3245949C1E

A screen shot I took in C2. Textures may not look the best, but they look way better then The Last of Us.

BBA14A29C1E3B43C57A962CE2E7ADA8DCE323823

Close up textures.

High res textures was enabled, and was on Very High, which in Crysis 2 is actually Medium.
 
Last edited:
Top