Forums

Topic: PS5/SERIESX is more powerful than a RTX 2080ti. Proofs and evidences are all here!

Posts 1 to 20 of 85

heyoo

[Edited by LiamCroft]

heyoo

nessisonett

@heyoo You don’t have any information about your processor or RAM. You could stick a graphics card in a toaster and it wouldn’t make your toast 60FPS.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

heyoo

@nessisonett

About the processor or ram? they using the most powerful pc and ram. its from pc expert tech site benchmarks and youtube videos with utmost high end pc rigs. They using an i9 9k mostly.

Console dont use system ram like the pc. the whole console is a dedicated gaming machine only. Remember the ps4 only has 8gb shared memory and it can easily play 60fps games native 1080p like mortal kombat 11 looking A+ beautiful. Most demanding modern games now requires 16gb of system ram to run modern games properly. Console use very minimal system ram to none at all.

Ps3 was a 512mb xdr shared memory and it play games like crysis 3, mgs5, destiny, thief, black flag, etc no problem at all all looking A+ beautiful during its time and on pc these ps3 games requires like 2gb of system ram minimum and like 2gb of vram. Console Magic.

Also 8 core cpu is proven more efficient and better than four and dual cores cpu in modern demanding games since 2017. Ps4 was the blue print of using multiple cores for modern games now. Slower 8 cores of PS4 > faster dual and quad cores cpu of 2016 and below now. Pc now using multiple cores too for better performance, 8 and 16 cores now to run games easily. i3 and i5 of old is inferior than the ps4 8 cores in modern games like doom eternal. PS4 runs doom eternal 45-60 fps and mostly at the high 50s and the 8 core cpu of the ps4 did not prevented it from running this game at this all good performance just like in cod 2019 and warzone all looking A+ beautiful. PS4 8 core cpu have no problem running native 1080p on red dead 2 smooth 30fps but pairing old i3 or i5 quad cores on a 1050ti will result to performance below 30fps on the same graphics as the ps4.

SeriesX and PS5 will have a superior next level 8 core cpu version of what the ps4 had so it will have no problem on that end. And yes another evidence would be if a proven weaker gpu and weaker cpu (below zen 2 cpu) than the ps5/seriesx can max out a game then these console will easily do too.

also its the other way around, you can stick the most powerful cpu on a mere 2gb vram card like a gtx 750ti and it won't help it at all on modern demanding games since 2017. It will still play on ugly graphics and inferior performance. Pair an i9 9k cpu with a 750ti and it will still be completely unplayable on red dead 2 even on ugly all low and off settings while the ps4 and its old 8 core cpu plays it easy 30fps looking A+ beautiful.

[Edited by heyoo]

heyoo

nessisonett

@heyoo ...you’re only comparing the graphics card with a PS5/Series X without using other equivalent hardware such as RAM and CPU. Your ‘research’ is entirely unscientific. Any experiment needs clearly labelled conditions and consistency, along with clear results instead of ‘these graphics are ugly’ or ‘less frames’ especially considering that neither console is even out yet. Share your entire system specs.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

heyoo

@nessisonett You can search all over the internet how these high end gpu performs on these said games. Low and off settings is ugly. Search how a 750ti plays on low and off ugly graphics on mortal kombat 11 or doom eternal vs the ps4 graphics which looks closer to pc ultra than anything below.

It's all about the vram to play at higher settings. If gpu only has 2gb of vram it plays on ugly graphics. By 2017 triple all games requires at least 4gb of vram recommended anything below will play on medium or low graphics. Ps4 aint playing on no ugly graphics since day 1 its a 5gb vram machine. All Medium Settings on Red Dead 2 is also ugly as well and console does not have those ugly graphics. I have a gtx 1060, ps4 and the ps4 pro plus real life evidences all over youtube are all legit evidences.

[Edited by heyoo]

heyoo

SirAngry

@nessisonett I don't think they're getting the point you are making about the holistic synergy between computing components. Hell, I've changed a motherboard in a build for a friend a few years back as he had a coolant leak issue... that got him a fairly impressive frame bump on some games. I've tried explaining how the large pool of cache in the PS5 is like the best thing ever, especially when coupled with the custom scrubbers... but people either don't want to hear the truth, or can't comprehend it. Hell, some on here don't even know that they don't know, especially where things like how clock speed has a dramatic effect on initial path calculations within RT in BVH. The OPs comment also fails to take into consideration supply side differences, SSDs and code optimised for SSDs are always more likely to maximise computational resources more easily. In the case of the 2080ti many games just didn't tax it's cores, for a variety of reasons.

SirAngry

Rudy_Manchego

I mean, I'm not a tech afficionado and performance isn't everything in my mind but it is typically true that consoles punch above their weight, specifically at launch, because they have components that are carefully designed to work together and developers can target that device. The base PS4, a 7 yr old piece of technology can run something as impressive as TLOU2 for that very reason.

Very high end PC builds, will typically be able to match or exceed consoles at launch in raw power but very few people have consistent builds with components that allow this. If power, Frames per second, resolution etc. mean a lot then upgrade your PC to high end pieces regularly and you'll no doubt win, particularly on third party games eventually.

Now I may be an idiot, but there's one thing I am not sir, and that sir, is an idiot

PSN: Rudy_Manchego | X:

BAMozzy

Considering you have no idea what the clock speed of the 2080 Ti is running at, as stock, its not running anywhere near as fast as a Series X (let alone the clock speeds of a PS5), you can't even begin to compare. You also have NO idea as to what optimisations the developer has done to get a stable frame rate. The only area that Gears 5 is stated to be 'above' the Ultra setting is the Particle effects and all the other enhancements are those over and above the XB1 version.

Things like Higher Resolution Volume Fog, Higher quality Depth of Field could be lower than PC's Ultra settings. Extreme draw distance may indicate the 'maximum' draw distance settings but not using 'ultra' for LoDs. Some 'Ultra' settings are simply a waste of resources and the difference in visual quality is negligible if you can actually see any difference at all. You really do not need the 'highest' quality asset to be used when the object is that far into the distance that the pixel density of 4k can't render the fine details of the asset anyway - its just a 'waste' - so using Ultra is pointless.

Until you can actually line up both games side by side and tweak the visual settings on the PC game to actually match the console settings EXACTLY, then you cannot state that the hardware is better than a 2080 Ti at all.

I won't bother going through EVERY game you decided to discuss, but benchmarks for GPU's are often set to Ultra just to have a consistency when comparing performance - that is a big difference from an optimised game for console that will have a range of settings used - some a like for like with the PC counterpart and even some that are 'unique' to the console. If you watch Digital Foundry video's, you will see some breakdowns that will say 'equivalent' to PC's 'Medium' setting or maybe even somewhere between Medium and High for example. Also if you watch DF, you may see some 'optimised' settings for PC based on 'cost' (as in frame rate) and 'impact' (as in whether it makes a significant difference visually or not) and rarely will they use an 'Ultra' setting because the cost does not justify the impact and in some case, almost imperceptible.

Obviously, something like Particles will make a difference visually but a developer will optimise a game to run on the console and on PC, the 'optimisation' is done by the PC owner because of the vast array of PC specs on the market. That's what the PC settings are for essentially, to optimise the game for the hardware you have. In every example, a developer is going to optimise a game for the console with some parallel settings to PC the PC Version (medium, High...) and some that are bespoke to the console version (between Medium and High).

I have no doubt that RDNA 2 will be a big jump up from GCN and even RDNA 1 (RX 5700 XT for example) - get a lot more performance per TF for example, but you can't really compare a game that has been optimised/tuned specifically for a Platform to a game that is built to run on a wide range of PC hardware.

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

phil_j

I remember back in the day when a PS2 port of a PC game had to have so much cut out it to even run. Mafia I'm looking at you. Fun times.

Is this the first generation where the consoles can even come close to competing with high end pcs?

Though I don't think you can compare these specific instances directly. Mark Cerny said it all when he said a PS4 teraflop is not equivilant to PS5 teraflop.

The thing I'm most excited for is instant loading. So I don't have to read and write crappy forum posts whilst waiting for something to boot up.

phil_j

nessisonett

@SirAngry There’s absolutely zero consistency or details. For all we know, they stuck mining software on in the background and were wondering why Gears 5 didn’t run. It’s nonsense as well because I have a laptop with a GTX 1060 and 8GB RAM and it runs Gears 5 at an almost stable 60FPS on Ultra. It’s got to be throttling somewhere.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

JohnnyShoulder

@phil_j I'm gonna miss your crappy forum posts!


I mean the consoles aren't even out yet, so it is probably best to wait until somewhere like Digital Foundry get their hands on the consoles and do a proper comparison, if you are into that sort of thing. Personally I'm more interested in playing the games than how it compares to a rival console or PC, but each to their own and all that jazz.

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

SirAngry

@nessisonett indeed, early benchmarks with the 2080ti were also hampered by early drivers that for some reason left performance on the table. Truth is I reckon there is loads more we can ring out if next-gen consoles as opposed to a decent PC with a 2080ti. Not because the GPU itself is better, but because these consoles are far better balanced and thought out. With the PS5 I literally only need what I possibly might use in RAM within the next 2 seconds, and even then for a majority of assets if there's an emergency the scrubbers and I/O can bail me out in milliseconds in many cases. No PC will have that possibility for a very long time, and it means maximising computational resources, specifically on the PS5, but also XSS|X is relatively straight forward. Most of the resources in PS5 will be going to service the Current frame, that's not really the case with PC architecture so the narrative of power isn't relevant. It's ultimate what you see on screen, and the quality of image. But what do I know, what I do know though is that I wished people could just wait and see.

SirAngry

Ryall

@SirAngry We can finally move beyond our consoles rebooting in the background during loading screens 😂.

Ryall

SirAngry

@Ryall I really have no idea what you mean...

But essentially yes, and the news the PS5 I/O is adopting Oodle for compression of texture and map details is mind boggling. We really could be seeing games load in under a second very shortly.

SirAngry

JohnnyShoulder

@Kidfried I'll always have time for this place. Whether that is a good or bad thing is another matter lol.

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

heyoo

New Update on opening post: Watch Dog Legions.

[Edited by heyoo]

heyoo

nessisonett

This is absolute nonsense. You’re using games that aren’t out, comparing them with a console that isn’t out to a graphics card that’s seemingly the only component of your test machine. If your 2080ti can’t run Dark Souls 3 at a stable 60FPS then that’s a fault of the optimisation or other components in your setup, that’s a stone-cold fact. What’s hilarious is your espousing of this as some sort of grand reveal that will set the ‘PC bias channels on the internet’ in their place. There is zero understanding outside of watching 1 or 2 Digital Foundry videos, you’re comparing numbers that don’t convey the full story and relying on recommended specs for games that aren’t out. What’s especially funny is that this isn’t even an up to date card! Your entire argument is that PS5 will beat the RTX 3080 by year 4, which is moot because there’ll be another better card on the market by then, which devs will aim for as their target. Comparing the PC and console markets is just silly, console devs have to wring out all they can because customers wouldn’t accept having to pay for an upgrade every two years. PC devs are constantly keeping up with the latest advancements and shifting the goalposts and so hardware is obsolete quicker. It’s comparing apples and pears, both are green fruit but are also completely different.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

heyoo

@JohnnyShoulder Digital Foundry? Remember they said a gtx 750ti and its 2014 pc build is better than the ps4. gtx 750ti is a mere 2gb vram card, it plays only ugly graphics since 2017. Please look it up. They downplayed the PS4 hard comparing it to a mere 2gb vram card like the gtx 750ti when in fact the PS4 even beats a 780ti too in the long run starting by year 4. The gtx 1050ti 4gb performs like the PS4 now and the 750ti build they have is obsolete a long time ago. Isn't this obvious enough?

[Edited by heyoo]

heyoo

heyoo

@nessisonett

Nonesenes? These are all facts. These all came from the devs and actual gameplays benchmarks.

That's the reason. They intentionally gimp the rtx 3080 vram so that they will profit on future upgrades on pc consumers. But it still a fact that console are made future proof than any high end gpu at the same year except for the novelty card like rtx 3090 which is basically a titan.

So you watch their 750ti video vs the PS4? Now, see how the 750ti performs and looks in triple A games since 2017. See how a 750ti plays Odysseey, red dead 2, doom eternal, etc. Ugly just ugly. Now you still believe their 750ti videos against the ps4 is even true? That's a budget pc they build to come up and beat the ps4 and obviously they are not even on the same level in the long run starting at year 3. Get back to me if a 750ti can play detroit becomes human. Don't even talk about optimization. The fact that's it's a mere 2gb vram card means it's inferior. Period. It plays on ugly graphics even PS3 looks better sometimes.

I'm not downplaying anything here just stating facts. I like both pc and console.

This is the problem. Do your own research or better buy these consoles and high end gpus and compare them in the long run console wins! I had 2 high end gpus in 2008 and the PS3 beat them easily in the long run, they become obsolete fast while the ps3 played for 9 years. I have a $200 gtx 960 2gb in 2015 and it's dead after 2 years plays only on ugly graphics since 2017 while my 2013 ps4 plays beautiful A+ graphics till today. I got scammed hard. PS4 is capable of 5gb of vram no wonder it still plays beautiful today.

Now Im using a gtx 1060 and satisfied with it at 1080p. and a gtx 1060 is not on the level of the ps4 pro and xboxonex too. I have the consoles too. Gtx 1060 aint playing higher than 1440p up to 4k on most modern demanding games like the ps4 pro and onex easily does. Onex easily beats the gtx 1070 too.

Rtx 2080ti cannot do 60fps on dark souls 3 and red dead 2 4k right? so you think its an optimization problem? I bet red dead 2 is coming to the PS5/SeriesX 4k lock 60fps ultra pc setting port. And when this happen its over. PS5/SeriesX wins! I can't wait to add it to my OP. So you really think the rtx 2080ti is better and can play next gen insane graphics 5-8 years from now like the PS5/SeriesX will? Good luck.

This is another problem we don't look back at the past. We only compare the first couple of years. They don't compare the base ps4 against their 750ti build since 2017 because its not even a competition. PS4 is far superior. No one reviews or compare 2006 PS3 vs a 8800 gt nvidias most powerful gpu in 2007 in modern demanding ps3 games because it's a long time ago. No one cares. But if compared on modern ps3 games you will see how easy the PS3 beats the most powerful gpu of 2007 and most high gpu of 2008 easily. Fact is console beats high end gpu at the same year easily in the long run starting as early as year 4, and this all the way to ps2 era

[Edited by heyoo]

heyoo

JohnnyShoulder

@heyoo I mean you just saying a load of numbers at me at this point and all of that means nothing to me. As long as the games still look good and play well on the system I play, that is all I'm interested in. Good for you if you are into all this kinda stuff, but it is not for me.

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

This topic has been archived, no further posts can be added.