Forums

Topic: PS5/SERIESX is more powerful than a RTX 2080ti. Proofs and evidences are all here!

Posts 1 to 20 of 90

heyoo

GEARS OF WAR 5

Gears 5 on Rtx 2080TI native 4k pc ultra setting - 62 AVG fps and it still dips below 60fps.

Now. Gears 5 on Series X is native 4k 60fps and is beyond pc ultra pc setting with new graphic enhancements below,

Contact Shadows
Screen Space Global Illumination
50% higher particles counts than that on the PC’s ultra spec
Real time cinematics in 4K 60fps
Higher resolution textures
Improved Anisotropic Filtering
Higher Resolution Volume Fog
Higher quality Depth of Field
Extremely far Draw Distances with high level of object detail
Shadow Resolution & Shadow Distance
High quality Screen Space Reflections
Post Processing improvements like Bloom, Lens Flare, Light Shafts, etc.

and the devs said this enhancement is not available on any gears 5 pc settings.

Now, if you add all these improvements on a 2080ti then surely the 2080ti will play it way below 60fps at 4k maybe around 48fps avg. Series X easily wins!

Digital Foundry said the Series X performed like the rtx 2080 but that is not even true because gears 5 pc ultra setting on rtx 2080 avg only 45fps at 4k while the series X is at 60fps native 4k with graphics beyond pc ultra settings.

Now Gears 5 on Series S also is said to be the same pc ultra setting with the new graphical enhancements but only at 1440p 60fps. Perhaps the 2080ti can run gears 5 with the next gen graphical enhancements at 1440p 60fps? Series S will play gears 5 multiplayer at native 1440p 120fps too. Now, the Rtx 2080ti at 1440p ultra setting plays gears 5 multiplayer at variable fps of 120-160fps hence it can only do lock 120fps at 1440p just like Series S will. So Series S = rtx 2080ti at 1440p? Interesting.

Remember RDNA 2 gpu architecture, features and capabilities >>> rtx 2000 series / amd RDNA 1 gpu architecture, features and capabilities.

Tflops on different generation of gpu architecture doesn't scale the same. 1 Tflop on new generation gpu is superior than 1 Tflop of old generation gpu. Yes, Series S only has 4 Tflops vs the rtx 2080ti 13.45 Tflop but it's tflop is based on new more advance RDNA 2 architecture plus mixing this with the new capabilities and features of RDNA 2 it’s actual performance would be way superior than a 4tflop on rdna 1 and turing architecture.

Series S has 10gb of flexible memory with 8gb vram by default vs Rtx 2080ti 11gb of vram, not much difference and more than enough for 1440p gaming. PS5/SeriesX has 13.5gb of vram by default and can even use almost all if not all 16gb of flexible memory as vram if needed too.

Also, the remaining 2gb memory of Series is flexible memory can be mostly use as vram as well if needed.
Ps3 was a 2 Tflop machine vs PS4 is only 1.8 Tflop machine and we all know which one is on the next level.

GTX 1050ti has only 2.1Tflop but is superior than a gtx 770 with 3.3 Tflops or gtx 780ti with 4.4 Tflops in modern games since 2017. The gtx 1050ti performs identical now to the base ps4 in recent games while ps4 actually doing better in recent games like marvel avengers (ps4 plays at prettier graphics) while the 1050ti plays on inferior graphics 1080p with rescaling on just to reach 30fps. Crysis Remaster minimum requirements is a 1050ti and this is still a PS4 game.

Also Rtx 2080ti and all other rtx 2000 series has 1st generation ray tracing only while RDNA 2 has 2nd generation ray tracing. And keep in mind next gen console are RDNA 2 with unique features not available in RDNA 2, probably some rdna 3 tech for future proofing.

CALL OF DUTY COLD WAR (rtx 2080ti/rtx 2080)

Call of duty cold war developers said ps5 will run the game native 4k 60fps with ray tracing On. And will run native 4k 120fps on multiplayer too. The Devs said next gen console superior vram or memory allows it to max out both resolution (4k) and frames per second (120fps). Watch the IGN PS5 multiplayer event.

Now, the rtx 2080ti plays COD 2019 multiplayer 4k at ultra-setting only around 85-110fps avg. This is far from what the PS5 can do with Cold war multiplayer which is native 4k 120fps and obviously Cold War is the more gpu demanding game too.

Rtx 2080ti plays COD 2019 4k at ultra with ray tracing On 71 fps and PS5 does native 4k 60fps with ray tracing ON with Cold War, but keep in mind console don't do variables fps and cold war is obviously the more demanding game.

Rtx 2080 plays COD 2019 4k at ultra with ray tracing On at 50fps avg. It's obvious the PS5 is superior than a rtx 2080, PS5 is native 4k 60fps with ray tracing ON with Cold War which is more demanding than COD 2019.

So don't believe anyone who says PS5 is on rtx 2080 or below level. It's not. It's even beating the rtx 2080ti.

ASSASSIN'S CREED VALHALLA

RTX 2080ti plays Assassin's Creed Odyssey ultra setting at 4k below 60fps, it averages around 54fps. The Rtx 2080ti cannot do lock 60fps on this game. And this is still a last gen game.

Now for the PS5/SeriesX, guess what! Ubisoft themselves said

"With the Xbox Series X | S, Assassin’s Creed Valhalla will take full advantage of the enhanced graphics, giving players the opportunity to experience the open world of Norway and England down to the very last detail. On Xbox Series X, Assassin’s Creed Valhalla will run at 60 FPS in full 4K resolution,"

let me repeat that, its FULL 4k resolution! aka native 4k! also keep in mind they said “down to the very last detail.”

Which means it will be playing on ultra max out setting easily on next gen consoles.

Valhalla is also native 4k 60fps on PS5.

“will not only run in 4K on the PS5, but also with 60 frames. And full 4K resolution, as you emphasize, which you can expect,”

“The days of checkerboard 4K should be over for now”

Obviously, Valhalla on ps5/seriesX will look way better and beautiful than last gen Assasin's creed odyssey pc ultra-settings. Valhalla next gen version is obviously more demanding than last gen odyssey pc ultra-settings.
This means Series X and PS5 can play Assassin's Creed Odyssey ultra-setting 4k at lock 60fps easily while the 2080ti does not.

Since the Rtx 2080ti can't even play 60fps 4k on Odyssey then surely it will struggle worst with Valhalla at 4k. Series X and PS5 Wins!

Devs don't lie on this matter. When Rockstar said red dead 2 was native 4k on xboxonex, it is native 4k and no one can deny it and random resolution comparison channels in the internet also says so. Guerilla games said horizon zero dawn is native 1080p on PS4 and it is, everyone agrees. If the devs didn't say anything about the resolution watch pc bias channels will create all made up resolutions and says red dead 2 is dynamic 4k on xboxonex and even goes below 1440p to downplay it. They will horizon zero dawn plays dynamic 1080p and goes below 720p too if guerilla games did not speak up. Good thing these pc bias sites ain't doing anything when devs actually speaks up.

If devs don't say anything about what resolution the consoles games are playing on, random people all over the internet creates their own made up resolution for the consoles and mostly cheat to downplay consoles. Wish all devs would just out right say what resolution a game plays on the console so no pc bias channels and people can cheat and lie just to downplay consoles.

DARK SOULS 3 / DEMON SOULS (vs rtx 2070 Super)

RTX 2070 Super plays Dark souls 3 (still last gen graphics) pc ultra-settings 4k below 60fps, it plays mostly on 55fps to high 50s. This card cannot do lock 60fps at 4k on dark souls 3 a last gen graphic game.

Now, the PS5 demon souls (true next gen graphics) gameplay trailer (captured on ps5) was playing it on native 4k 60fps.

The rtx 2070 super cannot even do lock 60fps on dark souls 3 a last gen graphics so there is no question it will play on inferior performance (probably 41fps avg) on demon souls at 4k if there was a pc release. Also, Ps5/SeriesX will easily play dark souls 3 ultra setting 4k 60 fps easily unlike the rtx 2070 super as the PS5 plays an actual next gen graphics of a dark souls game (demon souls remake) native 4k 60fps by default.

So don't even believe anyone who says PS5 = a rtx 2070 Super. It’s obvious PS5 is way better.

WITCHER 3 REMASTER (vs rtx 2080/2080ti)

"the next-gen edition of the game will feature a range of visual and technical improvements - including ray tracing and faster loading times"

Expect frame rate boosts from 30FPS to 60FPS minimum, incredible new lighting and global illumination effects including dynamic weather visuals, next-gen shadows, and of course ray traced visuals.

rtx 2080 runs the 2015 original witcher 3 at 4k at 59fps average (unable to lock 60 at max setting as it constantly plays below 60fps) and this is without all the graphic enhancement including ray tracing the next gen console will get.

Assumably it will be native 4k 60fps with all visuals upgrades beyond pc ultra on next gen console. And these graphics upgrades specially the ray tracing will make the rtx 2080 at 4k struggle hard or even borderline unplayable at 4k.

Rtx 2080 is clearly inferior than next gen console then. These Rtx cards takes a huge hit on its fps everytime ray tracing is turned On taking away almost half of its fps. As you can see the next gen console has no problem playing at native 4k 60fps with ray tracing On like on grand turismo 7 and cold war. Rtx 2000 series 1st generation ray tracing is just inefficient and not powerful as the 2nd generation of ray tracing hardware the next gen console have.

Rtx 2080ti plays witcher 3 4k ultra setting at 65-75fps. Now, adding the next edition graphical upgrades would likely make the 2080ti unplayable at 60fps.

All benchmark of the said gpus here are from legit pc gaming or tech benchmark sites using only the highest end gaming pc (cpu, ram, etc) and from youtube gameplays using very high end rigs.

Watch Dogs Legions (Rtx 2080ti/Rtx 2070)

The Official Pc Requirements Recommended of Watch Dogs Legion From Ubisoft

1080p High Settings with Ray Tracing On
Cpu - Ryzen 3600
Gpu - Rtx 2070
Vram - 8gb

4k Ultra Settings with Ray Tracing On
Cpu- Ryzen 3700x
Gpu- Rtx 2080ti
Vram - 11gb

SeriesX and Ps5 zen 2 custom cpu has 8 cores with 16 threads. The Ryzen 3600 is only a 6 core 12 thread cpu. The next gen console cpu is identical to the Ryzen 3700x which is also 8 cores with 16 threads. Also custom cpu actually means it has better customize cpu tech than the original version of it. Just like next gen console are custom rdna 2. It's custom rdna 2 because it has other unique features not available in rdna 2 tech.

The PS5/ SeriesX have 13.5gb of vram minimum hence it can easily play at 4k ultra setting with ray tracing on than the 11gb vram of the Rtx 2080ti build here.

Rtx 2070 is only playing at 1080p high setting with ray tracing on. Conclusion. Ps5/Series X is way more powerful than a rtx 2070 and completely on different level.

Also, don't believe anyone that says next gen console doesn't play this game at native 4k ultra settings with ray tracing on because base on specs alone next gen console is more than capable with that superior vram which is critical for 4k ray tracing gaming. Also next gen console rdna 2 per tflops > rdna /turing per tflops making the ps5 10.3 tflops / seriesX tflops12 tflops power perhaps even significantly exceed the rtx 2080ti 13.45 tflops of old turing architecture. RDNA 2 2nd generation ray tracing > Rtx 2080ti still 1st generation ray tracing too. PS5/SeriesX got this in the bag easy.

Gears Tactics

Gears Tactics is coming to seriesX and it will be native 4k 60fps.

Rtx 2080ti gears tactics ultra setting 4k plays at variable 52-65fps. Averages around 58 fps.

Rtx 2080 gears tactics ultra setting 4k plays at variable 41-54 fps. Averages around 47 fps.

5700 XT gears tactics ultra setting 4k plays at variable 34-47 fps. Averages around 41 fps.

Rtx 2080 / 5700 XT is clearly not on the level of SeriesX (and PS5).

Clear as day, 5700xt rdna 1 vs next gen console with custom rdna 2. RDNA 2 is said to be 50% more efficient, better, powerful than rdna 1. Sounds about right. And remember next gen console is not just rdna 2 it has other unique features not available in rdna 2 and mix that with the advanced purely dedicated gaming I/O system of next gen console it will surely surpass the rtx 2080ti easily specially down the road.

The fact that the rtx 2080ti plays it below 60fps constantly means seriesX is leading specially if it will be lock 60fps on the console.

All these gpu benchmark are taken from multiple notable pc benchmarking sites and youtube videos using only the highest pc parts (cpu, ram, etc). This is the apparent fps averages on each gpu on multiple source.

HOW TO KNOW IF PS5/SERIESX PLAYS A GAME ON ULTRA MAX SETTINGS.

We all know right now this for a fact,

PS5/SERIESX Assasin Creed Valhalla graphics native 4k 60fps >>> RTX 2080TI Assasin Creed Odyssey last gen graphics native 4k only 55fps average.

Valhalla graphics on next gen console is obviously greater and more demanding than the last gen graphics of Odyssey ultra settting hence SeriesX/Ps5 > Rtx 2080ti.

Now if a RTX 2080ti can max out Valhalla at 4k (expect it to play below 60fps) then the seriesX/ps5 has it at max out as well. Why would it not? Also, if a mere rtx 2060/2070/5700xt can max out Valhalla at 4k (will obviously play way below 60fps, maybe around 36 to 47 fps) then the PS5/SERIESX can easily max out this game as well. Why would it not? It's more than capable and the devs themselves said it.

Now, we should be prepare if pc bias comparison videos will say Valhalla plays on lower settings even on next gen console. They will use a rtx 3080 for comparison at ultra max setting 4k on the rtx 3080 and will claim its not on max settings on the ps5/seriesx which would be absurd. To counter this let's just find out other obvious weaker gpu than the consoles like a rtx 2080 or heck even a rtx 2060 and gtx 1080 and if these weaker cards than next gen consoles can max out valhalla at 4k maybe around 34-44fps and they say next gen cannot then something is fishy. That is just downright lying and cheating.

Just find an obvious weaker card than these next gen consoles like the 5700xt/2070/2080 etc and if these weaker cards can max out a certain game then next gen console will easily too.

You have to owned these consoles and all these cards to know the real truth yourselves. Don’t rely on pc bias channels in the internet as they can create all made up stuffs just to downplay these next gen consoles again.

For example, they say sekiro doesn’t run max settings on consoles like on the xboxonex but a mere 2gb vram card like a gtx 1050 can run it at max easily at 34fps average. Base Ps4 is a 5gbvram machine and xboxonex has 9gb of vram. Console don't use variable fps, its rare to none, its usually 30fps 60fps or 120fps

The gtx 1050 is a mere 2gb card and it is playable only on UGLY all low and off graphics in modern demanding games since 2017 like shadow of tomb raider, doom eternal, red dead 2, assassin creed origins and odyssey, mortal Kombat 11 and so on. 2gb vram cards are also unplayable in other games such as Detroit becomes human. All consoles plays on beautiful A+ graphics.

This is very crucial since next gen console will be playing native 4k most if not all the time now. Pc bias channels only saving grace now is to downplay next gen console graphics settings. Keep in mind if a proven weaker gpu than next gen consoles can max out a certain game at 4k then the consoles should easily too. This will be the biggest evidence.

Also as a bonus, let's review history.

Rtx 2080ti is a 2 year old card before next gen console release.

There is no 1998 pc gaming rig that beat the 2000 ps2. There is no 2004-2005 high end pc gpu that beat the 2006 PS3. There is no 2011 high end gpu that beat the 2013 PS4. History will repeat itself. The 2 year old 2080ti ain't beating the ps5/seriesx specially in the long run. PS5/SeriesX ain't competing against the 2080ti. Rtx 2080ti is a 4k card for last gen gaming and the new consoles are build for 4k next gen gaming that will last for the next 9 years. Based on history, 2080ti will also be obsolete by year 4 even at 1440p while next gen console will play insane more demanding graphics later on for 9 years.

Also for those who don't know. the 2006 PS3 even beats the most powerful gaming pc of 2007 and 98% of high end gpu in 2008 in the long run easily. The most powerful gaming pc of 2007 and 98% high end gpus of 2008 cannot play like the last 20 latest most demanding PS3 multiplatform games like MGS5, dragon age inquisition, destiny, theif, farcry 4, crysis 3, injustice, blackflag, blackops3, shadow of mordor, watchdogs and so on. The most powerful gpu of 2007 is unplayable (20fps) even on all low and off ugly graphics on assassin creed black flag while the PS3 easily plays it smooth 30fps looking A+ beautiful before the PS4 release. The most powerful pc gpu of 2007 is dead by 2011 in most games while the PS3 played more demanding games till 2015. Ps3 even plays NBA 2k18.

And yes, 2013 PS4 also beats the most powerful nvidia gpu in 2013 the gtx 780ti by year 4-5 when games started requiring 4gb of vram as recommended. It all comes down to vram in the long run to play modern demanding games. PS4 5gb vram and the gtx 780ti is only 3gb vram. Gtx 780ti plays on inferior graphics and performance than the PS4 in modern demanding triple A games since 2018.

There is a reason why comparison videos between pc vs console, the pc uses only the current highest end gpu (rtx 2080ti) like since 2018 because high end gpu at the same year of the ps4 release aint doing it in modern games since 2017. High end gpu's at the same year of console release will get exposed hard in modern games if compared.

rtx 3080 is only a 10gb vram card and the PS5/SeriesX are 13.5gb vram machine and can use all 16gb memory as vram too if needed. History will always repeat itself. Yes, Rtx 3080 will be faster in the first 2-3 years when games only requires 10gb or less vram minimum but when games starts requiring more than 10gb of vram minimum even at 1440p this card will become obsolete. PS5/SeriesX no doubt will be superior than a rtx 3080 in the long run starting at year 4.

History repeat itself, gtx 770 3.3tflops only 2gb vram only plays on ugly low and off graphics since 2017 while the ps4 1.8tflops but a 5gb machine keeps playing graphics getting better and better year after year and at its very best near the end of its cycle in terms graphics and performance. The ps4 just like the gtx 1050ti 4gb vram is superior than the gtx 770 and gtx 780 in modern games since 2017. RTX 3070 is a mere 8gb vram card with 20tflops and this is the equivalent of the gtx 770 to the base ps4. PS5/SeriesX will beat the rtx 3070 by year 4 easily in graphics and performance. It's all about the vram later on to play at better graphics and performance, tflops won't save a rtx 3070/3080 when games requires more vram than what these cards vram has same thing happened to the gtx 770/780. Also if a gpu plays on ugly or inferior graphics then its not on the level of the consoles.

******Will constantly update this opening post upon further more evidences that will come out.******

All the legit evidences and facts above stacks up nicely in favor of next gen consoles making it obvious ps5/seriesx is superior than a rtx 2080ti. Only way this will be denied is when pc bias channels will start making up "fake" opinions and comparisons to downplay the next gen consoles. Buy the PS5/SeriesX and these cards rtx 2060 to rtx 2080ti yourself to compare properly specially in the long run like 3 years and up from now you will see how superior next gen consoles are to any high end rtx 2000 series gpu.

All benchmark of the said gpus here are from legit pc gaming or tech benchmark sites using only the highest end gaming pc (cpu, ram, etc) and from youtube gameplays using very high end rigs.

Edited on by LiamCroft

heyoo

nessisonett

@heyoo You don’t have any information about your processor or RAM. You could stick a graphics card in a toaster and it wouldn’t make your toast 60FPS.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

heyoo

@nessisonett

About the processor or ram? they using the most powerful pc and ram. its from pc expert tech site benchmarks and youtube videos with utmost high end pc rigs. They using an i9 9k mostly.

Console dont use system ram like the pc. the whole console is a dedicated gaming machine only. Remember the ps4 only has 8gb shared memory and it can easily play 60fps games native 1080p like mortal kombat 11 looking A+ beautiful. Most demanding modern games now requires 16gb of system ram to run modern games properly. Console use very minimal system ram to none at all.

Ps3 was a 512mb xdr shared memory and it play games like crysis 3, mgs5, destiny, thief, black flag, etc no problem at all all looking A+ beautiful during its time and on pc these ps3 games requires like 2gb of system ram minimum and like 2gb of vram. Console Magic.

Also 8 core cpu is proven more efficient and better than four and dual cores cpu in modern demanding games since 2017. Ps4 was the blue print of using multiple cores for modern games now. Slower 8 cores of PS4 > faster dual and quad cores cpu of 2016 and below now. Pc now using multiple cores too for better performance, 8 and 16 cores now to run games easily. i3 and i5 of old is inferior than the ps4 8 cores in modern games like doom eternal. PS4 runs doom eternal 45-60 fps and mostly at the high 50s and the 8 core cpu of the ps4 did not prevented it from running this game at this all good performance just like in cod 2019 and warzone all looking A+ beautiful. PS4 8 core cpu have no problem running native 1080p on red dead 2 smooth 30fps but pairing old i3 or i5 quad cores on a 1050ti will result to performance below 30fps on the same graphics as the ps4.

SeriesX and PS5 will have a superior next level 8 core cpu version of what the ps4 had so it will have no problem on that end. And yes another evidence would be if a proven weaker gpu and weaker cpu (below zen 2 cpu) than the ps5/seriesx can max out a game then these console will easily do too.

also its the other way around, you can stick the most powerful cpu on a mere 2gb vram card like a gtx 750ti and it won't help it at all on modern demanding games since 2017. It will still play on ugly graphics and inferior performance. Pair an i9 9k cpu with a 750ti and it will still be completely unplayable on red dead 2 even on ugly all low and off settings while the ps4 and its old 8 core cpu plays it easy 30fps looking A+ beautiful.

Edited on by heyoo

heyoo

nessisonett

@heyoo ...you’re only comparing the graphics card with a PS5/Series X without using other equivalent hardware such as RAM and CPU. Your ‘research’ is entirely unscientific. Any experiment needs clearly labelled conditions and consistency, along with clear results instead of ‘these graphics are ugly’ or ‘less frames’ especially considering that neither console is even out yet. Share your entire system specs.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

heyoo

@nessisonett You can search all over the internet how these high end gpu performs on these said games. Low and off settings is ugly. Search how a 750ti plays on low and off ugly graphics on mortal kombat 11 or doom eternal vs the ps4 graphics which looks closer to pc ultra than anything below.

It's all about the vram to play at higher settings. If gpu only has 2gb of vram it plays on ugly graphics. By 2017 triple all games requires at least 4gb of vram recommended anything below will play on medium or low graphics. Ps4 aint playing on no ugly graphics since day 1 its a 5gb vram machine. All Medium Settings on Red Dead 2 is also ugly as well and console does not have those ugly graphics. I have a gtx 1060, ps4 and the ps4 pro plus real life evidences all over youtube are all legit evidences.

Edited on by heyoo

heyoo

SirAngry

@nessisonett I don't think they're getting the point you are making about the holistic synergy between computing components. Hell, I've changed a motherboard in a build for a friend a few years back as he had a coolant leak issue... that got him a fairly impressive frame bump on some games. I've tried explaining how the large pool of cache in the PS5 is like the best thing ever, especially when coupled with the custom scrubbers... but people either don't want to hear the truth, or can't comprehend it. Hell, some on here don't even know that they don't know, especially where things like how clock speed has a dramatic effect on initial path calculations within RT in BVH. The OPs comment also fails to take into consideration supply side differences, SSDs and code optimised for SSDs are always more likely to maximise computational resources more easily. In the case of the 2080ti many games just didn't tax it's cores, for a variety of reasons.

SirAngry

Rudy_Manchego

I mean, I'm not a tech afficionado and performance isn't everything in my mind but it is typically true that consoles punch above their weight, specifically at launch, because they have components that are carefully designed to work together and developers can target that device. The base PS4, a 7 yr old piece of technology can run something as impressive as TLOU2 for that very reason.

Very high end PC builds, will typically be able to match or exceed consoles at launch in raw power but very few people have consistent builds with components that allow this. If power, Frames per second, resolution etc. mean a lot then upgrade your PC to high end pieces regularly and you'll no doubt win, particularly on third party games eventually.

Now I may be an idiot, but there's one thing I am not sir, and that sir, is an idiot

PSN: Rudy_Manchego | Twitter:

BAMozzy

Considering you have no idea what the clock speed of the 2080 Ti is running at, as stock, its not running anywhere near as fast as a Series X (let alone the clock speeds of a PS5), you can't even begin to compare. You also have NO idea as to what optimisations the developer has done to get a stable frame rate. The only area that Gears 5 is stated to be 'above' the Ultra setting is the Particle effects and all the other enhancements are those over and above the XB1 version.

Things like Higher Resolution Volume Fog, Higher quality Depth of Field could be lower than PC's Ultra settings. Extreme draw distance may indicate the 'maximum' draw distance settings but not using 'ultra' for LoDs. Some 'Ultra' settings are simply a waste of resources and the difference in visual quality is negligible if you can actually see any difference at all. You really do not need the 'highest' quality asset to be used when the object is that far into the distance that the pixel density of 4k can't render the fine details of the asset anyway - its just a 'waste' - so using Ultra is pointless.

Until you can actually line up both games side by side and tweak the visual settings on the PC game to actually match the console settings EXACTLY, then you cannot state that the hardware is better than a 2080 Ti at all.

I won't bother going through EVERY game you decided to discuss, but benchmarks for GPU's are often set to Ultra just to have a consistency when comparing performance - that is a big difference from an optimised game for console that will have a range of settings used - some a like for like with the PC counterpart and even some that are 'unique' to the console. If you watch Digital Foundry video's, you will see some breakdowns that will say 'equivalent' to PC's 'Medium' setting or maybe even somewhere between Medium and High for example. Also if you watch DF, you may see some 'optimised' settings for PC based on 'cost' (as in frame rate) and 'impact' (as in whether it makes a significant difference visually or not) and rarely will they use an 'Ultra' setting because the cost does not justify the impact and in some case, almost imperceptible.

Obviously, something like Particles will make a difference visually but a developer will optimise a game to run on the console and on PC, the 'optimisation' is done by the PC owner because of the vast array of PC specs on the market. That's what the PC settings are for essentially, to optimise the game for the hardware you have. In every example, a developer is going to optimise a game for the console with some parallel settings to PC the PC Version (medium, High...) and some that are bespoke to the console version (between Medium and High).

I have no doubt that RDNA 2 will be a big jump up from GCN and even RDNA 1 (RX 5700 XT for example) - get a lot more performance per TF for example, but you can't really compare a game that has been optimised/tuned specifically for a Platform to a game that is built to run on a wide range of PC hardware.

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

phil_j

I remember back in the day when a PS2 port of a PC game had to have so much cut out it to even run. Mafia I'm looking at you. Fun times.

Is this the first generation where the consoles can even come close to competing with high end pcs?

Though I don't think you can compare these specific instances directly. Mark Cerny said it all when he said a PS4 teraflop is not equivilant to PS5 teraflop.

The thing I'm most excited for is instant loading. So I don't have to read and write crappy forum posts whilst waiting for something to boot up.

phil_j

nessisonett

@SirAngry There’s absolutely zero consistency or details. For all we know, they stuck mining software on in the background and were wondering why Gears 5 didn’t run. It’s nonsense as well because I have a laptop with a GTX 1060 and 8GB RAM and it runs Gears 5 at an almost stable 60FPS on Ultra. It’s got to be throttling somewhere.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

JohnnyShoulder

@phil_j I'm gonna miss your crappy forum posts!


I mean the consoles aren't even out yet, so it is probably best to wait until somewhere like Digital Foundry get their hands on the consoles and do a proper comparison, if you are into that sort of thing. Personally I'm more interested in playing the games than how it compares to a rival console or PC, but each to their own and all that jazz.

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

SirAngry

@nessisonett indeed, early benchmarks with the 2080ti were also hampered by early drivers that for some reason left performance on the table. Truth is I reckon there is loads more we can ring out if next-gen consoles as opposed to a decent PC with a 2080ti. Not because the GPU itself is better, but because these consoles are far better balanced and thought out. With the PS5 I literally only need what I possibly might use in RAM within the next 2 seconds, and even then for a majority of assets if there's an emergency the scrubbers and I/O can bail me out in milliseconds in many cases. No PC will have that possibility for a very long time, and it means maximising computational resources, specifically on the PS5, but also XSS|X is relatively straight forward. Most of the resources in PS5 will be going to service the Current frame, that's not really the case with PC architecture so the narrative of power isn't relevant. It's ultimate what you see on screen, and the quality of image. But what do I know, what I do know though is that I wished people could just wait and see.

SirAngry

Kidfried

@JohnnyShoulder @phil_j With SSD coming up does that mean we'll actually be playing games instead of writing stuff on the forums. It'll be lonely this Christmas...

Kidfried

Ryall

@SirAngry We can finally move beyond our consoles rebooting in the background during loading screens 😂.

Ryall

SirAngry

@Ryall I really have no idea what you mean...

But essentially yes, and the news the PS5 I/O is adopting Oodle for compression of texture and map details is mind boggling. We really could be seeing games load in under a second very shortly.

SirAngry

JohnnyShoulder

@Kidfried I'll always have time for this place. Whether that is a good or bad thing is another matter lol.

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

heyoo

New Update on opening post: Watch Dog Legions.

Edited on by heyoo

heyoo

nessisonett

This is absolute nonsense. You’re using games that aren’t out, comparing them with a console that isn’t out to a graphics card that’s seemingly the only component of your test machine. If your 2080ti can’t run Dark Souls 3 at a stable 60FPS then that’s a fault of the optimisation or other components in your setup, that’s a stone-cold fact. What’s hilarious is your espousing of this as some sort of grand reveal that will set the ‘PC bias channels on the internet’ in their place. There is zero understanding outside of watching 1 or 2 Digital Foundry videos, you’re comparing numbers that don’t convey the full story and relying on recommended specs for games that aren’t out. What’s especially funny is that this isn’t even an up to date card! Your entire argument is that PS5 will beat the RTX 3080 by year 4, which is moot because there’ll be another better card on the market by then, which devs will aim for as their target. Comparing the PC and console markets is just silly, console devs have to wring out all they can because customers wouldn’t accept having to pay for an upgrade every two years. PC devs are constantly keeping up with the latest advancements and shifting the goalposts and so hardware is obsolete quicker. It’s comparing apples and pears, both are green fruit but are also completely different.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

heyoo

@JohnnyShoulder Digital Foundry? Remember they said a gtx 750ti and its 2014 pc build is better than the ps4. gtx 750ti is a mere 2gb vram card, it plays only ugly graphics since 2017. Please look it up. They downplayed the PS4 hard comparing it to a mere 2gb vram card like the gtx 750ti when in fact the PS4 even beats a 780ti too in the long run starting by year 4. The gtx 1050ti 4gb performs like the PS4 now and the 750ti build they have is obsolete a long time ago. Isn't this obvious enough?

Edited on by heyoo

heyoo

heyoo

@nessisonett

Nonesenes? These are all facts. These all came from the devs and actual gameplays benchmarks.

That's the reason. They intentionally gimp the rtx 3080 vram so that they will profit on future upgrades on pc consumers. But it still a fact that console are made future proof than any high end gpu at the same year except for the novelty card like rtx 3090 which is basically a titan.

So you watch their 750ti video vs the PS4? Now, see how the 750ti performs and looks in triple A games since 2017. See how a 750ti plays Odysseey, red dead 2, doom eternal, etc. Ugly just ugly. Now you still believe their 750ti videos against the ps4 is even true? That's a budget pc they build to come up and beat the ps4 and obviously they are not even on the same level in the long run starting at year 3. Get back to me if a 750ti can play detroit becomes human. Don't even talk about optimization. The fact that's it's a mere 2gb vram card means it's inferior. Period. It plays on ugly graphics even PS3 looks better sometimes.

I'm not downplaying anything here just stating facts. I like both pc and console.

This is the problem. Do your own research or better buy these consoles and high end gpus and compare them in the long run console wins! I had 2 high end gpus in 2008 and the PS3 beat them easily in the long run, they become obsolete fast while the ps3 played for 9 years. I have a $200 gtx 960 2gb in 2015 and it's dead after 2 years plays only on ugly graphics since 2017 while my 2013 ps4 plays beautiful A+ graphics till today. I got scammed hard. PS4 is capable of 5gb of vram no wonder it still plays beautiful today.

Now Im using a gtx 1060 and satisfied with it at 1080p. and a gtx 1060 is not on the level of the ps4 pro and xboxonex too. I have the consoles too. Gtx 1060 aint playing higher than 1440p up to 4k on most modern demanding games like the ps4 pro and onex easily does. Onex easily beats the gtx 1070 too.

Rtx 2080ti cannot do 60fps on dark souls 3 and red dead 2 4k right? so you think its an optimization problem? I bet red dead 2 is coming to the PS5/SeriesX 4k lock 60fps ultra pc setting port. And when this happen its over. PS5/SeriesX wins! I can't wait to add it to my OP. So you really think the rtx 2080ti is better and can play next gen insane graphics 5-8 years from now like the PS5/SeriesX will? Good luck.

This is another problem we don't look back at the past. We only compare the first couple of years. They don't compare the base ps4 against their 750ti build since 2017 because its not even a competition. PS4 is far superior. No one reviews or compare 2006 PS3 vs a 8800 gt nvidias most powerful gpu in 2007 in modern demanding ps3 games because it's a long time ago. No one cares. But if compared on modern ps3 games you will see how easy the PS3 beats the most powerful gpu of 2007 and most high gpu of 2008 easily. Fact is console beats high end gpu at the same year easily in the long run starting as early as year 4, and this all the way to ps2 era

Edited on by heyoo

heyoo

This topic has been archived, no further posts can be added.