Forums

Topic: PS5/SERIESX is more powerful than a RTX 2080ti. Proofs and evidences are all here!

Posts 21 to 40 of 85

nessisonett

@heyoo It’s not the card that suddenly stops playing certain games, devs just aim for the newest tech as their benchmark. If consoles were updated every year then your PS5 wouldn’t be able to play the newest games in 3 or 4 years. You’re completely failing to grasp the basic concepts of PC game development. This is not a question of whether a certain card is ‘better’ than PS5. If they only released a new graphics card every 7 years then the exact games that play on PS5 would also play on an RTX 3080 for that same length of time. This is because developers would have that as their target build and would compromise or find workarounds, the exact thing that happens on console every single day. This is a pointless exercise that means absolutely nothing.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

Anti-Matter

LOL
What is this thread talking about ?

Rhythm gonna hit your head.

BAMozzy

@heyoo What you are missing is that the games on consoles are OPTIMISED to run on that specific hardware configuration and that system is optimised specifically to play games too. There will be things that a developer can do on the console version to get it running at a level they are happy with where as on PC, they have to make the game to run in a conventional way for a wide range of builds. What I mean by that is that on Console, they could push things on to the GPU that will normally be handled by CPU because the CPU is bottlenecking. Allocate CPU cores for specific tasks knowing that they have 7 cores to play with and tune the game to run on that system specifically. Even the RAM is 'fixed'.

Their PC version is far more 'generic' because it has to run across such a wide range of specs and different manufacturer parts - AMD, Intel, nVidia, multiple generations of architecture, multiple cores, multiple frequencies, multiple RAM configurations. They offer a wide range of settings for people to optimise their games for their hardware - its not because PC owners are special and get a LOT more options than Consoles, its because devs cannot optimise the game settings for 'EACH' individual PC configuration. There are some settings that match between a PC and Console (medium reflections, medium particles etc) and others that are specific to that console (lower than low shadows, somewhere between PC's low and medium LoDs) because the developer can tune these to the consoles specific hardware so you cannot even compare 1 directly with another because you cannot match the same settings exactly. Other things Devs can do is change the frame rates of various things at various distances - like enemies/characters running at 50% frame rates at a certain distance, resolution of reflections at 50%, Mirrors or rain on windows in car racing games updating at 50% frame rates to save on processing - but on PC, they are ALWAYS at the same frame rate as the game. There are 'tricks' that developers can utilise on Console to get much better performance on much weaker hardware than on PC purely because the console is 'static' and can tune the game specifically for that hardware. If it doesn't 'run' on PC hardware at a level you want, you can upgrade.

Another difference is that the OS and any background operations on a console are running on their own portion of the hardware. On a PC, you have background operations and system software running and that can affect the way the game runs - and as you have a 'choice' of which AV, which Malware fighter, whether you are running an email client or some other applications, they will be using up some of the resources for gaming which will affect the performance metrics.

PC GPU builders also build GPU's for that year, for games around that era and know that most upgrade every 3yrs or so. A console is built to try and last around 6yrs+. Its not in their interest to make a GPU to last 10yrs as they want you to upgrade. They have new architecture every couple of years too.

Red Dead 2 and Resident Evil on console has a different frame rate for objects/enemies moving in the distance - they run at 50% frame rate. Hitman 2 has 50% resolution on reflections, Forza Motorsport has 50% frame rate on Mirrors and Rain effects... Its NOT a like for like with PC at all. It is impossible to compare like for like because certain aspects are different and the PC does not have the option to match up settings exactly. That's not including aspects like chequerboard rendering too which many games on PC do not offer but developers implement on console....

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

nessisonett

@Bakuhatsu ....did you just create a new account to make it seem like you have people agreeing with you? You have the exact same speech mannerisms as the person who created this thread, so yes you did. You’re also completely missing my point, if a specific PC build with an RTX 3080 was targeted the same way as a PS5 or Xbox was, say 6 years in the future, it would run the game just about as well. The complete disregard for about 4 different people explaining this pointless argument is not a good look. And then creating a sock puppet to agree with you is the final nail in your coffin. Good day, you deeply sad little man.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

nessisonett

@Bakuhatsu There are a million other bloody factors than VRAM. You also don’t measure performance by how pretty the graphics are. This is utterly pointless, wheelie bins have more basic understanding of computing principles.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

nessisonett

@Bakuhatsu ....what are you even on about? Literally just cap your frame rate on PC at 30 or 60 FPS. That’s what they’re there for. Again, the performance of consoles is not the issue here, the specs are fixed and therefore devs spend more time optimising for that specific platform. I just don’t even get what you’re saying, ‘why would god of war even score a masterpiece’ lmao yes, we give awards to games because of how well they utilise VRAM on their console master race. I’m not saying that consoles are some sort of laggy mess, I literally have no idea where you’re getting that from because I’ve been saying the polar opposite.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

Anti-Matter

Is it PC vs PlayStation specs debate ?
Untitled

Rhythm gonna hit your head.

SirAngry

[Edited by SirAngry]

SirAngry

SirAngry

@Bakuhatsu I don't know whether English is your first language or not, and maybe that's part of the problem, but I can say your reading comprehension is p*ss poor as you've failed to pick up any points any one else has raised, in fact the only thing that appears worse than your reading comprehension seems to be your technical understanding. You are way out of your depth here talking about hypothetical situations that just don't exist. You are creating strawman arguments. You aren't smart enough to understand how little you understand, and that my friend is the Dunning–Kruger effect in full effect. Every single point I raised would have had a first year Computer Science student understanding the utterly pointlessness of what you've said. I give up. You don't know enough to even understand you don't know enough...

But if the RTX 2080ti was in either console it absolutely would run whatever game is coming out in 2026... because ignoring either console at that point will most likely be commercial suicide because of the install base. That alone shows you just don't understand the topic.

[Edited by SirAngry]

SirAngry

nessisonett

@Bakuhatsu Deeeear lord, you’re completely glossing over the gist of what we’re saying.

A card released at the beginning of the PS4’s lifecycle does not play recent PS4 games because developers do not optimise for graphics cards that old as newer ones are on the market.

If PS4 got a yearly update in the same vein, a base PS4 would not play games as well as the newest model due to developers shifting focus with technological advancements.

If Nvidia only released one graphics card from now until the end of the generation, that card would run games just as well as the PS5, due to targeted optimisation and developers having clear constraints.

Let’s say I lost both the use of both my arms. One, I lost in a shark attack. The other was dissolved in acid. Which one was better? Well the shark attack was painful and terrifying but the pain was fleeting. The acid was slow and excruciating but I haven’t completely lost that limb, it’s just all burned up. Both got the job done in different ways but at the end of the day, I lost my f*cking arms.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

nessisonett

@Bakuhatsu Also, just to clarify, you chose Assassin’s Creed Odyssey as your example. Ubisoft partnered with AMD and as such, the game is optimised better than on Nvidia cards due to continual driver updates. Yet another idiosyncrasy of the PC market and another reason why comparing it to console is so much more complicated than you’ll ever know.

[Edited by nessisonett]

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

SirAngry

@nessisonett give up, the guy is the stupidest person I've ever come across on the Internet, and that's saying something. He doesn't even know how stupid he is and how utterly irrelevant all the things he's saying are. He doesn't even know how to define "power". Hell in all my time in this industry I have never once met any programmer or system architect who uses the word "power" because it something that just isn't a metric. We might talk about resources or hardware acceleration, but power? No. That's for people who don't understand the industry at all. I've tried steering clear of using the word on here and other places, because it's not a metric.

@Bakuhatsu one last thing to try and get it through your thick skull. Do you think the XSS will run next gen games in 2026? Because it has far less computational resources than the RTX 2080ti on the GPU side. I can tell you the XSS will run those games, because Microsoft have stipulated that it has to run those games for as long as the XSX is relevant. They've given both a 7 year life cycle target. Were I a GPU manufacturer I'd be pissed at Microsoft, because the XSS is going to keep fairly old GPU's in play for much longer... but you're probably to thick to understand why.

[Edited by SirAngry]

SirAngry

SirAngry

@Bakuhatsu you just have had every point sale way above your head in this thread. Part of me thinks it must be utterly amazing living in your world where you are just so amazingly ignorant, of well... everything. The point you just raised about console secret sauce is literally the point we've been trying to ram through your thick skull from the start. The hardware is useless without the software, so your argument about "power" or being more "powerful" is utterly irrelevant you Nimrod. You might have finally come round to what @nessisonett was trying to tell you from the very beginning, but I'm not going to let you claim it was your argument, because it wasn't. All you've just done is prove us right. In 7 years time I'll probably still be getting tool updates for both consoles, I doubt the same will be true even of the RTX 3000 cards. I'm done trying to get you to see that your redundant argument about "power" is indeed redundant. I'm putting you are ignore, because I don't want to catch stupid.

SirAngry

heyoo

New Update: Gears Tactics. See Opening post.

I'm back. My other account just got banned as duplicate. Thankfully I got this one back. My replies are all deleted though. Hopefully you read all my replies before it got deleted.

Another update. Digital Foundry analysis of demon souls is 1440p at 60fps and 4k at 30fps. And Here we go and just as expected. It has started as early as now.

Let's just wait and see. I believe its native 4k 30fps with ray tracing on and native 4k 60fps without ray tracing. That's it. Hopefully bluepoint will just say the "real" resolution demon souls is at on ps5.

I maybe wrong here but still the final only legit proof will be on future games like in 3-4 years and up from now between the rtx 2080ti and seriesx/ps5, you can also include the rtx 3080 10gb vram too.

[Edited by heyoo]

heyoo

nessisonett

@heyoo You literally said “there’s no such thing as targeted optimisation”. Please explain your credentials and reasoning as this is like somebody who watches DIY SOS explaining to a builder that there’s no such thing as cement.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

heyoo

@nessisonett I guess you didn't read everything.

You were saying games specially future games are just optimize better for the console that's why console runs them better than old high end gpu, right? And I said its not optimization because the specs are all there from the very start.

Ps4 is a 5gb vram machine since day and this is the only reason it still play games like marvel avengers, valhalla, cyberpunk, cold war and even games in 2022 all looking A+ beautiful.

So you think a 1.5gb vram gpu in the ps3 era with old gpu architecture can play ps4 game graphics if devs optimize the games for this card? No. No matter what optimization you will do it will still play on super ugly graphics 10fps or it won't even run at all (crushes) at startup simply because it's too old gpu architecture cannot understand anything.

So how about the 2013 gtx titan? do you think it needs optimization to run ps4 games till 2022? No. It does not. Titan specs alone says it's more than enough to play any ps4 games from the very start until the end of the ps4 cycle. The titan has 6gb of vram to begin with. It doesn't need optimization just like the ps4. Now when games requires 7gb of vram minimum even at 1080p then the ps4 and rtx titan will be obsolete too.

It's not because devs don't optimize games for old high end cards, its simply because they have inferior specs like the architecture itself or the vram which is the main major factor in the long run. Tflops is only the big factor if games vram requirements are right on or even less than what a certain gpu has. When games becomes more demanding vram affects fps more potently than tflops. The base ps4 and its mere 1.8tflop did not stop it from playing mk11 native 1080p at lock 60fps looking A+ beautiful. While the gtx 770 with 3.2 tflops and 2gb vram not only it plays on ugly graphics it also plays on inferior 40fps. Don't tell me this has something to do with optimization? No, the gtx 770 simply has inferior vram.

I'm talking about real life performance here. What is this cement you are talking about? Give me evidences.

[Edited by heyoo]

heyoo

nessisonett

@heyoo Mate, you have no idea what you’re talking about. You literally don’t. You’re persisting with nonsense arguments about nonsense figures generated by nonsense ‘research’. You’re like a child who learned the word VRAM and thinks it’s the second bloody coming of Christ. This is pointless.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

TheFrenchiestFry

@heyoo Dude devs optimize games to work with older high end cards ALL THE TIME

The GTX 10 series is pretty old compared to the 20 and especially the 30 series cards yet current and next gen games are still saying that even as part of RECOMMENDED specs to get a smoother experience during gameplay the 1060 cards and onwards are still being supported and the games are optimized for those GPUs.

TheFrenchiestFry

PSN: phantom_sees

This topic has been archived, no further posts can be added.