Forums

Topic: PS5/SERIESX is more powerful than a RTX 2080ti. Proofs and evidences are all here!

Posts 41 to 60 of 90

heyoo

@nessisonett I guess you didn't read everything.

You were saying games specially future games are just optimize better for the console that's why console runs them better than old high end gpu, right? And I said its not optimization because the specs are all there from the very start.

Ps4 is a 5gb vram machine since day and this is the only reason it still play games like marvel avengers, valhalla, cyberpunk, cold war and even games in 2022 all looking A+ beautiful.

So you think a 1.5gb vram gpu in the ps3 era with old gpu architecture can play ps4 game graphics if devs optimize the games for this card? No. No matter what optimization you will do it will still play on super ugly graphics 10fps or it won't even run at all (crushes) at startup simply because it's too old gpu architecture cannot understand anything.

So how about the 2013 gtx titan? do you think it needs optimization to run ps4 games till 2022? No. It does not. Titan specs alone says it's more than enough to play any ps4 games from the very start until the end of the ps4 cycle. The titan has 6gb of vram to begin with. It doesn't need optimization just like the ps4. Now when games requires 7gb of vram minimum even at 1080p then the ps4 and rtx titan will be obsolete too.

It's not because devs don't optimize games for old high end cards, its simply because they have inferior specs like the architecture itself or the vram which is the main major factor in the long run. Tflops is only the big factor if games vram requirements are right on or even less than what a certain gpu has. When games becomes more demanding vram affects fps more potently than tflops. The base ps4 and its mere 1.8tflop did not stop it from playing mk11 native 1080p at lock 60fps looking A+ beautiful. While the gtx 770 with 3.2 tflops and 2gb vram not only it plays on ugly graphics it also plays on inferior 40fps. Don't tell me this has something to do with optimization? No, the gtx 770 simply has inferior vram.

I'm talking about real life performance here. What is this cement you are talking about? Give me evidences.

Edited on by heyoo

heyoo

nessisonett

@heyoo Mate, you have no idea what you’re talking about. You literally don’t. You’re persisting with nonsense arguments about nonsense figures generated by nonsense ‘research’. You’re like a child who learned the word VRAM and thinks it’s the second bloody coming of Christ. This is pointless.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

TheFrenchiestFry

@heyoo Dude devs optimize games to work with older high end cards ALL THE TIME

The GTX 10 series is pretty old compared to the 20 and especially the 30 series cards yet current and next gen games are still saying that even as part of RECOMMENDED specs to get a smoother experience during gameplay the 1060 cards and onwards are still being supported and the games are optimized for those GPUs.

TheFrenchiestFry

PSN: phantom_sees

nessisonett

@TheFrenchiestFry Yep, Cyberpunk has 1060 as its recommended specs. That came built into my laptop 3 years ago and plays games ‘better’ with more ‘power’ than a PS4.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

heyoo

@nessisonett it's all facts though. you can literally see how vram affects everything in the game graphics option itself.

Heres a rough quick example, you have a 4gb vram card

using only 25% of the 4gb card on graphics settings. you get 66fps
using 50% of the 4gb card on graphics settings you get 44fps
even just using near max level you will start playing below 30fps. 80% vram used, 27-34fps.
using more than the gpu maximum vram will result to unplayable fps. 120% vram, 18-24fps.

and if you don't even know this then there's no point discussing. This is so basic.

you think a 1/2/3gb vram gpu can play ghost of tsushima or last of us 2 graphics? No.

don't even worry if im a kid or an alien. it doesn't matter. Let's talk facts.

Edited on by heyoo

heyoo

nessisonett

@heyoo You could have 450GB VRAM on a system with a calculator’s CPU. Would the game run? No, no it wouldn’t.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

heyoo

@nessisonett cyberpunk is already a semi next gen game. The ps4 era is near it ends. Yes gtx 1060 6gb is the recommended specs on cyberpunk and I believe that it's 1080p on mostly on high graphics. R9 290x 4gb being the minimum gpu means it will be playing on all low. and guess what ps4 is 5gb vram so it will be playing on a little higher graphics. Cyberpunk is semi next gen already even at low/med settings it will still look good and even better than other old(2017 and below) games on ultra setting.

Edited on by heyoo

heyoo

heyoo

@nessisonett

i3 dual core + 450gb vram (ofcourse with the latest ps10 gpu architecture) = ps10 graphics 720p 27fps.

1gb vram + i29 cpu 128cores = PS3 graphics 720p 2040fps.

Yeh, I rather play the ps10 graphics at 720p 27fps than ugly graphics at 720p and unnecessary pointless fps.

Edited on by heyoo

heyoo

nessisonett

@heyoo No. That is not how games work. It just isn’t. VRAM doesn’t directly equate to how good a game looks and CPU doesn’t directly equate to how well it runs. You have literally no understanding of basic computing knowledge. This is insane, you wouldn’t do this with any other profession and this is a subject that somebody can study for 5-6 years and still be a total greenhorn.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

heyoo

@TheFrenchiestFry

1000 series specially the gtx 1060 and up is more than enough to play games at 1080p. They are not optimized, it's because the gtx 1060 6gb has more than enough vram to play any ps4 era games easily.

heyoo

TheFrenchiestFry

@heyoo VRAM does not equate to game performance. It's just an indicator of how much information can be displayed in a game at a single time. Performance is a task left up to the CPU and GPU. The VRAM can show you a game on screen with enough dedicated memory, but whether it performs well or not is up to how well the game itself is processed by the CPU in conjunction with the GPU, even if the GPU will be doing most of the heavy lifting in that regard

TheFrenchiestFry

PSN: phantom_sees

TheFrenchiestFry

@heyoo Cyberpunk isn't just a PS4 game however. Considering how intensive Witcher 3 was even on PC, especially when you enable stuff like Hairworks, Cyberpunk running on a new engine and having a denser world populated by more people and environmental details means it'll probably push the current gen systems to their absolute limits. It was clearly designed with the intent to future proof it for higher end PCs and next gen systems

Edited on by TheFrenchiestFry

TheFrenchiestFry

PSN: phantom_sees

nessisonett

@TheFrenchiestFry Yeah exactly, you could be playing a game at 4K 60FPS but with Superman 64 fog and pop-in. There are so many factors to take into account but it’s like talking to a brick wall.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

TheFrenchiestFry

@nessisonett DOOM Eternal I think has the 1050 as its minimum requirement for the GPU and either the 1060 or 1660 as the recommended requirement. Death Stranding definitely recommends the 1060 as well, meaning early next gen games will probably continue supporting the 10 and 16 series cards before gradually transitioning to taking full advantage of the 20 and 30 series GPUs.

TheFrenchiestFry

PSN: phantom_sees

HallowMoonshadow

How about everyone just ignores the troll before you guys pull your hair out in frustration?

This is clearly going nowhere 😅

Edited on by HallowMoonshadow

Previously known as Foxy-Goddess-Scotchy
.
.
.

"You don't have to save the world to find meaning in life. Sometimes all you need is something simple, like someone to take care of"

TheFrenchiestFry

@Foxy-Goddess-Scotchy I honestly can't tell if he's trolling or if he's just genuinely that uninformed about how PC/console specs actually work

He kept saying that VRAM meant a console was more powerful like he thought he was making a genuine point instead of coming off as an illiterate troll

TheFrenchiestFry

PSN: phantom_sees

nessisonett

@Foxy-Goddess-Scotchy If they nuke the thread fair enough but it really grinds my gears that he’s espousing literal nonsense and presenting it as fact. It would be like if somebody started talking about how the best way to teach is locking kids in cupboards and snorting coke underneath your desk. You would feel rather inclined to completely refute what is complete nonsense about something you do everyday and know a lot more about than somebody who watched a couple of videos on YouTube.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

HallowMoonshadow

Oh I understand the frustration and the need to correct them for sure @nessisonett and like @TheFrenchiestFry said I'll admit there's a possibility the person genuinely doesn't understand what they're saying (I know diddly squat about this computer stuff and I'll happily admit that)

... But it's been four days of you guys trying to explain it to this guy... It might be a bit of a lost cause.

If you wanna keep trying go ahead of course... I ain't gonna stop ya 😅

Edited on by HallowMoonshadow

Previously known as Foxy-Goddess-Scotchy
.
.
.

"You don't have to save the world to find meaning in life. Sometimes all you need is something simple, like someone to take care of"

BAMozzy

Of course VRAM makes ALL the difference... LMAO

And again, devs can optimise games for console by using lower res textures, lower quality assets and even 2D boards for more distant objects to cut down the frame size to fit lower VRAM constraints on Consoles so again cannot compare a PC to a Console because Devs are optimising the game specifically for the console.

Whilst you may not be able to tell whether a board of a tree in the far distance isn't a 3D object - which it could be on PC from just comparing a still shot or a side by side in motion, the amount of VRAM the frame on the PC version requires could be much higher - especially on Ultra settings as most benchmarks use. You cannot tell if the assets used are the exact same or specifically created to use on Consoles. There are too many differences to compare like for like.

A dev can spend months optimising a console version to 'look' good enough and 'run' at a specific frame rate within the limitations of that hardware - inc VRAM. The PC version may need more VRAM for that very reason - think about it, a PC has RAM on the GPU AND RAM in the system - a console has just RAM for 'EVERYTHING'. Its not going to dump everything in RAM to free up space - only to load it all in again to compose the next frame. That 8GB of RAM is SHARED between System and gaming (of which 5-5.5GB is for gaming), shared between CPU and GPU - it doesn't have ANY dedicated VRAM at all.

The fact that consoles don't have dedicated VRAM at all negates your entire argument and incredibly pointless to keep going on about it...

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

Ralizah

nessisonett wrote:

Let’s say I lost both the use of both my arms. One, I lost in a shark attack. The other was dissolved in acid. Which one was better? Well the shark attack was painful and terrifying but the pain was fleeting. The acid was slow and excruciating but I haven’t completely lost that limb, it’s just all burned up. Both got the job done in different ways but at the end of the day, I lost my f*cking arms.

This thread is amazing for out-of-context quotes.

Edited on by Ralizah

Currently Playing: Yakuza Kiwami 2 (SD)

PSN: Ralizah

This topic has been archived, no further posts can be added.