Forums

Topic: PS5/SERIESX is more powerful than a RTX 2080ti. Proofs and evidences are all here!

Posts 61 to 80 of 90

JohnnyShoulder

I'm with @Foxy-Goddess-Scotchy, I think this has gone well past its sell by date and time to move on. Just a suggestion and no need for anyone get upset thinking I'm telling them what to do.

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

JohnnyBastos

This thread is REALLY weird... but has a kind of car-crash appeal that I can't look away from.

JohnnyBastos

Gremio108

I can't believe the guy invented an account in the name of @kyleforrester87 all those years ago, just to agree with himself in this thread. He was really playing the long game there

Good job, Parappa. You can go on to the next stage now.

PSN: Hallodandy

JohnnyShoulder

@Gremio108 Always thought there was something dodgy about him!

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

nessisonett

@SirAngry If I wanted to speak to a guy with an eyepatch and pervy grin, I’d be on Grindr.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

heyoo

@nessisonett what. have you seen what a 512mb and below video ram gpu plays at? Ps2 graphics. 1gb(standard)-1.5gb vram graphics is ps3 era level on 720-1080p.. 2gb minimum vram requirements can still be played by the PS3. PS3 magic.

Ps4 started at lowly 1.5 vram to 2gb vram (previous standard) to 4gb (standard recommended on most games since 2017) and up to recently at 6gb now with cyberpunk.

Vram is the main ingredients for graphics and resolution. Increasing the resolution requires more vram. 512mb-1.5gb vram you only mostly play on 720p, with 4gb of vram you can mostly play at native 1080p easily. For native 4k you need at least a 8gb vram card for demanding games at 4k. That is why the gtx 1060 cannot do 4k on most demanding games like other 8gb vram cards.

Do you even have gpu with different vram amount and tested them. I have a gtx 950 2gb, it plays fine in games like fallout, witcher 3, bioshock infinite because these games don't require more than 2gb of vram to play on at least high settings. and then the gtx played on ugly graphics only once I installed FFXV same thing later on assasin creed origins and all other games since 2017. It's limited 2gb vram cannot allow it to play on beautiful graphics like the ps4. So i played on ps4/ps4 pro mostly and then I bought a gtx 1060 and I got the best of both worlds.

Edited on by heyoo

heyoo

nessisonett

Untitled

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

heyoo

@TheFrenchiestFry you mean optimize like running better? ofcourse but I'm talking about graphics here.

Console has standard with graphics. Unlike pc when you can lower settings to even uglier level is fine just to get boost in performance. That is not the way of the console. They have high standards with graphics it should always look beautiful. Now, if pc gpu don't play on the same graphics as the consoles then it's not a competition. So in this case vram matters first. To make a game look prettier you need more vram.

heyoo

heyoo

@TheFrenchiestFry 1000 series cards is on the level of the ps4/ps4pro/xboxonex/xboxone.

When they totally stop producing games for the ps4/xboxone/xboxonex/ps4 pro simply because it would downright look super ugly on these old consoles even at 1080p. Same thing will happen for a gtx 1080 it will play on super ugly graphics at 1080p so 1000 series is obsolete as well unless you don't mind playing on super ugly ps3 looking graphics during the ps5 era. I guess console gaming has higher standards when it comes to graphics if you still play on PC with ps3 level graphics during the ps5 era.

Edited on by heyoo

heyoo

heyoo

@nessisonett the truth hurts or what. I guess you cannot deal with the most basic truth that vram (along with evolving gpu architecture ofcourse) allows you to play games at better graphics and higher resolution.

Yes, times when games will look like real life that even at 1440p it requires more than 10gb vram minimum to look pretty. rtx 3080 10gb will struggle hard and will play only on ugly graphics at 1440p specially at 4k. And it seems like the rtx 3080 will be performing like the seriesS 5 years and up from now that will mostly play on 1080p. Remember seriesS only has 10gb of shared flexible memory. That's it! SereisS = rtx 3080 10gb in the long run both will play games mostly at native 1080p (with dlss/console upscaling to 1440p/4k) only. I called it here first! Just wait and see 5 years from now.

Red dead redemption 4 in 2028 will look 4x more beautiful and real life than red dead 2. Ps5/seriesX(using all 15-16gb of memory as vram) native 4k 30fps looking like real life. SeriesS/Rtx 3080 Native 1080p 30fps looking like real life. Rtx 3080 10gb vram at native 4k 14fps on ugly all low and off settings on red dead 4

As you can see vram matters most in graphics and the ps5/seriesX is future proof on handling any graphics 4-9 years from now. It can do 13.5gb of vram at minimum and can increase it with its remaining 2.5 flexible memory if needed. and who knows if the ps5/seriesX still have some secret memory inside it because the PS3 was only a 512mb xdr shared memory on "paper" specs and it played pc games that requires 1.5gb-2gb of vram just fine. Unbelievable. Like I said PS3 could actually had a 1.5-2gb vram memory they just hide it as "512mb xdr memory". xdr is said to be way more advance than pc ddr3 memory too. There is no other way to explain this.

My tip is get the rtx 3000/4000 series version with at least or more than 14gb of vram and your set to last till the end of ps5/seriesX console cycle without graphics compromise below the console graphics.

So all these 10gb vram will be inferior than the next gen console in the long run while 8gb vram cards will become obsolete faster. Rtx 2080ti has 11gb of vram but it has older gpu architecture than the amphere and rdna 2 and that would make it inferior too.

Edited on by heyoo

heyoo

JohnnyBastos

@heyoo I think you need to explain yourself in much more detail. Maybe these guys just can't keep up with your mega-brain. That is probably the reason why all of the experts on this forum think you're talking bollocks.

JohnnyBastos

BAMozzy

Consoles obviously can't play games at all - no dedicated VRAM...

Better cancel your PS5's, throw your PS4/Pro's away as not one has any dedicated VRAM at all. The PS4 Pro - a Console that can do 4k/60 with its 5.5GB RAM for gaming shared between both CPU and GPU with absolutely NO dedicated VRAM whatsoever...

The PS5 does NOT have 13.5GB of VRAM either - it has 16GB of RAM, of which some will be used for system only - maybe 2.5GB like the series X (didn't think it had been stated yet how much is reserved for System) and the rest is shared between CPU and GPU for gaming - NOT VRAM as the PS5, like the PS4/pro, like the XB1 S/X and Series S/X do NOT have dedicated VRAM at all.

Whatever you are smoking @heyoo, you obviously have absolutely no idea at all about this at all. You have NOT factored in that console games are OPTIMISED by the developer to work within the limitations of the platforms hardware. They can reduce the frame size by reducing distant objects to nothing more than 2D, reduce the polygon counts further, reduce the LoDs etc on console to reduce the amount of RAM that will be used like VRAM. There is no way a PS4 Pro was able to use 5.5GB of RAM as 'VRAM' because the RAM was being used to hold all the Textures, assets etc for that area and certainly isn't going to 'dump' that so that 5.5GB's can be used as VRAM.

Even with the 'blistering' speed of the SSD's, the PS5 cannot stream data from those into the GPU anywhere near as fast as RAM - in fact even with 'oodle' and ' kracken', its slower than the RAM in the Switch and that is less than 1/20th of the 10GB block in a series S - that 560GB/s. As impressive as the SSD, with a 'theoretical' peak of around 17GB/s with oodle, and raw of 5.5GB/s its less than the Switch RAM is at 25.6GB/s, quite a LOT slower than the PS4's 176GB/s and vastly slower than the PS5 RAM at 448GB/s so its not going to be used in place of RAM when the GPU needs all that Data instantly to render the frame - all those assets and textures used to compile a frame in 16.66ms (at 60fps) so there is NO way the PS5 can use 13.5GB of its RAM as VRAM....

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

SirAngry

@BAMozzy while in principle I agree with what you are saying, I have to correct you and say the SSD in the PS5 is actually plenty quick enough to draw specific data into cache to be used, not as standard for everything, but certain assets? Yeah sure. In fact it's damn useful for emergency calls. The XSS|X isn't shabby in that department either, just not as useful. However, the fact that everyone is still going on about VRAM in relation to consoles is actually mind bending to me. As you've pointed out, not since the PS3 have we essentially see a mainstream console with dedicated VRAM, with it's whopping 256 MB of GDDR3 RAM at 22.4GB/s... thanks Sony, I'm still suffering from PTSD. In terms of frames and usage of data, cache is way more important any way... and in terms of getting said frame to a screen then ROPs are way, way more crucial. It's almost like there's a fundamental misunderstand of how hardware and software work together to generate games... So why aren't we discussing all of that? Oh right, I forgot, the OP is a moron.

PS. @JohnnyBastos you're a bad, bad man. You do realise you are going straight to hell.

SirAngry

BAMozzy

@SirAngry I am sure that certain things can be pulled straight from the SSD if needed - I am sure Audio files could play direct from SSD as there is more than adequate bandwidth for that and of course texture files too - up to a limit. But my point was more about the fact that the RAM in a console is shared between system and games, shared between CPU and GPU and no dedicated VRAM at all. The PS4 Pro, for example, can deliver native 4k/60 with just 5.5GB of RAM shared between CPU and GPU and that has no dedicated VRAM at all. The PS5 isn't going to use the SSD in place of the RAM and use the 13.5GB (if that is what is allocated for games) as VRAM as 'claimed' by the idiot...

Anyway, I know you know the rest but for anyone else - and especially the uneducated person spouting complete and utter nonsense...

The reason that 'textures' were not as high res on the PS4 Pro compared to 'some' games on Xbox One X was because of the limited RAM by comparison - 5.5GB vs 9GB for games. 5.5GB of RAM was certainly good enough for 1440p and above and games like Spider-Man, God of War, Horizon:Zero Dawn, Last of Us 2, Uncharted 4 and Lost Legacy all looked AMAZING at 1440p+. I know God of War and H:ZD used Chequerboard Rendering but was still rendering more pixels than 1440p - all with just 5.5GB RAM shared between CPU and GPU and NO dedicated VRAM at all.

The Series S may have just 10GB RAM with probably about 8GB for gaming and targeting 1440p. As such, it won't have the 'higher' res textures - as its not 'necessary' at a lower resolution and its more than the Pro has. If games in 5yrs time are '1080p' on it, it won't be because of RAM, it will be because the 4TF GPU cannot render at 1440p within the Frame Budget - but the 3080 will easily do 4k in 5yrs time. You may not be able to run the latest game at 4k 'Ultra' and get at least 60fps - even without RT but VRAM will NOT limit that game to 1080p. Most PC's will have 16-32GB of RAM for gaming on top of its dedicated VRAM and the PS5/Series X will still be offering games at a 'high' resolution that probably can't match what the 3080 does - like for like - not that you can get an exact match because the Console versions often have settings that a PC gamer does not have access to - like running some things at 50% frame rate, like dropping resolution on certain things.

Even if you don't understand the technicalities of the hardware and what role they play in how a game runs, just watching Digital Foundry on visual comparisons - Alex's PC settings video is often a good watch as he gives his recommendations and often uses the 'best' console version as a 'baseline' - refusing to drop below that level so you get an idea of what settings are 'similar' (equivalent to PC's 'medium' setting for example) or which are completely bespoke and how they differ (somewhere between PC's Medium and High setting for example). There are sometimes settings used on consoles that are lower than PC's lowest setting. There is a reason you see a line, a certain point where textures, shadows and even polygon counts change in a console game - a certain distance where lower quality, lower resolution, simpler objects change to better versions with the 'highest' quality the game offers being the closest. Dropping the quality of assets and textures down at a distance saves on image file size and therefore required VRAM. Devs will tweak these settings specifically for the console as they know exactly what budget they have - on PCs, they don't because there is a vast array of hardware, different combinations of architecture from several different manufacturers in both CPU's and GPU's so they offer much more 'generalised' settings and the gamer has to do their own optimisation. A game and how it runs on PC does NOT equate to consoles in ANY way whatsoever and least of all resolution/frame rates. There are MANY contributing factors that enable a dev to push the Resolution up on a Console and run at a 'reasonable; frame rate that a more capable (on paper at least) PC may appear to struggle with. The key word is 'optimisation' - using many 'tricks/settings' that a PC gamer hasn't got access to - bespoke to the console. It may 'look' remarkably similar but that's more down to the talent of the developers, optimising their game in such a way that you don't notice or can't tell that the objects in the distance are just 2D on Console but 3D on PC, that characters/objects are moving at 50% frame rate in the distance or in your car mirrors as you drive but at full refresh rate on PC, that textures are really low quality in the distance because they are quite small compared to higher quality on PC - I could go on and on but it seems pointless as @heyoo is too ignorant or stupid to grasp the concept that a Console build of the game isn't exactly the same as a PC build, that the available settings a game has to optimise their PC version are 'different' and a Console build is 'bespoke', even the fact that modern consoles do NOT have any dedicated VRAM at all. There is a BIG difference between them. Its like saying the road car is bad because a Formula 1 car with just a 1.6litre engine beats them - despite the fact that the Formula 1 engine is 'optimised' to do one job. A Console is built and optimised to run games and the games its running are optimised specifically for that console.

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

SirAngry

I can confirm the idea of comparing console builds to PC settings is dumb. I have never seen geometry settings or mesh / LoD settings in a PC game, doesn't mean they don't exist, but the truth is you don't develop a game for PC settings and then pick the settings that work best on console and voila, you get your console game. No multiplatform game I have ever worked on has "comparable" PC settings. They might look like they do, but that's not what's happened. Late in this generation geometry models on consoles have become increasing basic for physics models, and you just don't see those normally, unless you've ever bounced a grenade off of an invisible wall. Truth is that stuff eats up processor time and PC games tend not to dial that stuff down, so quite often these DF pixel counts and frame counts do not even get close to telling the full story, which is why certain people shouldn't watch a few tech videos and think they understand what is going on.

Edited on by SirAngry

SirAngry

SirAngry

@JohnnyBastos we all saw what you did... may God have mercy on your imortal soul.

SirAngry

This topic has been archived, no further posts can be added.