Forums

Topic: PlayStation 5 --OT--

Posts 881 to 900 of 4,712

JJ2

So I dont really like Era ; I think there s a lot of crap in it but I recently had a look and saw an interesting explanation of ps5 variable frequency for people interested. Not sure it's an accurate description but definitely interesting read :

https://www.resetera.com/threads/playstation-5-system-archite...

Their debate about no sustained 2ghz in PS5 is silly though. Cerny never said that.

Edited on by JJ2

The crowd, accepting this immediately, assumed the anti-Eurasian posters and banners everywhere were the result of acts of sabotage by agents of Goldstein and ripped them from the walls.

JohnnyShoulder

@JJ2 Reminds me a bit of Reddit where you get one decent topic for about 10 complete bs topics.

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

BAMozzy

The thing is, at the moment, you are going to get a LOT of people speculating what may happen and how the 'boost' clock speeds work, what potentially that means etc etc. Until the PS5 is actually out and in the hands of people who hack the system in some way to ascertain the exact running speeds of the GPU and CPU, you will never know.

At the end of the day, its a more 'efficient' way of running a system in theory. It also stops the system building up heat so it can run faster as and when required too. If you ever see a PC breakdown of its components when running a game - so you can see where 'bottlenecks' if any are occurring, you will see that the GPU may only be at 60% utilisation so its still running at full speed but 40% of it is not being pushed/used. I don't know exactly what thresholds the PS5 could be utilising but in those areas where you would see that the game is only using 80% of the GPU at that 'fixed' speed, they could drop the GPU down to 2000mhz (for example) and be at 100% but because its now running slower, its not generating as much heat so cools down but because its 'flexible' like this, it means that if/when needed, it can boost the frequency too. Getting hot can throttle the performance and/or need a bigger cooling system - meaning potentially more noise too. If they didn't employ this method, then Sony wouldn't be able to constantly run the GPU and CPU at those speeds without requiring a bigger/better cooling system and/or risk overheating. All we know right now is the capped maximum the CPU and GPU can run at but neither a console or PC are often running at full capacity and it seems that Sony have implemented a way to use the power more efficiently - if its not a CPU intensive scene, but is GPU heavy, they can boost the GPU up to its fully capped level to power through and drop the CPU down a bit so its also not generating 'extra' heat and vice versa. Whether it can run both at 'full' speed and for how long, we don't know so its pure speculation. Whether games will come along that push both CPU and GPU for an extended time, we don't know. It maybe that it only happens for a second or two and the PS5 boosts both to power through without it bottlenecking before one or the other drops. Some of the most intensive GPU effects don't last for very long anyway - a big explosion with particles, transparent effects etc that settle back quickly and if the game has been 'optimised' around the 'general' game-play, it can utilise the boost mode as and when required for those momentary spikes that lead to a dropped frame or two. Its better than dropping the resolution or graphical settings for a 'worst case' scenario when the bulk of the game runs perfectly well for the vast majority.

Again - its still speculation as there is numerous parts to the APU that have 'dedicated' roles. The PS5 has dedicated audio and decompression - things that the PS4's CPU handles so that frees up CPU usage - no doubt they generate heat too but no one really knows how the PS5 will be running, what speeds the CPU and GPU will run at for the majority of the time and that could vary considerably from game to game to depending on the way the game is built and what speed it needs the various parts to run at. Its no point speculating either because it just spreads confusion and doesn't really matter either...

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

JJ2

@BAMozzy
'its not a CPU intensive scene, but is GPU heavy, they can boost the GPU up'
I think you described AMD SmartShift tech ?
Variable frequencies are twofold I think?
The main part is what Cerny explained the equivalent of a continuous boost (not an occasional boost) which brings a much higher sustainable useful power draw?

I mean there isnt much speculation either unless you question Cerny s words.
The link I posted above is a very nice description just for better understanding, not for speculation.

Edited on by JJ2

The crowd, accepting this immediately, assumed the anti-Eurasian posters and banners everywhere were the result of acts of sabotage by agents of Goldstein and ripped them from the walls.

nessisonett

@JohnnyShoulder Any site that annoys terrible people that much deserves to stay imo!

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

BAMozzy

@JJ2 So you can tell me exactly what the 'general' frequency of both the CPU and GPU will be running at can you? What the lowest frequencies could be? How long the console could run at max capped frequencies on BOTH CPU and GPU and what that all means for games, how that will effect the way games perform and/or optimised?

The only thing that is known is that the PS5 can vary the frequency on both the CPU and GPU and will be capped to a certain point. For all you know, the System may run at 200-300mhz lower on average but during intensive sections, it can boost either/both up to those capped levels.

All that explains is that if you somehow reduce the maximum power draw by what ever means necessary, reduce those 'maximum' peaks of intense power usage, you can then raise the overall power draw - something that makes sense. Its not 'just' opening up a Map screen though when the Fans kick in on games and if the 'whole' system is linked to power draw - if the fans are needed on full and other areas of the system are drawing power, does that affect the maximum power and therefore clock frequency the CPU/GPU can use? If you drop the Power to one area during those 'spikes' by reducing its clock speed, then of course those power 'spikes' are now lower and therefore you can build a system that is more efficient and send more power in general to the CPU and GPU. If you are running everything at a 'static' level, then you will get bigger power spikes because one area is still drawing power even if its not being fully utilised. Its more efficient to drop the power on area(s) that don't 'need' to it to boost areas that do when necessary and in doing so will reduce the maximum peak power draw. Its rare that everything needs to be running flat out simultaneously so you can drop power draw to areas that don't need it during those 'spikes' but as no-one knows what that means in relation to PS5 and its clock frequencies, its still total speculation.

Does that mean that the GPU and CPU will be 'running' at a 'general' or average frequency but when needed, it will drop the frequency of one to boost the other? For example does the CPU run at 3.3GHz and GPU at 2GHz but when the game needs more GPU power, it drops power to the CPU so it drops to 3.1Ghz and boosts the GPU to 2.23Ghz? Does that mean that the Console runs much closer, if not at the Cap limits but if there is a power spike, it throttles either CPU or GPU depending on which can be sacrificed? How much does the rest of the systems power demands affect the clock speeds - not just the Fan but SSD, the dedicated parts of the APU to Audio and Decompression etc. Drawing more power on average because its more efficiently used will still generate heat - more so especially if the average frequencies are higher unless, like I said, they drop the frequency based purely on need - so the GPU could drop to 1.5Ghz if that's ALL that it needs to be at that moment. If a CPU/GPU are running, they are generating heat - the faster the clock speed, the more heat they generate so running at max capacity for extended periods of time will generate a LOT of heat that will need to be dealt with. Heat affects the efficiency of RAM and SSD's which also generate heat which again would require the fans to kick in and thus increase the power draw for the system too. I assume that an SSD may be more power efficient as its not drawing power to spin a disc and power the arm to read that disc. How does charging controllers or powering any other USB device plugged in affect the power draw and heat of the system?

Point is, that page only goes so far in explaining why Sony can raise the clock speed to such a degree by more efficiently utilising the power - not running 'everything' at 'full' frequency all the time so those 'spikes' in power can be reduced. If you drop the power going to the CPU for example when a power spike would normally occur, then the overall power going to the APU can be more consistent and built with a higher power draw on average. But it doesn't tell us how the system will run on average and how long it could sustain maximum clock frequencies if the situation demands it etc. Therefore there is a LOT of speculation because there is unknowns - some of which we may not know for years.

Whether it matters or not is again debatable. Does it matter if the PS5 doesn't run at max frequencies very often? Does it matter if they have to drop the frequency of one to boost the other? I am sure we all just want great gaming experiences and if Sony have developed a more 'efficient' method to get 'more' performance in general from its hardware, that's all that matters. If you want to play Sony's games, its going to be the only console to offer those. We have seen new developments in rendering technology too that help get a LOT more graphical fidelity than could be achieved if 'each' pixel is rendered and equal. You would be hard pressed to see a difference between a native 4k and CB rendered 4k in actual game-play and only if you analyse a single frame up close - especially on the first frame after a transition does it become noticeable. DLSS 2.0 makes 1080p somehow look as good, if not better than native 4k, VRS too saves rendering resources by focusing on the pixels that really matter - not all pixels are 'equal' and even only rendering what the player can see makes a big difference - believe it or not, that's not always been the case with 3D models that may only be partially visible. If you only render the part of the object in view, that saves resources. Point is, it may not matter if one console has a slightly more rays being used or slightly higher native resolution, maybe less aggressive VRS or whatever differences maybe found, seeing any difference in a side by side could be extremely difficult. Especially as the pixels are so much smaller now. It can be very difficult to see the difference between 1800p and 4k - far more difficult than 900p vs 1080p despite the same relative size difference. Those 'softer' edges. softer details are much less noticeable because the pixels are so much smaller and with a much higher pixel count too, you have a lot more detail and definition.

The initial point I was making though is that its still speculation as to what the PS5 will be running at under 'normal' conditions. Explaining how they have achieved a 'higher' frequency cap than anyone expected - as that article tries to explain - doesn't really help answer those questions and explain what 'may' happen in a worst case scenario - ie the game demanding a high sustained power draw from ALL components and how long it can sustain that and manage the heat is still unknown. Its that 'up to' that causes concern - rightly or wrongly - and is why there is a LOT of speculation. We have games that run 'up to' 1080p/4k and 'up to' (or target) 30/60fps but that can mean the occasional drop down in resolution or the odd dropped frame here or there - to some games that rarely (if ever) hit those caps. Until someone comes along with the answer, people are speculating what this actually means - will the 'average' frequencies be quite a bit lower and boost up when needed, will they be close to cap and only drop when necessary, when it doesn't matter to reduce those power spikes? Will the CPU and or GPU be constantly fluctuating up and down as the game demands to keep the power and heat down - no point running at max frequencies if the game isn't demanding it and generating unnecessary heat... Who knows?? But people are still speculating until those things are known - not that it matters because if the game doesn't 'need' to run both CPU and GPU at max, then does it matter if they aren't? Of course not because its not impacting on the game, its more efficient and stops the heat building up too... BUT we still don't know...

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

Jaz007

@BAMozzy They should honestly have you write an article for the main site. The depth of your technical breakdowns is just phenomenal. And out of curiosity what's your professional background? I would have to imagine it's in computer hardware.

Jaz007

Ryall

@BAMozzy The PS5 CPU and GPU frequency isn’t going to have a hard lower limit . If someone issued it with halt and catch fire instructions both CPU and GPU clocks could potentially fall to very low levels. At this stage we don’t really know what developers are going to ask of the PS5. The kind of instructions it’s issued with is what is going to determine whether it remaining at its peak frequency or is forced to scale back due to power constraints.

I would assume power supplied through the USB ports would be outside the system on the chips power budget. They can’t have a game running differently because your charger a controller any more than they could if the console was in a warm room.

Ryall

BAMozzy

@Ryall The point I was trying to make is that no-one yet knows. It could very well be operating at very low frequencies - even off if you are watching a Bluray or streaming from netflix so there is no/minimal fan noise at all with certain applications.

Of course it makes no sense at all to reduce the CPU/GPU frequencies just because the power consumption is higher due to a USB powered device plugged in but I said that more 'tongue in cheek' because its still drawing power. We don't know whether the impact of the Decompression or Audio components will have some impact on the power draw for the APU and whether or not that impacts the max frequencies the CPU and GPU components of that APU will be affected - it will add to the heat of course.

The main point I was trying to make though is that right now, we can only speculate on what it will mean in the long run. If games start to push that gen harder and harder as they tend to do, could that mean the PS5 struggles to maintain its maximum frequencies as games demand more and more from both CPU and GPU? Will it mean that more complex CPU driven games have to scale back more visually because the GPU can't maintain its frequency so much? Especially if the 3D audio chip is working extremely hard calculating every single drop of rain and the sound it makes within a complex environment and the amount of data streaming in requires a lot of decompression and management - all of which will be drawing power allocated to the APU.

If I was a developer, I would want to know those parameters - what the console runs at on average and comfortably - using any potential boosts to power through those little blips that occur when certain unexpected events that cause those bottlenecks to drop the resolution and/or frame rate down for a moment before the dust and game settles back. As a gamer, I don't really care - I just want the game to run as smoothly as possible and to give the best, most consistent performance. You don't want to waste all those overheads through the majority of the game because of occasional spikes mean that to eliminate them, you have to drop settings a bit for the worst case scenario - especially if that only happens when either the CPU or GPU bottleneck. Its better to allocate the power to boost the area that needs it to power through that bottleneck rather than waste power on the other that isn't needed.

Sony's method is certainly an interesting way of using the power efficiently but until we know exactly what that means for the system and how it will run, its pointless speculating. Its no point saying the CPU and GPU will run at 200MHz lower and will boost one or the other whilst dropping the other to offset that extra power needed for example because that is pure speculation. Saying it will run at max frequencies for the majority of the time too, only dropping frequency to one or the other to minimise those power spikes is also just speculation too. If you are basing your purchase decision on this, then that is wrong. Wait to see if it 'matters' in games and what, if any, impact that decision has to the performance. Even if it can't deliver as many rays or maybe not have as a high a resolution, due to the smaller GPU, that may not make a 'noticeable' difference in practice and Sony's system could punch more above its specs because its more efficient usage of power - again just speculation. Its really not worth worrying about right now because we don't know, Sony hasn't told us and it could really vary on a game by game basis anyway...

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

JJ2

@BAMozzy
Oh well. I thought that link was useful and educational. A change from the constant speculation babbling we often see on the internet.

Edited on by JJ2

The crowd, accepting this immediately, assumed the anti-Eurasian posters and banners everywhere were the result of acts of sabotage by agents of Goldstein and ripped them from the walls.

Ryall

@JJ2 It was a very interesting article and I’m glad you shared it. At this point it’s difficult to say what impact eliminating the peaks in power consumption will have but it’s certainly worth understanding why Sony took the approach they did.

Ryall

BAMozzy

@JJ2 Its certainly interesting and does illustrate the approach Sony are taking quite well even if it doesn't add anything more than we knew already from Mark Cerny's tech talk about the console. Its almost saying that the console will throttle performance when the power spikes instead of when the temperature spikes for example.

To me that is still not a 'perfect' solution as it would mean the console would be running at those speeds and generating a LOT of heat and therefore need a very good cooling system and fans. It makes much more sense to run the console at whatever frequency it happens to need up to those limitations - so if the CPU only needs to run at 3GHz and GPU at 1.8GHz at a specific point, that's all its running at and thus not generating as much heat to require as much cooling and therefore also keeping the noise down. The Series X will be running at fixed frequencies regardless so when its CPU and GPU are not being pushed, its still running exactly the same, still generating heat and therefore still need the fans to kick in. Sony of course will need cooling too but if the system isn't always running as fast as it can, its not generating as much heat all the time

It still doesn't tell us though whether the console can run at peak frequencies and if so, for how long and if not, will we see a drop in visual quality (again I say 'see' as in metrics rather than actually perceive a difference with modern rendering techniques helping) over time. For example, if you are in a big battle with lots of AI, physics, explosions etc pushing the CPU really hard necessitating a full 3.5GHz for an extended period, will the resolution drop because the GPU cannot run constantly at 2.23GHz through that battle?

Again, we don't really know. Its still speculation right now and we will have to wait and see. I can't see the PS5 running at full speed and throttling performance 'just' when the power spikes - that would generate far too much heat and require cooling - something Cerny said was better managed and therefore quieter. If anything, I think the system will vary according to need - run at the frequency necessary to hit the target frame rates. Instead of only using 60% of available CPU or GPU, it will drop the frequency so its using 100% of the CPU and/or GPU and thus save power and generate less heat - but that is my speculation and still doesn't really answer questions about how long the console could run at peak frequencies for...

I wasn't criticising you or that article at all but merely trying to show that there is still a lot of speculation and we still don't know any more than we did after Cerny's presentation. People are speculating that the PS5 will run at those frequencies for the vast majority, only dipping below if necessary, some are speculating that it will run and some lower 'fixed' frequency and 'boost' up when needed to power through. I think it will just vary according to need to reduce the overall power consumption and heat generated but we don't really know at all.

A pessimist is just an optimist with experience!

Why can't life be like gaming? Why can't I restart from an earlier checkpoint??

Feel free to add me but please send a message so I know where you know me from...

PSN: TaimeDowne

JJ2

The crowd, accepting this immediately, assumed the anti-Eurasian posters and banners everywhere were the result of acts of sabotage by agents of Goldstein and ripped them from the walls.

icecube

Well that's a good looking console

icecube

Jessiex

Looks like it'll overheat easily.

Can't wait for PS5!

Genrou

So Xbox's new console looks like a speaker and Sony's looks like a router. Whatever happened to making a console look like a console. Are they that ashamed of making consoles or something?

Genrou

nessisonett

I really like its design, looks very space-age. Definitely a step above the phat PS3, that’s a bulky monstrosity in comparison.

Plumbing’s just Lego innit. Water Lego.

Trans rights are human rights.

JohnnyShoulder

Yeah I like the design. Thought the show was good too, had a nice variation of games. Obviously I'm totally pumped for Demon Souls which looked fantastic, as did Horizon Forbidden West and Spider-man Miles Morales.

Life is more fun when you help people succeed, instead of wishing them to fail.

Better to remain silent and be thought a fool than to speak and remove all doubt.

PSN: JohnnyShoulder

JJ2

The console looks fantastic.
Also I'm not fond of the digital edition but, well played Sony, well played.

The crowd, accepting this immediately, assumed the anti-Eurasian posters and banners everywhere were the result of acts of sabotage by agents of Goldstein and ripped them from the walls.

Please login or sign up to reply to this topic