The PlayStation 5's touted Kraken compression tech is in the headlines again, as another cross-gen title takes up much less space on Sony's current-gen console. On PS4, Marvel's Guardians of the Galaxy weighs in at a hefty, but somewhat standard 60GB or so. But on PS5, the size is cut down to just over 31GB. Not bad!
Of course, saving storage space on your PS5 isn't quite as big a concern these days, since you can now install a second SSD (you can peruse our guide for the Best PS5 SSD in 2021 if you're serious about upgrading). But still, that 30GB difference is going to be noticeable, isn't it? Hopefully more and more games can take advantage of this as the years roll on.
Have you been left impressed by the PS5's compression capabilities? Downsize in the comments section below.
That is extremely impressive and adds credence to the fact that perhaps the size of the SSD is more than sufficient.
I've personally not had an issue with size constraints yet on my PS5 - although i havent installed anything made by activision thus far.
As the generation goes on, i expect we may see even more impressive file sizes, and it will be interesting to compare with the series x, given Sony's promotion of the tech in the PS5 (reading that Series S requires approx 42 gb of space, so quite a difference!)
Okay almost 30gb reduction of size is insane, SSD is definitely not a gimmick beyond just load speed since that's probably why it's taking this much less space
Okay that's super impressive. I want to add at least 1TB ssd to my ps5 but pcie 4 ssd is still way more expensive compared sata ssd on my pc.
This has definitely been a consistent thing on PS5 and is nice to see.
Yet again, this proves that Sony's storage solution as a complete package has been well designed and superior to the competition.
There has been a negative media spin put on maybe the greatest tech decision made in the current generation and it makes me a bit afraid that it will get forgotten/pushed aside because of it.
I wonder how they manage to cut down the size on PS5 by 50%
compression on the PS5 is so impressive, it seems to have massively cut down the size of some of these articles as well!
@artemisthemp On games that are going to be loaded from HDD platters, the devs need to repeat the same assets multiple times in the game files to strike a balance with load times (e.g. if the NPCs in level 2 and level 5 are going to be the same, then it might be faster to load if those character assets are saved next to the level 2 environment assets AND also saved next to the level 5 environment assets, rather than to make the arm go back to the level 2 location to load the characters from there). That introduces a lot of duplication of assets. In SDD, the developers can pretty much ignore this as the assets can be loaded just as fast pretty much wherever they are saved in the game's files.
If I remember rightly, A faster cpu and smarter OS can do a much better job of decompressing (more streamlined & compressed) data on the fly.
So I guess the PS5 HW/SW allows for on one hand a more streamlined compression that contains more data in fewer lines of code, and on the other hand a much faster system of restoring then presenting the data as playable software.
There is no loss of quality at all, in fact, better compression algorithms can accommodate even more data with fewer lines of code.
@StrickenBiged You don't need kraken to omit some duplicate files.
@nathanSF Thats part of it, certainly; but don't underestimate the amount of space that is saved by eliminating the need to duplicate resources. As @StrickenBiged mentioned: Modern games have a lot of duplicate resources that take up a considerable amount of space. Read times on spinning platter disks are helped by clustering data to reduce seek times. This also means that games with more unique assets are less likely to benefit from that savings.
Makes me feel a bit better about adding this game to my PS5.
@thedevilsjester Indeed, though I was of the understanding that modular design of games did away with duplicate code.
So for example there is only ever one iteration of villain type 'x' code which is then called whenever needed and either has one or more fixed attribute variables added to 'level up' to fit given game progression stage or difficulty level, or if a 'commoner' / 'resident', has randomly generated set of physical attributes (from a global wardrobe with its own set of rules) or even personality traits/back story (from global text library of character traits with its own grammatical rules), to create a diverse population - which is where the super-efficient compression creates more variables in less space and super fast 'decompression' delivers seamless gameplay.
For example, maybe in Cyberpunk 2077 they did not write in a rule that said "IF there is a person x with attributes y in a given zone, Then do not repeat generation rule." But wanting to populate a city zone, they got lazy with the code which looped/ called back to the same character code. They could have written a complex algorithm to randomly generate characters but were not up to it so we have sparsely populated areas with people of some variation. Didn't assassins creed do a better job of crowding out streets?
People forget that the PS5 has a compression processor called "Kraken" which help compress and read compressed data on PS5, allows developers who utilize it to reduce size
Don't know if there's a day-one patch or anything, but I just pre-loaded on Steam and it's only 20.3Gb.
@nathanSF Code shouldn't be duplicated, but the compiled code for a whole game isn't that big, almost certainly less than a single gigabyte. Most of the space is taken up by assets, that is audio or visual data which the code will load when it is needed. Think textures and sound effects, but also animations, shaders, 3d shapes.
On a conventional hard drive you generally want to group that data by area so you're not seeking back and forth while trying to load it. If you have things which exist in multiple areas, you therefore end up with multiple copies.
Less duplication of assets. On PS4 and XBONE, developers copy assets across the games file so that when it needs to load it’s always “close” by. It decreases load times.
PS5 and Series X don’t need to do that as much, if at all in most cases. In addition, the RDNA CPUs in the current Gen consoles are a huge leap over the old underpowered Jaguar. They can decompress large files on the fly while still maintaining high performance for calculations. On PS4/XBONE, you were limited in your compression algorithms because you had to account for CPU wait time and processing speeds for each core. Not much of a concern in the current chipsets.
The Jaguar was underpowered in 2013; it used low wattage cores and was based on ultra book class processors that AMD was working on at the time. Aka, they were designed for efficiency and heat reduction first and foremost, performance being secondary. The current RDNA based CPUs in the new consoles are desktop class high wattage cores. It plays a role in why both consoles have such beefed up cooling systems and are dramatically larger than their predecessors. You stick the PS5’s CPU in a PS4 chassis with its cooling system and it would overhead almost immediately.
On games where developers utilize the tools and package their assets efficiently.
Unfortunately, Call of Duty is nearly 200gb on PS5. There’s no excuse for it. If Activision optimized it and used the PS5’s feature set to its potential it would probably clock in under 80 gigs.
@theheadofabroom Useful and interesting info. Thanks for that.
I pre-ordered the physical Deluxe edition for PS5 a week or so back after thinking about it for a while.
@TheRedComet This is all very interesting.
@nathanSF As others have mentioned, it's not the code thats duplicated, it's the assets. To be a bit clearer, when people are talking about proximity of the assets; they generally mean on the physical disk/disc.
There are two reasons for this:
Its easiest to explain with disc's; but the concept applies with spinning platter disks as well. Imagine Level 1 is located at the center of the disk, and Level 10 at the outer ring. They share the same Tree asset (which is located near Level 1). Level 1 loads fast because the head doesn't have to seek very far to get all of the assets for the level. Level 10 on the other hand has to seek all the way back to the center of the disc to grab that Tree. (And the more objects you have, the more scattered around the disc, the more compounded this issue becomes.)
If instead, you stored a copy of the tree near Level 10 and near Level 1, it would have to do a lot less seeking to find the assets for each level.
The second issue, and one thats a bit less known, is that when you read X bytes from a disc/disk, it doesn't read just X bytes. The hardware reads (and caches) in larger chunks and will automatically pull in nearby data, assuming your next read is going to to be accessing that data (spatial locality of reference). This is true of all storage (including SSD's and RAM) but its impact is felt more keenly on hardware with slow read/seek times.
Thanks, that's very clear. especially the "Second Issue" you address.
I'm no programmer, I'm just going on what I understand about blocks of modular structures, notions of elements, attributes and relationships and simple connection rules like, IF-THEN-ELSE-AND-OR, I'm not sure if we are talking about different things or different functions of the same thing, or the same thing from different angles.
I can see where your analogy will apply when talking about environments, lights, shading, audio, dialogue, apparel etc, where a bulk of data resides in a physical location and needs to be called on.
I can see how a slow moving, pedestrian, horseback etc environment can do more than a fast paced, contemporary urban environment like cyberpunk might present different challenges.
I'm now assuming it is the case that both (a) calls on physically located fixed assets - and (b) rule-based random generation and (c) other processes I'm unaware of, work hand in hand.
I assume there's a unique rule based routine that randomly generates NPC characters.
I imagine a library of physiology and clothing attributes linked to 'adult/child/ - male/female/ base model from which various character and diversity of gender types can be generated
A similar function would go for Level 1 - 10 Characters. so the physical location of the routines is irrelevant once it is pulled into system memory to run and generate the characteristics of the enemy/villain/character.
It would then make sense that if a player picks easy/normal/hard/crushing/ etc mode the game constantly filters responses through that routine.
Very impressive and hopefully more companies start taking advantage of what the PS5 offers, such as this. I don't plan on upgrading the SSD in my console any time soon, if at all, so any bit of saved storage space helps.
@nathanSF You have the right idea; however the mistake you are making is to consider (a) and (b) in your example, differently; when they are not. They both require loading of various assets (static meshes, textures, etc...) regardless of whether or not they were randomly assembled. The library of components used to create a random set of NPC's you imagine isn't loaded all at once; that would be an incredible waste of resources, since not every "level" uses every resource that would be loaded.
@thedevilsjester Do'h! thanks, that sounds obvious now you mention it.
Tap here to load 26 comments
Leave A Comment
Hold on there, you need to login to post a comment...