
There's no denying that PlayStation 5 is on its way, and we now have a fresh rumour to add to the pile. A ResetEra user has unearthed a new platform within Unreal Engine 4 named 'Erebus', and there's some evidence that it could turn out to be Sony's next generation console.
In the two images below, we can see a list of known platforms before Erebus appears at the end, with the abbreviation 'TST2', which doesn't appear to mean much by itself. We also see the name Erebus being used alongside PS4 and Xbox One in relation to cross-platform play, which could mean that this new platform will be able to talk to current generation machines.
As has been pointed out by GearNuke, Erebus is also the name of the Greek god of darkness. When you think about Sony's codenames for PS4 and PlayStation VR also have a Greek theme - Orbis and Morpheus, respectively - it's not a huge stretch that Erebus could be the codename for PS5.
What are your thoughts on this? Is Erebus the PS5, or could it be something else entirely? Enter the darkness in the comments below.
[source gearnuke.com, via resetera.com]
Comments 56
Knack 3 confirmed!
‘God of Darkness’? clearly Vita2- Sony’s Anti-Switch weapon.
And so it begins ... ✌🏾
to be fair it could be anything not necessarily the next PlayStation, it could be the next XBOX for example or hell maybe a brand new system by a new manufacturer (or an old one like SEGA) or even a new PlayStation Handheld
i think assuming it's the next PlayStation home console based on it's name is a bit or a reach
I actually think this is probably accurate.
@MadAussieBloke Oh I love it
So we potentially have a codename. Sounds cooler than Project Scarlet, that's for sure.
Going to be a Vita 2/VR hybrid surely?
@Kidfried Touche.
i want to see dev kit specifications
either way i wont be purchasing day one this time around unless it supports backwards compatibility
@get2sammyb @Kidfried Scorpio is actually Latin, in Greek it would be skorpios. But so is Orbis
Morpheus and Erebus are both Greek gods though. So Sony has named one project after a Greek god before, Microsoft has never done that. So it's something.
@Octane Nintendo also never codenamed something after a place you by drinks...or marine life...or a political movement....kr two random letters.
And then they did. Whats to say this is Sony?
Apart from we know the next Nintendo hardware is "Mariko" (At least on Nvidia's side) and the next Xbox family is collectively called "Scarlett". I suppose that only leaves Sony (or Google...) unless Xbox has unique codenames for each kne of the Scarlett set.
Wait, its listed with cross platform stuff? Well it aint Sony then. Unless its just PC and mobile.
@Octane Thanks for the correction.
The new Xbox is codenamed Scarlet anyway! That's been pretty widely reported already.
Selling my PS4 right now brb
@Fight_Teza_Fight if only this were true
inb4 patcher adds his opinion to the matter and says that because of this "proof" the ps5 will for sure launch 2019 and will be the last console.
it also won't sell well ofcourse and it will have streaming as a main focus because why not.
I think Sony are leading us off the scent. Knowing full well we scan for greek gods, they will codename their next project Dave to sow confusion amongst the masses.
It's obviously the codename for the atari vcs
Final Fantasy VII remake confirmed at launch!
So if it's the god of darkness PS5 won't support ray tracing : /
@FullbringIchigo The Next Xbox already has a well known code name.
@Tha_Likely_Lad I won't buy it at all without backwards compatibility. I will buy it day one with.
@NintendoFan4Lyf I doubt that boolean is used for anything like that. Sony may have switched places with Microsoft this generation on their cross platform play stance, but thats policy, not something built into UE4.
It does make sense that this could be 'PS5' due to the code name. The rumoured 'Google' console is Project Yeti for the more obscure option and its been well publicised that MS's next box is codenamed Project Scarlet so it really narrows down the possibilities that this could be something other than a PS5.
Of course it could be a 'Vita 2' or something else entirely - maybe not even Sony at all but its no secret that the PS5 is coming 'soon'. Very few consoles generations have lasted more than 5yrs and the PS4 is already nearly 5yrs old now. Since its release, there has been a number of technical 'leaps' too (4k, VR, HDR, VRR etc) - more so than the last generation which was relatively stable.
@NintendoFan4Lyf AMD has its own Ray Tracing option as well as working with MS to offer DXR (DirectX Raytracing) support too in their GPU's. ProRender will support real-time ray tracing and will mix that with traditional rasterization for greater speed too. AMD have their own OpenGL/Vulcan based ray tracing option too caled Radeon Rays 2.0 so its not like AMD are sitting back and allowing nVidia and MS to offer Ray Tracing whilst they don't.
That’s not true, consoles are dead and this is the last generation!!!
At least the intelligent folks from Gamespot and IGN say that.
@PS_Nation do the staff say that or just the readers?
if it's the staff then i have even more reasons to avoid IGN.
@jdv95 The readers say more explicitly.
@NintendoFan4Lyf @BAMozzy ray tracing seems pretty much out of the equation for PS5 (I think more about costs/power than hardware availability), although it's one of the features than can truly mark a fidelity leap for next generation, isn't ?
EDIT: https://www.pcgamesn.com/amd-battlefield-5-hardware , RT seems to demand a lot of CPU power (many cores) so that will also seems to push RT out of the next-gen capabilities.
@Kidfried Xenon i thought that was a printer. 😜
its probably sega genesis 2👍😂😱.word up son
@Flaming_Kaiser almost... ; ) Xenon was (also) a vertical space shooter from the late 80s... damn I loved that one.
Ps5 will be out 2021 Sony wants to make a big leap this time. It could be a switch type of console or a stream box with vr headset included with a better amount of ram an gpu power . If it comes out in 2020 it will be a 2.5 to the ps4 pro, an with a 100 million ps4 s out games like cyberpunk would want to cash in on that
@NintendoFan4Lyf yes, that's pretty much my take also. You can go 24k resolution and 12GB textures but if lighting sucks, all sucks. And as you said, RT will need a couple of iterations to "get it right", but for me I can imagine it as something as "vital" as perspective correction and bilinear filtering back in the day... like turning RT off will give you nausea ; )
@NintendoFan4Lyf Thats not how it works at all. Policy is not built into 3rd party engines. Its not the job of a 3rd party engine to enforce a specific platform holders policies (especially ones that change regularly). Not only are they not required to do so, they won't, as a general rule.
Its very unlikely that this boolean is used for this purpose.
As a general rule, a different platform would simply have separate servers. There would be no hard-coded prevention of cross play, just a soft prevention of everyone playing on different servers, match making on different servers, etc... This is the easiest, best, and most flexible way to wall off your platforms because it allows you to easily combine them by just sharing the same servers. (Source: I have been developing games and software for over a decade).
Hello darkness my old friend
@MattSilverado @NintendoFan4Lyf Ray tracing has been around for years. In gaming though, the 'cost' has been prohibitive. By cost, I am not referring to hardware costs at all but the much more valuable cost of 'time'.
In CGi movies, it doesn't matter if a single frame takes weeks or even months to render, whether it requires the processing of a single or multiple machines. However, in gaming, the time is a crucial factor as to whether a game will run or not and how smoothly it will run too. For a game to run at 30fps, A frame, as well as all the AI, physics calculations etc, has to be rendered in 33ms or less - any longer and you drop frames.
Ray Tracing has been far too 'expensive' in the past, not only because you are having to calculate 'millions' of rays, how they hit objects in the world you have created, but also what type of surface they hit, how they interact with other rays etc, but also have to consider the objects outside of the 'camera' letterbox view. One way Devs have enabled games to look so impressive is because the GPU only renders and processes everything in that 'letterbox' window we see as the gamer - not the 'world' we don't see. Even if you see a car boot sticking out from behind a wall, the rest of the car that you can't see isn't rendered - at least not fully, to save time until it becomes 'necessary'. With Ray tracing though, that car would need to be rendered so the rays react appropriately, like maybe have a reflection of it in a window you can see. Its things like that that have made Ray tracing far too expensive for gaming and why CGi movies can take weeks just to render 1 frame.
What we are likely to see is a 'scaled' down version of ray tracing anyway in gaming. Instead of millions of Rays, maybe just a handful comparatively, enough to be able to predict the 'outcome' and fill in any 'blanks' and smooth it all out with more traditional gaming methods of image processing. In other words, use it in conjunction with existing tech rather than rely on pure ry tracing alone.
Believe it or not, a PS3 could probably render a frame of God of War at native 4k but it may well take minutes to draw just 1 frame so it will never be 'playable'. Just because its 'weak' by modern standards, doesn't mean it couldn't render modern games at modern resolutions - it could but not at the necessary speed to be 'playable'.
The new API's and GPU's could well use some ray tracing in the next generation of Console. Maybe it won't be full ray tracing but 'partial' and then use that information to fill in the blanks so to speak. For example, instead of firing 100 rays at 100 pixels and following every one to see how they react etc, they could use just 20 and use that to calculate what the other 80 would have done instead of having to follow those 80 too - thus saving time. Maybe we will see different levels - much like we see different levels of Anti-aliasing. Its not unrealistic to see Ray tracing coming to console.
As I said, both MS (DXR) and AMD (Radeon Rays) have their own proprietary Ray Tracing options as well as nVidia of course so I can see some more 'basic' options making their way to consoles. Obviously MS will probably push DXR whilst Sony may well go with the AMD option. Most of the gaming uses will use a mix of traditional rasterisation with some ray tracing to help out in areas that rasterisation is weak on rather than totally replace the traditional methods entirely anyway so its not totally out of the question. nVidia look to be building a separate 'ray tracing' block into their GPU's too and with the move to 7nm, its possible that consoles could build something similar into their APU's without significant increase to the overall size.
Its interesting though and still slightly relevant to the topic as top gaming engines like Unity, Unreal and Frostbite are planning, if they haven't already, to integrate Ray Tracing. Frostbite already has as we have seen from the BF5 demo and we have seen Tomb Raider and Metro Exodus ray tracing demo's too. As such, I really do expect some degree of Ray Tracing to be integrated into the next gen Sony/MS consoles.
@BAMozzy yes, it's always a matter of balancing. I've done some 3D rendering on a Pentium90 and I remember enabling Ray Tracing on the software, it was so nice but sooo slow. It comes down to costs and available hardware at the end. Take animation movies from almost 20 years ago (say, Monsters Inc) , realtime graphics as available today on consumer hardware today are not even remotly close to that. But accounting that a single frame of that movie took 29 minutes on a Pixar farm... well, you know. Anyway, graphics on games at 30/60fps are getting really nice (Detroit looks good) but still years away from movie quality without some kind of "rendering farm".
EDIT: 29 minutes per frame is for Monsters University, but give or take hardware leap for all those years in between, it would have take the same back then ; )
@NintendoFan4Lyf You are absolutely right, it smells fishy, or its being used for some obscure thing that we are not aware of.
Consider this: Microsoft only recently changed their policy for cross play. They were originally so adamant that the Xbox could not play with other consoles that Square Enix indefinitely canceled FFXIV on their platform. Sony used to be all for it, and now they are very resistant. Nintendo also used to be strongly against, and they have since (selectively) changed their minds. There is also iOS and Android and other platforms which do not play well with others.
What this means is that you would need dozens of flags, and their usefulness would change constantly as these policies are very fluid. There are so many policies that each platform holder has, and they are constantly changing. There is no way that a 3rd party could, or should, enforce these, at any level. Thats up to the review process.
Also a big, big, red flag is in the name: bPS4SeenOtherConsole. Two things stand out:
One, "Console", this would be "Platform" or some other generic term. A game developed for iOS, Android, and PS4, would have the same restrictions against cross play as one developed for the Xbox, Switch, and PS4.
Two, "PS4". When you compile for a target, you are that target. There is no need to specify yourself. If this flag were used for that purpose, it would be bSeenOtherConsole (or Platform). Not only because other platforms also have restrictions (yes, even Xbox) and would need to check them as well, but because you would actually have two flags: "SeenOtherPlatform" and "PlatformIsPS4", and combining them just doesn't make sense.
Finally: "SeenOtherConsole" implies that at one point, any point, another console was seen. How is this useful for a networked walled garden? For this purpose, each "console" (or platform) connected to a service would have a platform type flag and the game would filter each entry in the list by the source platform. The fact that the PS4 has seen another console at some point doesn't really matter here.
@NintendoFan4Lyf I am not defending Sony's policy, its terrible, period. I just want to be clear that Sony was all for cross play and Microsoft was against it, now Microsoft is all for it and Sony is against it. We seem to get a LOT more coverage of "Sony hates cross play" than we did when Microsoft had the same policy.
The code may be authentic, but it does not suggest a walled garden use. In fact, it doesn't suggest that the other console is not ALSO a PS4. bPS4SeenOtherConsole can easily be translated as PS4 Has Seen (Some)Other Console Thats Not This One. It doesn't mean that its a different platform. There are a dozen meanings that I can think of that have nothing to do with cross play.
@PS_Nation and there you have it! Geez Louise! Lol
@BAMozzy good report man! But yeah, been wondering. Why not go with NVIDIA? But it’s too late now right? Ps5 will still get ray tracing? It needs to be around 8TFLOPS yes? But what if Microsoft doubled the flops on Xbox one x? Does Sony need to aim for 12TF’s? And a powerful cpu. And gpu? And memory? Good gosh it’s all too much! Lol
@NintendoFan4Lyf
If I were to guess, I would say that its for cross-play between PS family systems. For example, you can login to your PS3 and PS4 on the same account (and other PS devices), simultaneously, without being kicked off of one or the other. If you were to be logged in, to say, the PS3, and the PS4, and playing a game that had cross play between them, how would you handle this situation?
@adf86 “he tells a tale that’s worth ignoooooring!” Lol
@LaNooch1978 By all accounts, the latest nVidia GPU's with Ray Tracing are not that 'expensive'. Ok so they are more expensive than a Console but they 'nVidia' GPU's and also like a lot of other high-end GPU's, a lot more powerful than a SoC with its built in GPU, as well as CPU. Sony and MS will no doubt custom build their SoC, telling AMD what they want within a set budget - and maybe, they could include a section on that chip dedicated to ray tracing, along with the CPU and GPU cores too. Its no secret that the 'Jaguar' isn't exactly the most efficient and effective CPU - especially not clocked to 1.6ghz. A modern, multi-threaded 4core CPU clocked at 1.6ghz is more capable so if they built it with a 6core CPU clocked at 3Ghz, that's a massive boost to CPU in smaller footprint. Add in a smaller GPU (in size not power) with the more efficient and lower latency and you have a 'beast' of a console - with room to add in dedicated Ray tracing. Its also a guaranteed and long term income as consoles last much longer than GPU's on the market which are updated and improved annually. AMD also won't be charging MS/Sony much more than 'cost' for the chip. Its not like they have to manufacturer a whole separate GPU, package and distribute it to wholesalers, who then have a mark-up to shops who then have a mark-up for their profits too. A £600 GPU isn't costing nVidia £600 to make it - a lot of that cost is 'Profit' for nvidia, middlemen and retailers.
@Kiloman74 The purpose of the GPU is primarily to render frames and the higher it is, the quicker it can render a frame, the higher the resolution, the higher the visual settings OR combination of these. The CPU is ALSO important and a 'big' reason, we see games currently only hitting 30fps - especially those with high physics, lots of complex AI, as well as particle effects etc.
Games that drop resolution to enable 60fps mode is significantly reducing the render time but a lot of the CPU workload remains the same as it still has to calculate all the same physics etc and tell the GPU what to draw. Point is, its not just about the GPU and even if MS were to offer a 12tflop XB2 compared to Sony's 8Tflops, all that would mean is that the Xbox could render 4k images quicker and/or with higher visual settings. There are other factors that could make the PS5 better - such as the CPU and RAM, maybe even by having better ray tracing component. Maybe the PS5 wouldn't have the highest resolutions or could use creative rendering techniques - such as chequerboard rendering and temporal reconstruction to match or even 'better' Xbox - the PS4 is better than Xbox at chequerboard rendering because the PS4 has that built into the GPU as well as the ability to use half-floats. If a game was built to use 'Half floats' entirely, the PS4 has the equivalent of 8.4tflops - however not all instructions can get away with only half floats but a combination could match or even better Xbox's 6Tflop GPU that can't use half floats.
Point is that there are other variables to how games look and run that could come into play. It really depends on how Sony/MS decide they want their SoC's to be built, what features and how its all balanced. This gen, the focus was much more weighted on the GPU so we got a 'generational' leap visually and not so much on game-play and frame rates because the CPU was the limiting factor.
Next gen could well see a little boost in visual compared to Pro's/XB1X's but a 'leap' in game-play and performance thanks to a big CPU boost to handle much more complex physics, AI etc and within a 16.6ms time frame to give us a lot more 60fps games.
Surely it's just fake? All seems a little too convenient if you ask me.
@JohnnyBastos So is edible toilet paper, but it exists too.
@thedevilsjester A new Xbox they just had one? So value on the Xbox side is nothing.
Nice bit of digging👍 looking forward to all the rumors coming PS5 will be a beast I reckon
@BAMozzy @ShogunRok (Erebus the first five beings in existence, born of Chaos) You see that the? 5 confirmed!? Ps5
Apparently EPIC now saying it's code for Switch version of Fortnite.
Don't believe it myself as the screenshot clearly shows another entry for Switch!
https://twitter.com/arjanbrussee/status/1035270583007756288
I'm hoping for a Vita 2 but will be holding my breath until we get more info.
@BAMozzy thanks man. So, a 12 TFLOP console and a cpu and memory, big and powerful enough to run all games at full on native 4K at 60FPS. no checkerboarding or any “cheating it’s way to 4K” no performance issues. If it’s possible then that’s gonna cost isn’t it? I say Sony should give us two models. I’ll aim for the more advanced machine every time......
@Kiloman74 A console could easily get away with 8-9Tflops (currently) and still deliver native 4k. The X for example has 6tflops and delivers quite a few native 4k games and the 9GB RAM (for gaming) offers enough to buffer 4k and stream high quality assets. The 'main' issue though with the X is the CPU which limits some games to 30fps and possibly the 'resolution' too.
'Generally', the GPU is there to handle the 'drawing' and processing of the image. The CPU is there to calculate where everything is, the AI, the physics etc and tell the GPU what to draw. Sometimes, certain things maybe offloaded to the GPU - like the Physics for example because the CPU is too slow.
The more powerful a GPU, the quicker it can draw and process the image, draw a 'higher' resolution image, have higher quality visual settings or combination of these. In other words, if you have 4x GPU boost, you could draw an image 4x the size in the same time, or have much higher visual settings (like higher quality shadows, reflections, draw distances etc) or draw the 'same' smaller image in a quarter of the time.
1 thing to remember though is that 'Tflops' are relative and not indicative of performance when compared to much older or newer hardware. A 3Tflop GPU from say 5yrs ago wouldn't be as effective as a 3Tflop GPU would be today - which is perhaps why the X at only '6Tflops' is able to give much higher results than you may expect. On average, most 'X' games are offering over 2x the pixel count of the '4.2tflop' PS4 Pro. If you use the 'same' generation hardware, then you basically would need to multiply that by 4 to go from 1080 to 4k. New advances though reduce latency and improve efficiency which means you can achieve the same results with 'lower' Tflops.
The point I am making is that if the PS4 is 1.84Tflops, the PS5 wouldn't necessarily need to be 7.36 (4 x 1.84) to deliver 4x the resolution with current GPU technology. Of course we are seeing numerous games fall below a rock solid 1080p - especially at 60fps.
Another aspect of course is 'time'. This is very important with games as each 'frame' has to be drawn in a certain time to hit certain refresh rates. For 30fps games, that's 33ms so the CPU and GPU must complete their workloads for each frame within 33ms to hit 30fps. That drops to 16.6ms for 60fps. You basically need to double the GPU to halve the render time so if it takes a 1.8tflop GPU to render 1080p at 30fps, you would need 3.6tflops to render at 60fps and then 14.4tflops to render at 4k based on the same architecture. However that's not taking into consideration the CPU of course as part of that 33ms is taken up by the CPU's calculations and instructions to the GPU to tell it what to draw etc. It maybe that the GPU is only taking 10ms of that time to draw and the CPU is taking 20ms to do all its work. Maybe with Multi-threading, that could be halved and with a boost in speed, reduced down to 6ms so you could go from 30fps to 60fps just by replacing the CPU in this situation - therefore only need a better CPU and just a 7tflop GPU to hit 4k/60fps.
Unfortunately, there is no calculation that would determine the minimum specs needed for full 4k/60 across 'every' game. Some games are much more demanding than others - Fifa has no trouble running at Native 4k/60 on Pro yet other games can't run at a full 1440/30p. Visual settings too can affect how long it takes to render an image. If you want higher quality visual settings - higher than we currently get, with no pop-in or resolution of things like shadows, textures etc, then you need a more powerful GPU's than just boosting resolution of the current console standards. Not all games and game engines are created equal - in other words, some games may have much lower polygon counts for their character models and as such are much less 'taxing' to draw, other engines may be more 'time' expensive to increase the shadow quality (adding 3ms instead of 1ms to go from Low to Medium). Like I said, the new hardware should be more efficient and better latency so instructions get processed and sent much quicker than older hardware could. Going from 'single' thread to Multi-thread also has a big effect - often its just one CPU core that is 'struggling' to do its 'task' in time so you get a CPU backlog BUT the others are under utilised, sat around ticking over. With Multi threading, that single core's workload is shared so instead of just one running at max capacity, they all are working much more evenly. Coupled with better latency, they can process everything much faster and get the instructions to the GPU much quicker so it can start drawing the frame much sooner.
Even a PS3 could render a 4k image and could probably render a game like 'Pong' in 4k (original look) in time to make it playable - it can't output at 4k though because of the video out - the HDMI capabilities for example but point is, the game, the polygons, the world and draw distances, the amount of detail (whether its using 2D illustration backdrops or actual 3D models) etc etc all factor in to whether a GPU can render the image at a certain resolution and within a certain time frame. A 12 Tflop GPU may not be enough for some games at 4k/60fps with 'ultra' visual settings but could be more than enough for many others. Again, look at the Pro which can deliver 4k/60 with fifa but can't deliver Battlefront at even 1440/60 or look at Rise of the Tomb Raider, can't deliver 'enhanced' 1080p at 60fps, and despite its 2.2x GPU boost over the PS4, can't deliver the PS4 visuals at twice the frame rate consistently - its all because different settings inc frame rate are dependent on more than just the GPU.
Based on older hardware, a 12tflop, GPU with a multi-threaded, faster CPU should be more than adequate for native 4k 60fps gaming but maybe even 8 or 9tflops would be up to that task with adequate console visual settings so don't be 'disappointed' if the PS5 isn't 12tflops or more. This isn't even factoring in half floats (FP16) which effectively double the amount of instructions a GPU can process. The Pro, using half floats would be a 8.4tflop GPU but not all instructions can be FP16 so you get anywhere between 4.2 and 8.4tflops with a Pro if the devs can utilise some FP16 instruction.
I know this is quite long winded but the crux of it is that games are incredibly variable and the hardware is improving so that its impossible to state that a 'minimum' spec is needed to deliver all games over a 5-7yr period at a minimum native 4k/60 consistently. One of the reasons we see games improve both visually and in performance is because of game engines more than just 'better understanding' of the available hardware. That does come into play too, knowing better what to push onto the GPU, how much time certain processes take and building games with that in mind.
Another factor we haven't mentioned is Ray Tracing, which could be very expensive (time) and depending on the developers and there use of this, could determine the resolution and frame rate - in other words, you may find that some games run at 1800p and/or 30fps because they are using ray tracing compared to others which use traditional rasterisation and easily hitting 4k/60. There are just too many variables to categorically state what specs will be necessary for a minimum 4k/60 in every game over the course of the next gen hardware life cycle.
As for cost, People look at nVidias GPU and relative tflops and think that 'price' would be indicative of the price of a console. Companies like nVidia are making the GPU, then selling it to wholesalers for profit who sell it to retailers for profit who sell it to customers for profit. Same goes for CPU's, RAM etc too. Sony and MS will go direct to AMD who probably charge 'cost' with maybe a tiny profit on each (Pence per 10) that's guaranteed for at least 8yrs and lets not forget, they make 'millions' of these every year and bulk buying where as the single GPU's are in much lower numbers with shorter shelf life and a LOT of R&D costs too. Both Sony and MS also have more cores in their GPU than we actually get to reduce scrappage. If one or two cores aren't 'good', which does happen in mass manufacturing, these are still usable because they still have 'enough' usable cores to deliver the performance metric. I am sure nVidia and AMD use 'failed' core GPU chips in their lower end products and keep the ones that haven't got flaws for the top end - hence you get more 'cores' in some GPU's of the same family (look at the GTX specs - they just turn off some cores. The PS4 Pro GPU has 40cores but only 36 are active so you get a much higher yield.
I am not saying its going to be 'cheap' but there are again numerous factors that make individual PC parts expensive and why you can't compare a 12tflop PC GPU price today with what a possible 12tflop console could cost in a years time. When you consider the Xbox One X, OK so its £100 more than the Pro, but you are getting a whole PS4's GPU equivalent extra (an extra 43% more tflops), 50% more RAM that's also 50% faster, a much more efficient and better quality cooling system, a 4k HDR Bluray player, Variable Refresh Rate capability, Dolby Atmos enabled etc. If you were to ask any PC gamer if they would spend £100 to get 43% better CPU with 50% more RAM and boost all the RAM's speed by 50% too, an upgraded cooling system and the other benefits too, I think they would be insane not to take up that chance - yet that's 'expensive' despite being cheaper (especially if you factor in inflation) than a lot of consoles were. A PS cost $300 at launch and a Saturn cost $400 - both in 1995 - factor in inflation and these then become much more expensive than the Pro and even X in the case of the Saturn.
Anyway, its no point speculating on the ideal spec and whether or not they could build that at a reasonable price point to sell. We have seen consoles sold at a loss because the manufacturers know that they will make money on every game sold for it - especially digital as well as additional peripherals and know that the manufacturing costs will drop and they can keep the price point for year(s) after so it becomes profitable. Again though you do get distribution, wholesalers and retailers profit margins on top of the price point so whilst Sony/MS may not make much, if anything by pricing it to sell, they also factor in profit margins for middle-men and retailers in their RRP. No retailer would want to sell the consoles at a 'loss' so if any losses are to be found, its with the manufacturer but like I said, they have avenues to offset these - especially now with annual subscriptions...
Sorry I can't be more definitive but hope you found this informative and/or useful.
I’ll take my time reading all this cos it’s fascinating! Make me a console huh! Lol Mr cerny did say that 8TFLOPS is need for true 4K. Then there is the CPU and Ram side of things right? I’ve always said Sony should let Microsoft reveal their machine first, even release it before the ps5. And if Sony could make two version of he ps5, the second being the more powerful but slightly more expensive. If Microsoft could then will their next machine definitely double the amount of FLOPS? Then there is AMD. And NVIDIA. They say NVIDIA has the better ray tracing tech and has people been cursing AMD, mobile tech in today’s consoles and jaguar cores? All this baffles me by the way so please do your thing and proper explain! Lol 😂👍
they!
@Kiloman74 8Tflops is enough for 4k at least in the majority of games and with 'reasonable' visual settings. If you want 'ultra' level in everything, then you may need more but some settings are not necessary to have 'ultra' but its more a Status symbol - having a rig that can run 'X' game on Ultra at 'X' frame rate...
CPU and RAM are important too - its much more important to have a 'balanced' system. Both the PS4 and Xbox are very much weighted to GPU rather than 'balanced' and that's why games this gen have great visuals but not so much 'revolutionary' in terms of game design/development. AC: Unity and Just Cause 3 were trying to push games in new directions - more complex AI with high NPC numbers and physic based destruction - both very taxing on the CPU which is why those games struggled on console. I know games like Days Gone has a lot of enemies but not very complex AI or too much variation in their design. Still impressive but compared to Unity's individual character models and more individual AI for each, that took too much CPU usage so we got less than optimal performance. Just Cause destruction was very physics based, with lots of particles etc that the CPU had to calculate, track and how these all interacted with each other - hence it was too much for the CPU's in consoles - both work well on PC's
If you look back to 2012/13, People were saying the consoles were 'dead' and we were in a 'financial' crisis. Normally, during crisis, people stop spending on 'leisure' so it was not really a 'good' time to release a 'new' console - especially not one that was built for 'next' gen games. MS/Sony both built new consoles and put most money into the GPU, opting to keep costs down in the CPU - making games 'look' next gen even if they don't really push 'next' gen gaming. Of course we had some games that wouldn't run on last gen - not in the way they do on current gen - thanks to a lot of RAM and better streaming. That's why big open-world games don't have loading or small narrow areas that 'hide' the next area loading in.
It doesn't make sense for Sony (or MS) to make two 'different' spec consoles at launch. Its more likely that they would make a console built for 'digital' content - no disc drive and one with. It makes more sense to follow the last gen model with a '2nd' console coming out 3 to 4yrs later to offer 'enhanced' performance rather than release a £400 and £600+ higher spec console at launch. There were numerous factors that determined the 'need' for a Pro/X - the growth of 4k and HDR, the release of VR etc. During last gen, they released HD consoles at the birth of HD so didn't necessarily 'need' to release a Pro - they could have targetted 1080p - full HD - mid way but that was the 'target' for next gen.
As for Ray Tracing, nVidia certainly has made the most publicity with theirs but AMD and MS both have their own too. Also nVidia seem to be targetting the 'high-end' PC market where as AMD and MS look to be targetting everyone. AMD are known for giving the 'best performance for the money' even if its not the 'best' overall performance. We will have to see how these work on console - I expect Sony to use AMD's Radeon Rays and Xbox to use Direct X Rays (DXR). Rays are 'expensive' in time use but can be reduced in time by using fewer rays and using that to predict the results that more rays would give. Also using traditional Rasterisation with Rays for some aspects will cut down time and still give 'great' results.
The Jaguar was built for 'mobile' tech - that's not necessarily just mobile phones but netbooks, tablets etc. Low power consumption mobile devices. As I said above, the PS4/Xbox were built at a time of crisis and it was more important to make games 'look' like they were a 'step' up from current gen but not make 'expensive' consoles that may not sell due to the global financial situation and prediction that consoles were 'dead'. Analysts were obviously wrong, consoles had 'stagnated' because they had been around too long not because they were dying. People were craving 'new' and waiting so when they did release, people jumped on them despite the economic crisis.
Y’know, when it goes “all digital” and no slot for physical discs? That’s when I’ll give up! Or go back to Nintendo or something! Yeah, the next gen reveals is going to be very interesting indeed! Oh, one more thing......
Next gen mid gen refresh? 8K ready? Is the pro and the “X” basically 4K ready models if 8TFLOPS or more is needed?
Show Comments
Leave A Comment
Hold on there, you need to login to post a comment...