If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Orbis: The Next-Gen PlayStation Takes Shape

Digital Foundry assesses the extraordinary rumours surrounding the next-gen Sony console.

It's shaping up to be one of the most extraordinary tech stories of the year. If the rumours are true, Sony is leaving its Cell architecture behind, instead working closely with AMD on its next-generation PlayStation.

Last week, the story took on a new dimension with Kotaku posting a major scoop: the machine is codenamed Orbis, and furthering the previous rumour, does indeed sport both main processor and graphics core from AMD. Since Kotaku's exclusive, information has started to leak from other sources - according to VG247, the new console is set for a Christmas 2013 release, so directly head-to-head with the debut of the next-gen Xbox.

"The race is on. With PlayStation Orbis, Sony is looking to meet Microsoft head-on with a Q4 2013 release."

Meanwhile, PlayStation Orbis sketches leak, suggesting a touchscreen/camera combination is part of the proposition, while the codename appears to resolve successfully on Sony's developer portal in the same way that Vita codenames also link through to the site - which is IP white-listed by the way in case you're wondering why you can't get any further.

The evidence such as it is all seems to suggest that the next-generation PlayStation will be arriving more quickly than we imagined, and its hardware make-up signifies a radical departure from the spirit of the designs conjured up during the Ken Kuturagi era.

It's the rumoured choice of CPU from AMD that has surprised many. With PlayStation 3, Sony continued its traditions of bespoke architecture by helping to create the Cell - a central processing core surrounded by eight satellite co-processors (one of which is deactivated to improve chip yields, another of which was blocked off from developers and used for security). Processing performance on the SPU remains absolutely out of this world, and many argue that it's the defining characteristic of the hardware in its first-party exclusive titles enjoying technological advantages over their Xbox 360 counterparts. Unfortunately, SPU coding has proved challenging for many third party games developers, despite the best efforts of Sony's engineering teams in providing robust libraries that allow for advanced features to be almost literally dropped into existing code.

There's also the inconvenient truth that a lot of SPU power was spent on tasks that are typically associated with the graphics chip - in effect, the power of the Cell was often being used to make up for a shortfall in the capabilities of the PS3's RSX. The notion of moving GPU tasks onto the main processor worked for PlayStation 3 but it's somewhat at odds with the way technology is developing generally - the rise of DirectCompute and OpenCL is all about offloading tasks uniquely suited to the massively parallel architecture found in GPUs and freeing up more CPU time, a complete reverse on Sony's strategy with PS3.

"The exotic, vision-driven hardware that characterised the Kuturagi era is over. Sony's strategy is to buy in the best tech and to concentrate its resources on superior game creation tools."

There is interesting talk in VG247's story about abandoned work on a next-generation Cell processor running with 24 SPU co-processors - a route that would have potentially made backwards compatibility with PS3 an option for the new hardware. Unfortunately it would also have increased the issues developers have had with SPU programming, particularly when it comes to scheduling/coordinating tasks. Running six SPUs effectively in parallel took time even for first party Sony developers (check out all the unused SPU time in Killzone 2) so the notion of handling 24 simultaneously would be even more problematic for those less au fait with the architecture.

Backing away from Cell may well surprise or even upset some, but it's entirely in keeping with a firm looking to extract maximum performance from its overall silicon budget - and it's exactly the same strategy the firm employed for PlayStation Vita, licensing established technology from the best in the business and concentrating its own efforts on superior development tools. Some may lament that the era of exotic, vision-driven hardware from Sony that characterised the Kuturagi era is over, but it's safe to say that a combination of non-standard, challenging architecture and lacking devtools contributed significantly to the PS3's uncertain early start.

PS4 represents the ability for Sony to hit the ground running from day one: AMD has proven that it can deliver in terms of both CPU and GPU design, plus the company has unrivalled experience in combining both into a single chip. While it's unlikely to happen at launch, the ability to consolidate hardware like this is immensely valuable in reducing costs at a later stage in the console's lifecycle - something that Microsoft has already achieved with its slimline 360S, and a strategy that Sony itself employed in the past with PlayStation 2.

This approach has already yielded dividends with PlayStation Vita, where Sony's focus on creating the tools to get the best out of the licensed hardware resulted in an extremely strong launch line-up, and it's an area where the platform holder will need to be at the top of its game in its upcoming battle with Microsoft. Having devised and implemented DirectX 11, the Xbox makers are at the forefront of rendering technology, a crucial advantage the firm has over Sony, having worked with the GPU vendors directly in helping to architect next-gen rendering.

"Memory and bandwidth haven't seen an enormous, generational leap since the launch of the current-gen consoles. Exceeding 2GB of RAM will be a tough challenge for both Sony and Microsoft."

Curiously, VG247's story talks about the new PlayStation also using DX11. While the core hardware available to Sony with Orbis will almost certainly be a DX11-capable part, the notion of Sony actually utilising its competitor's API doesn't hold much water. Far more likely would be utilisation of OpenGL, something not unknown to Sony with its existing "PSGL" support for PlayStation 3, with OpenCL able to carry out DirectCompute style functions that are at the heart of DX11.

For its current generation console, Sony even went so far as to develop its own low-level interface with the graphics hardware - libgcm - and we may well see something similar for the new console. It's believed that there's something very similar indeed implemented with Vita, and it's highly likely that Sony would want to preserve as much commonality as possible with the workflows it established for its current generation machines.

Previously it's been our understanding that the platform holders were keen on extending the current console lifecycle until at least 2014, only unleashing replacement hardware at that point. Bringing forward the release to 2013 creates a fundamental challenge for both PlayStation and Xbox replacements. The focus on specs is all about the CPU and graphics core, but the idea of 2013 launch for both consoles has clear ramifications in terms of memory and bandwidth - these important areas have not seen anything like the same level of performance gains since the current gen console era kicked off in 2005.

Reports on next-generation Xbox target specs have been discounted by many because they mention a paltry 2GB of system RAM - a seemingly conservative amount bearing in mind that PC graphics cards now routinely ship with the same amount of memory. In an age where smartphones pack 1GB of RAM and where even the $25 Raspberry Pi offers 256MB of onboard memory, the notion of such a small amount of memory in both next-gen consoles may seem unlikely - especially as RAM is often cited as the most important element for developers in a fixed hardware design. Unfortunately, with the technologies available today, exceeding 2GB is going to be very challenging for both Sony and Microsoft.

However, RAM comes in many different forms and performance memory limits the options available to the platform holders. The technology used in PC graphics cards - GDDR5 - appears to be the best bet for a console aiming for mass production by mid 2013. The problem here is one of memory density - at the moment 256MB of RAM is the highest amount of memory that can be packed onto a single chip. Multiple chips can be stacked together and accessed in parallel, but then the memory bus that connects them to the rest of the system becomes a lot more complex - and more expensive - to make. At 2GB we're already looking at eight memory modules crammed onto the mainboard, and 4GB would see that doubled to what could be an unmanageable 16, bringing with it the necessity for an expensive 256-bit or even a 512-bit memory bus.

"Efficiency in design and carefully managing costs will be essential in making launch prices for next-gen consoles affordable and sustainable for the platform holders."

Ten years ago the platform holders may have considered throwing money at the problem to make it go away (360 and PS3 were extraordinarily cutting-edge for their time) but a lot has changed over the last few years, not least due to the near-collapse of the world economy - efficiency in design and carefully managing costs will be essential in making launch prices for next-gen consoles affordable and sustainable for the platform holders.

Generally speaking, console design is a fine art - typically, it's about cramming in as much processing potential as possible into a 200W power envelope and while there's clearly a lot of leeway from a design perspective, the materials and production processes available are very much known quantities - supplied as they are by third parties with technological roadmaps that are a matter of public record.

A successor to GDDR5 is some way off and a solution based on DDR4 - supposedly set for mainstream production later this year - may well be viable but would present performance issues, perhaps requiring the addition of expensive eDRAM (it's thought that Wii U uses this solution - ultra-fast "work" RAM backed up by fairly slow memory for the main system RAM). Unless memory densities on GDDR5 improve or an innovative alternative comes to light, 2GB may well be all we get - a far cry from the 8GB many developers crave and which is fast approaching the standard in gaming PCs.

Could the rush to bring next-gen consoles forward by a year have compromised a key element of the design? Will memory density increase in time for a 2013 mainstream launch, or is there some other application of existing technology we're not yet aware of? How Sony and Microsoft attempt to overcome the 2GB ceiling may well be a crucial factor in the overall level of power and longevity in their next-generation systems.

Author
Richard Leadbetter avatar

Richard Leadbetter

Technology Editor, Digital Foundry

Rich has been a games journalist since the days of 16-bit and specialises in technical analysis. He's commonly known around Eurogamer as the Blacksmith of the Future.

Comments