May 16, 2012, 8:30 AM — The cloud-based gaming scene (what I used to awkwardly refer to as streaming-gaming) got a little more interesting yesterday when Nvidia CEO Jen-Hsun Huang demonstrated a new 'virtual GPU' at (where else?) the GPU Technology Conference in San Jose, CA.
So what the heck does this mean? Basically it sounds similar to existing cloud-gaming initiatives from companies like OnLive and Gaikai, except the GPU in the hardware off at the data center is designed specifically for cloud computing.
The GPU in question is based on Nvidia's new Kepler chips, but this new model designed for data centers can handle four times as many games, and use half as much power, as current 'non-virtual' Kepler-based GPUs. At least that's how I'm interpreting the report, as relayed by VentureBeat. Basically with existing services, there's a dedicated GPU serving each customer connected to the service. The new virtual GPUs can multi-task and serve four users at a time. Obviously this makes for big cost savings for the cloud gaming provider.
The big drawback with cloud gaming so far has been lag. Generally speaking services like OnLive and Gaikai work fine for casual gamers but the slight latency between your input and when the game responds is enough to drive hardcore players back to their expensive gaming rigs. Nivdia claims to have solved that. Nvidia senior vice president Dan Vivoli told VentureBeat "You might play a game in the cloud and have a better response time than the console. Our architecture is so fast we can hide the latency of the network."
That's a bold claim!
Nividia is partnering with Gaikai to use Gaikai's 24 data centers in order to build out the Nvidia GeForce Grid to help deliver on the promise of the new virtual GPU. It sounds like this isn't an exclusive arrangement since the complete list of software partners also includes OnLive, G-Cluster, Ubitus, and Playcast. On the hardware side of the business, Nvidia's partners include HP, IBM, Amazon, Dell, Cisco, and Supermicro.
The promise for the future is that as Nvidia continues to improve and enhance its GPU designs, cloud gaming providers could just swap-in the latest generation of chip and seamlessly provide users with an updated gaming experience.
These are the high points of the announcement but VentureBeat has a lot more details, including info on a new multi-tasking technology Nvidia is calling dynamic parallelism. The post is worth a read.
What we don't have yet is any kind of dates or timelines, so for now we'll have to be content with the slightly laggy but basically sound services on offer from OnLive and Gaikai.