As teased at the start of this week, Sony AI and Gran Turismo 7 developer Polyphony Digital have announced their joint "breakthrough project" of an AI that beats the lap times of professional drivers. Named Gran Turismo Sophy, the team is working out how to implement it into future entries of the GT series. It will not be part of Gran Turismo 7 — at least at launch — but it's hoped the research will aid the development of self-driving cars as well as self-learning AI in games.

Gran Turismo Sophy played Gran Turismo Sport for hours until it had learnt enough to beat pro drivers in races. That included Gran Turismo world champion Takuma Miyazono. Speaking to Wired through a translator, he said: "Sophy is very fast, with lap times better than expected for the best drivers. But watching Sophy, there were certain moves that I only believed were possible afterwards." In order to achieve this, a neural network was trained to improve its operation of the controls and then provide feedback based on what it did.

"The approach, known as reinforcement learning, is inspired by the way animals respond to success and failure in the real world. Although it is decades old, the method has come to the fore in recent years, thanks to more sophisticated algorithms, more powerful computers, and more copious training data."

Kazunori Yamauchi himself, the creator of the Gran Turismo series, explains the most impressive thing about Sophy is its ability to avoid incurring penalties. The technology is planned for "future versions of the game" and is likely to be used to help novice drivers get up to speed and veteran racers to further improve their craft. "Sophy takes some racing lines that a human driver would never think of. I think a lot of the textbooks regarding driving skills will be rewritten."

You can watch some clips of Gran Turismo Sophy in action against professional drivers through here and via this link.

[source ai.sony, via wired.com]