Coolest thing I saw at GDC: software that animates anything

The Star Trek holodeck is one of the most compelling sci-fi technologies: you give a computer some verbal instructions and, boom, you’re on a street in 1940s San Francisco, or wherever you want to be. We may never have holograms that you can touch, but the part where a computer can generate any requested 3D scene is being worked on right now by a small studio in London.

At the Game Developers Conference in San Francisco on Wednesday, anything in the world CEO Gordon Midwood asked me what I wanted to see. I said I wanted to see a donkey and a few seconds later a donkey was walking on the screen in front of us. Sure, he kind of walked like a horse, and yes, all he did was wander around a field, but those are just details. The software delivered on its basic promise: I asked for a donkey and a donkey appeared.

For the next demonstration, Midwood took his hands off the keyboard. “Let’s make an underwater world and add 100 sharks and a dolphin,” he said into the microphone. A few seconds later, I was looking at a dolphin that showed up at the wrong party: 100 sharks swimming.

Developers looking to use Anything World as a game development or prototyping tool will build it into an engine like Unity, but as Midwood has demonstrated, it can also produce scenes, objects, and creatures in real time. It was the coolest thing I saw at the GDC hall, and others have already realized its potential. Roblox is exploring a deal with the company, and Ubisoft is already using the software for prototyping, as well as for a collaborative project called Rabbids Playground.

How it works

With so much blockchain stuff haunting GDC, the sight of an older tech buzzword was comforting. Anything World uses machine learning algorithms developed in part during a University of London research project that lasted over a year. In short, they built automated methods to teach a system to analyze 3D models from sources such as sketchfab and try to classify them, segment them, organize them and animate them (or not) in a way that makes sense to human beings. At the moment, it can pull over 500,000 models.

Of course, Sometimes Anything World gets it wrong: the software once thought a table was a quadruped, and another time it believed the top of a pineapple was a spider’s legs, which was “creepy,” says Midwood. .

It’s early days (at least compared to Star Trek: The Next Generation, which takes place in the 2360s), but even at this stage it’s fun to see how a machine learning system combines the 3D models it’s been given with what it ‘knows’ about animal locomotion and the rest of the physical world—I felt strangely proud of my trotting donkey, as if I were somehow responsible for bringing it to life just for asking.

For non-developers, Midwood thinks Anything World has potential in super-accessible game creation tools, or just as a fun and useful thing to have on hand. For example, you can use it to create green screen sets in real-time while streaming, or really treat it like a holodeck computer, putting on a VR headset and requesting a scene to relax.

Meta (the company formerly known as Facebook) demonstrated something similar last month, albeit without animated creatures. In response, Anything World released a parody demo. Interpreting what people want at the natural language level is perhaps one of the ultimate goals of all software, so it’s no surprise that there’s competition. Anything World’s technology seems stronger than Meta’s now, however. It’s also a fairly small company, with six machine learning experts and nine other technical roles working on the tool.

In the future, Anything World plans to release versions with high fidelity models and animations – has an Unreal Engine version coming and plans to use Epic Quixel Models— as well as your own consumer app. right now, it’s available for use with Unity.

Anything in the world is a far cry from a Star Trek computer’s understanding of the physical world – I doubt it knows anything about 1940s San Francisco – but just because donkeys can walk a little like horses now doesn’t mean they will. walk tomorrow. Midwood still doesn’t promise me a holodeck, but is confident that the system’s ability to label and animate 3D models will only get more granular and nuanced.

The shark-infested waters that were spawned for me. (Image credit: Anything in the world)

Leave a Comment