Unity Game Engine Makes ‘Digital Twins’ for Industrial Tests
January 20, 2022
Game giant Unity is using its game engine technology to help businesses make “digital twins” of real-world objects, environments and even people. These virtual entities take the brunt of testing products, machines and environments. Currently there are dozens of companies reportedly using Unity’s game engine to model digital doubles that can sub-in for robots, manufacturing lines and buildings, among other things, virtually operating and monitoring them even as they are optimized and trained. These twins rust when exposed to water and respond to things like temperature. They learn to avoid a ditch or call attention to a broken part.
“With an accurate enough digital twin, Unity’s game engine can even collect ‘synthetic data’ off the simulation to better understand and advance its IRL double,” writes Wired, in an article whose headline proclaims “Unity Wants to Digitally Clone the World.”
“We’re actually a massive data company,” Unity senior vice president of artificial intelligence Danny Lange told Wired. “We early on realized that at the end of the day, real-time 3D is about data and nothing but data.”
Unity’s main clients for digital twins are in the often dangerous, and usually expensive, world of industrial machinery, which has learned to substitute the virtual simulations for more costly physical models (think crash-test dummies).
“Unity executives believe that the company’s real-time 3D technology and AI capabilities position them to compete with the many other firms entering the $3.2 billion space, including IBM, Oracle, and Microsoft,” Wired writes. The real world is quite limited, Lange said, but “in a synthetic world, you can basically recreate a world that is better than the real world for training systems. And I can create many more scenarios with that data in Unity.”
Formerly the head of machine learning for Amazon, then Uber, Lange came to understand that a game engine could be the solution to some of the toughest challenges he’d encountered. “At Uber, Lange would watch as engineers tossed dummies in front of self-driving cars again and again to test vehicles’ ability to brake for humans,” Wired writes, explaining that “The car would have to identify that the object was human-shaped.”
“You can do that a few times, but how many times does it take [to train the AI]? A thousand?” asked Lange. “In a game engine, you can have an NPC actually trying to get killed in front of a car, and you can see if the car can actually prevent it.” Using Unity, an engineer could propagate data equivalent to “driving an Uber vehicle 500 million miles every 24 hours,” according to Wired.
Of course, human data, biometric and otherwise, plays a vital role in the process. Ultimately, the article asks the question “Is it okay to collect lots of data on people if, in the end, those people materialize as NPCs?” (NPC is game language for “non-player character.”)
No Comments Yet
You can be the first to comment!
Sorry, comments for this entry are closed at this time.