Q2 2025 Gaming Industry Report Released,
View Here

Konvoy’s Weekly Newsletter:

Your go-to for the latest industry insights and trends. Learn more here.

Newsletter

|

Apr 24, 2026

Gaming x Heavy Industry

Heavy industry is leveraging simulation technology, which was perfected in gaming

Copy Link

No items found.

Copy Link

Heavy industry uses game engines

Today, there are >3b gamers, and a large swath of the currently employed population grew up playing video games. They used controllers to play games in simulated worlds, but now they are employed at the helm of some of the world’s largest industries. It turns out that gaming software, 3D simulations, and a controller native generation have a lot to offer the world’s heaviest (and often most dangerous) industries.

Gaming engines are being leveraged far beyond just entertainment. They are now permeating almost every industry, and especially those that are capital-intensive and have high stakes (making simulation in advance much more important).

For example, adoption of the Godot engine in automotive and manufacturing reached 21% (Perforce) in 2025, a large increase from just 9% in 2024. Even Tesla uses the React Native Godot Engine in its app for 3D vehicle visualizations, allowing users to interact with detailed models of their cars. Keep in mind that heavy industry often operates on very tight margins, so the fact that Godot is licensed under a free license and is open source is likely a key driver of its adoption across these sectors.

The simulation infrastructure that the gaming industry has mastered over the last three decades has quietly (and quickly) become the operating backbone of physical AI. This has been especially impactful for autonomous mining, military robotics, and industrial forestry. The winners in these spaces will not be the ones who build solely the best machinery; they will be the ones with the best software stack led by a generation of talent that grew up with a controller in their hand.

The Simulation Stack: How We Got Here

On March 13, 2004, 15 autonomous vehicles (DARPA) left a starting gate in the Mojave Desert to attempt a 142-mile course from Barstow, California, to Primm, Nevada. The $1 million prize went unclaimed; the closest vehicle to make it across the desert made it only 7.5 miles before getting stuck on a rock (DARPA). The next year, in 2005, DARPA doubled the prize, and 5 of 195 teams (DARPA) completed the desert course.

By 2007, teams in this competition were navigating autonomous vehicles through a mock urban environment, obeying traffic laws and making real-time intelligent decisions. The teams that won were not defense primes; they were university robotics labs, AI researchers, and software engineers whose tools were probabilistic models, machine learning, and sensor fusion: the same underlying disciplines that power modern simulation engines.

That lineage of talent continued to compound. Sebastian Thrun, who led Stanford’s winning team in 2005, went on to build Google’s self-driving project (originally Google Chauffeur, now Waymo). The innovation even continued into heavy industry, as Komatsu has hauled over 10 billion tonnes of material with autonomous mining trucks. Caterpillar’s driverless fleet has moved approximately 9.3 billion tonnes, with both tracing their autonomy software back to DARPA-era simulation research.

Fast forward to today, NVIDIA has meaningfully expanded the infrastructure around simulation. Its Isaac and Jetson robotics platforms now support over 1.2 million developers and 10,000 enterprise customers (NVIDIA FY2024 Annual Report) who are building and deploying AI-driven robots worldwide. Jensen Huang has called Omniverse (NVIDIA’s simulation platform) the operating system for building digital twins. The autonomous systems simulation segment enables regulators, operators, and enterprises to conduct exhaustive virtual testing before deployment in the physical world (where the stakes are far higher).

The game engine was never just entertainment infrastructure. It was a decades-long proof of concept for how to train machines in virtual worlds before releasing them into the real world.

Simulation Saves Lives

The reason simulation-native autonomy is important is not just because it makes our lives easier or expands margins. It saves lives.

Mining, quarrying, and oil and gas extraction posted a fatality rate of 13.8 deaths per 100,000 workers in 2024. This is nearly 4x the U.S. national average of 3.3 deaths per 100,000. Agriculture and forestry is even worse: 20.9 deaths per 100,000 workers (the single most lethal industry category in America by fatality rate). This is a key reason why we invested in Kodama, a remote forestry and robotics company that was recently featured in the Wall Street Journal.

When the environment itself is this dangerous, removing humans from the machine is not just about increasing technological efficiency. It is about getting humans out of the machines and saving lives.

What is a Simulation-Native Stack?

Simulation-native means that a company chooses to build the virtual environment first, train its autonomy model in this risk-free sandbox, and validate edge cases as much as possible before deploying hardware in the real world. This behavior in heavy industry is not dissimilar from how the FAA now credits flight simulator hours toward pilot certification (10-30% of hours can be on a simulator, depending on the license). The fidelity of these virtual simulations are now so good that it is producing real-world competency.

The same logic is now directly impacting autonomous mining drills, unmanned ground vehicles for the military, and forestry robots; all of which often operate in remote and GPS-challenged terrain. At a high level, the three layers of the autonomous tech stack are:

  1. Perception (sensors, cameras, LiDAR)
  2. Simulation (the virtual twin where the model trains)
  3. Control (the operator interface).

Many companies are trying to own each of these layers internally, as that gives them the highest alpha over their competition and improved economics (lower cost, margin expansion) for each of their respective business models.

In mining, Rio Tinto has a haulage fleet in Western Australia’s Pilbara that has completed 8.9 million operating hours and moved over 4.8 billion tonnes of material across 10 mine sites. This has contributed to ~80% of the company’s daily production capacity. Since starting trial operations in 2008, Rio Tinto has recorded zero injuries (Mining.com) attributed to its autonomous haul truck fleet. With 200 locomotives, they currently have the world’s first heavy-haul, long-distance autonomous rail system.

In agriculture, John Deere’s autonomous 8R tractor classifies every camera pixel in approximately 100 milliseconds. It is trained on hundreds of thousands of images simulated across varied farm conditions, weather, and lighting with remote operators supervising multiple machines via tablet from the field edge rather than inside the machine. In forestry, Kodama’s semi-autonomous skidders build a live 3D map of surrounding terrain, automate the repetitive trail navigation, and pass dexterous cutting operations to a remote operator who may be hours away from the machine.

In defense, Overland AI’s ULTRA UGVs navigate contested, GPS-denied terrain using an autonomy stack built and validated through three years of DARPA RACER simulation and field testing, now deployed with U.S. Army and Marine Corps formations.

The simulation stack was perfected by gaming and is now being quickly applied to real-world industries. It is having an immense impact on heavy industries, many of which consumers are not aware of (yet).

Picks & Shovels of the Physical AI Stack

As the simulation market has shown, a tech stack is quickly emerging to further support, proliferate, and protect various industries. Here are a few that stuck out to us:

  • Simulation as a Service - companies that choose not to build this talent expertise in-house might choose to work with a third party who specializes in this craft.
  • Controller interface companies - live-operating or monitoring heavy industry equipment could become an outsourced service that companies pay for (not too dissimilar to call centers, but certainly much more complicated and expensive).
  • Applied Data - generating data via simulations is one thing, but to make sure that the millions/billions of simulations are applicable to a specific, consequential use case is another thing entirely. Firms that specialize here will give industry players an advantage.
  • Domain-specific hardware - a lot of heavy industry is happening in remote, intense, and dangerous environments (i.e., in space, deep sea, underground mines, the Artic Circle, forest fires, etc). Companies that build specialized hardware equipment that goes on the machines will see great success, as the industry is likely to buy these third-party vs building the entire hardware stack internally (especially given how rapidly hardware is advancing within the private sector).
  • Opaque Data - Companies can now leverage large amounts of data that have historically been stored in documentation or in employees' heads. Leveraging this data for simulations could be a large unlock and act as a wedge in a specific vertical.

Takeaway: the gaming industry perfected the virtual world and simulation technology stack. That talent and technology is now working on leveraging game engines and simulation for heavy industry. The benefits go far beyond efficiency and margin expansion (which are immense); it is also saving lives in these dangerous working conditions (let the robots do it). The Godot engine is seeing broad adoption because it is free/open source. The NVIDIA Omniverse supports 1.2m developers and >10k enterprises already. We are in the earliest stages of watching simulation technology transform heavy industries, and the world is going to love the results.

From the newsletters

View more
Left Arrow
Right Arrow

Interested in our Newsletters?

Click

here

to see them all