When testing advanced driver assistance systems (ADAS), autonomous vehicle prototypes, or just cars in general, it makes sense to perform the most dangerous activities in simulations. Early autonomous systems can be extremely unpredictable, and Tesla warns that they “could make the wrong decision at the worst possible time.” So even before installing a system in a car, automakers do a lot of simulator work to prepare the software for reality. At its most extreme, Tesla is building a supercomputer with capabilities far beyond what is necessary.
However, sooner or later you need to validate your work on the road. A system that routinely makes mistakes keeps the driver alert, according to the study, but systems that work flawlessly and occasionally make mistakes can lull the human test driver into a false sense of security, resulting in road deaths and wrecks. . As a result, you will need to be able to simulate driving that falls somewhere between simulation and reality.
One way companies are bridging this gap in the automotive space is to build simulators that take over the entire car, not just the computer. Taking this approach helps ensure that the car is ready for street testing, as the entire car is well tested and won’t do anything crazy.
It turns out that autonomous aircraft, including unmanned drones, face similar challenges. Flying a drone and seeing how it performs can pose some dangerous situations for the public, but computer simulations for the software that conducts the flight do not make the drone fully ready for real-world testing in real airspace.
So it’s exciting to see Microsoft doing the same thing as the link above about car simulators, but with drones.
Microsoft’s Drone Simulator Lab
They begin by telling us the story of Airtonomy CEO Josh Reed visiting Microsoft’s lab. Using VR goggles, he watched from behind a drone in a virtual wind farm. Across the Midwest, Leech is currently using hyper-realistic simulations to train autonomous aerial vehicles that inspect wind farms, monitor wildlife and detect leaks in oil tanks.
Looking around, it seemed to him that he was indeed in the air with the drone, and that the drone’s software had similarly been fooled into thinking he was actually in the air.
“You don’t want to fly drones into wind turbines or power lines or anything,” Riedy said. “Combined with the fact that winter in North Dakota can literally last 7 months, we realized we needed something beyond the physical world to design our solutions for customers.”
The data generated by Project AirSim is used to train artificial intelligence models about how activities will be performed during each phase of flight, from takeoff to cruise to landing. It will also provide libraries of simulated 3D environments representing a variety of urban and rural settings, as well as a suite of pre-developed sophisticated AI models to help accelerate autonomous air infrastructure inspection, last-mile delivery and urban air mobility.
Advances in artificial intelligence, computing and sensor technology are beginning to change how we transport people and goods, said Gurdeep Pall, corporate vice president of Business Incubation in Technology and Research at Microsoft. And it’s not just a problem in rural areas with wind farms; congested roads and highways are no longer sufficient as the fastest way to get from one place to another due to increasing urban density. Instead, companies will turn to drones for transportation.
“Autonomous systems will transform many industries and enable many weather scenarios, from last-mile delivery of cargo in congested cities to checking power lines 1,000 miles away,” Pall said. “But first we need to safely train these systems in the real, virtualized world. Project AirSim is a critical tool that allows us to bridge the gap between the world of bits and the world of atoms, and it demonstrates the power of the industrial metaverse—virtual worlds where businesses can build, test, and refine solutions and then bring them to the real world.”
AirSim, an earlier open-source project from Microsoft Research that is retired but inspired today’s presentation, was intended to provide high-fidelity simulation. AirSim was a popular research tool, but it required significant programming and machine learning expertise. Microsoft has now turned the open source technology into an end-to-end platform that allows AAM customers to more easily test and train 3D environments with AI aircraft.
“Everyone is talking about AI, but very few companies are able to build it at scale,” said Balinder Malhi, head of engineering for Project AirSim. “We built Project AirSim with the core capabilities we believe will help democratize and accelerate aerial autonomy—the ability to accurately simulate the real world, capture and process large amounts of data, and encode autonomy without the need for deep AI expertise.”
Developers can use Project AirSim to access pre-trained AI building blocks, such as high-quality models to locate and avoid obstacles and precision landings. These out-of-the-box features eliminate the need for deep machine learning expertise, enabling more people to train autonomous aircraft.
Simulink and AirSim are now available on various platforms, including Windows and Android. Microsoft is also working with industry partners to bring accurate simulations of weather, physics and, most importantly, the sensors an autonomous car uses to “see” the world. Through a partnership with MathWorks, customers can use Ansys’ highly accurate physics-based sensor simulations to generate rich ground-truth data for autonomous vehicles. Meanwhile, Microsoft and MathWorks are collaborating on ways for customers to import their physical modeling using Simulinks into Simulinks.
Technology alone will not be enough to usher in the era of autonomous flight. The industry must also find a way through existing aviation monitoring systems and regulatory situations. To accelerate this sector, the Project AirSim team is currently working with standards bodies, civil aviation authorities and legal entities to develop the required standards and compliance tools.
Microsoft’s Pall said the company wants to work with global civil aviation authorities on how Project AirSim can help certify safe autonomous systems, potentially providing scenarios within AirSim that a driverless car would need to successfully operate. In one scenario, heavy rain, strong wind and GPS connection drop. If the car can go from A to B every time under these conditions, Pall says, that would be a significant achievement toward certification.
AirSim pioneer Ashish Kapoor promises to help transform the simulation engine from a code-based research tool to a more robust platform that can be used without technical expertise. An aviator himself, Kapoor can’t wait to see what this development means for the aviation industry.
“In Project AirSim, a ton of data is generated when an aircraft flies through space,” said Kapoor, now senior manager of Microsoft’s autonomous systems research group. “Our ability to capture and transform this data into autonomy will significantly change the landscape of aviation. And that’s why we’ll see more vehicles in the sky helping to monitor farms, inspect critical infrastructure, and transport goods and people to the most remote locations.”
Featured image courtesy of Microsoft and Airtonomy.
Do you value CleanTechnica’s originality and clean tech news? Consider becoming a CleanTechnica Member, Supporter, Tech or Ambassador, or a Patron on Patreon.
Don’t want to miss a cleantech story? Sign up for daily news updates from CleanTechnica via email. Or follow us on Google News!
Have a tip about CleanTechnica, want to advertise or suggest a guest for our CleanTech Talk podcast? Contact us here.