Near misses like the one at New York’s John F. Kennedy International airport inspired a group from the AirLab(opens in new window) in Carnegie Mellon University’s Robotics Institute(opens in new window) (RI) to create World2Rules, an AI system that learns interpretable safety rules from data to analyze, verify and explain potential collision scenarios.
By learning from both everyday airport activity and documented safety violations, the system builds a clear picture of what “normal” and “unsafe” behaviors look like. When it detects a potential violation, the system does more than just raise an alert. It identifies which safety rule is being broken and explains why the situation is risky, showing how the scenario matches known patterns of danger rather than issuing a vague warning.

“The overall idea we’ve been working on with this project is to see how we can improve safety in the aviation domain or other safety-critical domains,” said Jack Wang(opens in new window), an RI master’s student. “As shown on the news, runway incursions have been happening. Sometimes they’re minor, but sometimes they can be quite catastrophic.”
Wang is passionate about aviation safety and flight. He joined the CMU Flying Club(opens in new window) as a first-year student and later taught a Student College(opens in new window) course to help students get the ground instruction they need to pursue a pilot’s license.
The World2Rules team wanted to design an AI system that could not only recognize when aircraft were on a dangerous path, but could also predict potential collisions early enough to give pilots and controllers critical extra moments to react.
To do so, the AirLab and the Bot Intelligence Group(opens in new window) jointly developed the Amelia-42 dataset(opens in new window). The set contains two years of Federal Aviation Administration airport surface movement data from 42 U.S. airports. It includes massive amounts of information, tracking aircraft and vehicle movement across runways and taxiways. To process the large amount of information, they used the Bridges-2 supercomputer at the Pittsburgh Supercomputing Center(opens in new window).
“The data we collected includes both normal airport operations and crash and incident reports,” said Jay Patrikar, a recent RI graduate who worked on World2Rules and was also a founder of the CMU Flying Club. “That data helps our system distinguish between normal and unsafe situations. We not only want to understand that a crash is happening, but also want to predict if a crash will happen in the future.”
World2Rules is designed to plug into a broader collision-prediction pipeline. It learns explicit safety rules from the Amelia dataset, recognizing patterns that lead to unsafe situations, such as aircraft occupying the same runway at the same time. It then applies those rules to aircraft trajectories, flagging when a future scenario would violate them. Instead of simply signaling risk, the system can then point to the specific rule being broken and explain why the behavior is dangerous in terms humans can understand.
“In practice, this ideally would mean air traffic controllers or automated systems could get earlier, clearer warnings of potential dangers,” Wang said.

To make sense of all that data, World2Rules combines two types of AI approaches, neural and symbolic. The neural side picks up on patterns buried in the airport data. The symbolic side turns those patterns into clear, logical rules that humans can read. By pairing pattern recognition with rule-based reasoning, the system can both identify risky situations and explain them in a structured way.
“Beyond aviation, World2Rules could also be used in other areas where safety is critical,” said Sebastian Scherer(opens in new window), an associate research professor in the RI and head of the AirLab. “The system can be adapted to different environments by teaching it the relevant rules and behaviors for that domain. Once that information is defined, the same core technology can learn and monitor safety risks without needing to be redesigned.”
The team reported their results at the NASA Formal Methods Symposium in Los Angeles earlier this month.
“Carnegie Mellon University is a private research university in Pittsburgh, Pennsylvania. The institution was originally established in 1900 by Andrew Carnegie as the Carnegie Technical School. In 1912, it became the Carnegie Institute of Technology and began granting four-year degrees.”
Please visit the firm link to site

