Model-Checking of Safety-Critical Software for Avionics
by Darren Cofer, Michael Whalen and Steven Miller
The adoption of model-based development tools is changing the cost-benefit equation for the industrial use of formal methods. The integration of formal methods such as model checking into software development environments makes it possible to fight increasing cost and complexity with automation and rigour.
By any measure, the size and the complexity of the safety-critical software deployed in commercial and military aircraft are rising exponentially. Current verification methods will not be able to cope effectively with the software being developed for next-generation aircraft. New verification processes are being developed that augment testing with analysis techniques such as formal methods. These processes will help ensure that the advanced functionality needed in modern aircraft can be delivered at a reasonable cost and with the required level of safety.
In the past, formal methods have not been widely used in industry due to a number of barriers:
- the cost of building separate analysis models
- the difficulty of keeping these models consistent with the software design
- the use of unfamiliar notations for modeling and analysis
- the inadequacy of tools for industrial-sized problems.
The widespread use of model-based development (MBD) tools is eliminating the first three barriers. MBD refers to the use of domain-specific (usually graphical) modelling languages that can be executed in simulation before the actual system is built. The use of such modelling languages allows engineers to create a model of the system, execute it on their desktop, and automatically generate code and test cases. Furthermore, tools are now being developed to translate these design models into analysis models that can be verified by formal methods tools, with the results translated back into the original modelling notation. This process leverages the original modeling effort and allows engineers to work in familiar notations for their domain.
The fourth barrier is being removed through dramatic improvements in analysis algorithms and the steady increase in computing power readily available to engineers due to Moore's Law. The combined forces of faster algorithms and cheap hardware mean that systems that were out of reach a decade ago can now be analyzed in a matter of seconds.
Model checking is a category of formal methods that is particularly well suited to integration in MBD environments. A model checker will consider every possible combination of system input and state, and determine whether or not a specified set of properties is true. If a property is not true, the model checker will produce a counterexample showing how the property can be falsified. Model checkers are highly automated, requiring little to no user interaction, and provide the verification equivalent of exhaustive testing of the model (see Figure 1).
Automated Translation Framework
In collaboration with the University of Minnesota under NASA's Aviation Safety Program, Rockwell Collins has developed a translation framework that bridges the gap between some of the most popular industrial MBD languages and several model checkers. These translators work primarily with the Lustre formal specification language, developed by the synchronous language research group at Verimag. Models developed in Simulink, StateFlow or SCADE are transformed into Lustre, and then successively refined and optimized through a series of configurable translation steps. Each step produces a new Lustre model that is syntactically closer to the target model checker specification language and preserves the semantics of the original model. This customized translation approach allows us to select the model checker whose capabilities are best suited to the model being analyzed, and to generate an analysis model that has been optimized to maximize the performance of the selected model checker.
Our translation framework is currently able to target eight different formal analysis tools. Most of our work has focused on the 'NuSMV' model checker developed jointly by ITC-IRST (Center for Scientific and Technological Research, Trento, Italy) and Carnegie Melon and the Prover model checker developed by Prover Technologies.
One of the largest and most successful applications of our tools was on the ADGS-2100, a Rockwell Collins product that provides the cockpit displays and display management logic for large commercial aircraft. The software requirements were captured as 593 properties to be checked. Verification was performed early in the design process while the design was still changing. By the end of the project, the product engineers were using the analysis tools to check properties after every design change. We were able to find and correct 98 errors in the early design models, thus improving the quality of the generated code and reducing the downstream testing costs.
In the Certification Technologies for Flight Critical Systems (CerTA FCS) project funded by the US Air Force, we have analyzed several software components of an adaptive flight control system for an unmanned aircraft. One system we analyzed was the redundancy manager which implements a triplex voting scheme for fault-tolerant sensor inputs. We performed a head-to-head comparison of verification technologies with two separate teams, one using testing and one using model checking. In evaluating the same set of system requirements, the model-checking team discovered twelve errors while the testing team discovered none. Furthermore, the model-checking evaluation required one-third less time.
Current research is focused on further expanding the range of models where model checking can be effectively applied. Our framework can deal with integers and fixed-point data types, but new analysis methods are needed to handle larger data types, floating point numbers and nonlinear functions. Work on the class of analysis algorithms known as SMT model checkers appears promising.
Several commercial MBD environments have begun to incorporate model checkers. This should make formal analysis more widely accessible to software engineers. However, it is not yet clear whether the same power and flexibility can be provided in off-the-shelf development environments as is available in custom approaches.
The impact that the use of formal verification technology will have on software aspects of aircraft certification is a major issue. The international committee (SC-205/WG-71) responsible for the next generation of certification guidance (DO-178C) is addressing this question.
Rockwell Collins formal methods: http://shemesh.larc.nasa.gov/fm/fm-collins-intro.html
Advanced Technology Center