Você está na página 1de 37

Success of Formal Methods Implemented in the LDRA tool suite

Veena BN

2013 LDRA Ltd Copyright 2005 Liverpool Data Research Associates Limited

Agenda
Introduction Formal Methods: Why? Where? How? Mathematical models & algorithms by stealth Industrial strength formal methods Techniques & methods implemented in the LDRA tool suite Conclusion Summary

What are Formal Methods?


A Formal Method is defined as being a mathematically based analysis technique which has a defined semantics. This definition is that adopted by the Avionics Community in DO-178C [DO1]. A Formal Method is required to have the property of soundness
intended to demonstrate that the technique has been subjected to peer review or can be demonstrated to be valid

LDRA Ltd

Liverpool Data Research Associates Founded 1975 Provider of Test Tools & Solutions Metrics Pioneer Consultancy, Support, Training Active participation in standards such as DO-178B/C, MISRA C/C++
4

The LDRA tool suite


Widely known for its extremely powerful Dynamic Analysis, Unit Test and Object Code Verification (Level A) capabilities where it has been a world leader for a long period. Also well known for its Static Analysis capability.

However, it has not been acknowledged in the formal methods arena.

Part of the reason is that LDRA have deliberately avoided the association, because so many software engineers are fearful of the mathematical overtones.

Formal Methods: Where? How?


Users are perfectly happy to discuss issues such as data flow anomalies without the faintest idea as to how one might go about finding such anomalies.
The underlying graph theory and sophisticated mathematics are a total mystery.

This paper documents some of the most commonly used Formal Methods which have been implemented in the LDRA tool suite for many years

Industrial Strength Formal Methods


The application areas where LDRA tools are used are extremely diverse. Frequently the users are pushing the extreme quarters of the programming languages and the LDRA tool suite is always expected to be able to perform its analyses. Tool suite users rarely confine themselves to carefully selected subsets of the languages; rather, they use the full language spectrum.

LDRA tool suite Formal Methods


The LDRA tool suite make extensive use of Formal Methods techniques in order to detect defects in source and object code software The Formal Methods implemented in the LDRA tool suite belong primarily to variants of modelling methods. The algorithms which implement these models have been refined for as many as 40 years and cope with multi-various programming constructs.

Formal Methods Techniques


There are two underlying mathematical models of the programs being analysed produced by the LDRA tool suite
Control Flow Model Data Flow Model

Mathematical Models
9

Control Flow Model


The control flow model is based on the syntax and semantics of each specific programming language. It handles such diverse constructs:
Recursion (single procedure, multi-procedural, multifile), Procedural parameters, Pointers to procedures, Multi-threading, tasking, concurrent processes, Exception handling,

The control flow model is system wide


multi-procedural, multi-file
10

Control Flow Model: Example Graph

11

Control Flow Model

Flow Graph

Annotated Flow Graph

12

The Data Flow Model


Powerful graph theoretic algebras are applied to the system-wide control flow model to yield a number of different types of analysis. Defects detected include:
references to un-initialized variables wasted computations on variables variables which do not contribute to outputs parameter mismatches of various types

The model is system wide and includes variable aliasing through procedure interfaces
13

Data Flow Analysis


Analysis is done based on the declaration and scope of program variables The operations performed
reference (R: use in a computation) definition (D: use in the left-hand-side of an assignment operation). The values of variables at declaration and after end of scope are treated as undefined (U).

UR, DD and DU anomalies are reported

14

Data Flow Analysis: from the tool suite


Data flow + Violations

Procedure information
15

Data Coupling Analysis


This technique investigates the way in which procedures interact with data items which are not local to that procedure. Procedures acquire external data items in two ways
parameters and global variables

For Ex: a global variable when passed as a parameter in a call then has two access mechanisms inside the procedure.
The danger arises firstly from the programmer failing to appreciate this fact and thinking they are distinct secondly from a compiler treating them as distinct when the programmer thinks they are the same Use of pointers make it worse

The tool has algorithms to detect problems of this type


16

File Handler Analysis


This technique looks at the use of file handlers (I/O streams, files, etc.). The Control Flow Model is annotated with the operations performed on the file handlers,
open, close, assignment, aliasing, etc..

The objective is to search system wide to find instances (on any path) of:
files written to, before being opened; files written to, after being closed; files written to, but never closed.

17

Storage Analysis
This model is, at present, exclusive to C. The problem is to identify the careless use of storage Storage allocated and then not de-allocated correctly It is also possible to release memory not allocated and this is also reported

18

Pointer Analysis
The data flow model is enhanced by the pointer variables and the operations performed upon them. The operations include aliasing over procedure boundaries and dereference operations.

Pointer Analysis:

Caveat:

Since this is a static model and pointer operations are a dynamic issue, the model has certain limitations.

19

Null Pointer Checking


The problem being addressed by this model
to trap the possibility of using a null pointer, i.e., a pointer which has no valid value

Accomplished by searching an annotated data flow model which is enhanced by the conditions of all the branching conditions Then any use of a pointer in any context on a path which does not contain a successful test of the value of the pointer is flagged.

20

Divide-by-Zero Analysis
This model is similar in concept to the previous model and uses an enhanced data flow model. The enhancements
Include the specific arithmetic operations on the program variables

The aim is to detect constructs which can lead to a divide-by-zero event. Any input value which is not checked before being used as a division is reported.

21

Array Bound Checking


The tool suite has two modes to address the problem of array bound overflow.
The checks can be performed statically or dynamically.

The static checks are again performed by enhancing the data flow model. The model has limitations due to the dynamic characteristics and additionally, the unhelpful nature of languages such as C and C++ makes a precise algorithm difficult. The checks can also be performed dynamically. The use of unchecked input values as an array index are reported.

22

Dead Code Analysis


In any programming language it is possible to include code which never contributes to any outputs.

Except in specific circumstances, the removal of this code contributes to most quality characteristics of a program.

Such code is flagged up by a comprehensive model which relates the program outputs to the program inputs, both directly and indirectly.

23

Information Flow Analysis


This model uses the same annotated data flow model as the dead code analysis. Combines the relationships discovered between the I/O variables and annotations supplied by customers. Differences between the forecasts supplied by customers and the actual results are reported. This is another aspect of the tool which utilises the results of other Formal Methods.

24

Information Flow Analysis


Aims to discover the relationships between input variables and output variables. The dependencies are classified as:
direct or indirect
sub categories of both (strong and weak).

This is performed in the LDRA Testbed tool suite by scanning the system-wide control and dataflow graphs with a grammar to discover such relationships.

25

Information Flow Analysis Report

26

Exact Semantic Analysis


The tool compares user-supplied annotations with the exact semantics of the program. This includes the use of:
invariants pre-conditions post-conditions

This provides a direct link with a number of other flavours of Formal Methods and notations.

27

LCSAJ Analysis
The set of linear code sequence and jump (LCSAJ) sub paths forms a basis set for the generation of program paths. As such, LCSAJs are a powerful vehicle for analyzing path structure and generating targeted test data. The tool generates a test case plan.

28

LCSAJ Example Test Case Planner

29

Side Effect Analysis


The use of functions in complex expressions can be a source of error if the functions concerned have side effects.
parameter side effects global variable side effects I/O side effects, both file and volatile location based class member side effects In particular, the result can be affected by compilers order of evaluation

30

MC/DC Test Case Planning


Modified condition/decision coverage (MC/DC) requires testing of decisions in a program such that changing the truth value of each individual condition within the decision forces a consequence on the overall decisions outcome A mathematical approach is implemented to generate a test case planner
lists out minimal number of test conditions for maximum coverage Ex: for N number of variables, instead of 2N test cases, (N+1) cases would be sufficient
31

MC/DC planner: Example From The LDRA tool suite

32

Conclusion
The LDRA tool suite comprising lexical analyzers, parsers and modelling tools have been in continuous production since 1975 The use of the Formal Methods components were first described in 1983 The algorithms have been applied to some 14 different computer languages with numerous dialectic variations The LDRA tool suite has been used in a huge number of safety and mission critical applications
33

Summary
In 40 years the LDRA tool suite has progressed considerably but there is still much to be done. The aim is to implement any technique
Which can reduce the occurrence of defects and faults provided only that the technique is reasonably applicable to significant numbers of software systems

As more Formal Methods mature they are likely to be prime candidates for implementation

34

References
Formal Methods by Stealth: Formal Methods Implemented in the LDRA Tool Suite: M. A. Hennell, and M. R. Woodward An Algebra for Dataflow anomaly Detection: Ira Forman RTCA. Software considerations in airborne systems and equipment certification. Report DO-178B, Radio Technical Commission for Aeronautics (RTCA) Inc., Suite 1020, 1140 Connecticut Avenue NW, Washington DC 20036, U.S.A. (1992). LDRA tool suite manual

35

Summary and Questions & Answers

36

For further information:


www.ldra.com info@ldra.com

@ldra_technology

LDRA Software Technology

LDRA Limited

37

Você também pode gostar