Requirement completeness using data- and control flow analysis
Carrying out a data- and control flow analysis is required in almost all functional safety standards (ISO 26262-6 Table 7 Measures 1f/g, DO 178C Table A-7 Measure 8 and EN 50128, EN 50657Table A19 Measures3/4). In comparison to other measures, the data and control flow analysis causes a lot of questions, when it comes to the real execution. This is mainly since the technological capabilities have so far been lacking to carry out such an analysis.
The new non-intrusive system observation technology (www.accemic.de) offers the possibility to prove data- and control flows by tests. Therefore, a systematic check can be performed in future to prove the functional completeness of the requirements.
Definition of data- and control flow
Wikipedia defines the data flow analysis as follows:
Data-flow analysis is a technique for gathering information about the possible set of values calculated at various points in a computer program. A program’s control flow graph (CFG) is used to determine those parts of a program to which a particular value assigned to a variable might propagate.
Control flow according to Wikipedia:
In computer science, control flow (or flow of control) is the order in which individual statements, instructions or function calls of an imperative program are executed or evaluated.
Challenges of data- and control flow analysis
First of all, it is very difficult in a practical application to clearly distinguish between data flow and control flow. At many points in the software program there is a close coupling between a data flow and a control flow.
In addition, software tends to have an infinite number of data- and control flows. A complete proof of the correctness of all data- and control flows is impossible in practice.
In addition, a dynamic verification of data- and control flows has not been possible for technological reasons, so far. In aerospace, unit tests and SW integration tests have mostly been used to prove important data- and control flows. However, since these tests are not performed with the real hardware, no data- and control flows could be detected under stress conditions for the entire embedded system. Also, for time-critical applications, which represent the majority of embedded systems, these proofs could not be provided until now.
Non-intrusive system observation
The non-intrusive system observation allows the monitoring of software parameters without influencing the behavior of the system. Such tests can be fully automated compared to the debugger tests. This supports the systematic measurements of data- and control flows in integration and system tests.
Also, the integration- and system tests prove the functional requirements. Important data- and control flows are always reflected in the functionalities of the system. Therefore, the non-intrusive system observation supports the checking the requirement completeness.
Activity diagrams specify data and control flows
However, the new possibilities, that result from non-intrusive system observation, also draw attention to a further weakness of the existing development approach. In hardly any project that I have seen in recent years has performed a systematic specification of the data- and control flows. In future, this weakness of most projects will move more to the center of attention. Both the systematic proof of the data- and control flows by test, as well as the completion of existing functional specification gaps are only successful if at least the most important data- and control flows are defined in the software architecture. If you describe the software architecture in UML, activity diagrams are a good way to do this.
ISO 26262 (Part 6 Table 7 points 1f/g) in the automotive industry, DO178C (Table A-7 point 8) in the aerospace industry and EN 50128 / EN 50657 (Table A19 points 3/4) in the railway industry require a data and control flow analysis.
Previously, only static analyses or unit and SW integration tests could be used to prove the data- and control flow.
In future, non-intrusive system monitoring will make it possible to systematically verify data- and control flows at integration- and overall system level tests. This new technology offers not only the possibility to test complex error scenarios. It also allows the user to check the completeness of functional requirements. This becomes a particularly powerful quality instrument, when the software architecture systematically specifies the data and control flows. UML offers the activity diagrams as a good mean to specify the data- and control flows.
Are you ready for a status workshop, to analyse improvement potentials in your requirement engineering, then send a mail to: info[at]heicon-ulm.de or call +49 (0) 7353 981 781.