Software Validation Tools Give an Edge to Satellite and Spacecraft Development



The space industry can realize increased productivity, reliability and cost savings by employing the software certification process used by military and commercial avionics.

As the space industry continues to mature, software validation tools represent a best practice and an excellent starting point for process improvement. In fact, the use of tools may be mandated, particularly in circumstances where satellites interact with the com­mercial airspace, private space vehicles carry human crew and space vehicles operate in and re-enter public airspace. In many of these circumstances, private vehicles will require commercial licenses from the FAA, which may force some or all of the processes in DO-178C to be applied. But any company that is building components in these contexts should consider the use of certified software validation tools as a way to establish an edge for the future.

The space industry shares many avionics industry characteristics, including safety-critical system requirements as well as cost and schedule challenges. But while the avionics industry is governed by software development process method­ologies such as DO-178C, the space industry has typically employed manual methods, using tools as divergent as Microsoft PowerPoint to IBM Rational DOORS, which do not provide a direct connection to source code and test results. By introducing the same techniques used by commercial and military avionics, the space industry could significantly improve the likelihood of mission success and cost reduction. These tools—already proven in the military and avionics communities—provide a direct link between system engineering tasks, software development, and test artifacts. The end result is improved organization and increased reliability to protect lives and missions. The following examples illustrate how these tools apply specifically to space vehicle and satellite system design processes.

LDRA_Spacecraft_Sat

Space Vehicle Design

Consider a space vehicle with hundreds of subsystems designed by dozens of suppliers. The complexity of certifying this vehicle for flight can be alleviated using a consistent set of tools. For example, in major space programs prime contractors as well as all subcontractors use LDRA tools for verification reports. This forces a con­sistent process across suppliers and allows for uniform evaluation of project milestones.

Figure 1 offers a typical Requirements Traceability Matrix (RTM) for this scenario. First, high-level program requirements connect to lower-level require­ments. These lower-level requirements dictate the specifics addressed by the spacecraft flight software and have system test and code review verification tasks associated with them. The specific system test require­ments correspond to Modified Condition / Decision Condition (MC/DC) that ensures input parameters don’t mask decisions made by the software.

Figure 1: RTM conceptual view, showing connections between requirements and verification.

Figure 1: RTM conceptual view, showing connections between requirements and verification.

This documentation tree provides a high degree of confidence that the software performs as expected. In cases where application failure can cause a loss of life, documentation review proves process reliability, illustrating whether subcontractors rigorously used the RTM to track their requirements and testing.

In the following examples, two high-level requirements are provided and two lower-level requirements are shown. These lower level requirements lead to unit test and system test verification plans. The third section gives system and unit test and system test verification results. Connections between the levels link artifacts such as requirements, code, and test.

LDRA_Spacecraft_FigA

Satellite Design

In real life, the connections between the documents are more compli­cated, although the concepts of covering link and percentage coverage remain the same. Consider the case of the typical satellite system, with a system requirements specification that covers an interface con­trol document and a software requirements specification. From right to left, you can trace the connections represented in the requirement traceability matrix.

The requirements on the left links to individual requirements (column 2) to the source code (column 3) so it is easy to see how each part of the code maps to specific requirements. A software verification plan must cover both the source code and any requirements-based test case (column 4). Again, this whole matrix of relationships all the way down to the software verification report can be represented via an auto­mated tool chain that offers integrated artifacts from code review, code coverage, unit testing, and system testing.

Figure 2: A conceptual diagram illustrating connections between documents in a typical satellite software development project. The left column lists requirements, and clicking on a requirement zooms the developer in tointo connections between individual requirements, source, and verification.

Figure 2: A conceptual diagram illustrating connections between documents in a typical satellite software development project. The left column lists requirements, and clicking on a requirement zooms the developer in into connections between individual requirements, source, and verification.

The use of software certification validation tools bridges the gap between the software engineer, the tester, and the project manager. Even in cases where there are thousands of requirements, these tools scale to represent all of the requirements and verification results. At any point, the project manager can take a look at overall status or use filters to isolate subsystems and look at one subsystem at a time. A test engineer can look solely at the connections between software and test cases, while the project manager can validate test cases in the context of code coverage. The end result of using tools is increased productivity and reliability.

The space industry can realize increased productivity, reliability, and cost savings by employing the software certification process used by military and commercial avionics. Rigorous links are established between various parts of a program, establishing a better quality pro­cess where automated tools replace error-prone human efforts. Most importantly, the software validation tools ultimately decrease the margin of error and save lives.


SOFTWARE CERTIFICATION TOOLS SUPPORT, ENFORCE, AND AUTOMATE QUALITY PROCESSES

CODE REVIEW

This form of analysis connects data to code, verifying that the data modified by a procedure is the data you expect the procedure to modify. In many process standards such as DO-178B/C, use of a coding standard is a key part of enforcement. Used in these envi­ronments, static data flow analysis makes sure that when code executes, it links and runs with the correct data and connects requirements and code. Armed with requirements traceability, developers can verify software does what it was specified to do, and these connections become proof points for project manage­ment and process auditing purposes.

DYNAMIC ANALYSIS (CODE COVERAGE)

Dynamic analysis evaluates the effectiveness of test plans, providing a high level of assurance that the subsystem on a space­craft or satellite has been adequately tested before integration. Dynamic analysis tools automate these processes and document what modules, lines, or conditions in code have executed. More importantly, the tools trace what part of the code is executed by each part of the test plan, helping developers examine the effectiveness of the test plan. When portions of the code are not executed, it identifies “unreachable code,” which in most certified systems should be eliminated, as well as ”infeasible code,” which generally requires further examination.

UNIT TESTING

Unit test tools automate the creation of test case drivers. Tools used in certification environments automatically create stubs and stub out global variables, which allow developers to examine code from the module to the function. Developers can then test code before hardware is fully available and run regression testing, which verifies that the results are the same every time. To achieve better quality testing, unit test tools are used in conjunction with code coverage tools. When combined, the developer chooses a representative sample of input cases to fulfil code coverage requirements, rather than randomly choosing a set of cases that is convenient. These tools map between portions of code and the expected and actual inputs and outputs to clearly document that inputs and outputs map as expected, prove higher level requirements are adequate, and confirm test cases ran as designed.

REQUIREMENT TRACEABILITY

Requirement traceability is the critical component that ties this all together, but achieving requirement traceability manually is an error-prone and inadequate process. In fact, research has shown that 85% of software errors stem from the requirement traceability failures. Tools that directly parse requirements in a requirement management system and automatically connect them to test plans and test results save time in system engineering processes and sup­port rigorous enforcement.


LDRA_ThomasJay Thomas, Director, Field Engineering for LDRA Technology has worked on embedded controls simulation, processor simulation, mission- and safety-critical flight software, and communications applications in the aerospace industry. His focus on embedded verification implementation ensures that LDRA clients in aerospace, medical, and industrial sectors are well grounded in safety-, mission-, and security-critical processes.

Share and Enjoy:
  • Digg
  • Sphinn
  • del.icio.us
  • Facebook
  • Mixx
  • Google
  • TwitThis

Tags: