Runtime Checking and Test Case Generation for Python

General Background

Unlike testing, which can only check a program's behavior for a finite number of inputs, deductive program verification aims to prove that a program fulfills a given specification for all possible inputs. Thanks to significant technical advances in the past decades, verification can now be performed automatically for large classes of programs and specifications; however, writing the specifications (like preconditions, postconditions and loop invariants) usually remains a manual, and often complex task. With automated tools, specifications are usually created iteratively, i.e., the verifier is given the program with a version of the specification, which is then repeatedly updated based on the verification result.

A frequent problem is that a verification error, i.e., a report by the verifier that some assertion might not always hold, can mean several things:

  1. The program is actually incorrect.
  2. Intermediate specifications like loop invariants or postconditions of helper methods are too weak to prove the correctness of the main method.
  3. The assertion cannot be proven because of limitations of the tools (e.g. because the SMT solver cannot prove a fact involving non-linear arithmetic).

For programmers, it is critical to distinguish between these three cases, but typically there is no tool support for making such decisions.

Project Background

To solve this problem, we want to use the counterexample we get from the verifier to check if the detected error is real. This counterexample represents a situation where the verifier deems a specification to be violated. While the counterexample is typically not immediately understandable for the user, it can be used to generate concrete test inputs for which the specification is supposedly violated. One can then run the method in question with these inputs and check, at runtime, if specifications are actually violated.

The intended workflow for the programmer is therefore as follows:

  1. Run the static verifier on the program, which results in a verification error.
  2. From the counterexample provided by the verifier, generate test inputs for the possibly incorrect method.
  3. Run the method with the generated test inputs, and check all relevant assertions and contracts at runtime.
  • If the error also occurs at runtime, the problem actually exists: Fix the program.
  • Otherwise, the error is spurious: Fix intermediate specifications.

Our group is working on a project whose goal is to verify the Python implementation of the SCION internet architecture implementation using a newly-developed static verifer. The verification process will require writing specifications for most of the SCION code in the specification language developed for the verifier; during this process, the abovementioned problems need to be addressed.

Goals

The goal of this project is to develop the necessary infrastructure to enable the workflow described above for the verification of the SCION implementation. In particular, this requires the following steps:

  • Develop a technique for instrumenting Python code.
  • Develop a runtime checking technique for concurrent programs (partially) annotated with contracts.
  • Implement instrumentation and checking for the specification language developed for the Viper Python frontend.
  • Develop and implement a methodology for generating test inputs from the counterexample provided by the static verifier.
  • Evaluate the implementation by writing and checking contracts for parts of the SCION implementation.

Useful prerequisites

The following skills would be helpful, but can also be learned during the project:

  • Familiarity with deductive verification techniques (e.g., Hoare logic)
  • Programming experience with Python

Opportunities

The project offers the following opportunities:

  • Understand the advantages and challenges of checking program properties at runtime as opposed to checking them statically.
  • Delve deep into advanced Python features like AST rewriting.
  • Work with and contribute to state-of-the art, open source verification tools.
  • Contribute to the verification of real, security-critical code.

Contact



JavaScript has been disabled in your browser