Evaluating the Architectural Coverage of Runtime T

Views:
 
Category: Education
     
 

Presentation Description

Evaluating the Architectural Coverage of Runtime Traces Colloquium Bachelor Thesis - Marc Giombetti Technical University Kaiserslautern - Chair of Prof. Dr. Rombach

Comments

Presentation Transcript

Evaluating the Architectural Coverage of Runtime Traces : 

Evaluating the Architectural Coverage of Runtime Traces Colloquium Bachelor Thesis Marc Giombetti marc@giombetti.com

Outline : 

Motivation & Goals Definitions Visualization of runtime traces Component classification and coverage metrics Conclusion Outline

Motivation : 

Software systems are getting more and more complex and the quality requirements increase. Quality is a key factor to the successful employment of a software system and has to be taken care of: Important to understand system behavior. Testing is a good and most often the only possibility to guarantee that a system behaves according to its specification, but: A problem is to analyze the quality of test cases since test cases focus mainly on the code level Runtime traces collect data about test case execution Motivation

Problem : 

Problem Runtime Traces are a recorded form of system behavior during runtime. Contain among other things, detailed information on which components and routines are invoked in a test scenario. Can be generated for basically any type of software using for example aspect oriented logging-techniques. Runtime trace example: Too much data to interpret 11 MB text output Problem Difficult to make statement about the test cases How to interpret the data? Solution Abstraction, visualization, measurement …Enter app/imexport/src/imexdatalh.c/IMEXDnewDataFieldLH_p 1166298115 Enter util/src/util.c/MEMallocWithPos_p 1166298115 Exit util/src/util.c/MEMallocWithPos_p 1166298115 Exit app/imexport/src/imexdatalh.c/IMEXDnewDataFieldLH_p 1166298115 Enter app/imexport/src/imparserlh.c/IMgetHierarchyLH_pc 1166298115 Exit app/imexport/src/imparserlh.c/IMgetHierarchyLH_pc 1166298115 Enter app/imexport/src/imparserlh.c/IMgetIdLH_pc 1166298115 Exit app/imexport/src/imparserlh.c/IMgetIdLH_pc 1166298115 Enter app/imexport/src/imparserlh.c/IMgetControlFlagLH 1166298115 Exit app/imexport/src/imparserlh.c/IMgetControlFlagLH 1166298115 Enter app/imexport/src/imparserlh.c/IMgetSpecialFieldLH_pc 1166298115 Exit app/imexport/src/imparserlh.c/IMgetSpecialFieldLH_pc 1166298115 Enter app/util/src/util.c/UTILnewString_pc 1166298115 Enter util/src/util.c/MEMallocWithPos_p 1166298115 Exit util/src/util.c/MEMallocWithPos_p 1166298115 Enter app/util/src/util.c/UTILstrcpy 1166298115 Exit app/util/src/util.c/UTILstrcpy 1166298115 Exit app/util/src/util.c/UTILnewString_pc 1166298115 Enter app/imexport/src/imexdatalh.c/IMEXDnewDataFieldLH_p 1166298115 Enter util/src/util.c/MEMallocWithPos_p 1166298115…

Goals : 

Goals Development of an approach to evaluate the architectural coverage based on runtime traces, thus making a statement on the quality of test cases. Conceptual classification of components and relations. Definition of architectural coverage metrics. Visualization of runtime traces to improve the understandability of system behavior.

Definitions : 

Code coverage determines which parts of the software (functions, branches, lines) have been executed. The architectural coverage is a metric that measures the degree to which elements in recorded runtime behavior (e.g. in a trace) capture elements of the static structure (i.e., components) of the software system´s architecture. Coverage results allow statements about test case quality on an abstract level. Definitions

The Big Picture : 

The Big Picture

Runtime Traces: Import & Visualization : 

Runtime Traces: Import & Visualization Abstraction

Classification – Different Types of Components : 

Classification – Different Types of Components

Example: Trace Visualization : 

Example: Trace Visualization Visualization

Example: Determination of Unique Components for Trace 1 : 

Example: Determination of Unique Components for Trace 1 Components B and J only occur in the first test case and therefore also only occur in trace 1. Trace 1 contains 16,67% unique components.  Further test cases should be developed to test the unique components in other scenarios. Coverage Metric:

Summary : 

The Prototype supports: Visualization of runtime system behavior (by using runtime traces). Determination of the classification, coverage evaluation & coverage metrics based on trace data Computation of the architectural coverage using special metrics. Determination of the different types of components. Integrated into SAVE Summary

Conclusion & Future Work : 

Conclusion & Future Work Architectural coverage cannot replace code coverage, but lifts the idea of code coverage to a higher level. Trace Visualization is useful to understand how the system actually behaves at runtime Basis of guidelines for test case optimization Unique: extend test cases. Missing: test cases have to be defined. Adaptation of the coverage metrics to take more parameters into consideration (e.g. component hierarchy depth).

Thank you for your attention : 

Questions? Thank you for your attention

Classification – Different Types of Relations : 

Classification – Different Types of Relations

Example: Determination of Common Components : 

Example: Determination of Common Components Components A,D,F and L play a central role in the system because they occur in every trace. All the traces (and all the underlying test cases) cover 33,3% of the components in the entire system. Coverage Metric: Common Components:

Comparison: Static and dynamic elements of Software Systems : 

Comparison: Static and dynamic elements of Software Systems

Coverage Evaluation : 

Coverage Evaluation Common components:

authorStream Live Help