CRV-2016 is the 3rd International Competition on Runtime Verification and will be held as part of the 16th International Conference on Runtime Verification. The event will be held in September 2016, in Madrid, Spain. CRV-2016 will draw attention to the invaluable effort of software developers and researchers who contribute in this field by providing the community with new or updated tools, libraries and frameworks for the instrumentation and runtime verification of software. The competition is a product of COST Action “Runtime Verification beyond Monitoring”.
Runtime Verification is a verification technique for the analysis of software at execution-time based on extracting information from a running system and checking if the observed behaviors satisfy or violate the properties of interest. During the last decade, many important tools and techniques have been developed and successfully employed. However, there is a pressing need to compare such tools and techniques, since we currently lack a common benchmark suite as well as scientific evaluation methods to validate and test new prototype runtime verification tools.
The main aims of CRV-2016 are to:
- Stimulate the development of new efficient and practical runtime verification tools and the maintenance and improvement of the already developed ones.
- Produce a benchmark suite for runtime verification tools, by sharing case studies and programs that researchers and developers can use in the future to test and to validate their prototypes.
- Discuss the metrics employed for comparing the tools.
- Provide a comparison of the tools on different benchmarks and evaluate them using different criteria.
- Enhance the visibility of presented tools among the different communities (verification, software engineering, cloud computing and security) involved in software monitoring.
Please direct any enquiries to the competition co-organizers (email@example.com)
- Yliès Falcone (Université Joseph Fourier, France).
- Giles Reger (University of Manchester, Manchester, UK).
- Sylvain Hallé (Université du Québec à Chicoutimi, Canada)
The CSRV Jury will include a representative for each participating team and the competition chairs. The Jury will be consulted at each stage of the competition to ensure that the rules set by the competition chairs are fair and reasonable.
Call for Participation
The main goal of CRV 2016 is to compare tools for runtime verification. We invite and encourage the participation with benchmarks and tools for the competition. The competition will consist of three main tracks based on what is being monitored:
- Track on monitoring Java programs (online monitoring)
- Track on monitoring C programs (online monitoring)
- Subtrack on Generic Specifications (e.g. in LTL)
- Subtrack on Implicit Specifications (e.g. memory safety)
- Track on monitoring of traces (offline monitoring)
The general organisation of the competition is described in the rules document (http://crv.liflab.ca/CRV2016.pdf).
To register please fill in the form (http://goo.gl/forms/kWxFFfFCvZ).
Expected Important Dates
- May 1st Registration Opens
- May 29th Benchmark Submission Deadline
- June 5th Registration Closes
- June 5-12th Clarifications Phase
- June 19th Benchmarks Announced
- July 10th Monitor Submission Deadline
- August 1st Notifications
- At RV 2016 Presentation of Results