The SAMATE Project Department of Homeland Security

Static Analysis Tool Exposition VI (SATE VI)

Last update: 10/22/2018


Note: The SATE V Report is available here.


Introduction

SATE is a non-competitive study of static analysis tool effectiveness, aiming at improving tools and increasing public awareness and adoption. Briefly, participating tool makers run their static analyzer on a set of programs, then researchers led by NIST analyze the tool reports. Everyone shares results and experiences at a workshop. The analysis report is made publicly available later.

SATE's purpose is NOT to evaluate nor choose the "best" tools. Rather, it is aimed at exploring the following characteristics of tools: relevance of warnings to security, their correctness, and prioritization. Its goals are:

  • To enable empirical research based on large test sets,
  • To encourage improvement of tools,
  • To boost public awareness of tools by objectively demonstrating their use on real software.

SATE VI is the sixth occurrence of SATE. There is information about and results from SATE V , SATE IV , SATE 2010 , SATE 2009  and SATE 2008  online.


Changes Since SATE V

  • The C and Java Tracks have been merged into the new Classic Track.
  • We have a new Mobile Track, focusing on mobile applications.
  • SATE VI will be using rolling releases for the test cases, so all tracks may not run at the same time.
  • On the Classic Track, we injected realistic vulnerabilities into large test cases to assess the ability of tools to find bugs that matter. The type of these vulnerabilities will be shared along with the test cases.

Tracks

Preparation work is ongoing for SATE VI. We will add information about the tracks and test cases as it becomes available. Note that we will be using rolling releases for the test cases, so the different tracks may not all run at the same time.


Classic Track

Contact: aure 'at' nist.gov

Main page: https://samate.nist.gov/SATE6ClassicTrack.html

Registration is open as of October 16, 2018. To register for the Classic Track, simply send an email, including the tool name and the author/organization, to 'aure' [at] ‘nist.gov’, letting us know you would like to participate.

Release target: October 29, 2018

The Classic Track combines the C and Java Tracks from the past SATEs. In SATE VI, we injected realistic vulnerabilities into large test cases to assess the ability of tools to find bugs that matter. The SAMATE team will only analyze warnings related to these injected bugs.


Ockham Sound Analysis Criteria Track

Contact: paul.black 'at' nist.gov

Release target: October 2017

Main page: https://samate.nist.gov/SATE6OckhamCriteria.html

The Ockham Criteria Track focuses on sound static analysis tools. We will analyze all findings reported by participating tools to assess their soundness.

After consultation with potential participants, we decided to use a new version of Juliet, 1.3, as test cases. Although the 1.3 cases are done, the suite needs a little work to release.


Mobile Track

Contact: michael.ogata 'at' nist.gov

Release target: October 2017

The mobile track includes several mobile application test cases. Most of the test cases are ready for analysis by tools. For this first foray into the mobile space, we are focusing on the Android operating system. For each test case, both source code and deployable APK files are provided. While SATE focuses primarily on static analysis, we invite participants in the mobile track to submit analysis of any and all kinds, including dynamic and behavioral analysis.


Organizing Meeting

The organizing meeting was held on May 31, 2017. The recording of the event is available below.

Notes:

The SATE V report is in the final stage of the publication process and should be released imminently.

Additional information will be available on this website after the organizing meeting. Should you have any inquiry, please contact us:

  • Classic track: aure 'at' nist.gov
  • Ockham track: paul.black 'at' nist.gov
  • Mobile track: michael.ogata 'at' nist.gov
  • General inquiries: aure 'at' nist.gov
Views