Static Analysis Tool Exposition (SATE 2009) Workshop
A Few Programs Under Many Static Analysis Microscopes
Jefferson Memorial by Christopher
Hollis, Hollis Innovations
Note: due date extended to Saturday, 17 October since the CFP was delayed.
Software must be developed to have high quality: quality cannot be "tested in". However auditors, certifiers, and others must assess the quality of software they receive. "Black-box" software testing cannot realistically find maliciously implanted Trojan horses or subtle errors which have many preconditions. For maximum reliability and assurance, static analysis must be used in addition to good development and testing. Static analyzers are quite capable and are developing quickly. Yet, developers, auditors, and examiners could use far more capabilities.
The goals of the Static Analysis Tool Exposition (SATE) 2009 are:
- To enable empirical research based on large test sets
- To encourage improvement of tools
- To speed adoption of tools by objectively demonstrating their use on real software
This workshop has two goals. First, gather participants and organizers of SATE to share experiences, report interesting observations, and discuss lessons learned. The workshop is also an opportunity for attendees to help shape the next SATE in 2010.
As noted in the Call for Papers the second goal is to convene researchers, tool developers, and government and industrial users of software assurance tools to define obstacles to urgently-needed software assurance capabilities and identify engineering or research approaches to overcome them. We solicit contributions describing basic research, applications, experience, or proposals relevant to software assurance tools, techniques, and their evaluation. Questions and topics of interest include but are not limited to:
- Contribution of static analysis to software security assurance
- Issues in applying static analysis to binaries
- System assurance at the design or requirements level
- Integration of, or tradeoffs between, static and dynamic analysis
- Issues in scaling static analysis to deal with large systems
- Flaw catching vs. sound analysis
- Benchmarks or reference datasets
- Formal descriptions of weaknesses and vulnerabilities in the CWE
- User experience drawing useful lessons or comparisons
- Synergies of pre- and post-production assurance
- Case studies on real applications
- Temporal and inter-tool information sharing
Who Should Attend?
Those who develop, use, purchase, or review software assurance tools should attend. Academicians who are working in the area of semi- or completely automated tools to review or assess the security properties of software are especially welcome. We encourage participation from researchers, students, developers, and assurance tool users in industry, government, and universities.
We will reserve workshop time for presentations from SATE participants. In addition to SATE presentations we solicit contributions describing basic research, applications, experience, or proposals relevant to software assurance tools, techniques, and their evaluation. See the Call for Papers for more details, such as topics of interest.
Open submission papers should be from 2 to 8 pages long. Papers over eight pages will not be reviewed. Papers should clearly identify their novel contributions.
Submit papers electronically in PDF to Wendy Havens (email@example.com) and Paul E. Black (firstname.lastname@example.org). Your submission constitutes permission for us to publish it in workshop proceedings.
We will notify submitters of acceptance by 23 October 2009.
Presentations by SATE participants will be handled separately.
Accepted papers will be published in the workshop proceedings as a NIST Special Publication.
- 17 October: Submissions due
note: due date extended since CFP was delayed
- 23 October: Author notification
- 6 November: Workshop
The program consists of presentations by participants in and organizers of the 2009 Static Analysis Tool Exposition, invited presentations, and presentation of an accepted paper.
This is the final program.
8:30 AM Welcome to SATE - Paul E. Black, NIST, SATE organizer
8:40 C Secure Coding Guidelines, Robert C. Seacord, SEI, & Thomas Plum, Plum Hall, Inc. - accepted paper
9:00 SATE 2009 background - Vadim Okun, NIST, SATE organizer
9:30 Automated Security Acceptance Testing with Binary Static Analysis, Chris Wysopal, Veracode, SATE participant
9:55 Paul Anderson, GrammaTech, SATE participant
10:55 Benchmarking Static C Bug-Checking Tools, Cristina Cifuentes, Sun - invited presentation
11:20 Addressing Software Security through Automated Source Code Analysis, Ming-Wei (Benson) Wu, Armorize, SATE participant
11:45 Ensuring Software Integrity with Static Analysis: NIST Code Base Evaluation and Results, Peter Henriksen, Coverity, SATE participant
1:10 PM SATE 2009 observations from analysis - Vadim Okun, NIST, SATE organizer
1:30 Threat Modeling and Manual Assessment, David Lindsay, Cigital, SATE analyst
1:50 Increasing Software Security, Reliability, and Maintainability through Integrated Static and Dynamic Analysis, John Greenland, LDRA, SATE participant
2:15 Experiences and Observations, Todd Landry, Klocwork, SATE participant
2:40 Recommendations based on analysis of SATE tool warnings - Aurelien Delaitre, SATE organizer
3:20 Discussion session: planning for SATE 2010 - Paul E. Black, NIST
4:00 Software Assurance Findings Expression Schema (SAFES) Framework, Sean Barnum, Cigital Federal - invited presentation 2
4:25 Null Dereference Analysis in Practice, William Pugh, UMD - invited presentation 3
4:55 Closing remarks - Paul E. Black, NIST, SATE organizer
Paul E. Black (NIST) email@example.com
Tom Ball (Microsoft Research)
Redge Bartholomew (Rockwell Collins)
Chandrasekhar Boyapati (Univ. of Michigan)
Mary Ann Davidson (Oracle)
Klaus Havelund (Jet Propulsion Laboratory)
W. Bradley Martin (NSA)
Jaime Merced (DoD)
James W. Moore (MITRE)
William Pugh (Univ. of Maryland)
Mark Saaltink (Communications Security Establishment Canada)
Henny Sipma (Kestrel Technology)
Andy White (NSA)