|February 1, 2017||Abstract and optional full paper submission begins|
|May 26, 2017||Exhibit & Supporter registration opens|
|June 15, 2017||Abstract and optional extended abstract submission ends|
|June 29, 2017||Acceptance notifications sent|
|July 24, 2017||Submit final abstracts and presenter biographies|
|August 28, 2017||Submit final presentations and optional full papers|
9/27/2017 | 4:00 PM - 4:45 PM | Track 5 - Test and Verification
Efficient, Effective & Innovative Automated Software Test for the Acquisition Process
The concept of automating the testing of software intensive systems has been around for decades, but the practice of automating testing is scarce in many industries, and especially in the government defense sector. The Scientific Test and Analysis Techniques Center of Excellence is conducting a multi-year study in the application of automated software testing (AST). This presentation captures the study’s first year and addresses the role of AST within the Department of Defense acquisition process. This presentation is intended to serve those interested in applying automation to software testing. It applies a systems engineering process based on the scientific method for the steps to conduct and to achieve an automation capability along with the important need to perform a return on investment (ROI) analysis to make the business case for automation. The approach stresses planning for future automation success by being aware of ‘level of effort’ required to maintain the automation capability and realizing that the true power of automation lies in repeated application of the same or similar testing – it is imperative that minimal modifications are required to your library of functioning scripts. Updates to scripts should be minor for either of the following purposes: a) to execute the same test later when the SUT or test environment changes, or b) to adapt scripts to perform similar but different automation purposes. Know that maintenance loads can vary significantly depending on the automation tools selected. Additionally, the presentation will assess the AST state of practice and art in industry, academia and DoD; develop an AST implementation guide for program managers and practitioners; and discuss how to incorporate Scientific Test and Analysis Techniques (STAT) into AST. This presentation provides information and insight into the planning, architectures or implementations, and test design strategies for AST, and describes how AST folds into the larger issue of software economics.
This presentation has not yet been uploaded.
No handouts have been uploaded.
Darryl Ahner (Primary Presenter), Air Force Institute of Technology, Darryl.Ahner@afit.edu;