Software Acceptance: Direct Artifact Assurance William L. Scherlis Carnegie Mellon University...

9
Software Acceptance: Direct Artifact Assurance William L. Scherlis Carnegie Mellon University Professor, School of Computer Science Director, CMU/NASA High Dependability Computing Program Director, CMU PhD Program in Software Engineering [email protected] 412-268-8741

Transcript of Software Acceptance: Direct Artifact Assurance William L. Scherlis Carnegie Mellon University...

Page 1: Software Acceptance: Direct Artifact Assurance William L. Scherlis Carnegie Mellon University Professor, School of Computer Science Director, CMU/NASA.

Software Acceptance:Direct Artifact Assurance

William L. ScherlisCarnegie Mellon University

Professor, School of Computer ScienceDirector, CMU/NASA High Dependability Computing Program

Director, CMU PhD Program in Software Engineering

[email protected]

Page 2: Software Acceptance: Direct Artifact Assurance William L. Scherlis Carnegie Mellon University Professor, School of Computer Science Director, CMU/NASA.

Outline

Problems Software test and inspection inadequate to

assure dependability and security Barriers in IT supply chain: off-the-shelf, outsourcing, etc. Example: the real story of Ariane-5 Example: Windows device drivers and blue-screens

Software acceptance

Direct assurance of software Contrast: CMM and NIAP-CC Examples: MSR SLAM, CMU Fluid What’s new:

Deep technical results informed by engineering pragmatism

Focus on scalability, decomposition, usability

Page 3: Software Acceptance: Direct Artifact Assurance William L. Scherlis Carnegie Mellon University Professor, School of Computer Science Director, CMU/NASA.

Assurance of critical properties—today’s best practice

Interface barriers exist between producers* and consumers at all stages of an IT supply chain

*Producers: Internal development groups; subcontractors/outsources/offshore; off-the-shelf; open source; etc.

Problem: Testing, inspection, and design analysis are inadequate to assure security and dependability

Symptom: Software failures and security defectsChallenges: Subsystem decomposition,

critical properties with non-locality in code, concurrency and non-determinism

Some examples…

Four barriersContractor qualificationRequirements definitionEngineering acceptance“Second” sourcing

Mitigation (today’s best practice)CMM / CMMIClose relationshipsTesting, inspection, design analysisAPI conventionalization

Page 4: Software Acceptance: Direct Artifact Assurance William L. Scherlis Carnegie Mellon University Professor, School of Computer Science Director, CMU/NASA.

Examples: The inadequacy of test and inspection

Ariane 5—mission criticalAriane 5 veered off course and exploded 40 seconds into its maiden flight due to software failureThe failure was due to a known unhandled exception in the software—cost $1 billionWhy? “Heritage Ariane 4 code”

Trust in the legacy…it worked for Ariane 4

Distrust in defined criteria…too risky to modify “working” software even when it is known to be broken

“Blue screen”—desktop Most occur due to faulty 3rd party device driver code—but Microsoft “blamed” by usersReputational cost to Microsoft

WindowsOS

3rd partydevice driver

OS API with associated integrity

constraints

Page 5: Software Acceptance: Direct Artifact Assurance William L. Scherlis Carnegie Mellon University Professor, School of Computer Science Director, CMU/NASA.

Problem: Software acceptance in today’s practice

Sources of softwareInternally developed

Mission- or security-critical Differentiating capability Business logic

Outsourced custom Whole solutions Separable subsystems

Off-the-shelf components Windows, OS X, Office,

OracleOpen source components

Apache, Linux, Tomcat, etcMobile code

JavaScript in a web page MS Word document Free players, plug-ins “Cool screensaver” virus mail Spoofed executable

enclosure

Basis for accepting softwareTrust the source

“Always trust content from __”

Chain of trust – certificatesExplicit test and inspection

Custom and outsourced May be more costly than

code development itselfLimited privilege – containment

Sandboxing for Java, scriptsVerification of safety attributes

Ada/Java type integrity Assert (often based on

testing)Lack of awareness

Spyware and adware Configuration mgt failures

Little focus on direct assurance of software code and design artifacts …

Page 6: Software Acceptance: Direct Artifact Assurance William L. Scherlis Carnegie Mellon University Professor, School of Computer Science Director, CMU/NASA.

Focus—direct assurance and evaluation best practiceWhat’s needed:

Direct assurance (focused tools and ongoing research) (technology-dependent; attribute focused)

Assure the software itself Quality, dependability, security

(objective analysis)

Contrast with accepted best practices for evaluation:

CMM/CMMI (ISO 9001x) (timeless; comprehensive)

Evaluate the team Cost and schedule predictability

Evaluate the process (correlates with bug reduction)

NIAP/CC (ISO 15408) (including potential EAL 7) (timeless; comprehensive)

Evaluate the process Security policy definition Evaluate the design Design compliance Sample the product* (*sampled – no direct assurance)

Page 7: Software Acceptance: Direct Artifact Assurance William L. Scherlis Carnegie Mellon University Professor, School of Computer Science Director, CMU/NASA.

Direct assurance—industry example: Microsoft SLAM for Windows XP

Carnegie Mellon

Blue Screens and Device Drivers: SLAM

WindowsOS

3rd partydevice driver

0 1 2

0 1 2

aa

0 1 2

a

0 1 2

0 1 2b

0 1 2b

0 1 2b b

b b a,b

b

a

a,b

0 1 2

a

a

b b a,bb a,b

a

b b

a

a,b

a

b

a

a,b

a

b

a

a,b

aa

a

b b

a

a,b

Based on model checking: a deeply technical approach to assurance,originated in university labs(DARPA and NSF funding)

OS API with associated integrity

constraints

Direct analysis of Windows device driver codefor protocol compliance

Compelling business case for direct assuranceof software artifacts

http://research.microsoft.com/slam/

Page 8: Software Acceptance: Direct Artifact Assurance William L. Scherlis Carnegie Mellon University Professor, School of Computer Science Director, CMU/NASA.

Direct assurance—research example: CMU Fluid

• Assure diverse “mechanical” program properties for dependability and security

• E.g., race conditions, locking policy, unaliased references

• Detected numerous race conditions (and developed assured fixes) for widely used production software code

• E.g., Sun Java 1.4 library BufferedInputStream

http://www.fluid.cs.cmu.edu

• Provide Java programmers with direct positive assurance of design intent

• Focused on incrementality of work and programmer early gratification

Page 9: Software Acceptance: Direct Artifact Assurance William L. Scherlis Carnegie Mellon University Professor, School of Computer Science Director, CMU/NASA.

Direct assurance—conclusion

Today’s software evaluation practices (testing, inspection, design analysis) are necessary but not sufficient to provide guarantees critical dependability and security attributes.

Industry and lab evidence suggests that S&T focus on defect avoidance can yield useful results

New analytical techniques for assurance are starting to emerge in research and industry

The most recently developed industry tools are based on technical results of early 6.1 and 6.2 lab projects

Focus on specific engineering attributes related to dependability and security

A priori emphasis on scalability, component decomposition, and harmony with engineering best practices.