Integrating the Science of Learningelearningsymposium.gmu.edu/2005/sessioncontent/...3 The Origin of...
Transcript of Integrating the Science of Learningelearningsymposium.gmu.edu/2005/sessioncontent/...3 The Origin of...
1
Integrating the Science of Learning
Janet WeisenfordExecutive DirectorHuman Performance Center
2
Outline
• Background - Executive Review of Navy Training- Human Performance Center
• The Science of Learning- “What” and “When”
• Benefits from the Science of Learning: Why Bother?• Implementing the Science of Learning Navy-Wide
- Challenges & Strategies• Some Examples & Suggestions• Summary
3
The Origin of the Navy’s HPC
Executive Review of Navy Training (Oct00)– Explicitly managing human performance is
not a function the Navy performs today.
– The Navy needs an organization fully dedicated to Human Performance.
– Research tells us a great deal about the science of learning; that science should be applied to Navy training.
– The Revolution in Training will be as successful as the soundness of the scientific foundation upon which it is built. The science of learning is paramount….
4
HPC Mission
“The mission of the Human Performance Center is to optimize Naval warfighting performance by applying the Human Performance Systems Model and the Science of
Learning to all facets of Naval operations.”
III. Develop Components
I. Define Requirements
EstablishPerformance Standards &
Requirements
Develop, Build, & Integrate
Tools
Design Human Performance
Solutions
Implement & Test Intervention;
Evaluate“Product of Plan”
IV. Execute & Measure
II. Define Solutions
Human Performance Systems Model
PerformanceConsultants Make
recommend-ations
Translate job requirements into
competencies
Apply Science of Learning & Human
Performance
Generate solution
options and metrics
Conduct effectiveness
& cost analysis
(K, S, A, O, T)
5
HPC Background
The HPC … A corporate Navy organization supporting the Sailor, the Fleet and the
Acquisition community
The HPC … A corporate Navy organization supporting the Sailor, the Fleet and the
Acquisition community•• FY02 (TFE HP Cell)FY02 (TFE HP Cell)
–– 1 site1 site–– 1010--20 personnel20 personnel
•• FY03 (Provisional HPC)FY03 (Provisional HPC)–– ““Year of the SailorYear of the Sailor””–– 16 Det sites16 Det sites–– 7575--85 temporary personnel 85 temporary personnel –– NPDC & NSTCNPDC & NSTC
•• FY04 (HPC)FY04 (HPC)–– ““Year of the FleetYear of the Fleet””–– 23 Det sites 23 Det sites –– 125 permanent personnel 125 permanent personnel
•• FY05 (HPC)FY05 (HPC)–– ““Year of ImplementationYear of Implementation””–– Continued expansion Continued expansion –– 171+ personnel171+ personnel
Norfolk• HPC HQ• FFC• OPTEVFOR• NNWC• CNE• CENNAVINTEL• CPD• CENATNSF• CNL• ATGLANT• MARFPCOM• NAVSAFCEN
Pax River• NAVAIR
Bethesda• NMETC
Washington• NAVSEA• CNI
Dahlgren• CSCS
Groton• SLC/CSL
Newport• NWC• SWOS
Panama City• CEOD/DIVE
Pensacola• CNATT• CID
Orlando• HPC N7
San Diego• ATGPAC• CNAF• CNSF• SPAWAR• FASWC• CSEAL/SWCC
Corpus Christi• CNATRA
Monterey• NPS
Great Lakes• NSTC
Millington• NPC
Athens• CSS
Honolulu• CPF
Port Hueneme• CSFE
6
Human Performance CenterFY 05 Structure
TheHuman Performance
CenterTraining & Education
Fleet Acquisition
U.S. Navy-Wide
HP Technology Support
Standards & MethodologyMeasurement & Evaluation
Return on Investment
Modeling & SimulationScience of LearningTeam Performance Integration
Quality Based Evaluation
7
Outline
• Background - Executive Review of Navy Training- Human Performance Center
• The Science of Learning- “What” and “When”
• Benefits from the Science of Learning: Why Bother?• Implementing the Science of Learning Navy-Wide
- Challenges & Strategies• Some Examples & Suggestions• Summary
8
• Changing the way the Navy focuses training is needed to achieve the greatest benefits for Sailors and the Fleet and will allow the Navy to meet the readiness challenges of the 21st Century
• Increasing the discipline & rigor in the selection of training solutions – a better understanding of the cost/performance trade-offs required to imparting knowledge, skills & abilities…to develop warfighting proficiency
• Other definitions: ERNT, NSF, APA
“The Science of Learning is the foundation for understanding whatlearning is, how people learn, and how it transfers to measurable
performance.”
Science of Learning
9
The “S” in the SLDefinition of Science
Science is defined as:1. Possession of knowledge as distinguished from
ignorance or misunderstanding2. Knowledge covering general truths or the operation
of general laws especially obtained and tested through the scientific method
(Webster’s New Collegiate Dictionary, 1977)
10
Learning Defined
• Learning is defined as “a change in the internal state of the individual that is inferred from a relatively permanent improvement in performance as a result of instruction and experience.”
• Internal States that are inferred. Learning involves the storage of memories or skills—no consensus on how this storage works, but there is agreement that the internal states can only be inferred from learner behavior.
• Permanent Improvement. Learning is inferred when the learner demonstrates new or improved knowledge or skills that are long-term.
• Resulting from experience. Learning is dependent on what the learner has experienced through instruction. The real key is how the learner interprets what he or she experienced.
Hays , The Science of Learning: Theories, Data, and Applications, drft 20 May 05
11
sensesinput of
newinformation
short-termstore
selectivefilter
limitedcapacitychannel
system for locating& retrievinginformation
long-termmemory storage
output/behavior
An Information Processing Model of Learning
Adapted from: Broadbent, D.E. (1958). Perception and communication.Elmsford, NY: Pergamon Press.
12
SL is applied after HPI analysis has determined that a root cause of a performance deficiency is Skills and Knowledge
– HP analyses: the identification & removal of barriers that prevent people from achieving top performance.
– Training alone historically solves less than 20% of performance concerns
When do we use Science of Learning?
Motives &
Preferences
Capacity (Selection & Assignment)
Skills &
Knowledge
Consequences&
Incentives
Tools &
Resources
Expectations&
Feedback35% 30% 10%
12% 10% 3%
13
Outline
• Background - Executive Review of Navy Training- Human Performance Center
• The Science of Learning- “What” and “When”
• Benefits from the Science of Learning: Why Bother?• Implementing the Science of Learning Navy-Wide
- Challenges & Strategies• Some Examples & Suggestions• Summary
14
The “S” in the SLSome Guiding Principles
• Goal of science is to objectively predict and “control” phenomena (e.g., instructional approaches to optimize learning and improveperformance)
• Demonstrate “differences” to discern treatment effects (e.g., instructional approaches)
• Minimize “error” by controlling sources of variance• No theory (or hypothesis) is ever proven – we attempt to rule out
the alternative explanations of phenomena• We must do things in such as way that they are repeatable
[Winer (1962). Statistical principles in experimental design; Rubenstein, et al. (1984).Science as a cognitive process.]
15
Why Bother?
Why? It Works!!
It increases the likelihood of achieving quality.- Zero Defects- Does what it was designed to do- Facilitates positive changes in learners- Value for money
The alternative places the learner and learning at risk.
16
Outline
• Background - Executive Review of Navy Training- Human Performance Center
• The Science of Learning- “What” and “When”
• Benefits from the Science of Learning: Why Bother?• Implementing the Science of Learning Navy-Wide
- Challenges & Strategies• Some Examples & Suggestions• Summary
17
Motives &
Preferences
Capacity (Selection
&Assignment)
Skills &
Knowledge
Consequences & IncentivesTools &
Resources
Expectations&
Feedback
Institutionalizing the Science of LearningGilbert Model
18
POTENTIAL ROOT CAUSE PRIORITY DIFFICULTYExpectations and FeedbackA - No clear expectations High MediumB - Performance metrics not developed High HighC - Feedback is inconsistent Medium LowD - Misconceptions re: implementation of SL Medium Low
Tools and ResourcesE - Few SL tools High HighF - Current processes do not include SL High MediumG - Few in-house SL experts Medium LowH - Heavy reliance on contractors Low LowI - No reference documentation Medium MediumJ - No DIDs, standards for contract vehicles Medium LowK - No SL SOW/product support Low LowL - Little validation for SL for Navy applications Medium High
Root Cause Matrix - 1
19
POTENTIAL ROOT CAUSE PRIORITY DIFFICULTYConsequences and IncentivesM - No consequences for no SL High HighN - No rewards for applying SL High HighO - No performance measures for applying SL Medium MediumP - Applying SL is not easy Medium Medium
Skills and KnowledgeQ - SL Knowledge/skills undefined Medium LowR - SL knowledge/skills not widespread Medium Low
CapacityS - No SL practitioner selection/assignment Medium MediumT - SL inherent capabilities not identified Medium Medium
Motives and preferencesU - Motives/preferences not identified Medium Low
Root Cause Matrix – 2
20
Root Cause Matrix Ratings
Low Medium High
High
Medium
Low
A F M N B E
C D G J Q R U
I O PS T
L
H K Difficulty
PriorityHigh/High
• No consequences for no SL• No rewards for applying SL• Performance metrics not developed• Few SL tools
21
Institutionalize SL throughout the Navy
What SL Institutionalized “looks like”:
21
Communicating expectations Navy-wide
Navy’s SL workforce defined and functioning at peak performance
Practitioners performing according to guidance
Increasing decision-making based on prediction vs. speculation
Providing uniform access to job performance tools
Gathering fleet impact data
Using a uniform rating scale for quality
Maintaining a “Heartsmart” dashboard
Fully integrated with ILE
22
• Organizations must incorporate SL into existing training development processes
- Acquisition Community- Fleet - NETC
Human Performance
HSI
ElementsElementsof HSIof HSI
RequirementsRequirements PlansPlans
ExecutionExecutionAssessmentsAssessments
Navy Warfare Navy Warfare Training SystemTraining System
DIGITAL THEORY
Combat Direction System Operator
USS DEYO
Digital data allows rapid, accurate communications between devices, systems, platforms, and communities.
Topics will include:
Semiconductors. IC Chips. Digital Data. Boolean Algebra.
Octal, Hexadecimal, and Decimal Mathematics.
Analog to Digital Conversion.
DIGITAL THEORY
Ones and Zeros. Dits and Dots. Flip Flops. Ring Counters.
Semiconductors. IC Chips. Digital Data. Boolean Algebra.
Octal, Hexadecimal, and Decimal Mathematics.
Analog to Digital Conversion.
Submarines, Aircraft, Ships, and Trucks use Digital Systems to speed up information exchange and enable rapid responses to situational requirements.
The Information Age was born in the instance that digital logic was first applied.
NAVSEA Lm03-123-C
NKO is the Delivery System
Institutionalizing the Science of LearningImplementation Strategy
23
Assessment:Assessment:
ISDISD
SL ToolSL Tool
ILE ILE
SL Knowledge
SL Knowledge
Requirements DefinedRequirements DefinedRequirements: NMETL, JMETL, MNS, JTA
Requirements: NMETL, JMETL, MNS, JTA
Analysis: HP/HSI/SAAnalysis: HP/HSI/SA Knowledge/Skill Gap DefinedKnowledge/Skill Gap Defined
Learning Objectives Developedand Categorized
Learning Objectives Developedand Categorized
Learning Strategies/Techniques/Technologies
Alternatives identified and Business Case Analysis performed
Learning Strategies/Techniques/Technologies
Alternatives identified and Business Case Analysis performed
Develop Training Product, e.g., RLODevelop Training Product, e.g., RLO
Implement and MeasureImplement and Measure
Fleet OutcomesFleet Outcomes
Science of Learning Notional Recipe
24
Status
• Currently defining Science of Learning Practitioner Community and conducting a job task analysis.
• Drafting policy and guidance.• Developing evaluation criteria—metrics.• Designing a tool to help with the application of the
science of learning—access to references; assistance in applying the research.
• Using the Learning Strategies Consortium as a method to work with industry and academia.
• Providing technical assistance.
25
Outline
• Background - Executive Review of Navy Training- Human Performance Center
• The Science of Learning- “What” and “When”
• Benefits from the Science of Learning: Why Bother?• Implementing the Science of Learning Navy-Wide
- Challenges & Strategies• Some Examples & Suggestions• Summary
26
An Example: Quality Evaluation Tool for Asynchronous Instruction
• Tool was developed to:- Assist instructional developers/designers to improve the quality
of their products- Assist those acquiring web-based training to raise their
standards for what they purchase• Tool evolved from a “checklist” to Likert Scale• Tool is based on numerous research efforts• Soon to be published as a Technical Report by Dr. Bob
Hays, Naval Air Warfare Center Training Systems Division, Dr. Rene’e Stout, R.J. Stout, Inc., and Dr. David Ryan-Jones
27
Quality Evaluation Tool
Instructional Features EvaluationRating Criteria Score (1-5)1. Instructional Content1.a. The content is presented in a logical manner.1.b. The purpose of the content is clearly stated.1.c. The instructional objectives are clearly stated.1.d. The content supports each & every instructional objective.1.e. The content is free of errors.1.f. The content is job relevant.1.g. The “authority” for the content is clearly stated.1.h. There are clear indications of prerequisites.1.i. There are clear indications of completed topics.1.j. Sources for additional information are available.
Content Subtotal
28
Quality Evaluation Tool
Instructional Features EvaluationRating Criteria Score (1-5)2. Instructional Activities2.a. Activities are relevant.2.b. The learner is required to interact with the content.2.c. Instruction is engaging.2.d. Instructional media directly supports learning activities.
Activities Subtotal3. Performance Assessment3.a. Assessments are relevant.3.b. Assessments are logical.3.c. Assessments are varied.
Assessment Subtotal
29
Quality Evaluation Tool
Instructional Features EvaluationRating Criteria Score (1-5)4. Performance Feedback
4.a. Feedback is timely (immediately or soon after assessment).4.b. Feedback is meaningful (related to objectives).4.c. Positive reinforcement is provided for correct responses.4.d. Instructional media directly supports learning activities.
Feedback SubtotalInstructional Features Subtotal
30
Quality Evaluation Tool
User Interface EvaluationEvaluation Criteria Score (1-5)5. Navigation and Operation5.a. User interface makes course structure explicit.5.b. Tutorial &/or help available to explain navigation & operation features.5.c. Help function is available to explain navigation & operation features.5.d. Includes all necessary navigation and operation controls.5.e. Navigation & operation controls are clearly and consistently labeled.5.f. Navigation & operation controls are located in consistent place.5.g. Navigation & operation controls operate consistently.5.h. Learner always knows location in course.5.i. Learner always knows how he/she arrived at location.5.j. Learner knows estimated time required for each module.
Navigation & Operation Subtotal
31
Quality Evaluation Tool
User Interface EvaluationEvaluation Criteria Score (1-5)6. Content Presentation6.a. There are no sensory conflicts.6.b. All media are sharp and clear.6.c. Presentations are aesthetically pleasing.6.d. Multi-modal presentation of content is used.6.e. Multi-media presentation of content is used.6.f. Media are easy to use.6.g. External hyperlinks are kept to a minimum.
Presentation Subtotal
32
Quality Evaluation Tool
User Interface EvaluationEvaluation Criteria Score (1-5)7. Installation and Registration7.a. Course does not require installation or learners can install without assistance.7.b. Minimal “plug-ins” are required.7.c. “Optimization” test is available.7.d. Technical support is available.7.e. Registration is simple & straightforward.
Installation and Registration Subtotal
Instructional Features SubtotalUser Interface Subtotal
Total Quality Score (sum of subtotals)
No matter what the overall score, a score of one on any criterionshould be considered a major problem and requires redesign of the instructional product. A score of two on any criterion should be considered a problem and may require redesign.
33
Instructional Features Evaluation Criteria
1.a. The content is presented in a logical manner.
1 2 3 4 5• No course
structure is visible.• No learning
objectives support each other andmany conflict.
• No concepts are presented clearly & precisely.
• Does not use techniques to build on prior learning.
• Very little course structure is visible.
• Few learning objectives support each other andmany conflict.
• Few concepts are presented clearly & precisely.
• Uses few techniques to build on prior learning.
• Course structure is somewhat visible.
• Some learning objectives support each other butsome conflict.
• Some concepts are presented clearly & precisely.
• Uses sometechniques to build on prior learning.
• Course structure is mostly visible.
• Most learning objectives support each other and few conflict.
• Most concepts are presented clearly & precisely.
• Uses manytechniques to build on prior learning.
• Course structure is very visible.
• All learning objectives support each other and none conflict.
• All concepts are presented clearly & precisely.
• Uses very manytechniques to build on prior learning.
34
1.a. The content is presented in a logical manner.
Multiple references are provided that further explain why logicalcontent presentation is important and how to structure content.
Excerpt: A problem arises when an individual is not able to construct a model thatprovides consistency among pieces of information. Because they strive for such consistency, they may “succumb to illusions of consistency andof inconsistency” (Johnson-Laird, et al., 2004, p.44). The result of improperlyformed models of instructional information is likely to be poorerperformance. Thus, the purpose of High quality instructional content shouldbe to guide the student to form models, Patterns, and associations ofinformation that will be more easily retrieved and used when needed.
35
Overall Scoring
Range of Scores Interpretation
172 — 215 Extremely well designed instructional product. Scored well on all criteria.
151 — 171 Has potential. Scored well on all "essential" criteria, but still has some "loose ends" that could be improved.
129 — 150 Has some strengths, but large deficits. Should focus on improving weaknesses.
108— 128 Not enough effort invested in instructional design. Confusing and could lead to frustration. Should go back to the "drawing board."
43 — 107 Inadequate. Little or no consideration of instructional design. Not suitable for most learners.
36
Scoring For Instruction
Subsection & Subtotal Score Interpretation
Instructional Features 75 is the highest possible subtotal score for all instructional features criteria. A subtotal score below 42 is a problem.
Instructional Content: 50 is the highest possible subtotal for instructional content criteria. A subtotal score below 20 is a problem.
Instructional Activities: 20 is the highest possible subtotal score forinstructional activities criteria. A subtotal scorebelow 8 is a problem.
Performance Assessment: 15 is the highest possible subtotal score forperformance assessment criteria. A subtotalscore below 6 is a problem.
Performance Feedback: 20 is the highest possible subtotal score forperformance feedback criteria. A subtotalscore below 8 is a problem.
37
Scoring for User Interface
User Interface 110 is the highest possible subtotal score for all user interface criteria. A subtotal score below 44 is a problem.
Navigation and Operation: 50 is the highest possible subtotal score for navigation criteria. A subtotal score below 20 is a problem.
Presentation: 35 is the highest possible subtotal score for presentation criteria. A subtotal score below 14 is a problem.
Installation & Registration: 25 is the highest possible subtotal score for installation and registration criteria. A subtotal score below 10 is a problem.
383838
Proposed Training MethodConceptual Training for IT System
Conceptual Training:
•The big picture of what the system is intended to do.
•Understanding workflow of the whole process and the organizational impact.
•Results in more accurate mental models over time and improves learning and retention. (Coulson et al., 2003)
Recommendation:
•Introduce conceptual training component prior to the “hands on training” by including job aids and work flows that explain how the overall system works and impacts different parts of the organization.
393939
Evaluation Strategy
Recommendation: Plan for a 4 Level Evaluation to determine training effectiveness.
Level 1 – Collect trainee reaction data
•Evaluate trainee satisfaction with regards to training.
Level 2- Assess knowledge gains
•Ideally, conduct pre and post-test assessment of trainee performance. If pretest is not feasible, then administer post training assessment at the end of each training module, with a pass/fail criteria.
•Provide remediation links from the assessment to the training content.
404040
Evaluation Strategy
Level 3 – Assess transfer of training to the job.•By direct observation from supervisors and site management.•Surveying users.•Tracking the number of help desk calls that are training related.
Level 4 - Assess organizational impact•Did the training positively impact the community of users?
•What was the estimated ROI of the training program.
-Time Savings -Increased Productivity
-Improved Quality -Better Performance
41
The “S” in the SLBasic Rules for Thinking Scientifically
• Maintain a healthy level of skepticism
• Consult original sources (when possible - e.g., publications)
• Don’t jump to conclusions (based on opinions or insufficient data)
[Beveridge (1975). The art of scientific investigation.]
Applies to any of the Human Performance Sub-sciences, not just SL
42
Summary
• Navy is committed to institutionalizing the Science of Learning.
• We are issuing policy, working to develop tools, providing the incentives and consequences, and establishing metrics.
• We are integrating the Science of Learning into an overall strategy of performance improvement.
43
Linked Slides
44
Mission Analysis
Determine Customer Goals
Identify Performer Groups
Assess Cost
Performance Analysis
Desired Performance
Actual Performance
Performance Gap
Root Cause Analysis
Select Model
Gather Data
Analyze Data
Intervention Selection
Classify Root Cause
Identify Interventions
Recommend Interventions
Evaluation
Evaluate Against Desired Goals
Provide Feedback to Customer and Other
Stakeholders
Intervention Implementation
Develop Interventions
Implement
Monitor
Revise
Intervention Planning
Develop Strategy
Develop Plan of Action & Milestones
Secure Stakeholders’
Approval
Human Performance Improvement
Science of Learning