Is Supportive Driver Monitoring Needed to Maximize …...2020/02/01 · Distraction is the new...
Transcript of Is Supportive Driver Monitoring Needed to Maximize …...2020/02/01 · Distraction is the new...
1 © 2020 MIT AGELAB© 2020 MIT AGELAB
Bruce Mehler – Research Scientist | Massachusetts Institute of Technology ([email protected])
UNECE Global Forum for Road Traffic Safety Pre-event, Global Ministerial Conference on Road SafetyPanel II: Emerging automotive technologies - how our life will change Stockholm, SwedenFebruary 18, 2020
Is Supportive Driver Monitoring Needed to Maximize Trust, Use, and the Safety-Benefits of Collaborative Automation?
1
2 © 2020 MIT AGELAB
This version of the presentation has been modified in the following ways:
• Videos and images of research participants have been removed or obscured to respect participant confidentiality.
• Additional text, slides, and citations have been added to increase utility of the slides for individuals who were not present to hear the speaker’s verbal commentary.
The MIT based research presented reflects insights developed from two MIT lead multi-partner consortia (AHEAD & AVT), several multi-year research projects sponsored by industrial partners, the Insurance Institute for Highway Safety, the Santos Family Foundation, and the United States Department of Transportation through the New England University Transportation Center at MIT.
The opinions expressed in this talk are those of the speaker and do not necessarily represent those of organizations that have sponsored the cited research activities.
For further information on this presentation, contact: Bruce Mehler ([email protected])
For information on the AHEAD & AVT consortia, lead contact: Bryan Reimer ([email protected])
Note on Achieve Version
3 © 2020 MIT AGELAB
What Is the Role of the Human in the Success of Automation?
• Automated technologies have delivered improved safetyo Electronic stability control (ESC)
o Automatic emergency braking (AEB)
• Increased automation promises:o Increased safety
o Increased comfort
o Increased mobility
• But, is the driver expected to play a role?o SAE Level 1: Collaborative control
o SAE Level 2: Active monitoring of technology
o SAE Level 3: Readiness to take-over if needed
• Is this reasonable to ask of the human?o Particularly as reliability of L2 and L3 systems improve?
4 © 2020 MIT AGELAB
What Is the Role of the Human in the Success of Automation?
HOW LONG?
Is it reasonable to expect a human to maintain sustained attention on their own in this model?
… Particularly as technology becomes more reliable?
5 © 2020 MIT AGELAB
Is Two-Way Collaborative Monitoring a More Realistic Model for Success?
MUTUAL SUPPORT
Will the driver be more comfortable, more trusting, if they have support in fulfilling their role?
Just as automation needs human back-up, should automation back-up the human?
Conceptual schematic – Bryan Reimer
6 © 2020 MIT AGELAB
Calls for Monitoring Driver State & Distraction are Not New to Automation
• The basic concerns & concepts are not new…
Detecting Driver State
• Heart Rate • Brain Waves • Blood Pressure • Respiration • Skin Conductance
• Gaze
Concentration • Gaze Direction • Perclose • Pupillometry
• Face Recognition • Voice
Visual Attention
Emotion
• Weather • Traffic • Road Geometry
• Acceleration • Breaking • Wheel Movement • Lane Discipline
Environment
Vehicle Performance
• Aggressive • Defensive • Cautious
Driving Style
• Following Distance • Lane Changes • Speed
Driving
Behavior Bio-metrics
Diagram from: Coughlin, J.F., Reimer, B., & Mehler, B. (2011). Monitoring, managing, and motivating driver safety and well-being. IEEE Pervasive Computing, 10(3), 14-21.
• But, both the needs and our capabilities are growing …
o Fatigue
o Drowsiness
o Alcohol impairment
o Distraction
o Workload management
o Health & wellness
7 © 2020 MIT AGELAB
Driver State Monitoring: Something new, something old, something new…
“The driver should be seen as anactive component in a state detection
feedback system. In addition to primingvehicle safety systems, information ondriver state can cue drivers to modify
their behavior and arousal level as wellas triggering support systems to increase
alertness…”
Conceptual summary of AgeLab vison & work between 2004 & 2009 for Ford Motor Company / Volvo Cars
8 © 2020 MIT AGELAB
The AHEAD Consortium
Formal project start: June 2013
Focus on broadening scientifically valid perspectives and methodologies for the objective measurement of demand placed on drivers by in-vehicle systems and technologies
Early emphasis on:
• Developing a framework in which HMI designers could evaluate demand across multiple dimensions, i.e. visual, auditory, haptic, vocal, manual, etc., by taking into consideration the relative cost / benefit interactions of various input, output and processing modalities to find an optimal balance to minimize impact on the primary driving task
• Understanding the role of spatial and temporal characteristics of a task
• Considering interactions between non-driving tasks and the broader operating environment
An evolving aim of AHEAD has been to move the language of assessment from one focused on distraction, to one that emphasizes driver attention management and safe operation, such that demands on the driver, active safety systems, and other higher order forms of automation can be considered as a whole.
Advanced Human Factors Evaluator for Attentional Demand
Aptiv Veoneer
9 © 2020 MIT AGELAB
The Advanced Vehicle Technology Consortium
Focus: To collect and analyze objective data that characterizes the behavioral and safety benefit of advanced driver assistance systems, higher levels of automation, and other production in-vehicle technologies under real-use conditions
A collaborative undertaking by OEMs, suppliers, the insurance industry, and consumer advocacy entities
Membership: Affectiva, Agero, Aptiv, Audi / AID, Consumer Reports, Google, Insurance Institute for Highway Safety (IIHS), Jaguar Land Rover, J.D. Power, Lear, Liberty Mutual Insurance, Progressive, Toyota, Travel Centers of America, Veoneer, Volvo, Zenuity & TBD
Looking Beyond the Technology Towards Consumer Understanding
To develop: An understanding of system performance and how drivers adapt to, use (or do not use),and behave with advanced vehicle technologies
ENVIRONMENT
TECHNOLOGY HUMAN
Founded: 2016 by MIT AgeLab, Touchstone Evaluations & Agero
10 © 2020 MIT AGELAB
• Two pronged study: - Users in their own cars (1 year+)
- MIT owned vehicles (1 month)
• Current vehicles- Tesla models S & X
- Range Rover Evoque
- Volvo S90 (Pilot Assist)
- Cadillac CT6 (Supercruise)
- Tesla Model 3 (added Dec 2019)
• Total miles in dataset: 511,638 (March 2019)
• Approximately 1000 miles of multi-camera HD video, audio, GPS, accelerometer and CAN data is being added per day.
Investigating Advanced Technology Use in the Wild
11 © 2020 MIT AGELAB
What if there is over-trust, and the eyes are not on the road?
Driver Intervention Needed
Critical Glance Behavior• Did over-trust lead to
deficient monitoring by the driver and an almost too close event?
• While long single off-road glances are safety concerns, is the overall pattern also important?
DO NOT RECORD – Please Respect Participant Privacy
Video Removed for Participant Privacy
Video shows participant operating vehicle under Autopilot. Eyes off-road for extended period of time, looking at electronic device in lap that he is engaged in adjusting / attempting to repair. Defeats steering wheel monitoring feature by bracing steering wheel with his legs.Comes upon construction site with small road cones gradually closing off lane of travel – apparently not detected by automated system. Participant happens to look-up at the seemingly last possible moment to take-over control and steer into adjacent lane.
12 © 2020 MIT AGELAB
Awareness Can Be Dependent Upon More than the Last Glance
-10 -9 -8 -7 -6 -5 -4 -3 -2 -1 0
Depth of buffer reflects sampling rate required to keep vehicle in lane (every 1.8-2s; cf. Senders, 1967). Note: this potentially changes with ADAS support & various levels of automation.
Driver’s focus away from road -awareness of road situation
declining
Driver needs to look back at the road and attend to it in order to refresh
their awareness of the road situation.
Glance to FRD Glance away from FRD Glance to speedometer/mirror
SA L
evel
or D
ecay
Rebuilding SA
Loss of SA
2
1
0
On-Road
Glace Location (FRD = forward roadway)The Attention Buffer ConceptFigure adapted & extended from Kircher & Ahlstrom (2009). AHEAD consortia efforts have extended the base rules shown here drawing on re-analysis of existing external datasets, MIT and AHEAD collected datasets and experimental studies. See Seppelt et al. (2017) and other papers in Appendix.
13 © 2020 MIT AGELAB
What if the eyes are on the road ahead, but the mind is not?
Detecting Driver State
• Heart Rate • Brain Waves • Blood Pressure • Respiration • Skin Conductance
• Gaze
Concentration • Gaze Direction • Perclose • Pupillometry
• Face Recognition • Voice
Visual Attention
Emotion
• Weather • Traffic • Road Geometry
• Acceleration • Breaking • Wheel Movement • Lane Discipline
Environment
Vehicle Performance
• Aggressive • Defensive • Cautious
Driving Style
• Following Distance • Lane Changes • Speed
Driving
Behavior Bio-metrics
• Cognitive workload?
• Mind wandering? Lost in thought?
• Look but do not see?Diagram from: Coughlin, J.F., Reimer, B., & Mehler, B. (2011). Monitoring, managing, and motivating driver safety and well-being. IEEE Pervasive Computing, 10(3), 14-21.
14 © 2020 MIT AGELAB
What if the eyes are on the road, but the mind is not?
AN AEB EVENT
Looking but not seeing?• What would have
happened if this behavior occurred at a time when the automation was unable to detect a safety critical situation?
DO NOT RECORD – Please Respect Participant Privacy
Video Removed for Participant Privacy
Video shows participant mostly oriented / staring in direction of forward roadway –interspaced with glance or two toward rearview mirror. Appears absorbed in thought, possibly ruminating about something.Comes upon construction site, vehicles in front begin gradually slowing, brake lights come on. Even though participant continues to stare at roadway, no apparent awareness of slowing traffic. Car immediately ahead comes to full stop. AEB engages in participant vehicle, preventing crash. Participant clearly startled and surprised at what has just happened.
15 © 2020 MIT AGELAB
The Need for True Collaborative Automation: Key Takeaways
• Distraction is the new normal - we have every reason to expect it to increase under automation (L2-L3)• If driver monitoring is useful under manual driving, it has at least as much potential under automation• Just as automation needs human back-up, automation should back-up the human in managing their attention
• Distraction is more than the classic eyes-off-road time• Taking into account how glace behavior is threaded together both off and on the road over time is likely to provide a more
accurate measure of risk than just off-road glance metrics used in current (U.S.) guidelines. • Simply looking on-road is not enough – need to detect cognitive workload, mind-wandering, look-but-do-not-see states
• Automated systems offer the promise of increased safety, but…• Increased display of system state information and increasing number of controls to support multiple modes of operation can
increase supervisory demand – increasing eyes-off-road time (good HMI design is essential)• Lack of understanding of the human’s role and system limitations leads to error – training & responsible marketing needed• Active monitoring of driver attention may be the solution / cost for the “right” to operate a vehicle
• Automation is here, offers potential benefits, and human centered design & support needs to be encouraged• Guidelines / regulation should be flexible (EuroNCAP) to encourage innovation• A “level playing field” is needed. Companies that invest in appropriate technology should not be at a disadvantage
16 © 2020 MIT AGELAB
Is Collaboration the Best Course on the Road to Collaboration Automation?
“Following efforts in Europe, and using a framework similar to the NHTSA-IIHS collaborative industry agreement on AEB, could manufacturers
and the government come together to take another step forward in safety by collaboratively
agreeing to install camera-based driver monitoring systems to work alongside
collaborative driving features?”
17 © 2020 MIT AGELAB
Discussion / Questions
For Further Follow-up:Bruce Mehler [email protected] Reimer [email protected]
18 © 2020 MIT AGELAB
Abdic, I., Fridman, L., McDuff, D., Marchi, E., Reimer, B., & Schuller, B. (2016). Driver frustration detection from audio and video in the wild. 2016 International Joint Conference on Artificial Intelligence (IJCAI). New York.Coughlin, J.F., Reimer, B., & Mehler, B. (2011). Monitoring, managing, and motivating driver safety and well-being. IEEE Pervasive Computing, 10(3), 14-21.Fridman, L., Reimer, B., Mehler, B. & Freeman, B. (2018, April). Cognitive load estimation in the wild. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM. )Lee, J.B., Sawyer, B.D., Mehler, B., Angell, L., Seppelt, B.D., Fridman, L., & Reimer, B. (2017, January). Linking the detection response task and the AttenD algorithm through the assessment of human-machine interface workload. Proceedings of the Transportation Research Board 96th Annual Meeting, Washington D.C., January 8-12, 2017. (Updated version published in TRR as doi: 10.3141/2663-11)Mehler, B., & Reimer, B. (2019, June). How demanding is “just driving”? A cognitive workload – psychophysiological reference evaluation. Proceedings of the 10th International Driving Symposium on Human Factors in Driver Assessment, Training, and Vehicle Design, Santa Fe, New Mexico, June 24-27, 2019, pp. 363-369.Mehler, B., Reimer, B. & Coughlin, J.F. (2012). Sensitivity of physiological measures for detecting systematic variations in cognitive demand from a working memory task: an on-road study across three age groups. Human Factors, 54(3), 396-412. Mehler, B., Reimer, B., Coughlin, J.F., & Dusek, J.A. (2009). The impact of incremental increases in cognitive workload on physiological arousal and performance in young adult drivers. Transportation Research Record: Journal of the Transportation Research Board, 2138, 6-12. doi:10.3141/2138-02Mehler, B., Kidd, D., Reimer, B., Reagan, I., Dobres, J. & McCartt, A. (2016). Multi-modal assessment of on-road demand of voice and manual phone calling and voice navigation entry across two embedded vehicle systems. Ergonomics, 59(3), 344-367. doi:10.1080/00140139.2015.1081412Muñoz, M., Reimer, B., Lee, J., Mehler, B., & Friedman, L. (2016). Distinguishing patterns in drivers’ visual attention allocation using Hidden Markov Models. Transportation Research Part F: Traffic Psychology and Behaviour, 43, 90-103. doi: http://dx.doi.org/10.1016/j.trf.2016.09.015Reimer, B., Mehler, B., Wang, Y., & Coughlin, J.F. (2012). A field study on the impact of variations in short term memory demands on drivers’ visual attention and driving performance across three age groups. Human Factors, 54(3), 454-468. Seppelt, B., Seaman, S., Lee, J., Angell, L.S., Mehler, B., & Reimer, B. (2017). Glass half-full: use of on-road glance metrics to differentiate crashes from near-crashes in the 100-car data. Accident Analysis & Prevention, 107, 48-62. doi: 10.1016/j.aap.2017.07.021Seaman, S., Lee, J., Seppelt, B., Angell, L., Mehler, B. & Reimer, B. (2017). It’s all in the timing: using the AttenD algorithm to assess texting in the NEST naturalistic driving database. Proceedings of the 9th International Driving Symposium on Human Factors in Driver Assessment, Training, and Vehicle Design.
Appendix: Selected References
19 © 2020 MIT AGELAB
Kircher, K., & Ahlström, C. (2009). Issues related to the driver distraction detection algorithm AttenD. In First international conference on driver distraction and inattention. Gothenburg, Sweden.Senders, J. W. (1967). On the distribution of attention in a dynamic environment. Acta Psychologica, 27, 349-354.
Appendix: Additional Papers Cited