2017 29 (2020-2017 2017 36 42017 2017 . 19 g jj 2017 23 . 28 . 2017 29 .2017 03, 02 2017 04 . i .
Ethics and Autonomous Vehicles - TECHLAVtechlav.ncat.edu/seminars/2017/2017-03-31 Schuelke-Leech...
Transcript of Ethics and Autonomous Vehicles - TECHLAVtechlav.ncat.edu/seminars/2017/2017-03-31 Schuelke-Leech...
Ethics and Autonomous Vehicles
Beth-Anne Schuelke-Leech
March 31, 2017
Agenda
• Ethics in Technology Development
• Trolley Problem
• Empirical Research
• Implications
We are all Ethical, Aren’t We?
Trolley Problem
Trolley Problem
Trolley Problem
Trolley Problem and Autonomous Systems
• Engineers and Computer Scientists are
programming vehicles to respond to different
situations
• How do we know that they are programming the
vehicles to respond the way that we would in the
same situation?
• What does “safety” mean in a vehicle?
– Safety to the occupant?
– Minimizing injuries? To whom?
Trolley Problem and Autonomous Systems
• Who is really making the decisions in an
autonomous system?
• What about an “intelligent”, learning system?
• Can we teach robots and autonomous systems to
be moral? To make ethical decisions about life
and death?
Trolley Problem and Autonomous Systems
• Who is really making the decisions in an
autonomous system?
• What about an “intelligent”, learning system?
• Can we teach robots and autonomous systems to
be moral? To make ethical decisions about life
and death?
• We better figure out how!
Autonomous/Self-driving/Driverless Vehicles
Human-Centric to Techno-Centric System
• Until now, the operation of an automobile was
taken to be the responsibility of the
driver/operator.
• Liability was limited for the OEMs
• With an autonomous system, who holds the
liability and responsibility for problems?
• What about with semi-autonomous systems?
When does the trade-off occur between human
responsibility and machine responsibility?
Research
• Looking into answering these questions through
text data analytics
Text Data Analytics
• Text Data Analytics
• Analysis of Unstructured Data
• Mixed quantitative and qualitative looking at
content and context to understand how different
stakeholders are discussing relevant issues
Federal AV Policy GuidanceSection 11: Ethical Considerations • Various decisions made by an HAV’s computer
“driver” will have ethical dimensions or implications. Different outcomes for different road users may flow from the same real-world circumstances depending on the choice made by an HAV computer, which, in turn, is determined by the programmed decision rules or machine learning procedures.
• (This discussion is intended only to introduce the relevance and importance of ethical considerations to the development and deployment of HAVs. It is not intended to be exhaustive or definitive, or to answer ethical questions, but rather only to raise the general topic of ethics as worthy of discussion and consideration by manufacturers, consumers, government, and other stakeholders.)
Federal AV Policy Guidance
• Even in instances in which no explicit ethical
rule or preference is intended, the programming
of an HAV may establish an implicit or inherent
decision rule with significant ethical
consequences. Manufacturers and other entities, working cooperatively with regulators and other stakeholders (e.g.,
drivers, passengers and vulnerable road users),
should address these situations to ensure that such ethical judgments and decisions are made consciously and intentionally.
• Three reasonable objectives of most vehicle operators are safety,
mobility, and legality. In most instances, those three objectives
can be achieved simultaneously and without conflict. In some
cases, achievement of those objectives may come into conflict.
For example, most States have a law prohibiting motor vehicles
from crossing a double-yellow line in the center of a roadway.
• When another vehicle on a two-lane road is double-parked or
otherwise blocking a vehicle’s travel lane, the mobility objective
(to move forward toward an intended destination) may come
into conflict with safety and legality objectives (e.g., avoiding risk
of crash with oncoming car and obeying a law).
• An HAV confronted with this conflict could resolve it in a few
different ways, depending on the decision rules it has been
programmed to apply, or even settings applied by a human driver
or occupant.
Federal AV Policy Guidance
• Similarly, a conflict within the safety objective
can be created when addressing the safety of
one car’s occupants versus the safety of another
car’s occupants. In such situations, it may be that
the safety of one person may be protected only
at the cost of the safety of another person [the Trolley Problem].
• In such a dilemma situation, the programming
of the HAV will have a significant influence
over the outcome for each individual involved.
Federal AV Policy Guidance
• Since these decisions potentially impact not only the automated vehicle and its occupants but also surrounding road users, the resolution to these conflicts should be broadly acceptable. Thus, it is important to consider whether HAVs are required to apply particular decision rules in instances of conflicts between safety, mobility, and legality objectives. Algorithms for resolving these conflict situations should be developed transparently using input from Federal and State regulators, drivers, passengers and vulnerable road users, and taking into account the consequences of an HAV’s actions on others.
Federal AV Policy Guidance
• Highway and road maintenance (visual cues)
• Technology standards
• NHTSA
• Research funding
• Elected representatives accountable to citizens,
overseeing public administration
• Set guidelines and frameworks for federal rules and
regulations
Congress and Transportation
Session Year 1 Year 2 Number of Files
Number of Tokens (words)
114 2015 2016 5,483 167,500,507113 2013 2014 10,566 258,312,326112 2011 2012 11,591 305,037,514111 2009 2010 13,510 325,823,903110 2007 2008 15,630 341,006,347109 2005 2006 14,979 327,804,803108 2003 2004 12,754 328,706,743107 2001 2002 11,701 474,711,027
Average 12,027 316,112,896 StandardDeviation
3,515 85,950,046
Total 96,214 2,528,903,170
U.S. Congress
Years Number of Files
Number of Tokens (words)
Clinton 1995-2000 52,889 949,667,255Bush 1 2001-2004 26,725 761,923,475Bush 2 2005-2008 50,034 755,254,671
Obama 1 2009-2012 77,918 979,189,724Obama 2 2013-2015 44,445 705,398,144
U.S. PA
Years Number of Files
Number of Tokens (words)
Engineers 2010-2015 15,841 14,097,589Entrepreneurs 2010-2015 10,545 4,525,476Manufacturing 2010-2015 55,516 23,692,746Gen. Business 2010-2015 88,814 67,644,217Silicon Valley 2010-2015 102,298 56,746,343
Social 2010-2015 76,597 82,802,467Industry CC3 2011-2015 66,692 477,037,318
Industry
Are Ethics in the development process?
• Have to narrow down research question
• Main research questions:
1. Are ethics incorporated into the discussion of
autonomous vehicles?
2. Technologies more broadly?
3. What are these conversations?
4. What does this tell us?
Ontology of Ethics
1. Legal compliance
2. Moral conduct (i.e., behavior, duty)
3. Religious dictates
4. Management requirement
What is the relationship of ethics to technology?
• Ethics with respect to technology is a very small
conversation
• Innovators talk about it more than other groups.
However, it is mostly related to IT and privacy,
not vehicles.
How do they talk about ethics
• Prepared subcorpus of just ethics-related
discussions. Must include 3 occurrences in 250
word chunks.
• Ethics to Innovators is a theoretical concept
• To Public Administrators, it is tangible,
compliance, rule-based, specific
How do they talk about ethics?
Innovators Public Administration
1. rate of giving 2. percent of consumers 3. chief supply chain 4. the innovation process 5. the critical question 6. open innovation activities 7. the creative industries 8. clients should use 9. the practice challenge 10. the leadership challenge 11. supply chain officer 12. best practices in 13. permalink trackback 0 14. research and innovation 15. in engineering education 16. ethical supply chains 17. open innovation is 18. of non millennials 19. the compliance and 20. self driving cars 21. of compliance and 22. ceb compliance ethics 23. of the innovation 24. must embrace aspirational 25. from the ceo 26. engineers must embrace 27. embrace aspirational ethics 28. business ethics and 29. a supply chain 30. the most ethical 31. supply chain transparency 32. open innovation and 33. of the organisation 34. of a product 35. in supply chains 36. foreign corrupt practices 37. ethical behavior and 38. anti money laundering 39. supply chains are 40. social networking sites 41. new year's resolutions 42. how people use 43. for engineering ethics 44. corporate social responsibility 45. by giving them 46. and compliance risks 47. when faced with 48. the securities and 49. the impact on 50. self driving car 51. of operational risk 52. management supply chain 53. is becoming more 54. in real time 55. fraud and corruption 56. corrupt practices act 57. and best practices 58. a product or 59. ethics and compliance 60. unwritten laws of
1. proposed rule change 2. the contracting officer 3. the postal service 4. administrative law judge 5. united states postal 6. the general counsel 7. states postal service 8. this final rule 9. the final rule 10. national institutes of 11. institutes of health 12. the assistant secretary 13. under this section 14. this proposed rule 15. of homeland security 16. state or local 17. described in paragraph 18. under this part 19. department of homeland 20. in paragraph c 21. that the proposed 22. terms and conditions 23. have a significant 24. of this notice 25. a notice of 26. act of 1995 27. code of federal 28. of federal regulations 29. in paragraph a 30. paragraph b of 31. to the commission 32. and drug administration 33. that the commission 34. requirements of this 35. regulatory flexibility act 36. by the board 37. federal acquisition regulation 38. has determined that 39. paragraph c of 40. will be considered 41. in compliance with 42. and urban development 43. housing and urban 44. of the date 45. this section and 46. 30 days after 47. as specified in 48. in paragraph b 49. under executive order 50. with the commission 51. with the provisions 52. policies and procedures 53. the paperwork reduction 54. securities exchange act 55. notice of proposed 56. of this subpart 57. of the securities 58. collection of information 59. assistant secretary for 60. authority citation for
Ethics Marker Set
• Ethics
• Ethical
• Unethical
Session Number of Files
Number of Tokens Files with Ethics Occurrences
Number of Occurrences
% of Files Per million tokens
114 5,483 167,500,507 871 5921 15.9 35.3
113 10,566 258,312,326 1,325 8,079 12.45 31.1
112 11,591 305,037,514 1,535 10,593 13.24 34.7
111 13,510 325,823,903 1,893 13.258 14.02 40.6
110 15,630 341,006,347 1,864 13,786 12.05 40.8
109 14,979 327,804,803 1,669 14,380 11.11 43.7
108 12,754 328,706,743 1,537 11,269 12.04 34.2
107 11,701 474,711,027 2,224 17,458 19.07 36.7
Ethics Discussions in Congress
Session Number of Files
Number of Tokens Files with Ethics
Occurrences
Number of Occurrences
% of Files Per million tokens
Clinton 52,889 949,667,255 2,794 26,383 5.3 27.8
Bush 1 26,725 761,923,475 2,264 21,272 8.5 27.9
Bush 2 50,034 755,254,671 2,778 19,832 5.6 26.3
Obama 1 77,918 979,189,724 3,833 26,318 4.9 26.9
Obama 2 44,445 705,398,144 2,090 18,689 4.7 26.5
Industry 3 66,692 477,037,318 752 1,072 1.1 2.2
Engineers 15,841 14,097,589 397 844 2.5 59.9
Entrepreneurs 10,545 4,525,476 93 186 0.9 41.1
Manufacturing 55,516 23,692,746 609 1,084 1.1 45.8
Gen. Business 88,814 67,644,217 1,498 3,264 1.7 48.3
Silicon Valley 102,298 56,746,343 1,045 1,875 1.0 33.0
Social 76,597 82,802,467 3,533 6,943 4.6 78.4
Ethics
Driver Responsibility Marker Set
• distracted driverdrive* /5 negligencedrive* /5 responsibilitydrive* /5 responsibledrive* /5 text*driving /3 drunkdriving /3 impaireddriving under the influencedrunk drive*drive* /5 at faultdriving /5 at faulthuman errorindividual /3 responsibilitynegligence /5 drive*owner /3 responsibilityoperator /3 responsibilityreasonable driverreckless driv*text* /5 driv*
Session Number of Files
Number of Tokens Files with Ethics Occurrences
Number of Occurrences
% of Files Per million tokens
114 5,483 167,500,507 240 625 4.4 3.7
113 10,566 258,312,326 395 719 3.7 2.8
112 11,591 305,037,514 453 963 3.9 3.2
111 13,510 325,823,903 552 1,564 4.1 4.8
110 15,630 341,006,347 434 732 2.1 2.8
109 14,979 327,804,803 421 866 2.8 2.6
108 12,754 328,706,743 413 888 3.2 2.7
107 11,701 474,711,027 667 1620 5.7 3.4
Driver Responsibility
Session Number of Files
Number of Tokens Files with Occurrences
Number of Occurrences
% of Files Per million tokens
Clinton 52,889 949,667,255 1,013 2,446 2.6 1.9
Bush 1 26,725 761,923,475 718 1,794 2.7 2.4
Bush 2 50,034 755,254,671 826 1,974 1.7 2.6
Obama 1 77,918 979,189,724 1,144 3,277 1.5 3.3
Obama 2 44,445 705,398,144 762 2,196 1.7 3.1
Industry 3 66,692 477,037,318 67 76 0.2 0.3
Engineers 15,841 14,097,589 109 148 0.7 10.5
Entrepreneurs 10,545 4,525,476 10 13 0.1 2.9
Manufacturing 55,516 23,692,746 215 304 0.4 12.8
Gen. Business 88,814 67,644,217 209 243 0.2 3.6
Silicon Valley 102,298 56,746,343 329 465 0.3 8.2
Social 76,597 82,802,467 223 245 0.3 3.0
Driver Responsibility
Driverless Marker Set• autonomous car
autonomous cars
autonomous vehicle*
car /5 autonomous
cars /5 autonomous
car /5 driverless
cars /5 driverless
car /5 self-driving
cars /5 self-driving
driverless car
driverless cars
driverless vehicl*
google car
google cars
intelligent vehicl*
self-driving car
self-driving cars
self-driving vehicle*
smart car
smart cars
smart vehicle*
unmanned car
unmanned cars
vehicle* /5 autonomous
vehicle* /5 driverless
vehicle* /5 self-driving
Session Number of Files
Number of Tokens Files with Ethics Occurrences
Number of Occurrences
% of Files Per million tokens
114 5,483 167,500,507 107 894 2.0 5.3
113 10,566 258,312,326 71 300 0.7 1.2
112 11,591 305,037,514 39 68 0.3 0.2
111 13,510 325,823,903 31 56 0.2 0.2
110 15,630 341,006,347 34 53 0.2 0.2
109 14,979 327,804,803 53 93 0.4 0.3
108 12,754 328,706,743 50 126 0.4 0.4
107 11,701 474,711,027 76 358 0.6 0.8
Driverless
Session Number of Files
Number of Tokens Files with Ethics
Occurrences
Number of Occurrences
% of Files Per million tokens
Clinton 52,889 949,667,255 108 261 0.2 0.3
Bush 1 26,725 761,923,475 79 169 0.3 0.2
Bush 2 50,034 755,254,671 76 178 0.2 0.2
Obama 1 77,918 979,189,724 79 169 0.1 0.2
Obama 2 44,445 705,398,144 108 261 0.2 0.4
Industry 3 66,692 477,037,318 143 246 0.2 0.5
Engineers 15,841 14,097,589 2,095 7,279 13.23 516.3
Entrepreneurs 10,545 4,525,476 17 46 0.2 10.2
Manufacturing 55,516 23,692,746 404 1,413 0.7 59.6
Gen. Business 88,814 67,644,217 708 1,938 0.8 28.7
Silicon Valley 102,298 56,746,343 2,144 6,317 2.1 111.3
Social 76,597 82,802,467 170 368 0.2 4.4
Driverless
Prox Ethics and Driverless within 10 words
• distracted driverdrive* /5 negligencedrive* /5 responsibilitydrive* /5 responsibledrive* /5 text*driving /3 drunkdriving /3 impaireddriving under the influencedrunk drive*drive* /5 at faultdriving /5 at faulthuman errorindividual /3 responsibilitynegligence /5 drive*owner /3 responsibilityoperator /3 responsibilityreasonable driverreckless driv*text* /5 driv*
Session Number of Files
Number of Tokens Files with Ethics Occurrences
Number of Occurrences
% of Files Per million tokens
114 5,483 167,500,507 0 0 0.0 0.0
113 10,566 258,312,326 0 0 0.0 0.0
112 11,591 305,037,514 0 0 0.0 0.0
111 13,510 325,823,903 0 0 0.0 0.0
110 15,630 341,006,347 0 0 0.0 0.0
109 14,979 327,804,803 0 0 0.0 0.0
108 12,754 328,706,743 0 0 0.0 0.0
107 11,701 474,711,027 0 0 0.0 0.0
Prox Ethics and Driverless
Session Number of Files
Number of Tokens Files with Ethics
Occurrences
Number of Occurrences
% of Files Per million tokens
Clinton 52,889 949,667,255 0 0 0.0 0.0
Bush 1 26,725 761,923,475 0 0 0.0 0.0
Bush 2 50,034 755,254,671 0 0 0.0 0.0
Obama 1 77,918 979,189,724 0 0 0.0 0.0
Obama 2 44,445 705,398,144 0 0 0.0 0.0
Industry 3 66,692 477,037,318 0 0 0.0 0.0
Engineers 15,841 14,097,589 17 63 0.1 4.5
Entrepreneurs 10,545 4,525,476 0 0 0.0 0.0
Manufacturing 55,516 23,692,746 2 2 0.0 0.1
Gen. Business 88,814 67,644,217 6 15 0.01 0.2
Silicon Valley 102,298 56,746,343 10 13 0.01 0.2
Social 76,597 82,802,467 0 0 0.0 0.0
Prox Ethics and Driverless
• Simply talking about “ethics” does not mean that
we agree on what this means
• Innovators and public administrators (regulators)
do not have the same concept of ethics and they
do not talk about it in the same way
How do they talk about ethics?
• To ensure that the ethical implications of
technology and product development are
considered, need more than just a code of ethics.
• Federal guidelines essentially say that ethics
should be considered where
“resolution(s)…should be broadly acceptable”
(i.e., reasonable man’s standard?)
• Need to ensure that ethics are explicit part of the
conversation (“….consciously and intentionally)
Autonomous/Driverless Vehicles
• Previous research showed that Engineers are
talking a lot about technologies and
opportunities
• But only a little about who is responsible if there
is a problem or failure (do want a receptive
regulatory environment)
• In fact, technology is assumed to be infallible /
flawless
Oversight implications
• US Congress talks some about technology, but
little about company responsibility, particularly
with respect to cars
• Policy in this area is going to be reactive
• So, who is really thinking about the ethical and
moral responsibilities of product development
and deployment?
• The engineers and product developers
Oversight implications
• Smart vehicles and smart technologies have not yet come into the policy conversation in a substantive way
• Ethical and moral implications do not seem to be a major consideration (in fact, bias towards believing that technologies do not fail)
• Engineers and companies are going to determine the technology trajectory and policy-makers and regulators are going to try to react to it
• Policy-making likely to be led by litigation, rather than collaboration
Implications
• Companies and policymakers need to establish
guidelines for ethical behavior and decisions
• THEN, there needs to be a mechanism for
ensuring that ethics and implications of
decisions are explicitly considered and
incorporated into the design and development
process
• (…unless ethics don’t matter)
Implications
Questions?