SlideShare une entreprise Scribd logo
1  sur  209
Télécharger pour lire hors ligne
Law and Regulation of Machine Intelligence
Prof. Nicolas Petit ©; Twitter: @CompetitionProf
Fall 2016, Bar Ilan University
www.lcii.eu
Prospective issues
 A robot surgeon misdiagnoses your illness, who’s liable?
 You lease a robot to a large corporation to milk your herd of cows. The robot kills
one cow, who is liable?
 A robot programme is hired to select the next salesforce of a company: only returns
male profiles, and refuses to shortlist female profiles?
 A robot spies on your spouse who cheats, shall he report? Data protection issue
 Trolley problem: Google’s car runs into a kid or an old woman? A kid with 5% high
injury risk v a tree with 99% high injury risk for 4 passengers? A dog v another car?
 A robot creates a new song: who owns it? What if the song sounds similar to that of
a copyrighted work? Who’s liable for the infringement?
 Bots: http://motherboard.vice.com/read/the-best-things-a-random-bot-bought-on-
the-darknet
 You dont want to drive an autonomous car, but the insurance company refuses to
provide a contract: is this ok?
 A robot in human form is getting beat in the street by police officers
 Right to dignity?; but the robot has killed someone: right to shoot him down?
 Death penalty for Bots?
 Right to procreate, to dignity, to decent funerals?
www.lcii.eu
Asimov’s three laws of robotics (1950)
 Device that is well-suited for work that is too dull, dirty or dangerous for
real humans
 Safety feature introduced in all bots
 LAW 1: A robot may not injure a human being or, through inaction, allow a
human being to come to harm;
 LAW 2: A robot must obey the orders given it by human beings except where
such orders would conflict with the First Law;
 LAW 3: A robot must protect its own existence as long as such protection does
not conflict with the First or Second Laws
 In later fiction where robots had taken responsibility for government of whole
planets and human civilizations, Asimov also added a fourth, or zeroth law, to
precede the others:
 LAW 0: A robot may not harm humanity, or, by inaction, allow humanity to
come to harm
www.lcii.eu
Not science fiction – NHTSA, 4 February 2016
 National Highway Traffic Safety Administration (NHTSA) is US Highway Safety
Agency in charge of enforcing the Federal Motor Vehicle Safety Standards
(FMVSSs)
 Provides certification for new vehicles by automotive producers
 Number of FMVSSs requirements
 All buit around the notion of “driver”, and “driver’s position” or “driver’s seating
position”
 “MVSS No. 101 contains requirements for location, identification, color, and
illumination of motor vehicle controls, telltales, and indicators. S5.1.1 requires the
controls listed in Tables 1 and 2 of the standard to be located so that they are operable
by the [belted] driver”.
 “S5.3.1, which states that service brakes shall be activated by means of a foot control”
 49 CFR 571.3 defines “driver” as the occupant of a motor vehicle seated
immediately behind the steering control system.
www.lcii.eu
Google’s “SDS”
 “Google seeks to produce a vehicle that contains L4 automated driving capabilities, and removes
conventional driver controls and interfaces (like a steering wheel, throttle pedal, and brake pedal,
among many other things)”.
 “Expresses concern that providing human occupants of the vehicle with mechanisms to control
things like steering, acceleration, braking, or turn signals, or providing human occupants with
information about vehicle operation controlled entirely by the SDS, could be detrimental to safety
because the human occupants could attempt to override the SDS’s decisions”
 “Google’s design choices in its proposed approach to the SDV raise a number of novel issues in
applying the FMVSSs. Those standards were drafted at a time when it was reasonable to
assume that all motor vehicles would have a steering wheel, accelerator pedal, and brake pedal,
almost always located at the front left seating position, and that all vehicles would be operated
by a human driver. Accordingly, many of the FMVSSs require that a vehicle device or basic
feature be located at or near the driver or the driver’s seating position. For vehicles with an AI
driver that also preclude any occupant from assuming the driving task, these assumptions about
a human driver and vehicle controls do not hold”.
 “Google has asked who or what is to be considered the driver and which seating position is
considered to be the driver’s seating position in its SDV.”
www.lcii.eu
A Car has a Driver, a Driver need not be
human
Options
 1) “NHTSA could interpret the
term “driver” as meaningless for
purposes of Google’s SDV, since
there is no human driver, and
consider FMVSS provisions that refer
to a driver as simply inapplicable to
Google’s vehicle design”;
 2) “NHTSA could interpret
“driver” and “operator” as referring
to the SDS”
NHTSA
 As a foundational starting point for
the interpretations below, NHTSA
will interpret driver in the context of
Google’s described motor vehicle
design as referring to the SDS, and
not to any of the vehicle
occupants.
 If no human occupant of the
vehicle can actually drive the
vehicle, it is more reasonable to
identify the driver as whatever (as
opposed to whoever) is doing the
driving. In this instance, an item of
motor vehicle equipment, the SDS,
is actually driving the vehicle.
www.lcii.eu
Consequences
 “The controls listed in Tables 1 and 2 may simply be operable by
the SDS and need not be located so that they are available to any
of the human occupants of the motor vehicle”.
 For more, see
 http://isearch.nhtsa.gov/files/Google%20--
%20compiled%20response%20to%2012%20Nov%20%2015%2
0interp%20request%20--%204%20Feb%2016%20final.htm
 http://spectrum.ieee.org/cars-that-think/transportation/self-
driving/an-ai-can-legally-be-defined-as-a-cars-driver
www.lcii.eu
Schedule
Topics Date
Technology and Society 22-11-16
Theory of Regulation; Liability 23-11-16
Robotic Warfare; Market Conduct 25-11-2016
www.lcii.eu
Aims of the course
Basic questions
 Should we regulate AIs and
robots?
 If yes, how should we regulate
AIs and robots?
Goals
 Identify problems more than
solutions
 Think of frameworks/methods
to mindmap those issues
 Learn from you
Class I – Technology and Society
1. State of the Art
www.lcii.eu
Examples
 Robotic cars
 Drones
 Hive and swarm robots
www.lcii.eu
An old field, AI and robotics
 1950s: « Game AI », Arthur Samuel’s checker-playing program
 1954: Turing test
 1955: Newell and Simon, « Logic Theorist », proves 38 out of 52
mathematical problems
 Dartmouth Summer Research Project of 1956
www.lcii.eu
Initial ideas
 Any physical process including the mind process can be
modelized as a computable algorithm (Church Turing
thesis)
 A software is a set of computable algorithms. No reason
why it could not reach outcomes similar to those
generated by the mind (artefact)
 Machines can learn: “Learning is any process by which a
system improves performance from experience”, Herbert Simon
 Ultimate ambition is to have computers do what humans
do well (they already know how to do things humans
cannot do): heuristic, seeing, learning
www.lcii.eu
Milestones
 In 1997, DeepBlue beats Gary
Kasparov at chess
 In 2005, Stanford robot wins
DARPA Grand Challenge by driving
autonomously for 131 miles along
unrehearsed desert trail
 In February 2011, in a Jeopardy!
quiz show exhibition match, IBM's
question answering system, Watson,
defeats the two greatest Jeopardy!
champions, Brad Rutter and Ken
Jennings
 In January 2016, Researchers from
Google DeepMind have developed
the first computer able to defeat a
human champion at the board game
Go
www.lcii.eu
 2020s, Personal computers will
have the same processing power as
human brains.
 2030s, Mind uploading becomes
possible.
 2040s, Human body 3.0 (as
Kurzweil calls it) comes into
existence; People spend most of
their time in full-immersion virtual
reality
 2045s, The Singularity occurs as
artificial intelligences surpass
human beings as the smartest and
most capable life forms on the
Earth; The extermination of
humanity by violent machines is
unlikely
Kurzweil, « The Singularity is Near », 2005
 Law of accelerating returns:
technology progressing toward
Singularity at exponential rate
(each transition occurs more
rapidly than the last)
 Functionality of the brain is
quantifiable in terms of technology
 Baby boomers will live long
enough to see singularity. Nanobots
will eventually be able to repair and
replace any part of the body that
wears out
 Strong Artificial Intelligences and
cybernetically augmented humans
will become the dominant forms of
sentient life on the Earth
Source: Wikipedia
www.lcii.eu
Why now?
Moravec paradox solved?
 High level reasoning (playing
chess, using mathematics, etc.)
easier than low-level
sensorimotor skills (moving
across space, recognizing speech,
etc.)
 This is because high level
reasoning demands less
computational power
 See Moravec, H. (1998). When
will computer hardware match
the human brain. Journal of
Evolution and Technology, 1.
Technological evolution
 Brute force computational
power now available (due to
Moore’s law)
 Vast troves of data now
available, and distributed
(cloud)
www.lcii.eu
Various subfields of AI, and examples
 Deep learning (or machine learning)
 Algorithms that enable robots to learn tasks through trial and error using a process
that more closely approximates the way humans learn: spam filtering
 Neural networks
 Emulate the ability of living organisms to integrate perceptual inputs smoothly with
motor responses
 Speech recognition
 Uses sound metrics along with domain and context specific language to respond to
voice commands
 Natural language processing
 Robots interacting and responding through interpretations of natural language
instructions
 Artificial vision
 Object recognition, which allow robots to interact and measure their environment .
Can include different features: visual object recognition and tracking, image
stabilization, visual-based serving, human-to-machine interaction etc. => recognize
x-rays, MRI scans, battlefield robots recognize kids, automated cars,
 Knowledge representation
www.lcii.eu
Philosophical debate
Turing
 Dialogue with a human and a
machine through teletype
 If a machine could carry on a
conversation (over a teleprinter)
that was indistinguishable from a
conversation with a human being,
then it was reasonable to say that
the machine was "thinking”
 Machine's ability to exhibit
intelligent behavior equivalent to,
or indistinguishable from, that of a
human
 A. M. Turing (1950) Computing
Machinery and Intelligence. Mind
49: 433-460
Searle
 « Chinese room » argument
 You do not speak Chinese
 You’re in room with two slits, a book,
and some scratch paper
 Someone slides Chinese characters
through first slit
 You follow the instructions in the
book, correlate characters as instructed
by book, and slide the resulting sheet
in the second slit.
 It appears you speak Chinese, yet you
do not understand a word of Chinese
 No need to understand Chinese to
translate it. Fact that computer is able
to do it does not create strong AI
 Weak AI that simulates thought
 http://www.iep.utm.edu/chineser/
www.lcii.eu
Scientific debate
 Is this a computational problem only (Kurzweil)?
 Or is there an algorithm that we do not know?
www.lcii.eu
Challenges (1)
 Supervised learning v unsupervised learning
 Human feeds neural network with inputs (image) and output/label (face, dog, lamp,
etc.), and the AI comes up with a statistical rule that correlates inputs with the correct
outputs => no causal explanation + error or reward signal
 AI does identify patterns
 Pornography example: Justice Potter Stewart, SCOTUS: « I know it when I see it »
 But all this is supervised learning, and is finite resource => but Internet providers have
pre-labelled data
 Natural language
 « Recognize speech » v « Wreck on the beach »
 Questions on Siri, Wolfram alpha, etc. generate different responses if different syntax is
used
 Disambiguation: « I ate spaghetti with meatballs » v « I ate spaghetty with chopsticks » (see R.
Mooney podcast)
 Artificial vision and the towel problem:
https://www.youtube.com/watch?v=gy5g33S0Gzo
 Common sense?
 Magellan problem
 Optimizing transportation algorithm
www.lcii.eu
www.lcii.eu
Robotics
 Robotics is meta-technology
 Robotics is about space motion
 A robot is an « agencified » AI
 Mechanical engineering issues are not relevant, though
they constitute significant limits
 Power
 Haptic technology
 Motricity
www.lcii.eu
Survey of most relevant applications, RoboLaw,
2014
www.lcii.eu
Related technologies
 Human enhancement (and transhuman sciences)
 Bionics (see J. Fischman, National Geographic, 2010) and prosthetics
 Exosqueletons
 Emulations
 Mind uploading
 Care robots, social robots, etc.
 Augmented reality
www.lcii.eu
Definitional problem
1921, Karel Capek, R.U.R.
 Rossum’s Universal Robots:
Capek’s play makes first use of
the word “robot” to describe an
artificial person
 Capek invented the term, basing
it on the Czech word for “forced
labor” or “servitude”:
http://www.wired.com/2011/01
/0125robot-cometh-capek-rur-
debut/
2014, Robolaw
 “In fact, the term “robot” can mean
different things to different people,
since there is no agreement on its
meaning neither among professional
users (i.e. roboticists) nor among
laypeople”
www.lcii.eu
Definitions
 “Actuated mechanism programmable in two or
more axes (4.3) with a degree of autonomy (2.2), moving within its
environment, to perform intended tasks”, ISO 8373:2012(en)
 “A robot is a constructed system that displays both physical and
mental agency, but is not alive in the biological sense ”, Neil M.
Richards and William D. Smart, 2016
 “Robots are mechanical objects that take the world in, process
what they sense, and in turn act upon the world”, Ryan Calo,
2015.
www.lcii.eu
ISO 8373:2012(en)
 “a degree of autonomy? ”
 Full autonomy means unexpected decisions in unexpected situations
 “Moving within its environment”?
 Space motion is critical
 Softbots do not move? Need “Hard”-bot seated behind a computer?
 Human being who makes online transactions does not move, yet may create
harm
 “to perform intended tasks”
 Intention of whom? Very confusing
 Robot with sufficient autonomy to intentionally resist to harmful third party
intentional order
 Drone will not launch bomb on hospital
 Google car will not crash in school
 Industrial bot refuses to operate if safety risk
 Localizing intention: initial intention or subsequent command?
www.lcii.eu
« Constructed system that displays both physical and
mental agency », Richards and Smart
 But robots created by
robots? Are they
constructed?
 How much mental agency?
 Semi unmanned drones
 Robots w/o physical
agency: softbots (Hartzog,
2015)
 News generating bot
 Robot advertisers
 Computer composers:
http://www.gizmag.com/creat
ive-artificial-intelligence-
computer-algorithmic-
music/35764/
www.lcii.eu
www.lcii.eu
« Sense – Think – Act » spectrum?
Sense (acquire and
process information)
Think (process what
was sensed)
Act (locomotion and
kinematics)
Low Industrial robots that
paint or weld car
parts
Exosqueleton,
DaVinci Robot
(teleoperated)
Augmented reality
devices (Hololens),
Medical diagnosis
robot
Medium Mars Rover, Drones
High Vacuum cleaner Social Robots Driverless car; Hoops
Robotic basketball-
shooting arm; Airport
security check
systems
www.lcii.eu
Robolaw typology
 No definition, but 5 categories of relevant items:
 Use or task: service or industrial
 Environment: physical (road, air, sea, etc.) v cyberspace
 Nature: embodied or disembodied (bionic systems)
 Human-Robot Interaction (HRI)
 Autonomy
www.lcii.eu
Calo, 2015
 Embodiement
 Robot as « machine » or « hardware »
 Softbots (eg, robot traders)?
 Mooney thinks this is irrelevant; true from scientists perspective, but not
necessary true from a society perspective
 Emergence
 New forms of conduct, including welfare enhancing behaviour => problem
solving robots
 « Social valence »
 Robots stimulate reactions from society: « social actors »
 Soldiers jeopardize themselves to preserve robots in military field
 People write love letters to Philae
 Often correlated to anthropomorphic embodiement (Honda Asimo)
2. Economics
www.lcii.eu
Pew Research Survey
48%
 Tech pessimists
 “a massive detrimental impact on
society, where digital agents displace
both blue- and white-collar workers,
leading to income inequality and
breakdowns in social order”
52%
 Tech optimists
 “anticipated that human ingenuity
would overcome and create new jobs
and industries”
Source: http://www.futureofwork.com/article/details/rise-of-intelligent-robots-will-widen-the-social-
inequality-gap
www.lcii.eu
A. Smith, An Inquiry into the Nature and Causes of the
Wealth of Nations, 1776
 “A great part of the machines made use of in those manufactures
in which labour is most subdivided, were originally the inventions
of common workmen, who, being each of them employed in some
very simple operation, naturally turned their thoughts towards
finding out easier and readier methods of performing it. Whoever
has been much accustomed to visit such manufactures, must
frequently have been shewn very pretty machines, which were the
inventions of such workmen, in order to facilitate and quicken
their own particular part of the work”.
www.lcii.eu
D. Ricardo, On Machinery, 1821
www.lcii.eu
J-M. Keynes, “Economic Possibilities for our
Grandchildren”, 1930
 “We are being afflicted with a new disease of which some readers may not yet have
heard the name, but of which they will hear a great deal in the years to come--
namely, technological unemployment. This means unemployment due to our
discovery of means of economising the use of labour outrunning the pace at which
we can find new uses for labour. [...] But this is only a temporary phase of
maladjustment. All this means in the long run that mankind is solving its economic
problem”.
 “Yet there is no country and no people, I think, who can look forward to the age of
leisure and of abundance without a dread”
 “ Three-hour shifts or a fifteen-hour week may put off the problem for a great while.
For three hours a day is quite enough to satisfy the old Adam in most of us!”
 Concludes by touting to disappearance of economics as a science
www.lcii.eu
Empirical studies
 Bank of America/Merrill Lynch, 2015
 “Robots are likely to be performing 45% of manufacturing tasks by 2025E
(vs. 10% today)”
 McKinsey Global Institute, Disruptive technologies Advances that will
transform life, business, and the global economy, 2013
 By 2025, “knowledge work automation tools and systems could take on tasks
that would be equal to the output of 110 million to 140 million full-time
equivalents (FTEs)” (knowledge work is use of computers to perform tasks
that rely on complex analyses, subtle judgments, and creative problem
solving).
 By 2025, “[w]e estimate that the use of advanced robots for industrial and
service tasks could take on work in 2025 that could be equivalent to the
output of 40 million to 75 million full-time equivalents (FTEs)”.
www.lcii.eu
Frey and Osborne, 2013
 “47 percent of total US employment is
in the high risk category, meaning that
associated occupations are potentially
automatable over some unspecified
number of years, perhaps a decade or
two”
 “most workers in transportation and
logistics occupations, together with the
bulk of office and administrative support
workers, and labour in production
occupations, are at risk” + “a
substantial share of employment in
service occupations”
Wave IWave II Plateau
www.lcii.eu
Substitution effect, consequences
Job polarization
 Shift in the occupational
structure
 Displaced workers relocate their
labor supply to low skill service
occupations
 Other humans resist by
investing in skills through
education (Frey and Osborne,
2014; Cowen, 2013)
 This leads to « labour market
polarization » (Autor, 2014;
Cowen, 2013)
Discussion
 Frey and Osborne, 2014 believe this
model still holds true
 “Our model predicts …
computerisation being principally
confined to low-skill and low-wage
occupations. Our findings thus imply
that as technology races ahead, low-
skill workers will reallocate to tasks
that are non-susceptible to
computerisation – i.e., tasks requiring
creative and social intelligence”
 Brynjolfsson and McAfee, 2011
disagree: when technology becomes
cognitive, substitution can also
occur for non routine tasks
www.lcii.eu
Substitution pace
 “Technological advances are contributing to declining costs in
robotics. Over the past decades, robot prices have fallen about 10
percent annually and are expected to decline at an even faster pace
in the near future (MGI, 2013). Industrial robots, with features
enabled by machine vision and high-precision dexterity, which
typically cost 100,000 to 150,000 USD, will be available for 50,000
to 75,000 USD in the next decade, with higher levels of
intelligence and additional capabilities (IFR, 2012b). Declining
robot prices will inevitably place them within reach of more users”
 Hanson on copies
www.lcii.eu
 Philips brings electric shavers
production home?
https://blogs.cfainstitute.org/in
vestor/2014/06/16/the-robot-
revolution-innovation-begets-
innovation/
Effect on Developing Economies?
 McKinsey Global Institute,
2013
 “Effects of these technologies on
developing economies could be
mixed. Some countries could lose
opportunities to provide outsourced
services if companies in advanced
economies choose automation
instead. But access to knowledge
work automation technologies could
also help level the playing field,
enabling companies in developing
countries to compete even more
effectively in global markets”.
www.lcii.eu
Substitution (Engineering) Bottlenecks: Frey &
Osborne, 2013
Social intelligence tasks Creative intelligence tasks Perception and
manipulation tasks
Negotiation, persuasion
and care
Ability to make jokes;
recipes ; concepts
Disorganized environment
or manipulation of non-
calibrated, shifting shapes
(towel problem)
www.lcii.eu
Autor, 2014
 Routine tasks: “Human tasks that have proved most amenable to computerization are
those that follow explicit, codifiable procedures”
 Non routine tasks: “Tasks that have proved most vexing to automate are those that
demand flexibility, judgment, and common sense”
 Engineers “cannot program a computer to simulate a process that they (or the scientific
community at large) do not explicitly understand”
 Non routine tasks less exposed to substitution
 Tasks that are not exposed may benefit from it, though complementarity effect
 In construction, mechanization has not entirely devalued construction workers,
but augmented their productivity; but not true for all (worker who knows to
use shovel v excavator)
www.lcii.eu
Typology of D. Autor et al (2003), Autor (2014)
Task Description Substitution risk
Routine (incl.
skilled work)
Clerical work, bookeeping, back and middle office,
factory work
High
Non routine « Abstract » « Manual » Low
Problem solving, intuition,
creativity and persuasion
Situational adaptability,
in person interaction,
visual and language
recognition
High education, high wage Low education, low
wage
Doctors, CEOs, managers,
artists, academics
Housecleaning, flight
attendants, food
preparation, security
jobs
www.lcii.eu
Findings of D. Autor et al (2003), Autor (2014)
 Computers are more substitutable for human labour in routine
relative to non-routine tasks (substitution effect);
 And a greater intensity of routine inputs increases the marginal
productivity of non-routine inputs (complementarity effect)
 “Job polarization” effect
 Increase of high education, high wage jobs
 Increase of non routine low education, low wage jobs
 No increase in wages for this later category, given abundance of
supply
 Autor, D., Levy, F. and Murnane, R.J. (2003), “The skill content
of recent technological change: An empirical exploration”, The
Quarterly Journal of Economics, vol. 118, no. 4, pp. 1279–1333
www.lcii.eu
Job Polarization, some evidence (Autor, 2014)
www.lcii.eu
 “Obtaining skills takes time studying in
school and learning on the job. Thus
skilled workers are disproportionately older
workers”
 “machine-biased productivity
improvements effects a redistribution from
younger, relatively unskilled workers to
older relatively skilled workers as well as
retirees”
 “When today’s machines get smarter,
today’s young workers get poorer and save
less”
 “The fall in today’s saving rate means that
the next generation will have even lower
wages than today”
 “In short, better machines can spell
universal and permanent misery for our
progeny”
Generational effect, Sachs and Kotlikoff, 2012
 Long term misery?
www.lcii.eu
T. Cowen, 2013
Average is over
 The rich will get richer, the
poor will get poorer
 Substitution effect stronger in
work w/o consciousness/ability
to train
Freestyle chess metaphor
 Random player-machine teams
outperform chessmaster-
machine teams
 Not necessary teams of grand
masters!
 “In the language of economics,
we can say that the productive
worker and the smart machine
are, in today’s labor markets,
stronger complements than
before”
www.lcii.eu
Substitution-Complement Framework
www.lcii.eu
The model explained
Multi-causal substitution
 Exponential decrease in costs of
technology
 « Deskilling »
 Replacement by semi-skilled
technologies, through
fragmentation and simplification
of tasks (fordism)
 « The copy economy » (Hanson,
2014): « the most important
features of these artificial brains
is easy to copy »
Two types of complements
 Complements arising from
substitution (upward slopping curve)
 AI and Robots-related jobs (those of
Autor and Cowen)
 Enabling technologies and new jobs
 Punch cards, typewriters, printers,
calculators, etc.
 Complements with indifference (L
curve)
 Indifference on human labour of an
increase in machine labour (horizontal
line)
 « Emerging jobs » new sectors without
human labour
 Protected sectors, superstars like chefs,
footballers and singers?
 Indifference on machines of increase in
human labour (vertical line)
 Bank tellers (ATMs?) => J. Bessen book
www.lcii.eu
Take aways
 MGI, 2013:
 “In some cases there may be regulatory hurdles to overcome. To protect citizens, many
knowledge work professions (including legal, medical, and auditing professions) are
governed by strict regulatory requirements regarding who may perform certain types of
work and the processes they use”
 “Policies discouraging adoption of advanced robots—for example, by protecting manual
worker jobs or levying taxes on robots—could limit their potential economic impact”.
 Frey and Osborne, 2013:
 “The extent and pace of legislatory implementation can furthermore be related to the public
acceptance of technological progress”
 Brynjolfsson and McAfee, 2014 citing Voltaire : “Work saves a man from three great
evils: boredom, vice, and need.”
3. Legal services and education
www.lcii.eu
www.lcii.eu
LegalZoom: https://www.legalzoom.com/country/be
 “LegalZoom provides the legal solutions you need to start a business, run a business, file a
trademark application, make a will, create a living trust, file bankruptcy,change your
name, and handle a variety of other common legal matters for small businesses and families.
Since the process involved in starting a business can be complicated, we provide services
to help start an LLC, form a corporation, file a DBA, and take care of many of the legal
requirements related to starting and running a business. If you are interested in protecting
your intellectual property, LegalZoom offers trademark and copyright registration services, as
well as patent application assistance. It's essential for everyone to have a last will and
testament and a living will, and you can get yours done efficiently and affordably
through LegalZoom. For those who have more advanced planning needs, our living
trust service is available. With all our services, you have the option to speak to a
lawyer and tax professional. Let LegalZoom take care of the details so you can focus on
what matters most – your business and family”
www.lcii.eu
Pricing Legal Zoom
Source: http://www.law-valley.com/blog/2014/03/03/legalzoom-le-
leader-americain/
www.lcii.eu
www.lcii.eu
Neota Logic Inc.
http://www.neotalogic.com/solutions/
Concept
 Software company that helps
companies make routine legal
decisions without consulting a
lawyer.
 Let an employee take family leave
(source of employment
discrimination claims)?
 Input questions, and get results
 Customer is business or law firms
 Software has been used to answer
queries on the European Union’s
regulation of financial derivatives
Example: compliance
 “Regulations are constantly
changing. With a Neota Logic app,
you can instantly incorporate
changes to regulations and policies
ensuring timely compliance.
Incorporate apps into your regulatory
processes and see how easy it is to
ensure consistent methodologies are
followed and provide your business
with auditable results”
www.lcii.eu
www.lcii.eu
Drivers of change (Susskind, 2014)
Contextual
 More for less challenge
 Clients of lawyers (in-house
counsels): less staff, less external
counselling, more compliance
and conformity costs
 https://www.lexoo.co.uk/
 Liberalization
 http://thejurists.eu/
Structural
 Information technology (Katz
2013)
 Large data power
 Immense computing power
 Automate and innovate
www.lcii.eu
Katz, 2013
Wind of change
 “Like many industries before it,
the march of automation,
process engineering,
informatics, and supply chain
management will continue to
operate and transform our
industry. Informatics,
computing, and technology are
going to change both what it
means to practice law and to
“think like a lawyer.””
Substitution+complement
www.lcii.eu
Outlook
First generation
 E-discovery
 Automated document assembly
 Online dispute resolution
New generation
 Quantitative legal prediction
 Contract analyzis
 Online drafting wizard
(Weagree.com)
 Legal risk management
www.lcii.eu
Quantitative Legal Prediction
 Everyday, lawyers make predictions
 Do I have a case?
 What is our likely exposure?
 How much is this going to cost?
 What will happen if we leave this particular provision
out of this contract?
 How can we best staff this particular legal matter?
 How high the probability to settle?
www.lcii.eu
Predicting case outcomes
LexMachina
 Lunch between Professor M.
Lemley and Bruce Sewell
 Create electronic set of patent
litigation events and outcomes
 https://lexmachina.com/about/
 Funded by Apple, Cisco,
Genentech, Intel, Microsoft, and
Oracle, etc.
 More at https://goo.gl/UyB0wU
www.lcii.eu
QLP and Machine Learning
 Predicting outcomes
 “An algorithm learning that in workplace discrimination cases in which there is a racial
epithet expressed in writing in an email, there is an early defendant settlement
probability of 98 percent versus a 60 percent baseline. An attorney, upon encountering
these same facts, might have a similar professional intuition that early settlement is
likely given these powerful facts”
 No gaming (moral hazard)
 Discover hidden data
 “Imagine, for instance, that the algorithm detects that the probability of an early
settlement is meaningfully higher when the defendant sued in a personal injury case is a
hospital as compared to other types of defendants”
www.lcii.eu
TyMetrix
www.lcii.eu
LawyerMetrics
 “Lawyer Metrics makes it possible to replace lower-performing “C
players” in your organization with higher performing “B” and “A”
attorneys”.
 http://lawyermetrics.org/services/human/
www.lcii.eu
4 disruptive and robotic legal technologies
1. Embedded legal knowledge
2. Intelligent legal search
3. Big data
4. AI-based problem-solving (pb solving, with
natural language processing input)
www.lcii.eu
Pros and cons
Technique
 Use big data and computational
power
 “inverse” or inductive reasoning
 Use observables to build model
(>< build model, and then try to
infer result)
 Concept of similarity that is
implemented and refined using
large bodies of data
 Facebook recommending
friends, Netflix recommending
movies and Amazon
recommending books
Pros and Cons
 Pros
 Overcomes anecdotal or
unindicative information
 Overcomes human cognitive
limitations: heuristic, biases,
preferences, etc.
 Cons
 Lack of relatedness btw new cases
and past cases
 Overgeneralization: most rape
cases occured in poor areas => rape
does not exist in wealthier areas
 Capture information in data:
change of member on board of
regulator
www.lcii.eu
 Stasia Kelly, U.S.
comanaging partner of
DLA Piper: “I really want
to know the person giving
advice”
 R. Susskind, Tomorrow’s
Lawyer, 2014
 Often, lawyers regard legal
work as « bespoke »:
customized, made to
measure, personal
 « romantic » vision
Impact on legal professions, and the denial
problem
www.lcii.eu
Really?
 Take employment contract: you dont use a blank
canvas everytime you draft one
 French lawyers all use same structure in work
(standardized): I-II; AB
 Anglo-saxon contracts always have definitions first
www.lcii.eu
 Many of those tasks can be
subject to
 Outsourcing
 Insourcing
 Delawyering
 Computerizing
 Leasing
 ...
Decomposition
TABLE 4.1. Litigation,
decomposed (Sussking, 2014
Document review
Legal research
Project management
Litigation support
(Electronic) disclosure
Strategy
Tactics
Negotiation
Advocacy
www.lcii.eu
Effect on legal market (Susskind, 2013)
Firms
 Elite group of say 20 firms to
Big4;
 Opportunity for middle size
firms;
 Small firms will disappear, due
to liberalization and competition
from banks, accountants and
other retailers (“end of lawyers who
practice in the manner of a cottage
industry”)
 Contrast with Kobayashi and
Ribstein?
Lawyers
 Barristers will remain:
 “oral advocacy at its finest is
probably the quintessential bespoke
legal service”
 But not for “lower value” disputes
and note that “courtroom
appearances themselves will diminish
in number with greater uptake of
virtual hearings, while online dispute
resolution (ODR) will no doubt
displace many conventional litigators
”
www.lcii.eu
Disciplines likely to be affected?
 Corporate and M&A work
 Global Merger Analyzis Platform (GMAP):
http://www.ft.com/intl/cms/s/2/a1271834-5ac2-11e5-
9846-de406ccb37f2.html#axzz41hY8v2x3
 Trademark and copyrights filing
 Patent applications
 Private international law?
 Your take?
www.lcii.eu
Disciplines likely to be affected
 Data intensive
 Public information
 Searchable
 Scalable (global law)
 Standardized (labelled input)
www.lcii.eu
Users perspective
 Provides « unmet legal needs » (Wilkins): brings law in
consumer markets
 Growth of LegalZoom is indicative
 For more see
http://www.slate.com/articles/technology/robot_invasio
n/2011/09/will_robots_steal_your_job_5.html
www.lcii.eu
LegalZoom
www.lcii.eu
Rocketlawyer
www.lcii.eu
Complementarity effect (Susskind, 2013)
New jobs
 Legal knowledge engineer
 Legal technologist
 Legal hybrid
 Legal process analyst
 Legal project manager
 ODR practitioner
 Legal management consultant
 Legal risk manager
New employers
 Global accounting firms
 Major legal publishers
 Legal know-how providers
 Legal process outsourcers
 High street retail businesses
 Legal leasing agencies
 New-look law firms
 Online legal service providers
 Legal management consultancies
www.lcii.eu
Challenges – Education
Forget substitutable work
 Routine tasks
 Memorization
 Research and other repetitive
data-driven tasks
 Non routine manual-tasks?
 Filing briefs
 Taking minutes of meetings
Invest in complements
 Train in science, computation,
data analytics and technology
 Invest in soft skills, incl.
leadership, executive training,
management: « social
bottlenecks »
www.lcii.eu
 “Lawyers who expect to operate in
this new environment must
understand how technology is
reshaping the markets in which
their clients compete, as well as the
practice of law itself, including the
use of “big data,” artificial
intelligence, and process
management to analyze, structure,
and produce legal outcomes”
(Heineman, Lee and Wilkins,
2014)
Retraining need
 “Clients will not be inclined to pay
expensive legal advisers for work
that can be undertaken by less
expert people … This prediction
does not signal the end of lawyers
entirely, but it does point to a need
for fewer traditional lawyers. At the
same time when systems and
processes play a more central role
in law, this opens up the possibility
of important new forms of legal
service, and of exciting new jobs for
those lawyers who are sufficiently
flexible, open-minded, and
entrepreneurial to adapt to
changing market conditions”
(Susskind, Chapter 11)
www.lcii.eu
 « la délivrance automatisée de
consultations en ligne n’est autorisée
que pour répondre à la demande d’un
client déterminé et pour satisfaire des
besoins spécifiques » Article 4.12
(M.B. 17.01.2013);
 « L’avocat ne délivre aucun service ni
ne donne consultation ou avis
personnalisés sur un forum de
discussion électronique ou tout autre
groupe virtuel public » Article 4.13
(M.B. 17.01.2013).
 « En l’état actuel de la déontologie,
une telle pratique n’est pas admise.
L’avocat engage son crédit et sa
responsabilité s’il n’adapte pas les
actes qu’il rédige à l’examen de la
situation particulière d’un client
[…] » (LB 01-02, n°3, 226)
 Not OK?
Challenges – Professional Regulation
 Attorney-client confidentiality
 Limit to data aggregation by
law firms?; Limit to data
portability by lawyers?
 Unauthorized exercise of
profession
 Partnership rules
 Independence of lawyer?
 Fee regulation
www.lcii.eu
Challenges – Intellectual property?
 More IP protection could promote production of new legal
technology
 Too much IP protection prevents production of of new legal
technology
www.lcii.eu
Legal preserves
 Objections to machines taking over certain types of
work?
 Passing life sentence?
 Other?
www.lcii.eu
Take away points
 Think of tasks, not jobs
 Users’ perspective also matters; Not only perspective of suppliers
 Where machines are better than humans, expect substitution
and/or complementarity
 Shall the law prevent machines in legal services? No, substitution
(and/or complementarity) not necessarily bad
 But need to think about specific tasks for which society wants to
preserve humans
 Social contract imperative: man-made law necessary for trust?
Class II – Regulation and Liability
1. Regulation
www.lcii.eu
“Noel Sharkey, a computer scientist at the University of Sheffield,
observes that overly rigid regulations might stifle innovation. But a
lack of legal clarity leaves device-makers, doctors, patients and
insurers in the dark”
The Economist, 01 September 2012
www.lcii.eu
Definitions
 Roboethics and robolaw
 Regulation
 “State intervention into the economy by making and applying legal
rules”? (Morgan and Yeung)
 “Exceptionalism” (Calo, 2015): “a technology is exceptional if it invites
a systemic change to laws or legal institutions in order to preserve or
rebalance established values”.
Use existing basic legal infrastructure, and deal with issues on a
case-by-case basis, through litigation and precedent (common law
approach)
v
Adopt sui generis rules and updates
www.lcii.eu
Goals of the lecture
 Where do we need regulation?
 Put differently where do we need to (i) adapt existing law;
(ii) introduce new law?
1.1 Overview of Existing Approaches
www.lcii.eu
Two trajectories
Disciplinary (legal)
 In each branch, specialists
identify frictional HYPOs
 Top down
 Suggestions for a green paper,
2012: “top down approach that
studies for each legal domain
the consequences on robotics”
Technology (applications)
 For each class of applications,
speculation on legal problems
 Bottom up
 Robolaw, 2012
 Stanford, Artificial Intelligence and
Life in 2030
www.lcii.eu
1. Disciplinary approach
 “fitting” exercize: jacket factory allegory
 8 areas of the law:
 health and safety of machinery;
 product liability rules and sale of consumer goods;
 intellectual property;
 labour law;
 data privacy;
 criminal law;
 contractual and non contractual liability;
 e-personhood.
www.lcii.eu
Intellectual property
 Are machine intelligence generated works protectable under intellectual
property regimes, and as the case may be, who is their owner?
 Copyright law
 “Creative” or “original” work requirement
 Means work that reflect the subjective personality of the author
 Subjective, not objective: two similar songs can be ©
 Can a machine that computes data be capable of creation?
 Patent law needs “inventive step ” and “non obviousness”
 Inventive means unexpected, that does not follow path of technology progress
 Non obviousness requirement to the skilled person?
 Is anything non obvious to a super intelligence?
 Who is the owner?
 PETA/Naruto case, before US Courts
 Not hypothetical given that some IPRs held by abstract legal persons like
corporations?
www.lcii.eu
Pros and cons
 Avoid inconsistencies
 Grant strong IP protection in
AI field, yet create strict
programmer liability in same
field
 Speculation on problems
created by technology under
imperfect information
 Risk of mistakes, that create
new legal problems
 Social planner believes that AI
research into biological
treatment of Alzheimer is next;
creates strong IP; but techno
frontier; mechanical (mind
upload) would work better;
unwanted problems
www.lcii.eu
2. Functional approach
 Determine classes of MI applications, and then assess the legal
needs from there.
 Bottom up approach geared to technological evolution
 Savile row tailors allegory
www.lcii.eu
Stanford, 2016
8 fields
1. Transport
2. Home/services robots
3. Healthcare
4. Education
5. Low-resources communities
6. Public safety and security
7. Employment and workplace
8. Entertainment
9 sujets juridico-politiques
1. Privacy (biases in predictive
algorithm + right to intimacy)
2. Innovation (open v patent thickets)
3. Liability (civil)
1. Locus: efficiency/fairness
2. Foreseeability condition
4. Liability (criminal)
1. Intent condition (mens rea)
5. Agency (legal personhood)
6. Certification and licensing
requirements
7. Taxation (budgets dependent
payroll, income tax; speeding and
parking tickets)
8. Labor (working contracts
requirements)
9. Politics
www.lcii.eu
Robolaw, 2012
 Distinct legal issues for various class of applications
 Self driving cars: primary question relates to impact of liability
rules on manufacturers’ incentives for innovation
 For prostheses, the focus is also placed on public law issues, for
instance whether a person can identify itself with the prostheses it
wears, and whether it can resist depiction in an official document
with it, or not
 For personal care robots, some basic human rights considerations
come also into play, such as the need to protect the right to
independent living protected by Article 19 of the UN Convention
on the rights of persons with Disabilities: right to refuse robot
assistance agst insurance companies?
www.lcii.eu
Pros and cons
 More open to ex ante
robo-ethics
 Pro-innovation
 Obsessive focus on not
hindering technological
evolution
 Too much trust in
technology success?
 Technological
convergence will dissipate
differences btw
technologies
1.2. Regulatory trade-offs
www.lcii.eu
Disabling regulation (Pelkmans and Renda)
REACH
 Problem with the “imposition of
fairly heavy testing requirements for
all existing and new substances
alike”
 “The other feature of REACH,
owing to its ambitious precautionary
approach of ‘no data, no market’
(access), is that this entire process of
testing before being allowed on the
market takes no less than 11 years”
GMO regulation
 In the EU, only two new GMO
products have been allowed to be
cultivated: NK603 GM maize
and the Amflora potato
 This despite reported benefits to
farmers and decrease in poverty
www.lcii.eu
Drones: Disabling regulation?
US FAA
 Rules for small unmaned
aircrafts
https://www.faa.gov/uas/media
/Part_107_Summary.pdf
 Standardize visual line of sight
(VLOS) flights of unmanned
aircraft that weigh less than 55
lbs. (25 kg)
 Aeronautical knowledge test
Finland
 “Allows BVLOS flights under
certain conditions, and it does not
require drone operators to possess an
aerial work certificate”
 https://techcrunch.com/2016/0
6/28/heres-whats-missing-from-
the-new-drone-regulations/
www.lcii.eu
 Civil purpose nuclear
energy
 Reproductive cloning and
nanotechnologies?
« Knee-jerk » regulation?
 “tendency to overreact to
risks, accidents and
incidents” (Van Tol, 2011)
www.lcii.eu
 Taxi v Uber
 Airbn’B v hotel chains
 E-cigarette
Rent seeking
 Bastiat: candle manufacturers
request chamber of deputies to:
“pass a law requiring the closing of
all windows, dormers, skylights,
inside and outside shutters, curtains,
casements, bull's-eyes, deadlights, and
blinds—in short, all openings, holes,
chinks, and fissures through which
the light of the sun is wont to enter
houses, to the detriment of the fair
industries with which, we are proud
to say, we have endowed the country
[...]”.
www.lcii.eu
Rent seeking in AI
 Taxi, truck and bus drivers
 Delivery industry
 Insurance companies
 Carmakers
www.lcii.eu
Regulatory timing
 Collingridge dilemma: Too early to act, not enough
information; too late to act, all information but no longer able
to change things
 “Regulatory connection” quandary: the risks and opportunities
created by emerging technologies cannot be “suitably understood
until the technology further develops” … what if it is harmful?
 “They're talking about spending 5-10 years to regulate technologies
that are already 5-10 years old“ (Garreau)
 Amazon, Intel and Google have been very vocal in relation to
drones delivery regulation, which is outdated
 Bostrom’s treacherous turn
www.lcii.eu
Enabling regulation (Pelkmans and Renda)
End-of-life vehicles
 “beyond what a market-based approach
might be expected to achieve”
 Quantitative targets: “reuse and
recycling of 80 % of the car weight in
2006, up to 85 % by 2015; reuse and
recovery at least 85 % in 2006 and 95
% in 2015”
 “Innovation takes place at the very
beginning of the life cycle of cars, namely
at the design & planning stage”
 October 2016: Germany’s Bundesrat
just passed a resolution to ban the
internal combustion engine starting
in 2030
Porter Hypothesis
 In environment, safety and
health, “tough standards trigger
innovation and upgrading”
 And market opportunities to
race for first mover advantage
 Counter-example: 2015
Volkwagen NOx (nitrogen
oxides) emission scandal
www.lcii.eu
Regulatory trade offs
 Complex relation between technological innovation and
government regulation
 “The economic literature (starting from the seminal work of
Ashford and later with the so-called “Porter hypothesis”) has
long recognised that regulation can be a powerful stimulus to
innovation and entrepreneurship”(Pelkmans and Renda,
2014)
 At the same time, regulation “can and does disable
innovation” (Pelkmans and Renda, 2014)
2. Liability Issues (Civil)
www.lcii.eu
Goals of the lecture
 Who should pay for robot generated harm?
 How is the issue dealt with under basic legal structure?
 Should regulation be adopted?
www.lcii.eu
 Friends pay you a visit
 Robot kills friends dog, which it
confuses with an insect
 Deems it is a threat for herbal
environment
Hypothetical scenario
 You have a garden
 Buy a robot gardener
 Unmanned system with very
high autonomy
 Can « sense » maturation of
fruits, veggies, plants
 Robot has « actuators »
 Turn irrigating devices « on »
 Prun the grass
 Kill mosquitos
 Carry and spill water,
pesticides and other
liquid/aerial products
www.lcii.eu
Social goals of liability law
 Solution to be found so as to fulfill goals of liability law
 Disputed
 G. Williams, « The Aims of the Law of Tort », 4 Current Legal
Problems, 1951
 Corrective/protective
 Provide solvent target
 Deterrence
 No gain from harmful conduct
 Encourage precaution
www.lcii.eu
S. Lehman Wilzig, « Frankestein Unbound », 1981
 Product liability
 Robot is piece of hardware
 Liability on producer, plus possible limited liability on importers, wholesalers and
retailers
 2 manufacturers problem: « hardware » + « software »
 « Inherent risk »
 Dangerous animals
 Strict liability only for dangerous species; no liability for « usually harmless species »
 Slavery
 Several regimes: master is liable v slave is liable
 Roman law: master liable for civil claims, not criminal acts; possibility to eschew if
total absence of complicity
 What punishment against the bot?
 Diminished capacity: independent persons, but not entirely intelligent
 Children: fully intelligent, but with low moral responsibility
 Agent
 Person
www.lcii.eu
Landscaping of default legal structure
 Basic rules (not exclusive)
 Default liability/tort regimes: IL torts ordinance
 Litigation in court
 Additional rules (not exclusive)
 Strict liability
 Defective products liability (Directive 85/374/EEC on liability for defective
products was adopted in 1985
 IL: Defective Products Liability Law, 1980 (Defective Products Law).
 Consumer rights
 Directive 2011/83/EC on Consumer Rights
 IL: Consumer Protection Law, 1981 (Consumer Protection Law) and the
 Latent defects
 Only for sales
 Duty of guidance
 Only for contracts
www.lcii.eu
 Liability of
owner/keeper/user?
 Liability of perpetrator?
 Liability of
manufacturer?
Framing the options
 Who may/should pay for
robot generated harm?
 Classic imputability issue
Default Legal Structure
www.lcii.eu
Basic rules
Body of rules
 Tort law
 Vicarious liability law
Imputability
 Perpetrator (one is liable for
damages caused by his own acts)
 Owner, holder, master (one his
liable for damages caused by
others’ acts)
www.lcii.eu
 Negligence
 Duty of care
 Omission that constitutes negligence
 Breach of statutory duty
www.lcii.eu
 Employer, corporation, State
 But also: parents, masters, owners
 And vicarious liability for animals, etc.
www.lcii.eu
 Cumulative
 Employer personally liable for employee harm (negligence) and
vicariously liable (supervisor)
 Employee personally liable and employer vicariously liable
 Both are fault-based (+ or – negligence)
www.lcii.eu
Assessment
Owner/keeper/user
imputability (vicarious)
 Protective of victim => solvent
target
 Not necessarily apt to achieve
deterrence purpose of liability law
 Perverse effects on innovation
incentives? => kills market for the
purchase of robots?
Robot imputability (tort)
 Could achieve deterrence, for
robots and AIs can
 Make cost-benefit analysis
 Be taught some legal and moral
principles: bonus pater familias
 But no solvent target
 Remedy problem
 Solvency issue: robots need
registration and to have property
(capital endowment)
 Transfer the robot to victim (but
moral harm?)
 Forced sale of the robot (but no
market)
 Insurance?
 Legal personhood threshold
www.lcii.eu
Assessment
Vicarious (owner) Tort (robot)
Provide solvent target Y N
Incentives N Y
www.lcii.eu
Conclusion
 Early cases likely to seek liability of
owner/keeper/user
 On basis of vicarious liability
 Other liability routes would need to pass
personhood threshold, for liability is contingent
on third person’s fault
 Not fully protective of victims, bc only one
solvent target in liability for things!
www.lcii.eu
Additional rules: Product Liability Law
 IL: Defective Products Liability Law, 1980 (Defective
Products Law)
www.lcii.eu
 Strict liability on manufacturers
 product was defective (deficiency or warnings/safety instructions insufficient)
=> undesired injury by normal use
 “only to bodily injuries and does not extend to indirect, consequential, or pure economic
damages”
 Limited defences
 defect created when the product was no longer under the manufacturer’s
control
 “state-of-the-art’ defence”: manufacturer could not have known that the design of
the product did not comply with reasonable safety standards
 Product was released beyond the control of the manufacturer contrary to its
desire
 Damage: does “not take into account a level of earnings higher than three times the
average earnings in Israel. The damages for pain and suffering pursuant to this law are
limited. The remaining bases of claim generally do not provide for a maximum amount of
liability”
www.lcii.eu
Assessment into context
 Manufacturer liability
 Liability without fault
 Strict liability
 Primary goal is corrective
www.lcii.eu
What the basic legal structure achieves
 Multiplicity of potential regimes are applicable
 No legal desert!
 Most likely to take place under vicarious liability+defective
products
 Less likely to occur under tort liability
 Liability both on supply and demand side!
3. Selected liability issues
www.lcii.eu
Ongoing questions
 Immunity
 Insurance
 Standardization
www.lcii.eu
1. Immunity
 Appeals to limit liability on manufacturers, as a way both to boost
innovation in the robotic industry, by reducing the fears of liability-
related costs (Calo, 2011)
 Strict liability of owner, with cap: owner benefits from introduction
of technology, and victim faces tough causality problem (Decker,
2014)
www.lcii.eu
Ryan Calo, « Open Robotics », 2011
 Distinction btw closed and open robots
 Closed robots are designed to perform a set task: // dishwasher
 Open robots invite third party contribution: multifunction, open to all
software, hardware modularity
 According to Calo, « open robots » are more « generative » in terms of
innovation
 But, “open robotics may expose robotics platform manufacturers and distributors to
legal liability for accidents in a far wider set of scenarios than closed robotics”
 HYPO: Roomba vacuums up and kills an animal
 Manufacturer liability if Roomba causes injury in its normal use
 If product misuse – attempt to use Roomba as pet locomotion device –
no manufacturer liability
 With open robots: more applications + no product misuse defense for the
robot is not designed to perform predetermined tasks
www.lcii.eu
 Disincentive to investments
in (open) robotics markets
 “Early adopters of robotics are
likely to be populations such as
the elderly or disabled that
need in-home assistance. Other
early applications have
involved helping autistic
children. These populations
would make understandably
sympathetic plaintiffs in the
event of litigation”
(http://cyberlaw.stanford.e
du/blog/2009/11/robotics-
law-liability-personal-robots)
The problem
www.lcii.eu
Selective immunity
 Immunizing manufacturers of open robotic platforms from damages
lawsuits arising out of users’ implementation of robots, at least
temporarily;
 Precedent in aviation industry, crippled by litigation under PLL =>
General Aviation Revitalization Act (“GARA”) of 1994
 // immunity enjoyed by firearms manufacturers and website
operators
 Websites not liable for users’ postings (re. defamation, for instance)
 Selective immunity: “presumption against suit unless the plaintiff can
show the problem was clearly related to the platform’s design”
www.lcii.eu
Insurance
 Today, most car accidents are caused by human error
 Progress in crash avoidance technology
 Risks of accidents unlikely to be completely removed since events are not totally
predictable, yet decrease (90%)
 Disruption of the « crash economy » (RAND, 2014)
 Today, variety of approaches, but imputability on driver: strict, no fault or
negligence based liability
 Today, compulsory insurance (EU directive)
 Two issues:
 Who’s liable?
 Shift in liability from user/owner/driver to manufacturer
 EP: “The greater a robot's learning capability or autonomy is, the lower other parties'
responsibility should be”
 Compulsory v non compulsory insurance?
www.lcii.eu
 Transfer?
 European Parliament, Proposal,
JURI: “Points out that a possible
solution to the complexity of allocating
responsibility for damage caused by
increasingly autonomous robots could be
an obligatory insurance scheme, as is
already the case, for instance, with cars;
notes, nevertheless, that unlike the
insurance system for road traffic, where
the insurance covers human acts and
failures, an insurance system for robotics
could be based on the obligation of the
producer to take out an insurance for the
autonomous robots it produces”
Compulsory insurance?
 Still needed? Maybe, but coverage of
losses caused by crashes is likely to
be less expensive
 Coverage of losses not caused by
crashes but by wind, floods and
other natural elements and by theft
(comprehensive coverage) is less
likely to change, yet price will
decrease bc cost of repair offset by
lower accidents (see
http://www.iii.org/issue-
update/self-driving-cars-and-
insurance)
www.lcii.eu
Insurance
 Nothing changes: no-fault form, in which neither party is at fault, and
each car owner’s insurance covers their own vehicle
 Prices should go down!
 http://www.iii.org/issue-update/self-driving-cars-and-insurance
 Changes:
 Manufacturer insurance (Volvo)
 Shared insurance?
 Utility cost with a premium cost based on mileage or usage
 Leasing – ridesharing model
 Risks involving driverless-car hacks and cybersecurity
www.lcii.eu
Standardization
 Private ordering mechanisms
 International Standardization organs (ISO, IEC)
 ISO/TS 15066:2016(en), Robots and robotic devices — Collaborative
robots
 US and EU organizations (IEEE, CEN, CENELEC)
 Many EU rules on machinery and product safety
 Example: Directive 2006/42/EC of the European
Parliament and the Council of 17 May 2006 on machinery
4. Should Someone else Pay?
www.lcii.eu
Coase Theorem
 HYPO
 A uses robot gardener at night, when water and electricity
cost less
 Neighbour B moves in, creates a boutique hotel
 With noise at night, B’s clients flee in droves
 Who should be liable for injury?
 Standard legal solution: A liable to compensate externality
 But at the same time, it is B that creates harm if A is
forbidden to use its robot gardener simply bc the former
moved
 Coase argues that who is liable is to some extent irrelevant:
as long as no transaction costs and property rights well
defined, parties will bargain the efficient solution
www.lcii.eu
 This is only possible absent transaction
costs, ie when it is easy to negotiate
 Often not the case in the real world
 A is a cooperative of multiple owners,
who are never present, for their
agricultural exploitation is fully
automated: search costs and negotiation
costs
 If B liable, bargaining for €5,000 is not
possible. B would contemplate installing
double glazing: €10,000 loss for society
 Not the cheapest cost solution
 When there are transaction costs, law
should assign liability so as to achieve the
cheapest cost solution that would have
been found in negotiation
 A is liable
 Negative externality inflicted by
regulatory system should be as little
efficiency harmful as possible
Cheapest cost solution principle
 Options
 A sends robot for mechanical
update so it makes less noise:
5,000€
 B installs double glazing: extra
€15,000.00
 Solution
 Efficient social solution is that
robot gardener is retooled
 This happens regardless of liability
assignement
 If town hall assigns B right to
silence, A will pay 5,000€ to retool
bot
 If town hall assigns A right to
noise, B will pay to A 5,000€ to
retool bot
 In both cases, the efficient solution
is followed, regardless of who is
liable
www.lcii.eu
Application to robotics
 Invites to research what is the cheapest cost solution
 Not one obvious culprit, avoid moral bias
 A, B or someone else?
www.lcii.eu
Liability on the creator/programmer?
 Robot manufacturer to encode ex ante robot prohibition to
operate at night?
 One line of code: 1€?
 If all the Bs of this world could negotiate freely, they would
contact all robot gardener producers and ask them to encode this
prohibition
 Not possible
 Legal system to hold robot producers liable if noise harm at night
 But bots would be less valuable for buyers, and price system would
correct this?
www.lcii.eu
References
 R. Coase, “The Problem of Social Cost”, 3 J. Law & Econ. [1960]
 R. Coase, “The Federal Communications Commission”, 2 J. Law
& Econ. [1959]
Class III: Robotic Warfare; Market Regulation;
Regulatory Framework
1. Robotic Warfare
www.lcii.eu
Goals of the lecture
 Can we delegate human killing decision to an
autonomous machine?
 Yes, no, maybe?
 What, if any, conditions should the law set?
Technology
www.lcii.eu
Technology
Today
 Robots (with limited autonomy)
are already deployed on the
battlefield in areas such as bomb
disposal, mine clearance and
antimissile systems
 No full autonomy, but increasing
at rapid pace
 30 nations with defensive
human-supervised autonomous
weapons to defend against
surprise attacks from incoming
missiles and rockets: IronDome,
Phalanx CWIS, etc.
Tomorrow?
 Lethal Autonomous Weapons
Systems (“LAWS”) aka “robot killers”
 Robots that can “select and engage
targets without further intervention by a
human operator” (US Directive)
 “Kalashnikovs of tomorrow”: Unlike
nuclear weapons, mass production,
easy proliferation and swift
circulation => SGR-A1 costs approx.
200,000,00€
 Like nuclear weapons, risk of “a
military AI arms race”
www.lcii.eu
UCAVS SWORDS
www.lcii.eu
Sentinels Swarms
www.lcii.eu
www.lcii.eu
Typology
Man Machine
Humans in the loop
(essential operator)
Non autonomous
Humans on the loop (fail
safe)
Partially autonomous
Humans out of the loop (seek
and destroy)
Fully autonomous
www.lcii.eu
 Psychological disconnect and
self justice (« Good Kill » movie)
 Mistakes in visual recognition:
identifying someone as a
combatant
 1960, U.S. missile attack warning
system at NORAD, where an alert
was received saying that the United
States was under massive attack by 99
Soviet missiles, 99,9% certainty;
amateur astronomer: “It’s a beautiful
night! There’s a big full moon right in
sector three. And I can even see icebergs
down in the fjord.”
 Hate by design?
Prospects for warfare
 Clean war
 No casualties: explosive detection
bots
 No war crimes on the battlefield and
outside
 Fast war
 Economic
 Cuts in budget of armed
forces, including retirement
and public health
 R&D
Legal issues
www.lcii.eu
Legal issues
Standard
 Since 2004, US program to
search and kill al Qaeda and
Taliban commanders
 Used in Lybia, Pakistan,
Afghanistan, Syria
 117 Drone killings in 2010, see
http://www.longwarjournal.org/
pakistan-strikes/
 In 2013, China flew a drone into
contested airspace in the East
China Sea. Japan reciprocated by
sending a manned fighter
aircraft
 Act of war? IAC, NIAC?
Ethical
 Shall a machine be granted a license
to kill without human input?
 Are there decisions computers shall
not make without human input?
 Need killswitch?
www.lcii.eu
Ban on LAWs
 Human Rights’ Watch:
https://www.hrw.org/report/2012/11/19/losing-
humanity/case-against-killer-robots
 Report of the Special Rapporteur on Extrajudicial, Summary or
Arbitrary Executions, 20–21, Human Rights Council, U.N.
Doc. A/HRC/23/47 (Apr. 9, 2013)
 United Nations held a further round of talks in Geneva
between 94 military powers aiming to draw up an
international agreement restricting their use
www.lcii.eu
Today
 https://www.stopkillerrobots.org/2016/04/thirdmtg/
 Algeria, Bolivia, Chile, Costa Rica, Cuba, Ecuador, Egypt, Ghana,
Holy See, Mexico, Nicaragua, Pakistan, State of Palestine, and
Zimbabwe
 Convention on Conventional Weapons (CCW) Review Conference
in December will decide if they will hold a Group of Governmental
Experts (GGE) meeting on autonomous weapons systems in 2017
www.lcii.eu
Outright ban proponents
 Moral argument, giving robots the agency to kill
humans would cross red line
 Reduce killing decision to cost-benefit
 Deontological self limitation in human killing
decision: empathy
 Duty of human judgment in killing decision, bc
justice hinges on human reasoning (Asaro, 2012)
 Slippery slope
 Decrease the threshold of war
 Prospect of a generalization of warfare
 Existential risk
www.lcii.eu
Ban skeptics
 If humanity persists in entering into warfare, which is reasonable assumption,
need to better protect non fighter lives
 Automation increases human control: “ In situations where the machine can perform
the task with greater reliability or precision than a person, this can actually increase human
control over the final outcome. For example in a household thermostat, by delegating the
task of turning on and off heat and air conditioning, humans improve their control over the
outcome: the temperature in the home”.
 Robots are conservative warriors: do not try to protect themselves, in particular in
case of low certainty of identification + judgment not clouded by anger or fear
 Robots will not replace humans: organic assets like dogs, etc.
 “Law as code”: design ex ante LAWs constraints like Watt and the heating
machines, upper bound on RPM (upper bound on laser wattage, for instance)
 « Against statu quo », pro « moratorium » and « regulate instead of prohibiting them
entirely », (Arkin, 2016)
www.lcii.eu
 Rent seeking?
 UK position driven by
development of Taranis drone
 Scientists driven by vested research
interest? General Leslie Groves (cited
in Levy, 2006): “What happened is
what I expected, that after they had
this extreme freedom for about six
months their feet began to itch, and
as you know, almost every one of
them has come back into
government research, because it was
just too exciting”
Critical review
 Knee-jerk regulation?
 We already outsource, to specialist
killers that we do not know and over
whom we have little control
 We are faced with a possibly
transitional question, shall not
obscure the possibility of machine to
machine war where 0 human
casualties becomes possible
 Lethal weaponry already exists. LAW
simply makes it accurate: weapons
with 100% success rate (consider
« HRW position on Human Rights
Watch’s position on the use of
precision-guided munitions in urban
settings—a moral imperative »)
 Counterfactual issue: existing world
is not clean war, but dozens of
hidden war crimes
www.lcii.eu
Outstanding issues (Anderson and Waxman,
2012)
 Empirical skepticism: can we trust technology to design
safeguards?
 Deontological imperative: do we want to take the human « out of
the firing loop »?
 Accountability: who takes the blame (incl. costs) for war crimes?
 Not a yes/no question, but how to regulate?
Default Legal Structure
www.lcii.eu
Laws of war
 Hague Conventions (and regulations) of 1899 and
1907
 Convention (II) with Respect to the Laws and Customs
of War on Land and its annex: Regulations concerning
the Laws and Customs of War on Land
 Mostly about combatants
 Provisions on warfare deemed to “contain rules of
customary international law”
 Article 51 of the UN Charter provides right of
self-defence in case of armed attack
www.lcii.eu
Humanitarian law
 In particular, Convention (IV) relative to the
Protection of Civilian Persons in Time of War.
Geneva, 12 August 1949
 Mostly related to civilians protection
 Protocol Additional (Protocol I), and relating to
the Protection of Victims of International Armed
Conflicts, 8 June 1977
www.lcii.eu
Disarmament law (1)
 Convention on Certain Conventional Weapons
(CCW) and protocols
 Under UN Aegis
 Compliance mechanism
 Since 2013, expert meeting on LAWs
www.lcii.eu
 All prohibitions or restrictions on
the use of specific weapons or
weapon systems
 Protocol I on Non-Detectable
Fragments
 Protocol II on Prohibitions or
Restrictions on the Use of Mines,
Booby Traps and Other Devices
 Protocol III on Prohibitions or
Restrictions on the Use of Incendiary
Weapons, etc.
 Protocol IV on Blinding Laser
Weapons
 Protocol V on Explosive Remnants of
War
Disarmament law (2)
 CCW: “chapeau” convention
with general provisions (1980
with 2001 amendment),
including scope
 Article 1 common to the Geneva
Conventions of 12 August 1949.
 Refers to Article 2 of Geneva
Conventions of 12 August 1949 for the
Protection of War Victims: “cases of
declared war or of any other armed conflict
which may arise between two or more of the
High Contracting Parties, even if the state
of war is not recognized by one of them.
The Convention shall also apply to all
cases of partial or total occupation of the
territory of a High Contracting Party, even
if the said occupation meets with no armed
resistance”
 Amended in 2001 to cover also “armed
conflicts not of an international character
occurring in the territory of one of the High
Contracting Parties”
www.lcii.eu
State of discussion
Ban skeptics
 Those are process, R&D
questions, which ought to be
adressed at design level
 Not a trial and error question:
“significant national investment into
R&D already undertaken that will
be hard to write off on ethical or
legal grounds; and national prestige
might be in play” (Anderson and
Waxman, 2012)
Ban proponents
 LAWs violate all provisions of
Geneva conventions designed to
protect civilians (HRW
allegation: “robots with complete
autonomy would be incapable of
meeting international humanitarian
law standards”)
 This justifies a new protocol
under CCW to ban all LAWs
www.lcii.eu
#1. Duty of review
 Protocol I, Article 36 – “New Weapons”:
 “In the study, development, acquisition or adoption of a new weapon, means or method
of warfare, a High Contracting Party is under an obligation to determine whether its
employment would, in some or all circumstances, be prohibited by this Protocol or by any
other rule of international law applicable to the High Contracting Party”
 Protocol I, Article 84 – “Rules of Application”
 “The High Contracting Parties shall communicate to one another, as soon as possible,
through the depositary and, as appropriate, through the Protecting Powers, their official
translations of this Protocol, as well as the laws and regulations which they may adopt to
ensure its application”
 See also Protocol I, Article 35 – “Basic rules”:
 “1. In any armed conflict, the right of the Parties to the conflict to choose methods or
means of warfare is not unlimited. 2. It is prohibited to employ weapons, projectiles and
material and methods of warfare of a nature to cause superfluous injury or unnecessary
suffering. 3. It is prohibited to employ methods or means of warfare which are intended,
or may be expected, to cause widespread, long-term and severe damage to the natural
environment”
www.lcii.eu
Discussion
 On producer and customer States
 Conflict of interest?
 Home industry
 Public subsidies to defense R&D
 All signatory States shall apply
 Some States have set up formal review (BE), others not
 But US is not party to Protocol 1;
 Some contend that Article 36 is customary international law
 Components and final products?
www.lcii.eu
HYPO: weaponization scenario
www.lcii.eu
 HRW report, p.31:“a frightened
mother may run after her two children
and yell at them to stop playing with toy
guns near a soldier. A human soldier
could identify with the mother’s fear and
the children’s game and thus recognize
their intentions as harmless, while a
fully autonomous weapon might see only
a person running toward it and two
armed individuals” => Visual
recognition requires a subjective
understanding of intention
 “Legal threshold has always depended in
part upon technology as well as intended
use” (A&W, 2012)
#2. Distinction requirement
 Article 51(4) Protocol n°1:
“Indiscriminate attacks are prohibited.
Indiscriminate attacks are: (a) those which
are not directed at a specific military
objective; (b) those which employ a method
or means of combat which cannot be
directed at a specific military objective; or
(c) those which employ a method or means
of combat the effects of which cannot be
limited as required by this Protocol; and
consequently, in each such case, are of a
nature to strike military objectives and
civilians or civilian objects without
distinction”
www.lcii.eu
 HRW report, p.33: “A fully
autonomous aircraft identifies an
emerging leadership target”
 Pb 1: “if the target were in a city, the
situation would be constantly changing and
thus potentially overwhelming”
 Pb 2: “weigh the anticipated advantages of
attacking the leader”, which may depend
on the political context
 Rules out systems that “aim at other
weapons” + “ethical issue of attaching
weights to the variables at stake”
(A&W, 2012)
#3. Proportionality principle
 Article 51(5) b):
“an attack which may be expected to cause
incidental loss of civilian life, injury to
civilians, damage to civilian objects, or a
combination thereof, which would be
excessive in relation to the concrete and
direct military advantage anticipated”
 Civilian harm shall not outweigh
military benefits
 Ex ante balancing of civilian and
military harm is required
www.lcii.eu
 Khrisnan, 2009
 Development of “[t]echnology
can largely affect the calculation
of military necessity”; and
 “Once [autonomous weapons] are
widely introduced, it becomes a
matter of military necessity to use
them, as they could prove far
superior to any other type of
weapon”
 Who decides if political or
military necessity (persuading
the ennemy to surrender)
#4. “Military necessity” rule (or defense)
 Customary principle of
humanitarian law
 Lethal force only for the
explicit purpose of defeating
an ennemy
 Only to the extent of
winning the war
 Respect other rules of IHL:
No attack on wounded or
surrendering troops
www.lcii.eu
#5. Martens clause
 Article 1(2) of Protocol 1
“In cases not covered by this Protocol or by other international
agreements, civilians and combatants remain under the
protection and authority of the principles of international law
derived from established custom, from the principles of humanity
and from dictates of public conscience”
www.lcii.eu
Reality check?
 Robots may not comply with default legal structure, but do humans?
 Pentagon Intelligence, Surveillance, and Reconnaissance (ISR) Task
Force: standard for drone strikes is not “no civilian casualties,” only
that it must be a “low” collateral damage estimate
 More at https://theintercept.com/drone-papers/the-assassination-
complex/
Upgrading of Default Legal Structure?
www.lcii.eu
CCW discussions
 Wide attendance
 Discussion is whether autonomous systems are acceptable
 But “neither side has managed to construct a coherent definition for autonomous weapon
systems for the purpose of a weapons ban” (Crootof, 2014)
 Most States believe that autonomous is ok as long as there is “meaningful human
control” (GER: “LAW system without any human control is not in line with our command
and control requirements”)
 Ban supporters:
 Cuba and Ecuador
 Ban opponents:
 British say existing international humanitarian law (IHL) is “the appropriate
paradigm for discussion”, supported by Czechs
 Programming is enough
www.lcii.eu
 Options
 Stationarity requirements
 Only for defensive purposes
 Only non human targets
 Only non lethal measures
 Only in certain areas: high sea v
urban areas
Crootof, 2014
 Supports intentional, proactive
regulation
 “An independent treaty might take
one of three forms: it might attempt
comprehensive regulation (like the
Chemical Weapons Convention—
which, in addition to banning the
development, production,
acquisition, stockpiling, retention,
transfer, and use of certain defined
chemical weapons, also outlines
enforcement mechanisms), provide
piecemeal regulations of specific
activities (like the Nuclear Test Ban
or nonproliferation treaties), or serve
as a framework treaty intended to be
augmented by later protocols (like
the CCW itself). All of these have
associated benefits and drawbacks”
www.lcii.eu
Arkin, 2009
 Ron arkin has proposed an ethical code, designed to ensure compliance
 « Ethical governor »
 First step: LAW must evaluate the information it senses and determine whether
an attack is prohibited under international humanitarian law and the rules of
engagement
 Second step: LAW must assess the attack under the proportionality test.
According to Arkin, “the robot can fire only if it finds the attack ‘satisfies all ethical
constraints and minimizes collateral damage in relation to the military necessity of the
target’”
 Report of California Polytechnic State University of San Luis Obispo consider
that robot ethical morality is insufficient in complex environments
 Other approaches?
 Slavery ethics
 Self learning and strong AI (McGinnis): highly desirable, but unattainable
Conclusions
www.lcii.eu
Liability v warfare
Robotic liability
 Social desirability of technology
(almost) unquestioned
 Debate is immunity or not
 Focus on default legal structure,
and possible ex post adjustment
to the law
 National discussion
 Possibly because essentially
discrete harm issues
Robotic warfare
 Social desirability of technology
challenged
 Debate is ban or not
 On all sides, voices calling for
new rules, and ex ante
regulation
 Robo-ethics driven, « law as
code »
 International discussion
 Possibly because of stronger
systemic and existential risk
www.lcii.eu
References
 See generally:
http://www.unog.ch/80256EE600585943/(httpPages)/8FA3C2562A60FF81C12
57CE600393DF6?OpenDocument
 Ronald Arkin, The Case for Banning Killer Robots: Counterpoint,
Communications of the ACM, Vol. 58 No. 12, Pages 46-47
 Kenneth Anderson and Matthew Waxman, Law and Ethics for Robot Soldiers,
2012
 David Levy, Robots Unlimited, A K Peters, Ltd., 2006
 Michael C. Horowitz & Paul Scharre, Meaningful Human Control in Weapon
Systems: A Primer (Mar. 2015)
 Peter Asaro, On banning autonomous weapon systems, International Review of
the Red Cross, 2012
 Rebecca Crootof, The Killer Robots are Here, Cardozo Law Review, 2015
 Markus Wagner, “Taking Humans Out of the Loop: Implications for
International Humanitarian Law,” 21 Journal of Law, Information and Science
(2011)
2. Market conduct
www.lcii.eu
www.lcii.eu
Competition policy
Goals
 Allocative efficiency
 Productive efficiency
 Dynamic efficiency
Tools
 Prohibition of collusion
 Prohibition of abuse of
dominance
 Prohibition of mergers to
monopoly, and others

 Israel Antitrust Authority (IAA)
www.lcii.eu
Perfect competition, 3.0?
 Increased transparency
 Lower search costs: PCWs and aggregators
 Entry and Expansion
 Platforms as enablers, and the midget disruptors
 Demotion of brick and mortar behemoths
 AMZN v Walmart
 AMZN v GAP
 AMZN v Publishers
 Matching supply and offer
 Sharing economy, and underutilized assets
 The long tail
www.lcii.eu
 Predominance of data-
hungry business models
 Search for data advantage
 Offline players join the
fray, and search for smart
pricing algorithms
 Use of personal assistants
to make decisions for us
Emergence
 Dynamic pricing
 Use of pricing algorithms
(Lawrence book, Making of
a Fly, $23,698,655,93)
 Personalized pricing
 Octo’s insurance quotes
based on drivers’ behavior
 Data explosion
 Cloud computing
 IoT
www.lcii.eu
Ezrachi and Stucke, 2016
 « Façade of competition »
 Cost of free: « Data as
Currency »
 From invisible hand, to
« digitized hand »
www.lcii.eu
Collusion
Easy cases
 « Messenger scenario »: rival
executives collude, and defer to
their algorithms to calculate,
implement and police the cartel
 Evidence of horizontal agreement
+ liability: easy
 « Hub and Spoke »: rival do not
interact, but outsource the pricing
decision to an upstream supplier
algorithm
 Boomerang Commerce
 Uber
 Evidence of vertical agreements,
and // conduct, and cumulative
effect + liability: quite easy
Tough cases
 « Predictable agent »
 All firms in industry use same
pricing algorithm
 Used to monitor each other’s list
prices, and increase when
sustainable
 Instant detection => conscious
parallelism
 « God view and the Digital Eye »
 Each firm can see entire economy
on giant screen
 Algorithm not programme to
increase prices, just profit
maximizer
 Tacit collusion on steroids
www.lcii.eu
 Almost perfect, behavioral
discrimination
 Groups of customers
 Decoys AAPL watch: $349 to
$17,000
 Price steering
 Drip pricing
 Complexity
 Imperfect willpower
Behavioral discrimination
 Perfect price discrimination
 Geographic, demographic,
and other situational
information
 Prices, coupons and
vouchers,
 Target scandal
 Cherry pickers avoidance
www.lcii.eu
Frenemies
 Superplatforms-superplatforms: friends and foes
 GOOG v AMZN v FB v AAPL v MSFT
 GOOG Android supports 90% of AAPL’s APIs
 Superplatforms v Independent Apps
 Uber v GOOG and AAPL?
 Superplatforms with Independent Apps
 Extraction => cooperation during cookie and data identification tech
placement
 Capture => uneven cut, GOOG 32%
 Superplatforms with and v Independent Apps
 Brightest flashlight android app
 Disconnect
 Personal assistants
www.lcii.eu
Remedies
 UK style market investigations
 Putting a price on free
 Privacy by default remedies
 Possible regulation, beyond antitrust
www.lcii.eu
Bottom lines for competition law
 Some strategies don’t raise market failures in antitrust sense
 Personal assistants
 Some generate classic problems for antitrust, nothing new under the sun
 Frenemies
 Some invite thinking on goals of competition law
 Behavioral discrimination?
 Some invite thinking on gaps in competition law
 Predictable Agent and God View
 Behavioral discrimination
 Some may create enforcement difficulties
 Tacit collusion: no liability on algorithm
 Detection problem
www.lcii.eu
 Competition engineers
 Antitrust Hacker
 Antitrust Standardizer
 Antitrust « Digital Half »
 Antitrust Shamer
Reinventing enforcement agencies?
 Competition doctors
 Standard mission of agencies
is to remove antitrust
infringements from markets
 Deterrence, specific and
general: carried out ex post
with fines
 Remediation for the future
 Behavioral and structural
remedies
www.lcii.eu
#1: Antitrust Hacker
Scenario
 Agencies to build programs and
give away software that counteracts
virtual competition
 Agency to cooperate with
computer scientists that build
software so as to technologically
undermine effectiveness of
abovementioned strategies
 Software then made widely
available to customers and rivals
willing to avail themselves of
competitive options
 Prospects for business and tech
communities
 Interface with consumer agencies
Applications
 Anti-decoy filters that eliminate
false options
 Additive data perturbation
software => runs in the back of
users’ sessions and visits
random websites => noise
 Anti-steering filters
 Policy checking privacy
enhancing tools
 Anticomplexity software
 Clearware.org refine consent
content and present it in a more
human readable format
 Same with pricing?
www.lcii.eu
 De facto standards are most
likely dominant platform
operators
 Impose on gateway players to
make it possible for users to
define their own level of
acceptance for all new software
 “Users’ browsers only accept
interaction with Web servers
whose privacy-preferences
correspond to their own local
preferences” (Boldt, 2007)
 Problem of disconnect between
dominance (platform) and
abuse (spyware firm)
#2: Antitrust Standardizer
 Agencies to promote ex ante
specification of
 Privacy non-intensive pricing
algorithms
 Privacy Enhancing Technologies
(identity verification with
minimum identity disclosure,
etc.)
 Antitrust compliance in AI :
individually rational v socially
harmful (Dolmans) + Article 22
GDPR
 Advocate introduction of
antitrust « standards » with
Standard Setting Organizations
and/or de facto standards
 IETF, IEEE-SA, ISO, etc.
 « Dominant » platforms: OS,
handsets, browsers and search
engines
www.lcii.eu
#3: Antitrust Shamer
 Instant antitrust popup that warns of systematic
behavioral discrimination on website
 Instant antitrust popup that suggests user to disconnect
or use alternative browser (Tor)
 Permanent and updated antitrust list of privacy-intensive
websites
www.lcii.eu
#4: Antitrust « Digital Half »
Scenario
 « Digital half » of the competition
agency (P. Domingos)
 Hidden, anonymous or
pseudonymous
 Tacit collusion: stealth
remediation, agency acts as a
maverick, post low prices to trigger
price war
 Behavioral discrimination: agency
monitors customers on platforms
and instantly informs high price
customers that other low price
customers pay less
Discussion
 Pros: possiblity to catch
infringements « red-handed »
 Cons: due process? Not possible
to remedy without
infringement, simply monitor
 But interim measures? Article 8
R1/2003
 Yet, interim measures on
infringing firms
www.lcii.eu
Challenges
Conceptual
 Privacy as an antitrust problem:
quality competition?
 Privacy as a market failure:
mistrust causes deadweight loss?
 Government as spy in market?
Instrumental
 Change the law on behavioral
discrimination (US v EU)?
 Change the law on tacit
collusion (US&EU)?
 Remedy without a cause (no
infringement)?
 Cat and mouse game where
market always ahead of agency?
3. Alternative Consequentialist Framework
www.lcii.eu
Proposed framework
Public interest, beyond market failures
Utilitarian
Externality
Negative Discrete Systemic
Positive Discrete Systemic
Existential
Existernality
Negative Terminator
Positive La formule de dieu
www.lcii.eu
Framework
 Discrete externality (personal; random; rare; endurable)
 Negative: harm
 Positive: benefits
 Systemic externality (local; predictable; frequent; unsustainable)
 Negative
 Substitution effect
 Privacy
 Positive
 Complementarity effect
 Generative or General Purpose technologies (Zitrain, Bresnahan)
 « Existernality » (global; fat tail; terminal)
 Negative: existential risk
 Positive: pure human enhancement
www.lcii.eu
Illustration of the framework (Drones)
 Discrete externality
 A drone crashes on the ceiling of a house, while delivering
 Transports an explosive product
 Burns the house
 Systemic externality
 A drone operated delivery system puts employment in the mail industry at risk
 Existentialist threat
 Drone designed for war
www.lcii.eu
Discrete externalities
Litigation
 Basic legal infrastructure and
case-by-case resolution
 Decisional experimentation,
with fitting exercize
Regulation
 Experimentation
 “Tokku” Special Zone for
Robotics Empirical Testing and
Development (RT special zone)
from open environments => Test
human-robot interface in limited
areas => companies entitled to
less strict legal standard
http://www.economist.com/blog
s/banyan/2014/03/economic-
zones-japan
 Regulatory emulation as States
liberalize driverless cars
www.lcii.eu
www.lcii.eu
Systemic externalities
 If discrete externalities become widespread or harmful
 Scope for new regulation?
 Negative externalities
• Tax on robotic-intensive industries
• Private entitlement of rights: laws on privacy
• Safety standards to solve collective action problem
• Mandatory insurance or electronic personhood for robots (Leroux et al.)
 Positive externalities
• Subsidies for public goods issues: building of controlled environment
infrastructures for driverless cars
• Proactive IPR policy for innovation into robotics technologies
• Immunity from liability for research on certain systemic applications?
GARA precedent
www.lcii.eu
Existernalities
 Calls for legal bans on specific applications
 UN Campaign to stop killer robots:
https://www.stopkillerrobots.org/category/un/
 Technical resolution of issues
 Philosophers: « Creating friendly AIs 1.0 » (Yudkowsky, 2001)
 Technologists: Keeping open and competitive technology,
https://openai.com/blog/introducing-openai/
Liege Competition and Innovation Institute (LCII)
University of Liege (ULg)
Quartier Agora | Place des Orateurs, 1, Bât. B 33, 4000 Liege, BELGIUM
Thank you

Contenu connexe

Tendances

Reshaping Urban Mobility with Autonomous Vehicles. Lessons from the City of B...
Reshaping Urban Mobility with Autonomous Vehicles. Lessons from the City of B...Reshaping Urban Mobility with Autonomous Vehicles. Lessons from the City of B...
Reshaping Urban Mobility with Autonomous Vehicles. Lessons from the City of B...Digital Policy and Law Consulting
 
ADOT Road to the Future Autonomous Vehicles Presentation 9/27/18
ADOT Road to the Future Autonomous Vehicles Presentation 9/27/18ADOT Road to the Future Autonomous Vehicles Presentation 9/27/18
ADOT Road to the Future Autonomous Vehicles Presentation 9/27/18Mark Goldstein
 
Gt briefing sept 2014 the robots are coming ppp
Gt briefing sept 2014 the robots are coming pppGt briefing sept 2014 the robots are coming ppp
Gt briefing sept 2014 the robots are coming pppTracey Keys
 
Problems in Autonomous Driving System of Smart Cities in IoT
Problems in Autonomous Driving System of Smart Cities in IoTProblems in Autonomous Driving System of Smart Cities in IoT
Problems in Autonomous Driving System of Smart Cities in IoTijtsrd
 
Future agenda autonomous vehicles -the emerging landscape final
Future agenda   autonomous vehicles -the emerging landscape finalFuture agenda   autonomous vehicles -the emerging landscape final
Future agenda autonomous vehicles -the emerging landscape finalFuture Agenda
 
CSA SW & (ISC)2 Phoenix Autonomous Vehicles Presentation 10/16/18
CSA SW & (ISC)2 Phoenix Autonomous Vehicles Presentation 10/16/18CSA SW & (ISC)2 Phoenix Autonomous Vehicles Presentation 10/16/18
CSA SW & (ISC)2 Phoenix Autonomous Vehicles Presentation 10/16/18Mark Goldstein
 
5 Autonomous Cars Trends Everyone Should Know About In 2019
5 Autonomous Cars Trends Everyone Should Know About In 20195 Autonomous Cars Trends Everyone Should Know About In 2019
5 Autonomous Cars Trends Everyone Should Know About In 2019Bernard Marr
 
The future of autonomous vehicles 2019 Interim Report
The future of autonomous vehicles 2019 Interim ReportThe future of autonomous vehicles 2019 Interim Report
The future of autonomous vehicles 2019 Interim ReportFuture Agenda
 
Gt briefing nov 2014 wearables fab, fashion or functional slideshare
Gt briefing nov 2014 wearables  fab, fashion or functional slideshareGt briefing nov 2014 wearables  fab, fashion or functional slideshare
Gt briefing nov 2014 wearables fab, fashion or functional slideshareTracey Keys
 
The Automative Imagination
The Automative ImaginationThe Automative Imagination
The Automative ImaginationSam Kinsley
 
Introduction to Connected Cars and Autonomous Vehicles
Introduction to Connected Cars and Autonomous VehiclesIntroduction to Connected Cars and Autonomous Vehicles
Introduction to Connected Cars and Autonomous VehiclesBill Harpley
 
New reality|| Working with robots
New reality|| Working with robotsNew reality|| Working with robots
New reality|| Working with robotsLearningade
 
HYVE autonomous driving report
HYVE autonomous driving reportHYVE autonomous driving report
HYVE autonomous driving reportMichael Bartl
 
The Autonomous Revolution of Vehicles & Transportation 6/12/19
The Autonomous Revolution of Vehicles & Transportation 6/12/19The Autonomous Revolution of Vehicles & Transportation 6/12/19
The Autonomous Revolution of Vehicles & Transportation 6/12/19Mark Goldstein
 
Aerospace Arizona Summit Autonomous Vehicles Presentation 11/8/18
Aerospace Arizona Summit Autonomous Vehicles Presentation 11/8/18Aerospace Arizona Summit Autonomous Vehicles Presentation 11/8/18
Aerospace Arizona Summit Autonomous Vehicles Presentation 11/8/18Mark Goldstein
 
The Future is Here: The Impact of Data on Society and Our Daily Lives
The Future is Here: The Impact of Data on Society and Our Daily LivesThe Future is Here: The Impact of Data on Society and Our Daily Lives
The Future is Here: The Impact of Data on Society and Our Daily LivesJim "Brodie" Brazell
 
Phoenix Mobile & Emerging Tech Festival Autonomous Vehicles Presentation 11/3/18
Phoenix Mobile & Emerging Tech Festival Autonomous Vehicles Presentation 11/3/18Phoenix Mobile & Emerging Tech Festival Autonomous Vehicles Presentation 11/3/18
Phoenix Mobile & Emerging Tech Festival Autonomous Vehicles Presentation 11/3/18Mark Goldstein
 
Driving Disrupted: Driverless Cars Change Everything
Driving Disrupted: Driverless Cars Change EverythingDriving Disrupted: Driverless Cars Change Everything
Driving Disrupted: Driverless Cars Change Everythingsparks & honey
 

Tendances (20)

Reshaping Urban Mobility with Autonomous Vehicles. Lessons from the City of B...
Reshaping Urban Mobility with Autonomous Vehicles. Lessons from the City of B...Reshaping Urban Mobility with Autonomous Vehicles. Lessons from the City of B...
Reshaping Urban Mobility with Autonomous Vehicles. Lessons from the City of B...
 
ADOT Road to the Future Autonomous Vehicles Presentation 9/27/18
ADOT Road to the Future Autonomous Vehicles Presentation 9/27/18ADOT Road to the Future Autonomous Vehicles Presentation 9/27/18
ADOT Road to the Future Autonomous Vehicles Presentation 9/27/18
 
Gt briefing sept 2014 the robots are coming ppp
Gt briefing sept 2014 the robots are coming pppGt briefing sept 2014 the robots are coming ppp
Gt briefing sept 2014 the robots are coming ppp
 
Problems in Autonomous Driving System of Smart Cities in IoT
Problems in Autonomous Driving System of Smart Cities in IoTProblems in Autonomous Driving System of Smart Cities in IoT
Problems in Autonomous Driving System of Smart Cities in IoT
 
AI inspired vehicle experience
AI inspired vehicle experience AI inspired vehicle experience
AI inspired vehicle experience
 
Future agenda autonomous vehicles -the emerging landscape final
Future agenda   autonomous vehicles -the emerging landscape finalFuture agenda   autonomous vehicles -the emerging landscape final
Future agenda autonomous vehicles -the emerging landscape final
 
CSA SW & (ISC)2 Phoenix Autonomous Vehicles Presentation 10/16/18
CSA SW & (ISC)2 Phoenix Autonomous Vehicles Presentation 10/16/18CSA SW & (ISC)2 Phoenix Autonomous Vehicles Presentation 10/16/18
CSA SW & (ISC)2 Phoenix Autonomous Vehicles Presentation 10/16/18
 
5 Autonomous Cars Trends Everyone Should Know About In 2019
5 Autonomous Cars Trends Everyone Should Know About In 20195 Autonomous Cars Trends Everyone Should Know About In 2019
5 Autonomous Cars Trends Everyone Should Know About In 2019
 
The future of autonomous vehicles 2019 Interim Report
The future of autonomous vehicles 2019 Interim ReportThe future of autonomous vehicles 2019 Interim Report
The future of autonomous vehicles 2019 Interim Report
 
Gt briefing nov 2014 wearables fab, fashion or functional slideshare
Gt briefing nov 2014 wearables  fab, fashion or functional slideshareGt briefing nov 2014 wearables  fab, fashion or functional slideshare
Gt briefing nov 2014 wearables fab, fashion or functional slideshare
 
The Automative Imagination
The Automative ImaginationThe Automative Imagination
The Automative Imagination
 
Introduction to Connected Cars and Autonomous Vehicles
Introduction to Connected Cars and Autonomous VehiclesIntroduction to Connected Cars and Autonomous Vehicles
Introduction to Connected Cars and Autonomous Vehicles
 
New reality|| Working with robots
New reality|| Working with robotsNew reality|| Working with robots
New reality|| Working with robots
 
HYVE autonomous driving report
HYVE autonomous driving reportHYVE autonomous driving report
HYVE autonomous driving report
 
The Autonomous Revolution of Vehicles & Transportation 6/12/19
The Autonomous Revolution of Vehicles & Transportation 6/12/19The Autonomous Revolution of Vehicles & Transportation 6/12/19
The Autonomous Revolution of Vehicles & Transportation 6/12/19
 
Aerospace Arizona Summit Autonomous Vehicles Presentation 11/8/18
Aerospace Arizona Summit Autonomous Vehicles Presentation 11/8/18Aerospace Arizona Summit Autonomous Vehicles Presentation 11/8/18
Aerospace Arizona Summit Autonomous Vehicles Presentation 11/8/18
 
The Future is Here: The Impact of Data on Society and Our Daily Lives
The Future is Here: The Impact of Data on Society and Our Daily LivesThe Future is Here: The Impact of Data on Society and Our Daily Lives
The Future is Here: The Impact of Data on Society and Our Daily Lives
 
Phoenix Mobile & Emerging Tech Festival Autonomous Vehicles Presentation 11/3/18
Phoenix Mobile & Emerging Tech Festival Autonomous Vehicles Presentation 11/3/18Phoenix Mobile & Emerging Tech Festival Autonomous Vehicles Presentation 11/3/18
Phoenix Mobile & Emerging Tech Festival Autonomous Vehicles Presentation 11/3/18
 
I Car2.0
I Car2.0I Car2.0
I Car2.0
 
Driving Disrupted: Driverless Cars Change Everything
Driving Disrupted: Driverless Cars Change EverythingDriving Disrupted: Driverless Cars Change Everything
Driving Disrupted: Driverless Cars Change Everything
 

En vedette

Marco del buen desempeño docente
Marco del buen desempeño docenteMarco del buen desempeño docente
Marco del buen desempeño docente0013
 
Primer Paquete Económico 2017 Zacatecas (2/9)
Primer Paquete Económico 2017 Zacatecas (2/9)Primer Paquete Económico 2017 Zacatecas (2/9)
Primer Paquete Económico 2017 Zacatecas (2/9)Zacatecas TresPuntoCero
 
JULIOPARI - Elaborando un Plan de Negocios
JULIOPARI - Elaborando un Plan de NegociosJULIOPARI - Elaborando un Plan de Negocios
JULIOPARI - Elaborando un Plan de NegociosJulio Pari
 
El emprendedor y el empresario profesional cert
El emprendedor y el empresario profesional certEl emprendedor y el empresario profesional cert
El emprendedor y el empresario profesional certMaestros Online
 
INVESTIGACIÓN DEPRESION EN ADOLESCENTES
INVESTIGACIÓN DEPRESION EN ADOLESCENTESINVESTIGACIÓN DEPRESION EN ADOLESCENTES
INVESTIGACIÓN DEPRESION EN ADOLESCENTESOLIVER JIMENEZ
 
Como hacer un plan de negocios
Como hacer un plan de negociosComo hacer un plan de negocios
Como hacer un plan de negociosXPINNERPablo
 
Estrategias competitivas básicas
Estrategias competitivas básicasEstrategias competitivas básicas
Estrategias competitivas básicasLarryJimenez
 
1721 mercadeo -_ventas_y_servicio_al_cliente
1721 mercadeo -_ventas_y_servicio_al_cliente1721 mercadeo -_ventas_y_servicio_al_cliente
1721 mercadeo -_ventas_y_servicio_al_clienteYerika Marcela Rendon
 
Unidad Didáctica: Los sectores ecónomicos
Unidad Didáctica: Los sectores ecónomicosUnidad Didáctica: Los sectores ecónomicos
Unidad Didáctica: Los sectores ecónomicosmarina valverde
 
Informe mantenimiento mecanico
Informe mantenimiento mecanicoInforme mantenimiento mecanico
Informe mantenimiento mecanicoJDPVasquez
 
Segundo Paquete Económico 2017 Zacatecas - Egresos (3-9)
Segundo Paquete Económico 2017 Zacatecas - Egresos (3-9)Segundo Paquete Económico 2017 Zacatecas - Egresos (3-9)
Segundo Paquete Económico 2017 Zacatecas - Egresos (3-9)Zacatecas TresPuntoCero
 
Proyectos_de_innovacion
Proyectos_de_innovacionProyectos_de_innovacion
Proyectos_de_innovacionWebMD
 
Maltrato infantil ong_paicabi
Maltrato infantil ong_paicabiMaltrato infantil ong_paicabi
Maltrato infantil ong_paicabikarely de la o
 

En vedette (20)

Marco del buen desempeño docente
Marco del buen desempeño docenteMarco del buen desempeño docente
Marco del buen desempeño docente
 
Primer Paquete Económico 2017 Zacatecas (2/9)
Primer Paquete Económico 2017 Zacatecas (2/9)Primer Paquete Económico 2017 Zacatecas (2/9)
Primer Paquete Económico 2017 Zacatecas (2/9)
 
JULIOPARI - Elaborando un Plan de Negocios
JULIOPARI - Elaborando un Plan de NegociosJULIOPARI - Elaborando un Plan de Negocios
JULIOPARI - Elaborando un Plan de Negocios
 
PMP Sonora Saludable 2010 2015
PMP Sonora Saludable 2010   2015  PMP Sonora Saludable 2010   2015
PMP Sonora Saludable 2010 2015
 
El emprendedor y el empresario profesional cert
El emprendedor y el empresario profesional certEl emprendedor y el empresario profesional cert
El emprendedor y el empresario profesional cert
 
INVESTIGACIÓN DEPRESION EN ADOLESCENTES
INVESTIGACIÓN DEPRESION EN ADOLESCENTESINVESTIGACIÓN DEPRESION EN ADOLESCENTES
INVESTIGACIÓN DEPRESION EN ADOLESCENTES
 
Como hacer un plan de negocios
Como hacer un plan de negociosComo hacer un plan de negocios
Como hacer un plan de negocios
 
Estrategias competitivas básicas
Estrategias competitivas básicasEstrategias competitivas básicas
Estrategias competitivas básicas
 
Cápsula 1. estudios de mercado
Cápsula 1. estudios de mercadoCápsula 1. estudios de mercado
Cápsula 1. estudios de mercado
 
Rodriguez alvarez
Rodriguez alvarezRodriguez alvarez
Rodriguez alvarez
 
Fichero de actividades
Fichero de actividadesFichero de actividades
Fichero de actividades
 
1721 mercadeo -_ventas_y_servicio_al_cliente
1721 mercadeo -_ventas_y_servicio_al_cliente1721 mercadeo -_ventas_y_servicio_al_cliente
1721 mercadeo -_ventas_y_servicio_al_cliente
 
Libro de-mantenimiento-industrial
Libro de-mantenimiento-industrialLibro de-mantenimiento-industrial
Libro de-mantenimiento-industrial
 
Modulo7gestion
Modulo7gestionModulo7gestion
Modulo7gestion
 
Unidad Didáctica: Los sectores ecónomicos
Unidad Didáctica: Los sectores ecónomicosUnidad Didáctica: Los sectores ecónomicos
Unidad Didáctica: Los sectores ecónomicos
 
Informe mantenimiento mecanico
Informe mantenimiento mecanicoInforme mantenimiento mecanico
Informe mantenimiento mecanico
 
Segundo Paquete Económico 2017 Zacatecas - Egresos (3-9)
Segundo Paquete Económico 2017 Zacatecas - Egresos (3-9)Segundo Paquete Económico 2017 Zacatecas - Egresos (3-9)
Segundo Paquete Económico 2017 Zacatecas - Egresos (3-9)
 
Proyectos_de_innovacion
Proyectos_de_innovacionProyectos_de_innovacion
Proyectos_de_innovacion
 
Maltrato infantil ong_paicabi
Maltrato infantil ong_paicabiMaltrato infantil ong_paicabi
Maltrato infantil ong_paicabi
 
Tears In The Rain
Tears In The RainTears In The Rain
Tears In The Rain
 

Similaire à Full Course - Law and Regulation of Machine Intelligence - Bar Ilan University 2016

9694 thinking skills ai
9694 thinking skills ai9694 thinking skills ai
9694 thinking skills aimayorgam
 
Robotic car seminar report
Robotic car seminar reportRobotic car seminar report
Robotic car seminar reportVvs Pradeep
 
9694 thinking skills ai rev qr
9694 thinking skills ai rev qr9694 thinking skills ai rev qr
9694 thinking skills ai rev qrmayorgam
 
Ai and robotics: Past, Present and Future
Ai and robotics: Past, Present and FutureAi and robotics: Past, Present and Future
Ai and robotics: Past, Present and FutureHongmei He
 
AI & robotics: Past, Present and Future
AI & robotics: Past, Present and FutureAI & robotics: Past, Present and Future
AI & robotics: Past, Present and FutureHongmei He
 
Artificial Intelligence Today (22 June 2017)
Artificial Intelligence Today (22 June 2017)Artificial Intelligence Today (22 June 2017)
Artificial Intelligence Today (22 June 2017)Sabri Sansoy
 
Training and Simulation in a More Autonomous and Robotic Future
Training and Simulation in a More Autonomous and Robotic FutureTraining and Simulation in a More Autonomous and Robotic Future
Training and Simulation in a More Autonomous and Robotic FutureAndy Fawkes
 
Presentation on Artificial Intelligence
Presentation on Artificial IntelligencePresentation on Artificial Intelligence
Presentation on Artificial IntelligenceIshwar Bulbule
 
AI PROJECT .... (1).pptx
AI PROJECT .... (1).pptxAI PROJECT .... (1).pptx
AI PROJECT .... (1).pptxVishnuDubey14
 
artificialintelligence-ppt-160506100840.pdf
artificialintelligence-ppt-160506100840.pdfartificialintelligence-ppt-160506100840.pdf
artificialintelligence-ppt-160506100840.pdfTejasRH
 
Ethics for the machines altitude software
Ethics for the machines   altitude softwareEthics for the machines   altitude software
Ethics for the machines altitude softwareAltitude Software
 
Artificial intelligence 13cs041
Artificial intelligence 13cs041Artificial intelligence 13cs041
Artificial intelligence 13cs041Manik Behl
 
Ivan Horodyskyy “Legal limits of the AI software development: what should be ...
Ivan Horodyskyy “Legal limits of the AI software development: what should be ...Ivan Horodyskyy “Legal limits of the AI software development: what should be ...
Ivan Horodyskyy “Legal limits of the AI software development: what should be ...Lviv Startup Club
 
Smart Machines -presentation, Dec 2014
Smart Machines -presentation, Dec 2014Smart Machines -presentation, Dec 2014
Smart Machines -presentation, Dec 2014Immo Salo
 
ARTIFICIAL intelligence ppt presentation
ARTIFICIAL intelligence ppt presentationARTIFICIAL intelligence ppt presentation
ARTIFICIAL intelligence ppt presentationGuruPrakash763066
 
Creating an Effective Human-Machine Partnership
Creating an Effective Human-Machine PartnershipCreating an Effective Human-Machine Partnership
Creating an Effective Human-Machine PartnershipAndy Fawkes
 
Selected topics in Computer Science
Selected topics in Computer Science Selected topics in Computer Science
Selected topics in Computer Science Melaku Bayih Demessie
 
Social Impacts of Artificial intelligence
Social Impacts of Artificial intelligenceSocial Impacts of Artificial intelligence
Social Impacts of Artificial intelligenceSaqib Raza
 

Similaire à Full Course - Law and Regulation of Machine Intelligence - Bar Ilan University 2016 (20)

9694 thinking skills ai
9694 thinking skills ai9694 thinking skills ai
9694 thinking skills ai
 
Robotic car seminar report
Robotic car seminar reportRobotic car seminar report
Robotic car seminar report
 
9694 thinking skills ai rev qr
9694 thinking skills ai rev qr9694 thinking skills ai rev qr
9694 thinking skills ai rev qr
 
Ai and robotics: Past, Present and Future
Ai and robotics: Past, Present and FutureAi and robotics: Past, Present and Future
Ai and robotics: Past, Present and Future
 
AI & robotics: Past, Present and Future
AI & robotics: Past, Present and FutureAI & robotics: Past, Present and Future
AI & robotics: Past, Present and Future
 
Artificial Intelligence Today (22 June 2017)
Artificial Intelligence Today (22 June 2017)Artificial Intelligence Today (22 June 2017)
Artificial Intelligence Today (22 June 2017)
 
EtikaUI.pptx
EtikaUI.pptxEtikaUI.pptx
EtikaUI.pptx
 
Training and Simulation in a More Autonomous and Robotic Future
Training and Simulation in a More Autonomous and Robotic FutureTraining and Simulation in a More Autonomous and Robotic Future
Training and Simulation in a More Autonomous and Robotic Future
 
Presentation on Artificial Intelligence
Presentation on Artificial IntelligencePresentation on Artificial Intelligence
Presentation on Artificial Intelligence
 
AI PROJECT .... (1).pptx
AI PROJECT .... (1).pptxAI PROJECT .... (1).pptx
AI PROJECT .... (1).pptx
 
artificialintelligence-ppt-160506100840.pdf
artificialintelligence-ppt-160506100840.pdfartificialintelligence-ppt-160506100840.pdf
artificialintelligence-ppt-160506100840.pdf
 
Ethics for the machines altitude software
Ethics for the machines   altitude softwareEthics for the machines   altitude software
Ethics for the machines altitude software
 
Artificial intelligence ppt
Artificial intelligence   pptArtificial intelligence   ppt
Artificial intelligence ppt
 
Artificial intelligence 13cs041
Artificial intelligence 13cs041Artificial intelligence 13cs041
Artificial intelligence 13cs041
 
Ivan Horodyskyy “Legal limits of the AI software development: what should be ...
Ivan Horodyskyy “Legal limits of the AI software development: what should be ...Ivan Horodyskyy “Legal limits of the AI software development: what should be ...
Ivan Horodyskyy “Legal limits of the AI software development: what should be ...
 
Smart Machines -presentation, Dec 2014
Smart Machines -presentation, Dec 2014Smart Machines -presentation, Dec 2014
Smart Machines -presentation, Dec 2014
 
ARTIFICIAL intelligence ppt presentation
ARTIFICIAL intelligence ppt presentationARTIFICIAL intelligence ppt presentation
ARTIFICIAL intelligence ppt presentation
 
Creating an Effective Human-Machine Partnership
Creating an Effective Human-Machine PartnershipCreating an Effective Human-Machine Partnership
Creating an Effective Human-Machine Partnership
 
Selected topics in Computer Science
Selected topics in Computer Science Selected topics in Computer Science
Selected topics in Computer Science
 
Social Impacts of Artificial intelligence
Social Impacts of Artificial intelligenceSocial Impacts of Artificial intelligence
Social Impacts of Artificial intelligence
 

Plus de Nicolas Petit

A Review of Competition Policy for the Digital Era (Cremer et al Report)
A Review of Competition Policy for the Digital Era  (Cremer et al Report)A Review of Competition Policy for the Digital Era  (Cremer et al Report)
A Review of Competition Policy for the Digital Era (Cremer et al Report)Nicolas Petit
 
Nicolas Petit 27 september 18 - Hard Questions of Law and AI
Nicolas Petit 27 september 18 - Hard Questions of Law and AINicolas Petit 27 september 18 - Hard Questions of Law and AI
Nicolas Petit 27 september 18 - Hard Questions of Law and AINicolas Petit
 
On Binding Effect of Guidance Paper and AEC test
On Binding Effect of Guidance Paper and AEC test On Binding Effect of Guidance Paper and AEC test
On Binding Effect of Guidance Paper and AEC test Nicolas Petit
 
Antitrust claims in a standards context - ASPI APEB LES - Paris 2016
Antitrust claims in a standards context - ASPI APEB LES - Paris 2016Antitrust claims in a standards context - ASPI APEB LES - Paris 2016
Antitrust claims in a standards context - ASPI APEB LES - Paris 2016Nicolas Petit
 
Law of robots and AIs - Future of Lawyers - Lecture 3
Law of robots and AIs - Future of Lawyers - Lecture 3Law of robots and AIs - Future of Lawyers - Lecture 3
Law of robots and AIs - Future of Lawyers - Lecture 3Nicolas Petit
 
Introduction to the law of robots and AIs - lecture 3 - The Future of Lawyers?
Introduction to the law of robots and AIs - lecture 3 - The Future of Lawyers?Introduction to the law of robots and AIs - lecture 3 - The Future of Lawyers?
Introduction to the law of robots and AIs - lecture 3 - The Future of Lawyers?Nicolas Petit
 
IBC Slides Petit - Abuse of Dominance - Huawei and Post Danmark
IBC Slides Petit  - Abuse of Dominance - Huawei and Post DanmarkIBC Slides Petit  - Abuse of Dominance - Huawei and Post Danmark
IBC Slides Petit - Abuse of Dominance - Huawei and Post DanmarkNicolas Petit
 
Competition law and the Unified Patent Court - Slides 25 09 15
Competition law and the Unified Patent Court - Slides 25 09 15Competition law and the Unified Patent Court - Slides 25 09 15
Competition law and the Unified Patent Court - Slides 25 09 15Nicolas Petit
 
Loi macron - Frontières de l'autorité de la concurrence - N Petit
Loi macron - Frontières de l'autorité de la concurrence - N PetitLoi macron - Frontières de l'autorité de la concurrence - N Petit
Loi macron - Frontières de l'autorité de la concurrence - N PetitNicolas Petit
 
Problem practices in Competition Law - Presentation to CMA Academy
Problem practices in Competition Law -  Presentation to CMA AcademyProblem practices in Competition Law -  Presentation to CMA Academy
Problem practices in Competition Law - Presentation to CMA AcademyNicolas Petit
 
Patent hold up and the antitrustization of frand - A multi-sided reappraisal ...
Patent hold up and the antitrustization of frand - A multi-sided reappraisal ...Patent hold up and the antitrustization of frand - A multi-sided reappraisal ...
Patent hold up and the antitrustization of frand - A multi-sided reappraisal ...Nicolas Petit
 

Plus de Nicolas Petit (11)

A Review of Competition Policy for the Digital Era (Cremer et al Report)
A Review of Competition Policy for the Digital Era  (Cremer et al Report)A Review of Competition Policy for the Digital Era  (Cremer et al Report)
A Review of Competition Policy for the Digital Era (Cremer et al Report)
 
Nicolas Petit 27 september 18 - Hard Questions of Law and AI
Nicolas Petit 27 september 18 - Hard Questions of Law and AINicolas Petit 27 september 18 - Hard Questions of Law and AI
Nicolas Petit 27 september 18 - Hard Questions of Law and AI
 
On Binding Effect of Guidance Paper and AEC test
On Binding Effect of Guidance Paper and AEC test On Binding Effect of Guidance Paper and AEC test
On Binding Effect of Guidance Paper and AEC test
 
Antitrust claims in a standards context - ASPI APEB LES - Paris 2016
Antitrust claims in a standards context - ASPI APEB LES - Paris 2016Antitrust claims in a standards context - ASPI APEB LES - Paris 2016
Antitrust claims in a standards context - ASPI APEB LES - Paris 2016
 
Law of robots and AIs - Future of Lawyers - Lecture 3
Law of robots and AIs - Future of Lawyers - Lecture 3Law of robots and AIs - Future of Lawyers - Lecture 3
Law of robots and AIs - Future of Lawyers - Lecture 3
 
Introduction to the law of robots and AIs - lecture 3 - The Future of Lawyers?
Introduction to the law of robots and AIs - lecture 3 - The Future of Lawyers?Introduction to the law of robots and AIs - lecture 3 - The Future of Lawyers?
Introduction to the law of robots and AIs - lecture 3 - The Future of Lawyers?
 
IBC Slides Petit - Abuse of Dominance - Huawei and Post Danmark
IBC Slides Petit  - Abuse of Dominance - Huawei and Post DanmarkIBC Slides Petit  - Abuse of Dominance - Huawei and Post Danmark
IBC Slides Petit - Abuse of Dominance - Huawei and Post Danmark
 
Competition law and the Unified Patent Court - Slides 25 09 15
Competition law and the Unified Patent Court - Slides 25 09 15Competition law and the Unified Patent Court - Slides 25 09 15
Competition law and the Unified Patent Court - Slides 25 09 15
 
Loi macron - Frontières de l'autorité de la concurrence - N Petit
Loi macron - Frontières de l'autorité de la concurrence - N PetitLoi macron - Frontières de l'autorité de la concurrence - N Petit
Loi macron - Frontières de l'autorité de la concurrence - N Petit
 
Problem practices in Competition Law - Presentation to CMA Academy
Problem practices in Competition Law -  Presentation to CMA AcademyProblem practices in Competition Law -  Presentation to CMA Academy
Problem practices in Competition Law - Presentation to CMA Academy
 
Patent hold up and the antitrustization of frand - A multi-sided reappraisal ...
Patent hold up and the antitrustization of frand - A multi-sided reappraisal ...Patent hold up and the antitrustization of frand - A multi-sided reappraisal ...
Patent hold up and the antitrustization of frand - A multi-sided reappraisal ...
 

Dernier

Alexis OConnell mugshot Lexileeyogi 512-840-8791
Alexis OConnell mugshot Lexileeyogi 512-840-8791Alexis OConnell mugshot Lexileeyogi 512-840-8791
Alexis OConnell mugshot Lexileeyogi 512-840-8791BlayneRush1
 
citizenship in the Philippines as to the laws applicable
citizenship in the Philippines as to the laws applicablecitizenship in the Philippines as to the laws applicable
citizenship in the Philippines as to the laws applicableSaraSantiago44
 
昆士兰科技大学毕业证学位证成绩单-补办步骤澳洲毕业证书
昆士兰科技大学毕业证学位证成绩单-补办步骤澳洲毕业证书昆士兰科技大学毕业证学位证成绩单-补办步骤澳洲毕业证书
昆士兰科技大学毕业证学位证成绩单-补办步骤澳洲毕业证书1k98h0e1
 
Are There Any Alternatives To Jail Time For Sex Crime Convictions in Los Angeles
Are There Any Alternatives To Jail Time For Sex Crime Convictions in Los AngelesAre There Any Alternatives To Jail Time For Sex Crime Convictions in Los Angeles
Are There Any Alternatives To Jail Time For Sex Crime Convictions in Los AngelesChesley Lawyer
 
Alexis O'Connell Arrest Records Houston Texas lexileeyogi
Alexis O'Connell Arrest Records Houston Texas lexileeyogiAlexis O'Connell Arrest Records Houston Texas lexileeyogi
Alexis O'Connell Arrest Records Houston Texas lexileeyogiBlayneRush1
 
PPT Template - Federal Law Enforcement Training Center
PPT Template - Federal Law Enforcement Training CenterPPT Template - Federal Law Enforcement Training Center
PPT Template - Federal Law Enforcement Training Centerejlfernandez22
 
Illinois Department Of Corrections reentry guide
Illinois Department Of Corrections reentry guideIllinois Department Of Corrections reentry guide
Illinois Department Of Corrections reentry guideillinoisworknet11
 
Guide for Drug Education and Vice Control.docx
Guide for Drug Education and Vice Control.docxGuide for Drug Education and Vice Control.docx
Guide for Drug Education and Vice Control.docxjennysansano2
 
Conditions Restricting Transfer Under TPA,1882
Conditions Restricting Transfer Under TPA,1882Conditions Restricting Transfer Under TPA,1882
Conditions Restricting Transfer Under TPA,18822020000445musaib
 
Good Governance Practices for protection of Human Rights (Discuss Transparen...
Good Governance Practices for protection  of Human Rights (Discuss Transparen...Good Governance Practices for protection  of Human Rights (Discuss Transparen...
Good Governance Practices for protection of Human Rights (Discuss Transparen...shubhuc963
 
Analysis on Law of Domicile under Private International laws.
Analysis on Law of Domicile under Private International laws.Analysis on Law of Domicile under Private International laws.
Analysis on Law of Domicile under Private International laws.2020000445musaib
 
Comparison of GenAI benchmarking models for legal use cases
Comparison of GenAI benchmarking models for legal use casesComparison of GenAI benchmarking models for legal use cases
Comparison of GenAI benchmarking models for legal use casesritwikv20
 
Law360 - How Duty Of Candor Figures In USPTO AI Ethics Guidance
Law360 - How Duty Of Candor Figures In USPTO AI Ethics GuidanceLaw360 - How Duty Of Candor Figures In USPTO AI Ethics Guidance
Law360 - How Duty Of Candor Figures In USPTO AI Ethics GuidanceMichael Cicero
 
Special Accounting Areas - Hire purchase agreement
Special Accounting Areas - Hire purchase agreementSpecial Accounting Areas - Hire purchase agreement
Special Accounting Areas - Hire purchase agreementShubhiSharma858417
 
Alexis O'Connell lexileeyogi Bond revocation for drug arrest Alexis Lee
Alexis O'Connell lexileeyogi Bond revocation for drug arrest Alexis LeeAlexis O'Connell lexileeyogi Bond revocation for drug arrest Alexis Lee
Alexis O'Connell lexileeyogi Bond revocation for drug arrest Alexis LeeBlayneRush1
 
Presentation1.pptx on sedition is a good legal point
Presentation1.pptx on sedition is a good legal pointPresentation1.pptx on sedition is a good legal point
Presentation1.pptx on sedition is a good legal pointMohdYousuf40
 
Understanding Cyber Crime Litigation: Key Concepts and Legal Frameworks
Understanding Cyber Crime Litigation: Key Concepts and Legal FrameworksUnderstanding Cyber Crime Litigation: Key Concepts and Legal Frameworks
Understanding Cyber Crime Litigation: Key Concepts and Legal FrameworksFinlaw Associates
 
Alexis O'Connell Lexileeyogi 512-840-8791
Alexis O'Connell Lexileeyogi 512-840-8791Alexis O'Connell Lexileeyogi 512-840-8791
Alexis O'Connell Lexileeyogi 512-840-8791BlayneRush1
 
Wurz Financial - Wealth Counsel to Law Firm Owners Services Guide.pdf
Wurz Financial - Wealth Counsel to Law Firm Owners Services Guide.pdfWurz Financial - Wealth Counsel to Law Firm Owners Services Guide.pdf
Wurz Financial - Wealth Counsel to Law Firm Owners Services Guide.pdfssuser3e15612
 
Grey Area of the Information Technology Act, 2000.pptx
Grey Area of the Information Technology Act, 2000.pptxGrey Area of the Information Technology Act, 2000.pptx
Grey Area of the Information Technology Act, 2000.pptxBharatMunjal4
 

Dernier (20)

Alexis OConnell mugshot Lexileeyogi 512-840-8791
Alexis OConnell mugshot Lexileeyogi 512-840-8791Alexis OConnell mugshot Lexileeyogi 512-840-8791
Alexis OConnell mugshot Lexileeyogi 512-840-8791
 
citizenship in the Philippines as to the laws applicable
citizenship in the Philippines as to the laws applicablecitizenship in the Philippines as to the laws applicable
citizenship in the Philippines as to the laws applicable
 
昆士兰科技大学毕业证学位证成绩单-补办步骤澳洲毕业证书
昆士兰科技大学毕业证学位证成绩单-补办步骤澳洲毕业证书昆士兰科技大学毕业证学位证成绩单-补办步骤澳洲毕业证书
昆士兰科技大学毕业证学位证成绩单-补办步骤澳洲毕业证书
 
Are There Any Alternatives To Jail Time For Sex Crime Convictions in Los Angeles
Are There Any Alternatives To Jail Time For Sex Crime Convictions in Los AngelesAre There Any Alternatives To Jail Time For Sex Crime Convictions in Los Angeles
Are There Any Alternatives To Jail Time For Sex Crime Convictions in Los Angeles
 
Alexis O'Connell Arrest Records Houston Texas lexileeyogi
Alexis O'Connell Arrest Records Houston Texas lexileeyogiAlexis O'Connell Arrest Records Houston Texas lexileeyogi
Alexis O'Connell Arrest Records Houston Texas lexileeyogi
 
PPT Template - Federal Law Enforcement Training Center
PPT Template - Federal Law Enforcement Training CenterPPT Template - Federal Law Enforcement Training Center
PPT Template - Federal Law Enforcement Training Center
 
Illinois Department Of Corrections reentry guide
Illinois Department Of Corrections reentry guideIllinois Department Of Corrections reentry guide
Illinois Department Of Corrections reentry guide
 
Guide for Drug Education and Vice Control.docx
Guide for Drug Education and Vice Control.docxGuide for Drug Education and Vice Control.docx
Guide for Drug Education and Vice Control.docx
 
Conditions Restricting Transfer Under TPA,1882
Conditions Restricting Transfer Under TPA,1882Conditions Restricting Transfer Under TPA,1882
Conditions Restricting Transfer Under TPA,1882
 
Good Governance Practices for protection of Human Rights (Discuss Transparen...
Good Governance Practices for protection  of Human Rights (Discuss Transparen...Good Governance Practices for protection  of Human Rights (Discuss Transparen...
Good Governance Practices for protection of Human Rights (Discuss Transparen...
 
Analysis on Law of Domicile under Private International laws.
Analysis on Law of Domicile under Private International laws.Analysis on Law of Domicile under Private International laws.
Analysis on Law of Domicile under Private International laws.
 
Comparison of GenAI benchmarking models for legal use cases
Comparison of GenAI benchmarking models for legal use casesComparison of GenAI benchmarking models for legal use cases
Comparison of GenAI benchmarking models for legal use cases
 
Law360 - How Duty Of Candor Figures In USPTO AI Ethics Guidance
Law360 - How Duty Of Candor Figures In USPTO AI Ethics GuidanceLaw360 - How Duty Of Candor Figures In USPTO AI Ethics Guidance
Law360 - How Duty Of Candor Figures In USPTO AI Ethics Guidance
 
Special Accounting Areas - Hire purchase agreement
Special Accounting Areas - Hire purchase agreementSpecial Accounting Areas - Hire purchase agreement
Special Accounting Areas - Hire purchase agreement
 
Alexis O'Connell lexileeyogi Bond revocation for drug arrest Alexis Lee
Alexis O'Connell lexileeyogi Bond revocation for drug arrest Alexis LeeAlexis O'Connell lexileeyogi Bond revocation for drug arrest Alexis Lee
Alexis O'Connell lexileeyogi Bond revocation for drug arrest Alexis Lee
 
Presentation1.pptx on sedition is a good legal point
Presentation1.pptx on sedition is a good legal pointPresentation1.pptx on sedition is a good legal point
Presentation1.pptx on sedition is a good legal point
 
Understanding Cyber Crime Litigation: Key Concepts and Legal Frameworks
Understanding Cyber Crime Litigation: Key Concepts and Legal FrameworksUnderstanding Cyber Crime Litigation: Key Concepts and Legal Frameworks
Understanding Cyber Crime Litigation: Key Concepts and Legal Frameworks
 
Alexis O'Connell Lexileeyogi 512-840-8791
Alexis O'Connell Lexileeyogi 512-840-8791Alexis O'Connell Lexileeyogi 512-840-8791
Alexis O'Connell Lexileeyogi 512-840-8791
 
Wurz Financial - Wealth Counsel to Law Firm Owners Services Guide.pdf
Wurz Financial - Wealth Counsel to Law Firm Owners Services Guide.pdfWurz Financial - Wealth Counsel to Law Firm Owners Services Guide.pdf
Wurz Financial - Wealth Counsel to Law Firm Owners Services Guide.pdf
 
Grey Area of the Information Technology Act, 2000.pptx
Grey Area of the Information Technology Act, 2000.pptxGrey Area of the Information Technology Act, 2000.pptx
Grey Area of the Information Technology Act, 2000.pptx
 

Full Course - Law and Regulation of Machine Intelligence - Bar Ilan University 2016

  • 1. Law and Regulation of Machine Intelligence Prof. Nicolas Petit ©; Twitter: @CompetitionProf Fall 2016, Bar Ilan University
  • 2. www.lcii.eu Prospective issues  A robot surgeon misdiagnoses your illness, who’s liable?  You lease a robot to a large corporation to milk your herd of cows. The robot kills one cow, who is liable?  A robot programme is hired to select the next salesforce of a company: only returns male profiles, and refuses to shortlist female profiles?  A robot spies on your spouse who cheats, shall he report? Data protection issue  Trolley problem: Google’s car runs into a kid or an old woman? A kid with 5% high injury risk v a tree with 99% high injury risk for 4 passengers? A dog v another car?  A robot creates a new song: who owns it? What if the song sounds similar to that of a copyrighted work? Who’s liable for the infringement?  Bots: http://motherboard.vice.com/read/the-best-things-a-random-bot-bought-on- the-darknet  You dont want to drive an autonomous car, but the insurance company refuses to provide a contract: is this ok?  A robot in human form is getting beat in the street by police officers  Right to dignity?; but the robot has killed someone: right to shoot him down?  Death penalty for Bots?  Right to procreate, to dignity, to decent funerals?
  • 3. www.lcii.eu Asimov’s three laws of robotics (1950)  Device that is well-suited for work that is too dull, dirty or dangerous for real humans  Safety feature introduced in all bots  LAW 1: A robot may not injure a human being or, through inaction, allow a human being to come to harm;  LAW 2: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law;  LAW 3: A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws  In later fiction where robots had taken responsibility for government of whole planets and human civilizations, Asimov also added a fourth, or zeroth law, to precede the others:  LAW 0: A robot may not harm humanity, or, by inaction, allow humanity to come to harm
  • 4. www.lcii.eu Not science fiction – NHTSA, 4 February 2016  National Highway Traffic Safety Administration (NHTSA) is US Highway Safety Agency in charge of enforcing the Federal Motor Vehicle Safety Standards (FMVSSs)  Provides certification for new vehicles by automotive producers  Number of FMVSSs requirements  All buit around the notion of “driver”, and “driver’s position” or “driver’s seating position”  “MVSS No. 101 contains requirements for location, identification, color, and illumination of motor vehicle controls, telltales, and indicators. S5.1.1 requires the controls listed in Tables 1 and 2 of the standard to be located so that they are operable by the [belted] driver”.  “S5.3.1, which states that service brakes shall be activated by means of a foot control”  49 CFR 571.3 defines “driver” as the occupant of a motor vehicle seated immediately behind the steering control system.
  • 5. www.lcii.eu Google’s “SDS”  “Google seeks to produce a vehicle that contains L4 automated driving capabilities, and removes conventional driver controls and interfaces (like a steering wheel, throttle pedal, and brake pedal, among many other things)”.  “Expresses concern that providing human occupants of the vehicle with mechanisms to control things like steering, acceleration, braking, or turn signals, or providing human occupants with information about vehicle operation controlled entirely by the SDS, could be detrimental to safety because the human occupants could attempt to override the SDS’s decisions”  “Google’s design choices in its proposed approach to the SDV raise a number of novel issues in applying the FMVSSs. Those standards were drafted at a time when it was reasonable to assume that all motor vehicles would have a steering wheel, accelerator pedal, and brake pedal, almost always located at the front left seating position, and that all vehicles would be operated by a human driver. Accordingly, many of the FMVSSs require that a vehicle device or basic feature be located at or near the driver or the driver’s seating position. For vehicles with an AI driver that also preclude any occupant from assuming the driving task, these assumptions about a human driver and vehicle controls do not hold”.  “Google has asked who or what is to be considered the driver and which seating position is considered to be the driver’s seating position in its SDV.”
  • 6. www.lcii.eu A Car has a Driver, a Driver need not be human Options  1) “NHTSA could interpret the term “driver” as meaningless for purposes of Google’s SDV, since there is no human driver, and consider FMVSS provisions that refer to a driver as simply inapplicable to Google’s vehicle design”;  2) “NHTSA could interpret “driver” and “operator” as referring to the SDS” NHTSA  As a foundational starting point for the interpretations below, NHTSA will interpret driver in the context of Google’s described motor vehicle design as referring to the SDS, and not to any of the vehicle occupants.  If no human occupant of the vehicle can actually drive the vehicle, it is more reasonable to identify the driver as whatever (as opposed to whoever) is doing the driving. In this instance, an item of motor vehicle equipment, the SDS, is actually driving the vehicle.
  • 7. www.lcii.eu Consequences  “The controls listed in Tables 1 and 2 may simply be operable by the SDS and need not be located so that they are available to any of the human occupants of the motor vehicle”.  For more, see  http://isearch.nhtsa.gov/files/Google%20-- %20compiled%20response%20to%2012%20Nov%20%2015%2 0interp%20request%20--%204%20Feb%2016%20final.htm  http://spectrum.ieee.org/cars-that-think/transportation/self- driving/an-ai-can-legally-be-defined-as-a-cars-driver
  • 8. www.lcii.eu Schedule Topics Date Technology and Society 22-11-16 Theory of Regulation; Liability 23-11-16 Robotic Warfare; Market Conduct 25-11-2016
  • 9. www.lcii.eu Aims of the course Basic questions  Should we regulate AIs and robots?  If yes, how should we regulate AIs and robots? Goals  Identify problems more than solutions  Think of frameworks/methods to mindmap those issues  Learn from you
  • 10. Class I – Technology and Society
  • 11. 1. State of the Art
  • 12. www.lcii.eu Examples  Robotic cars  Drones  Hive and swarm robots
  • 13. www.lcii.eu An old field, AI and robotics  1950s: « Game AI », Arthur Samuel’s checker-playing program  1954: Turing test  1955: Newell and Simon, « Logic Theorist », proves 38 out of 52 mathematical problems  Dartmouth Summer Research Project of 1956
  • 14. www.lcii.eu Initial ideas  Any physical process including the mind process can be modelized as a computable algorithm (Church Turing thesis)  A software is a set of computable algorithms. No reason why it could not reach outcomes similar to those generated by the mind (artefact)  Machines can learn: “Learning is any process by which a system improves performance from experience”, Herbert Simon  Ultimate ambition is to have computers do what humans do well (they already know how to do things humans cannot do): heuristic, seeing, learning
  • 15. www.lcii.eu Milestones  In 1997, DeepBlue beats Gary Kasparov at chess  In 2005, Stanford robot wins DARPA Grand Challenge by driving autonomously for 131 miles along unrehearsed desert trail  In February 2011, in a Jeopardy! quiz show exhibition match, IBM's question answering system, Watson, defeats the two greatest Jeopardy! champions, Brad Rutter and Ken Jennings  In January 2016, Researchers from Google DeepMind have developed the first computer able to defeat a human champion at the board game Go
  • 16. www.lcii.eu  2020s, Personal computers will have the same processing power as human brains.  2030s, Mind uploading becomes possible.  2040s, Human body 3.0 (as Kurzweil calls it) comes into existence; People spend most of their time in full-immersion virtual reality  2045s, The Singularity occurs as artificial intelligences surpass human beings as the smartest and most capable life forms on the Earth; The extermination of humanity by violent machines is unlikely Kurzweil, « The Singularity is Near », 2005  Law of accelerating returns: technology progressing toward Singularity at exponential rate (each transition occurs more rapidly than the last)  Functionality of the brain is quantifiable in terms of technology  Baby boomers will live long enough to see singularity. Nanobots will eventually be able to repair and replace any part of the body that wears out  Strong Artificial Intelligences and cybernetically augmented humans will become the dominant forms of sentient life on the Earth Source: Wikipedia
  • 17. www.lcii.eu Why now? Moravec paradox solved?  High level reasoning (playing chess, using mathematics, etc.) easier than low-level sensorimotor skills (moving across space, recognizing speech, etc.)  This is because high level reasoning demands less computational power  See Moravec, H. (1998). When will computer hardware match the human brain. Journal of Evolution and Technology, 1. Technological evolution  Brute force computational power now available (due to Moore’s law)  Vast troves of data now available, and distributed (cloud)
  • 18. www.lcii.eu Various subfields of AI, and examples  Deep learning (or machine learning)  Algorithms that enable robots to learn tasks through trial and error using a process that more closely approximates the way humans learn: spam filtering  Neural networks  Emulate the ability of living organisms to integrate perceptual inputs smoothly with motor responses  Speech recognition  Uses sound metrics along with domain and context specific language to respond to voice commands  Natural language processing  Robots interacting and responding through interpretations of natural language instructions  Artificial vision  Object recognition, which allow robots to interact and measure their environment . Can include different features: visual object recognition and tracking, image stabilization, visual-based serving, human-to-machine interaction etc. => recognize x-rays, MRI scans, battlefield robots recognize kids, automated cars,  Knowledge representation
  • 19. www.lcii.eu Philosophical debate Turing  Dialogue with a human and a machine through teletype  If a machine could carry on a conversation (over a teleprinter) that was indistinguishable from a conversation with a human being, then it was reasonable to say that the machine was "thinking”  Machine's ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human  A. M. Turing (1950) Computing Machinery and Intelligence. Mind 49: 433-460 Searle  « Chinese room » argument  You do not speak Chinese  You’re in room with two slits, a book, and some scratch paper  Someone slides Chinese characters through first slit  You follow the instructions in the book, correlate characters as instructed by book, and slide the resulting sheet in the second slit.  It appears you speak Chinese, yet you do not understand a word of Chinese  No need to understand Chinese to translate it. Fact that computer is able to do it does not create strong AI  Weak AI that simulates thought  http://www.iep.utm.edu/chineser/
  • 20. www.lcii.eu Scientific debate  Is this a computational problem only (Kurzweil)?  Or is there an algorithm that we do not know?
  • 21. www.lcii.eu Challenges (1)  Supervised learning v unsupervised learning  Human feeds neural network with inputs (image) and output/label (face, dog, lamp, etc.), and the AI comes up with a statistical rule that correlates inputs with the correct outputs => no causal explanation + error or reward signal  AI does identify patterns  Pornography example: Justice Potter Stewart, SCOTUS: « I know it when I see it »  But all this is supervised learning, and is finite resource => but Internet providers have pre-labelled data  Natural language  « Recognize speech » v « Wreck on the beach »  Questions on Siri, Wolfram alpha, etc. generate different responses if different syntax is used  Disambiguation: « I ate spaghetti with meatballs » v « I ate spaghetty with chopsticks » (see R. Mooney podcast)  Artificial vision and the towel problem: https://www.youtube.com/watch?v=gy5g33S0Gzo  Common sense?  Magellan problem  Optimizing transportation algorithm
  • 23. www.lcii.eu Robotics  Robotics is meta-technology  Robotics is about space motion  A robot is an « agencified » AI  Mechanical engineering issues are not relevant, though they constitute significant limits  Power  Haptic technology  Motricity
  • 24. www.lcii.eu Survey of most relevant applications, RoboLaw, 2014
  • 25. www.lcii.eu Related technologies  Human enhancement (and transhuman sciences)  Bionics (see J. Fischman, National Geographic, 2010) and prosthetics  Exosqueletons  Emulations  Mind uploading  Care robots, social robots, etc.  Augmented reality
  • 26. www.lcii.eu Definitional problem 1921, Karel Capek, R.U.R.  Rossum’s Universal Robots: Capek’s play makes first use of the word “robot” to describe an artificial person  Capek invented the term, basing it on the Czech word for “forced labor” or “servitude”: http://www.wired.com/2011/01 /0125robot-cometh-capek-rur- debut/ 2014, Robolaw  “In fact, the term “robot” can mean different things to different people, since there is no agreement on its meaning neither among professional users (i.e. roboticists) nor among laypeople”
  • 27. www.lcii.eu Definitions  “Actuated mechanism programmable in two or more axes (4.3) with a degree of autonomy (2.2), moving within its environment, to perform intended tasks”, ISO 8373:2012(en)  “A robot is a constructed system that displays both physical and mental agency, but is not alive in the biological sense ”, Neil M. Richards and William D. Smart, 2016  “Robots are mechanical objects that take the world in, process what they sense, and in turn act upon the world”, Ryan Calo, 2015.
  • 28. www.lcii.eu ISO 8373:2012(en)  “a degree of autonomy? ”  Full autonomy means unexpected decisions in unexpected situations  “Moving within its environment”?  Space motion is critical  Softbots do not move? Need “Hard”-bot seated behind a computer?  Human being who makes online transactions does not move, yet may create harm  “to perform intended tasks”  Intention of whom? Very confusing  Robot with sufficient autonomy to intentionally resist to harmful third party intentional order  Drone will not launch bomb on hospital  Google car will not crash in school  Industrial bot refuses to operate if safety risk  Localizing intention: initial intention or subsequent command?
  • 29. www.lcii.eu « Constructed system that displays both physical and mental agency », Richards and Smart  But robots created by robots? Are they constructed?  How much mental agency?  Semi unmanned drones  Robots w/o physical agency: softbots (Hartzog, 2015)  News generating bot  Robot advertisers  Computer composers: http://www.gizmag.com/creat ive-artificial-intelligence- computer-algorithmic- music/35764/
  • 31. www.lcii.eu « Sense – Think – Act » spectrum? Sense (acquire and process information) Think (process what was sensed) Act (locomotion and kinematics) Low Industrial robots that paint or weld car parts Exosqueleton, DaVinci Robot (teleoperated) Augmented reality devices (Hololens), Medical diagnosis robot Medium Mars Rover, Drones High Vacuum cleaner Social Robots Driverless car; Hoops Robotic basketball- shooting arm; Airport security check systems
  • 32. www.lcii.eu Robolaw typology  No definition, but 5 categories of relevant items:  Use or task: service or industrial  Environment: physical (road, air, sea, etc.) v cyberspace  Nature: embodied or disembodied (bionic systems)  Human-Robot Interaction (HRI)  Autonomy
  • 33. www.lcii.eu Calo, 2015  Embodiement  Robot as « machine » or « hardware »  Softbots (eg, robot traders)?  Mooney thinks this is irrelevant; true from scientists perspective, but not necessary true from a society perspective  Emergence  New forms of conduct, including welfare enhancing behaviour => problem solving robots  « Social valence »  Robots stimulate reactions from society: « social actors »  Soldiers jeopardize themselves to preserve robots in military field  People write love letters to Philae  Often correlated to anthropomorphic embodiement (Honda Asimo)
  • 35. www.lcii.eu Pew Research Survey 48%  Tech pessimists  “a massive detrimental impact on society, where digital agents displace both blue- and white-collar workers, leading to income inequality and breakdowns in social order” 52%  Tech optimists  “anticipated that human ingenuity would overcome and create new jobs and industries” Source: http://www.futureofwork.com/article/details/rise-of-intelligent-robots-will-widen-the-social- inequality-gap
  • 36. www.lcii.eu A. Smith, An Inquiry into the Nature and Causes of the Wealth of Nations, 1776  “A great part of the machines made use of in those manufactures in which labour is most subdivided, were originally the inventions of common workmen, who, being each of them employed in some very simple operation, naturally turned their thoughts towards finding out easier and readier methods of performing it. Whoever has been much accustomed to visit such manufactures, must frequently have been shewn very pretty machines, which were the inventions of such workmen, in order to facilitate and quicken their own particular part of the work”.
  • 37. www.lcii.eu D. Ricardo, On Machinery, 1821
  • 38. www.lcii.eu J-M. Keynes, “Economic Possibilities for our Grandchildren”, 1930  “We are being afflicted with a new disease of which some readers may not yet have heard the name, but of which they will hear a great deal in the years to come-- namely, technological unemployment. This means unemployment due to our discovery of means of economising the use of labour outrunning the pace at which we can find new uses for labour. [...] But this is only a temporary phase of maladjustment. All this means in the long run that mankind is solving its economic problem”.  “Yet there is no country and no people, I think, who can look forward to the age of leisure and of abundance without a dread”  “ Three-hour shifts or a fifteen-hour week may put off the problem for a great while. For three hours a day is quite enough to satisfy the old Adam in most of us!”  Concludes by touting to disappearance of economics as a science
  • 39. www.lcii.eu Empirical studies  Bank of America/Merrill Lynch, 2015  “Robots are likely to be performing 45% of manufacturing tasks by 2025E (vs. 10% today)”  McKinsey Global Institute, Disruptive technologies Advances that will transform life, business, and the global economy, 2013  By 2025, “knowledge work automation tools and systems could take on tasks that would be equal to the output of 110 million to 140 million full-time equivalents (FTEs)” (knowledge work is use of computers to perform tasks that rely on complex analyses, subtle judgments, and creative problem solving).  By 2025, “[w]e estimate that the use of advanced robots for industrial and service tasks could take on work in 2025 that could be equivalent to the output of 40 million to 75 million full-time equivalents (FTEs)”.
  • 40. www.lcii.eu Frey and Osborne, 2013  “47 percent of total US employment is in the high risk category, meaning that associated occupations are potentially automatable over some unspecified number of years, perhaps a decade or two”  “most workers in transportation and logistics occupations, together with the bulk of office and administrative support workers, and labour in production occupations, are at risk” + “a substantial share of employment in service occupations” Wave IWave II Plateau
  • 41. www.lcii.eu Substitution effect, consequences Job polarization  Shift in the occupational structure  Displaced workers relocate their labor supply to low skill service occupations  Other humans resist by investing in skills through education (Frey and Osborne, 2014; Cowen, 2013)  This leads to « labour market polarization » (Autor, 2014; Cowen, 2013) Discussion  Frey and Osborne, 2014 believe this model still holds true  “Our model predicts … computerisation being principally confined to low-skill and low-wage occupations. Our findings thus imply that as technology races ahead, low- skill workers will reallocate to tasks that are non-susceptible to computerisation – i.e., tasks requiring creative and social intelligence”  Brynjolfsson and McAfee, 2011 disagree: when technology becomes cognitive, substitution can also occur for non routine tasks
  • 42. www.lcii.eu Substitution pace  “Technological advances are contributing to declining costs in robotics. Over the past decades, robot prices have fallen about 10 percent annually and are expected to decline at an even faster pace in the near future (MGI, 2013). Industrial robots, with features enabled by machine vision and high-precision dexterity, which typically cost 100,000 to 150,000 USD, will be available for 50,000 to 75,000 USD in the next decade, with higher levels of intelligence and additional capabilities (IFR, 2012b). Declining robot prices will inevitably place them within reach of more users”  Hanson on copies
  • 43. www.lcii.eu  Philips brings electric shavers production home? https://blogs.cfainstitute.org/in vestor/2014/06/16/the-robot- revolution-innovation-begets- innovation/ Effect on Developing Economies?  McKinsey Global Institute, 2013  “Effects of these technologies on developing economies could be mixed. Some countries could lose opportunities to provide outsourced services if companies in advanced economies choose automation instead. But access to knowledge work automation technologies could also help level the playing field, enabling companies in developing countries to compete even more effectively in global markets”.
  • 44. www.lcii.eu Substitution (Engineering) Bottlenecks: Frey & Osborne, 2013 Social intelligence tasks Creative intelligence tasks Perception and manipulation tasks Negotiation, persuasion and care Ability to make jokes; recipes ; concepts Disorganized environment or manipulation of non- calibrated, shifting shapes (towel problem)
  • 45. www.lcii.eu Autor, 2014  Routine tasks: “Human tasks that have proved most amenable to computerization are those that follow explicit, codifiable procedures”  Non routine tasks: “Tasks that have proved most vexing to automate are those that demand flexibility, judgment, and common sense”  Engineers “cannot program a computer to simulate a process that they (or the scientific community at large) do not explicitly understand”  Non routine tasks less exposed to substitution  Tasks that are not exposed may benefit from it, though complementarity effect  In construction, mechanization has not entirely devalued construction workers, but augmented their productivity; but not true for all (worker who knows to use shovel v excavator)
  • 46. www.lcii.eu Typology of D. Autor et al (2003), Autor (2014) Task Description Substitution risk Routine (incl. skilled work) Clerical work, bookeeping, back and middle office, factory work High Non routine « Abstract » « Manual » Low Problem solving, intuition, creativity and persuasion Situational adaptability, in person interaction, visual and language recognition High education, high wage Low education, low wage Doctors, CEOs, managers, artists, academics Housecleaning, flight attendants, food preparation, security jobs
  • 47. www.lcii.eu Findings of D. Autor et al (2003), Autor (2014)  Computers are more substitutable for human labour in routine relative to non-routine tasks (substitution effect);  And a greater intensity of routine inputs increases the marginal productivity of non-routine inputs (complementarity effect)  “Job polarization” effect  Increase of high education, high wage jobs  Increase of non routine low education, low wage jobs  No increase in wages for this later category, given abundance of supply  Autor, D., Levy, F. and Murnane, R.J. (2003), “The skill content of recent technological change: An empirical exploration”, The Quarterly Journal of Economics, vol. 118, no. 4, pp. 1279–1333
  • 48. www.lcii.eu Job Polarization, some evidence (Autor, 2014)
  • 49. www.lcii.eu  “Obtaining skills takes time studying in school and learning on the job. Thus skilled workers are disproportionately older workers”  “machine-biased productivity improvements effects a redistribution from younger, relatively unskilled workers to older relatively skilled workers as well as retirees”  “When today’s machines get smarter, today’s young workers get poorer and save less”  “The fall in today’s saving rate means that the next generation will have even lower wages than today”  “In short, better machines can spell universal and permanent misery for our progeny” Generational effect, Sachs and Kotlikoff, 2012  Long term misery?
  • 50. www.lcii.eu T. Cowen, 2013 Average is over  The rich will get richer, the poor will get poorer  Substitution effect stronger in work w/o consciousness/ability to train Freestyle chess metaphor  Random player-machine teams outperform chessmaster- machine teams  Not necessary teams of grand masters!  “In the language of economics, we can say that the productive worker and the smart machine are, in today’s labor markets, stronger complements than before”
  • 52. www.lcii.eu The model explained Multi-causal substitution  Exponential decrease in costs of technology  « Deskilling »  Replacement by semi-skilled technologies, through fragmentation and simplification of tasks (fordism)  « The copy economy » (Hanson, 2014): « the most important features of these artificial brains is easy to copy » Two types of complements  Complements arising from substitution (upward slopping curve)  AI and Robots-related jobs (those of Autor and Cowen)  Enabling technologies and new jobs  Punch cards, typewriters, printers, calculators, etc.  Complements with indifference (L curve)  Indifference on human labour of an increase in machine labour (horizontal line)  « Emerging jobs » new sectors without human labour  Protected sectors, superstars like chefs, footballers and singers?  Indifference on machines of increase in human labour (vertical line)  Bank tellers (ATMs?) => J. Bessen book
  • 53. www.lcii.eu Take aways  MGI, 2013:  “In some cases there may be regulatory hurdles to overcome. To protect citizens, many knowledge work professions (including legal, medical, and auditing professions) are governed by strict regulatory requirements regarding who may perform certain types of work and the processes they use”  “Policies discouraging adoption of advanced robots—for example, by protecting manual worker jobs or levying taxes on robots—could limit their potential economic impact”.  Frey and Osborne, 2013:  “The extent and pace of legislatory implementation can furthermore be related to the public acceptance of technological progress”  Brynjolfsson and McAfee, 2014 citing Voltaire : “Work saves a man from three great evils: boredom, vice, and need.”
  • 54. 3. Legal services and education
  • 56. www.lcii.eu LegalZoom: https://www.legalzoom.com/country/be  “LegalZoom provides the legal solutions you need to start a business, run a business, file a trademark application, make a will, create a living trust, file bankruptcy,change your name, and handle a variety of other common legal matters for small businesses and families. Since the process involved in starting a business can be complicated, we provide services to help start an LLC, form a corporation, file a DBA, and take care of many of the legal requirements related to starting and running a business. If you are interested in protecting your intellectual property, LegalZoom offers trademark and copyright registration services, as well as patent application assistance. It's essential for everyone to have a last will and testament and a living will, and you can get yours done efficiently and affordably through LegalZoom. For those who have more advanced planning needs, our living trust service is available. With all our services, you have the option to speak to a lawyer and tax professional. Let LegalZoom take care of the details so you can focus on what matters most – your business and family”
  • 57. www.lcii.eu Pricing Legal Zoom Source: http://www.law-valley.com/blog/2014/03/03/legalzoom-le- leader-americain/
  • 59. www.lcii.eu Neota Logic Inc. http://www.neotalogic.com/solutions/ Concept  Software company that helps companies make routine legal decisions without consulting a lawyer.  Let an employee take family leave (source of employment discrimination claims)?  Input questions, and get results  Customer is business or law firms  Software has been used to answer queries on the European Union’s regulation of financial derivatives Example: compliance  “Regulations are constantly changing. With a Neota Logic app, you can instantly incorporate changes to regulations and policies ensuring timely compliance. Incorporate apps into your regulatory processes and see how easy it is to ensure consistent methodologies are followed and provide your business with auditable results”
  • 61. www.lcii.eu Drivers of change (Susskind, 2014) Contextual  More for less challenge  Clients of lawyers (in-house counsels): less staff, less external counselling, more compliance and conformity costs  https://www.lexoo.co.uk/  Liberalization  http://thejurists.eu/ Structural  Information technology (Katz 2013)  Large data power  Immense computing power  Automate and innovate
  • 62. www.lcii.eu Katz, 2013 Wind of change  “Like many industries before it, the march of automation, process engineering, informatics, and supply chain management will continue to operate and transform our industry. Informatics, computing, and technology are going to change both what it means to practice law and to “think like a lawyer.”” Substitution+complement
  • 63. www.lcii.eu Outlook First generation  E-discovery  Automated document assembly  Online dispute resolution New generation  Quantitative legal prediction  Contract analyzis  Online drafting wizard (Weagree.com)  Legal risk management
  • 64. www.lcii.eu Quantitative Legal Prediction  Everyday, lawyers make predictions  Do I have a case?  What is our likely exposure?  How much is this going to cost?  What will happen if we leave this particular provision out of this contract?  How can we best staff this particular legal matter?  How high the probability to settle?
  • 65. www.lcii.eu Predicting case outcomes LexMachina  Lunch between Professor M. Lemley and Bruce Sewell  Create electronic set of patent litigation events and outcomes  https://lexmachina.com/about/  Funded by Apple, Cisco, Genentech, Intel, Microsoft, and Oracle, etc.  More at https://goo.gl/UyB0wU
  • 66. www.lcii.eu QLP and Machine Learning  Predicting outcomes  “An algorithm learning that in workplace discrimination cases in which there is a racial epithet expressed in writing in an email, there is an early defendant settlement probability of 98 percent versus a 60 percent baseline. An attorney, upon encountering these same facts, might have a similar professional intuition that early settlement is likely given these powerful facts”  No gaming (moral hazard)  Discover hidden data  “Imagine, for instance, that the algorithm detects that the probability of an early settlement is meaningfully higher when the defendant sued in a personal injury case is a hospital as compared to other types of defendants”
  • 68. www.lcii.eu LawyerMetrics  “Lawyer Metrics makes it possible to replace lower-performing “C players” in your organization with higher performing “B” and “A” attorneys”.  http://lawyermetrics.org/services/human/
  • 69. www.lcii.eu 4 disruptive and robotic legal technologies 1. Embedded legal knowledge 2. Intelligent legal search 3. Big data 4. AI-based problem-solving (pb solving, with natural language processing input)
  • 70. www.lcii.eu Pros and cons Technique  Use big data and computational power  “inverse” or inductive reasoning  Use observables to build model (>< build model, and then try to infer result)  Concept of similarity that is implemented and refined using large bodies of data  Facebook recommending friends, Netflix recommending movies and Amazon recommending books Pros and Cons  Pros  Overcomes anecdotal or unindicative information  Overcomes human cognitive limitations: heuristic, biases, preferences, etc.  Cons  Lack of relatedness btw new cases and past cases  Overgeneralization: most rape cases occured in poor areas => rape does not exist in wealthier areas  Capture information in data: change of member on board of regulator
  • 71. www.lcii.eu  Stasia Kelly, U.S. comanaging partner of DLA Piper: “I really want to know the person giving advice”  R. Susskind, Tomorrow’s Lawyer, 2014  Often, lawyers regard legal work as « bespoke »: customized, made to measure, personal  « romantic » vision Impact on legal professions, and the denial problem
  • 72. www.lcii.eu Really?  Take employment contract: you dont use a blank canvas everytime you draft one  French lawyers all use same structure in work (standardized): I-II; AB  Anglo-saxon contracts always have definitions first
  • 73. www.lcii.eu  Many of those tasks can be subject to  Outsourcing  Insourcing  Delawyering  Computerizing  Leasing  ... Decomposition TABLE 4.1. Litigation, decomposed (Sussking, 2014 Document review Legal research Project management Litigation support (Electronic) disclosure Strategy Tactics Negotiation Advocacy
  • 74. www.lcii.eu Effect on legal market (Susskind, 2013) Firms  Elite group of say 20 firms to Big4;  Opportunity for middle size firms;  Small firms will disappear, due to liberalization and competition from banks, accountants and other retailers (“end of lawyers who practice in the manner of a cottage industry”)  Contrast with Kobayashi and Ribstein? Lawyers  Barristers will remain:  “oral advocacy at its finest is probably the quintessential bespoke legal service”  But not for “lower value” disputes and note that “courtroom appearances themselves will diminish in number with greater uptake of virtual hearings, while online dispute resolution (ODR) will no doubt displace many conventional litigators ”
  • 75. www.lcii.eu Disciplines likely to be affected?  Corporate and M&A work  Global Merger Analyzis Platform (GMAP): http://www.ft.com/intl/cms/s/2/a1271834-5ac2-11e5- 9846-de406ccb37f2.html#axzz41hY8v2x3  Trademark and copyrights filing  Patent applications  Private international law?  Your take?
  • 76. www.lcii.eu Disciplines likely to be affected  Data intensive  Public information  Searchable  Scalable (global law)  Standardized (labelled input)
  • 77. www.lcii.eu Users perspective  Provides « unmet legal needs » (Wilkins): brings law in consumer markets  Growth of LegalZoom is indicative  For more see http://www.slate.com/articles/technology/robot_invasio n/2011/09/will_robots_steal_your_job_5.html
  • 80. www.lcii.eu Complementarity effect (Susskind, 2013) New jobs  Legal knowledge engineer  Legal technologist  Legal hybrid  Legal process analyst  Legal project manager  ODR practitioner  Legal management consultant  Legal risk manager New employers  Global accounting firms  Major legal publishers  Legal know-how providers  Legal process outsourcers  High street retail businesses  Legal leasing agencies  New-look law firms  Online legal service providers  Legal management consultancies
  • 81. www.lcii.eu Challenges – Education Forget substitutable work  Routine tasks  Memorization  Research and other repetitive data-driven tasks  Non routine manual-tasks?  Filing briefs  Taking minutes of meetings Invest in complements  Train in science, computation, data analytics and technology  Invest in soft skills, incl. leadership, executive training, management: « social bottlenecks »
  • 82. www.lcii.eu  “Lawyers who expect to operate in this new environment must understand how technology is reshaping the markets in which their clients compete, as well as the practice of law itself, including the use of “big data,” artificial intelligence, and process management to analyze, structure, and produce legal outcomes” (Heineman, Lee and Wilkins, 2014) Retraining need  “Clients will not be inclined to pay expensive legal advisers for work that can be undertaken by less expert people … This prediction does not signal the end of lawyers entirely, but it does point to a need for fewer traditional lawyers. At the same time when systems and processes play a more central role in law, this opens up the possibility of important new forms of legal service, and of exciting new jobs for those lawyers who are sufficiently flexible, open-minded, and entrepreneurial to adapt to changing market conditions” (Susskind, Chapter 11)
  • 83. www.lcii.eu  « la délivrance automatisée de consultations en ligne n’est autorisée que pour répondre à la demande d’un client déterminé et pour satisfaire des besoins spécifiques » Article 4.12 (M.B. 17.01.2013);  « L’avocat ne délivre aucun service ni ne donne consultation ou avis personnalisés sur un forum de discussion électronique ou tout autre groupe virtuel public » Article 4.13 (M.B. 17.01.2013).  « En l’état actuel de la déontologie, une telle pratique n’est pas admise. L’avocat engage son crédit et sa responsabilité s’il n’adapte pas les actes qu’il rédige à l’examen de la situation particulière d’un client […] » (LB 01-02, n°3, 226)  Not OK? Challenges – Professional Regulation  Attorney-client confidentiality  Limit to data aggregation by law firms?; Limit to data portability by lawyers?  Unauthorized exercise of profession  Partnership rules  Independence of lawyer?  Fee regulation
  • 84. www.lcii.eu Challenges – Intellectual property?  More IP protection could promote production of new legal technology  Too much IP protection prevents production of of new legal technology
  • 85. www.lcii.eu Legal preserves  Objections to machines taking over certain types of work?  Passing life sentence?  Other?
  • 86. www.lcii.eu Take away points  Think of tasks, not jobs  Users’ perspective also matters; Not only perspective of suppliers  Where machines are better than humans, expect substitution and/or complementarity  Shall the law prevent machines in legal services? No, substitution (and/or complementarity) not necessarily bad  But need to think about specific tasks for which society wants to preserve humans  Social contract imperative: man-made law necessary for trust?
  • 87. Class II – Regulation and Liability
  • 89. www.lcii.eu “Noel Sharkey, a computer scientist at the University of Sheffield, observes that overly rigid regulations might stifle innovation. But a lack of legal clarity leaves device-makers, doctors, patients and insurers in the dark” The Economist, 01 September 2012
  • 90. www.lcii.eu Definitions  Roboethics and robolaw  Regulation  “State intervention into the economy by making and applying legal rules”? (Morgan and Yeung)  “Exceptionalism” (Calo, 2015): “a technology is exceptional if it invites a systemic change to laws or legal institutions in order to preserve or rebalance established values”. Use existing basic legal infrastructure, and deal with issues on a case-by-case basis, through litigation and precedent (common law approach) v Adopt sui generis rules and updates
  • 91. www.lcii.eu Goals of the lecture  Where do we need regulation?  Put differently where do we need to (i) adapt existing law; (ii) introduce new law?
  • 92. 1.1 Overview of Existing Approaches
  • 93. www.lcii.eu Two trajectories Disciplinary (legal)  In each branch, specialists identify frictional HYPOs  Top down  Suggestions for a green paper, 2012: “top down approach that studies for each legal domain the consequences on robotics” Technology (applications)  For each class of applications, speculation on legal problems  Bottom up  Robolaw, 2012  Stanford, Artificial Intelligence and Life in 2030
  • 94. www.lcii.eu 1. Disciplinary approach  “fitting” exercize: jacket factory allegory  8 areas of the law:  health and safety of machinery;  product liability rules and sale of consumer goods;  intellectual property;  labour law;  data privacy;  criminal law;  contractual and non contractual liability;  e-personhood.
  • 95. www.lcii.eu Intellectual property  Are machine intelligence generated works protectable under intellectual property regimes, and as the case may be, who is their owner?  Copyright law  “Creative” or “original” work requirement  Means work that reflect the subjective personality of the author  Subjective, not objective: two similar songs can be ©  Can a machine that computes data be capable of creation?  Patent law needs “inventive step ” and “non obviousness”  Inventive means unexpected, that does not follow path of technology progress  Non obviousness requirement to the skilled person?  Is anything non obvious to a super intelligence?  Who is the owner?  PETA/Naruto case, before US Courts  Not hypothetical given that some IPRs held by abstract legal persons like corporations?
  • 96. www.lcii.eu Pros and cons  Avoid inconsistencies  Grant strong IP protection in AI field, yet create strict programmer liability in same field  Speculation on problems created by technology under imperfect information  Risk of mistakes, that create new legal problems  Social planner believes that AI research into biological treatment of Alzheimer is next; creates strong IP; but techno frontier; mechanical (mind upload) would work better; unwanted problems
  • 97. www.lcii.eu 2. Functional approach  Determine classes of MI applications, and then assess the legal needs from there.  Bottom up approach geared to technological evolution  Savile row tailors allegory
  • 98. www.lcii.eu Stanford, 2016 8 fields 1. Transport 2. Home/services robots 3. Healthcare 4. Education 5. Low-resources communities 6. Public safety and security 7. Employment and workplace 8. Entertainment 9 sujets juridico-politiques 1. Privacy (biases in predictive algorithm + right to intimacy) 2. Innovation (open v patent thickets) 3. Liability (civil) 1. Locus: efficiency/fairness 2. Foreseeability condition 4. Liability (criminal) 1. Intent condition (mens rea) 5. Agency (legal personhood) 6. Certification and licensing requirements 7. Taxation (budgets dependent payroll, income tax; speeding and parking tickets) 8. Labor (working contracts requirements) 9. Politics
  • 99. www.lcii.eu Robolaw, 2012  Distinct legal issues for various class of applications  Self driving cars: primary question relates to impact of liability rules on manufacturers’ incentives for innovation  For prostheses, the focus is also placed on public law issues, for instance whether a person can identify itself with the prostheses it wears, and whether it can resist depiction in an official document with it, or not  For personal care robots, some basic human rights considerations come also into play, such as the need to protect the right to independent living protected by Article 19 of the UN Convention on the rights of persons with Disabilities: right to refuse robot assistance agst insurance companies?
  • 100. www.lcii.eu Pros and cons  More open to ex ante robo-ethics  Pro-innovation  Obsessive focus on not hindering technological evolution  Too much trust in technology success?  Technological convergence will dissipate differences btw technologies
  • 102. www.lcii.eu Disabling regulation (Pelkmans and Renda) REACH  Problem with the “imposition of fairly heavy testing requirements for all existing and new substances alike”  “The other feature of REACH, owing to its ambitious precautionary approach of ‘no data, no market’ (access), is that this entire process of testing before being allowed on the market takes no less than 11 years” GMO regulation  In the EU, only two new GMO products have been allowed to be cultivated: NK603 GM maize and the Amflora potato  This despite reported benefits to farmers and decrease in poverty
  • 103. www.lcii.eu Drones: Disabling regulation? US FAA  Rules for small unmaned aircrafts https://www.faa.gov/uas/media /Part_107_Summary.pdf  Standardize visual line of sight (VLOS) flights of unmanned aircraft that weigh less than 55 lbs. (25 kg)  Aeronautical knowledge test Finland  “Allows BVLOS flights under certain conditions, and it does not require drone operators to possess an aerial work certificate”  https://techcrunch.com/2016/0 6/28/heres-whats-missing-from- the-new-drone-regulations/
  • 104. www.lcii.eu  Civil purpose nuclear energy  Reproductive cloning and nanotechnologies? « Knee-jerk » regulation?  “tendency to overreact to risks, accidents and incidents” (Van Tol, 2011)
  • 105. www.lcii.eu  Taxi v Uber  Airbn’B v hotel chains  E-cigarette Rent seeking  Bastiat: candle manufacturers request chamber of deputies to: “pass a law requiring the closing of all windows, dormers, skylights, inside and outside shutters, curtains, casements, bull's-eyes, deadlights, and blinds—in short, all openings, holes, chinks, and fissures through which the light of the sun is wont to enter houses, to the detriment of the fair industries with which, we are proud to say, we have endowed the country [...]”.
  • 106. www.lcii.eu Rent seeking in AI  Taxi, truck and bus drivers  Delivery industry  Insurance companies  Carmakers
  • 107. www.lcii.eu Regulatory timing  Collingridge dilemma: Too early to act, not enough information; too late to act, all information but no longer able to change things  “Regulatory connection” quandary: the risks and opportunities created by emerging technologies cannot be “suitably understood until the technology further develops” … what if it is harmful?  “They're talking about spending 5-10 years to regulate technologies that are already 5-10 years old“ (Garreau)  Amazon, Intel and Google have been very vocal in relation to drones delivery regulation, which is outdated  Bostrom’s treacherous turn
  • 108. www.lcii.eu Enabling regulation (Pelkmans and Renda) End-of-life vehicles  “beyond what a market-based approach might be expected to achieve”  Quantitative targets: “reuse and recycling of 80 % of the car weight in 2006, up to 85 % by 2015; reuse and recovery at least 85 % in 2006 and 95 % in 2015”  “Innovation takes place at the very beginning of the life cycle of cars, namely at the design & planning stage”  October 2016: Germany’s Bundesrat just passed a resolution to ban the internal combustion engine starting in 2030 Porter Hypothesis  In environment, safety and health, “tough standards trigger innovation and upgrading”  And market opportunities to race for first mover advantage  Counter-example: 2015 Volkwagen NOx (nitrogen oxides) emission scandal
  • 109. www.lcii.eu Regulatory trade offs  Complex relation between technological innovation and government regulation  “The economic literature (starting from the seminal work of Ashford and later with the so-called “Porter hypothesis”) has long recognised that regulation can be a powerful stimulus to innovation and entrepreneurship”(Pelkmans and Renda, 2014)  At the same time, regulation “can and does disable innovation” (Pelkmans and Renda, 2014)
  • 111. www.lcii.eu Goals of the lecture  Who should pay for robot generated harm?  How is the issue dealt with under basic legal structure?  Should regulation be adopted?
  • 112. www.lcii.eu  Friends pay you a visit  Robot kills friends dog, which it confuses with an insect  Deems it is a threat for herbal environment Hypothetical scenario  You have a garden  Buy a robot gardener  Unmanned system with very high autonomy  Can « sense » maturation of fruits, veggies, plants  Robot has « actuators »  Turn irrigating devices « on »  Prun the grass  Kill mosquitos  Carry and spill water, pesticides and other liquid/aerial products
  • 113. www.lcii.eu Social goals of liability law  Solution to be found so as to fulfill goals of liability law  Disputed  G. Williams, « The Aims of the Law of Tort », 4 Current Legal Problems, 1951  Corrective/protective  Provide solvent target  Deterrence  No gain from harmful conduct  Encourage precaution
  • 114. www.lcii.eu S. Lehman Wilzig, « Frankestein Unbound », 1981  Product liability  Robot is piece of hardware  Liability on producer, plus possible limited liability on importers, wholesalers and retailers  2 manufacturers problem: « hardware » + « software »  « Inherent risk »  Dangerous animals  Strict liability only for dangerous species; no liability for « usually harmless species »  Slavery  Several regimes: master is liable v slave is liable  Roman law: master liable for civil claims, not criminal acts; possibility to eschew if total absence of complicity  What punishment against the bot?  Diminished capacity: independent persons, but not entirely intelligent  Children: fully intelligent, but with low moral responsibility  Agent  Person
  • 115. www.lcii.eu Landscaping of default legal structure  Basic rules (not exclusive)  Default liability/tort regimes: IL torts ordinance  Litigation in court  Additional rules (not exclusive)  Strict liability  Defective products liability (Directive 85/374/EEC on liability for defective products was adopted in 1985  IL: Defective Products Liability Law, 1980 (Defective Products Law).  Consumer rights  Directive 2011/83/EC on Consumer Rights  IL: Consumer Protection Law, 1981 (Consumer Protection Law) and the  Latent defects  Only for sales  Duty of guidance  Only for contracts
  • 116. www.lcii.eu  Liability of owner/keeper/user?  Liability of perpetrator?  Liability of manufacturer? Framing the options  Who may/should pay for robot generated harm?  Classic imputability issue
  • 118. www.lcii.eu Basic rules Body of rules  Tort law  Vicarious liability law Imputability  Perpetrator (one is liable for damages caused by his own acts)  Owner, holder, master (one his liable for damages caused by others’ acts)
  • 119. www.lcii.eu  Negligence  Duty of care  Omission that constitutes negligence  Breach of statutory duty
  • 120. www.lcii.eu  Employer, corporation, State  But also: parents, masters, owners  And vicarious liability for animals, etc.
  • 121. www.lcii.eu  Cumulative  Employer personally liable for employee harm (negligence) and vicariously liable (supervisor)  Employee personally liable and employer vicariously liable  Both are fault-based (+ or – negligence)
  • 122. www.lcii.eu Assessment Owner/keeper/user imputability (vicarious)  Protective of victim => solvent target  Not necessarily apt to achieve deterrence purpose of liability law  Perverse effects on innovation incentives? => kills market for the purchase of robots? Robot imputability (tort)  Could achieve deterrence, for robots and AIs can  Make cost-benefit analysis  Be taught some legal and moral principles: bonus pater familias  But no solvent target  Remedy problem  Solvency issue: robots need registration and to have property (capital endowment)  Transfer the robot to victim (but moral harm?)  Forced sale of the robot (but no market)  Insurance?  Legal personhood threshold
  • 123. www.lcii.eu Assessment Vicarious (owner) Tort (robot) Provide solvent target Y N Incentives N Y
  • 124. www.lcii.eu Conclusion  Early cases likely to seek liability of owner/keeper/user  On basis of vicarious liability  Other liability routes would need to pass personhood threshold, for liability is contingent on third person’s fault  Not fully protective of victims, bc only one solvent target in liability for things!
  • 125. www.lcii.eu Additional rules: Product Liability Law  IL: Defective Products Liability Law, 1980 (Defective Products Law)
  • 126. www.lcii.eu  Strict liability on manufacturers  product was defective (deficiency or warnings/safety instructions insufficient) => undesired injury by normal use  “only to bodily injuries and does not extend to indirect, consequential, or pure economic damages”  Limited defences  defect created when the product was no longer under the manufacturer’s control  “state-of-the-art’ defence”: manufacturer could not have known that the design of the product did not comply with reasonable safety standards  Product was released beyond the control of the manufacturer contrary to its desire  Damage: does “not take into account a level of earnings higher than three times the average earnings in Israel. The damages for pain and suffering pursuant to this law are limited. The remaining bases of claim generally do not provide for a maximum amount of liability”
  • 127. www.lcii.eu Assessment into context  Manufacturer liability  Liability without fault  Strict liability  Primary goal is corrective
  • 128. www.lcii.eu What the basic legal structure achieves  Multiplicity of potential regimes are applicable  No legal desert!  Most likely to take place under vicarious liability+defective products  Less likely to occur under tort liability  Liability both on supply and demand side!
  • 130. www.lcii.eu Ongoing questions  Immunity  Insurance  Standardization
  • 131. www.lcii.eu 1. Immunity  Appeals to limit liability on manufacturers, as a way both to boost innovation in the robotic industry, by reducing the fears of liability- related costs (Calo, 2011)  Strict liability of owner, with cap: owner benefits from introduction of technology, and victim faces tough causality problem (Decker, 2014)
  • 132. www.lcii.eu Ryan Calo, « Open Robotics », 2011  Distinction btw closed and open robots  Closed robots are designed to perform a set task: // dishwasher  Open robots invite third party contribution: multifunction, open to all software, hardware modularity  According to Calo, « open robots » are more « generative » in terms of innovation  But, “open robotics may expose robotics platform manufacturers and distributors to legal liability for accidents in a far wider set of scenarios than closed robotics”  HYPO: Roomba vacuums up and kills an animal  Manufacturer liability if Roomba causes injury in its normal use  If product misuse – attempt to use Roomba as pet locomotion device – no manufacturer liability  With open robots: more applications + no product misuse defense for the robot is not designed to perform predetermined tasks
  • 133. www.lcii.eu  Disincentive to investments in (open) robotics markets  “Early adopters of robotics are likely to be populations such as the elderly or disabled that need in-home assistance. Other early applications have involved helping autistic children. These populations would make understandably sympathetic plaintiffs in the event of litigation” (http://cyberlaw.stanford.e du/blog/2009/11/robotics- law-liability-personal-robots) The problem
  • 134. www.lcii.eu Selective immunity  Immunizing manufacturers of open robotic platforms from damages lawsuits arising out of users’ implementation of robots, at least temporarily;  Precedent in aviation industry, crippled by litigation under PLL => General Aviation Revitalization Act (“GARA”) of 1994  // immunity enjoyed by firearms manufacturers and website operators  Websites not liable for users’ postings (re. defamation, for instance)  Selective immunity: “presumption against suit unless the plaintiff can show the problem was clearly related to the platform’s design”
  • 135. www.lcii.eu Insurance  Today, most car accidents are caused by human error  Progress in crash avoidance technology  Risks of accidents unlikely to be completely removed since events are not totally predictable, yet decrease (90%)  Disruption of the « crash economy » (RAND, 2014)  Today, variety of approaches, but imputability on driver: strict, no fault or negligence based liability  Today, compulsory insurance (EU directive)  Two issues:  Who’s liable?  Shift in liability from user/owner/driver to manufacturer  EP: “The greater a robot's learning capability or autonomy is, the lower other parties' responsibility should be”  Compulsory v non compulsory insurance?
  • 136. www.lcii.eu  Transfer?  European Parliament, Proposal, JURI: “Points out that a possible solution to the complexity of allocating responsibility for damage caused by increasingly autonomous robots could be an obligatory insurance scheme, as is already the case, for instance, with cars; notes, nevertheless, that unlike the insurance system for road traffic, where the insurance covers human acts and failures, an insurance system for robotics could be based on the obligation of the producer to take out an insurance for the autonomous robots it produces” Compulsory insurance?  Still needed? Maybe, but coverage of losses caused by crashes is likely to be less expensive  Coverage of losses not caused by crashes but by wind, floods and other natural elements and by theft (comprehensive coverage) is less likely to change, yet price will decrease bc cost of repair offset by lower accidents (see http://www.iii.org/issue- update/self-driving-cars-and- insurance)
  • 137. www.lcii.eu Insurance  Nothing changes: no-fault form, in which neither party is at fault, and each car owner’s insurance covers their own vehicle  Prices should go down!  http://www.iii.org/issue-update/self-driving-cars-and-insurance  Changes:  Manufacturer insurance (Volvo)  Shared insurance?  Utility cost with a premium cost based on mileage or usage  Leasing – ridesharing model  Risks involving driverless-car hacks and cybersecurity
  • 138. www.lcii.eu Standardization  Private ordering mechanisms  International Standardization organs (ISO, IEC)  ISO/TS 15066:2016(en), Robots and robotic devices — Collaborative robots  US and EU organizations (IEEE, CEN, CENELEC)  Many EU rules on machinery and product safety  Example: Directive 2006/42/EC of the European Parliament and the Council of 17 May 2006 on machinery
  • 139. 4. Should Someone else Pay?
  • 140. www.lcii.eu Coase Theorem  HYPO  A uses robot gardener at night, when water and electricity cost less  Neighbour B moves in, creates a boutique hotel  With noise at night, B’s clients flee in droves  Who should be liable for injury?  Standard legal solution: A liable to compensate externality  But at the same time, it is B that creates harm if A is forbidden to use its robot gardener simply bc the former moved  Coase argues that who is liable is to some extent irrelevant: as long as no transaction costs and property rights well defined, parties will bargain the efficient solution
  • 141. www.lcii.eu  This is only possible absent transaction costs, ie when it is easy to negotiate  Often not the case in the real world  A is a cooperative of multiple owners, who are never present, for their agricultural exploitation is fully automated: search costs and negotiation costs  If B liable, bargaining for €5,000 is not possible. B would contemplate installing double glazing: €10,000 loss for society  Not the cheapest cost solution  When there are transaction costs, law should assign liability so as to achieve the cheapest cost solution that would have been found in negotiation  A is liable  Negative externality inflicted by regulatory system should be as little efficiency harmful as possible Cheapest cost solution principle  Options  A sends robot for mechanical update so it makes less noise: 5,000€  B installs double glazing: extra €15,000.00  Solution  Efficient social solution is that robot gardener is retooled  This happens regardless of liability assignement  If town hall assigns B right to silence, A will pay 5,000€ to retool bot  If town hall assigns A right to noise, B will pay to A 5,000€ to retool bot  In both cases, the efficient solution is followed, regardless of who is liable
  • 142. www.lcii.eu Application to robotics  Invites to research what is the cheapest cost solution  Not one obvious culprit, avoid moral bias  A, B or someone else?
  • 143. www.lcii.eu Liability on the creator/programmer?  Robot manufacturer to encode ex ante robot prohibition to operate at night?  One line of code: 1€?  If all the Bs of this world could negotiate freely, they would contact all robot gardener producers and ask them to encode this prohibition  Not possible  Legal system to hold robot producers liable if noise harm at night  But bots would be less valuable for buyers, and price system would correct this?
  • 144. www.lcii.eu References  R. Coase, “The Problem of Social Cost”, 3 J. Law & Econ. [1960]  R. Coase, “The Federal Communications Commission”, 2 J. Law & Econ. [1959]
  • 145. Class III: Robotic Warfare; Market Regulation; Regulatory Framework
  • 147. www.lcii.eu Goals of the lecture  Can we delegate human killing decision to an autonomous machine?  Yes, no, maybe?  What, if any, conditions should the law set?
  • 149. www.lcii.eu Technology Today  Robots (with limited autonomy) are already deployed on the battlefield in areas such as bomb disposal, mine clearance and antimissile systems  No full autonomy, but increasing at rapid pace  30 nations with defensive human-supervised autonomous weapons to defend against surprise attacks from incoming missiles and rockets: IronDome, Phalanx CWIS, etc. Tomorrow?  Lethal Autonomous Weapons Systems (“LAWS”) aka “robot killers”  Robots that can “select and engage targets without further intervention by a human operator” (US Directive)  “Kalashnikovs of tomorrow”: Unlike nuclear weapons, mass production, easy proliferation and swift circulation => SGR-A1 costs approx. 200,000,00€  Like nuclear weapons, risk of “a military AI arms race”
  • 153. www.lcii.eu Typology Man Machine Humans in the loop (essential operator) Non autonomous Humans on the loop (fail safe) Partially autonomous Humans out of the loop (seek and destroy) Fully autonomous
  • 154. www.lcii.eu  Psychological disconnect and self justice (« Good Kill » movie)  Mistakes in visual recognition: identifying someone as a combatant  1960, U.S. missile attack warning system at NORAD, where an alert was received saying that the United States was under massive attack by 99 Soviet missiles, 99,9% certainty; amateur astronomer: “It’s a beautiful night! There’s a big full moon right in sector three. And I can even see icebergs down in the fjord.”  Hate by design? Prospects for warfare  Clean war  No casualties: explosive detection bots  No war crimes on the battlefield and outside  Fast war  Economic  Cuts in budget of armed forces, including retirement and public health  R&D
  • 156. www.lcii.eu Legal issues Standard  Since 2004, US program to search and kill al Qaeda and Taliban commanders  Used in Lybia, Pakistan, Afghanistan, Syria  117 Drone killings in 2010, see http://www.longwarjournal.org/ pakistan-strikes/  In 2013, China flew a drone into contested airspace in the East China Sea. Japan reciprocated by sending a manned fighter aircraft  Act of war? IAC, NIAC? Ethical  Shall a machine be granted a license to kill without human input?  Are there decisions computers shall not make without human input?  Need killswitch?
  • 157. www.lcii.eu Ban on LAWs  Human Rights’ Watch: https://www.hrw.org/report/2012/11/19/losing- humanity/case-against-killer-robots  Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, 20–21, Human Rights Council, U.N. Doc. A/HRC/23/47 (Apr. 9, 2013)  United Nations held a further round of talks in Geneva between 94 military powers aiming to draw up an international agreement restricting their use
  • 158. www.lcii.eu Today  https://www.stopkillerrobots.org/2016/04/thirdmtg/  Algeria, Bolivia, Chile, Costa Rica, Cuba, Ecuador, Egypt, Ghana, Holy See, Mexico, Nicaragua, Pakistan, State of Palestine, and Zimbabwe  Convention on Conventional Weapons (CCW) Review Conference in December will decide if they will hold a Group of Governmental Experts (GGE) meeting on autonomous weapons systems in 2017
  • 159. www.lcii.eu Outright ban proponents  Moral argument, giving robots the agency to kill humans would cross red line  Reduce killing decision to cost-benefit  Deontological self limitation in human killing decision: empathy  Duty of human judgment in killing decision, bc justice hinges on human reasoning (Asaro, 2012)  Slippery slope  Decrease the threshold of war  Prospect of a generalization of warfare  Existential risk
  • 160. www.lcii.eu Ban skeptics  If humanity persists in entering into warfare, which is reasonable assumption, need to better protect non fighter lives  Automation increases human control: “ In situations where the machine can perform the task with greater reliability or precision than a person, this can actually increase human control over the final outcome. For example in a household thermostat, by delegating the task of turning on and off heat and air conditioning, humans improve their control over the outcome: the temperature in the home”.  Robots are conservative warriors: do not try to protect themselves, in particular in case of low certainty of identification + judgment not clouded by anger or fear  Robots will not replace humans: organic assets like dogs, etc.  “Law as code”: design ex ante LAWs constraints like Watt and the heating machines, upper bound on RPM (upper bound on laser wattage, for instance)  « Against statu quo », pro « moratorium » and « regulate instead of prohibiting them entirely », (Arkin, 2016)
  • 161. www.lcii.eu  Rent seeking?  UK position driven by development of Taranis drone  Scientists driven by vested research interest? General Leslie Groves (cited in Levy, 2006): “What happened is what I expected, that after they had this extreme freedom for about six months their feet began to itch, and as you know, almost every one of them has come back into government research, because it was just too exciting” Critical review  Knee-jerk regulation?  We already outsource, to specialist killers that we do not know and over whom we have little control  We are faced with a possibly transitional question, shall not obscure the possibility of machine to machine war where 0 human casualties becomes possible  Lethal weaponry already exists. LAW simply makes it accurate: weapons with 100% success rate (consider « HRW position on Human Rights Watch’s position on the use of precision-guided munitions in urban settings—a moral imperative »)  Counterfactual issue: existing world is not clean war, but dozens of hidden war crimes
  • 162. www.lcii.eu Outstanding issues (Anderson and Waxman, 2012)  Empirical skepticism: can we trust technology to design safeguards?  Deontological imperative: do we want to take the human « out of the firing loop »?  Accountability: who takes the blame (incl. costs) for war crimes?  Not a yes/no question, but how to regulate?
  • 164. www.lcii.eu Laws of war  Hague Conventions (and regulations) of 1899 and 1907  Convention (II) with Respect to the Laws and Customs of War on Land and its annex: Regulations concerning the Laws and Customs of War on Land  Mostly about combatants  Provisions on warfare deemed to “contain rules of customary international law”  Article 51 of the UN Charter provides right of self-defence in case of armed attack
  • 165. www.lcii.eu Humanitarian law  In particular, Convention (IV) relative to the Protection of Civilian Persons in Time of War. Geneva, 12 August 1949  Mostly related to civilians protection  Protocol Additional (Protocol I), and relating to the Protection of Victims of International Armed Conflicts, 8 June 1977
  • 166. www.lcii.eu Disarmament law (1)  Convention on Certain Conventional Weapons (CCW) and protocols  Under UN Aegis  Compliance mechanism  Since 2013, expert meeting on LAWs
  • 167. www.lcii.eu  All prohibitions or restrictions on the use of specific weapons or weapon systems  Protocol I on Non-Detectable Fragments  Protocol II on Prohibitions or Restrictions on the Use of Mines, Booby Traps and Other Devices  Protocol III on Prohibitions or Restrictions on the Use of Incendiary Weapons, etc.  Protocol IV on Blinding Laser Weapons  Protocol V on Explosive Remnants of War Disarmament law (2)  CCW: “chapeau” convention with general provisions (1980 with 2001 amendment), including scope  Article 1 common to the Geneva Conventions of 12 August 1949.  Refers to Article 2 of Geneva Conventions of 12 August 1949 for the Protection of War Victims: “cases of declared war or of any other armed conflict which may arise between two or more of the High Contracting Parties, even if the state of war is not recognized by one of them. The Convention shall also apply to all cases of partial or total occupation of the territory of a High Contracting Party, even if the said occupation meets with no armed resistance”  Amended in 2001 to cover also “armed conflicts not of an international character occurring in the territory of one of the High Contracting Parties”
  • 168. www.lcii.eu State of discussion Ban skeptics  Those are process, R&D questions, which ought to be adressed at design level  Not a trial and error question: “significant national investment into R&D already undertaken that will be hard to write off on ethical or legal grounds; and national prestige might be in play” (Anderson and Waxman, 2012) Ban proponents  LAWs violate all provisions of Geneva conventions designed to protect civilians (HRW allegation: “robots with complete autonomy would be incapable of meeting international humanitarian law standards”)  This justifies a new protocol under CCW to ban all LAWs
  • 169. www.lcii.eu #1. Duty of review  Protocol I, Article 36 – “New Weapons”:  “In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party”  Protocol I, Article 84 – “Rules of Application”  “The High Contracting Parties shall communicate to one another, as soon as possible, through the depositary and, as appropriate, through the Protecting Powers, their official translations of this Protocol, as well as the laws and regulations which they may adopt to ensure its application”  See also Protocol I, Article 35 – “Basic rules”:  “1. In any armed conflict, the right of the Parties to the conflict to choose methods or means of warfare is not unlimited. 2. It is prohibited to employ weapons, projectiles and material and methods of warfare of a nature to cause superfluous injury or unnecessary suffering. 3. It is prohibited to employ methods or means of warfare which are intended, or may be expected, to cause widespread, long-term and severe damage to the natural environment”
  • 170. www.lcii.eu Discussion  On producer and customer States  Conflict of interest?  Home industry  Public subsidies to defense R&D  All signatory States shall apply  Some States have set up formal review (BE), others not  But US is not party to Protocol 1;  Some contend that Article 36 is customary international law  Components and final products?
  • 172. www.lcii.eu  HRW report, p.31:“a frightened mother may run after her two children and yell at them to stop playing with toy guns near a soldier. A human soldier could identify with the mother’s fear and the children’s game and thus recognize their intentions as harmless, while a fully autonomous weapon might see only a person running toward it and two armed individuals” => Visual recognition requires a subjective understanding of intention  “Legal threshold has always depended in part upon technology as well as intended use” (A&W, 2012) #2. Distinction requirement  Article 51(4) Protocol n°1: “Indiscriminate attacks are prohibited. Indiscriminate attacks are: (a) those which are not directed at a specific military objective; (b) those which employ a method or means of combat which cannot be directed at a specific military objective; or (c) those which employ a method or means of combat the effects of which cannot be limited as required by this Protocol; and consequently, in each such case, are of a nature to strike military objectives and civilians or civilian objects without distinction”
  • 173. www.lcii.eu  HRW report, p.33: “A fully autonomous aircraft identifies an emerging leadership target”  Pb 1: “if the target were in a city, the situation would be constantly changing and thus potentially overwhelming”  Pb 2: “weigh the anticipated advantages of attacking the leader”, which may depend on the political context  Rules out systems that “aim at other weapons” + “ethical issue of attaching weights to the variables at stake” (A&W, 2012) #3. Proportionality principle  Article 51(5) b): “an attack which may be expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated”  Civilian harm shall not outweigh military benefits  Ex ante balancing of civilian and military harm is required
  • 174. www.lcii.eu  Khrisnan, 2009  Development of “[t]echnology can largely affect the calculation of military necessity”; and  “Once [autonomous weapons] are widely introduced, it becomes a matter of military necessity to use them, as they could prove far superior to any other type of weapon”  Who decides if political or military necessity (persuading the ennemy to surrender) #4. “Military necessity” rule (or defense)  Customary principle of humanitarian law  Lethal force only for the explicit purpose of defeating an ennemy  Only to the extent of winning the war  Respect other rules of IHL: No attack on wounded or surrendering troops
  • 175. www.lcii.eu #5. Martens clause  Article 1(2) of Protocol 1 “In cases not covered by this Protocol or by other international agreements, civilians and combatants remain under the protection and authority of the principles of international law derived from established custom, from the principles of humanity and from dictates of public conscience”
  • 176. www.lcii.eu Reality check?  Robots may not comply with default legal structure, but do humans?  Pentagon Intelligence, Surveillance, and Reconnaissance (ISR) Task Force: standard for drone strikes is not “no civilian casualties,” only that it must be a “low” collateral damage estimate  More at https://theintercept.com/drone-papers/the-assassination- complex/
  • 177. Upgrading of Default Legal Structure?
  • 178. www.lcii.eu CCW discussions  Wide attendance  Discussion is whether autonomous systems are acceptable  But “neither side has managed to construct a coherent definition for autonomous weapon systems for the purpose of a weapons ban” (Crootof, 2014)  Most States believe that autonomous is ok as long as there is “meaningful human control” (GER: “LAW system without any human control is not in line with our command and control requirements”)  Ban supporters:  Cuba and Ecuador  Ban opponents:  British say existing international humanitarian law (IHL) is “the appropriate paradigm for discussion”, supported by Czechs  Programming is enough
  • 179. www.lcii.eu  Options  Stationarity requirements  Only for defensive purposes  Only non human targets  Only non lethal measures  Only in certain areas: high sea v urban areas Crootof, 2014  Supports intentional, proactive regulation  “An independent treaty might take one of three forms: it might attempt comprehensive regulation (like the Chemical Weapons Convention— which, in addition to banning the development, production, acquisition, stockpiling, retention, transfer, and use of certain defined chemical weapons, also outlines enforcement mechanisms), provide piecemeal regulations of specific activities (like the Nuclear Test Ban or nonproliferation treaties), or serve as a framework treaty intended to be augmented by later protocols (like the CCW itself). All of these have associated benefits and drawbacks”
  • 180. www.lcii.eu Arkin, 2009  Ron arkin has proposed an ethical code, designed to ensure compliance  « Ethical governor »  First step: LAW must evaluate the information it senses and determine whether an attack is prohibited under international humanitarian law and the rules of engagement  Second step: LAW must assess the attack under the proportionality test. According to Arkin, “the robot can fire only if it finds the attack ‘satisfies all ethical constraints and minimizes collateral damage in relation to the military necessity of the target’”  Report of California Polytechnic State University of San Luis Obispo consider that robot ethical morality is insufficient in complex environments  Other approaches?  Slavery ethics  Self learning and strong AI (McGinnis): highly desirable, but unattainable
  • 182. www.lcii.eu Liability v warfare Robotic liability  Social desirability of technology (almost) unquestioned  Debate is immunity or not  Focus on default legal structure, and possible ex post adjustment to the law  National discussion  Possibly because essentially discrete harm issues Robotic warfare  Social desirability of technology challenged  Debate is ban or not  On all sides, voices calling for new rules, and ex ante regulation  Robo-ethics driven, « law as code »  International discussion  Possibly because of stronger systemic and existential risk
  • 183. www.lcii.eu References  See generally: http://www.unog.ch/80256EE600585943/(httpPages)/8FA3C2562A60FF81C12 57CE600393DF6?OpenDocument  Ronald Arkin, The Case for Banning Killer Robots: Counterpoint, Communications of the ACM, Vol. 58 No. 12, Pages 46-47  Kenneth Anderson and Matthew Waxman, Law and Ethics for Robot Soldiers, 2012  David Levy, Robots Unlimited, A K Peters, Ltd., 2006  Michael C. Horowitz & Paul Scharre, Meaningful Human Control in Weapon Systems: A Primer (Mar. 2015)  Peter Asaro, On banning autonomous weapon systems, International Review of the Red Cross, 2012  Rebecca Crootof, The Killer Robots are Here, Cardozo Law Review, 2015  Markus Wagner, “Taking Humans Out of the Loop: Implications for International Humanitarian Law,” 21 Journal of Law, Information and Science (2011)
  • 186. www.lcii.eu Competition policy Goals  Allocative efficiency  Productive efficiency  Dynamic efficiency Tools  Prohibition of collusion  Prohibition of abuse of dominance  Prohibition of mergers to monopoly, and others   Israel Antitrust Authority (IAA)
  • 187. www.lcii.eu Perfect competition, 3.0?  Increased transparency  Lower search costs: PCWs and aggregators  Entry and Expansion  Platforms as enablers, and the midget disruptors  Demotion of brick and mortar behemoths  AMZN v Walmart  AMZN v GAP  AMZN v Publishers  Matching supply and offer  Sharing economy, and underutilized assets  The long tail
  • 188. www.lcii.eu  Predominance of data- hungry business models  Search for data advantage  Offline players join the fray, and search for smart pricing algorithms  Use of personal assistants to make decisions for us Emergence  Dynamic pricing  Use of pricing algorithms (Lawrence book, Making of a Fly, $23,698,655,93)  Personalized pricing  Octo’s insurance quotes based on drivers’ behavior  Data explosion  Cloud computing  IoT
  • 189. www.lcii.eu Ezrachi and Stucke, 2016  « Façade of competition »  Cost of free: « Data as Currency »  From invisible hand, to « digitized hand »
  • 190. www.lcii.eu Collusion Easy cases  « Messenger scenario »: rival executives collude, and defer to their algorithms to calculate, implement and police the cartel  Evidence of horizontal agreement + liability: easy  « Hub and Spoke »: rival do not interact, but outsource the pricing decision to an upstream supplier algorithm  Boomerang Commerce  Uber  Evidence of vertical agreements, and // conduct, and cumulative effect + liability: quite easy Tough cases  « Predictable agent »  All firms in industry use same pricing algorithm  Used to monitor each other’s list prices, and increase when sustainable  Instant detection => conscious parallelism  « God view and the Digital Eye »  Each firm can see entire economy on giant screen  Algorithm not programme to increase prices, just profit maximizer  Tacit collusion on steroids
  • 191. www.lcii.eu  Almost perfect, behavioral discrimination  Groups of customers  Decoys AAPL watch: $349 to $17,000  Price steering  Drip pricing  Complexity  Imperfect willpower Behavioral discrimination  Perfect price discrimination  Geographic, demographic, and other situational information  Prices, coupons and vouchers,  Target scandal  Cherry pickers avoidance
  • 192. www.lcii.eu Frenemies  Superplatforms-superplatforms: friends and foes  GOOG v AMZN v FB v AAPL v MSFT  GOOG Android supports 90% of AAPL’s APIs  Superplatforms v Independent Apps  Uber v GOOG and AAPL?  Superplatforms with Independent Apps  Extraction => cooperation during cookie and data identification tech placement  Capture => uneven cut, GOOG 32%  Superplatforms with and v Independent Apps  Brightest flashlight android app  Disconnect  Personal assistants
  • 193. www.lcii.eu Remedies  UK style market investigations  Putting a price on free  Privacy by default remedies  Possible regulation, beyond antitrust
  • 194. www.lcii.eu Bottom lines for competition law  Some strategies don’t raise market failures in antitrust sense  Personal assistants  Some generate classic problems for antitrust, nothing new under the sun  Frenemies  Some invite thinking on goals of competition law  Behavioral discrimination?  Some invite thinking on gaps in competition law  Predictable Agent and God View  Behavioral discrimination  Some may create enforcement difficulties  Tacit collusion: no liability on algorithm  Detection problem
  • 195. www.lcii.eu  Competition engineers  Antitrust Hacker  Antitrust Standardizer  Antitrust « Digital Half »  Antitrust Shamer Reinventing enforcement agencies?  Competition doctors  Standard mission of agencies is to remove antitrust infringements from markets  Deterrence, specific and general: carried out ex post with fines  Remediation for the future  Behavioral and structural remedies
  • 196. www.lcii.eu #1: Antitrust Hacker Scenario  Agencies to build programs and give away software that counteracts virtual competition  Agency to cooperate with computer scientists that build software so as to technologically undermine effectiveness of abovementioned strategies  Software then made widely available to customers and rivals willing to avail themselves of competitive options  Prospects for business and tech communities  Interface with consumer agencies Applications  Anti-decoy filters that eliminate false options  Additive data perturbation software => runs in the back of users’ sessions and visits random websites => noise  Anti-steering filters  Policy checking privacy enhancing tools  Anticomplexity software  Clearware.org refine consent content and present it in a more human readable format  Same with pricing?
  • 197. www.lcii.eu  De facto standards are most likely dominant platform operators  Impose on gateway players to make it possible for users to define their own level of acceptance for all new software  “Users’ browsers only accept interaction with Web servers whose privacy-preferences correspond to their own local preferences” (Boldt, 2007)  Problem of disconnect between dominance (platform) and abuse (spyware firm) #2: Antitrust Standardizer  Agencies to promote ex ante specification of  Privacy non-intensive pricing algorithms  Privacy Enhancing Technologies (identity verification with minimum identity disclosure, etc.)  Antitrust compliance in AI : individually rational v socially harmful (Dolmans) + Article 22 GDPR  Advocate introduction of antitrust « standards » with Standard Setting Organizations and/or de facto standards  IETF, IEEE-SA, ISO, etc.  « Dominant » platforms: OS, handsets, browsers and search engines
  • 198. www.lcii.eu #3: Antitrust Shamer  Instant antitrust popup that warns of systematic behavioral discrimination on website  Instant antitrust popup that suggests user to disconnect or use alternative browser (Tor)  Permanent and updated antitrust list of privacy-intensive websites
  • 199. www.lcii.eu #4: Antitrust « Digital Half » Scenario  « Digital half » of the competition agency (P. Domingos)  Hidden, anonymous or pseudonymous  Tacit collusion: stealth remediation, agency acts as a maverick, post low prices to trigger price war  Behavioral discrimination: agency monitors customers on platforms and instantly informs high price customers that other low price customers pay less Discussion  Pros: possiblity to catch infringements « red-handed »  Cons: due process? Not possible to remedy without infringement, simply monitor  But interim measures? Article 8 R1/2003  Yet, interim measures on infringing firms
  • 200. www.lcii.eu Challenges Conceptual  Privacy as an antitrust problem: quality competition?  Privacy as a market failure: mistrust causes deadweight loss?  Government as spy in market? Instrumental  Change the law on behavioral discrimination (US v EU)?  Change the law on tacit collusion (US&EU)?  Remedy without a cause (no infringement)?  Cat and mouse game where market always ahead of agency?
  • 202. www.lcii.eu Proposed framework Public interest, beyond market failures Utilitarian Externality Negative Discrete Systemic Positive Discrete Systemic Existential Existernality Negative Terminator Positive La formule de dieu
  • 203. www.lcii.eu Framework  Discrete externality (personal; random; rare; endurable)  Negative: harm  Positive: benefits  Systemic externality (local; predictable; frequent; unsustainable)  Negative  Substitution effect  Privacy  Positive  Complementarity effect  Generative or General Purpose technologies (Zitrain, Bresnahan)  « Existernality » (global; fat tail; terminal)  Negative: existential risk  Positive: pure human enhancement
  • 204. www.lcii.eu Illustration of the framework (Drones)  Discrete externality  A drone crashes on the ceiling of a house, while delivering  Transports an explosive product  Burns the house  Systemic externality  A drone operated delivery system puts employment in the mail industry at risk  Existentialist threat  Drone designed for war
  • 205. www.lcii.eu Discrete externalities Litigation  Basic legal infrastructure and case-by-case resolution  Decisional experimentation, with fitting exercize Regulation  Experimentation  “Tokku” Special Zone for Robotics Empirical Testing and Development (RT special zone) from open environments => Test human-robot interface in limited areas => companies entitled to less strict legal standard http://www.economist.com/blog s/banyan/2014/03/economic- zones-japan  Regulatory emulation as States liberalize driverless cars
  • 207. www.lcii.eu Systemic externalities  If discrete externalities become widespread or harmful  Scope for new regulation?  Negative externalities • Tax on robotic-intensive industries • Private entitlement of rights: laws on privacy • Safety standards to solve collective action problem • Mandatory insurance or electronic personhood for robots (Leroux et al.)  Positive externalities • Subsidies for public goods issues: building of controlled environment infrastructures for driverless cars • Proactive IPR policy for innovation into robotics technologies • Immunity from liability for research on certain systemic applications? GARA precedent
  • 208. www.lcii.eu Existernalities  Calls for legal bans on specific applications  UN Campaign to stop killer robots: https://www.stopkillerrobots.org/category/un/  Technical resolution of issues  Philosophers: « Creating friendly AIs 1.0 » (Yudkowsky, 2001)  Technologists: Keeping open and competitive technology, https://openai.com/blog/introducing-openai/
  • 209. Liege Competition and Innovation Institute (LCII) University of Liege (ULg) Quartier Agora | Place des Orateurs, 1, Bât. B 33, 4000 Liege, BELGIUM Thank you