-
Not Synced
Beautiful day out
-
Not Synced
there. Thank you
-
Not Synced
for joining us here today.
-
Not Synced
This gives me great pleasure to
-
Not Synced
introduce you to
-
Not Synced
2 thought leaders
-
Not Synced
who actually inform
-
Not Synced
inspire and shape
-
Not Synced
my own thinking about
-
Not Synced
relationships between
-
Not Synced
technology and society
-
Not Synced
on at least a weekly basis.
-
Not Synced
And I'm not kidding
-
Not Synced
It's really fantastic to
-
Not Synced
have
-
Not Synced
for an hour and a bit
-
Not Synced
to talk about a big
-
Not Synced
topid: AI
-
Not Synced
and society.
-
Not Synced
Jot is an associate professor
-
Not Synced
at MIT media lab
-
Not Synced
where he leads the
-
Not Synced
scaleable corporations
-
Not Synced
group among other
-
Not Synced
things. He has done
-
Not Synced
really amazing work
-
Not Synced
over the last couple of
-
Not Synced
years looking at
-
Not Synced
the interplay between
-
Not Synced
autonomous systems
-
Not Synced
and society. How these
-
Not Synced
systems should interact
-
Not Synced
with eachother.
-
Not Synced
He recently published a
-
Not Synced
study in science that
-
Not Synced
got a lot of press
-
Not Synced
coverage.
-
Not Synced
addessing the question
-
Not Synced
whether we can program
-
Not Synced
moral principles
-
Not Synced
into autonomous vehicles.
-
Not Synced
and maybe he will talk a bit
-
Not Synced
more about that
-
Not Synced
and then of course
-
Not Synced
Joe
-
Not Synced
Director of MIT media lab
-
Not Synced
professor of practice.
-
Not Synced
Aperson who doesn't really
-
Not Synced
need an intro ,
-
Not Synced
so I'll keep it extremely brief.
-
Not Synced
Just by highlighting two
-
Not Synced
of the must reads
-
Not Synced
from recent months.
-
Not Synced
It's an interview he had,
-
Not Synced
a conversation actually with
-
Not Synced
President Obama
-
Not Synced
in the Wired Magazine.
-
Not Synced
on the future of the world,
-
Not Synced
addressing AI issues
-
Not Synced
among other topics.
-
Not Synced
And his book,
-
Not Synced
Whiplash, which is somehow a
-
Not Synced
survival guide for the
-
Not Synced
faster future that we're all
-
Not Synced
struggling with. I highly
-
Not Synced
recommend it as a reading
-
Not Synced
I greatly benefited fromi it.
-
Not Synced
So, these are not only 2
-
Not Synced
amazing thought leaders.
-
Not Synced
They're also wonderful collaborators
-
Not Synced
and colleagues
-
Not Synced
and I have the great privilege
-
Not Synced
through
-
Not Synced
Berkley and Kline team to work with b
-
Not Synced
bothof them as part of
-
Not Synced
our recently launched
-
Not Synced
joint venture.
-
Not Synced
The AI Ethics and Governance
-
Not Synced
Initiative. It's just wonderful
-
Not Synced
to have you here
-
Not Synced
and spend some time
-
Not Synced
with all of us
-
Not Synced
and share your thoughts
-
Not Synced
so thank you very much
-
Not Synced
and welcome.
-
Not Synced
(applause)
-
Not Synced
Thank you,
-
Not Synced
first of all
-
Not Synced
some of you may be here thinking
-
Not Synced
"wait, this isn't the talk
-
Not Synced
that I signed up for.
-
Not Synced
So to just give you some of
-
Not Synced
the prominence of this
-
Not Synced
originally I think there
-
Not Synced
was a book talk
-
Not Synced
that I was going to do
-
Not Synced
with Merckman and then
-
Not Synced
I said "oh, well, why don't
-
Not Synced
we bring somebody else interesting in
-
Not Synced
and Josh joined.
-
Not Synced
We were going to have a dialoge
-
Not Synced
about his book and
-
Not Synced
my book.
-
Not Synced
And he had a family emeregency
-
Not Synced
and coudln't make it.
-
Not Synced
I grabbed Jot and also
-
Not Synced
realized just as Ers was
-
Not Synced
saying, we're doing a lot
-
Not Synced
of work with the
-
Not Synced
Berkman Center on
-
Not Synced
AI and society and I thought
-
Not Synced
this would be a
-
Not Synced
sufficiently relevant topic to
-
Not Synced
what we were going to
-
Not Synced
talk about anyway
-
Not Synced
so it wouldn't be that
-
Not Synced
much false advertising.
-
Not Synced
and It was sort of an idea
-
Not Synced
that I think relates to my book as well.
-
Not Synced
One, I can't remember who it was
-
Not Synced
but a well known author told me
-
Not Synced
when you give
-
Not Synced
book talks, don't explain your
-
Not Synced
whole book because then no
-
Not Synced
one will have to buy it.
-
Not Synced
So this book actually started
-
Not Synced
about 4 years ago.
-
Not Synced
And we were just wrapping
-
Not Synced
it up as we saw a lot of
-
Not Synced
this AI, society,controversy/interests
-
Not Synced
start. So the book actually
-
Not Synced
sort of ends where our
-
Not Synced
exploration of AI and society begins.
-
Not Synced
So in a way, it overlaps
-
Not Synced
what the book is about
-
Not Synced
but is sufficently different
-
Not Synced
that you have to read the
-
Not Synced
book in order to
-
Not Synced
understand
-
Not Synced
the whole story
-
Not Synced
But, let me.
-
Not Synced
I'll just start a few remarks
-
Not Synced
We'll have .... present some of his work
-
Not Synced
and then we'll have a converstaion
-
Not Synced
with all of you
-
Not Synced
and feel free to interrupt
-
Not Synced
and ask questions
-
Not Synced
or disagree.
-
Not Synced
I think the(stammers)
-
Not Synced
I co-taught a class with Jonathan
-
Not Synced
in January in the winter semester.
-
Not Synced
His tradtional course he teaches
-
Not Synced
is called internet and society
-
Not Synced
it's a politics and technology
-
Not Synced
of control. .... was there
-
Not Synced
others were there, it was
-
Not Synced
a fun class.
-
Not Synced
But one of the sort of
-
Not Synced
framing pieces of how we
-
Not Synced
talked about this
-
Not Synced
was this sort of framing pieces of
-
Not Synced
how we talked about this.
-
Not Synced
Was this sort of lesigian?
-
Not Synced
picture that many of you
-
Not Synced
may have seen in
-
Not Synced
his book where
-
Not Synced
you have law at the top,
-
Not Synced
and then you have markets
-
Not Synced
on one side.
-
Not Synced
and you have norms on the other
-
Not Synced
and you have technology
-
Not Synced
underneath and you have you in
-
Not Synced
the middle
-
Not Synced
and some how
-
Not Synced
what you are able to do
-
Not Synced
is sort of determine by this
-
Not Synced
relationship between law. technology - I think technology
-
Not Synced
is on top and law is
-
Not Synced
down here.
-
Not Synced
But anyway,
-
Not Synced
somehow these all
-
Not Synced
effect eachother.
-
Not Synced
so you can create
-
Not Synced
technologies that effect
-
Not Synced
the law,
-
Not Synced
you can create laws that
-
Not Synced
effect norms,
-
Not Synced
youcan create norms that
-
Not Synced
effect technology
-
Not Synced
so some
-
Not Synced
realtionship between
-
Not Synced
norms, markets, law and technology
-
Not Synced
is how we need to be thinking
-
Not Synced
in order to
-
Not Synced
design all of these systems
-
Not Synced
so they work well in th
-
Not Synced
future. I think one
-
Not Synced
of the key reasons why the collaboration
-
Not Synced
between MIT and Harper Law School
-
Not Synced
Medialab and Berkman
-
Not Synced
so important is that
-
Not Synced
you kind of have to get all
-
Not Synced
of the pieces
-
Not Synced
and the people in the same room
-
Not Synced
because the problem is
-
Not Synced
once everyone has a solution
-
Not Synced
and they're trying to convince
-
Not Synced
eachother of the solution
-
Not Synced
it's, I call them, people
-
Not Synced
selling doll houses.
-
Not Synced
rather than legos .
-
Not Synced
What you want is a whole pile of legos
-
Not Synced
with lawyers and business people
-
Not Synced
and technologists and policy makers
-
Not Synced
playing with the legos
-
Not Synced
rather than
-
Not Synced
trying to sell eachother thier
-
Not Synced
own dollhouses.
-
Not Synced
That's what was sort
-
Not Synced
of fun with the class
-
Not Synced
is that I think
-
Not Synced
a lot of lawyers
-
Not Synced
realized that actually infact,
-
Not Synced
whether you're talking about bit
-
Not Synced
coin or differential privacy or AI
-
Not Synced
we still have a lot of choices
-
Not Synced
to make on the technology side
-
Not Synced
and in fact those can
-
Not Synced
be informed by
-
Not Synced
policy and law
-
Not Synced
and conversly, I think
-
Not Synced
a lot of the technologists thought
-
Not Synced
that law was something like
-
Not Synced
laws of physics that just are. But in
-
Not Synced
fact laws are the result
-
Not Synced
of lawyers and policy
-
Not Synced
makers taking to
-
Not Synced
technologists.
-
Not Synced
Imagining what society wants.
-
Not Synced
So we're sort of in the process right
-
Not Synced
now of
-
Not Synced
struggling through how
-
Not Synced
we think about this.
-
Not Synced
But Importantly,
-
Not Synced
it's already happening
-
Not Synced
so it's not like we
-
Not Synced
have that much time.
-
Not Synced
I think it was Pedro Demingo in
-
Not Synced
his book
-
Not Synced
says in master algorithm
-
Not Synced
and this isn't the exact
-
Not Synced
I'm paraphrasing the quote
-
Not Synced
it's something like
-
Not Synced
I'm less afraid of
-
Not Synced
a super intelligence coming
-
Not Synced
to take over the world
-
Not Synced
and more worried about
-
Not Synced
a stupid intelligence that's taken
-
Not Synced
over already.
-
Not Synced
You know?
-
Not Synced
I think that's very close to where we are.
-
Not Synced
I think if you see
-
Not Synced
Julie A's paper , article in
-
Not Synced
Propublic, I guess it was
-
Not Synced
a little over a year ago.
-
Not Synced
where she happens to find a district where
-
Not Synced
they're forced to disclose court records.
-
Not Synced
So she was specifically going after the
-
Not Synced
fact that machine learning
-
Not Synced
AI is now used by the
-
Not Synced
judiciary to set bail
-
Not Synced
to do parole
-
Not Synced
and even sentencing.
-
Not Synced
And they have this thing
-
Not Synced
called the risk score
-
Not Synced
where the machine sort
-
Not Synced
of pops up
-
Not Synced
after it does an assesment of
-
Not Synced
a person's history
-
Not Synced
looks at thier interviews and
-
Not Synced
she found, and this
-
Not Synced
is great, cause she's a
-
Not Synced
math matician
-
Not Synced
in a data sense. She crunched
-
Not Synced
all these numbers
-
Not Synced
and it shows
-
Not Synced
that for many cases for white people
-
Not Synced
is't sort of nearly random
-
Not Synced
in some cases.
-
Not Synced
So, it's a number but it's still almost random.
-
Not Synced
and then for black people
-
Not Synced
it's biased against them
-
Not Synced
and what's interesting
-
Not Synced
is when I talked to ...
-
Not Synced
a prosecuter the other day
-
Not Synced
he said well I love, they
-
Not Synced
love these numbers
-
Not Synced
because you get a risk score
-
Not Synced
that says okay,
-
Not Synced
this person has a
-
Not Synced
risk rating of 8
-
Not Synced
and so then the court can
-
Not Synced
say 'okay, we'll give you this bail
-
Not Synced
because the last thing that they
-
Not Synced
want is for them to give them
-
Not Synced
some bail
-
Not Synced
and then the person goes out and murders
-
Not Synced
somebody, it's sort of thier fault.
-
Not Synced
If they're taken the risk score
-
Not Synced
they can say, "I just looked at
-
Not Synced
the risk score
-
Not Synced
it absolves them of this
-
Not Synced
responsibility
-
Not Synced
and so there's this
-
Not Synced
really interesting question
-
Not Synced
that even at random
-
Not Synced
it's still, there' s this wierd moral
-
Not Synced
hazard that even though you
-
Not Synced
have agency, you're able to
-
Not Synced
push off
-
Not Synced
this responsibility to the machine, right?
-
Not Synced
And then you can sort of say,
-
Not Synced
well it was math.
-
Not Synced
and the problem right now
-
Not Synced
is these algorithms are running
-
Not Synced
on data sets and rating
-
Not Synced
systems that are closed.
-
Not Synced
We see this happening in a
-
Not Synced
variety of fields . I think we
-
Not Synced
see this happening
-
Not Synced
in the judiciary,
-
Not Synced
Which is a scary place for it
-
Not Synced
to be happening
-
Not Synced
And so part of this initiative with AI
-
Not Synced
fund that we're doing
-
Not Synced
we're going to try and look
-
Not Synced
at whether we can create
-
Not Synced
more transparency and auditability
-
Not Synced
we're also seeing it in medicine.
-
Not Synced
There's a study that I heard
-
Not Synced
that when a doctor overrulled the machine
-
Not Synced
in diagnostics the doctor was wrong
-
Not Synced
70 % of the time.
-
Not Synced
So what does that mean?
-
Not Synced
So if you're a doctor
-
Not Synced
and you know for a fact
-
Not Synced
that you're 70% likely
-
Not Synced
on average to be wrong
-
Not Synced
are you ever going to overrule
-
Not Synced
the machine?
-
Not Synced
And what about that 30%
-
Not Synced
where the doctors are right?
-
Not Synced
So, it creates a very difficult
-
Not Synced
situation.
-
Not Synced
You look at...
-
Not Synced
Imagine war
-
Not Synced
We talk about autonomous
-
Not Synced
weapons and there's this whole
-
Not Synced
fight about
-
Not Synced
it, but what if all of the data
-
Not Synced
and not what if,
-
Not Synced
In fact, all of the data
-
Not Synced
that's driving intellegence
-
Not Synced
the way that you get on to
-
Not Synced
the termination list
-
Not Synced
as a target,
-
Not Synced
a lot of it involves statistical
-
Not Synced
analysis of your activity
-
Not Synced
your emotions, your calls
-
Not Synced
and there's this great interview
-
Not Synced
I think it was in the Independent or
-
Not Synced
it was Indpendent.
-
Not Synced
There was this guy who
-
Not Synced
I think he was in
-
Not Synced
Pakistan
-
Not Synced
I'm gonna get this wrong
-
Not Synced
I'll um, but it's close.
-
Not Synced
But he had been attacked
-
Not Synced
a number of times
-
Not Synced
where the collateral damage
-
Not Synced
was family members
-
Not Synced
being dead so he knew he
-
Not Synced
was on the kill list
-
Not Synced
but he didn't know how to get off
-
Not Synced
so he goes to London
-
Not Synced
to kind of fight for
-
Not Synced
'wait, look at me, talk to me
-
Not Synced
I'm on this kill list
-
Not Synced
but I'm not a bad guy.
-
Not Synced
Somehow
-
Not Synced
you got the wrong person.
-
Not Synced
But there's no interface
-
Not Synced
in which he can sort of
-
Not Synced
lobby and petition for getting off
-
Not Synced
this kill list.
-
Not Synced
So even though the person
-
Not Synced
controlling the drone strike and
-
Not Synced
pushing the button
-
Not Synced
maybe a human being,
-
Not Synced
If all of the data that's feeding into, or
-
Not Synced
a substantial amount of data that's
-
Not Synced
feeding into the decision to
-
Not Synced
put the person on
-
Not Synced
the kill list
-
Not Synced
is from a machine,
-
Not Synced
I don't know how that's that
-
Not Synced
different
-
Not Synced
from the machine actually being charged.
-
Not Synced
So we talk about sort of these
-
Not Synced
future autonomous systems
-
Not Synced
and robots running around
-
Not Synced
and killing people as
-
Not Synced
a sort of scary thing.
-
Not Synced
But if we are just pushing a button that the
-
Not Synced
robot just tells us to do
-
Not Synced
ABC or D but robot
-
Not Synced
says it's C, you're going to push C.
-
Not Synced
Apparently that was how Kisenger controlled Nixon
-
Not Synced
was through his elbow.
-
Not Synced
The anwer was always C.
-
Not Synced
But anyway,
-
Not Synced
the.... that actually is
-
Not Synced
when we think about practice.
-
Not Synced
We may already be in autonomous
-
Not Synced
mode in many things.
-
Not Synced
And then I'm going to T up to
-
Not Synced
Y? which is one of the
-
Not Synced
first places where the rubber meets
-
Not Synced
the road is with autonomous vehicles.
-
Not Synced
and a lot of the people that
-
Not Synced
I talk to say
-
Not Synced
that while the real soul searching
-
Not Synced
around this is going to happen
-
Not Synced
when the next big autonmous vehicle
-
Not Synced
accident happens where
-
Not Synced
it's clearly the machine's fault,
-
Not Synced
How is that going to play out?
-
Not Synced
So that may be one of the things
-
Not Synced
But the last thing that
-
Not Synced
I'll say is that I think
-
Not Synced
this is where the media lab
-
Not Synced
is excited. I think it's kind of
-
Not Synced
an interface design problem
-
Not Synced
because part of what the
-
Not Synced
problem is is that you may think
-
Not Synced
that by pushing , the button, the right to
-
Not Synced
overrule the computer the right to
-
Not Synced
launch the missle
-
Not Synced
may be your finger
-
Not Synced
if you have no choice,
-
Not Synced
morally or statistically other than
-
Not Synced
to push the button
-
Not Synced
you're not in charge anymore. Right?
-
Not Synced
So what I think we need to think about
-
Not Synced
is how to we bring
-
Not Synced
society and humans into the decision making
-
Not Synced
process so that the answer that
-
Not Synced
we derive involves human beings
-
Not Synced
and how does that interface hapen? what is the
-
Not Synced
right way to do it
-
Not Synced
Because I think what we are going to end
-
Not Synced
up with is collective decision making.
-
Not Synced
machines. and what we want to
-
Not Synced
not be in is human agency with no real decision
-
Not Synced
making ability.
-
Not Synced
And then we can talk more about some
-
Not Synced
of the ideas.
-
Not Synced
but I'll hand it over to Jot,
-
Not Synced
Thank you.
-
Not Synced
So I'll just give a short
-
Not Synced
overview of
-
Not Synced
the research we' ve been doing
-
Not Synced
on autonomous vehicles
-
Not Synced
I'm not a driverless car expert.
-
Not Synced
I don't build driverless cars.
-
Not Synced
But I'm interested in them as
-
Not Synced
as kind of a social phenomenon
-
Not Synced
and the reason has to do with this dilema that
-
Not Synced
Steve will keep discussing.
-
Not Synced
You know, what if it's an
-
Not Synced
autonomous car
-
Not Synced
that is going to for some
-
Not Synced
reason harm a bunch of
-
Not Synced
pedestrians
-
Not Synced
crossing the street because the
-
Not Synced
brakes are broken
-
Not Synced
or because they jumped in front
-
Not Synced
of it or whatever.
-
Not Synced
But the car can swerve and kill one
-
Not Synced
bystander on the other side.
-
Not Synced
in order to minimize harm
-
Not Synced
in order to save 5 or 10 people
-
Not Synced
should the car do this?
-
Not Synced
And who should decide?
-
Not Synced
And more interestingly,
-
Not Synced
what if the car could
-
Not Synced
swerve and hit a wall harming
-
Not Synced
the passenger or killing the passenger.?
-
Not Synced
In order to save these people?
-
Not Synced
Should the car do this as well?
-
Not Synced
What does the car
-
Not Synced
have a duty towards.?
-
Not Synced
Minimizing harm?utilitarian principle?
-
Not Synced
Protection of the owner or passengers in the car?
-
Not Synced
Duty toward them? Or something else? A sort of negotiation
-
Not Synced
inbetween?
-
Not Synced
Do we ignore this problem?
-
Not Synced
do we just say
-
Not Synced
well let the car deal with this problem
-
Not Synced
and it seems to be a very
-
Not Synced
controversial topic because
-
Not Synced
there are lots of people
-
Not Synced
who love this,
-
Not Synced
and lots of people who hate this.
-
Not Synced
and people who hate this say,
-
Not Synced
Well this is never going to happen
-
Not Synced
it's just so statistically unlikely
-
Not Synced
and I think that kind
-
Not Synced
of misses the point
-
Not Synced
because this is a invitro e xploration
-
Not Synced
of a principle so you strip
-
Not Synced
away all of the things that
-
Not Synced
don't matter in the real world
-
Not Synced
so you can isolate the factor
-
Not Synced
you know, does drug x
-
Not Synced
cause this particular
-
Not Synced
reation in a cell for example?
-
Not Synced
You know, you don't do this in the
-
Not Synced
forest you do it in a petridish.
-
Not Synced
And this is the petri dish for studying
-
Not Synced
human perception of machine ethics
-
Not Synced
and what other factors do people
-
Not Synced
seem to be ticked off by?
-
Not Synced
I think when we started studying
-
Not Synced
this we used the
-
Not Synced
techniques from social
-
Not Synced
psychology, we framed these
-
Not Synced
problems to people,
-
Not Synced
we varied things,
-
Not Synced
the number of people
-
Not Synced
who are being sacrified or
-
Not Synced
otherwise, whether there's an active
-
Not Synced
omission vs act of comission
-
Not Synced
and things like this.
-
Not Synced
And we're sort of interested in
-
Not Synced
how to people want to resolve
-
Not Synced
this dilemma?
-
Not Synced
What's fascinating is that
-
Not Synced
there was somthing that
-
Not Synced
was so obvious that we
-
Not Synced
missed initially, and that
-
Not Synced
was : it's not really
-
Not Synced
an ethical question, it's more
-
Not Synced
of a social dilemma. So
-
Not Synced
it's a question about
-
Not Synced
how you negotiate
-
Not Synced
the interests of different people.
-
Not Synced
And this was the sort of
-
Not Synced
strongest finding that we
-
Not Synced
found. Which was, no one
-
Not Synced
wants to be in a self sacrificing car
-
Not Synced
but they want to whole world
-
Not Synced
to drive one.
-
Not Synced
And it's really fascinating
-
Not Synced
that the fact is so strong.
-
Not Synced
You know, if you look at
-
Not Synced
the morality of sacrifice
-
Not Synced
and this is if you kill a pedestrian
-
Not Synced
to save ten
-
Not Synced
kill a passenger to save ten
-
Not Synced
and so forth.
-
Not Synced
so you can see that
-
Not Synced
I think it's moral and desirable
-
Not Synced
in both my car and
-
Not Synced
other cars to sacrifice other people
-
Not Synced
for the greater good. So I'm
-
Not Synced
happy to kill pedestrians to
-
Not Synced
save ten that's great
-
Not Synced
but as soon as you tell me,
-
Not Synced
'well would you sacrifice yourself?
-
Not Synced
would you sacrifice your passenger?'
-
Not Synced
Well I think it's moral ,
-
Not Synced
I think it's great, but I would
-
Not Synced
never want this in my car.
-
Not Synced
Not in other cars and
-
Not Synced
defininately not in my car.
-
Not Synced
This is where you see these
-
Not Synced
things split.
-
Not Synced
Now this is the tragedy
-
Not Synced
at the commons
-
Not Synced
right?
-
Not Synced
I want public safety to be maximized.
-
Not Synced
I would like the world
-
Not Synced
to be a safer place
-
Not Synced
where the cars
-
Not Synced
might make the decisions that
-
Not Synced
minimize harm, but
-
Not Synced
I don't want to contribute to
-
Not Synced
this public good. I woudln't
-
Not Synced
want to pay the personal cost
-
Not Synced
needed to do this.
-
Not Synced
So we thought
-
Not Synced
maybe regulation? You know
-
Not Synced
that's how public goods, problems
-
Not Synced
are solved.
-
Not Synced
Let's set a quota on the
-
Not Synced
number of sheep that
-
Not Synced
can graze so we
-
Not Synced
don't have to overrrun the pasture.
-
Not Synced
Or let's set a quota on
-
Not Synced
the number of fish you
-
Not Synced
can catch so you don't
-
Not Synced
over fish and kill
-
Not Synced
all the fish and basically
-
Not Synced
everybody loses out.
-
Not Synced
And we ask people whether
-
Not Synced
they would support this
-
Not Synced
and we found that people
-
Not Synced
think it's moral , but
-
Not Synced
they don't want it to be
-
Not Synced
legally enforced.
-
Not Synced
At least for now. Right?
-
Not Synced
This is the PR
-
Not Synced
problem and maybe it's a..
-
Not Synced
maybe we need to double up
-
Not Synced
the law
-
Not Synced
so that people can
-
Not Synced
feel comfortable with
-
Not Synced
what this means.
-
Not Synced
(audience)-I'd just like to ask a question ?
-
Not Synced
because we talk a lot about the
-
Not Synced
evolution of cultural things
-
Not Synced
and I assume all of these are people,
-
Not Synced
or I guess you don't know
-
Not Synced
but most of these are
-
Not Synced
people who have never
-
Not Synced
been in a self driving car
-
Not Synced
right? And I think one of the
-
Not Synced
things we found , again, this is not
-
Not Synced
my work but some of our colleagues
-
Not Synced
They do this self driving car,
-
Not Synced
Uber type thing where you can
-
Not Synced
map and it was actually for
-
Not Synced
normal sort of, the public.
-
Not Synced
And they're impression
-
Not Synced
of the safety of self drving cars
-
Not Synced
changed substantially
-
Not Synced
after they had expereinced
-
Not Synced
it for a little while
-
Not Synced
and they sort of anticdotally
-
Not Synced
felt safer than with dad.
-
Not Synced
So I think once you're in
-
Not Synced
a self driving car, and
-
Not Synced
see how much control it
-
Not Synced
-
Not Synced
has your view on the safety
-
Not Synced
as well as it's,
-
Not Synced
and the other thing that happens and
-
Not Synced
this may happen more
-
Not Synced
in Japan than the US
-
Not Synced
In Japenese culture
-
Not Synced
your sort of identify with
-
Not Synced
machines and tools like that
-
Not Synced
they start to feel trust with
-
Not Synced
the machine
-
Not Synced
which I think unless you expereince it
-
Not Synced
you don't, you can't imagine it.
-
Not Synced
Anyway.
-
Not Synced
I agree.
-
Not Synced
I think there's all sorts of
-
Not Synced
things . We're now interested in
-
Not Synced
studying, for example,
-
Not Synced
agency perception. You know,
-
Not Synced
do people see these things to
-
Not Synced
have minds, and if not, why not?
-
Not Synced
what's the missing component?
-
Not Synced
Which becomes really interesting
-
Not Synced
with drones for example.
-
Not Synced
So the other thing is
-
Not Synced
when we ask people, well again,
-
Not Synced
people think it's moral
-
Not Synced
to sacrifice but they don't want it
-
Not Synced
to be regulated
-
Not Synced
and definately not
-
Not Synced
buy it if it's regulated
-
Not Synced
but they're much more likely to
-
Not Synced
purchase those cars if they were regulated.
-
Not Synced
I think this is really a really important
-
Not Synced
question.
-
Not Synced
If people don't purchase those cars
-
Not Synced
you will not save lives.
-
Not Synced
I mean people estimate,
-
Not Synced
scientists estimate that 90 % of
-
Not Synced
accidents today are due to human
-
Not Synced
error. So if we can , the sooner, assuming
-
Not Synced
the technology get's there assuming
-
Not Synced
we have wide adoption
-
Not Synced
the sooner we save more lives.
-
Not Synced
but if the people
-
Not Synced
are so worried about edge cases
-
Not Synced
or that their own safety is not paramount
-
Not Synced
they may not purchase the
-
Not Synced
cars and we may not therefor
-
Not Synced
have wide adoption and as a result
-
Not Synced
- we can map this onto
-
Not Synced
the quadrants this is clearly one
-
Not Synced
that you can't just leave up to the market.
-
Not Synced
If people aren't buying
-
Not Synced
the thing that they believe
-
Not Synced
has a common good.
-
Not Synced
-Exactly, and if you regulate it
-
Not Synced
you can, there's a backfire
-
Not Synced
effect which is well fine,
-
Not Synced
that's great, that's a good
-
Not Synced
social contract for other
-
Not Synced
people but I will continue to
-
Not Synced
drive my own car
-
Not Synced
and probably be more
-
Not Synced
likely to kill myself
-
Not Synced
as a result.
-
Not Synced
So people are not rational
-
Not Synced
in they way they
-
Not Synced
assess risk of getting on a plane
-
Not Synced
or will I be eaten by a shark? You know
-
Not Synced
people over estimate
-
Not Synced
those risks and there's a
-
Not Synced
good chance that
-
Not Synced
if we don't trust those
-
Not Synced
systems then we will overestimate
-
Not Synced
those risks too and
-
Not Synced
prefer to drive ourselves.
-
Not Synced
so we have an ethical
-
Not Synced
dilemma we strarted from
-
Not Synced
then we realized it's a
-
Not Synced
social dilemma
-
Not Synced
but now we're realizing
-
Not Synced
there's a meta ethical dilemma
-
Not Synced
which is if you solve the
-
Not Synced
social dilemma
-
Not Synced
by using regualtions
-
Not Synced
you may actually create a
-
Not Synced
bigger dilemma
-
Not Synced
a bigger trolley problem
-
Not Synced
which is do we continue
-
Not Synced
to drive cars or sell or
-
Not Synced
do we lead to wide adoption?
-
Not Synced
of autonomous vehicles.
-
Not Synced
so we want to collect more data.
-
Not Synced
we want to understand this
-
Not Synced
issue in more nuanced ways
-
Not Synced
and we started, I'm gonna
-
Not Synced
move fast on this
-
Not Synced
we started collecting data, these
-
Not Synced
things have made it
-
Not Synced
to transoportation regulations now
-
Not Synced
or guidelines which is good
-
Not Synced
but we've created a website
-
Not Synced
called moral machine
-
Not Synced
in which we randomly
-
Not Synced
generate scenarios
-
Not Synced
so in this case it's not just
-
Not Synced
one vs ten
-
Not Synced
or one vs five
-
Not Synced
it's there's a dog in there
-
Not Synced
and we've varied the ages,
-
Not Synced
sometimes they're children
-
Not Synced
sometimes they're pregnant
-
Not Synced
woman
-
Not Synced
sometimes people are crossing at red
-
Not Synced
lights
-
Not Synced
and so do they deserve the same
-
Not Synced
level of protection as -isn't
-
Not Synced
that interesting?
-
Not Synced
This group here,
-
Not Synced
what if they're children?
-
Not Synced
Do they, should they,
-
Not Synced
are they expected to know
-
Not Synced
not to cross the red light?
-
Not Synced
and so it get's really hairy
-
Not Synced
really quickly.
-
Not Synced
You know these are
-
Not Synced
some cartoons
-
Not Synced
very simplified scenarios
-
Not Synced
but i think they still
-
Not Synced
bring out lots of interesting
-
Not Synced
questions and we show
-
Not Synced
people the resulfs
-
Not Synced
(audience groaning at result)
-
Not Synced
We show people results.
-
Not Synced
This is a former of mine
-
Not Synced
who has a cat
-
Not Synced
he's happy to kill babies
-
Not Synced
to save cats
-
Not Synced
we also show people,
-
Not Synced
we show people how much they
-
Not Synced
care about different factors
-
Not Synced
and how that compared to with others.
-
Not Synced
So people love this
-
Not Synced
cause it's kind of metal to thier
-
Not Synced
own morality.
-
Not Synced
You know, do I care about
-
Not Synced
the law a lot and how
-
Not Synced
do I
-
Not Synced
compare with other people on this
-
Not Synced
matter? Do I protect passengers more
-
Not Synced
than other people or less,?
-
Not Synced
and so on.
-
Not Synced
We also have this design mode
-
Not Synced
where people can create thier own
-
Not Synced
scenarios and they get a link
-
Not Synced
to them and
-
Not Synced
and a lot of people have been using
-
Not Synced
them to teach ethics
-
Not Synced
in highschools
-
Not Synced
and universities.
-
Not Synced
And we have all sorts of you know,
-
Not Synced
species preferences
-
Not Synced
should social value be taken
-
Not Synced
into account
-
Not Synced
should age be taken into
-
Not Synced
account and so forth.
-
Not Synced
And we also evaluate if there
-
Not Synced
is an omission or comission
-
Not Synced
distinction.
-
Not Synced
which action, is the action that minimizes
-
Not Synced
harm should it be an omission or commision
-
Not Synced
and there is definitely a bias.
-
Not Synced
WE're now analyses.
-
Not Synced
So far we've translated this into
-
Not Synced
10 languages we've recieved
-
Not Synced
3 million uses
-
Not Synced
that have completed more than
-
Not Synced
28 million decisions
-
Not Synced
binary choices a
-
Not Synced
and we have 300000 fulll
-
Not Synced
surveys
-
Not Synced
and this is still growing fast
-
Not Synced
These full serveys
-
Not Synced
allow us to tease out
-
Not Synced
whether these people
-
Not Synced
have cars
-
Not Synced
themselves, which age bracket,
-
Not Synced
which income bracket
-
Not Synced
they come from
-
Not Synced
and so on.
-
Not Synced
This is really interesting because then you can
-
Not Synced
start
-
Not Synced
saying well people how
-
Not Synced
have cars maybe
-
Not Synced
more or less likely
-
Not Synced
to support this particular
-
Not Synced
ethical framework.
-
Not Synced
We have a lot of global coverage and so far we've
-
Not Synced
been looking it
-
Not Synced
cross cultural differences
-
Not Synced
and because this is recorded
-
Not Synced
I don't want to talk about it
-
Not Synced
yet but basically
-
Not Synced
we're observing some very
-
Not Synced
interesting cross cultural differences
-
Not Synced
in terms of the degree to
-
Not Synced
which people are utilitarian
-
Not Synced
or to which thy would prioritize
-
Not Synced
the passengers
-
Not Synced
to which they're willing to take
-
Not Synced
an action so omission vs
-
Not Synced
comission and so forth
-
Not Synced
I think it's really facinating
-
Not Synced
and it would be very important
-
Not Synced
precondition to any sort of
-
Not Synced
public relations effort to make the
-
Not Synced
cars more acceptable
-
Not Synced
but also potentially to the legal differenes in the
-
Not Synced
legal frameworks
-
Not Synced
as well.
-
Not Synced
Also beginning to look at partial autonomy,
-
Not Synced
so whether it's autonomous cars or
-
Not Synced
drones
-
Not Synced
or jujges making bail decisions
-
Not Synced
again, you can have a
-
Not Synced
machine to everything or
-
Not Synced
you can have a human do everything.
-
Not Synced
so and in the car you
-
Not Synced
have things where the driver
-
Not Synced
assitance so the person is
-
Not Synced
in control and the machine
-
Not Synced
sort of watches over them
-
Not Synced
so tpyota has been promoting
-
Not Synced
this model and other car makers as well
-
Not Synced
but also there
-
Not Synced
s auto pilot where the machine does
-
Not Synced
the things and thehuman
-
Not Synced
kind of has to keep an eye on
-
Not Synced
whether it's a car or anything else and
-
Not Synced
then you have full autonomy.
-
Not Synced
The question here we're interested in
-
Not Synced
is we're comparing these
-
Not Synced
models and we're investigating
-
Not Synced
empirically whether
-
Not Synced
people assign
-
Not Synced
differnt degrees of blame
-
Not Synced
and responsibility depending on the
-
Not Synced
control architecture, we can call it
-
Not Synced
If a person overrriding a decision made by
-
Not Synced
a machine is differnt from a machine
-
Not Synced
overriding a decision made by a human
-
Not Synced
and it happens
-
Not Synced
again this is now in submission,
-
Not Synced
But it happens to really matter.
-
Not Synced
It really matters who you think is
-
Not Synced
ultimately responsible and who is liable
-
Not Synced
and I think this is a sort of psychological
-
Not Synced
imput to potential legistations that
-
Not Synced
could come up.
-
Not Synced
to deal with these scenarios
-
Not Synced
so this is a broader picture
-
Not Synced
that I like. wich I think Joey
-
Not Synced
eluded to initially.
-
Not Synced
which is that there is a gap in
-
Not Synced
between
-
Not Synced
and on one side we have engineers
-
Not Synced
who think everything is an engeneering
-
Not Synced
problem, you know, ev
-
Not Synced
everythin can be enginerred away
-
Not Synced
and you have people from
-
Not Synced
the humanities and social sciences
-
Not Synced
study the nuances of human
-
Not Synced
behavior
-
Not Synced
but also who know how rules
-
Not Synced
can get sort of abused
-
Not Synced
and have a good sort of knack
-
Not Synced
for this. You know, how do you
-
Not Synced
ensure you have coherent system of
-
Not Synced
ethics and values and checks and
-
Not Synced
balances and so on
-
Not Synced
I think that these sides often don't
-
Not Synced
talk to each other so I think
-
Not Synced
there is a sizable community of
-
Not Synced
people who complain about who
-
Not Synced
are very good identifying a problem
-
Not Synced
and violations of fairness and rights.
-
Not Synced
and so on
-
Not Synced
but we don't have the tools
-
Not Synced
to express these objections
-
Not Synced
in a way that computer scientists
-
Not Synced
can operationalize
-
Not Synced
likewise, we have machine
-
Not Synced
learning
-
Not Synced
and there are scientist who feel
-
Not Synced
that this is problamatic
-
Not Synced
I can see that this has, can cause
-
Not Synced
problems this can
-
Not Synced
violate
-
Not Synced
people's rights but again
-
Not Synced
they don't have the intellectual
-
Not Synced
framework to raise these issues in
-
Not Synced
a way that humans as a society
-
Not Synced
can evaluate so we're hoping
-
Not Synced
to do and this is part of the partnership
-
Not Synced
between the media lab and the berkman center
-
Not Synced
Berkman center are from this side and they
-
Not Synced
understand us,
-
Not Synced
and we come from this side from technology and we
-
Not Synced
work on interfaces`
-
Not Synced
and we hope through this
-
Not Synced
we will make some interesting famework
-
Not Synced
this is , I think, where
-
Not Synced
many of the interesting questions are.
-
Not Synced
So I think we're ready for a discussion and taking some
-
Not Synced
questions.
-
Not Synced
"I guess the one other part I would add
-
Not Synced
to this is that
-
Not Synced
just one other acess is going back
-
Not Synced
to judiciary but we can have
-
Not Synced
this
-
Not Synced
in cars as well. Is in the one hand
-
Not Synced
I don't think anybody
-
Not Synced
thinks that speeding tickets issues
-
Not Synced
by speed cameras on the highway
-
Not Synced
are , I mean some people may
-
Not Synced
not like them but
-
Not Synced
that's inappropriate use of machine
-
Not Synced
because it's really a fact. There's a speed
-
Not Synced
that you're allowed to go. and the
-
Not Synced
machine is more likely to measure your s
-
Not Synced
speed than
-
Not Synced
a human eye balling it,
-
Not Synced
and probably more fair.
-
Not Synced
on the other hand I don't think anyone believes
-
Not Synced
that the supreme court
-
Not Synced
decisions at least for now should
-
Not Synced
have really that much substantial role
-
Not Synced
with the machines at least in the deliberation
-
Not Synced
part so there's a spectrum there's this
-
Not Synced
thing where on the one end
-
Not Synced
where you're just establishing a fact which
-
Not Synced
is sort of impemation of the law
-
Not Synced
which we're not even disputing
-
Not Synced
the justice of it to
-
Not Synced
the supreme court which
-
Not Synced
is supposed to try and reflect the norms
-
Not Synced
of the day making
-
Not Synced
determinations about laws.
-
Not Synced
but then there's a continum between
-
Not Synced
somewhere in the middle there
-
Not Synced
you have this uncanny place where
-
Not Synced
it feels like the machines have some
-
Not Synced
infulence and I think whats
-
Not Synced
kind of interesting is just about all
-
Not Synced
of these hypotheticals we have
-
Not Synced
there's one extreme where you do want
-
Not Synced
the machines in charge,
-
Not Synced
there's another extreme where you
-
Not Synced
do want humans in charge.
-
Not Synced
Those are actually not that difficult.
-
Not Synced
Ther's a space inbetween them
-
Not Synced
and I thikn that's why it's kind of
-
Not Synced
an interface problem
-
Not Synced
is that it's very unclear how the human
-
Not Synced
and machine peices whether it's a societal thing
-
Not Synced
or an individual get together so that's
-
Not Synced
again, it's related to the autonomy question
-
Not Synced
but I think it's a .. and I think it's
-
Not Synced
sort of technology and is it ethics or
-
Not Synced
morality there's some sort of stack as well.
-
Not Synced
And maybe everything to an internet person
-
Not Synced
looks like a stack.
-
Not Synced
Maybe that's my proble.
-
Not Synced
I there's sort of an intereting thought
-
Not Synced
experiment which was
-
Not Synced
you know, suppose that we
-
Not Synced
I think we need tools too, it's not
-
Not Synced
just a legal question. New kinds of tools
-
Not Synced
and new kinds of data can make a
-
Not Synced
big difference. So let's assume
-
Not Synced
that we invented the cars
-
Not Synced
and they started going on high speed
-
Not Synced
but we didn't invent radar.
-
Not Synced
that can accurately measure speeds.
-
Not Synced
so we relied on human guestimation of speeds
-
Not Synced
of your speed driving. So there's
-
Not Synced
a police man standing sort of
-
Not Synced
eyeballing cars and (thinks)
-
Not Synced
that sort of looks like 120 right?
-
Not Synced
You can very well imagine that
-
Not Synced
under this scenario , if policemen
-
Not Synced
were discriminating one particular group
-
Not Synced
maybe over estimate the speed of
-
Not Synced
people driving cars from
-
Not Synced
that particular ethinic group.
-
Not Synced
and underestimate the speed of
-
Not Synced
other people.
-
Not Synced
But somehow the tool solves this
-
Not Synced
question because it makes the
-
Not Synced
final youknow, it's recorded
-
Not Synced
and somehow it becomes objective.
-
Not Synced
It becomes a fact.
-
Not Synced
And we haven't, you know,
-
Not Synced
It's not disputeable so can we do
-
Not Synced
something
-
Not Synced
similar here and we say
-
Not Synced
-But I think that's where it hits a slippery slope.
-
Not Synced
So if you're doing speed of a car
-
Not Synced
it's a very small number of data points the
-
Not Synced
machine is getting ot guess
-
Not Synced
your speed but the risk rating
-
Not Synced
to some people may seem very
-
Not Synced
scientific especially if they don't
-
Not Synced
understand math and statistics
-
Not Synced
and so they may say the machine
-
Not Synced
rating said they have a risk of this
-
Not Synced
and actually in the forms, they never
-
Not Synced
ask you your race.
-
Not Synced
It just turns out that when you
-
Not Synced
collect the data and you collect
-
Not Synced
the questions the result is biased
-
Not Synced
against race and so
-
Not Synced
one of the questions of what's difficult
-
Not Synced
is if you don'ot understand how these
-
Not Synced
algorithms convert data inot result
-
Not Synced
and this is the problem with the
-
Not Synced
black box thing. A lot of
-
Not Synced
the machines and again there's progress
-
Not Synced
on making machines that can explain how
-
Not Synced
they got to the decision
-
Not Synced
but a lot of the machines we
-
Not Synced
currently use are unable to
-
Not Synced
describe how they got the number,
-
Not Synced
they just give you the number.
-
Not Synced
So if I may pick up on that and
-
Not Synced
ask a first question?
-
Not Synced
So., this question of the normatitvity
-
Not Synced
of the autonomous system and who makes
-
Not Synced
where's the sorce of the norm that
-
Not Synced
seems to be a key question. And I'm wondering
-
Not Synced
picking up on your earlier description whether
-
Not Synced
we're on a particular trajectory
-
Not Synced
by what you described. I think
-
Not Synced
there'are roughly 3 phases I've heard.
-
Not Synced
1 Is okay, we have these autonomous vehicles
-
Not Synced
and now it's a question for law
-
Not Synced
makers and regulators
-
Not Synced
Do we apply existing norms to
-
Not Synced
these new technologies ?.
-
Not Synced
Sometimes you need to update
-
Not Synced
the regualations which we
-
Not Synced
see happening
-
Not Synced
you made reference to that. But
-
Not Synced
there is also a second phase
-
Not Synced
it seems where it, is can we
-
Not Synced
somehow program some of the values and laws and
-
Not Synced
rules into the systems themselves so the behavior
-
Not Synced
is closer to what we have normative constance around
-
Not Synced
the society and as law makers and policy
-
Not Synced
makers?
-
Not Synced
And then there' s a potential third phase
-
Not Synced
I'm particulary interested in your views
-
Not Synced
whether that is indeed a trajectory
-
Not Synced
in the area you study or
-
Not Synced
more broadly potentially
-
Not Synced
that as you could envision a future
-
Not Synced
where more and more data
-
Not Synced
accumulates in systems like
-
Not Synced
autonomous vehicles based on
-
Not Synced
the rules we program then
-
Not Synced
how to behave and how they learn
-
Not Synced
how these rules are obeyed or not
-
Not Synced
what the compliance rate is and the like
-
Not Synced
where suddenly the norm itsself becomes
-
Not Synced
computer or machine generated and how to do
-
Not Synced
we feel about that because that
-
Not Synced
may be an advertantly get us to the other end of the
-
Not Synced
spectrum that you're decribing that the
-
Not Synced
norms are no longer developed here
-
Not Synced
and then somehow programmed into the
-
Not Synced
system
-
Not Synced
but at least evolution of the norm
-
Not Synced
happens in the automated system.
-
Not Synced
I think you would have to tease apart the
-
Not Synced
norms and the laws
-
Not Synced
One of my -says the engeneer to the law student -
-
Not Synced
(laughing)
-
Not Synced
My favorite one is a chart that Jot gave me
-
Not Synced
but I did this in japan so I can
-
Not Synced
do this here.
-
Not Synced
Imagine you have a car
-
Not Synced
and on oyur left there's two motor
-
Not Synced
cycles one the left and on the right
-
Not Synced
the one on the left has no
-
Not Synced
helmet, the one on the right
-
Not Synced
is wearing a helmet. And there's
-
Not Synced
a helmet law. The guy on the left is clearly
-
Not Synced
breaking the law completely disrespecting the law
-
Not Synced
so you have to sweve. Someone jumps in
-
Not Synced
front of your car. You have to swerve.
-
Not Synced
Do you hit the guy iwthout the helmet
-
Not Synced
or do you hit the guy with the helmet??
-
Not Synced
The guy with the helmet is more likely to survive,
-
Not Synced
but he's following the law.
-
Not Synced
So who hits the guy without the helmet.
-
Not Synced
I did this at a Japenese car company
-
Not Synced
and half of them in the room
-
Not Synced
raised their hand said well,
-
Not Synced
of course you go after the guy who
-
Not Synced
broke the law right?
-
Not Synced
But this is a very interesting normative
-
Not Synced
question
-
Not Synced
so there's all these versions of it