-
Not Synced
Beautiful day out
-
Not Synced
there. Thank you
-
Not Synced
for joining us here today.
-
Not Synced
This gives me great pleasure to
-
Not Synced
introduce you to
-
Not Synced
2 thought leaders
-
Not Synced
who actually inform
-
Not Synced
inspire and shape
-
Not Synced
my own thinking about
-
Not Synced
relationships between
-
Not Synced
technology and society
-
Not Synced
on at least a weekly basis.
-
Not Synced
And I'm not kidding
-
Not Synced
It's really fantastic to
-
Not Synced
have
-
Not Synced
for an hour and a bit
-
Not Synced
to talk about a big
-
Not Synced
topid: AI
-
Not Synced
and society.
-
Not Synced
Jot is an associate professor
-
Not Synced
at MIT media lab
-
Not Synced
where he leads the
-
Not Synced
scaleable corporations
-
Not Synced
group among other
-
Not Synced
things. He has done
-
Not Synced
really amazing work
-
Not Synced
over the last couple of
-
Not Synced
years looking at
-
Not Synced
the interplay between
-
Not Synced
autonomous systems
-
Not Synced
and society. How these
-
Not Synced
systems should interact
-
Not Synced
with eachother.
-
Not Synced
He recently published a
-
Not Synced
study in science that
-
Not Synced
got a lot of press
-
Not Synced
coverage.
-
Not Synced
addessing the question
-
Not Synced
whether we can program
-
Not Synced
moral principles
-
Not Synced
into autonomous vehicles.
-
Not Synced
and maybe he will talk a bit
-
Not Synced
more about that
-
Not Synced
and then of course
-
Not Synced
Joe
-
Not Synced
Director of MIT media lab
-
Not Synced
professor of practice.
-
Not Synced
Aperson who doesn't really
-
Not Synced
need an intro ,
-
Not Synced
so I'll keep it extremely brief.
-
Not Synced
Just by highlighting two
-
Not Synced
of the must reads
-
Not Synced
from recent months.
-
Not Synced
It's an interview he had,
-
Not Synced
a conversation actually with
-
Not Synced
President Obama
-
Not Synced
in the Wired Magazine.
-
Not Synced
on the future of the world,
-
Not Synced
addressing AI issues
-
Not Synced
among other topics.
-
Not Synced
And his book,
-
Not Synced
Whiplash, which is somehow a
-
Not Synced
survival guide for the
-
Not Synced
faster future that we're all
-
Not Synced
struggling with. I highly
-
Not Synced
recommend it as a reading
-
Not Synced
I greatly benefited fromi it.
-
Not Synced
So, these are not only 2
-
Not Synced
amazing thought leaders.
-
Not Synced
They're also wonderful collaborators
-
Not Synced
and colleagues
-
Not Synced
and I have the great privilege
-
Not Synced
through
-
Not Synced
Berkley and Kline team to work with b
-
Not Synced
bothof them as part of
-
Not Synced
our recently launched
-
Not Synced
joint venture.
-
Not Synced
The AI Ethics and Governance
-
Not Synced
Initiative. It's just wonderful
-
Not Synced
to have you here
-
Not Synced
and spend some time
-
Not Synced
with all of us
-
Not Synced
and share your thoughts
-
Not Synced
so thank you very much
-
Not Synced
and welcome.
-
Not Synced
(applause)
-
Not Synced
Thank you,
-
Not Synced
first of all
-
Not Synced
some of you may be here thinking
-
Not Synced
"wait, this isn't the talk
-
Not Synced
that I signed up for.
-
Not Synced
So to just give you some of
-
Not Synced
the prominence of this
-
Not Synced
originally I think there
-
Not Synced
was a book talk
-
Not Synced
that I was going to do
-
Not Synced
with Merckman and then
-
Not Synced
I said "oh, well, why don't
-
Not Synced
we bring somebody else interesting in
-
Not Synced
and Josh joined.
-
Not Synced
We were going to have a dialoge
-
Not Synced
about his book and
-
Not Synced
my book.
-
Not Synced
And he had a family emeregency
-
Not Synced
and coudln't make it.
-
Not Synced
I grabbed Jot and also
-
Not Synced
realized just as Ers was
-
Not Synced
saying, we're doing a lot
-
Not Synced
of work with the
-
Not Synced
Berkman Center on
-
Not Synced
AI and society and I thought
-
Not Synced
this would be a
-
Not Synced
sufficiently relevant topic to
-
Not Synced
what we were going to
-
Not Synced
talk about anyway
-
Not Synced
so it wouldn't be that
-
Not Synced
much false advertising.
-
Not Synced
and It was sort of an idea
-
Not Synced
that I think relates to my book as well.
-
Not Synced
One, I can't remember who it was
-
Not Synced
but a well known author told me
-
Not Synced
when you give
-
Not Synced
book talks, don't explain your
-
Not Synced
whole book because then no
-
Not Synced
one will have to buy it.
-
Not Synced
So this book actually started
-
Not Synced
about 4 years ago.
-
Not Synced
And we were just wrapping
-
Not Synced
it up as we saw a lot of
-
Not Synced
this AI, society,controversy/interests
-
Not Synced
start. So the book actually
-
Not Synced
sort of ends where our
-
Not Synced
exploration of AI and society begins.
-
Not Synced
So in a way, it overlaps
-
Not Synced
what the book is about
-
Not Synced
but is sufficently different
-
Not Synced
that you have to read the
-
Not Synced
book in order to
-
Not Synced
understand
-
Not Synced
the whole story
-
Not Synced
But, let me.
-
Not Synced
I'll just start a few remarks
-
Not Synced
We'll have .... present some of his work
-
Not Synced
and then we'll have a converstaion
-
Not Synced
with all of you
-
Not Synced
and feel free to interrupt
-
Not Synced
and ask questions
-
Not Synced
or disagree.
-
Not Synced
I think the(stammers)
-
Not Synced
I co-taught a class with Jonathan
-
Not Synced
in January in the winter semester.
-
Not Synced
His tradtional course he teaches
-
Not Synced
is called internet and society
-
Not Synced
it's a politics and technology
-
Not Synced
of control. .... was there
-
Not Synced
others were there, it was
-
Not Synced
a fun class.
-
Not Synced
But one of the sort of
-
Not Synced
framing pieces of how we
-
Not Synced
talked about this
-
Not Synced
was this sort of framing pieces of
-
Not Synced
how we talked about this.
-
Not Synced
Was this sort of lesigian?
-
Not Synced
picture that many of you
-
Not Synced
may have seen in
-
Not Synced
his book where
-
Not Synced
you have law at the top,
-
Not Synced
and then you have markets
-
Not Synced
on one side.
-
Not Synced
and you have norms on the other
-
Not Synced
and you have technology
-
Not Synced
underneath and you have you in
-
Not Synced
the middle
-
Not Synced
and some how
-
Not Synced
what you are able to do
-
Not Synced
is sort of determine by this
-
Not Synced
relationship between law. technology - I think technology
-
Not Synced
is on top and law is
-
Not Synced
down here.
-
Not Synced
But anyway,
-
Not Synced
somehow these all
-
Not Synced
effect eachother.
-
Not Synced
so you can create
-
Not Synced
technologies that effect
-
Not Synced
the law,
-
Not Synced
you can create laws that
-
Not Synced
effect norms,
-
Not Synced
youcan create norms that
-
Not Synced
effect technology
-
Not Synced
so some
-
Not Synced
realtionship between
-
Not Synced
norms, markets, law and technology
-
Not Synced
is how we need to be thinking
-
Not Synced
in order to
-
Not Synced
design all of these systems
-
Not Synced
so they work well in th
-
Not Synced
future. I think one
-
Not Synced
of the key reasons why the collaboration
-
Not Synced
between MIT and Harper Law School
-
Not Synced
Medialab and Berkman
-
Not Synced
so important is that
-
Not Synced
you kind of have to get all
-
Not Synced
of the pieces
-
Not Synced
and the people in the same room
-
Not Synced
because the problem is
-
Not Synced
once everyone has a solution
-
Not Synced
and they're trying to convince
-
Not Synced
eachother of the solution
-
Not Synced
it's, I call them, people
-
Not Synced
selling doll houses.
-
Not Synced
rather than legos .
-
Not Synced
What you want is a whole pile of legos
-
Not Synced
with lawyers and business people
-
Not Synced
and technologists and policy makers
-
Not Synced
playing with the legos
-
Not Synced
rather than
-
Not Synced
trying to sell eachother thier
-
Not Synced
own dollhouses.
-
Not Synced
That's what was sort
-
Not Synced
of fun with the class
-
Not Synced
is that I think
-
Not Synced
a lot of lawyers
-
Not Synced
realized that actually infact,
-
Not Synced
whether you're talking about bit
-
Not Synced
coin or differential privacy or AI
-
Not Synced
we still have a lot of choices
-
Not Synced
to make on the technology side
-
Not Synced
and in fact those can
-
Not Synced
be informed by
-
Not Synced
policy and law
-
Not Synced
and conversly, I think
-
Not Synced
a lot of the technologists thought
-
Not Synced
that law was something like
-
Not Synced
laws of physics that just are. But in
-
Not Synced
fact laws are the result
-
Not Synced
of lawyers and policy
-
Not Synced
makers taking to
-
Not Synced
technologists.
-
Not Synced
Imagining what society wants.
-
Not Synced
So we're sort of in the process right
-
Not Synced
now of
-
Not Synced
struggling through how
-
Not Synced
we think about this.
-
Not Synced
But Importantly,
-
Not Synced
it's already happening
-
Not Synced
so it's not like we
-
Not Synced
have that much time.
-
Not Synced
I think it was Pedro Demingo in
-
Not Synced
his book
-
Not Synced
says in master algorithm
-
Not Synced
and this isn't the exact
-
Not Synced
I'm paraphrasing the quote
-
Not Synced
it's something like
-
Not Synced
I'm less afraid of
-
Not Synced
a super intelligence coming
-
Not Synced
to take over the world
-
Not Synced
and more worried about
-
Not Synced
a stupid intelligence that's taken
-
Not Synced
over already.
-
Not Synced
You know?
-
Not Synced
I think that's very close to where we are.
-
Not Synced
I think if you see
-
Not Synced
Julie A's paper , article in
-
Not Synced
Propublic, I guess it was
-
Not Synced
a little over a year ago.
-
Not Synced
where she happens to find a district where
-
Not Synced
they're forced to disclose court records.
-
Not Synced
So she was specifically going after the
-
Not Synced
fact that machine learning
-
Not Synced
AI is now used by the
-
Not Synced
judiciary to set bail
-
Not Synced
to do parole
-
Not Synced
and even sentencing.
-
Not Synced
And they have this thing
-
Not Synced
called the risk score
-
Not Synced
where the machine sort
-
Not Synced
of pops up
-
Not Synced
after it does an assesment of
-
Not Synced
a person's history
-
Not Synced
looks at thier interviews and
-
Not Synced
she found, and this
-
Not Synced
is great, cause she's a
-
Not Synced
math matician
-
Not Synced
in a data sense. She crunched
-
Not Synced
all these numbers
-
Not Synced
and it shows
-
Not Synced
that for many cases for white people
-
Not Synced
is't sort of nearly random
-
Not Synced
in some cases.
-
Not Synced
So, it's a number but it's still almost random.
-
Not Synced
and then for black people
-
Not Synced
it's biased against them
-
Not Synced
and what's interesting
-
Not Synced
is when I talked to ...
-
Not Synced
a prosecuter the other day
-
Not Synced
he said well I love, they
-
Not Synced
love these numbers
-
Not Synced
because you get a risk score
-
Not Synced
that says okay,
-
Not Synced
this person has a
-
Not Synced
risk rating of 8
-
Not Synced
and so then the court can
-
Not Synced
say 'okay, we'll give you this bail
-
Not Synced
because the last thing that they
-
Not Synced
want is for them to give them
-
Not Synced
some bail
-
Not Synced
and then the person goes out and murders
-
Not Synced
somebody, it's sort of thier fault.
-
Not Synced
If they're taken the risk score
-
Not Synced
they can say, "I just looked at
-
Not Synced
the risk score
-
Not Synced
it absolves them of this
-
Not Synced
responsibility
-
Not Synced
and so there's this
-
Not Synced
really interesting question
-
Not Synced
that even at random
-
Not Synced
it's still, there' s this wierd moral
-
Not Synced
hazard that even though you
-
Not Synced
have agency, you're able to
-
Not Synced
push off
-
Not Synced
this responsibility to the machine, right?
-
Not Synced
And then you can sort of say,
-
Not Synced
well it was math.
-
Not Synced
and the problem right now
-
Not Synced
is these algorithms are running
-
Not Synced
on data sets and rating
-
Not Synced
systems that are closed.
-
Not Synced
We see this happening in a
-
Not Synced
variety of fields . I think we
-
Not Synced
see this happening
-
Not Synced
in the judiciary,
-
Not Synced
Which is a scary place for it
-
Not Synced
to be happening
-
Not Synced
And so part of this initiative with AI
-
Not Synced
fund that we're doing
-
Not Synced
we're going to try and look
-
Not Synced
at whether we can create
-
Not Synced
more transparency and auditability
-
Not Synced
we're also seeing it in medicine.
-
Not Synced
There's a study that I heard
-
Not Synced
that when a doctor overrulled the machine
-
Not Synced
in diagnostics the doctor was wrong
-
Not Synced
70 % of the time.
-
Not Synced
So what does that mean?
-
Not Synced
So if you're a doctor
-
Not Synced
and you know for a fact
-
Not Synced
that you're 70% likely
-
Not Synced
on average to be wrong
-
Not Synced
are you ever going to overrule
-
Not Synced
the machine?
-
Not Synced
And what about that 30%
-
Not Synced
where the doctors are right?
-
Not Synced
So, it creates a very difficult
-
Not Synced
situation.
-
Not Synced
You look at...
-
Not Synced
Imagine war
-
Not Synced
We talk about autonomous
-
Not Synced
weapons and there's this whole
-
Not Synced
fight about
-
Not Synced
it, but what if all of the data
-
Not Synced
and not what if,
-
Not Synced
In fact, all of the data
-
Not Synced
that's driving intellegence
-
Not Synced
the way that you get on to
-
Not Synced
the termination list
-
Not Synced
as a target,
-
Not Synced
a lot of it involves statistical
-
Not Synced
analysis of your activity
-
Not Synced
your emotions, your calls
-
Not Synced
and there's this great interview
-
Not Synced
I think it was in the Independent or
-
Not Synced
it was Indpendent.
-
Not Synced
There was this guy who
-
Not Synced
I think he was in
-
Not Synced
Pakistan
-
Not Synced
I'm gonna get this wrong
-
Not Synced
I'll um, but it's close.
-
Not Synced
But he had been attacked
-
Not Synced
a number of times
-
Not Synced
where the collateral damage
-
Not Synced
was family members
-
Not Synced
being dead so he knew he
-
Not Synced
was on the kill list
-
Not Synced
but he didn't know how to get off
-
Not Synced
so he goes to London
-
Not Synced
to kind of fight for
-
Not Synced
'wait, look at me, talk to me
-
Not Synced
I'm on this kill list
-
Not Synced
but I'm not a bad guy.
-
Not Synced
Somehow
-
Not Synced
you got the wrong person.
-
Not Synced
But there's no interface
-
Not Synced
in which he can sort of
-
Not Synced
lobby and petition for getting off
-
Not Synced
this kill list.
-
Not Synced
So even though the person
-
Not Synced
controlling the drone strike and
-
Not Synced
pushing the button
-
Not Synced
maybe a human being,
-
Not Synced
If all of the data that's feeding into, or
-
Not Synced
a substantial amount of data that's
-
Not Synced
feeding into the decision to
-
Not Synced
put the person on
-
Not Synced
the kill list
-
Not Synced
is from a machine,
-
Not Synced
I don't know how that's that
-
Not Synced
different
-
Not Synced
from the machine actually being charged.
-
Not Synced
So we talk about sort of these
-
Not Synced
future autonomous systems
-
Not Synced
and robots running around
-
Not Synced
and killing people as
-
Not Synced
a sort of scary thing.
-
Not Synced
But if we are just pushing a button that the
-
Not Synced
robot just tells us to do
-
Not Synced
ABC or D but robot
-
Not Synced
says it's C, you're going to push C.
-
Not Synced
Apparently that was how Kisenger controlled Nixon
-
Not Synced
was through his elbow.
-
Not Synced
The anwer was always C.
-
Not Synced
But anyway,
-
Not Synced
the.... that actually is
-
Not Synced
when we think about practice.
-
Not Synced
We may already be in autonomous
-
Not Synced
mode in many things.
-
Not Synced
And then I'm going to T up to
-
Not Synced
Y? which is one of the
-
Not Synced
first places where the rubber meets
-
Not Synced
the road is with autonomous vehicles.
-
Not Synced
and a lot of the people that
-
Not Synced
I talk to say
-
Not Synced
that while the real soul searching
-
Not Synced
around this is going to happen
-
Not Synced
when the next big autonmous vehicle
-
Not Synced
accident happens where
-
Not Synced
it's clearly the machine's fault,
-
Not Synced
How is that going to play out?
-
Not Synced
So that may be one of the things
-
Not Synced
But the last thing that
-
Not Synced
I'll say is that I think
-
Not Synced
this is where the media lab
-
Not Synced
is excited. I think it's kind of
-
Not Synced
an interface design problem
-
Not Synced
because part of what the
-
Not Synced
problem is is that you may think
-
Not Synced
that by pushing , the button, the right to
-
Not Synced
overrule the computer the right to
-
Not Synced
launch the missle
-
Not Synced
may be your finger
-
Not Synced
if you have no choice,
-
Not Synced
morally or statistically other than
-
Not Synced
to push the button
-
Not Synced
you're not in charge anymore. Right?
-
Not Synced
So what I think we need to think about
-
Not Synced
is how to we bring
-
Not Synced
society and humans into the decision making
-
Not Synced
process so that the answer that
-
Not Synced
we derive involves human beings
-
Not Synced
and how does that interface hapen? what is the
-
Not Synced
right way to do it
-
Not Synced
Because I think what we are going to end
-
Not Synced
up with is collective decision making.
-
Not Synced
machines. and what we want to
-
Not Synced
not be in is human agency with no real decision
-
Not Synced
making ability.
-
Not Synced
And then we can talk more about some
-
Not Synced
of the ideas.
-
Not Synced
but I'll hand it over to Jot,
-
Not Synced
Thank you.
-
Not Synced
So I'll just give a short
-
Not Synced
overview of
-
Not Synced
the research we' ve been doing
-
Not Synced
on autonomous vehicles
-
Not Synced
I'm not a driverless car expert.
-
Not Synced
I don't build driverless cars.
-
Not Synced
But I'm interested in them as
-
Not Synced
as kind of a social phenomenon
-
Not Synced
and the reason has to do with this dilema that
-
Not Synced
Steve will keep discussing.
-
Not Synced
You know, what if it's an
-
Not Synced
autonomous car
-
Not Synced
that is going to for some
-
Not Synced
reason harm a bunch of
-
Not Synced
pedestrians
-
Not Synced
crossing the street because the
-
Not Synced
brakes are broken
-
Not Synced
or because they jumped in front
-
Not Synced
of it or whatever.
-
Not Synced
But the car can swerve and kill one
-
Not Synced
bystander on the other side.
-
Not Synced
in order to minimize harm
-
Not Synced
in order to save 5 or 10 people
-
Not Synced
should the car do this?
-
Not Synced
And who should decide?
-
Not Synced
And more interestingly,
-
Not Synced
what if the car could
-
Not Synced
swerve and hit a wall harming
-
Not Synced
the passenger or killing the passenger.?
-
Not Synced
In order to save these people?
-
Not Synced
Should the car do this as well?
-
Not Synced
What does the car
-
Not Synced
have a duty towards.?
-
Not Synced
Minimizing harm?utilitarian principle?
-
Not Synced
Protection of the owner or passengers in the car?
-
Not Synced
Duty toward them? Or something else? A sort of negotiation
-
Not Synced
inbetween?
-
Not Synced
Do we ignore this problem?
-
Not Synced
do we just say
-
Not Synced
well let the car deal with this problem
-
Not Synced
and it seems to be a very
-
Not Synced
controversial topic because
-
Not Synced
there are lots of people
-
Not Synced
who love this,
-
Not Synced
and lots of people who hate this.
-
Not Synced
and people who hate this say,
-
Not Synced
Well this is never going to happen
-
Not Synced
it's just so statistically unlikely
-
Not Synced
and I think that kind
-
Not Synced
of misses the point
-
Not Synced
because this is a invitro e xploration
-
Not Synced
of a principle so you strip
-
Not Synced
away all of the things that
-
Not Synced
don't matter in the real world
-
Not Synced
so you can isolate the factor
-
Not Synced
you know, does drug x
-
Not Synced
cause this particular
-
Not Synced
reation in a cell for example?
-
Not Synced
You know, you don't do this in the
-
Not Synced
forest you do it in a petridish.
-
Not Synced
And this is the petri dish for studying
-
Not Synced
human perception of machine ethics
-
Not Synced
and what other factors do people
-
Not Synced
seem to be ticked off by?