Thank you very much,
ladies and gentlemen.
Well, what I'd like to do today
in this very brief talk,
is a small social revolution,
I hope.
As you can see behind me,
my talk is about celebrating failure.
And when you say failure, very often,
people just switch off.
But what I'd like to do is --
I'd like to suggest today
that there are useful failures
and not so useful failures.
And i'd like to try
and help you understand
how we can manage our failures.
So I'll begin with
a very simple question.
The simple question is:
who makes mistakes?
Everybody absolutely,
human beings make mistakes.
All human beings
make mistakes.
There are no exceptions.
So if it's inevitable for human beings
to make mistakes --
Why is it that every single error
that we make
we tend to overexaggerate
and see it as a disaster?
We can't get there,
we'll never get there.
It's as if an error, a mistake,
a failure means no success,
It's as if in our brains,
we've got this idea of a spectrum,
on one side you have success,
and on the other side,
it's failure.
But what i'd like to suggest is
that's not the case,
and if we're able to understand failure,
and failing well,
and create a failing well culture,
well then we can create
the best path to success.
But to do that,
unfortunately,
we have to make a very big shift
in our perspectives.
It's quite difficult.
It's not easy to start
thinking positively about failures.
We need to embrace failures.
What we tend to do
is we tend to ignore them,
or hide them,
just pretend they're not happening.
If we do that, we're unable
to take the information
that comes from mistakes,
and we're able to take this information
and use it well
on our path to success.
Basically, I'm gonna give you
this very simple quotation
from Aldous Huxley,
I like it very much.
"Experience is not
what happens to you.
It is what you do with
what happens to you."
In other words, we need to use what happens,
and apply what happens,
rather than pretending something
that goes wrong hasn't happened.
And i'd like to begin by showing you
that this already exists
by taking a particular industry
that is failure obsessive.
Everything it does
is linked to failure,
and so it's part of it's culture.
Why? Well, because if it doesn't
manage the small mistakes
then unfortunately,
the result is very strong.
It's the nuclear industry.
They have the safest safety record
of any industry and just as well
because if they make a big mistake,
a disaster
then many people are killed across continents,
and also it's a long term disaster.
So what have they done?
They got obsessive about mistakes.
They've started to understand
that using mistakes in order
to stop disasters is the only way
to keep safety an absolute premium.
I'll just show you,
just very briefly,
four main summaries
that they've come up with.
First one, everyone,
is responsible for the mistakes.
Everyone is responsible
as they're part of the process.
Now, once you've understood
that everyone's responsible,
then it gets easier to create
an open environment,
where people are communicating openly
about what's gone right,
and what's gone wrong as well.
That takes us on to the third idea,
and the third idea is questioning.
It doesn't matter who tells you
to do something.
It doesn't matter how often
the process has been followed.
You can always
question that person.
If the idea is to try and understand
and keep safety high.
And finally,
everything that comes out,
especially the mistakes are shared.
You can notice that this is from
the Institute of Nuclear Power Operations.
What does that mean?
It's a sector-wide idea.
In other words, we don't keep
our information to ourselves.
We share it amongst everyone,
all the competitors.
Now that's pretty interesting.
But where's the link between
mistakes and disasters?
Well, to do that,
I'm going to show you this.
This is the accident pyramid,
it comes from a very different sector,
the insurance sector.
The insurance sector basically
it's based on failure.
It has business from failure.
This pyramid behind me,
it's about a 100 years old.
They've updated [it]. What you see,
on the screen behind me
is 2003 figures,
something like that.
What's important
is not the exact figure,
what's important
is the relationship.
We'll begin with
the yellow pieces, okay?
We'll start with first aid.
What does first aid mean?
It means that,
according to this,
300 examples of people going
to get bandages, medicine --
In other words,
they've cut their finger,
they're not feeling well,
from the company, okay?
So the company administers medicines,
and so on to help people.
Next one up,
30 - severe accidents.
How do companies
see severe accidents?
Well, usually it means
that you've spent
at least 1 day away from work.
But, you can imagine,
that's a wide range of accidents.
So it could be breaking your leg,
it could be a serious illness,
and so on.
And unfortunately,
1 at the top, is fatality,
and fatality means fatality.
Now why all of this,
and why in yellow?
Well, yellow represents
the reported incidents.
In other words, companies looking
at the number of accidents.
They're trying to understand,
how can we bring that down?
How's it possible
to bring that down?
Especially, the severe accidents,
and fatalities.
We can have processes,
we can have rules.
But how can we really
bring it down?
How we can make a difference?
And the answer is:
the big numbers at the bottom.
So let's have a look.
We'll begin with
the biggest number, 300,000.
those are risky behaviors,
in other words,
doing something --
when you shouldn't really
be doing it.
So you're tired,
you're stressed out,
you're not concentrated.
And the next one up, 3,000.
Well, that's near misses,
in other words,
a chain of risky behaviors, 1, 2, 3, 4.
That would be a disaster unless
it were for luck.
So we were lucky
not to have a disaster.
Okay, so that's the idea!
And the basic idea is if we can manage
the 3,000 and the 300,000,
we can reduce
the really tough things in the yellow.
That's the idea,
but, and here's the big but.
In order to be able to manage
the big numbers at the bottom,
we need people to tell us.
Because in the white
it's the not reported.
It's people keeping
mistakes to themselves.
Now I thought, okay,
that's interesting!
That's about industries!
But can we apply this
to everything, to life?
And it turns out, we can!
We can apply this
accident pyramid to anything.
So I decided to apply it to something
a disaster I brought for you,
which is my marriage, okay?
Now, before we have
a look at my marriage,
in terms of the accident pyramid,
I'm just gonna ask,
is anyone else in the audience,
have you ever finished
a long term relationship?
If you have, just put your hand in the air,
just so I understand.
Fantastic! Oh, just people in the front
apparently, not in the back.
Okay, good!
So if you have ever finished
a long term relationship,
you can play along with me, okay?
The others, you're just
gonna have to imagine.
So, let's begin.
My personal accident pyramid.
Risky behaviors, 300,000.
Wow, that's a lot.
Examples, well, it could be drinking
the beer directly out of the bottle.
It could be forgetting
my mother-in-law's birthday.
It could be paying electricity bill late,
so I have to pay extra.
Something like that, okay?
Fine, next one up.
Near misses, well that's a chain
of risky behaviors,
where basically,
there would be a big argument
if it were not for the situation
so maybe my wife and myself,
we find ourselves in a public place,
in a theater, so we can't argue.
We're in front of my parents.
That's basic idea.
Let's go up to 300.
Oh well, that's when
there's screaming arguments, okay?
So that's when there's
a difference of opinion.
I think is what I'd say.
We go up to 30.
30 I put out
as walking out of the house,
but you can also include, if you wish,
slamming doors, and so on.
And finally, number 1.
That's a letter from her lawyer
saying I want a separation, okay?
So, I thought, well --
How did that happen there?
I mean we didn't get married
for us to get divorced, right?
So how did that happen?
And it turns out
that the best way for me
to have managed this disaster
is exactly the same ideas
that the nuclear industry had.
That's amazing!
Everyone in the relationship
is responsible for the process.
Openness means open communication,
and so giving feedback,
questioning whatever it is is useful,
and finally constant learning
in order to help yourself get to,
in this case, a successful situation.
So, if it's so obvious,
why don't we do it?
Well, David Ledbetter spoke about
you are what you share.
And an interconnected world,
there is so much potential for sharing.
But we're afraid
of sharing information.
We're afraid either in relationships,
or in a much bigger situation,
in organizations as well.
Why is that happening?
Well, I'm going to suggest
that it's a question of timing.
There's a basic idea that short term
is ever more important.
In other words, you're looking just
at the next quarter.
What's happening at the next quarter
we're not looking ahead enough.
And short-termism
has permeated society.
But this is a problem,
Because basically, at this point,
where's the innovation coming from?
Where's the learning coming from?
Where's showing initiative coming from?
Basically, ladies and gentlemen,
very often it seems
that we're paid to not make mistakes.
But you know, if you are paid to
not make mistakes
then how are we gonna go forward?
How's that possible?
And a correlation to not making mistakes,
ladies and gentlemen,
unfortunately, is the blame culture.
The blame culture when there
is a mistake is this finger pointing.
It's your fault.
And so what we can do is we can
put the responsibility on to a person.
We can attribute the problems
to a person.
But does that mean
the problems have gone away?
Probably not, probably not.
What all of this has caused,
ladies and gentlemen,
is if I make a mistake,
it's probably best to me
to shut up, to keep quiet,
and this creates cover ups.
And what the cover ups do?
Well, cover ups create the potential
for systemic disasters.
That's the fatality at the top.
I'll give you some examples.
1986, this is a technological disaster.
Space shuttle Challenger disaster.
Logistics, how about
the Denver international airport
automatic baggage handling system.
It went more than half
a million dollars over budget.
It failed to deliver for 10 years,
and finally, it was stopped.
A little bit more recent
the environmental disaster
that came out of the deep water horizon.
All of these are systemic disasters
but maybe the biggest one,
is something we're living through right now,
which is the financial crack.
What you can see behind me,
is the Economist front cover.
But my question to you is:
which year do you think this is?
2008? 2010?
No, actually, unfortunately,
it's November 1997.
You see, we're not very good
at learning from failures.
We tend to sort of forget them,
ignore them.
Let me try and put everything together,
in synthesis.
Well, in a perfect world,
everything would be right,
and we'd understand,
and that would be wonderful.
But we don't live in a perfect world!
Let's take the opposite.
I do something, it doesn't go right,
and I don't understand why.
And probably,
I don't want to understand why,
because failing is bad.
So I'll just forget it!
Short term pressure makes us feel
that what's come up
now is the answer.
In other words, you're successful,
but it's not important
how or why you were successful.
And very often, short-termism
means you were lucky.
Well, that's good for
the immediate result.
But there's a problem.
And the problem is --
you don't know how to repeat it,
and probably what we're doing is,
we're sewing the seeds
to an eventual disaster.
So what I'm going to
suggest this afternoon
is why not think about
short term mistakes
being an important part of learning
to get to the overall goal?
In other words, shifting,
moving from this idea
of successful - unsuccessful,
to why we are
successful or unsuccessful.
That's the basic idea.
So, let me put it all together.
What have I learned from failure?
I've learned these things here.
First, failing well means learning,
especially the difficult feedback.
In fact if you can take something
from the difficult feedback,
that's usually the most
important information.
Second one,
if you're too short term
about how you see things,
you're probably creating,
without knowing it,
the basis for a long term disaster.
Third, accountability doesn't mean
punishing people,
it menas understanding
you're part of the process.
And finally, if we can create
a no blame culture,
basically, we automatically
have more openness.
So there's much more
information that comes out.
Now, I can't do this by myself.
I can't create a failing well
culture by myslef.
So, I need some activists.
And I thought I'd start with
600 people today, okay?
So I know you're busy today.
So, as of tomorrow,
ladies and gentlemen,
let's all start failing well.
Thank you very much!
(Applause)
Host: Now, Tim, I have one question.
We're in European, Western society
and most of your examples,
nuclear and so on,
were from this society,
this culture.
Do these messages
apply across cultures?
TB: Yes, it's a very good question!
In terms of short-termism.
So this idea of just looking
at the next quarter,
that's very much an idea
of the western world.
But the idea of cover ups,
is world-wide.
It links to how much information
is power if you retain it
or you share it.
But let's say what's
underlying everything.
So in other words,
what goes across cultures,
is the idea of embarrassment and shame,
which changes from culture to culture.
So, in the eastern cultures,
shame would be
looking bad in front of the group
whilst generally in the west --
It's shame feeling.
I'm unable to do something - the individual,
So yes, the basic idea.
But how we would change that
that's gonna change
from culture to culture.
Host: Okay, thank you very much!
Tim Baxter.
(Applause)