0:00:08.224,0:00:11.253 Thank you very much,[br]ladies and gentlemen. 0:00:11.253,0:00:14.762 Well, what I'd like to do today [br]in this very brief talk, 0:00:14.762,0:00:18.642 is a small social revolution,[br]I hope. 0:00:18.642,0:00:22.730 As you can see behind me,[br]my talk is about celebrating failure. 0:00:22.730,0:00:27.033 And when you say failure, very often,[br]people just switch off. 0:00:27.033,0:00:29.173 But what I'd like to do is --[br]I'd like to suggest today 0:00:29.173,0:00:33.153 that there are useful failures [br]and not so useful failures. 0:00:33.153,0:00:35.280 And i'd like to try [br]and help you understand 0:00:35.280,0:00:38.381 how we can manage our failures. 0:00:38.381,0:00:40.589 So I'll begin with [br]a very simple question. 0:00:40.589,0:00:45.234 The simple question is: [br]who makes mistakes? 0:00:45.234,0:00:48.296 Everybody absolutely, [br]human beings make mistakes. 0:00:48.296,0:00:51.255 All human beings [br]make mistakes. 0:00:51.255,0:00:52.951 There are no exceptions. 0:00:52.951,0:00:56.978 So if it's inevitable for human beings[br]to make mistakes -- 0:00:56.978,0:01:01.004 Why is it that every single error[br]that we make 0:01:01.004,0:01:05.391 we tend to overexaggerate [br]and see it as a disaster? 0:01:05.391,0:01:07.640 We can't get there, [br]we'll never get there. 0:01:07.640,0:01:12.371 It's as if an error, a mistake,[br]a failure means no success, 0:01:12.371,0:01:16.438 It's as if in our brains,[br]we've got this idea of a spectrum, 0:01:16.438,0:01:18.660 on one side you have success, 0:01:18.660,0:01:21.090 and on the other side,[br]it's failure. 0:01:21.090,0:01:23.673 But what i'd like to suggest is[br]that's not the case, 0:01:23.673,0:01:26.744 and if we're able to understand failure,[br] 0:01:26.744,0:01:30.404 and failing well,[br]and create a failing well culture, 0:01:30.404,0:01:34.178 well then we can create [br]the best path to success. 0:01:34.178,0:01:35.681 But to do that, [br]unfortunately, 0:01:35.681,0:01:39.758 we have to make a very big shift [br]in our perspectives. 0:01:39.758,0:01:41.662 It's quite difficult.[br]It's not easy to start 0:01:41.662,0:01:44.060 thinking positively about failures. 0:01:44.060,0:01:45.972 We need to embrace failures. 0:01:45.972,0:01:48.842 What we tend to do [br]is we tend to ignore them, 0:01:48.842,0:01:53.088 or hide them, [br]just pretend they're not happening. 0:01:53.088,0:01:56.072 If we do that, we're unable [br]to take the information 0:01:56.072,0:01:58.173 that comes from mistakes, 0:01:58.173,0:01:59.759 and we're able to take this information 0:01:59.759,0:02:03.294 and use it well [br]on our path to success. 0:02:03.294,0:02:06.454 Basically, I'm gonna give you [br]this very simple quotation 0:02:06.454,0:02:08.590 from Aldous Huxley, [br]I like it very much. 0:02:08.590,0:02:11.106 "Experience is not [br]what happens to you. 0:02:11.106,0:02:14.350 It is what you do with [br]what happens to you." 0:02:14.350,0:02:19.001 In other words, we need to use what happens,[br]and apply what happens, 0:02:19.001,0:02:21.219 rather than pretending something 0:02:21.219,0:02:23.891 that goes wrong hasn't happened. 0:02:23.891,0:02:27.246 And i'd like to begin by showing you [br]that this already exists 0:02:27.246,0:02:32.440 by taking a particular industry [br]that is failure obsessive. 0:02:32.452,0:02:35.100 Everything it does [br]is linked to failure, 0:02:35.100,0:02:37.250 and so it's part of it's culture. 0:02:37.250,0:02:43.000 Why? Well, because if it doesn't [br]manage the small mistakes 0:02:43.000,0:02:48.595 then unfortunately, [br]the result is very strong. 0:02:48.595,0:02:50.393 It's the nuclear industry. 0:02:50.393,0:02:52.531 They have the safest safety record 0:02:52.531,0:02:54.724 of any industry and just as well 0:02:54.724,0:02:57.973 because if they make a big mistake, [br]a disaster 0:02:57.973,0:03:00.844 then many people are killed across continents, 0:03:00.844,0:03:03.951 and also it's a long term disaster. 0:03:03.951,0:03:05.493 So what have they done? 0:03:05.493,0:03:08.395 They got obsessive about mistakes. 0:03:08.395,0:03:11.445 They've started to understand[br]that using mistakes in order 0:03:11.445,0:03:17.449 to stop disasters is the only way[br]to keep safety an absolute premium. 0:03:17.449,0:03:19.567 I'll just show you,[br]just very briefly, 0:03:19.567,0:03:24.382 four main summaries[br]that they've come up with. 0:03:24.382,0:03:30.160 First one, everyone,[br]is responsible for the mistakes. 0:03:30.160,0:03:33.650 Everyone is responsible [br]as they're part of the process. 0:03:33.650,0:03:36.606 Now, once you've understood[br]that everyone's responsible, 0:03:36.606,0:03:39.807 then it gets easier to create[br]an open environment, 0:03:39.807,0:03:41.766 where people are communicating openly 0:03:41.766,0:03:46.555 about what's gone right,[br]and what's gone wrong as well. 0:03:46.555,0:03:50.352 That takes us on to the third idea,[br]and the third idea is questioning. 0:03:50.352,0:03:53.683 It doesn't matter who tells you[br]to do something. 0:03:53.683,0:03:58.314 It doesn't matter how often[br]the process has been followed. 0:03:58.324,0:04:01.553 You can always [br]question that person. 0:04:01.553,0:04:05.498 If the idea is to try and understand[br]and keep safety high. 0:04:05.498,0:04:08.555 And finally, [br]everything that comes out, 0:04:08.555,0:04:12.042 especially the mistakes are shared. 0:04:12.042,0:04:13.688 You can notice that this is from 0:04:13.688,0:04:16.265 the Institute of Nuclear Power Operations. 0:04:16.265,0:04:17.135 What does that mean? 0:04:17.135,0:04:20.767 It's a sector-wide idea. 0:04:20.767,0:04:24.356 In other words, we don't keep[br]our information to ourselves. 0:04:24.356,0:04:28.121 We share it amongst everyone,[br]all the competitors. 0:04:28.121,0:04:29.794 Now that's pretty interesting. 0:04:29.794,0:04:32.876 But where's the link between[br]mistakes and disasters? 0:04:32.876,0:04:35.539 Well, to do that,[br]I'm going to show you this. 0:04:35.553,0:04:37.128 This is the accident pyramid, 0:04:37.128,0:04:41.251 it comes from a very different sector,[br]the insurance sector. 0:04:41.251,0:04:45.834 The insurance sector basically[br]it's based on failure. 0:04:45.834,0:04:48.589 It has business from failure. 0:04:48.589,0:04:52.033 This pyramid behind me,[br]it's about a 100 years old. 0:04:52.033,0:04:55.644 They've updated [it]. What you see, [br]on the screen behind me 0:04:55.644,0:04:58.616 is 2003 figures,[br]something like that. 0:04:58.616,0:05:01.188 What's important [br]is not the exact figure, 0:05:01.188,0:05:03.237 what's important [br]is the relationship. 0:05:03.237,0:05:05.501 We'll begin with [br]the yellow pieces, okay? 0:05:05.501,0:05:07.126 We'll start with first aid. 0:05:07.126,0:05:08.547 What does first aid mean? 0:05:08.547,0:05:10.480 It means that, [br]according to this, 0:05:10.480,0:05:16.787 300 examples of people going [br]to get bandages, medicine -- 0:05:16.787,0:05:18.297 In other words, [br]they've cut their finger, 0:05:18.297,0:05:20.770 they're not feeling well, [br]from the company, okay? 0:05:20.770,0:05:27.368 So the company administers medicines,[br]and so on to help people. 0:05:27.381,0:05:30.461 Next one up, [br]30 - severe accidents. 0:05:30.461,0:05:33.015 How do companies [br]see severe accidents? 0:05:33.015,0:05:35.251 Well, usually it means[br]that you've spent 0:05:35.251,0:05:38.354 at least 1 day away from work. 0:05:38.354,0:05:41.686 But, you can imagine,[br]that's a wide range of accidents. 0:05:41.686,0:05:43.239 So it could be breaking your leg, 0:05:43.239,0:05:45.835 it could be a serious illness, [br]and so on. 0:05:45.835,0:05:48.943 And unfortunately,[br]1 at the top, is fatality, 0:05:48.956,0:05:51.739 and fatality means fatality. 0:05:51.739,0:05:54.751 Now why all of this, [br]and why in yellow? 0:05:54.751,0:05:58.083 Well, yellow represents[br]the reported incidents. 0:05:58.083,0:06:01.657 In other words, companies looking[br]at the number of accidents. 0:06:01.657,0:06:04.291 They're trying to understand,[br]how can we bring that down? 0:06:04.291,0:06:06.006 How's it possible [br]to bring that down? 0:06:06.006,0:06:10.004 Especially, the severe accidents,[br]and fatalities. 0:06:10.004,0:06:13.640 We can have processes,[br]we can have rules. 0:06:13.640,0:06:15.568 But how can we really [br]bring it down? 0:06:15.568,0:06:17.100 How we can make a difference? 0:06:17.100,0:06:19.823 And the answer is:[br]the big numbers at the bottom. 0:06:19.823,0:06:21.340 So let's have a look. 0:06:21.340,0:06:24.713 We'll begin with [br]the biggest number, 300,000. 0:06:24.713,0:06:26.580 those are risky behaviors, 0:06:26.580,0:06:29.049 in other words,[br]doing something -- 0:06:29.049,0:06:31.256 when you shouldn't really [br]be doing it. 0:06:31.256,0:06:33.455 So you're tired, [br]you're stressed out, 0:06:33.455,0:06:35.929 you're not concentrated. 0:06:35.929,0:06:37.420 And the next one up, 3,000. 0:06:37.420,0:06:39.911 Well, that's near misses,[br]in other words, 0:06:39.911,0:06:43.645 a chain of risky behaviors, 1, 2, 3, 4. 0:06:43.645,0:06:46.618 That would be a disaster unless[br]it were for luck. 0:06:46.618,0:06:49.570 So we were lucky [br]not to have a disaster. 0:06:49.570,0:06:51.494 Okay, so that's the idea! 0:06:51.494,0:06:55.926 And the basic idea is if we can manage[br]the 3,000 and the 300,000, [br] 0:06:55.926,0:07:01.047 we can reduce [br]the really tough things in the yellow. 0:07:01.047,0:07:04.287 That's the idea,[br]but, and here's the big but. 0:07:04.287,0:07:06.429 In order to be able to manage 0:07:06.429,0:07:10.134 the big numbers at the bottom,[br]we need people to tell us. 0:07:10.134,0:07:12.737 Because in the white [br]it's the not reported. 0:07:12.753,0:07:16.013 It's people keeping [br]mistakes to themselves. 0:07:16.013,0:07:17.955 Now I thought, okay,[br]that's interesting! 0:07:17.955,0:07:19.032 That's about industries! 0:07:19.032,0:07:22.786 But can we apply this [br]to everything, to life? 0:07:23.694,0:07:25.588 And it turns out, we can! 0:07:25.588,0:07:28.595 We can apply this [br]accident pyramid to anything. 0:07:28.595,0:07:33.179 So I decided to apply it to something[br]a disaster I brought for you, 0:07:33.179,0:07:35.080 which is my marriage, okay? 0:07:35.080,0:07:38.616 Now, before we have [br]a look at my marriage, 0:07:38.616,0:07:40.069 in terms of the accident pyramid, 0:07:40.069,0:07:43.088 I'm just gonna ask,[br]is anyone else in the audience, 0:07:43.088,0:07:46.007 have you ever finished [br]a long term relationship? 0:07:46.007,0:07:48.708 If you have, just put your hand in the air,[br]just so I understand. 0:07:48.708,0:07:51.474 Fantastic! Oh, just people in the front [br]apparently, not in the back. 0:07:51.474,0:07:53.034 Okay, good! 0:07:53.034,0:07:56.922 So if you have ever finished [br]a long term relationship, 0:07:56.922,0:07:59.003 you can play along with me, okay? 0:07:59.003,0:08:02.005 The others, you're just [br]gonna have to imagine. 0:08:02.005,0:08:03.212 So, let's begin. 0:08:03.212,0:08:05.227 My personal accident pyramid. 0:08:05.227,0:08:07.948 Risky behaviors, 300,000. 0:08:07.948,0:08:09.220 Wow, that's a lot. 0:08:09.220,0:08:14.585 Examples, well, it could be drinking [br]the beer directly out of the bottle. 0:08:14.585,0:08:19.299 It could be forgetting [br]my mother-in-law's birthday. 0:08:19.299,0:08:23.249 It could be paying electricity bill late,[br]so I have to pay extra. 0:08:23.249,0:08:24.936 Something like that, okay? 0:08:24.936,0:08:26.484 Fine, next one up. 0:08:26.484,0:08:29.944 Near misses, well that's a chain [br]of risky behaviors, 0:08:29.944,0:08:32.775 where basically, [br]there would be a big argument 0:08:32.775,0:08:36.934 if it were not for the situation[br]so maybe my wife and myself, 0:08:36.956,0:08:40.740 we find ourselves in a public place,[br]in a theater, so we can't argue. 0:08:40.740,0:08:43.770 We're in front of my parents.[br]That's basic idea. 0:08:43.770,0:08:44.626 Let's go up to 300. 0:08:44.626,0:08:47.315 Oh well, that's when [br]there's screaming arguments, okay? 0:08:47.315,0:08:50.805 So that's when there's [br]a difference of opinion. 0:08:50.805,0:08:53.088 I think is what I'd say. 0:08:53.088,0:08:54.418 We go up to 30. 0:08:54.418,0:08:56.858 30 I put out [br]as walking out of the house, 0:08:56.858,0:09:01.585 but you can also include, if you wish,[br]slamming doors, and so on. 0:09:01.585,0:09:04.178 And finally, number 1.[br]That's a letter from her lawyer 0:09:04.178,0:09:06.911 saying I want a separation, okay? 0:09:06.911,0:09:10.702 So, I thought, well -- 0:09:10.702,0:09:12.702 How did that happen there? 0:09:12.702,0:09:16.002 I mean we didn't get married[br]for us to get divorced, right? 0:09:16.002,0:09:17.709 So how did that happen? 0:09:17.709,0:09:20.498 And it turns out [br]that the best way for me 0:09:20.498,0:09:26.345 to have managed this disaster[br]is exactly the same ideas 0:09:26.345,0:09:28.671 that the nuclear industry had. 0:09:28.671,0:09:30.163 That's amazing! 0:09:30.163,0:09:34.095 Everyone in the relationship [br]is responsible for the process. 0:09:34.095,0:09:37.297 Openness means open communication, 0:09:37.297,0:09:39.611 and so giving feedback, 0:09:39.611,0:09:42.800 questioning whatever it is is useful, 0:09:42.800,0:09:44.480 and finally constant learning 0:09:44.480,0:09:46.695 in order to help yourself get to, 0:09:46.695,0:09:49.512 in this case, a successful situation. 0:09:49.512,0:09:53.965 So, if it's so obvious, [br]why don't we do it? 0:09:53.965,0:09:58.063 Well, David Ledbetter spoke about[br]you are what you share. 0:09:58.063,0:10:00.317 And an interconnected world, 0:10:00.317,0:10:02.587 there is so much potential for sharing. 0:10:02.587,0:10:06.527 But we're afraid [br]of sharing information. 0:10:06.532,0:10:08.871 We're afraid either in relationships, 0:10:08.871,0:10:11.389 or in a much bigger situation, 0:10:11.389,0:10:13.495 in organizations as well. 0:10:13.495,0:10:14.358 Why is that happening? 0:10:14.358,0:10:19.110 Well, I'm going to suggest [br]that it's a question of timing. 0:10:19.130,0:10:24.961 There's a basic idea that short term[br]is ever more important. 0:10:25.531,0:10:28.194 In other words, you're looking just [br]at the next quarter. 0:10:28.194,0:10:29.551 What's happening at the next quarter 0:10:29.551,0:10:31.727 we're not looking ahead enough. 0:10:31.727,0:10:36.510 And short-termism [br]has permeated society. 0:10:36.510,0:10:37.852 But this is a problem, 0:10:37.852,0:10:39.569 Because basically, at this point, 0:10:39.569,0:10:42.116 where's the innovation coming from? 0:10:42.116,0:10:44.771 Where's the learning coming from? 0:10:44.771,0:10:47.873 Where's showing initiative coming from? 0:10:47.873,0:10:50.773 Basically, ladies and gentlemen,[br]very often it seems 0:10:50.773,0:10:54.622 that we're paid to not make mistakes. 0:10:54.622,0:10:57.714 But you know, if you are paid to[br]not make mistakes 0:10:57.714,0:10:59.963 then how are we gonna go forward? 0:10:59.963,0:11:02.134 How's that possible? 0:11:02.134,0:11:04.537 And a correlation to not making mistakes, 0:11:04.537,0:11:08.891 ladies and gentlemen,[br]unfortunately, is the blame culture. 0:11:08.891,0:11:13.028 The blame culture when there [br]is a mistake is this finger pointing. 0:11:13.028,0:11:14.744 It's your fault. 0:11:14.744,0:11:16.974 And so what we can do is we can 0:11:16.974,0:11:19.058 put the responsibility on to a person. 0:11:19.058,0:11:22.870 We can attribute the problems[br]to a person. 0:11:22.870,0:11:25.471 But does that mean [br]the problems have gone away? 0:11:25.471,0:11:28.086 Probably not, probably not. 0:11:30.410,0:11:32.732 What all of this has caused, [br]ladies and gentlemen, 0:11:32.732,0:11:35.850 is if I make a mistake, [br]it's probably best to me 0:11:35.850,0:11:39.129 to shut up, to keep quiet, 0:11:39.129,0:11:42.088 and this creates cover ups. 0:11:42.088,0:11:43.316 And what the cover ups do? 0:11:43.316,0:11:48.161 Well, cover ups create the potential[br]for systemic disasters. 0:11:48.161,0:11:50.682 That's the fatality at the top. 0:11:50.682,0:11:52.862 I'll give you some examples. 0:11:52.862,0:11:57.236 1986, this is a technological disaster. 0:11:57.236,0:12:01.197 Space shuttle Challenger disaster. 0:12:01.197,0:12:05.693 Logistics, how about [br]the Denver international airport 0:12:05.693,0:12:09.735 automatic baggage handling system. 0:12:09.735,0:12:14.268 It went more than half [br]a million dollars over budget. 0:12:14.268,0:12:18.238 It failed to deliver for 10 years,[br]and finally, it was stopped. 0:12:18.238,0:12:21.112 A little bit more recent[br]the environmental disaster 0:12:21.112,0:12:24.620 that came out of the deep water horizon. 0:12:24.620,0:12:26.612 All of these are systemic disasters 0:12:26.612,0:12:29.228 but maybe the biggest one, 0:12:29.228,0:12:30.685 is something we're living through right now, 0:12:30.685,0:12:32.728 which is the financial crack. 0:12:32.728,0:12:34.147 What you can see behind me, 0:12:34.147,0:12:38.126 is the Economist front cover. 0:12:38.126,0:12:42.082 But my question to you is: [br]which year do you think this is? 0:12:42.082,0:12:44.290 2008? 2010? 0:12:45.721,0:12:50.072 No, actually, unfortunately, [br]it's November 1997. 0:12:50.072,0:12:53.356 You see, we're not very good [br]at learning from failures. 0:12:53.362,0:12:57.387 We tend to sort of forget them,[br]ignore them. 0:12:57.387,0:13:00.427 Let me try and put everything together, [br]in synthesis. 0:13:00.441,0:13:04.938 Well, in a perfect world, [br]everything would be right, 0:13:04.938,0:13:07.597 and we'd understand, [br]and that would be wonderful. 0:13:07.607,0:13:10.229 But we don't live in a perfect world! 0:13:10.229,0:13:12.378 Let's take the opposite. 0:13:12.378,0:13:16.579 I do something, it doesn't go right,[br]and I don't understand why. 0:13:16.579,0:13:19.358 And probably, [br]I don't want to understand why, 0:13:19.358,0:13:23.698 because failing is bad. [br]So I'll just forget it! 0:13:23.698,0:13:29.692 Short term pressure makes us feel 0:13:29.692,0:13:33.002 that what's come up [br]now is the answer. 0:13:33.002,0:13:37.104 In other words, you're successful,[br]but it's not important 0:13:37.104,0:13:39.892 how or why you were successful. 0:13:39.892,0:13:43.755 And very often, short-termism [br]means you were lucky. 0:13:45.017,0:13:47.341 Well, that's good for [br]the immediate result. 0:13:47.341,0:13:49.061 But there's a problem. [br]And the problem is -- 0:13:49.061,0:13:52.624 you don't know how to repeat it,[br]and probably what we're doing is, 0:13:52.628,0:13:55.630 we're sewing the seeds [br]to an eventual disaster. 0:13:55.630,0:13:57.844 So what I'm going to [br]suggest this afternoon 0:13:57.844,0:14:02.321 is why not think about [br]short term mistakes 0:14:02.330,0:14:04.384 being an important part of learning 0:14:04.384,0:14:06.385 to get to the overall goal? 0:14:06.385,0:14:09.679 In other words, shifting, [br]moving from this idea 0:14:09.679,0:14:12.161 of successful - unsuccessful, 0:14:12.161,0:14:17.004 to why we are [br]successful or unsuccessful. 0:14:17.004,0:14:18.354 That's the basic idea. 0:14:18.354,0:14:19.918 So, let me put it all together. 0:14:19.918,0:14:22.887 What have I learned from failure? 0:14:22.887,0:14:24.071 I've learned these things here. 0:14:24.071,0:14:27.586 First, failing well means learning, 0:14:27.586,0:14:30.638 especially the difficult feedback. 0:14:30.638,0:14:33.439 In fact if you can take something [br]from the difficult feedback, 0:14:33.458,0:14:36.240 that's usually the most[br]important information. 0:14:36.240,0:14:38.477 Second one, [br]if you're too short term 0:14:38.477,0:14:40.069 about how you see things, 0:14:40.069,0:14:43.467 you're probably creating, [br]without knowing it, 0:14:43.467,0:14:46.547 the basis for a long term disaster. 0:14:46.547,0:14:49.670 Third, accountability doesn't mean[br]punishing people, 0:14:49.670,0:14:53.747 it menas understanding [br]you're part of the process. 0:14:53.747,0:14:57.846 And finally, if we can create[br]a no blame culture, 0:14:57.846,0:15:02.284 basically, we automatically [br]have more openness. 0:15:02.284,0:15:04.705 So there's much more [br]information that comes out. 0:15:04.705,0:15:07.083 Now, I can't do this by myself. 0:15:07.083,0:15:11.799 I can't create a failing well [br]culture by myslef. 0:15:11.814,0:15:13.362 So, I need some activists. 0:15:13.362,0:15:16.361 And I thought I'd start with [br]600 people today, okay? 0:15:16.363,0:15:18.088 So I know you're busy today. 0:15:18.088,0:15:21.102 So, as of tomorrow, [br]ladies and gentlemen, 0:15:21.102,0:15:23.791 let's all start failing well. 0:15:23.791,0:15:24.677 Thank you very much! 0:15:24.677,0:15:27.482 (Applause) 0:15:31.144,0:15:32.990 Host: Now, Tim, I have one question. 0:15:32.990,0:15:35.776 We're in European, Western society 0:15:35.776,0:15:38.594 and most of your examples, [br]nuclear and so on, 0:15:38.594,0:15:41.714 were from this society,[br]this culture. 0:15:41.714,0:15:44.201 Do these messages [br]apply across cultures? 0:15:44.201,0:15:46.482 TB: Yes, it's a very good question! 0:15:46.482,0:15:48.248 In terms of short-termism. 0:15:48.248,0:15:50.431 So this idea of just looking [br]at the next quarter, 0:15:50.431,0:15:53.688 that's very much an idea[br]of the western world. 0:15:53.688,0:15:57.308 But the idea of cover ups,[br]is world-wide. 0:15:57.314,0:16:00.659 It links to how much information 0:16:00.659,0:16:03.840 is power if you retain it [br]or you share it. 0:16:03.840,0:16:06.318 But let's say what's [br]underlying everything. 0:16:06.318,0:16:07.716 So in other words, [br]what goes across cultures, 0:16:07.716,0:16:10.089 is the idea of embarrassment and shame, 0:16:10.089,0:16:12.007 which changes from culture to culture. 0:16:12.007,0:16:13.812 So, in the eastern cultures, 0:16:13.812,0:16:17.675 shame would be[br]looking bad in front of the group 0:16:17.675,0:16:19.938 whilst generally in the west -- 0:16:19.938,0:16:21.470 It's shame feeling. 0:16:21.470,0:16:23.722 I'm unable to do something - the individual, 0:16:23.722,0:16:25.580 So yes, the basic idea. 0:16:25.580,0:16:27.721 But how we would change that 0:16:27.721,0:16:29.925 that's gonna change[br]from culture to culture. 0:16:29.925,0:16:31.981 Host: Okay, thank you very much![br]Tim Baxter. 0:16:31.981,0:16:34.700 (Applause)