0:00:09.480,0:00:10.945 Wow! That was great! 0:00:10.945,0:00:13.896 My name is Audrey [br]and I'm a neuroscientist at Caltech. 0:00:13.896,0:00:17.752 And I want to tell you[br]about what I work on here at Caltech. 0:00:17.752,0:00:19.755 So, what I work on in my lab 0:00:19.755,0:00:21.505 is, we actually study sleep. 0:00:21.505,0:00:22.759 Let me ask you a question: 0:00:22.759,0:00:24.501 If you want to study the brain, 0:00:24.501,0:00:25.843 do you want to look at it 0:00:25.843,0:00:28.323 where it's kind of enclosed 0:00:28.323,0:00:30.033 in a skull you can't see through? 0:00:30.033,0:00:33.015 Or do you want to see it [br]in a transparent box? 0:00:33.015,0:00:34.767 I think it's a transparent box! 0:00:34.767,0:00:37.239 So, one thing that's really good 0:00:37.239,0:00:40.759 is that we actually have this [br]already in nature. 0:00:40.759,0:00:42.765 We have a brain in a glass box. 0:00:42.765,0:00:44.409 We have something where we can see 0:00:44.409,0:00:48.008 straight through an organism[br]and see their brain. 0:00:48.008,0:00:50.286 What we can do [br]is zoom in on their brain 0:00:50.286,0:00:51.717 and look at these neurons. 0:00:51.717,0:00:53.986 What I'm showing you here [br]are neurons. 0:00:53.986,0:00:56.513 and each one of them[br]is a different color. 0:00:56.513,0:00:58.756 They're neurons in Technicolor. 0:00:58.996,0:01:02.499 These neurons are hypocretin neurons. 0:01:02.499,0:01:04.840 Does anyone know anybody with narcolepsy? 0:01:04.840,0:01:07.497 Have you guys heard of narcolepsy? 0:01:07.497,0:01:09.496 Narcolepsy is a sleep disorder 0:01:09.496,0:01:12.653 where you're always kind of sleepy, 0:01:12.653,0:01:14.996 and when you get stimuli that excite you, 0:01:14.996,0:01:17.755 or something that is,[br]I guess, your brain food, 0:01:17.755,0:01:19.460 instead of being excited, 0:01:19.460,0:01:21.550 which is kind of the normal thing to do, 0:01:21.550,0:01:23.484 you actually fall asleep. 0:01:23.484,0:01:25.532 These neurons I'm showing you here 0:01:25.532,0:01:28.505 are the neurons[br]that are involved in that. 0:01:28.505,0:01:32.610 So, when we have all these neurons [br]in different colors in the brain, 0:01:32.610,0:01:33.973 the important thing is: 0:01:33.973,0:01:35.805 How do they connect with one another? 0:01:35.805,0:01:37.663 You have all these tracks-- 0:01:37.663,0:01:41.185 how many of you guys here [br]have taken public transportation? 0:01:42.751,0:01:44.888 When you have public transportation, 0:01:44.888,0:01:47.644 what's important is [br]where those points of contact are, 0:01:47.644,0:01:49.936 and where all those [br]different tracks lead. 0:01:49.936,0:01:51.481 And so, similar to that, 0:01:51.481,0:01:53.745 we are constructing a system map. 0:01:53.745,0:01:55.741 One of the ways [br]we want to do this, 0:01:55.741,0:01:59.504 is to take cues from how[br]we interact with other humans. 0:01:59.504,0:02:01.012 When we have another human, 0:02:01.012,0:02:03.517 one thing that we do[br]is give a handshake. 0:02:03.517,0:02:06.021 So what professor Cori Bargmann did 0:02:06.021,0:02:09.682 is that she designed a way[br]to do a molecular grasp. 0:02:09.682,0:02:11.609 So for example, I'm neuron Audrey, 0:02:11.609,0:02:13.361 this is neuron Ella, 0:02:13.361,0:02:17.509 and we each have a glove[br]that's not lighting up. 0:02:17.509,0:02:19.502 But when we make contact, 0:02:19.502,0:02:23.255 we actually have our gloves lighting up. 0:02:23.255,0:02:25.044 And when we are further away, 0:02:25.044,0:02:27.510 the lights go off. 0:02:27.510,0:02:29.523 So, this is a way to figure out 0:02:29.523,0:02:31.759 how those neurons[br]are connecting to one other, 0:02:31.759,0:02:33.000 and we can actually see... 0:02:33.000,0:02:35.279 (Audience laughs at off-camera event) 0:02:35.279,0:02:37.248 [Cori and her friends[br]Molecular Grasp] 0:02:37.248,0:02:39.988 ...we see when the neurons[br]are talking to each other, 0:02:39.988,0:02:41.875 when they're close enough[br]to one other, 0:02:41.875,0:02:44.223 and when they're further apart [br]from each other. 0:02:44.231,0:02:47.906 I want to talk about[br]my motivation to study science. 0:02:47.906,0:02:49.369 I want to study science 0:02:49.369,0:02:52.867 because I like to test ideas[br]and observe what happens. 0:02:53.487,0:02:57.301 What we want to do is make sure [br]we're making a fair comparison. 0:02:57.301,0:03:01.517 When we have an apple,[br]we want to compare it to an apple. 0:03:01.527,0:03:03.917 When you make a comparison,[br]you want to make sure 0:03:03.917,0:03:06.004 you're comparing[br]an orange to an orange. 0:03:06.004,0:03:10.434 So in controlled conditions,[br]we have lots of different controls. 0:03:10.434,0:03:12.776 We have positive controls [br]and negative controls; 0:03:12.776,0:03:14.768 you have positive controls[br]to make sure 0:03:14.768,0:03:18.022 when you're actually testing something,[br]you can see an effect. 0:03:18.026,0:03:21.499 You have negative controls[br]to make sure you don't have auto-activation, 0:03:21.499,0:03:23.833 so that when you don't have introduction 0:03:23.833,0:03:27.176 of the experimental condition[br]or stimuli you're testing, 0:03:27.176,0:03:29.247 it's not going to just go off. 0:03:29.247,0:03:32.256 And then you have[br]your experimental condition. 0:03:32.256,0:03:35.505 And when you design[br]an experiment, 0:03:35.505,0:03:37.517 you have lots of positive controls, 0:03:37.517,0:03:39.644 you have lots of negative controls, 0:03:39.644,0:03:42.234 and you have your experimental condition. 0:03:42.234,0:03:44.760 One trend that we're seeing [br]in neuroscience 0:03:44.760,0:03:49.497 is that we can observe[br]and test our idea. 0:03:49.497,0:03:50.983 And what we scientists are, 0:03:50.983,0:03:53.997 is we're kind of like little ninjas, 0:03:53.997,0:03:57.505 just quietly looking at the brain. 0:03:57.505,0:04:00.260 And that's what[br]Alex and his friends have done. 0:04:00.260,0:04:03.745 What they do is take these brains[br]that are in glass boxes, 0:04:03.745,0:04:05.777 and the brains are[br]swimming, or sleeping, 0:04:05.777,0:04:07.417 or doing whatever they want, 0:04:07.417,0:04:10.093 and the team, meanwhile,[br]captures the neural activity. 0:04:10.093,0:04:11.443 The way they do this 0:04:11.443,0:04:13.490 is that they've[br]genetically engineered 0:04:13.490,0:04:17.523 each of these neurons[br]to give off a photon of light, 0:04:17.523,0:04:19.476 and the more active[br]these brains are, 0:04:19.476,0:04:22.008 the more photons come out, 0:04:22.008,0:04:24.798 so you see a greater luminescence. 0:04:24.808,0:04:28.249 And so, just like while these fish[br]are doing their regular business, 0:04:28.249,0:04:31.789 while these brains in glass boxes[br]are doing their regular business, 0:04:31.789,0:04:35.787 and we can observe the neural activity. 0:04:35.787,0:04:38.518 Let's say we want to take this[br]a little bit further, 0:04:38.518,0:04:41.284 to the entire human population. 0:04:41.284,0:04:45.759 What if we look at[br]the entire human population, 0:04:45.759,0:04:47.516 in your native environment, 0:04:47.516,0:04:51.756 and let's say we take away[br]those positive and negative controls 0:04:51.756,0:04:54.536 and just have experimental conditions? 0:04:54.536,0:04:57.331 What if we have something [br]like a wiki log, 0:04:57.331,0:04:59.131 a wiki lab notebook, 0:04:59.131,0:05:02.149 where everyone contributes their ideas, 0:05:02.149,0:05:06.307 and as people are contributing,[br]everyone writes over one another, 0:05:06.307,0:05:08.510 kind of like Wikipedia.[br]Who would have thought 0:05:08.510,0:05:10.957 that WIkipedia would have[br]taken off the way it did, 0:05:10.957,0:05:13.904 with all its success?[br]But it did. 0:05:13.904,0:05:16.008 And what if we could[br]apply it to science? 0:05:16.008,0:05:18.852 It's slightly different than[br]how we think about things now. 0:05:18.852,0:05:21.030 We have these apples and oranges, 0:05:21.030,0:05:23.765 and usually we can't compare[br]them to one another. 0:05:23.765,0:05:26.064 But now we have [br]all these different conditions, 0:05:26.064,0:05:28.007 and things like maybe weird fruits, 0:05:28.007,0:05:29.990 like dragon fruit, 0:05:29.990,0:05:31.999 and we want to compare those. 0:05:31.999,0:05:34.250 So, I want to give you a challenge: 0:05:34.250,0:05:36.886 to be the ones,[br]to be the mathematicians 0:05:36.886,0:05:39.085 and the statisticians to design[br] 0:05:39.085,0:05:40.733 these types of algorithms; 0:05:40.733,0:05:42.517 and to decide whether or not 0:05:42.517,0:05:43.785 we can extrapolate trends 0:05:43.785,0:05:47.623 and figure out the observations[br]without all those controls. 0:05:47.623,0:05:50.152 So, change the way we do science.[br]Go change the world. 0:05:50.152,0:05:51.436 Thanks! 0:05:51.436,0:05:54.104 (Applause)