1 00:00:09,480 --> 00:00:10,945 Wow! That was great! 2 00:00:10,945 --> 00:00:13,896 My name is Audrey and I'm a neuroscientist at Caltech. 3 00:00:13,896 --> 00:00:17,752 And I want to tell you about what I work on here at Caltech. 4 00:00:17,752 --> 00:00:19,755 So, what I work on in my lab 5 00:00:19,755 --> 00:00:21,505 is, we actually study sleep. 6 00:00:21,505 --> 00:00:22,759 Let me ask you a question: 7 00:00:22,759 --> 00:00:24,501 If you want to study the brain, 8 00:00:24,501 --> 00:00:25,843 do you want to look at it 9 00:00:25,843 --> 00:00:28,323 where it's kind of enclosed 10 00:00:28,323 --> 00:00:30,033 in a skull you can't see through? 11 00:00:30,033 --> 00:00:33,015 Or do you want to see it in a transparent box? 12 00:00:33,015 --> 00:00:34,767 I think it's a transparent box! 13 00:00:34,767 --> 00:00:37,239 So, one thing that's really good 14 00:00:37,239 --> 00:00:40,759 is that we actually have this already in nature. 15 00:00:40,759 --> 00:00:42,765 We have a brain in a glass box. 16 00:00:42,765 --> 00:00:44,409 We have something where we can see 17 00:00:44,409 --> 00:00:48,008 straight through an organism and see their brain. 18 00:00:48,008 --> 00:00:50,286 What we can do is zoom in on their brain 19 00:00:50,286 --> 00:00:51,717 and look at these neurons. 20 00:00:51,717 --> 00:00:53,986 What I'm showing you here are neurons, 21 00:00:53,986 --> 00:00:56,513 and each one of them is a different color. 22 00:00:56,513 --> 00:00:58,756 They're neurons in Technicolor. 23 00:00:58,996 --> 00:01:02,499 These neurons are hypocretin neurons. 24 00:01:02,499 --> 00:01:04,840 Does anyone know anybody with narcolepsy? 25 00:01:04,840 --> 00:01:07,497 Have you guys heard of narcolepsy? 26 00:01:07,497 --> 00:01:09,496 Narcolepsy is a sleep disorder 27 00:01:09,496 --> 00:01:12,653 where you're always kind of sleepy, 28 00:01:12,653 --> 00:01:14,996 and when you get stimuli that excite you, 29 00:01:14,996 --> 00:01:17,755 or something that is, I guess, your brain food, 30 00:01:17,755 --> 00:01:19,460 instead of being excited, 31 00:01:19,460 --> 00:01:21,550 which is kind of the normal thing to do, 32 00:01:21,550 --> 00:01:23,484 you actually fall asleep. 33 00:01:23,484 --> 00:01:25,532 These neurons I'm showing you here 34 00:01:25,532 --> 00:01:28,505 are the neurons that are involved in that. 35 00:01:28,505 --> 00:01:32,610 So, when we have all these neurons in different colors in the brain, 36 00:01:32,610 --> 00:01:33,973 the important thing is: 37 00:01:33,973 --> 00:01:35,805 How do they connect with one another? 38 00:01:35,805 --> 00:01:37,663 You have all these tracks-- 39 00:01:37,663 --> 00:01:41,185 how many of you guys here have taken public transportation? 40 00:01:42,751 --> 00:01:44,888 When you have public transportation, 41 00:01:44,888 --> 00:01:47,644 what's important is where those points of contact are, 42 00:01:47,644 --> 00:01:49,936 and where all those different tracks lead. 43 00:01:49,936 --> 00:01:51,481 And so, similar to that, 44 00:01:51,481 --> 00:01:53,745 we are constructing a system map. 45 00:01:53,745 --> 00:01:55,741 One of the ways we want to do this, 46 00:01:55,741 --> 00:01:59,504 is to take cues from how we interact with other humans. 47 00:01:59,504 --> 00:02:01,012 When we have another human, 48 00:02:01,012 --> 00:02:03,517 one thing that we do is give a handshake. 49 00:02:03,517 --> 00:02:06,021 So what professor Cori Bargmann did 50 00:02:06,021 --> 00:02:09,682 is that she designed a way to do a molecular grasp. 51 00:02:09,682 --> 00:02:11,609 So for example, I'm neuron Audrey, 52 00:02:11,609 --> 00:02:13,361 this is neuron Ella, 53 00:02:13,361 --> 00:02:17,509 and we each have a glove that's not lighting up. 54 00:02:17,509 --> 00:02:19,502 But when we make contact, 55 00:02:19,502 --> 00:02:23,255 we actually have our gloves lighting up. 56 00:02:23,255 --> 00:02:25,044 And when we are further away, 57 00:02:25,044 --> 00:02:27,510 the lights go off. 58 00:02:27,510 --> 00:02:29,523 So, this is a way to figure out 59 00:02:29,523 --> 00:02:31,759 how those neurons are connecting to one other, 60 00:02:31,759 --> 00:02:33,000 and we can actually see... 61 00:02:33,000 --> 00:02:35,279 (Audience laughs at off-camera event) 62 00:02:35,279 --> 00:02:37,248 [Cori and her friends Molecular Grasp] 63 00:02:37,248 --> 00:02:39,988 ...we see when the neurons are talking to each other, 64 00:02:39,988 --> 00:02:41,875 when they're close enough to one other, 65 00:02:41,875 --> 00:02:44,223 and when they're further apart from each other. 66 00:02:44,231 --> 00:02:47,906 I want to talk about my motivation to study science. 67 00:02:47,906 --> 00:02:49,369 I want to study science 68 00:02:49,369 --> 00:02:52,867 because I like to test ideas and observe what happens. 69 00:02:53,487 --> 00:02:57,301 What we want to do is make sure we're making a fair comparison. 70 00:02:57,301 --> 00:03:01,517 When we have an apple, we want to compare it to an apple. 71 00:03:01,527 --> 00:03:03,917 When you make a comparison, you want to make sure 72 00:03:03,917 --> 00:03:06,004 you're comparing an orange to an orange. 73 00:03:06,004 --> 00:03:10,434 So in controlled conditions, we have lots of different controls. 74 00:03:10,434 --> 00:03:12,776 We have positive controls and negative controls; 75 00:03:12,776 --> 00:03:14,768 you have positive controls to make sure 76 00:03:14,768 --> 00:03:18,022 when you're actually testing something, you can see an effect. 77 00:03:18,026 --> 00:03:21,499 You have negative controls to make sure you don't have auto-activation, 78 00:03:21,499 --> 00:03:23,833 so that when you don't have introduction 79 00:03:23,833 --> 00:03:27,176 of the experimental condition or stimuli you're testing, 80 00:03:27,176 --> 00:03:29,247 it's not going to just go off. 81 00:03:29,247 --> 00:03:32,256 And then you have your experimental condition. 82 00:03:32,256 --> 00:03:35,505 And when you design an experiment, 83 00:03:35,505 --> 00:03:37,517 you have lots of positive controls, 84 00:03:37,517 --> 00:03:39,644 you have lots of negative controls, 85 00:03:39,644 --> 00:03:42,234 and you have your experimental condition. 86 00:03:42,234 --> 00:03:44,760 One trend that we're seeing in neuroscience 87 00:03:44,760 --> 00:03:49,497 is that we can observe and test our idea. 88 00:03:49,497 --> 00:03:50,983 And what we scientists are, 89 00:03:50,983 --> 00:03:53,997 is we're kind of like little ninjas, 90 00:03:53,997 --> 00:03:57,505 just quietly looking at the brain. 91 00:03:57,505 --> 00:04:00,260 And that's what Alex and his friends have done. 92 00:04:00,260 --> 00:04:03,745 What they do is take these brains that are in glass boxes, 93 00:04:03,745 --> 00:04:05,777 and the brains are swimming, or sleeping, 94 00:04:05,777 --> 00:04:07,417 or doing whatever they want, 95 00:04:07,417 --> 00:04:10,093 and the team, meanwhile, captures the neural activity. 96 00:04:10,093 --> 00:04:11,443 The way they do this 97 00:04:11,443 --> 00:04:13,490 is that they've genetically engineered 98 00:04:13,490 --> 00:04:17,523 each of these neurons to give off a photon of light, 99 00:04:17,523 --> 00:04:19,476 and the more active these brains are, 100 00:04:19,476 --> 00:04:22,008 the more photons come out, 101 00:04:22,008 --> 00:04:24,798 so you see a greater luminescence. 102 00:04:24,808 --> 00:04:28,249 And so, while these fish are doing their regular business, 103 00:04:28,249 --> 00:04:31,789 while these brains in glass boxes are doing their regular business, 104 00:04:31,789 --> 00:04:35,787 and we can observe the neural activity. 105 00:04:35,787 --> 00:04:38,518 Let's say we want to take this a little bit further, 106 00:04:38,518 --> 00:04:41,284 to the entire human population. 107 00:04:41,284 --> 00:04:45,759 What if we look at the entire human population, 108 00:04:45,759 --> 00:04:47,516 in your native environment, 109 00:04:47,516 --> 00:04:51,756 and let's say we take away those positive and negative controls 110 00:04:51,756 --> 00:04:54,536 and just have experimental conditions? 111 00:04:54,536 --> 00:04:57,331 What if we have something like a wiki log, 112 00:04:57,331 --> 00:04:59,131 a wiki lab notebook, 113 00:04:59,131 --> 00:05:02,149 where everyone contributes their ideas, 114 00:05:02,149 --> 00:05:06,307 and as people are contributing, everyone writes over one another, 115 00:05:06,307 --> 00:05:08,510 kind of like Wikipedia. Who would have thought 116 00:05:08,510 --> 00:05:10,957 that WIkipedia would have taken off the way it did, 117 00:05:10,957 --> 00:05:13,904 with all its success? But it did. 118 00:05:13,904 --> 00:05:16,008 And what if we could apply it to science? 119 00:05:16,008 --> 00:05:18,852 It's slightly different than how we think about things now. 120 00:05:18,852 --> 00:05:21,030 We have these apples and oranges, 121 00:05:21,030 --> 00:05:23,765 and usually we can't compare them to one another. 122 00:05:23,765 --> 00:05:26,064 But now we have all these different conditions, 123 00:05:26,064 --> 00:05:28,007 and things like maybe weird fruits, 124 00:05:28,007 --> 00:05:29,990 like dragon fruit, 125 00:05:29,990 --> 00:05:31,999 and we want to compare those. 126 00:05:31,999 --> 00:05:34,250 So, I want to give you a challenge: 127 00:05:34,250 --> 00:05:36,886 to be the ones, to be the mathematicians 128 00:05:36,886 --> 00:05:39,085 and the statisticians to design 129 00:05:39,085 --> 00:05:40,733 these types of algorithms; 130 00:05:40,733 --> 00:05:42,517 and to decide whether or not 131 00:05:42,517 --> 00:05:43,785 we can extrapolate trends 132 00:05:43,785 --> 00:05:47,623 and figure out the observations without all those controls. 133 00:05:47,623 --> 00:05:50,152 So, change the way we do science. Go change the world. 134 00:05:50,152 --> 00:05:51,436 Thanks! 135 00:05:51,436 --> 00:05:54,104 (Applause)