0:00:00.000,0:00:03.526 Hi, my name is Scotty Page. I'm a[br]professor of complex systems, political 0:00:03.526,0:00:07.550 science and economics at the University of[br]Michigan in Ann Arbor. And I'd like to 0:00:07.550,0:00:11.306 welcome you to this free online course[br]called Model Thinking. In this opening 0:00:11.306,0:00:14.939 lecture, I just want to do four things.[br]The first thing I want to do is I want to 0:00:14.939,0:00:18.480 sort of explain to you why you know, I[br]personally think it's, it's so important 0:00:18.480,0:00:21.731 and fun to take a course in models.[br]Second, what I'd like to do is, I want to 0:00:21.731,0:00:24.889 give you a sense of the outline of the[br]course. Like, what we're gonna cover, a 0:00:24.889,0:00:27.839 little bit, and how it's gonna be[br]structured. Third thing is, I'll talk a 0:00:27.839,0:00:31.080 little bit about an online course that[br]you, I've never taught an online course 0:00:31.080,0:00:34.362 before, you probably haven't taken one.[br]So, let's talk a little bit how it's, just 0:00:34.362,0:00:37.603 how it's structured, how it's set up, you[br]know, what, what's out there on the web, 0:00:37.603,0:00:40.844 that sort of thing. And then the last[br]thing I'll do is I'll talk about, sort of, 0:00:40.844,0:00:44.292 how I'm gonna structure particular units.[br]Each unit will focus on a single model or 0:00:44.292,0:00:47.616 a class of models. I wanna give you some[br]sense of exactly how we're gonna unpack 0:00:47.616,0:00:50.982 those things, analyze them and think about[br]them, alright? Okay, so let's get started. 0:00:50.982,0:00:54.468 Why model?. First reason, I think,[br]is this. In order to be an intelligent 0:00:54.468,0:00:58.042 citizen of the world, I think you have to[br]understand models. Why do I say that? 0:00:58.042,0:01:01.524 Well, when you think about, like, a[br]liberal arts education, I think some of us 0:01:01.524,0:01:05.237 classically think of, sort of, the great[br]books, like, this long shelf of books that 0:01:05.237,0:01:08.858 everyone should know. And when the great[br]books curriculum was formed, right, and 0:01:08.858,0:01:12.525 I'll talk about this some in the next[br]lecture, you know, most of human knowledge 0:01:12.525,0:01:16.038 didn't have models in it. Models are a[br]relatively new phenomena. Right? So, if 0:01:16.038,0:01:19.668 you take, you know, whether you go from[br]anthropology to zoology, anywhere in 0:01:19.668,0:01:23.495 between. When you go to college, you'll[br]sorta find, like, oh my gosh, I'm learning 0:01:23.495,0:01:27.370 models in this course. And we'll talk, in[br]a minute, about some of the reasons why 0:01:27.370,0:01:31.196 we're using models, right? But models are[br]everywhere. And so in order to just be 0:01:31.196,0:01:35.366 involved in a conversation, it's important[br]these days that you can use and understand 0:01:35.366,0:01:39.695 models. Alright. Reason number two. The[br]reason models are everywhere, the reason 0:01:39.695,0:01:43.614 they're everywhere from anthropology to[br]zoology is, they're better, right? They 0:01:43.614,0:01:47.686 sort of make us clearer, better thinkers.[br]Anytime anybody's ever run a horse race 0:01:47.686,0:01:51.758 between models making, you know, people[br]using models to make decisions, and people 0:01:51.758,0:01:55.728 not using models to make decisions, the[br]people with models do better. So models 0:01:55.728,0:01:59.800 just make you a better thinker. The reason[br]why is that they sort of weed out the 0:01:59.800,0:02:03.923 logical inconsistencies. They're a crutch,[br]right, they just, you know, we sort of are 0:02:03.923,0:02:07.486 crazy, you know, think silly things, can't[br]think through all the logical 0:02:07.486,0:02:11.934 consequences. Models sort of tie us to the[br]mast a little bit. And in doing so, right, 0:02:11.934,0:02:17.830 we think better. We get better at what we do.[br]All right. Reason number three to use and 0:02:17.830,0:02:22.581 understand data. So. There's just so much[br]data out there, right? When I first became 0:02:22.581,0:02:26.408 a social scientist, I mean, it was, it[br]would be a real effort for somebody to go 0:02:26.408,0:02:30.381 grab a data set. Now there's just a ton of[br]it. I like to think of it as a fire hose 0:02:30.382,0:02:34.209 of data, but my friends who are computer[br]scientists, they call it a hairball of 0:02:34.209,0:02:37.987 data, right, cuz it's just sort of all[br]mangled and messed up. So, models, they'll 0:02:37.987,0:02:41.765 just take that data, right, and sort of[br]structure it into information, and then 0:02:41.765,0:02:45.739 turn that information into knowledge. And[br]so, without models, all we've just got is 0:02:45.739,0:02:49.615 a whole bunch of numbers out there. With[br]models, we actually get information and 0:02:49.615,0:02:53.540 knowledge and eventually maybe even some[br]wisdom. At least we can hope, right? Okay. 0:02:53.540,0:02:56.856 Reason number four: last piece of main[br]category reason. And by the way, the next 0:02:58.514,0:03:00.172 four lectures I'm gonna un-, I'm gonna[br]sort of work through and unpack each of 0:03:00.172,0:03:03.446 these four reasons in more depth. But I[br]just sorta want to lay them out there, 0:03:03.446,0:03:06.550 this first lecture. So, reason number[br]four: to decide, strategize, and design. 0:03:06.550,0:03:10.143 So, when you've gotta make a decision,[br]whether it's, you know, whether you're the 0:03:10.143,0:03:13.459 President of the United States or whether[br]you're running your local PTO 0:03:13.459,0:03:17.006 organization, it's helpful to build or[br]structure that information in a way to 0:03:17.006,0:03:20.646 make better decisions. So we'll learn[br]about things like decision trees and game 0:03:20.646,0:03:24.285 theory models and stuff like that, to just[br]help us make better decisions and to 0:03:24.285,0:03:27.970 strategize better. And also, at the very[br]end of the class, we'll talk about design 0:03:27.970,0:03:31.241 issues, right? You can use models to[br]design things like institutions and 0:03:31.241,0:03:34.788 policies and stuff like that. So, models[br]just make it better at making choices, 0:03:34.788,0:03:38.789 better at taking actions. Okay, so those[br]are the big four. Now let's talk a little 0:03:38.789,0:03:42.882 bit about, the outline of the course, what[br]it's like. So, this isn't gonna be a 0:03:42.882,0:03:46.871 typical course, not just because it's[br]online, but because the structure of the 0:03:46.871,0:03:51.172 course is very different. So most courses,[br]like if you take a math course, it sort of 0:03:51.172,0:03:55.265 starts here and moves along, right, with[br]each thing building on the thing before 0:03:55.265,0:03:59.358 it. Now the difficulty with a course like[br]that is if you ever, like fall off the 0:03:59.358,0:04:02.330 train, right? Fall behind. That's it.[br]You're just lost. Because you know, 0:04:02.330,0:04:05.647 everything, what I'm doing in lecture six[br]you need to know lecture five. And for 0:04:05.647,0:04:09.005 lecture five you need to know lecture[br]four. Well this course is going to be very 0:04:09.005,0:04:12.280 different. This course is going to be a[br]little bit more like, a trip to the zoo. 0:04:12.280,0:04:16.300 [laugh] Right? So we're gonna learn about[br]giraffes, and then we're gonna learn about 0:04:16.300,0:04:20.127 rhinos, and then we go over the lion cage.[br]So if you didn't quite understand the 0:04:20.127,0:04:23.953 rhinos, it's not gonna hurt you too much[br]when we go over the lion cage, right? So 0:04:23.953,0:04:27.925 it's more like just moving from one topic[br]to the next. They're somewhat related in 0:04:27.925,0:04:31.849 that they're all sort of animals, but you[br]don't need to fully know the giraffe to 0:04:31.849,0:04:35.530 move to the rhino, but obviously like[br]we're not gonna take like giraffes and 0:04:35.530,0:04:39.212 rhinos, we're gonna study models and[br][inaudible]. So what kind of models. We're 0:04:39.212,0:04:42.612 gonna study models like collective action[br]models. These are models where individuals 0:04:42.612,0:04:45.770 have to decide how much to contribute to[br]something that's for the public good. 0:04:45.770,0:04:48.805 We'll study things like the wisdom of[br]crowds. Like, how is it that groups of 0:04:48.805,0:04:51.962 people can be really smart? We'll study[br]models that have, like, fancy names like 0:04:51.962,0:04:54.998 [inaudible] models and Markov models.[br]These are models of sort of processes, 0:04:54.998,0:04:58.034 right? So they, they sound scary, but[br]they're actually always sort of fun and 0:04:58.034,0:05:01.110 interesting. We'll study game theory[br]models. We'll study something called the 0:05:01.110,0:05:04.470 Colonel Blotto game, which is a game where[br]you have to decide how many resources to 0:05:04.470,0:05:07.789 allocate across different front. So this[br]can be thought of as a really interesting 0:05:07.789,0:05:10.460 model of war. It can also be an[br]interesting model of, you know, firm 0:05:10.460,0:05:13.586 competition or sports, or legal defenses,[br]all sorts of stuff. So. We're going to 0:05:13.586,0:05:16.915 just you know, play with a whole bunch of[br]moth. Everything from economic growth to 0:05:16.915,0:05:20.285 tipping points. You know, a whole bunch in[br]between. So it should be lots and lots of 0:05:20.285,0:05:24.432 fun. Okay. What's the format for this?[br]How's this going to work? What does an 0:05:24.432,0:05:27.780 online course even look like? Okay. Well.[br]Let's think about it. So first thing, 0:05:27.780,0:05:31.397 there's these videos. You're watching one[br]right now. I'm going to try to keep them 0:05:31.397,0:05:34.807 between eight and fifteen minutes in[br]length. Right? Sometimes I may sneak to 0:05:34.807,0:05:38.434 sixteen but mostly I'll be creating[br]fifteen minutes in length. And inside the 0:05:38.434,0:05:42.060 videos there'll be questions. So I may,[br]all of a sudden the video may stop and 0:05:42.060,0:05:45.687 it'll say, what's the capital of Delaware?[br]Well actually it won't say that, but 0:05:45.687,0:05:49.502 something germane, hopefully, you know, to[br]the, to the lecture. So, there'll be these 0:05:49.502,0:05:53.269 fifteen eight to fifteen minute lectures.[br]Each module, each section will have, you 0:05:53.269,0:05:56.727 know, somewhere between like three and six[br]of those. Right? Okay. In addition, 0:05:56.727,0:06:00.085 there'll be readings. So on the wiki[br]you'll find links to the reading. Not a 0:06:00.085,0:06:03.578 bunch of these readings will come out of[br]some books that I'm, I've written, and 0:06:03.578,0:06:07.026 some, one that I'm about to write about[br]actually this course, and it'll all be 0:06:07.026,0:06:10.519 free. So, you'll know that Princeton[br]University Press has been very generous in 0:06:10.519,0:06:14.504 letting a lot of that content of my books[br]be out there. So we're going to you'll be 0:06:14.504,0:06:18.266 free to download whatever you need to look[br]at. All right? There's some assignments. 0:06:18.266,0:06:21.893 So there's an assignment on the web page.[br]You'll see a little assignment thing, so 0:06:21.893,0:06:25.271 all sorts of assignments. So, just make[br]sure you're following. What's going on 0:06:25.271,0:06:28.685 with the course, and then finally there[br]will be some quizzes, right so there are 0:06:28.685,0:06:32.100 some quizzes out there just to make sure[br]you know hey am I really getting this. 0:06:32.100,0:06:35.687 You'll, you'll watch me and you'll think,[br]yea Scott gets these models but that's not 0:06:35.687,0:06:39.144 what this is about, right this is about[br]you understanding the models. So there'll 0:06:39.144,0:06:42.222 be some quizzes, right but all in good[br]fun. Okay, and finally, there's the 0:06:42.222,0:06:45.445 discussion form. I mean, there's 40 to[br]50,000 people in this class, right, so, 0:06:45.445,0:06:49.015 office hours can get sort of crowded. So,[br]we're gonna have a discussion forum where 0:06:49.015,0:06:52.107 people are gonna ask questions. I'll[br]answer some. I've got some graduate 0:06:52.107,0:06:55.330 students who'll answer some. Other[br]students can answer things, but there'll 0:06:55.330,0:06:58.639 be a place for people to sorta share[br]ideas, share thoughts, give feedback and 0:06:58.639,0:07:01.992 should be, hopefully, you know, really[br]useful and structured in a way that will 0:07:01.992,0:07:06.500 work for everybody. Okay so how does it[br]work, what's one of these sections gonna 0:07:06.500,0:07:10.758 look like, well each section which of[br]course is gonna be 21. It's gonna be 0:07:10.758,0:07:14.318 focused on, you know, particular model.[br]Right, and so, we talk about the model and 0:07:14.318,0:07:17.969 say, okay, what is the model? What are the[br]assumptions? What are the parts? How does 0:07:17.969,0:07:21.392 it work? You know, what are the main[br]results? What are the applications? So, we 0:07:21.392,0:07:25.089 just sort of you know talk through how the[br]thing sort of plays out. Then it'll go 0:07:25.089,0:07:28.648 into some technical details. Sometimes in[br]the same lecture, I present the model. 0:07:28.648,0:07:32.299 Sometimes in later lectures. This will be,[br]you know, more technical stuff. A little 0:07:32.299,0:07:36.042 bit more mathematics. Now, I'll try and be[br]very clear about whether or not the math 0:07:36.042,0:07:39.328 is, you know, easy, medium, or hard. You[br]know, I'll let you know upfront. Like 0:07:39.328,0:07:42.933 okay, this may require a lot of Algebra or[br]this is just, you know, sort of simple 0:07:42.933,0:07:46.250 logical thinking. Right, so. I'll be[br]pretty clear about how much effort it's 0:07:46.250,0:07:49.801 gonna take to get through, trudge through[br]some of the examples. And there will be 0:07:49.801,0:07:53.529 practice problems you can work on as well.[br]And the other thing I'm gonna do in every 0:07:53.529,0:07:57.129 one of these sections is talk about the[br]fertility of the models. Tonight's memory 0:07:57.129,0:08:00.706 [inaudible] is this kernel blotter model[br]that was, could be used to model war or 0:08:00.706,0:08:04.058 sports or legal defenses, right. Most of[br]these models were developed for one 0:08:04.058,0:08:07.455 purpose but we can apply them to other[br]purposes. So we're going to talk a lot 0:08:07.455,0:08:10.987 about how, okay, now that we just learned[br]this model, where can we apply it. Where 0:08:10.987,0:08:14.244 does it, you know, where else does it[br]work? Right? Okay, so that's it. That's 0:08:14.244,0:08:17.599 sort of how it's gonna work, right?[br]Learning models is really important. It 0:08:17.599,0:08:21.046 makes you a more intelligent citizen,[br]probably just, you know, sort of, just a 0:08:21.046,0:08:24.630 more engaged person out there in the[br]world. ?Cause so much of how people think 0:08:24.630,0:08:28.491 and what people do is now based on models.[br]Makes you a clearer thinker. That's why so 0:08:28.491,0:08:32.213 many people are using models. It helps you[br]use and understand data, and it's gonna 0:08:32.213,0:08:35.936 help you make better decisions, strategize[br]better, and even, you know, design things 0:08:35.936,0:08:39.566 better. So the course should be really,[br]really useful. We're gonna cover a lot of 0:08:39.566,0:08:43.059 topics. The don't necessarily build on the[br]one before, right? There'll be some 0:08:43.059,0:08:46.690 quizzes and videos and that sort of stuff.[br]And this should be just a great time. 0:08:46.690,0:08:50.720 Alright. Welcome, and let's get started.[br]Thank you.