WEBVTT 00:00:07.374 --> 00:00:11.044 Welcome, uh, to this, uh, cloud computing concepts course. 00:00:11.044 --> 00:00:14.247 In this, uh, lecture we'll be covering several, uh, concepts, 00:00:14.247 --> 00:00:19.018 uh, that are assumed as a part of, uh, the topics 00:00:19.018 --> 00:00:20.220 that we'll be discussing 00:00:20.220 --> 00:00:22.389 in the cloud computing concepts, uh, course. 00:00:23.656 --> 00:00:25.792 So, we'll be covering several of the basics, uh, 00:00:25.792 --> 00:00:28.128 they're considered to be basics in computer science. 00:00:28.128 --> 00:00:29.295 Uh, typically, 00:00:29.295 --> 00:00:31.564 some of these concepts, are concepts that, uh, 00:00:31.564 --> 00:00:33.933 computer science students learn in their first couple of years 00:00:33.933 --> 00:00:36.035 in, uh, a computer science degree, 00:00:36.035 --> 00:00:38.271 undergraduate degree program. 00:00:38.271 --> 00:00:39.072 So for those of you, 00:00:39.072 --> 00:00:41.207 who are already familiar with this material, 00:00:41.207 --> 00:00:44.043 uh, this could be a refresher or you could just, uh, skip it, 00:00:44.043 --> 00:00:46.479 if you already know, uh, this stuff. 00:00:46.479 --> 00:00:48.648 In any case, uh, you can always use this 00:00:48.648 --> 00:00:50.617 orientation lecture as a reference, 00:00:50.617 --> 00:00:54.954 uh, to keep coming back to if, you encounter a, uh, term in, 00:00:54.954 --> 00:00:57.957 uh, during the regular lectures that doesn't make sense to you, 00:00:57.957 --> 00:00:59.459 or that you'd like to be reminded about. 00:01:01.127 --> 00:01:02.529 So, we'll discuss a few topics. 00:01:02.529 --> 00:01:04.364 We'll discuss some basic data structures, 00:01:04.364 --> 00:01:06.065 uh, what processes are. 00:01:06.065 --> 00:01:08.468 We'll discuss a little bit about computer architecture, 00:01:08.468 --> 00:01:11.905 and then, um, a little bit of math with Big O notation, 00:01:11.905 --> 00:01:13.273 basic probability 00:01:13.273 --> 00:01:15.175 and a few other miscellaneous topics. 00:01:15.175 --> 00:01:16.009 So, let's get started here. 00:01:17.410 --> 00:01:18.778 So, bas-basic data structures, 00:01:18.778 --> 00:01:20.246 we'll discuss two di-different data structures. 00:01:20.246 --> 00:01:22.015 The first one is a queue. 00:01:22.015 --> 00:01:22.916 A queue, is essentially, 00:01:22.916 --> 00:01:25.385 a First-in First-out data structure. 00:01:25.385 --> 00:01:27.687 Elements of the queue could be, uh, any data types. 00:01:27.687 --> 00:01:31.157 In this example, here, I have elements that are in tiers. 00:01:31.157 --> 00:01:34.694 So, in this, uh, example right now, the queue has 5 elements, 00:01:34.694 --> 00:01:37.197 3, 5, 8, 0, 7. 00:01:37.197 --> 00:01:39.799 This is an ordered, list of elements. 00:01:39.799 --> 00:01:41.768 When you remove an element from the queue, 00:01:41.768 --> 00:01:43.503 you remove it from the head of the queue. 00:01:43.503 --> 00:01:46.406 In this case, the head is the element three. 00:01:46.406 --> 00:01:48.107 Uh, when you insert a new element, 00:01:48.107 --> 00:01:50.510 you insert it in the tail of the queue, which means that, 00:01:50.510 --> 00:01:52.212 if, I insert a new element, say one, 00:01:52.212 --> 00:01:54.914 it's going to get inserted right after seven in this queue. 00:01:56.316 --> 00:01:58.017 So removal always comes from the head, 00:01:58.017 --> 00:02:00.420 while insertion always happens at the tail. 00:02:00.420 --> 00:02:02.555 Uh, so, for instance, given this particular queue, 00:02:02.555 --> 00:02:04.557 consisting of five elements, if you-if you, 00:02:04.557 --> 00:02:07.961 dequeue or remove an item, uh, then that item will be three. 00:02:07.961 --> 00:02:09.863 At that point of time the queue will only consist 00:02:09.863 --> 00:02:13.733 of 5, 8, 0, and 7, with the head pointing to 5. 00:02:13.733 --> 00:02:16.803 The next item that you dequeue or remove will be five, okay? 00:02:16.803 --> 00:02:19.639 After that the queue will consist of only 8, 0, and 7. 00:02:19.639 --> 00:02:23.009 Then, uh, the next item that you dequeue will be eight, 00:02:23.009 --> 00:02:25.545 um, and so on and so forth. 00:02:25.545 --> 00:02:29.349 Uh, dequeues and, or removals, as well as, insertions, 00:02:29.349 --> 00:02:31.851 uh, can occur, uh, concurrently, so you can, 00:02:31.851 --> 00:02:33.286 uh, remove an element, 00:02:33.286 --> 00:02:35.088 but at the same time also insert an element at the tail. 00:02:36.990 --> 00:02:38.224 So, that's a queue, 00:02:38.224 --> 00:02:40.193 which is a First-in First-out data structure. 00:02:40.193 --> 00:02:42.095 A different data structure is the stack, 00:02:42.095 --> 00:02:44.664 which is the First-in Last-out data structure. 00:02:44.664 --> 00:02:47.100 Uh, imagine a stack of dishes that you, uh, 00:02:47.100 --> 00:02:48.201 place on your table. 00:02:48.201 --> 00:02:49.736 Uh, the dish that you put on the top, 00:02:49.736 --> 00:02:51.371 that you add the last, 00:02:51.371 --> 00:02:53.840 will be the first one that you can remove, right? 00:02:53.840 --> 00:02:55.475 And that's essentially a stack. 00:02:55.475 --> 00:02:58.912 So, uh, remove or also known, as a pop happens from the top. 00:02:58.912 --> 00:03:01.214 An insert or a push also happens at the top. 00:03:01.214 --> 00:03:05.318 Okay, so in this case, um, uh, if you, uh, push a new element 00:03:05.318 --> 00:03:08.655 into this particular stack, um, uh, say nine. 00:03:08.655 --> 00:03:10.456 The nine will go above three. 00:03:10.456 --> 00:03:12.659 After that, if you pop, or remove an element, 00:03:12.659 --> 00:03:14.294 that's going to be the last element that you added, 00:03:14.294 --> 00:03:15.762 which is nine, in this case. 00:03:15.762 --> 00:03:17.463 The next pop after that will get three, 00:03:17.463 --> 00:03:18.765 because nine was just removed. 00:03:18.765 --> 00:03:21.768 And three is now the top of the, uh, stack. 00:03:21.768 --> 00:03:23.069 Uh, after you've removed three, 00:03:23.069 --> 00:03:24.938 the next pop or removal will get you five, 00:03:24.938 --> 00:03:26.639 uh, and so on and so forth. 00:03:26.639 --> 00:03:28.541 After that you'll get 8 and then 0 and 7. 00:03:28.541 --> 00:03:31.444 Okay, so, once again, the removal always returns you 00:03:31.444 --> 00:03:33.146 the last item that was added. 00:03:33.146 --> 00:03:35.181 And, uh, insertion or push adds an item 00:03:35.181 --> 00:03:37.584 right to the top, of the stack. 00:03:37.584 --> 00:03:39.185 So these two data structures, queue and stack 00:03:39.185 --> 00:03:40.820 are used very widely in computer science 00:03:40.820 --> 00:03:44.424 and, um, in fact, we'll be using the notion of a stack, uh, 00:03:44.424 --> 00:03:47.927 right away when we'll discuss, uh, uh, processes, um, 00:03:47.927 --> 00:03:50.029 uh, soon in this particular orientation lecture, itself. 00:03:51.397 --> 00:03:54.701 So, speaking of processes, um, let's discuss a process next. 00:03:54.701 --> 00:03:58.571 Uh, a process is essentially a program in action. 00:03:58.571 --> 00:04:00.673 Uh, you may write a program in a language, 00:04:00.673 --> 00:04:03.476 programming language, such as, C or C++. 00:04:03.476 --> 00:04:05.445 And that's sh- an example is shown here. 00:04:05.445 --> 00:04:08.648 This example code, here, uh, consists of a main function, 00:04:08.648 --> 00:04:12.251 which, uh, calls a function, uh, called f1. 00:04:12.251 --> 00:04:14.253 F1 itself has a local integer variable x, 00:04:14.253 --> 00:04:17.624 and then, f1 itself calls a different function, f2. 00:04:18.625 --> 00:04:20.692 Okay, this is shown pictorially here, in the circle, 00:04:20.692 --> 00:04:23.596 where main calls f1 and f1 calls f2. 00:04:23.596 --> 00:04:24.964 'Kay, this is the code that you would write, 00:04:24.964 --> 00:04:26.499 and then, y'know, you would compile the code 00:04:26.499 --> 00:04:28.368 and then, uh, execute it. 00:04:28.368 --> 00:04:31.137 And when you're executing it, when your program is in action, 00:04:31.137 --> 00:04:33.139 that is a process. 00:04:33.139 --> 00:04:35.041 So, a process consists of your code, 00:04:35.041 --> 00:04:38.611 but also several other elements that, uh, help your code 00:04:38.611 --> 00:04:41.280 to execute on the computer, on which you are executing it. 00:04:41.280 --> 00:04:43.483 So, we are going to discuss some, uh, elements 00:04:43.483 --> 00:04:45.451 o-of a process which are critical. 00:04:46.319 --> 00:04:47.520 Here, are some of these elements. 00:04:47.520 --> 00:04:50.156 First, um, uh, let's look at the code. 00:04:50.156 --> 00:04:52.392 The code itself, uh, is, uh, static. 00:04:52.392 --> 00:04:54.894 Once you write the code, um, nothing changes it, 00:04:54.894 --> 00:04:57.664 uh, we are not considering the variable values 00:04:57.664 --> 00:04:59.032 to be a part of the code. 00:04:59.032 --> 00:05:00.433 I'll talk about that in a little bit. 00:05:00.433 --> 00:05:01.601 But the code itself, is static; 00:05:01.601 --> 00:05:04.037 it does not change once you write it. 00:05:04.037 --> 00:05:06.339 Uh, but, uh, there is a program counter, 00:05:06.339 --> 00:05:08.141 which is typically maintained by the computer 00:05:08.141 --> 00:05:10.943 on which you're running your, uh, process, uh, 00:05:10.943 --> 00:05:11.944 which points to 00:05:11.944 --> 00:05:14.113 which is the line number inside the code, 00:05:14.113 --> 00:05:16.215 where the program is currently executing, 00:05:16.215 --> 00:05:18.618 or rather where, the process is currently executing. 00:05:18.618 --> 00:05:21.020 'Kay, this is called a Program Counter, or the PC. 00:05:21.020 --> 00:05:22.055 Okay, this is a part, 00:05:22.055 --> 00:05:24.290 an important part of the process state. 00:05:24.290 --> 00:05:25.725 So, if you were to take a core dump, 00:05:25.725 --> 00:05:27.226 or stop the process right now, 00:05:27.226 --> 00:05:28.895 and take a snapshot of that process. 00:05:28.895 --> 00:05:31.164 That snapshot must include, the program counter, 00:05:31.164 --> 00:05:33.399 because it says, where in the code is, 00:05:33.399 --> 00:05:35.168 uh, the execution currently at. 00:05:36.769 --> 00:05:40.406 Next, uh, when, um, er-r, um, functions call each other, 00:05:40.406 --> 00:05:44.544 um, or in an object oriented, um, uh, program, 00:05:44.544 --> 00:05:47.213 uh, methods call each other, they use a stack. 00:05:47.213 --> 00:05:49.215 So every process contains, uh, a stack. 00:05:49.215 --> 00:05:51.884 Uh, more specifically, um, a proce- 00:05:51.884 --> 00:05:54.087 a process might contain multiple threads. 00:05:54.087 --> 00:05:56.222 Each thread will contain its own stack. 00:05:56.222 --> 00:05:57.090 But, let's just say 00:05:57.090 --> 00:05:58.691 there is one thread in this process. 00:05:58.691 --> 00:06:00.660 Uh, so, a process contains one stack, 00:06:00.660 --> 00:06:03.463 and this stack is used by these functions or methods, 00:06:03.463 --> 00:06:07.066 to pass arguments and return, uh, the result values. 00:06:07.066 --> 00:06:09.402 So, for instance, when main calls f1, 00:06:09.402 --> 00:06:13.339 main will push the arguments, uh, to f1, on top of the stack. 00:06:13.339 --> 00:06:15.141 When f1 starts executing 00:06:15.141 --> 00:06:16.409 it will pop, or remove, 00:06:16.409 --> 00:06:17.744 the elements from the top of the stack 00:06:17.744 --> 00:06:20.113 and use it to, uh, execute. 00:06:20.113 --> 00:06:23.516 Uh, similarly, uh, when f1 calls f2, 00:06:23.516 --> 00:06:28.354 um, f1 will push the arguments on top of the stack, for f2, 00:06:28.354 --> 00:06:31.390 and then f2 will, um, uh, pop them, execute, 00:06:31.390 --> 00:06:34.260 and then push the result values on top of the stack. 00:06:34.260 --> 00:06:36.162 The result values then popped by f1. 00:06:36.162 --> 00:06:39.132 This is the way in which f1 gets back the result, 00:06:39.132 --> 00:06:42.635 um, the final int return value from f2, to itself. 00:06:42.635 --> 00:06:46.506 And finally when f-f1 needs to return a resolved value to, 00:06:46.506 --> 00:06:48.775 uh, main, it pushes it on top of the stack 00:06:48.775 --> 00:06:51.010 when the execution returns to main, 00:06:51.010 --> 00:06:52.545 uh, main, uh, pops it 00:06:52.545 --> 00:06:54.347 or removes it from the top of the stack 00:06:54.347 --> 00:06:57.717 and main has the final return value from f1. 00:06:57.717 --> 00:06:59.018 'Kay, so, the stack 00:06:59.018 --> 00:07:00.720 is an important part of the process state as well, 00:07:00.720 --> 00:07:02.989 because it tells you where in the program execution 00:07:02.989 --> 00:07:05.091 you are, with respect to functions calling each other. 00:07:06.859 --> 00:07:08.094 Finally, uh, y'know, uh, 00:07:08.094 --> 00:07:09.729 functions have local variable, like x. 00:07:09.729 --> 00:07:12.031 There might also be global variables, um, 00:07:12.031 --> 00:07:13.933 uh, and of course in object oriented programs, 00:07:13.933 --> 00:07:15.434 you have, uh, objects, 00:07:15.434 --> 00:07:17.403 which store a lot of fields in them. 00:07:17.403 --> 00:07:19.071 A lot of these, uh, pieces of data, 00:07:19.071 --> 00:07:20.940 are stored in, what is known as a heap. 00:07:20.940 --> 00:07:23.442 Okay, so, uh, a heap is essentially a lot of data, 00:07:23.442 --> 00:07:26.312 uh, that was created by functions or methods, 00:07:26.312 --> 00:07:29.382 uh, or, uh, um, by objects, um. 00:07:29.382 --> 00:07:33.119 And it holds all of the data and, y'know, when an object, 00:07:33.119 --> 00:07:34.687 when a function's executing, 00:07:34.687 --> 00:07:36.322 uh, say f1 is executing, 00:07:36.322 --> 00:07:39.158 uh, the object x might be created inside the heap, 00:07:39.158 --> 00:07:40.326 when f1 is done executing. 00:07:40.326 --> 00:07:43.930 X, uh, the object of x will be removed from the heap. 00:07:43.930 --> 00:07:45.398 Similarly, there're also registers, 00:07:45.398 --> 00:07:46.632 and I'll talk more about registers 00:07:46.632 --> 00:07:48.201 when we discuss computer architecture, 00:07:48.201 --> 00:07:49.869 but registers hold some of the, uh, recent values 00:07:49.869 --> 00:07:52.805 that have been accessed, uh, by the process, 00:07:52.805 --> 00:07:53.472 p1, in this case. 00:07:55.441 --> 00:07:57.310 Okay, uh, speaking of computer architecture. 00:07:57.310 --> 00:07:58.511 Let's discuss a simplified version 00:07:58.511 --> 00:07:59.846 of computer architecture. 00:07:59.846 --> 00:08:02.615 Computer architectures today, uh, can be very complicated. 00:08:02.615 --> 00:08:04.517 But, um, typically when we ta- 00:08:04.517 --> 00:08:06.786 when we think of how a process executes, 00:08:06.786 --> 00:08:08.487 in a computer architecture, 00:08:08.487 --> 00:08:10.022 we can think of a simplified view. 00:08:10.022 --> 00:08:11.023 So there is a CPU, 00:08:11.023 --> 00:08:12.625 which executes the action instructions, 00:08:12.625 --> 00:08:14.794 which are present in your, uh, code. 00:08:14.794 --> 00:08:16.162 There're also a bunch of registers, 00:08:16.162 --> 00:08:18.064 that are co-located, uh, with the CPU. 00:08:18.064 --> 00:08:20.199 These are, uh, small pieces of memory, 00:08:20.199 --> 00:08:22.835 that can be accessed very quickly by the CPU. 00:08:22.835 --> 00:08:25.471 Typically, uh, there's only a small number of registers. 00:08:25.471 --> 00:08:27.773 Typically, not more than a few tens of registers. 00:08:27.773 --> 00:08:29.308 Uh, there's also a cache, 00:08:29.308 --> 00:08:31.043 which is, a slightly larger memory 00:08:31.043 --> 00:08:32.678 than the collection of registers. 00:08:32.678 --> 00:08:34.746 And the larger a map piece of memory is, 00:08:34.746 --> 00:08:36.482 uh, the slower it is to access it. 00:08:36.482 --> 00:08:38.717 So accessing a cache is slower than registers, 00:08:38.717 --> 00:08:40.318 uh, than accessing registers. 00:08:40.318 --> 00:08:43.389 But accessing a cache is still pretty fast. 00:08:43.389 --> 00:08:45.524 Uh, then beyond the cache there is the main memory, 00:08:45.524 --> 00:08:47.960 or the Random Access Memory, or the RAM. 00:08:47.960 --> 00:08:49.829 Uh, which is even larger than the cache, 00:08:49.829 --> 00:08:51.664 uh, and the ca- the memory itself, 00:08:51.664 --> 00:08:53.232 therefore, is slower than the cache, 00:08:53.232 --> 00:08:54.700 but still, it's pretty fast. 00:08:54.700 --> 00:08:56.002 And then finally, there's hard disk, 00:08:56.002 --> 00:08:58.738 or if you would like, a solid stave-state drive, 00:08:58.738 --> 00:09:00.406 a flash, uh, drive, or whatever, 00:09:00.406 --> 00:09:03.376 which is, a lot more memory than, uh, the main memory, 00:09:03.376 --> 00:09:05.378 uh, but of course, it's much slower. 00:09:05.378 --> 00:09:08.547 So, as you go up from disk, to memory, to cache, registers, 00:09:08.547 --> 00:09:09.749 the speed increases, 00:09:09.749 --> 00:09:11.384 the total amount of, uh, storage space 00:09:11.384 --> 00:09:12.418 that you have, decreases. 00:09:14.687 --> 00:09:15.788 So, when you write a program, 00:09:15.788 --> 00:09:19.025 such as, C++, or Java, or C, and you compile it, 00:09:19.025 --> 00:09:21.560 it gets compiled to low level machine instructions. 00:09:21.560 --> 00:09:22.628 These machine instructions, 00:09:22.628 --> 00:09:24.630 uh, might be specific to the architecture, 00:09:24.630 --> 00:09:25.898 the machine on which you are running, 00:09:25.898 --> 00:09:27.633 or it might be a, um, 00:09:27.633 --> 00:09:30.670 ah, virtual machine, like a JVM, kind of, um, code. 00:09:30.670 --> 00:09:33.039 In any case, uh, these low level machine instructions 00:09:33.039 --> 00:09:35.808 are the executable version of your, uh, program, 00:09:35.808 --> 00:09:38.778 and this is stored in file system, um, uh, 00:09:38.778 --> 00:09:41.280 in the file system on- on your disk. 00:09:41.280 --> 00:09:42.648 When your program starts executing, 00:09:42.648 --> 00:09:44.116 when it becomes a process, 00:09:44.116 --> 00:09:46.852 uh, the CPU loads instructions in batches, 00:09:46.852 --> 00:09:48.821 or, m-, in a simplifi- more simplified way, 00:09:48.821 --> 00:09:51.457 it loads instructions one by one. 00:09:51.457 --> 00:09:52.959 Uh, it loads them into memory, 00:09:52.959 --> 00:09:55.261 and then into cache, and into registers. 00:09:55.261 --> 00:09:57.163 Typically, the cache and the registers contain 00:09:57.163 --> 00:09:59.398 the last few accessed instructions. 00:09:59.398 --> 00:10:01.600 Um, so they are, uhm, 00:10:01.600 --> 00:10:02.568 they follow what is known as, 00:10:02.568 --> 00:10:04.937 the least recently used principle. 00:10:04.937 --> 00:10:07.673 Now, as it executes y- each instruction, the CPU, 00:10:07.673 --> 00:10:09.842 which is executing the process, loads the data, 00:10:09.842 --> 00:10:11.610 that is necessary for the instruction, 00:10:11.610 --> 00:10:13.312 into the memory and then if necessary 00:10:13.312 --> 00:10:15.348 into the cache and the registers. 00:10:15.348 --> 00:10:18.217 Um, and if there are any necessary, uh, changes 00:10:18.217 --> 00:10:20.019 that happen to a variable like, x, 00:10:20.019 --> 00:10:23.356 then, these are stored into a memory, um, um, 00:10:23.356 --> 00:10:25.191 uh, first into cache and then into memory. 00:10:26.692 --> 00:10:28.761 Once in a while, you may need to store something on disks, 00:10:28.761 --> 00:10:31.530 so that you have, um, a permanent storage. 00:10:31.530 --> 00:10:33.499 So in case, the process goes offline, 00:10:33.499 --> 00:10:35.501 then you still have some data. 00:10:35.501 --> 00:10:37.403 Say for instance, the program might open a file 00:10:37.403 --> 00:10:38.571 and write something into the file. 00:10:38.571 --> 00:10:40.773 So, in this case, uh, you may want to write those, 00:10:40.773 --> 00:10:42.508 uh, updates to the file onto the disk. 00:10:43.509 --> 00:10:45.311 Now, of course this is a very highly simplified picture. 00:10:45.311 --> 00:10:46.946 Computer architectures can be far more, 00:10:46.946 --> 00:10:48.914 uh, complex than this, today, 00:10:48.914 --> 00:10:52.251 but for our practical purposes, in this course, and as far as, 00:10:52.251 --> 00:10:55.955 uh, explaining how processes work, um, this will suffice. 00:10:58.758 --> 00:11:00.593 So next, we move on to a few mathematical topics. 00:11:00.593 --> 00:11:03.429 The first mathematical topic is, a Big O notation. 00:11:03.429 --> 00:11:04.363 A Big O notation is, 00:11:04.363 --> 00:11:06.465 one of the most basic ways of analyzing algorithms. 00:11:06.465 --> 00:11:07.600 When I give you an algorithm, 00:11:07.600 --> 00:11:10.002 and I say analyze the algorithm mathematically, 00:11:10.002 --> 00:11:12.471 essentially, I expect you to, uh, come up with, 00:11:12.471 --> 00:11:14.440 at least a Big O notation, 00:11:14.440 --> 00:11:17.109 uh, and an analysis of that algorithm. 00:11:17.109 --> 00:11:18.044 The-the Big O notation, 00:11:18.044 --> 00:11:20.646 describes an upper bound in the behavior of an algorithm. 00:11:20.646 --> 00:11:23.315 Um, as some variable is scaled 00:11:23.315 --> 00:11:25.885 or increased in value all the way to infinity. 00:11:25.885 --> 00:11:28.888 Typically, this variable is a size of the input. 00:11:28.888 --> 00:11:30.823 Um, maybe the number of input elements, 00:11:30.823 --> 00:11:33.159 um, you'll see-sa-you'll see a few examples of this. 00:11:34.393 --> 00:11:38.197 So, essentially, this, uhh-oah, Big O analysis tells us, 00:11:38.197 --> 00:11:40.199 how does the, uh, algorithm itself 00:11:40.199 --> 00:11:43.769 scale as the size of the input is increased? 00:11:43.769 --> 00:11:45.638 Um, this analyzes the run time, 00:11:45.638 --> 00:11:47.573 or typically, another performance metric, as well. 00:11:47.573 --> 00:11:49.041 Most of the time, we analyze run time, 00:11:49.041 --> 00:11:50.543 sometimes, as you'll see in the course, 00:11:50.543 --> 00:11:52.845 we'll analyze things like number of messages, 00:11:52.845 --> 00:11:55.781 um, uh, even, uh, bandwidth, um, 00:11:55.781 --> 00:11:59.185 uh, that is exchanged i-in the total network itself. 00:11:59.185 --> 00:12:00.619 The important thing about Big O notation is that, 00:12:00.619 --> 00:12:03.122 i-it captures worst case performance. 00:12:03.122 --> 00:12:04.890 Okay, so this is not expected performance 00:12:04.890 --> 00:12:06.258 or best case performance. 00:12:06.258 --> 00:12:07.960 Big O is worst case performance. 00:12:07.960 --> 00:12:09.028 Given this algorithm, 00:12:09.028 --> 00:12:11.630 what is the worst that it can do in terms of, uh, 00:12:11.630 --> 00:12:13.099 this performance metric that we're measuring? 00:12:15.301 --> 00:12:18.737 So, essentially, when I say, an algorithm A is order of foo, 00:12:18.737 --> 00:12:19.438 this means that, 00:12:19.438 --> 00:12:23.642 Algorithm A takes < c foo time to complete, 00:12:23.642 --> 00:12:25.444 assuming time is the metric we are measuring, 00:12:25.444 --> 00:12:27.980 for some constant c, a fixed constant c, 00:12:27.980 --> 00:12:30.583 uh, beyond some input size N. 00:12:30.583 --> 00:12:31.951 Okay? Typically, this foo 00:12:31.951 --> 00:12:33.919 that is, uh, present in the definition is s-, 00:12:33.919 --> 00:12:36.255 uh, uh, some function of input size N. 00:12:36.255 --> 00:12:38.924 For instance, I could say that an algorithm is order N, 00:12:38.924 --> 00:12:41.961 which means that algorithm A takes less than c times N 00:12:41.961 --> 00:12:44.530 time to complete for some constant c, 00:12:44.530 --> 00:12:46.432 beyond some input size N. 00:12:46.432 --> 00:12:48.300 Similarly the algorithm might be order N^2. 00:12:48.300 --> 00:12:51.337 Which means, that, uh, it is quadratic in time, 00:12:51.337 --> 00:12:53.405 and it takes time less than c times N squared 00:12:53.405 --> 00:12:56.242 for some constant c. 00:12:56.242 --> 00:12:58.844 Once again, when we say order N^2 or order N, 00:12:58.844 --> 00:13:00.880 notice that we do not include the constant c, 00:13:00.880 --> 00:13:01.881 in the definition itself. 00:13:01.881 --> 00:13:03.516 This is because, the constant is irrelevant, 00:13:03.516 --> 00:13:05.384 as long as there is a fixed constant, 00:13:05.384 --> 00:13:06.986 some fixed constant. 00:13:06.986 --> 00:13:09.054 That is good enough for us as far as Big O notation 00:13:09.054 --> 00:13:11.023 is, uh, concerned. 00:13:11.023 --> 00:13:12.892 Okay, so we do not state the constants in Big O notation. 00:13:14.059 --> 00:13:15.261 So let's see a couple of examples. 00:13:15.261 --> 00:13:18.297 Um, so first, I give you an unsorted list 00:13:18.297 --> 00:13:19.665 o-of, say, integers. 00:13:19.665 --> 00:13:21.467 And I ask you to search for a specific element 00:13:21.467 --> 00:13:22.768 in the list. 00:13:22.768 --> 00:13:24.436 This essentially means, that you have to iterate through 00:13:24.436 --> 00:13:25.971 the list, one item at a time. 00:13:25.971 --> 00:13:28.040 And search for the element by trying to match it 00:13:28.040 --> 00:13:30.643 with each element quality present, in the list. 00:13:30.643 --> 00:13:32.478 What is the worst case performance? 00:13:32.478 --> 00:13:34.280 That's what we need, for Big O notation, right? 00:13:34.280 --> 00:13:35.414 So, the worst case performance happens when, 00:13:35.414 --> 00:13:38.050 either element is not there, at all, in the list, 00:13:38.050 --> 00:13:39.385 so, you return a false. 00:13:39.385 --> 00:13:42.588 Uh, or the element is the very last one, in the list. 00:13:42.588 --> 00:13:43.856 The very last one, that you match against. 00:13:44.924 --> 00:13:45.891 Therefore, the worst case performance 00:13:45.891 --> 00:13:47.259 involves N operations, 00:13:47.259 --> 00:13:52.064 and the number of operations involved is < c* N, where c=2, 00:13:52.064 --> 00:13:53.766 because the number of operations is N, 00:13:53.766 --> 00:13:54.833 where N is the size of the list. 00:13:55.968 --> 00:13:58.103 Okay, so, the time as well as, the number of operations, 00:13:58.103 --> 00:14:01.040 uh, assuming each operation takes one unit of time. 00:14:01.040 --> 00:14:05.578 Both the time taken for this, uh, um, uh, for this algorithm, 00:14:05.578 --> 00:14:06.478 which iterates through the list. 00:14:06.478 --> 00:14:09.949 As a list number of operations are both order N. 00:14:09.949 --> 00:14:11.217 That's why we see order N over here. 00:14:13.285 --> 00:14:14.253 Let's take a different, uh, example 00:14:14.253 --> 00:14:16.522 to see how Big O notation works. 00:14:16.522 --> 00:14:18.958 This leads with insertion sorting of an unsorted list. 00:14:18.958 --> 00:14:21.760 So suppose I give you a list of elements, uh, say integers, 00:14:21.760 --> 00:14:25.297 and this is unsorted and I ask you to sort them, 00:14:25.297 --> 00:14:29.001 say in increasing order of, uh, the, uh, elements. 00:14:29.001 --> 00:14:30.836 The insertion sorter algorithm works as follows; 00:14:30.836 --> 00:14:33.272 first, you create a new list, that is empty. 00:14:33.272 --> 00:14:34.974 Then, you iterate through the elements 00:14:34.974 --> 00:14:36.775 in the unsorted original list, 00:14:36.775 --> 00:14:38.210 you remove one element at a time, 00:14:38.210 --> 00:14:39.778 and you insert the element, 00:14:39.778 --> 00:14:41.714 in the new list at the appropriate position. 00:14:41.714 --> 00:14:45.384 Essentially, you sort through the new list, uh, linearly, 00:14:45.384 --> 00:14:46.752 uh, searching for the right position 00:14:46.752 --> 00:14:49.088 where this element, should be. 00:14:49.088 --> 00:14:50.322 So, if you do this, 00:14:50.322 --> 00:14:52.458 the first element, of course, takes one operation to insert. 00:14:52.458 --> 00:14:54.059 That's one insert operation. 00:14:54.059 --> 00:14:55.694 The second element, in the worst case, 00:14:55.694 --> 00:14:57.129 takes two operations to insert. 00:14:57.129 --> 00:14:58.530 One operation for the comparison, 00:14:58.530 --> 00:15:00.399 with the one element that's already present, 00:15:00.399 --> 00:15:01.834 in the new sorted list 00:15:01.834 --> 00:15:03.369 and one more operation to insert. 00:15:03.369 --> 00:15:05.104 Imagine the case, the worst case, 00:15:05.104 --> 00:15:06.772 where, this second element, 00:15:06.772 --> 00:15:08.574 is inserted right at the very end, of the list. 00:15:09.775 --> 00:15:11.577 Similarly, the third element, will take, uh, 00:15:11.577 --> 00:15:12.578 three operations to insert, 00:15:12.578 --> 00:15:13.612 where the first two elements, 00:15:13.612 --> 00:15:15.848 compared with the two already existing elements, 00:15:15.848 --> 00:15:18.050 the third, uh, uh, operation, 00:15:18.050 --> 00:15:21.987 uh, inserts it at the very end of, uh, the, ah, 00:15:21.987 --> 00:15:23.322 uh, of the list. 00:15:23.322 --> 00:15:24.223 If, you go on like this 00:15:24.223 --> 00:15:26.292 and the i-th element takes i operations to insert, 00:15:26.292 --> 00:15:28.894 and you can calculate that the total time required 00:15:28.894 --> 00:15:32.665 to insert all the, m, elements is 1+2+3, so on until N. 00:15:32.665 --> 00:15:36.769 And this is a well-known sum which comes to N(N+1)/2, 00:15:36.769 --> 00:15:39.405 which is, uh, <1*N^2. 00:15:39.405 --> 00:15:42.074 So, with a value of C equals one, uh, this, uh, 00:15:42.074 --> 00:15:44.276 insertion sorting example, 00:15:44.276 --> 00:15:47.579 is an algorithm that takes order N^2 in time, 00:15:47.579 --> 00:15:49.815 and also order N^2 in number of operations. 00:15:55.020 --> 00:15:57.022 Next, we look at a little bit of basic probability, 00:15:57.022 --> 00:15:58.424 uh, before probability, 00:15:58.424 --> 00:15:59.825 we need to discuss, few definitions. 00:15:59.825 --> 00:16:01.994 Uh, the first definition is a set. 00:16:01.994 --> 00:16:04.396 A set is, essentially, a collection of things. 00:16:04.396 --> 00:16:06.632 So for instance, a set S might be the set of all humans, 00:16:06.632 --> 00:16:09.368 who live in this world, at this current moment. 00:16:09.368 --> 00:16:12.004 A subset, is another set, which is a collection of things, 00:16:12.004 --> 00:16:14.173 that is a part of the larger set. 00:16:14.173 --> 00:16:17.042 So, I might design a-, uh, da, uh, describe a set S2, 00:16:17.042 --> 00:16:18.110 which says, 00:16:18.110 --> 00:16:21.146 a set of all humans who live, currently, in Europe, today. 00:16:21.146 --> 00:16:22.514 S2 is a subset of S, 00:16:22.514 --> 00:16:25.117 because every element that is present in S2 00:16:25.117 --> 00:16:26.452 is also present in S, 00:16:26.452 --> 00:16:28.053 but the reverse is not necessarily true. 00:16:30.989 --> 00:16:33.859 Okay, so now, um, probability, 00:16:33.859 --> 00:16:37.029 any-any event, uh, has a probability of happening. 00:16:37.029 --> 00:16:40.199 Okay, so, umm, let me give you an example. 00:16:40.199 --> 00:16:42.368 If you wake up at a random hour of the day, 00:16:42.368 --> 00:16:44.937 um, so you just wake up at some random hour of the day, 00:16:44.937 --> 00:16:46.338 say you're jet lagged or something. 00:16:46.338 --> 00:16:49.375 Uh, and, uh, I ask you, what is the probability, 00:16:49.375 --> 00:16:52.244 that the event at that time is between 10 a.m. and 11 a.m.? 00:16:52.244 --> 00:16:53.045 Let's try to calculate this. 00:16:54.179 --> 00:16:55.881 Well, there are 24 hours in a day, 00:16:55.881 --> 00:16:58.884 so you have a set that consists of 24 elements, 00:16:58.884 --> 00:17:01.854 one for each hour, 12 a.m., 1 a.m., 2 a.m., 00:17:01.854 --> 00:17:04.990 all the way til, 10 a.m., 11 a.m., 11 p.m. 00:17:04.990 --> 00:17:07.559 'Kay, so when I say an element like, uh, 1 a.m. 00:17:07.559 --> 00:17:10.996 I mean, the hour between 1 a.m. and 1:59 am. 00:17:10.996 --> 00:17:14.133 Similarly, 10 a.m. means, 10 a.m. and 10:59 a.m. 00:17:15.267 --> 00:17:18.470 Out of the set of 24 elements, you pick one, at random. 00:17:18.470 --> 00:17:20.071 And the probability that you're going to pick 10 a.m., 00:17:20.071 --> 00:17:22.007 is essentially, 1 divided by 24, 00:17:22.007 --> 00:17:24.108 because you're picking one element at random 00:17:24.108 --> 00:17:26.744 and there are 24 elements, uh, in the set, 00:17:26.744 --> 00:17:28.747 and the probability that you pick, 00:17:28.747 --> 00:17:32.518 um, that one special element, 10 a.m., is 1 over 24, 00:17:32.518 --> 00:17:34.720 because you-you wanna pick, you wanna be lucky, 00:17:34.720 --> 00:17:37.289 o-of that, uh, to pick that, uh, 10. 00:17:37.289 --> 00:17:38.690 Okay, so, the probability that you, 00:17:38.690 --> 00:17:40.192 if you wake up at a random hour of the day, 00:17:40.192 --> 00:17:42.261 the time is between 10 a.m. and 11 a.m., 00:17:42.261 --> 00:17:43.796 the-that probability is 1 over 24. 00:17:46.765 --> 00:17:48.567 Now, uh, there might be multiple events 00:17:48.567 --> 00:17:50.569 that you might need to calculate the probability of, 00:17:50.569 --> 00:17:52.638 uh, conjunctions or disjunctions of these events. 00:17:52.638 --> 00:17:53.806 I'll describe what these are, soon. 00:17:53.806 --> 00:17:56.208 So say, E1 is one event and E2 is another event, 00:17:56.208 --> 00:17:58.644 and E1 and E-E2 are independent of each other. 00:17:58.644 --> 00:17:59.678 This means that, 00:17:59.678 --> 00:18:02.381 E1 does not influence E2, E2 does not influence E1. 00:18:02.381 --> 00:18:04.850 Then the probability that E1 and E2 both happen, 00:18:04.850 --> 00:18:06.185 is essentially, 00:18:06.185 --> 00:18:08.053 the multiplication of these probabilities. 00:18:08.053 --> 00:18:09.721 So you take the probability of E1, 00:18:09.721 --> 00:18:11.423 multiply it by the probability of E2, 00:18:11.423 --> 00:18:12.424 and that gives you, 00:18:12.424 --> 00:18:15.227 the probability that both E1 and E2 are true. 00:18:15.227 --> 00:18:16.495 Let's discuss an example of this. 00:18:16.495 --> 00:18:17.930 Suppose you have three shirts. 00:18:17.930 --> 00:18:20.065 They're colored blue, green, and red. 00:18:20.065 --> 00:18:23.101 And also you wake up at a random hour and blindly pick a shirt, 00:18:23.101 --> 00:18:25.504 you put your, uh, hand into your, uh, closet, 00:18:25.504 --> 00:18:27.606 and you blindly pick out a shirt. 00:18:27.606 --> 00:18:28.640 What is the probability that, 00:18:28.640 --> 00:18:30.776 he woke up between 10 a.m. and 11 a.m. 00:18:30.776 --> 00:18:34.046 and, that he picked a green shirt to wear? 00:18:34.046 --> 00:18:36.181 Well, the first probability of the first event, 00:18:36.181 --> 00:18:37.483 that he woke up between 10 and 11, 00:18:37.483 --> 00:18:38.951 is essential, 1 over 24, 00:18:38.951 --> 00:18:40.853 like we calculated on the previous slide. 00:18:40.853 --> 00:18:43.055 The probability that you pick the green shirt, is essentially, 00:18:43.055 --> 00:18:45.390 one out of three, because there are three colors of shirts, 00:18:45.390 --> 00:18:46.525 uh, you have three shirts 00:18:46.525 --> 00:18:48.660 and, uh, you want to pick the green one, 00:18:48.660 --> 00:18:49.862 you want to be lucky, to pick the green one, 00:18:49.862 --> 00:18:51.296 so that's one over three. 00:18:51.296 --> 00:18:53.499 And so, the probability that both these events are true, 00:18:53.499 --> 00:18:54.867 that you woke up between 10 and 11, 00:18:54.867 --> 00:18:56.869 and that you wore the green shirt 00:18:56.869 --> 00:18:58.370 is the multiplication of the probabilities 00:18:58.370 --> 00:18:59.538 of those events. 00:18:59.538 --> 00:19:03.542 So it's (1/24)*(1/3)=1/72. 00:19:05.010 --> 00:19:05.744 One of the things, 00:19:05.744 --> 00:19:06.712 that you have to be careful about here, 00:19:06.712 --> 00:19:09.548 is that, uh, if E1 and E2 are not independent, 00:19:09.548 --> 00:19:11.683 meaning that they influence each other in some way, 00:19:11.683 --> 00:19:15.420 uh, then, you cannot multiply, uh, the, uh, probabilities 00:19:15.420 --> 00:19:16.288 with, uh, each other. 00:19:16.288 --> 00:19:20.459 Okay, so, for instance, um, if, uh, 00:19:20.459 --> 00:19:22.461 the shirts that you have in your closet, 00:19:22.461 --> 00:19:25.030 uh, change from one hour to another, because, uh, 00:19:25.030 --> 00:19:26.431 um, say, uh, 00:19:26.431 --> 00:19:28.300 someone in your house changes them around, 00:19:28.300 --> 00:19:30.602 then you can't necessarily multiply these probabilities, 00:19:30.602 --> 00:19:31.436 as they are here. 00:19:32.771 --> 00:19:34.139 A different thing that we need to do 00:19:34.139 --> 00:19:36.875 with two event probabilities is, uh, to OR them. 00:19:36.875 --> 00:19:38.944 So, if, I give you two events, E1 and E2 00:19:38.944 --> 00:19:40.546 and I ask you the probability that, 00:19:40.546 --> 00:19:43.248 uh, of either E1 or E2 happening 00:19:43.248 --> 00:19:46.285 then, you can say that Prob(E1 OR E2)= 00:19:46.285 --> 00:19:51.290 Prob(E1)+Prob(E2)- Prob(E1 AND E2). 00:19:51.290 --> 00:19:53.458 Okay, so, this is the intersection probability here. 00:19:53.458 --> 00:19:55.827 If, you do not know probability of E1 and E2, 00:19:55.827 --> 00:19:57.162 this last term over here, 00:19:57.162 --> 00:20:00.599 than, you can write the inequality Prob(E1 OR E2)<= 00:20:00.599 --> 00:20:03.402 the sum of the constituent probabilities, okay? 00:20:04.570 --> 00:20:06.872 Sometimes we use these inequalities to upper bound, 00:20:06.872 --> 00:20:10.676 uh, the, uh, probabilities of the, um, uh, disjunctions, 00:20:10.676 --> 00:20:11.677 as we see here. 00:20:13.545 --> 00:20:14.980 Okay, let's disces-discuss a couple of-of, 00:20:14.980 --> 00:20:16.214 a few miscellaneous topics. 00:20:16.214 --> 00:20:18.116 The first miscellaneous topic is, a Domain Name System, 00:20:18.116 --> 00:20:19.184 or known as DNS. 00:20:19.184 --> 00:20:20.185 This is a collection of servers, 00:20:20.185 --> 00:20:22.220 that are spread throughout the world and, uh, 00:20:22.220 --> 00:20:23.221 the DNS is very critical 00:20:23.221 --> 00:20:25.591 to the operation of the, uh, web. 00:20:25.591 --> 00:20:26.692 Uh, typically the input 00:20:26.692 --> 00:20:30.095 to the DU-to the DNS system is a URL, say coursera.org. 00:20:30.095 --> 00:20:31.563 Uh, the URL is a name, 00:20:31.563 --> 00:20:32.764 it's a human readable string, 00:20:32.764 --> 00:20:35.000 that uniquely identifies the object. 00:20:35.000 --> 00:20:36.468 Uh, typically, when you go to your browser, 00:20:36.468 --> 00:20:39.504 whether it's Safari, Chrome, Internet Explorer, Netscape, 00:20:39.504 --> 00:20:41.840 Firefox, or whatever your favorite browser is. 00:20:41.840 --> 00:20:44.576 Uh, you might enter a URL like coursera.org, 00:20:44.576 --> 00:20:46.178 the moment you hit Enter on your browser, 00:20:46.178 --> 00:20:49.081 your browser goes off and contacts a DNS, the DNS, 00:20:49.081 --> 00:20:52.250 the DNS system, and, uhm, uh, gives the DNS system 00:20:52.250 --> 00:20:55.721 the name of this URL, which is coursera.org. 00:20:55.721 --> 00:20:57.289 What does the DNS give back to your browser? 00:20:57.289 --> 00:20:59.224 It gives back the IP address of a web server, 00:20:59.224 --> 00:21:01.793 that hosts that content, so that your machine, 00:21:01.793 --> 00:21:04.296 your browser can tho- then go and send a request, 00:21:04.296 --> 00:21:05.631 to that IP address, 00:21:05.631 --> 00:21:08.634 and fetch the actual content of that web page. 00:21:08.634 --> 00:21:09.935 So, the IP address is an ID, 00:21:09.935 --> 00:21:12.371 it's a unique string that points to that particular object. 00:21:12.371 --> 00:21:14.706 For instance, the coursera.org, uh, web page. 00:21:14.706 --> 00:21:18.577 Uh, and the, um, the ID itself may not be human readable. 00:21:18.577 --> 00:21:20.912 So, essentially, a DNS is a system that translates 00:21:20.912 --> 00:21:22.914 between a human readable name, 00:21:22.914 --> 00:21:25.884 such as, a URL, to, uh, to, uh, unique ID, 00:21:25.884 --> 00:21:27.986 um, which may or may not be human readable, 00:21:27.986 --> 00:21:29.454 like the IP address. 00:21:30.656 --> 00:21:32.224 Uh, in this case, IP address may refer to either, 00:21:32.224 --> 00:21:33.959 the actual web server that is storing the content, 00:21:33.959 --> 00:21:35.527 or maybe an indirection server 00:21:35.527 --> 00:21:37.162 such as, a content distribution network server, 00:21:37.162 --> 00:21:40.132 or, uh, such as, an Akamai server that is, 00:21:40.132 --> 00:21:42.834 uh, hosting or, uh, caching the content on your behalf. 00:21:45.470 --> 00:21:46.605 Uh, graphs, 00:21:46.605 --> 00:21:48.840 um, so when we say graphs, in computer science, 00:21:48.840 --> 00:21:51.943 we don't necessarily mean plots, which have x and y axes. 00:21:51.943 --> 00:21:54.079 Uh, a graph, is essentially, a network. 00:21:54.079 --> 00:21:57.149 So here I am showing you a graph of some cities, 00:21:57.149 --> 00:22:00.318 um, Amsterdam, Boston, Chicago, Delhi, and Edmonton, 00:22:00.318 --> 00:22:03.689 and the travel times, uh, between those cities by air, 00:22:03.689 --> 00:22:06.291 um, rounded up to the nearest hour. 00:22:06.291 --> 00:22:08.894 Okay, so, uh, traveling from Amsterdam to Boston 00:22:08.894 --> 00:22:09.961 takes about six hours. 00:22:09.961 --> 00:22:11.496 Traveli-traveling from Chicago to Delhi 00:22:11.496 --> 00:22:12.831 takes about 14 hours. 00:22:12.831 --> 00:22:14.499 Say these are the flights that are available 00:22:14.499 --> 00:22:16.468 on some particular airline. 00:22:16.468 --> 00:22:17.736 Right? 00:22:17.736 --> 00:22:19.304 So, you see both cities here, as well as, 00:22:19.304 --> 00:22:21.406 uh, connections between, uh, pairs of cities, 00:22:21.406 --> 00:22:25.577 uh, and also, um, some values along those connections, 00:22:25.577 --> 00:22:27.579 which are, the number of hours it takes to transit 00:22:27.579 --> 00:22:28.346 between those cities. 00:22:29.581 --> 00:22:31.183 Uh, these all have names, in the terms, 00:22:31.183 --> 00:22:32.217 in the world of graphs. 00:22:32.217 --> 00:22:35.253 So, each city, would be a node or a vertex. 00:22:35.253 --> 00:22:38.123 Uh, each connection between a pair of nodes, 00:22:38.123 --> 00:22:40.158 would be called an edge, 00:22:40.158 --> 00:22:44.096 and the value along an edge is called the edge weight. 00:22:44.096 --> 00:22:46.832 Also, when I say that A is adjacent to B, 00:22:46.832 --> 00:22:51.636 this means that A has an edge that connects it to B, directly. 00:22:51.636 --> 00:22:53.405 Typically, an edge only goes between, 00:22:53.405 --> 00:22:55.941 uh, uh two nodes or two vertices. 00:22:55.941 --> 00:22:58.076 An edge does not cross three, uh, nodes. 00:22:58.076 --> 00:23:03.181 So here, for instance, I have, um, uh, 1, 2, 3, 4, 5 vertices, 00:23:03.181 --> 00:23:04.483 and how many edges do I have? 00:23:04.483 --> 00:23:08.987 I have an A-E edge, an A-B edge, a B-E edge, a B-C edge 00:23:08.987 --> 00:23:09.888 and a C-D edge, 00:23:09.888 --> 00:23:12.457 so, I also have, uh, five, uh, different edges, 00:23:12.457 --> 00:23:14.626 each with its own, uh, edge weight. 00:23:17.662 --> 00:23:19.331 So, in this, uh, lecture, we have covered, 00:23:19.331 --> 00:23:22.000 the basic data structure, such as, uh, queue and a stack, 00:23:22.000 --> 00:23:24.436 we have seen that processes are programs in action, 00:23:24.436 --> 00:23:26.805 and that they consist of multiple different pieces, 00:23:26.805 --> 00:23:28.940 uh, processes of course are under computer architecture, 00:23:28.940 --> 00:23:30.041 and it's important for you to know, 00:23:30.041 --> 00:23:32.744 at least, a simplified version of the computer architecture, 00:23:32.744 --> 00:23:35.080 so that you know what's, ah, operating where, 00:23:35.080 --> 00:23:36.648 uh, when you run a process. 00:23:36.648 --> 00:23:37.816 We've seen some mathematical topics, 00:23:37.816 --> 00:23:38.950 such as, Big O notation 00:23:38.950 --> 00:23:40.418 which, analyzes the worst case performance 00:23:40.418 --> 00:23:41.920 of any given algorithm, 00:23:41.920 --> 00:23:43.588 and then some basic probability. 00:23:43.588 --> 00:23:46.057 And finally, we have seen a few miscellaneous topics. 00:23:46.057 --> 00:23:48.093 I hope you can come back- keep coming back to this, 00:23:48.093 --> 00:23:49.494 as a reference, whenever you need it, 00:23:49.494 --> 00:23:51.329 uh, throughout, uh, the, uh, course.