1 00:00:00,822 --> 00:00:01,302 Interviewer: Today we have Elon Musk. 2 00:00:01,302 --> 00:00:00,438 Elon, 3 00:00:00,438 --> 00:00:02,991 thank you for joining us. 4 00:00:02,991 --> 00:00:01,493 Elon: Thanks for having me. 5 00:00:01,493 --> 00:00:03,695 Interviewer: So, 6 00:00:03,695 --> 00:00:06,918 we want to spend the time today talking about your view of the future and what people should work on. 7 00:00:06,918 --> 00:00:11,065 To start off, 8 00:00:11,065 --> 00:00:11,784 could you tell us, 9 00:00:11,784 --> 00:00:13,424 you famously said, 10 00:00:13,424 --> 00:00:14,288 when you were younger, 11 00:00:14,288 --> 00:00:16,000 there were five problems that you thought were most important for you to work on. 12 00:00:16,000 --> 00:00:18,826 If you were 22 today, 13 00:00:18,826 --> 00:00:20,540 what would the five problems that you would think about working on be? 14 00:00:20,540 --> 00:00:22,517 Elon: Well, 15 00:00:22,517 --> 00:00:23,815 first of all, 16 00:00:23,815 --> 00:00:27,617 I think if somebody is doing something that is useful to the rest of society, 17 00:00:27,617 --> 00:00:27,617 I think that's a good thing. 18 00:00:00,000 --> 00:00:30,027 Like, 19 00:00:30,027 --> 00:00:31,883 it doesn't have to change the world. 20 00:00:31,883 --> 00:00:36,615 If you make something that has high value to people... 21 00:00:36,615 --> 00:00:38,245 And frankly, 22 00:00:38,245 --> 00:00:39,656 even if it's something, 23 00:00:39,656 --> 00:00:42,847 if it's like just a little game or some improvement in photo sharing or something, 24 00:00:42,847 --> 00:00:50,611 if it has a small amount of good for a large number of people, 25 00:00:50,611 --> 00:00:53,208 I think that's fine. 26 00:00:53,208 --> 00:00:58,032 Stuff doesn't need to change the world just to be good. 27 00:00:58,032 --> 00:01:04,278 But in terms of things that I think are most like to affect the future of humanity, 28 00:01:04,278 --> 00:01:09,209 I think AI is probably the single biggest item in the near-term that's likely to affect humanity. 29 00:01:09,209 --> 00:01:24,909 So, 30 00:01:24,909 --> 00:01:18,663 it's very important that we have the advent of AI in a good way. 31 00:01:18,663 --> 00:01:20,532 It's something that, 32 00:01:20,532 --> 00:01:24,953 if you could look into the crystal ball and to the future, 33 00:01:24,953 --> 00:01:30,904 you would like that outcome because it is something that could go wrong, 34 00:01:30,904 --> 00:01:34,243 as we've talked about many times. 35 00:01:34,243 --> 00:01:35,575 And so, 36 00:01:35,575 --> 00:01:37,658 we really need to make sure it goes right. 37 00:01:37,658 --> 00:01:40,096 So that's AI, 38 00:01:40,096 --> 00:01:44,399 working on AI and making sure it's great future. 39 00:01:44,399 --> 00:01:46,800 That's the most important thing, 40 00:01:46,800 --> 00:01:47,511 I think, 41 00:01:47,511 --> 00:01:48,188 right now, 42 00:01:48,188 --> 00:01:49,923 the most pressing item. 43 00:01:49,923 --> 00:01:50,727 Then, 44 00:01:50,727 --> 00:01:53,744 I would say anything to do with genetics. 45 00:01:53,744 --> 00:01:59,592 If you can actually solve genetic diseases, 46 00:01:59,592 --> 00:02:06,526 if you can prevent dementia or Alzheimer's or something like that with genetic reprograming, 47 00:02:06,526 --> 00:02:07,536 that would be wonderful. 48 00:02:07,536 --> 00:02:14,884 So I think genetics might be the sort of second most important item. 49 00:02:14,884 --> 00:02:16,772 And then, 50 00:02:16,772 --> 00:02:18,528 I think, 51 00:02:18,528 --> 00:02:20,881 having a high-bandwidth interface to the brain. 52 00:02:20,881 --> 00:02:23,105 We're currently bandwidth-limited. 53 00:02:23,105 --> 00:02:27,151 We have a digital tertiary self in the form of out email capabilities, 54 00:02:27,151 --> 00:02:28,580 our computers, 55 00:02:28,580 --> 00:02:29,184 phones, 56 00:02:29,184 --> 00:02:30,107 applications. 57 00:02:30,107 --> 00:02:31,704 We're practically superhuman. 58 00:02:31,704 --> 00:02:39,924 But we're extremely bandwidth-constrained in that interface between the cortex and that tertiary digital form of yourself. 59 00:02:39,924 --> 00:02:42,892 And helping solve that bandwidth constraint would be, 60 00:02:42,892 --> 00:02:44,659 I think, 61 00:02:44,659 --> 00:02:46,499 very important in the future as well. 62 00:02:46,499 --> 00:02:47,512 Yeah. 63 00:02:47,512 --> 00:02:47,960 Interviewer: So one of the, 64 00:02:47,960 --> 00:02:48,569 I think, 65 00:02:48,569 --> 00:02:51,820 most common questions I hear ambitious young people ask is, 66 00:02:51,820 --> 00:02:54,643 "I want to be the next Elon Musk. 67 00:02:54,643 --> 00:02:56,000 How do I do that? 68 00:02:56,000 --> 00:02:57,381 " Obviously, 69 00:02:57,381 --> 00:03:00,035 the next Elon Musk will work on very different things than you did. 70 00:03:00,035 --> 00:03:06,984 But what have you done or what did you do when you were younger that you think sort of set you up to have a big impact? 71 00:03:06,984 --> 00:03:08,320 Elon: Well, 72 00:03:08,320 --> 00:03:09,367 first of all, 73 00:03:09,367 --> 00:03:11,867 I should say that I do not expect to be involved in all these things. 74 00:03:11,867 --> 00:03:13,190 So, 75 00:03:13,190 --> 00:03:16,456 the five things that I thought about at the time in college, 76 00:03:16,456 --> 00:03:19,454 so quite a long time ago, 77 00:03:19,454 --> 00:03:20,709 25 years ago, 78 00:03:20,709 --> 00:03:25,172 making life multi-planetary, 79 00:03:25,172 --> 00:03:29,042 accelerating the transmission to sustainable energy, 80 00:03:29,042 --> 00:03:32,069 the Internet broadly speaking, 81 00:03:32,069 --> 00:03:34,880 and then genetics, 82 00:03:34,880 --> 00:03:37,096 and AI. 83 00:03:37,096 --> 00:03:39,166 I didn't expect to be involved in all those things. 84 00:03:39,166 --> 00:03:42,347 Actually, 85 00:03:42,347 --> 00:03:43,320 at the time in college, 86 00:03:43,320 --> 00:03:48,166 I sort of thought helping with electrification of cars was how it would start out. 87 00:03:48,166 --> 00:03:54,102 That's actually what I worked on as an intern was advanced ultra-capacitors, 88 00:03:54,102 --> 00:03:59,999 to see if there would be a breakthrough relative to batteries for energy storage in cars. 89 00:03:59,999 --> 00:04:01,337 And then, 90 00:04:01,337 --> 00:04:03,501 when I came out to go to Stanford, 91 00:04:03,501 --> 00:04:11,199 that's what I was going to be doing my grad studies on was working on advanced energy storage technologies for electric cars. 92 00:04:11,199 --> 00:04:26,153 And I put that on hold to start an Internet company in '95 because there does seem to be a time for particular technologies when they're at a steep point in the inflection curve. 93 00:04:26,153 --> 00:04:32,659 And I didn't want to do a PHD at Stanford and watch it all happen. 94 00:04:32,659 --> 00:04:38,909 I wasn't entirely certain that the technology I'd be working on would succeed. 95 00:04:38,909 --> 00:04:43,069 You can get a doctorate on many things that ultimately do not a have practical bearing on the world. 96 00:04:43,069 --> 00:04:51,250 And I really was just trying to be useful. 97 00:04:51,250 --> 00:04:52,352 That's the optimization. 98 00:04:52,352 --> 00:04:53,115 It's like, 99 00:04:53,115 --> 00:04:56,630 "What can I do that would actually be useful? 100 00:04:56,630 --> 00:05:00,263 "Interviewer: Do you think people that want to be useful today should get PhDs? 101 00:05:00,263 --> 00:05:02,124 Elon: Mostly not. 102 00:05:02,124 --> 00:05:05,546 Interviewer: What is the way to be useful? 103 00:05:05,546 --> 00:05:06,523 Elon: Some yes, 104 00:05:06,523 --> 00:05:07,493 but mostly not. 105 00:05:07,493 --> 00:05:09,974 Interviewer: How should someone figure out how they can be most useful? 106 00:05:09,974 --> 00:05:11,418 Elon: Well, 107 00:05:11,418 --> 00:05:10,670 I think you make some estimates of, 108 00:05:10,670 --> 00:05:12,776 whatever this thing is that you're trying to create, 109 00:05:12,776 --> 00:05:21,248 what would be the utility delta compared to the current state of the art times how many people it would affect. 110 00:05:21,248 --> 00:05:30,797 So that's why I think having something that makes a big difference but affects sort of small to moderate number of people is great, 111 00:05:30,797 --> 00:05:34,048 as is something that makes even a small difference but affects a vast number of people. 112 00:05:34,048 --> 00:05:35,673 Like, 113 00:05:35,673 --> 00:05:37,750 the area under the curve. 114 00:05:37,750 --> 00:05:36,852 Interviewer: Yeah, 115 00:05:36,852 --> 00:05:39,358 the area under the curve. 116 00:05:39,358 --> 00:05:40,102 Elon: Yeah, 117 00:05:40,102 --> 00:05:38,014 exactly. 118 00:05:38,014 --> 00:05:38,450 I mean, 119 00:05:38,450 --> 00:05:42,711 the area under the curve would actually be roughly similar for those two things, 120 00:05:42,711 --> 00:05:47,031 so it's actually really about just trying to be useful and matter. 121 00:05:47,031 --> 00:05:51,860 Interviewer: When you're trying to estimate probability of success, 122 00:05:51,860 --> 00:05:53,240 so this thing will be really useful, 123 00:05:53,240 --> 00:05:54,799 good area under the curve... 124 00:05:54,799 --> 00:05:56,480 I guess to use the example of SpaceX, 125 00:05:56,480 --> 00:05:59,466 when you made the go-decision that you were actually going to do that, 126 00:05:59,466 --> 00:06:01,209 this was kind of a very crazy thing at the time. 127 00:06:01,209 --> 00:06:02,363 Elon: Very crazy. 128 00:06:02,363 --> 00:06:03,100 For sure. 129 00:06:03,100 --> 00:06:05,073 They were not shy of saying that. 130 00:06:05,073 --> 00:06:07,900 But I agreed with them that it was quite crazy. 131 00:06:07,900 --> 00:06:09,546 Crazy... 132 00:06:09,546 --> 00:06:15,075 if the objective was to achieve the best risk-adjusted return, 133 00:06:15,075 --> 00:06:19,185 starting off a company is insane. 134 00:06:19,185 --> 00:06:21,748 But that was not my objective. 135 00:06:21,748 --> 00:06:28,177 I'd soon come to a conclusion that if something didn't happen to improve rocket technology, 136 00:06:28,177 --> 00:06:29,681 we'd be stuck on earth forever. 137 00:06:29,681 --> 00:06:35,470 And the big aerospace companies had just had no interest in radical innovation. 138 00:06:35,470 --> 00:06:42,010 All they wanted to do was try to make their old technology slightly better every year. 139 00:06:42,010 --> 00:06:42,921 And in fact, 140 00:06:42,921 --> 00:06:45,186 sometimes it would actually get worse. 141 00:06:45,186 --> 00:06:47,410 Particularly in rockets, 142 00:06:47,410 --> 00:06:48,067 it was pretty bad. 143 00:06:48,067 --> 00:06:52,701 In '69 we were able to go to the moon with the Saturn 5. 144 00:06:52,701 --> 00:06:55,571 And then the space shuttle could only take people to low-earth orbit. 145 00:06:55,571 --> 00:06:56,794 And then the space shuttle retired. 146 00:06:56,794 --> 00:06:58,771 And that trend basically trends to zero. 147 00:06:58,771 --> 00:07:04,866 People sometimes think technology just automatically gets better every year but actually it doesn't. 148 00:07:04,866 --> 00:07:08,820 It only gets better if smart people work like crazy to make it better. 149 00:07:08,820 --> 00:07:11,848 That's how any technology actually gets better. 150 00:07:11,848 --> 00:07:13,057 By itself, 151 00:07:13,057 --> 00:07:15,248 technology, 152 00:07:15,248 --> 00:07:16,943 if people don't work at it, 153 00:07:16,943 --> 00:07:18,585 actually will decline. 154 00:07:18,585 --> 00:07:20,738 Look at the history of civilizations, 155 00:07:20,738 --> 00:07:21,747 many civilizations. 156 00:07:21,747 --> 00:07:22,597 Look at, 157 00:07:22,597 --> 00:07:23,041 say, 158 00:07:23,041 --> 00:07:24,366 ancient Egypt, 159 00:07:24,366 --> 00:07:28,594 where they were able to build these incredible pyramids and then they basically forgot how to build pyramids. 160 00:07:28,594 --> 00:07:32,926 And even hieroglyphics. 161 00:07:32,926 --> 00:07:35,858 They forgot how to read hieroglyphics. 162 00:07:35,858 --> 00:07:41,124 Or if you look at Rome and how they were able to build these incredible roadways and aqueducts and indoor plumbing, 163 00:07:41,124 --> 00:07:42,549 they forgot how to do all of those things. 164 00:07:42,549 --> 00:07:46,688 There are many such examples in history. 165 00:07:46,688 --> 00:07:57,319 So I think we should always bear in mind that entropy is not on your side. 166 00:07:57,319 --> 00:08:05,983 Interviewer: One thing I really like about you is you are unusually fearless and willing to go in the face of other people telling you something is crazy. 167 00:08:05,983 --> 00:08:07,729 And I know a lot of pretty crazy people. 168 00:08:07,729 --> 00:08:08,136 You still stand out. 169 00:08:08,136 --> 00:08:12,359 Where does that come from or how do you think about making a decision when everyone tells you, 170 00:08:12,359 --> 00:08:13,358 "This is a crazy idea? 171 00:08:13,358 --> 00:08:15,642 " Where do you get the internal strength to do that? 172 00:08:15,642 --> 00:08:16,712 Elon: Well, 173 00:08:16,712 --> 00:08:20,995 first of all I'd say I actually think I feel fear quite strongly. 174 00:08:20,995 --> 00:08:25,280 So it's not as though I just have the absence of fear. 175 00:08:25,280 --> 00:08:27,071 I feel it quite strongly. 176 00:08:27,071 --> 00:08:35,775 There are just times when something is important enough that you believe in it enough that you do it in spite of fear. 177 00:08:35,775 --> 00:08:37,437 Interviewer: So, 178 00:08:37,437 --> 00:08:38,616 speaking of important things. 179 00:08:38,616 --> 00:08:39,170 Elon: It's like, 180 00:08:39,170 --> 00:08:40,937 people shouldn't think, 181 00:08:40,937 --> 00:08:42,529 well, 182 00:08:42,529 --> 00:08:43,953 "I feel fear about this and therefore I shouldn't do it, 183 00:08:43,953 --> 00:08:46,896 " it's normal to feel fear. 184 00:08:46,896 --> 00:08:47,943 Like, 185 00:08:47,943 --> 00:08:51,006 you'd have to have something mentally wrong with you if you don't feel fear. 186 00:08:51,006 --> 00:08:54,198 Interviewer: So, 187 00:08:54,198 --> 00:08:56,428 you just feel it and let the importance of it drive you to do it anyway? 188 00:08:56,428 --> 00:08:57,884 Elon: Yeah. 189 00:08:57,884 --> 00:08:58,355 You know, 190 00:08:58,355 --> 00:09:00,164 actually something that can be helpful is fatalism, 191 00:09:00,164 --> 00:09:01,333 to some degree. 192 00:09:01,333 --> 00:09:02,977 If you just accept the probabilities, 193 00:09:02,977 --> 00:09:04,085 then that diminishes fear. 194 00:09:04,085 --> 00:09:11,419 When starting SpaceX, 195 00:09:11,419 --> 00:09:19,545 I thought the odds of success were less than 10% and I just accepted that actually probably I would just lose everything. 196 00:09:19,545 --> 00:09:23,852 But that maybe would make some progress. 197 00:09:23,852 --> 00:09:25,641 If we could just move the ball forward, 198 00:09:25,641 --> 00:09:26,611 even if we died, 199 00:09:26,611 --> 00:09:30,260 maybe some other company could pick up the baton and keep moving it forward, 200 00:09:30,260 --> 00:09:33,444 so we'd still do some good. 201 00:09:33,444 --> 00:09:34,593 Yeah, 202 00:09:34,593 --> 00:09:36,939 same with Tesla. 203 00:09:36,939 --> 00:09:40,753 I thought the odds of a car company succeeding were extremely low. 204 00:09:40,753 --> 00:09:42,719 Interviewer: What do you think the odds of the Mars colony are at this point, 205 00:09:42,719 --> 00:09:44,181 today? 206 00:09:44,181 --> 00:09:45,758 Elon: Well, 207 00:09:45,758 --> 00:09:46,373 oddly enough, 208 00:09:46,373 --> 00:09:50,293 I actually think they're pretty good. 209 00:09:50,293 --> 00:09:52,479 Interviewer: So when can I go? 210 00:09:52,479 --> 00:09:54,665 If I can come back, 211 00:09:54,665 --> 00:09:56,036 I want to come back. 212 00:09:56,036 --> 00:09:56,036 Elon: I hope I'm not in some realm of self-delusion here. 213 00:00:00,000 --> 00:09:56,954 But look at it this way: at this point, 214 00:09:56,954 --> 00:09:56,914 I'm certain there is a way. 215 00:09:56,914 --> 00:10:03,245 I'm certain that success is one of possible outcomes for establishing a self-sustaining Mars colony, 216 00:10:03,245 --> 00:10:04,542 a growing Mars colony. 217 00:10:04,542 --> 00:10:06,592 I'm certain that is possible. 218 00:10:06,592 --> 00:10:09,849 Whereas until maybe a few years ago, 219 00:10:09,849 --> 00:10:11,933 I was not sure that success was even one of the possible outcomes. 220 00:10:11,933 --> 00:10:16,221 In terms of having some meaningful number of people going to Mars, 221 00:10:16,221 --> 00:10:20,884 I think this is potentially something that can be accomplished in about 10 years. 222 00:10:20,884 --> 00:10:23,301 Maybe sooner, 223 00:10:23,301 --> 00:10:25,288 maybe nine years. 224 00:10:25,288 --> 00:10:30,674 I need to make sure that SpaceX doesn't die between now and then and that I don't die, 225 00:10:30,674 --> 00:10:32,527 or if I do die, 226 00:10:32,527 --> 00:10:35,212 that someone takes over who will continue that. 227 00:10:35,212 --> 00:10:36,793 Interviewer: You shouldn't go on the first launch. 228 00:10:36,793 --> 00:10:37,738 Elon: Yeah, 229 00:10:37,738 --> 00:10:38,270 exactly. 230 00:10:38,270 --> 00:10:41,922 The first launch will be robotic anyway. 231 00:10:41,922 --> 00:10:43,237 Interviewer: I want to go, 232 00:10:43,237 --> 00:10:44,592 except for the Internet latency. 233 00:10:44,592 --> 00:10:45,451 Elon: Yeah, 234 00:10:45,451 --> 00:10:47,711 the internet latency would be pretty significant. 235 00:10:47,711 --> 00:10:52,914 Mars is roughly 12 light minutes from the sun and Earth is 8 light minutes. 236 00:10:52,914 --> 00:10:53,737 So, 237 00:10:53,737 --> 00:10:55,536 the closest approach to Mass is four light minutes away. 238 00:10:55,536 --> 00:10:57,097 The furthest approach is 20. 239 00:10:57,097 --> 00:11:00,903 A little more because you can't sort of talk directly through the sun. 240 00:11:00,903 --> 00:11:03,676 Interviewer: Speaking of really important problems, 241 00:11:03,676 --> 00:11:05,204 AI. 242 00:11:05,204 --> 00:11:07,462 You have been outspoken about AI. 243 00:11:07,462 --> 00:11:11,316 Could you talk about what you think the positive future for AI looks like and how we get there? 244 00:11:11,316 --> 00:11:13,976 Elon: Okay, 245 00:11:13,976 --> 00:11:22,524 I mean I do want to emphasize that this is not really something that I advocate or this is not prescriptive. 246 00:11:22,524 --> 00:11:24,525 This is simply, 247 00:11:24,525 --> 00:11:25,771 hopefully, 248 00:11:25,771 --> 00:11:27,751 predictive. 249 00:11:27,751 --> 00:11:29,848 Because you will hear some say, 250 00:11:29,848 --> 00:11:31,731 well, 251 00:11:31,731 --> 00:11:39,443 like this is something that I want to occur instead of this is something I think that probably is the best of the available alternatives. 252 00:11:39,443 --> 00:11:41,666 The best of the available alternatives that I can come up with, 253 00:11:41,666 --> 00:11:47,275 and maybe someone else can come up with a better approach or better outcome, 254 00:11:47,275 --> 00:11:50,366 is that we achieve democratization of AI technology. 255 00:11:50,366 --> 00:12:02,146 Meaning that no one company or small set of individuals has control over advanced AI technology. 256 00:12:02,146 --> 00:12:05,090 I think that's very dangerous. 257 00:12:05,090 --> 00:12:09,054 It could also get stolen by somebody bad, 258 00:12:09,054 --> 00:12:16,231 like some evil dictator or country could send their intelligence agency to go steal it and gain control. 259 00:12:16,231 --> 00:12:18,087 It just becomes a very unstable situation, 260 00:12:18,087 --> 00:12:19,086 I think, 261 00:12:19,086 --> 00:12:21,798 if you've got any incredibly powerful 262 00:12:21,798 --> 00:12:23,198 AI. 263 00:12:23,198 --> 00:12:27,148 You just don't know who's going to control that. 264 00:12:27,148 --> 00:12:31,471 So it's not that I think that the risk is that the AI would develop a will of its own right off the bat. 265 00:12:31,471 --> 00:12:39,977 I think the concern is that someone may use it in a way that is bad. 266 00:12:39,977 --> 00:12:45,364 Or even if they weren't going to use it in a way that's bad but somebody could take it from them and use it in a way that's bad, 267 00:12:45,364 --> 00:12:46,539 that, 268 00:12:46,539 --> 00:12:47,062 I think, 269 00:12:47,062 --> 00:12:48,206 is quite a big danger. 270 00:12:48,206 --> 00:12:51,676 So I think we must have democratization of AI technology to make it widely available. 271 00:12:51,676 --> 00:12:54,977 And that's the reason that obviously you, 272 00:12:54,977 --> 00:12:56,059 me, 273 00:12:56,059 --> 00:13:07,789 and the rest of the team created OpenAI was to help spread out AI technology so it doesn't get concentrated in the hands of a few. 274 00:13:07,789 --> 00:13:12,241 But then, 275 00:13:12,241 --> 00:13:12,713 of course, 276 00:13:12,713 --> 00:13:18,560 that needs to be combined with solving the high-bandwidth interface to the cortex. 277 00:13:18,560 --> 00:13:22,686 Interviewer: Humans are so slow. 278 00:13:22,686 --> 00:13:24,157 Elon: Humans are so slow. 279 00:13:24,157 --> 00:13:24,924 Yes, 280 00:13:24,924 --> 00:13:25,653 exactly. 281 00:13:25,653 --> 00:13:31,729 But we already have a situation in our brain where we've got the cortex and the limbic system... 282 00:13:31,729 --> 00:13:33,506 The limbic system is kind of a... 283 00:13:33,506 --> 00:13:34,363 I mean, 284 00:13:34,363 --> 00:13:36,876 that's the primitive brain. 285 00:13:36,876 --> 00:13:41,331 That's kind of like your instincts and whatnot. 286 00:13:41,331 --> 00:13:44,881 And the cortex is the thinking upper part of the brain. 287 00:13:44,881 --> 00:13:47,337 Those two seem to work together quite well. 288 00:13:47,337 --> 00:13:49,267 Occasionally, 289 00:13:49,267 --> 00:13:50,700 your cortex and limbic system will disagree, 290 00:13:50,700 --> 00:13:51,769 but they... 291 00:13:51,769 --> 00:13:53,838 Interviewer: It generally works pretty well. 292 00:13:53,838 --> 00:13:55,158 Elon: Generally works pretty well, 293 00:13:55,158 --> 00:13:56,883 and it's like rare to find someone who... 294 00:13:56,883 --> 00:14:01,387 I've not found someone wishes to either get rid of the cortex or get rid of the limbic system. 295 00:14:01,387 --> 00:14:03,463 Interviewer: Very true. 296 00:14:03,463 --> 00:14:04,106 Elon: Yeah, 297 00:14:04,106 --> 00:14:04,696 that's unusual. 298 00:14:04,696 --> 00:14:19,790 So I think if we can effectively merge with AI by improving the neural link between your cortex and your digital extension of yourself, 299 00:14:19,790 --> 00:14:22,214 which already, 300 00:14:22,214 --> 00:14:22,156 like I said, 301 00:14:22,156 --> 00:14:23,051 already exists, 302 00:14:23,051 --> 00:14:24,928 just has a bandwidth issue. 303 00:14:24,928 --> 00:14:32,295 And then effectively you become an AI-human symbiote. 304 00:14:32,295 --> 00:14:35,054 And if that then is widespread, 305 00:14:35,054 --> 00:14:38,344 with anyone who wants it can have it, 306 00:14:38,344 --> 00:14:40,673 then we solve the control problem as well, 307 00:14:40,673 --> 00:14:48,789 we don't have to worry about some evil dictator AI because we are the AI collectively. 308 00:14:48,789 --> 00:14:51,366 That seems like the best outcome I can think of. 309 00:14:51,366 --> 00:14:52,586 Interviewer: So, 310 00:14:52,586 --> 00:14:57,734 you've seen other companies in their early days that start small and get really successful. 311 00:14:57,734 --> 00:14:59,322 I hope I never get this asked on camera, 312 00:14:59,322 --> 00:15:02,026 but how do you think OpenAI is going as a six-month-old company? 313 00:15:02,026 --> 00:15:04,594 Elon: I think it's going pretty well. 314 00:15:04,594 --> 00:15:06,798 I think we've got a really talented group at OpenAI. 315 00:15:06,798 --> 00:15:08,074 Interviewer: Seems like it. 316 00:15:08,074 --> 00:15:09,023 Elon: Yeah, 317 00:15:09,023 --> 00:15:10,441 a really talented team and they're working hard. 318 00:15:10,441 --> 00:15:16,207 OpenAI is structured as a 501(c)(3) non-profit. 319 00:15:16,207 --> 00:15:19,174 But many non-profits do not have a sense of urgency. 320 00:15:19,174 --> 00:15:21,482 It's fine, 321 00:15:21,482 --> 00:15:23,240 they don't have to have a sense of urgency, 322 00:15:23,240 --> 00:15:26,845 but OpenAI does because I think people really believe in the mission. 323 00:15:26,845 --> 00:15:28,836 I think it's important. 324 00:15:28,836 --> 00:15:40,106 And it's about minimizing the risk of existential harm in the future. 325 00:15:40,106 --> 00:15:42,401 And so I think it's going well. 326 00:15:42,401 --> 00:15:46,482 I'm pretty impressed with what people are doing and the talent level. 327 00:15:46,482 --> 00:15:48,016 And obviously, 328 00:15:48,016 --> 00:15:51,618 we're always looking for great people to join in the mission. 329 00:15:51,618 --> 00:15:53,681 Interviewer: It looks like close to 40 people now. 330 00:15:53,681 --> 00:15:54,577 It's quite a lot. 331 00:15:54,577 --> 00:15:55,662 All right. 332 00:15:55,662 --> 00:15:58,032 Just a few more questions before we wrap up. 333 00:15:58,032 --> 00:15:58,703 How do you spend your days now? 334 00:15:58,703 --> 00:16:00,970 What do you allocate most of your time to? 335 00:16:00,970 --> 00:16:06,738 Elon: My time is mostly split between SpaceX and Tesla. 336 00:16:06,738 --> 00:16:08,198 And of course, 337 00:16:08,198 --> 00:16:13,268 I try to spend a part of every week at OpenAI. 338 00:16:13,268 --> 00:16:19,399 So I spend basically half a day at OpenAI most weeks. 339 00:16:19,399 --> 00:16:23,753 And then I have some OpenAI stuff that happens during the week. 340 00:16:23,753 --> 00:16:25,159 But other than that, 341 00:16:25,159 --> 00:16:24,720 it's really SpaceX and Tesla. 342 00:16:24,720 --> 00:16:27,837 Interviewer: What do you do when you're at SpaceX and Tesla? 343 00:16:27,837 --> 00:16:30,456 What does your time look like there? 344 00:16:30,456 --> 00:16:31,465 Elon: Yes, 345 00:16:31,465 --> 00:16:32,332 it's a good question. 346 00:16:32,332 --> 00:16:36,781 I think a lot of people think I must spend a lot of time with media or on businessy things. 347 00:16:36,781 --> 00:16:40,069 But actually almost all my time, 348 00:16:40,069 --> 00:16:41,402 like 80% of it, 349 00:16:41,402 --> 00:16:44,046 is spent on engineering and design. 350 00:16:44,046 --> 00:16:45,498 Engineering and design, 351 00:16:45,498 --> 00:16:47,991 so it's developing next-generation product. 352 00:16:47,991 --> 00:16:54,390 That's 80% of it. 353 00:16:54,390 --> 00:16:55,201 Interviewer: You probably don't remember this. 354 00:16:55,201 --> 00:16:56,583 A very long time ago, 355 00:16:56,583 --> 00:16:56,795 many, 356 00:16:56,795 --> 00:16:57,372 many, 357 00:16:57,372 --> 00:16:58,066 years, 358 00:16:58,066 --> 00:16:58,540 you took me on a tour of SpaceX. 359 00:16:58,540 --> 00:17:02,786 And the most impressive thing was that you knew every detail of the rocket and every piece of engineering that went into it. 360 00:17:02,786 --> 00:17:04,716 And I don't think many people get that about you. 361 00:17:04,716 --> 00:17:05,969 Elon: Yeah. 362 00:17:05,969 --> 00:17:08,159 I think a lot of people think I'm kind of a business person or something, 363 00:17:08,159 --> 00:17:09,508 which is fine. 364 00:17:09,508 --> 00:17:10,001 Business is fine. 365 00:17:10,001 --> 00:17:15,281 But really it's like at SpaceX, 366 00:17:15,281 --> 00:17:18,346 Gwynne Shotwell is Chief Operating Officer. 367 00:17:18,346 --> 00:17:19,606 She manages legal, 368 00:17:19,606 --> 00:17:20,198 finance, 369 00:17:20,198 --> 00:17:21,268 sales, 370 00:17:21,268 --> 00:17:26,420 and general business activity. 371 00:17:26,420 --> 00:17:29,863 And then my time is almost entirely with the engineering team, 372 00:17:29,863 --> 00:17:37,316 working on improving the Falcon 9 and our Dragon spacecraft and developing the Mars Colonial architecture. 373 00:17:37,316 --> 00:17:38,953 At Tesla, 374 00:17:38,953 --> 00:17:40,661 it's working on the Model 3 and, 375 00:17:40,661 --> 00:17:41,856 yeah, 376 00:17:41,856 --> 00:17:47,536 so I'm in the design studio, 377 00:17:47,536 --> 00:17:48,845 take up a half a day a week, 378 00:17:48,845 --> 00:17:54,831 dealing with aesthetics and look-and-feel things. 379 00:17:54,831 --> 00:18:03,872 And then most of the rest of the week is just going through engineering of the car itself as well as engineering of the factory. 380 00:18:03,872 --> 00:18:12,456 Because the biggest epiphany I've had this year is that what really matters is the machine that builds the machine, 381 00:18:12,456 --> 00:18:14,309 the factory. 382 00:18:14,309 --> 00:18:18,098 And that is at least two orders of magnitude harder than the vehicle itself. 383 00:18:18,098 --> 00:18:20,934 Interviewer: It's amazing to watch the robots go here and these cars just happen. 384 00:18:20,934 --> 00:18:23,015 Elon: Yeah. 385 00:18:23,015 --> 00:18:24,363 Now, 386 00:18:24,363 --> 00:18:33,151 this actually has a relatively lower level of automation compared to what the Gigafactory will have and what Model 3 will have. 387 00:18:33,151 --> 00:18:34,131 Interviewer: What's the speed on the line of these cars? 388 00:18:34,131 --> 00:18:36,281 Elon: Actually our speed on the line is incredibly slow. 389 00:18:36,281 --> 00:18:37,332 I think we are... 390 00:18:37,332 --> 00:18:39,009 in terms of the extra velocity of vehicles on the line, 391 00:18:39,009 --> 00:18:40,094 it's probably about, 392 00:18:40,094 --> 00:18:41,608 including both X and S, 393 00:18:41,608 --> 00:18:51,476 it's maybe five centimeters per second. 394 00:18:51,476 --> 00:18:53,501 This is very slow. 395 00:18:53,501 --> 00:18:54,413 Interviewer: And what would you like to get to? 396 00:18:54,413 --> 00:18:59,796 Elon: I'm confident we can get to at least one meter per second. 397 00:18:59,796 --> 00:19:00,119 So, 398 00:19:00,119 --> 00:19:01,803 a 20-fold increase. 399 00:19:01,803 --> 00:19:03,409 Interviewer: That would be very fast. 400 00:19:03,409 --> 00:19:03,991 Elon: Yeah. 401 00:19:03,991 --> 00:19:05,194 At least. 402 00:19:05,194 --> 00:19:06,573 I mean, 403 00:19:06,573 --> 00:19:07,705 I think quite a ... 404 00:19:07,705 --> 00:19:08,303 one meter per second, 405 00:19:08,303 --> 00:19:09,051 just to put into perspective, 406 00:19:09,051 --> 00:19:10,960 is a slow walk or a medium-speed walk. 407 00:19:10,960 --> 00:19:13,827 A fast walk could be one and a half meters per second. 408 00:19:13,827 --> 00:19:13,827 And then the fastest humans can run over 10 meters per second.