00:01
Nine one one. What's the state what's the state of your emergency?
00:04
Hey, the tech industry got flipped upside down over the weekend. I was staring at my phone the whole time. I haven't been so so, invigorated by any story in, like, five years.
00:14
My wife might divorce me because I haven't looked up for phone. I've just been on Twitter all weekend. Sam, how are you doing right now with this open AI story? It's been awesome. I, you the real winner through all this Twitter. Twitter's been great for this. Right? Twitter just thrives through all types of controversy.
00:29
Twitter, aka, the Silicon Valley group chat, has been awesome.
00:34
Awesome all weekend. It's been awesome.
00:44
I thought there was a chance that you were just, like, out duck hunting, and I would come on today, and you'd be like, what happened? I'm a Sam Altman fan. I I've loved this guy. We've talked about him for years. I'm a big fan of his. This is and I love drama. I love goth up. I read TMZ every day, and so now I've got TMZ on Twitter. It's awesome. Of course, we're talking about OpenAI and Sam Altman getting fired on we're we're recording this Monday. He was fired, like, Friday, but there's, like, news happening in real time, like, constantly. Right? Exactly. Like, just as of an hour ago, there's news. So I we don't know exactly where this is going to all land. We're gonna try to get this out as soon as soon as we can. We we had a whole other episode planned, and we canceled it. We just said, let's do let's do this right now. What's our reaction?
01:25
What is our reaction? To the open AI stuff that's going on. Alright. Sam, where do you wanna start? I think we should take this
01:32
character by character,
01:34
go through all the players that were involved, and I think we should give them kind of a a grain.
01:39
And they can get, you know, If they get an a or an a plus, that means they came out a big winner here. You could get a a c, you can get an f. I think we should grade all the players of how this has turned out for them. In the, you know, forty eight hours so far that this happened. Oh, do you wanna first give a little timeline summary recap? Yeah. So
01:58
I think you should do it because you've been paying attention to it more closely, but it all started Friday at, like, what, three o'clock? Exactly. Friday afternoon.
02:06
There's I get a text message from Ben Levy. He just says, wow, Sam Altman. What the fuck?
02:12
Holy shit.
02:13
What?
02:14
And I but he doesn't say the the news yet and then what? What? And then he's like, Sam Altman, out But he, like, misspells out, then he, like, does the stars not out? Out of what? Where where's the outcome? It it was unfathomable that Sam Albert would get fired. Right? Like,
02:30
And the same way that, you know, I would don't expect to wake up and see that Elon Musk has been fired from Tesla.
02:35
You don't think you don't wake up and think that Sam Altman's gonna get fired from Open AI, which is
02:40
the hottest private company, probably the most important startup in the world that has gone from zero to ninety billion evaluation in just a couple of years here. Yeah. It was shocking. It started because, Open AI,
02:53
I don't even know. Our guy put this in military time. So what's this two forty eight? At two forty eight, Open AI announced the leadership transition. They said that Sam Altman is departing as CEO because he was not consistently candid in his communications with the board. And their CTO,
03:07
a lady named Mira. She was be become the intermittent,
03:11
CEO, a little over an hour later, Sam Alman said I loved my time at OpenAI. I was transformative personally and that he's excited for what's next. Well, let's pause there because the first phase here is wild speculation.
03:23
So the speculation is, oh my god. I shock and speculation. So I can't believe this happened. Like an April fool's joke. It must have been something bad, dude. The you know, he must you know, this was a fraud.
03:36
Did they unleash AGI? Was there a huge privacy,
03:39
like, leak
03:40
that is causing him to have to step down because this is a security issue that he he overlooked.
03:45
Is this there's, like, this accusation from his sister about sexual assault when they were kids, Is it because of that? And everybody
03:52
assumes it's gotta be the worst.
03:55
Until
03:56
one hour later,
03:57
or when is it? One hour later? Yeah. I think
04:01
couple hours later, Greg, who was the the chairman and was essentially his co founder,
04:07
of Open AI. The the lead technical guy at the time
04:10
or the early days,
04:12
sends this message to the team. Hi, everyone. Super proud of what we built starting in my apartment eight years ago.
04:18
We've been through tough times and great times together, it's accomplishing so much, doing what should have been impossible. Based on today's news, comma, I quit.
04:25
Wishing you all But the eye lowercase,
04:28
the lowercase eye.
04:30
And,
04:32
and so then now we're we interface too. Phase two is
04:36
wait. Greg's on the board.
04:38
Wait. Greg is a stand up guy. If it was something really bad, Greg would not have just quit with him. There must this must not be as bad. Is there some kind of power play jealousy
04:50
a coup that's happening here is Sam being wrong.
04:52
And the
04:53
instantly, I see the court of public opinion
04:56
shift.
04:57
And the court of public opinion says, look, if Sam does something really effed up, Greg would not have just followed him out the door and quit. Unless Greg was in on it, so still a possibility
05:06
But more likely, it seems like Greg doesn't stand for this reasoning.
05:11
So maybe we don't stand for this reasoning. And all of a sudden,
05:14
you see the Sam army.
05:16
Come forward.
05:17
People who are started from YC when he was the president of YC saying, look,
05:23
I don't know what happened here, but Sam is like he went to bat for me.
05:28
So many stories came out that were all the same variety. I don't know if you saw these. They were all the same variety. It was startup saying,
05:34
we had a time when we were screwed.
05:37
I emailed Sam to be like, hey, here's what's going on. Blah blah blah. Sam quickly replied with just something like, hey, make sure you, you know, you're communicating or like,
05:46
you know, just do your best. There's only what you can control. But behind the scenes, He went to war for us. We found out later that he called all of our investors threatened to dangle them over the balcony if they if they screwed us and he saved our ass and didn't even mention it that he was doing that just in the background.
06:03
He went through and, like, saved us. And we found out later that this was true. So I don't know what what happened here, but, you know, I ride with with Sam. Well, an equally big deal.
06:13
Is one time I had a customer service question and I deemed him, and he replied, and he solved it for me. So Exactly. And all a lot of it was he replied fast. He replied what he didn't have to. He vouched for us when he didn't have to. He Man, he he did, by the way, that was out of joke. He did reply to me. Like, I had, like, air. I couldn't figure something out. I was, like, I'm just gonna DM Sam. And he replied with, like, the the the he he solved my problem. But, yeah, the guy seems like a great guy. And his brother comes out and says, to all the people that are gleefully hating today, please know, you're betting against the wrong guy.
06:45
Okay. So that kind of foreshadows.
06:47
Act two. Act two is the weekend. So Friday, you got the crazy news dump. Now the weekend, you have
06:55
news that
06:56
people start even inside the company start to stand with Sam because they say if there was something bad,
07:03
Tell us what it is. You don't have to say it publicly, say it internally.
07:06
What does it mean he wasn't
07:08
candidly
07:10
consistent or consistently candid with the board. About what? Did he lie? What did he lie about? How bad was it? You're taking our leader out. Our two leaders out now.
07:20
And wait, who is the board? Oh, the board is like this combination of of of of people who have no
07:27
They didn't build this. They don't even own equity in this, but they're deciding our future because the board is basically
07:34
one legit technologist.
07:35
This guy, Adam, you know, Angela, who who was the co founder CTO of of Facebook back in the day, and then he co founded Quora after that. So you have Adam,
07:44
legitimate character in Silicon Valley. And then you have a bunch of other people that nobody's ever heard of. One of them is, like, that actor's wife. She's like, you know, an academic somewhere never had a job. Jettison Gordon Lovett. Yeah. Exactly. Yeah. You know, the the guy from inception.
08:00
You know, like, great.
08:03
The guy from inception, not Leonardo De Caprio.
08:07
That's the first. Not even him. He he's not the one on the board. And not even him.
08:14
So you got, like, you know, you got her. You got a couple of the characters here.
08:18
But nobody that is, again, no equity, no skin in the game and, no track record building or operating, you know, complex companies.
08:26
So that seems a bit weird. They're getting no data and,
08:31
rubers come out. Sam's gonna have a new company by Monday. Sam and Greg, are gonna create an open AI competitor by Monday,
08:38
which is just like
08:40
a pretty bad ass threat to throw down.
08:42
And so once the team starts to say,
08:46
they start to tweet out these emoji hearts saying I, basically, I stand with with Sam. I don't know all the details if you don't come out with any details explaining why you did this, I'm I'm gonna probably leave with Sam.
08:58
And so the board starts scrambling. And they start trying to renegotiate with Sam to maybe bring him back. And his conditions are like, cool.
09:06
I'll come back.
09:07
Y'all all gotta resign
09:09
and clear my name.
09:10
You gotta bring back Greg.
09:12
And, we're gonna put a put a new board together. That's like the board of my choosing.
09:17
The board doesn't wanna do all that. And, so they're kind of stalling. They're going back and forth. He goes back to the office with a guest pass on his badge, and he says, this is the last time I'll ever enter this office with a guest pass.
09:31
And then at the last minute,
09:33
We hear the news, the the news come out. It looks like Sam's coming back. And then Sunday night, it hits
09:39
there's a new CEO in town. Emmett Sheer,
09:42
My former boss, the former CEO of Twitch is now suddenly these the the guy for OpenAI. Sam is out. He's in shock. That That was like, you know, what I thought was the end. And before I went to sleep,
09:56
thinking, okay. We're gonna record the spot. I thought that was the end.
09:59
And it releases an announcement on Twitter saying, basically, like, hey,
10:04
hey, I'm here. Here's my thirty day plan.
10:08
I wake up this morning, and there's more news.
10:12
Six hundred of the seven hundred employees
10:15
of Open AI have signed a petition saying, if you don't bring Sam and Greg back and all resigned, we're all leaving.
10:21
Six hundred of seven hundred.
10:24
It was actually five hundred as of three AM, and then it got to six fifty
10:28
as of,
10:30
as of right before recording this. That's see a hundred people woke up and also signed the pledge. In clinic, by the way,
10:36
the guy who was supposedly behind
10:39
the entire coup, Elliot, the main technologist,
10:43
the scientist who was the on the board who who was the guy who sent Sam a Google Meet link and was like, hey, can you join this right quick? It sounds like sure. Let me just pop on. Let me just get my microphone. And he pops on and then gets fire. The whole board is sitting there and they execute a disaster.
10:58
Ilya comes out and signs the pledge and says, I'm sorry for what I did.
11:03
What? And even tweets out, and he goes, like, I wanna get the band back together and make it right. I'm sorry. Oh, yeah. I'm I regret what I did, and, you know, we'll try to right the wrong. And then on top of that, before before that that that they happen,
11:16
Sate Adela, the CEO of Microsoft comes out and says, hey,
11:21
we still support Open AI.
11:24
Sure.
11:26
Look forward to beating the new guy in it, which is just a really funny way of putting it. Because apparently, they were not consulted and only told one minute before the firing happened that, one minute before the press release went out that Sam was fired.
11:39
So they were blindsided, even though they put they've agreed to confront ten billion dollars into this company and own forty nine percent of it.
11:47
And then he says, oh, and good news. Sam and Greg now work for Microsoft.
11:53
And this is just like
11:54
the ultimate
11:56
checkmate move,
11:57
because Sam, I don't know if you know this, but Microsoft not only do they own forty nine percent.
12:03
Not only do they, provide all of the funding, which Open AI have requires to to to exist. If they could pull that at any minute now, because a lot of that funding is in, compute credits. So, basically, server credits.
12:16
But not only that. They also have the license to to all the technology.
12:21
They they have the model, the weights, the infer they have everything that they need to basically hand Sam and Greg the new, the, like, all of the code and be like, you don't have to start from scratch. Here you go. And so,
12:32
in the end,
12:33
they got everything.
12:35
They got everything they wanted. They got the team, the technology, the money,
12:40
got everything.
12:41
The most interesting part of all this though is the characters because this is everyone keeps saying this is succession in real life, and it is it's the it's one of the greatest things to happen to in Silicon Valley in the last, you know, fifteen years. It's a it's a beautiful story line. We thought Sam Bancman was a was a was a gift from the gods. Terms of podcasting, this is actually, I think, more interesting. It's not played out yet, so the story is still gonna, like, keep going. But
13:06
let's talk about the characters. Let's talk about Sam altman first. We talked about him well, we've talked about him a ton, but this guy, basically, his background is he's from Saint Louis, my hometown, give a shout out to Saint Louis. He then he moved to Silicon Valley to join Y Combinator at the age of like nineteen or twenty. He started a company that had, like, mild success. I think it raised, a fair amount of money and sold for fifty million dollars of which he walked away with five million dollars at the age of, like, twenty one. And using that five million dollars, he parlayed that into hundreds of millions by investing in things like Airbnb, Dropbox, things like that. Because he worked at Y Combinator eventually becomes president of Y Combinator at the age of twenty seven. Two two quotes from,
13:44
from Paulgram about T Mobile. And remember, Paul Graham is somebody who,
13:48
invested in Dropbox, invested in Airbnb, invested in, you know, He has seen the founders. He knows everybody in Silicon Valley. He's met Mark Zuckerberg. He's met all these guys. And so
13:58
for him to be most impressed by Sam Altman, means something. And he says
14:03
Sam Altman has it.
14:05
You could parachute him into an island full of cannibals,
14:08
and come back five years later, and he'd be the king.
14:13
That's the first one. And then he goes, the second one he goes.
14:17
I met Sam Altman many years ago when he was nineteen years old. He was a Stanford dropout, which by the way. Harvard or Stanford dropout is the number one. Resume, stand. That is the number one,
14:28
like, item. If you go to Harvard to Stanford and you don't drop out, you have intentionally chosen to not have the number one credential you can have. Which is Harvard dropout or Stanford dropout.
14:38
So he says within meeting,
14:40
within three min minutes of meeting him, I remember thinking,
14:44
so this is what Bill Gates must have been like when he was nineteen.
14:48
In three minutes.
14:50
I don't know what Samoma did in three minutes,
14:53
but I would love I would love to know. I am fascinated by people who are special. They're the kind of the the freaks. I love it in sports, but you could see it. You could see LeBron James at age seventeen was like a freak of nature. Yeah. He had, like, the equivalent of a forty two inch vert, you know, whatever he did. It was, like, yeah, it was like an eighth grader dunking. There's another story that's of remarkable. I thought this is kinda tells you about this guy's character. So, Sam Wilman's gay. He came out when he was, like, fifteen or sixteen, I think. You said something about he's from your hometown saying, There's some story which is that he came out and not only did he just come out as a teenager, which alone takes some bravery. He then, like,
15:29
hosted like a pep rally or something like that. He went to John Burrows. So in Saint Louis where I'm from, every most all high schools are either Catholic or some type of Christian whatever. And he came out as being gay, but he, like, went to a or he I think he created, like, a club, some type of, like, gay club and at a school. They're like, no. We're not we're not about this. And so I think he led, like, a walkout, like, a school walkout. That takes a lot of, like, a little leadership. He's a very courageous guy. And then but for, like, the last fifteen years pre open AI, he was a even though he was president of Y Combinator, he was still a little bit behind the scenes. And even though that he was running this huge incubator and he's a powerful guy, it still felt like, well, why would Paul Graham say this about this kid who still has yet it still has unfulfilled potential. A couple of years ago, he starts working on open AI and then all of a sudden we're like, oh, alright. The guy was plotting and it's it's he's played the long game. Now we understand why everyone, like, looks up to this guy. And open eye open AI has basically taken over the tech industry and be went from zero to one of the biggest companies in the world in what? Five years or something like that? Yeah. Exactly. So he leaves the most prestigious job in Silicon Valley, which is you know, president of of YC. He's Dean of Harvard. Dean of our Harvard.
16:42
He quits suddenly because the
16:45
nonprofit research organization he co founded you know, he felt like that was the right opportunity. He did that.
16:50
Suddenly, it was surprising. I remember at the time I had the thought. I think I've said this before here, which is that I had the thought like,
16:58
if Sam helped me just quit being the president of YC to go join Open AI,
17:03
how can I justify coming into work tomorrow?
17:06
Like, why would I not also go to OpenAI? Whatever Open AI is, I knew nothing at the time. But all I do is if that one of the smartest guys in the world went and did that,
17:16
maybe I should go do that too. And honestly, that would have been the greatest career move I ever made if I had done that because,
17:22
you know, opening eye quickly, you know, changed its structure from nonprofit to becoming, you know, like, a capped for profit thing and, became worth, you know, nearly a hundred billion dollars. In a in a short period of time, creating chat GPT, the fastest growing product of all time to get to a hundred million users faster than any. And let's go to his to partners because when he started it, there's a bunch of weird stories of that I don't that aren't entirely out in the open, but it seems like there was something where Elon Musk funded the business or helped fund the business at first. He invested forty million dollars, and he also helped to recruit
17:56
and, convince
17:57
a few of Sam's co founders to join. Is that right? That's right. So, basically, the when people say, okay, great. Sam is this He, you drop him on an island of cannibals. He comes out king. Okay. Cool. But what's his skill? What does he do? And basically, it was, like, on anything about strategy or ambition.
18:13
But this is what Paul Graham said. He said, if there's if there was ever a question on strategy or ambition,
18:17
I default to what would Sam do?
18:20
And, you know, this program was like twenty years older than Sam saying that. So that's kind of impressive. So what did he do? He basically has a conversation with Elon Musk, and the two of them shared the same belief, which is that artificial intelligence was gonna be a big deal that we would be able to one day realize AGI, which is artificial general intelligence,
18:39
Think about it as,
18:41
you know, super intelligence when when,
18:45
you know,
18:46
AI is generally smarter than humans and kind of everything that we, you know, most of the things that we try to do.
18:53
So the they both said that's also kinda scary because we don't know what's gonna happen on the other side of AGI.
18:58
So we should fund research
19:00
in a nonprofit that will,
19:03
study, you know, fund the research around this so that we could safely create AGI. Instead of having AGI that kills us all like the movies.
19:10
So what Sam did was he hosts his dinner in Palo Alto. And, you know, he gets Elon Musk there and he gets a few other smart key people. One of them was this guy Greg, who was at the time the CT of Stripe, which was before OpenAI, was the
19:24
most impactful, big startup in the world. So they gets a CTO stripe. He gets this guy, Elya, who's,
19:31
like the lead research,
19:33
scientist in AI at at Google. Essentially, he's the kinda like one of the founding fathers of of of AI and deep learning.
19:40
And so they get a bunch of these people around the table, and they basically read Hoffman is there. And they basically like, we need to do this. Elon basically just agrees fund forty million dollars into the nonprofit.
19:51
So Elon puts his money where his mouth is. Sam,
19:54
you know, helps recruit, put the whole deal together, puts brings the people together, On the way home, Sam's like, hey, Greg. Let me give you a ride back to San Francisco. So they drive back to San Francisco and what Greg says and what Sam says is The first thirty minutes, it's an hour long drive from Palo Alto to San Francisco. It's the first thirty minutes. Greg just asked me rapid fire like a hundred fifty questions.
20:14
Then the next then as a thirty minute point, he goes, okay. I've decided I'm gonna do this. I'll quit Stripe and I'll join open AI.
20:21
Whatever this thing is that you're creating, I will join it.
20:24
So the next thirty minutes, they started making their plan. Like, okay. What are the first things we need to do? Which tells you something about Greg, to do, again, a crazy career move like that to, you know, just have independent mindedness and conviction and something that quickly.
20:38
And tells you something about Sam's persuasiveness and ability to, like, put this thing together.
20:43
Then, what Elon says is that he recruited Elya out of Google, which was very, very hard to do. And he says it's the reason that him and Larry Paige, founder of Google, don't talk anymore.
20:52
You know, he used to be friends with Larry. He used to famously Elon didn't own a house for a period of time, and he would just crash at friends houses. And so he would crash at Larry Paige's house all the time. He says that they got into arguments about AI
21:04
and that he was really worried about AI and Larry was like tech forward and was like, no. It's gonna be amazing.
21:10
And,
21:12
and then he says that, you know, he got pissed when Larry called him a specious
21:16
basically like you or you're just all about humans and Elon was like, what the fuck else am I supposed to be about? Are you not? That scares me even more. I'm gonna create open AI now to, like, defend against Google, which owned deep mind, the the number one player at the time. And so
21:33
Elon basically poaches Elon and Sam poach Elya out of Google which took a lot of money and a lot of persuasion, I believe, to get him out. And that he becomes when he joins OpenAI, the research community who worships this guy basically
21:47
followed, and they got all the top researchers to join. And that is the founding story of OpenAI.
21:51
It's a that's amazing. Right? I mean, the the I just this is like drama, movie drama shit, of just that that dinner, that drive home, mean, this is perfect. And we're gonna have to have Justin Timberlake Star as one of these guys. And so this brings us to, I think, the second character that we should talk about. This guy Greg, who I believe is the unsung hero in this whole thing, because
22:14
Sam Altman
22:15
is kind of like the here he's the hero. Right? And you get He's the the CEO.
22:21
He gets backstabbed. He gets wronged. He's the martyr. He he he had to to suffer.
22:26
People come out and they're all supporting Sam and sharing. Like, it's like a funeral for Sam on Twitter the other day. He's like, oh, I remember that once Sam responded to my customer service request, like, oh, I was on a bike once and Sam, you know, he he he moved over in the lane so I could go by. Right? Like, everybody was just sharing their favorite, Sam, memory. And then nobody's really talking about Greg. Now Greg, in my opinion,
22:47
shifted this whole fucking thing. Because
22:50
at first, it was like, damn, I don't know what Sam did. Must have been horrible, blah, blah, blah. As soon as Greg came out and was like, yo, on this, I quit. And they're like, Greg's the number two. He's on the board. He was the chairman of the board, I think. They're like, if Greg doesn't think this was right,
23:07
I don't think it's right. And so
23:09
Greg single handedly, they shifted the perception the narrative around this whole thing. And the board didn't counteract that. And so I think Greg was the the real hero of this. And then what I did was
23:20
there's some amazing blog posts that anybody who's a real nerd about this stuff, if you just really love kinda like the the lore, the canon of of Silicon Valley, you gotta go read this. Greg has blog posts up on his blog about,
23:34
about the early days of open AI.
23:37
And,
23:38
I think he blogged the early days of Open AI last week.
23:43
No. It was eight years ago. This thing started a long time ago. And he talks about, like, they've didn't have an office. He's like, alright. Just come to my apartment. And he's like, here's the four of us in our apartment.
23:52
He's like, I didn't know anything about machine learning.
23:55
So I did two things. Number one, I asked, Elliot, what is the number one textbook about machine learning? I bought it. And I studied it religiously. He's like, I read everything there was to know about deep learning and machine learning. He goes, and this is the former CTO of Stri. He's not like a nontechnical dude. He's known as like this
24:13
incredible one of these ten x or hundred x engineers.
24:17
And,
24:18
he says I didn't know anything about deep learning. So I just dedicated myself to go read, you know, page one chapter one of the book and I'm gonna take notes and I'm gonna ask a ton of questions. He goes, two, while these guys were setting up their research stuff, I did everything else. Like, oh, but your back looks a little uncomfortable. You need a pillow? Oh, you guys thirsty. I'll go get smoothies, and he's doing literally all the, like, intern grunt work, those first few months to get them set up. He's like, and it basically was alternating between research would do something, then research would get blocked. And I would say, oh, what's blocking you? They're like, it's taking forever to run this training set or I have to, like, organize this data. And then Greg would stay up all night and basically, like, do the engineering because people don't realize engineering and coding is very different than AI research.
25:02
And so he's he would do the coding that would make the researchers go faster.
25:05
He built basically all of the infrastructure
25:08
so that researchers could do great work.
25:10
And,
25:12
and even with GPT, like, there was a story like, you know, they ran the model and the output was kinda like, like, it was so so.
25:19
And then Greg, like, locked himself in a room for two weeks, and he came out and suddenly GPT worked. And we were like, what the hell just happened? He's like, check this out. And, you know, I did that. And so he has these great blog posts and photos of those early days that I thought was kind of amazing. So Greg was born in eighty nine, which puts him around thirty four years old now.
25:39
He joined Stripe in two thousand ten. He became CTO in two thousand So he was CTO of Stripe when he was about twenty four, twenty five years old. And then he go he left Stripe to co found Open AI. So he was only twenty six twenty seven when he started
25:54
working on Open AI. It's pretty amazing. These guys are are freaks.
25:58
Unreal. It's
25:59
unreal. And by the way, what did you give we you forgot to give a grade. So, Sam Altman, he's gonna win no matter what in all of this. Alright. So I think he was gonna get, like, let's say
26:10
b plus. Right? He was gonna get a b plus. Why? Because
26:13
he got fired. He got backstabbed by your own your own people. Then not not great. Not a great thing to happen. Two,
26:20
they haven't said what you did, but you did something. Right? They weren't just gonna fire you for nothing. So, like, you did something. We don't know what it is unknown. Okay. Give you an a for that.
26:31
It looked like he was gonna get the job back, which would have been great, but then he didn't. So, didn't come it didn't it didn't happen in the end. So even though you got some some support from your team didn't happen,
26:41
until the Microsoft thing happened,
26:43
which was literally like
26:45
Imagine playing a game of chess.
26:48
And you're like, oh, I think I got this guy. You know, I have a five piece to three advantage here. His queen is out.
26:55
And then you look up from the board, and he's holding, you know, a glock to your head. And that's what happened to ilya here. That but basically, it's like,
27:03
He's now he's at Microsoft.
27:05
They have the license to all of the technology.
27:08
They have all the training data in the world because guess what? Ninety percent of computers are PCs.
27:12
That run Microsoft that run, you know, Windows.
27:15
He has all the funding in the world. He got paid. I'm sure there's, you know, like,
27:20
Satiya had to bring,
27:22
Sam or Greg in before the stock market opened on Monday just to kind of like not have this be an issue for Microsoft.
27:29
That they got screwed here. They put a bunch of money into the into a loop into a lame duck. And so he got everything. He got the tech. Six hundred of seven hundred people follow or saying they're gonna follow him there. He got the team. He got the money.
27:42
He got the funding.
27:43
And he got it all in forty eight hours. And he and they were basically like, yeah. They're gonna lead a new area of Microsoft, that's gonna do open that's gonna do re research on AI. It's like, oh, you mean open AI part two? So so that now he has to walk away with an a plus. Right? Like, You can't Yeah. You can't knock that. And Well and he's he's got the good guy label unless they come out with some new information that's that shows that he did something f ed up, which something has to come out. There's only three options in my opinion of how the story for Sam ends. Option a is it's a it's an unforgivable sin. I think that that is
28:16
not likely, but it is possible that what he's did is just well, you're done.
28:20
Option b is he goes back to Open AI, become CEO. I think that was more likely to not going to happen. I I think. But option c is he stays at Microsoft, and I think in a decade or five years, he becomes the CEO of Microsoft.
28:35
Which is now worth five trillion dollars or something at that point. Yeah. Right now today.
28:41
Yeah. So he's he's gonna turn out okay. So now this other guy, Ilia,
28:45
He looks like a schmuck here. So he's he's a loser here. He looks he looks like He gets an f. He gets an f. There's a little bit of likability with him where it's like, oh, you're just a brilliant scientist artist
28:58
turn who was who was
29:00
you were you were manipulated in here because you're just a a scientist and you don't understand human interaction. So if he plays So we should explain one thing, which is what are the possibilities of what Sam did that might put Sam on the wrong or Eli in the right. So there is what you're calling the unforgivable sin, which I would say, by the way,
29:17
Not a great tracker could have anything that being unforgivable. But let's say the really bad personal stuff could be the could be one reason why. In which case, I would say,
29:27
That's a completely fair reason to remove him. If they did an investigation found out that that was true,
29:32
that they're in the right for removing him.
29:35
That's defensible, but they just haven't said anything.
29:38
Num reason number two is
29:42
safety concern. So Ilia is famously. Like, the reason he's doing this is literally
29:47
he's worried about AGI. It doesn't want the world to end.
29:50
And,
29:52
what initially people thought was.
29:54
Sam says all the right things about safety, but also Sam is like, you know,
29:59
Capitalist.
30:00
He's a entrepreneur. He's he he's a he's a hard charging guy, and he's running forward here.
30:08
And maybe they're saying slow down. He's saying, no. We don't need to slow down. And they're saying, Sam, this is too risky. And he's saying, no. No. No. This is not risky. We gotta move forward. That's what it look the situation was. And it was made crazier by the fact that I don't know if you saw this. He did an interview the day before,
30:24
and he said something like this. He said
30:26
I've been lucky to be in the room four times now. The last one was just a couple weeks ago,
30:32
where I saw something.
30:34
That
30:35
pushed the veil of ignorance forward and the frontier of techno technology to us. And, like, it pushed the veil of ignorance away
30:43
the front frontier of technology to us. So basically, you're saying, like, there's been, like, three or four mind blowing moments,
30:49
you know, about AI. I've been lucky enough to be in the room. The last one happened a couple weeks ago.
30:54
Swaddly, you see this. It's gonna stun people.
30:57
And so some people are like, yo, did they like
30:59
create the monster super intelligent? Like, did they create AGI? You know, has AGI been achieved internally?
31:05
And he, like, randomly posted on Reddit, not long ago. AGI has been achieved internally as like a joke, I guess.
31:12
So some people said, maybe they've stumbled into something, and this guy's going too fast. And it's, you know, like the scientists, you know, around the nuclear bomb saying, no. No. No. No. We gotta, like, destroy this thing or we gotta, like, slow down. This is getting too crazy. That would have been the other reason, which would have been, again, defensible,
31:28
especially given that OpenAI's
31:30
charter is specifically says We're not about increasing shareholder value. We are about,
31:36
you know, safely building AGI to benefit all of humanity. So that is their charter. And if they thought this guy's moving too fast and putting that at risk, I would say they are completely in the right to do so if they believe that to be true. The bad part is, again, they've come out with no evidence that that that is the case. They haven't explained themselves. So nobody believes them, but that's true.
31:54
The third reason. So, you know, you have the really bad thing. You have the safety argument.
31:59
And the third thing is, essentially, it's just a power play. They're annoyed by him. They don't feel like he listens. They feel like he's getting all the credit or he wants fame and fortune and
32:10
power play. Let's get this guy out of here, and we will we get to run and own this thing. And that's like a jealousy driven thing. And I don't know which of those is the most true. I don't know what you think, but, you know, it seems like
32:23
You know, none of those are perfect explanations.
32:26
I think
32:27
I'll I'll say what I think maybe later about it. I it's number three, I think. I think it's number three.
32:33
Let's talk about let's let's come back to the board, but let's talk about Emich here because you are one of the handful of people or, you know, however many people who have had a relationship with him and understand. And so last night, at one AM,
32:48
emmett basically said, I've been the guy that's been picked. He put out this really good tweet where he explained what he was doing. The funniest part I thought was he's had the job now for twelve minutes,
32:59
and he did a wonderful job of saying the word r. He goes, our products are going to be this. Our team is going to do this. We are going and it reminded me of that meme where there's a guy who go he goes, hey, I made this. And he hands it to a stick figure. And the stick figure goes, you made this, and then the first guy leaves, and the stick figure looks at the thing that and he goes, I made this.
33:21
And it's the whole point about on the internet where where people steal stuff. I actually think I'm and it's a great guy. So I'm I'm just I'm just messing around But I thought it was amazing how he's like, twelve minutes later, this our company is going to do this. What do you think about Emmett in this role?
33:35
He he's been getting a lot of criticism because he previously shared online that he's like, I actually think we need to slow things down and and a bunch of people are like, this is ridiculous. Don't slow down, whatever. But he seems like a really thoughtful, great CEO.
33:49
Am I wrong? So Emmett, you're right. Oh, I worked with Emmett for about two years. Got to sit with him, you know,
33:56
in hundreds of meetings and see, you know, how he thinks and, you know, out with him, have beers and just kind of talked to him. He came on the podcast. I talked to him for an hour there about,
34:05
AI stuff. You always said he was really thoughtful.
34:08
He's extremely intelligent
34:10
in a broad way. So,
34:12
most people
34:13
are not that intelligent that you meet. You meet some people that are like, you know, oh, they're, like, above average. Then you meet some people,
34:20
as you say, the oven burns hotter. And it's just very clear, like, he's just got more horsepower in his head than most people I think that's great. I think, you know, as much as we like to say that, like, hard work is what matters. Like, yeah. You know what else that matters? Like,
34:34
intelligence is also really a really strong attribute. It's like, you know, like, you know, Steph Curry's got great skills, but, like, LeBron is six eight two fifty. Right? Like, in the size and speed matter.
34:45
I hate when people downplay intelligence. And and I'm like, guys, let's just admit some people are born smarter just like some people are seven tall. There's a reason why twenty percent of Americans who are seven feet tall and uh-uh below the age of thirty are in the NBA. You you just you're you're better. You're better at that particular thing.
35:00
Same with intelligence. That height helps you with basketball. Intelligence helps you with running companies. Right? Like, that that's a thing.
35:07
So
35:08
Emmid is extremely intelligent. He is amongst the most intelligent people I've ever met.
35:12
So that's the that's the first thing.
35:15
Second thing is he's got depth and breadth. So a lot of people are really, really intelligent in one specific domain. It seems like this guy, Illia might be one of those where he's just extremely, extremely technically intelligent,
35:26
but it seems like in the game of, you know,
35:29
soft power and politics and and, you know, people and and all that. He, you know, had a pretty boneheaded
35:37
way of doing this that didn't didn't work out to his own tweet. I deeply regret what I did. It was handled very poorly. So, you know, I think he, you know, some people get really, really smart in one difficult domain,
35:48
but they're not don't have a breadth of of intelligence.
35:51
I would say that Emmett
35:53
is wide and deep. So he's technically very strong. He was a great engineer. They built Twitch, one of the, you know, the best live training products. And it was him and this guy Kyle. Kyle went on to build self driving cars. And Emmett Rand Twitch for like whatever eighteen years or some shit like that.
36:07
So, you know, between Justin TV and Twitch. And so
36:11
He's very good technically, which is important because this is a deeply technical product. He, you know, he said to me the other thing. One thing when I was talking about AI, he's like, He's like, yeah. Like, once you understand how transformers work, and I was like, well, let's stop there because I have no idea how transformers work. So, like, the rest of your sentence, because he said it, like, Yeah. It's easy. I mean, just once you once you get on transformers work, it's like, you know, blah, blah, blah, just anybody could go watch a video. We've had a smart a couple smart people in this pod. And I'm they've tried to use analogies because I'd be like, I don't understand what you're saying, and they've used an analogy. And I'm like, can you please use use an analogy for your analogy? Like, I think we had a biology on here, and he was, like, referring to the battle of the river of Tames in World War two in order to, like, describe crypto. And I'm, like,
36:54
guy, I don't know what the battle battle of Tanes was. Like, this doesn't make sense to me. Right. Right. Exactly.
37:00
You know, I'm asking for a lunch bowl over here, and you're making gourmet.
37:04
Let's let's do this right. What what's his soft skills like?
37:08
Well, his soft skills, I I would say when I experienced him, which was like, you know, whatever he's been he'd been running the company for over a decade
37:15
as CEO.
37:17
I thought he was good, not great. So for example, I think he always knew what to do. Like if you caught him
37:24
when he was calm, he would give great advice on the people side of how to handle something.
37:29
However, in the heat of the moment,
37:31
he was prone to falling it falling into, like, debates about semantics, about words,
37:37
about data that would make people feel like shit. And he would like win the battle and lose the war type of thing with people. And so I don't think he was great with that.
37:45
But I also think
37:47
Twitch was like a bunch of very, like, kind of it was a community product with a bunch of very, like, sort of woke people at the company that were, like, very touchy feel about stuff.
37:56
You know, I think Obid AI is gonna be his type of people. You know, like, Emmett was better in meetings where all the people were engineers and proud product people. He was worse. The more that the people were, you know,
38:07
we got legal PR, cobs,
38:10
you know,
38:12
marketing,
38:13
you know, those meetings went worse for him. I thought.
38:16
And so I think he'll have less of that here. But he also is like emology, like he'll reference the Battle of the Bulge. Like he'll reference, you know,
38:25
well, the judicial system in Ireland back in the fifteen hundreds, what they did was really interesting. It's like, dude, why do you know this? He's like, well, I've read about all the judicial systems. I was very interested in how how governing has changed over time. It's like, that's your casual reading? He's like, yeah. Like, I'll just I I don't know. Is that not is that weird? And it's like, you know somebody's really good when they when they don't even realize it's weird when they do is, like, really fucking weird. And so, he's has a very, very wide set of knowledge. Now
38:53
he's getting kind of, labeled as a a detail. I don't know if you've seen this, like a decelerationist, meaning, like, somebody wants to slow things down. Because as soon as he's got name Basically, a suit.
39:03
Yeah. So he's kinda like a suit or, like, a doomer
39:06
you know, like, oh, this is this guy's anti technology, which is pretty silly. If it's, like, you know, if he could be technology, he would, you know, like, he technologist through and through. They're they're called the guy who invented Twitch a a boomer, basically.
39:20
Yeah. It's stupid. And so,
39:23
you know, a friend who a mutual friend described it well. They go, because people dug up all his old tweets. And he said if if it open it, if opening has at at a ten right now in terms of speed, You know, it should probably be at like a one or two because, you know, you just don't want this cat to get out of the bag. And so,
39:39
I wanna read you what he,
39:42
I have the transcript of what he said on our podcast,
39:45
which
39:46
when he came on about AI. So I asked him very simply. I said, is AI gonna kill us all? And he goes,
39:53
maybe?
39:54
I I go it's interesting you say maybe it's gonna kill us all because, like, you're pretty like pro technology and everything. You're an optimist.
40:01
And he goes, well, it's because I'm an optimist that I'm scared. If I was a pessimist, if I was like, oh, this AI thing's overhyped
40:08
It's technology not that impressive. It's just a nice little magic trick. It's it's nothing real underneath it. If I was pessimistic, I would be less worried. I'm optimistic. I am extremely impressed with how these work, and I see a pathway where this is gonna get better, very fast. And at some point, it's gonna be able to self improve. That's very scary because we don't know what happens on the other side of that. He's like, it's because I'm optimistic. And the second thing he said was he goes, here's the analogy.
40:30
You know, in biology, there's like there's now, like, ways that you can synthetically
40:35
do things. You can we know we can edit genes, for example. And we can, like, add an extra leg to a sheep. We can like modify a virus. This was like, you know, COVID. Right? Like, you could modify a virus and make it more,
40:47
spreadable. And we all saw with COVID, how dangerous that capability is that, like, this thing, you know, if it's leaked from a lab, that's really, really bad. Like, you know, that that was a big mistake
40:57
that technology made that, like,
40:59
you know, fucked up the world for a few years. And so he's like,
41:03
you know, we have regulation. I think everybody kind of agrees. We should maybe
41:07
not go super fast and make it easy for somebody to, like, print smallpox in their home. Right? Like, we should maybe not give that capability to everybody. And maybe we should, like, slow down make sure there's some oversight to we to so that we don't do something really bad that, like, creates a super virus.
41:21
He's like,
41:22
people get that when it comes biology, they don't really get that when it comes to AI. He's like with AI.
41:27
Here's the thing. AI is really good at, like, the one thing you need in order to be really good at AI. At developing AI. He's like, AI
41:35
can code.
41:37
It can write code. It can read everything there is to know about a subject in an afternoon
41:41
become absolute world expert at it, and then it could translate that into action. It could it could translate to writing code. It could also design chips. Yeah. It doesn't do either of those two things perfectly right now. It's not it's not amazing at writing code.
41:53
And it's not amazing at designing chips.
41:56
But it can do them. It's it's within the zone of things that AI can be good at. The type of things AI can be good at, which means that over time, it's gonna be able to write code that designs AI and write, you know, and design chips that make AI better.
42:10
And once that thing can improve itself, it's gonna get fast is gonna get really amazing really fast, faster than anyone can expect.
42:17
And maybe that's a good thing. Maybe it all works out, but also maybe it doesn't. And that's, like, is that even if I think the probability of a really bad thing happening is low,
42:27
it's the consequences are so bad. We're not just talking about, like, a virus that's kind of, like, screws up your computer or, like, gets like, you know, even like a pandemic that gets people sick. We're talking about this could destroy, like, He says, like, we could destroy all of the value with the light code, which I don't even know what the light code is, but
42:46
I don't want that.
42:47
Right? So he's describing, like, even if it's low probability, we should just be really, really careful about that and, like, make sure we don't screw the whole thing up.
42:54
And a friend
42:57
reach out to one thing. They go. They go every saying, like, emmett's gonna wanna, like, slow this down and stop it. He's like, no. No. Emmett's smart. Emmett knows the the game theory of this, the prisoner's dilemma. If you go too slow,
43:07
somebody else will create AGI before you because you you slow down to zero. So somebody else will get there before you who may not share your
43:16
moral beliefs around safety.
43:18
So you can't go too slow where Other people get there first who don't share your your safety views, but you can't go so fast that you don't actually build it safely. There's a middle ground to see. And so, I think that's the best description of this where people are trying to label him as one thing and knowing what I know about him, that's just not true.
43:34
Well, let's go, like, through rapid fire of, like, some other
43:38
story points. We'll and we'll make that quick. And then Also, we've been there of what we think will is gonna happen. One quick story point for you, if you like to have a final grades before we do stories. Final grades. So the the board, what do you what do you grade at the board? Oh, they're fucking losers. So the worst, they they look horrible. They're gonna get You got an app. Like If it stands for fucking loser,
43:58
Yeah.
43:59
Yeah. Is there an NFL? They're the worst. I mean, the it doesn't matter what the truth is. It's all about perception, and they look like the worst. Including,
44:07
Adam De Angelo, the founder of Quora. Right now, he looks horrible.
44:11
So f, big in f. Do you agree? Yeah. Agreed. Emmett, what do you give Emmett? He's, twelve minutes onto the job here. T. B. C.
44:19
He, he's actually, actually, it's actually, like, a deer at the moment. He's getting dragged, I think,
44:25
which isn't fair. But, no, he he looks like he's a loser here because of perception.
44:31
Right. And what I mean, he's taking the hero's job. Like, it's it's hard to look good and regardless of how amazing he is. Right?
44:38
Right. And what about Satya, Microsoft?
44:42
He gets a b. He looks pretty good. I mean, the he he's he's the man. I mean, he's kinda re made a resurgence for Satish or for Microsoft. I think that, I think there's a real world where Bing actually catches up to Google in terms of search and
44:57
Microsoft, it's gonna become just an even bigger company than they they already are. So, yeah, I mean, he's a he's a he him and Sam are the masterminds. I gotta give him a plus. The dude
45:07
you know, mobilized over the weekend.
45:09
And before the markets opened and could hit Microsoft stock,
45:14
He basically acquired Open AI
45:17
for negative one hundred billion dollars.
45:21
He aqua hired Open AI,
45:24
which is incredible.
45:26
And so now, man, he he got Open AI out of the nonprofit capped profit structure. It's just super for profit inside Microsoft. He got the key talent, you know, Sam, Greg, all of the key people, six hundred of the of the key people.
45:42
And now they just own the whole thing instead of forty nine percent.
45:45
Like, that's
45:47
that's an a plus. He crushed it on this one. He he's the man.
45:51
So let's do let's do the storylines.
45:53
Yeah. Here's one quick storyline. Have you heard of, fairchild conductors? Do you know what that is? I've so I've heard of it, but I'm not a history guy. So what's the story? There's isn't there something like the the traitorous eight or something? What what is that? Yeah. So so, basically, there was the there was a big company called, I think it was called shockley conductors,
46:10
and they made conductors
46:12
com computer chips in the sixties and Silicon Valley.
46:16
People were pissed off at the the guy who owned it William Chockley. I think his name is. He was brilliant, but just a pain in the ass. And so these eight guys said, hey, you gotta bail or we're bailing. And the guy the sea goes, see you. And so these eight people went and started fairchild which was basically,
46:30
the equivalent is like Nvidia,
46:32
but back then in nineteen sixty. First Silicon Valley. Right? That was like the That was the seed round of Silicon Valley. Yeah. And they were fame their culture was famous because they were casual. So it was, like, where nerds could come and be casual and wear, like, the best actually best idea is actually one. And that was where the whole, like, culture of Silicon Valley, arguably started there. And then the off offshoot
46:55
of fairchild conductors was basically,
46:57
IBM.
46:58
So IBM was founded by one of the founders of fairchild.
47:02
Andy Grove, the guy, the famous manager of, I think he was the CEO of IBM. He came from there.
47:08
The founders of Sequoia Capital,
47:10
Clyde Perkins. I think might have spun out. I mean, like, bay basically, it was there's just all these people, but, the story of the treacherous eight is what they're called, is very similar to this story. So if you're interested in this type of, like, stuff and you and you could actually go and read that book and you could see a lot of similarities between what is going on now. So this is like our version of that. And so it's gonna end well, I think, for Greg and Sam, because I ended really nicely for the the treacherous eight So that's an interesting way. My question is that they they unwind this whole thing and they revert back to open AI, by the way. That's my, like, And so it hasn't happened yet, but I think the final step now now that Ilia signed the pledge being like, I quit. I'm also gonna go work there. It's like, what is the point of even having the open AI entity?
47:53
If six hundred of the people, including all of the leaders
47:56
are gone.
47:58
I don't think they want to be inside Microsoft necessarily. So I think what's No. No. No.
48:04
I think this was the final straw and by the end of I bet by the end of day today,
48:09
they get a, whoopsie daisy undo
48:12
on the whole thing. And, Sam is back as CEO of open AI. And, they just the board quits and they get a new board. That's my my prediction by the way. It doesn't make sense. I believe I believe it That prediction, I think, is the same. I think there's long tail predictions here, which is new startups aren't gonna want to have board of directors
48:31
I actually think that's the wrong takeaway. Brian Halligan from Hubspot, he wrote. He's like, no. The takeaway shouldn't be that not have a board. The takeaway should be have a good board.
48:39
And I think so. But I think that potentially, like, the governance structure of companies might actually change significantly
48:46
because people are gonna be like, they're gonna they're gonna have PTSD from just hearing this story. And that that actually is gonna have downstream effects that are quite large. And in in a weird way, it could even impact, like, the venture capital industry. I think that is,
49:01
a real possibility.
49:02
I totally agree, and I think it's so stupid. So right now, there's so many startups
49:07
that are reading their their governing documents about the board and thinking through. Do I need to change my board?
49:14
It is my public service announcement.
49:17
Don't worry, bro. You're not Open AI. You're not Sam Altman.
49:21
And, your board is also not these spucks either.
49:24
And nobody cares. And, like, you know, the for a startup, like, don't worry about your governance structure right now. Like,
49:31
you should be doing the standard vanilla things.
49:34
Of, you know, put competent people there and then don't worry about it too much. Like, you need product market fit. You don't have this problem. In the same way that, like, when you hear what's going on with, like, Uber or Facebook, it's like,
49:46
you're not Uber in Facebook. You don't have the same problems that they have Don't
49:51
suddenly start spending your energy over there and take your eye off the actual main problem that you need product market fit. You need growth.
49:57
I think the the takeaway here is OpenAI's
50:00
board was stupid. I say, that was actually a strategic mistake. It seems like on Saint Walton's part, Also, I think the whole nonprofit stick,
50:08
that was stupid too.
50:10
TBD, if if I'm gonna be right on that, but as of now, it's look it would have been a lot simpler. Maybe if this is just a bore or this is just a normal bore where they have accurate equipment complete.
50:20
They started with Elon Musk, you know, Reed Hoffman and all these, like, legit players on the board. And then one by one, they all quit. So basically, like, Elon quit because things weren't, like, they weren't listening to him, and he took his ball and tried to run home. Reed Hoffman quit because he started a competitor. Another guy quit because he wants to run for president, Another person quit. Yeah. It's like, they're basically five of the original board members quit for different reasons, either starting competitors or conflict of interest or running for president or whatever. And,
50:49
this is what they got left with. Yeah. So hopefully that gets changed
50:53
What are any other takeaways? I think, what were you saying about this was the best and the worst of us? Yeah. That was kind of my big picture takeaway. Like, like this was the best of Silicon Valley and the worst of Silicon Valley all in the same weekend.
51:05
You know, the best I thought was things like, you know, what Greg did
51:09
When he was like, yo,
51:11
I'm out too. I quit even though they tried to
51:15
they they wanted to retain him. They weren't firing him.
51:17
It's like, yeah, that's a real co founder. Right? That's
51:20
that's a that's a ride or die sort of ducks fly together moment. And Totally. I thought that was
51:25
That was badass. I thought that was the best of us. And then I thought the worst of us was, like, this, like,
51:31
stupid board that has no skin in the game that doesn't know how to communicate. That didn't like, handle this well, just fumbling and bumbling and removing a founder from there. Removing basically the greatest founder alive, you know, outside of Elon Musk. From their company that they created
51:44
for what seems like no reason. And if there is a reason, you gotta, like,
51:49
say it. And even if you say it in a way that wasn't Like, doesn't get you sued. Like, you know, maybe that's why they're not saying anything, but you have to at least tell your team. You have to give more than what they did. They handle it better than what they And if it was so bad, then don't negotiate with him to come back. Right? Like,
52:05
if it was that bad, then he shouldn't have twenty four hours later been in discussions to come back.
52:10
You know, so the the board I thought was, you know, it was the worst of us. I thought, you know, Twitter was alive and popping and it really did feel like it was the Silicon Valley group chat. That was kind of the best of us. And the worst worst of it was like, you know, it felt like gossip girls all weekend. And,
52:25
that's kinda lame. And there's a bunch of, you know, like speculation wasted energy.
52:29
You know, like, I saw these Twitter spaces going on for hours and hours of people just, like, discussing, like, you know, it was a real housewives episode and they can't believe that Jenna said that. Right? Like, it just seemed like, you know, it wasn't seem like gossip. And I thought that was a waste of time. You know, I thought there was a whole bunch of things that were kind of like the best and the worst showed themselves in the same weekend. You know, maybe,
52:50
you know, the it could be that there actually is a legitimate,
52:53
you know, crazy safety concern and, like, this guy was going too fast. And You know, this earlier guy was maybe the hero was pumping the brakes even though it
53:01
screwed up their his own economics.
53:04
If he really was standing for something that he Right? Like, it could be that it was that, or it could be that it was a jealousy power play and and take him out. Right? We don't know, but I just thought there was a whole bunch of best and worst. Another one was like, just this instant labeling, like, are you these, like, tribes? Like, I don't know if you've seen this, like, e slash a c c thing. Is that the e cell and d cell? Yeah. Bay basically, there's there's a whole bunch of people in the Twitter bio that hit e slash a c c. I haven't seen this for months. I have no idea. I didn't even know what the hell this was. But it's except you're you're an effective
53:34
accelerationist.
53:35
Right? So it's basically like kind of a play on the effective, out
53:40
alterism
53:41
or whatever altruism that Sam Bankman for you was doing? That's pretty stupid, I think. The one is basically like pedal to the metal, you know, tech forward, progress forward, technology optimist.
53:51
And it's like, that's the cool club to be in. And then anybody who's like saying, hey, slow down. You're a decel. Which is like, you know,
53:59
it's the d word, basically. It's a it's a slur. It's an insult.
54:03
And to me, this was, again, it's the worst of us. It's the It's like people in Silicon Valley make fun of politics. It's like, oh, I'm not left or right. It's not that simple. It's not Republican Democrat. I'm not I don't fall under these stupid party, you know, tribes that are just blindly, you know, following the herd, and it's like we're doing the same thing.
54:21
This e cell, D cell shit
54:23
is the same thing as Democrat Republican red, blue, left, right.
54:27
It's a oversimplification.
54:29
Of,
54:30
a massive over oversimplification
54:32
of what's going on. And so I thought that was kind of the worst of us too.
54:36
Final prediction.
54:37
My prediction is in the next twenty four hours, all this is gonna be
54:41
undone,
54:42
and things will be back to normal. That's my prediction. Is that yours too? Maybe we'll give it forty eight hours, but I think at this rate, twenty four hours.
54:49
When I was in second grade, my class had a field trip,
54:53
And,
54:54
they said they told the parents, I said, hey, you need twenty bucks to come to this field trip. So my mom gives me twenty dollars in the morning and I go, and I'm excited about the field don't think I'd ever been on a filter. It was my first.
55:03
And, they're like, hey, okay. Put your thing over here. Sign your give me your signed waiver and put your twenty bucks.
55:09
And I'm, like, looking around. I'm, like, where did my twenty bucks go? It was right. I'm, like, this is my desk. It's not here. I'm looking under the table on the desk. I checked my backpack.
55:17
I'm like, what the hell? Somebody took my twenty bucks.
55:20
I can't go on the field trip now. And so I go to the teacher.
55:24
And I'm like,
55:26
Miss, like,
55:27
somebody stole my twenty dollars. And she's like, oh my god,
55:31
stealing. Like, this is stealing money from the kid. Like, that's not okay.
55:35
And she's like, okay. Sure. I'm like, yeah. I was right there. And so,
55:39
she
55:40
turns the world upside down. She starts She's like, hey, blows the whistle. She's like, everybody sit down.
55:46
Who has the money?
55:48
And it's just silence. Nobody raises their hand. And she's like, okay.
55:52
We'll try this a different way. I'm gonna put a jar outside, and we're gonna walk out one by one. And I want the twenty dollars to be in the jar by the time this is done. No harm, no foul.
56:01
Everybody walks out. Everybody walks in. She comes back. Jar's empty. And she's pissed now. She's like, okay. I gave you a chance to do this clean.
56:10
I gave you a chance to do this publicly.
56:12
Now we're not going on the field trip, guys. Guess what?
56:15
And,
56:17
And I'm like, oh my god. This is getting the field trip canceled. This is crazy. Not just for our class. She tells the teacher next door. She's like,
56:24
We don't earn we didn't earn this because there's been, you know, one of our core values has been betrayed here. And where was the twenty dollars?
56:32
And then I'm like, I gotta go to the bathroom and I get a I get up, and I'm like, miss, can I go to the bathroom? She said, yes, sure. I get up.
56:37
And I feel this, like, scratchy feeling in my sock. Well And I'm like, I reached out and I was like, I put how you put the twenty bucks in my sock this morning? That's right.
56:51
But
56:52
here I am. A drug dealer, dude. What he did? Here I am. And now,
56:56
I'm like, what do I do?
56:58
I'm like, the class is gonna be so mad at me. If they realize this is in my sock the whole time, the teacher's gonna be so mad at me if she realizes that I accidentally did this. I didn't mean for all this to happen.
57:10
And so I went to the restroom.
57:12
I took the twenty bucks.
57:13
Put it at the toilet, and I flust the toilet, I went back to my seat.
57:18
Did you really? Did you really do that?
57:21
Is that really on that end? No. That made it for better better story, but I did hand it to her because she's back home, and then we went on the field trip. But I thought it'd be a better story, but anyway, But that thing did happen with the sock and it wasted half of the field trip day. And, that's all I feel. Iliya feels right now where he's like,
57:38
I did not mean for all of this staff. And, like,
57:42
this got
57:44
way bigger than I thought this was gonna get. And,
57:48
Whoops. Can I just, like, give you this twenty bucks and we pretend this never happened? And that's what he's trying to do right now. And, you know, more power to him. I I've been I feel you.
57:57
Alright. Well, we're gonna see what happens.
58:00
That's the pod.
00:00 58:02