1
00:00:00,000 --> 00:00:17,320
Welcome to the Free Cities podcast. My name is Timothy Allen and this is the official podcast

2
00:00:17,320 --> 00:00:19,020
of the Free Cities Foundation.

3
00:00:30,000 --> 00:00:36,340
Hello and welcome to this episode number 158 of the Free Cities podcast.

4
00:00:37,760 --> 00:00:45,340
Yes, hello indeed and a very warm welcome from here, the hills of the Welsh borders.

5
00:00:46,400 --> 00:00:51,840
As I look outside the window, winter has finally hit for real this morning.

6
00:00:51,840 --> 00:01:01,060
We had an epic frost and as a result I've given myself a bit of a treat to help record this intro.

7
00:01:01,700 --> 00:01:13,100
We know here that the cold season has definitely arrived when we have to go out into the fields at daybreak and smash the ice in the animals' water troughs.

8
00:01:14,040 --> 00:01:17,860
Some of you may know this experience. That was my duty this morning.

9
00:01:17,860 --> 00:01:25,040
and when I got back my eldest daughter has a temperature she was out late last night looking

10
00:01:25,040 --> 00:01:32,580
after the horses so she's off from school asleep in her room upstairs and you know what I decided

11
00:01:32,580 --> 00:01:40,560
to grab my laptop and a couple of mics move it into the house and I've set up a makeshift office

12
00:01:40,560 --> 00:01:44,460
right here in front of our ancient fireplace.

13
00:01:45,980 --> 00:01:47,280
And it's great.

14
00:01:47,520 --> 00:01:49,800
I'll give you a little bit of a fireplace hit.

15
00:01:50,140 --> 00:01:50,880
Here we go.

16
00:01:51,520 --> 00:01:52,640
There, listen to that.

17
00:01:53,740 --> 00:01:55,300
It's cold outside.

18
00:01:55,420 --> 00:01:56,860
It's lovely in here, right?

19
00:01:57,120 --> 00:02:01,320
Well, I certainly am a very happy man right now, to be honest.

20
00:02:02,180 --> 00:02:06,160
I'm very easy to please, as I think many men are.

21
00:02:06,160 --> 00:02:09,640
In my case, a wood fire sometimes is pretty much all it takes.

22
00:02:10,560 --> 00:02:18,580
Maybe add a cigar to that and possibly a glass of something like, I don't know, a tawny port at this time of the year.

23
00:02:18,800 --> 00:02:21,720
That would be good, but, you know, not at 10 a.m.

24
00:02:22,080 --> 00:02:23,900
I'm happy peering into the flames.

25
00:02:24,500 --> 00:02:25,340
And there's the dog.

26
00:02:25,440 --> 00:02:26,380
That didn't take long.

27
00:02:26,720 --> 00:02:27,340
She's in here.

28
00:02:28,020 --> 00:02:29,060
She's curled up.

29
00:02:29,440 --> 00:02:29,620
Right.

30
00:02:30,880 --> 00:02:31,940
Now I'm set.

31
00:02:31,940 --> 00:02:32,380
Right.

32
00:02:32,520 --> 00:02:39,960
Talking of the weather and coldness, today's episode is about a place that exists in a part of the world.

33
00:02:39,960 --> 00:02:46,200
the climate of which you could almost say is diametrically opposed to here in the Welsh

34
00:02:46,200 --> 00:02:53,260
mountains that's right it's Honduras and today's and next week's episodes will both be about

35
00:02:53,260 --> 00:03:00,640
Honduran free cities since the Honduran general elections will be happening in a couple of days

36
00:03:00,640 --> 00:03:06,360
on Sunday that's the 30th of November depending on when you're listening to this

37
00:03:06,360 --> 00:03:10,720
so this week I'll be speaking with Trey Goff

38
00:03:10,720 --> 00:03:13,620
who is the Chief of Staff at Prospera

39
00:03:13,620 --> 00:03:17,080
also one of the longest serving employees

40
00:03:17,080 --> 00:03:20,020
at that particular free city I think

41
00:03:20,020 --> 00:03:24,080
and then next week after the election result has come out

42
00:03:24,080 --> 00:03:27,680
I'll be publishing an interview I did with Massimo Mazzone

43
00:03:27,680 --> 00:03:32,160
who is the founder and owner of Ciudad Morazan

44
00:03:32,160 --> 00:03:36,340
which is the, in inverted commas, other free city in Honduras.

45
00:03:36,460 --> 00:03:39,020
There are more, but the two well-known ones.

46
00:03:39,760 --> 00:03:47,800
Yes, Ciudad Morazan, the free city that journalists doing hit pieces never seem to want to visit or mention.

47
00:03:48,580 --> 00:03:50,160
Somehow it doesn't fit their narrative.

48
00:03:50,420 --> 00:03:57,880
Anyway, the outcome of the election will obviously have a massive impact on the ZAs.

49
00:03:57,880 --> 00:04:04,080
expected policy direction potential impacts for governance stability investment climate

50
00:04:04,080 --> 00:04:10,420
and just the day-to-day operations of the zay so here's a quick lowdown on who's in the elections

51
00:04:10,420 --> 00:04:18,740
and who we want to win and who we don't want to win if you go and take a look at polymarket

52
00:04:18,740 --> 00:04:26,940
you can see that polling is suggests a kind of three-way race you've got salvador nasralla who's

53
00:04:26,940 --> 00:04:35,080
the Liberal Party, generally favourable towards ZAs. They're pro-business centrist. Then you've

54
00:04:35,080 --> 00:04:43,560
got Risi Moncada, who is the Libre Party. They're unfavourable. They have supported efforts to

55
00:04:43,560 --> 00:04:51,740
repeal the ZAs legal framework. And then finally in the race, you have Nasri Asfura from the National

56
00:04:51,740 --> 00:04:58,780
party they are favorable support investment stability and economic zones so obviously

57
00:04:58,780 --> 00:05:05,820
the implications for prospera and ciudad morazan favorable outcome would be nasrala or asfura to win

58
00:05:05,820 --> 00:05:12,480
then we get likely continuity and improvement in the national political environment for zas

59
00:05:12,480 --> 00:05:19,740
and a moncada win well obviously continued political pressure on the zas however even

60
00:05:19,740 --> 00:05:26,880
if they do win prosper has shown remarkable resilience in the face of government opposition

61
00:05:26,880 --> 00:05:34,720
and maintains a 50-year legal stability guarantee and in fact recently the zone has grown from 100

62
00:05:34,720 --> 00:05:42,880
to 400 companies just in the past year alone so i think either way prosper is gonna win it's just

63
00:05:42,880 --> 00:05:50,040
they win harder if we get a favorable government. Right, let's get on to the episode. But of course,

64
00:05:50,480 --> 00:05:56,180
I forgot to mention this episode is brought to you by Veritas Villages. They're building off-grid,

65
00:05:56,340 --> 00:06:01,980
freedom-oriented communities in Latin America. And these communities should appeal to you if

66
00:06:01,980 --> 00:06:08,000
you're a freedom lover looking for a great plan B option, or in fact, as many have chosen, a plan

67
00:06:08,000 --> 00:06:14,580
option, a home. For Bitcoiners amongst you, you can buy your property with Bitcoin and you can

68
00:06:14,580 --> 00:06:21,180
spend Bitcoin at communities. They can even help you mine Bitcoin with the excess energy that your

69
00:06:21,180 --> 00:06:29,160
home will be producing. www.veritasvillages.com forward slash free cities for more info. And

70
00:06:29,160 --> 00:06:34,700
please mention the Free Cities podcast if you choose to do business with them. Right. Today's

71
00:06:34,700 --> 00:06:43,320
episode trey goff and to be honest my conversation with trey possibly one of my favorites from

72
00:06:43,320 --> 00:06:51,380
prague um it was a it was a great mix of information and and speculation and just some really deep

73
00:06:51,380 --> 00:06:56,780
sort of philosophical stuff here's the ai summary from fountain in this episode of the podcast trey

74
00:06:56,780 --> 00:07:03,720
explains some of prosperer's history and current state while also diving into some deep philosophical

75
00:07:03,720 --> 00:07:09,960
thoughts on the future impact of AI. Key topics, Prospera Free Cities, Silicon Valley oligarchs,

76
00:07:10,260 --> 00:07:16,660
regulation, artificial intelligence, personhood, consciousness, technology, capitalism,

77
00:07:17,280 --> 00:07:23,900
existentialism. It's a good one. In fact, it's jam-packed. Updates from Roatan and then

78
00:07:23,900 --> 00:07:29,540
a fascinating conversation about AI and the future. One that was actually only cut short

79
00:07:29,540 --> 00:07:31,700
by my very busy schedule in Prague,

80
00:07:32,200 --> 00:07:34,280
but one that will surely continue

81
00:07:34,280 --> 00:07:36,560
when I next make it out to Prospera,

82
00:07:36,640 --> 00:07:38,800
which may be sooner rather than later.

83
00:07:38,940 --> 00:07:40,660
But even if it's not sooner,

84
00:07:41,020 --> 00:07:41,960
it will obviously be

85
00:07:41,960 --> 00:07:42,980
when the Free Cities Conference

86
00:07:42,980 --> 00:07:45,400
is held in Prospera next year.

87
00:07:45,840 --> 00:07:47,720
September the 3rd to the 6th, 2026.

88
00:07:48,060 --> 00:07:49,160
Secure your tickets now

89
00:07:49,160 --> 00:07:52,120
at freecitiesconference.com.

90
00:07:52,460 --> 00:07:54,440
That's freecitiesconference.com.

91
00:07:54,440 --> 00:07:55,260
And by the way,

92
00:07:55,460 --> 00:07:57,140
if you're thinking of

93
00:07:57,140 --> 00:07:59,440
becoming a resident of Prospera,

94
00:07:59,540 --> 00:08:06,380
or starting a business in prospera you can use the free cities podcasts referral link which will

95
00:08:06,380 --> 00:08:28,598
help us out in a very small way that is at prospera forward slash r forward slash free cities it all in the show notes click on it start your business and thank me later every little bit helps right thank you subscribers on fountain please consider doing the same yourself

96
00:08:28,598 --> 00:08:33,478
if you're into the value for value platforms six bucks a month for bonus episodes and more

97
00:08:33,478 --> 00:08:41,998
and the good vibes that you know we absolutely and wholeheartedly appreciate your support if

98
00:08:41,998 --> 00:08:47,658
you're not into that quick review on apple or spotify only takes a moment and it really makes

99
00:08:47,658 --> 00:08:55,218
a difference right time to relax and enjoy the episode here you go have some more of my fireplace

100
00:08:55,218 --> 00:09:04,818
in the background the sounds will help you drift away as you sit back dog at feet steaming cup of

101
00:09:04,818 --> 00:09:14,898
tea on the little occasional table next to your armchair relax and enjoy my conversation with trey

102
00:09:14,898 --> 00:09:15,698
goth

103
00:09:15,698 --> 00:09:36,798
like i was saying the first time we met was the first prague conference

104
00:09:36,798 --> 00:09:42,958
three years ago yeah and i made a film i was i remember that yep yep and you were great for

105
00:09:42,958 --> 00:09:46,998
soundbites i remember because when we used you at the very end of it i remember that yeah i can't

106
00:09:46,998 --> 00:09:51,058
i could probably remember what you said but whatever you said was like perfect for the

107
00:09:51,058 --> 00:09:58,918
ending of the film and it was like um i can't even remember it was just it was the most it was like

108
00:09:58,918 --> 00:10:03,558
if you if you would distill a soundbite down to its most perfect essence it came out of your mouth

109
00:10:03,558 --> 00:10:12,298
so um thank you for that for starters but um but um you're you're um yeah so you're and you were

110
00:10:12,298 --> 00:10:16,498
at Prosper then as well. I think you were the, you're quite a long, long time Prosperian.

111
00:10:16,498 --> 00:10:21,698
I was actually the first like full-time employee legally. Now, this is a bit of a technical point

112
00:10:21,698 --> 00:10:27,158
because, you know, the founder, CEO, Eric was already working on the project. Of course,

113
00:10:27,478 --> 00:10:31,678
there were a few other guys that were also working on the project, but they were not employees. They

114
00:10:31,678 --> 00:10:35,778
were early investors and board members and they were just, you know, giving their time basically

115
00:10:35,778 --> 00:10:40,978
to work on it. So I was the actual first like literal employee of, of this, this project

116
00:10:40,978 --> 00:10:46,158
specifically. So I've been around since 2017. Wow. Actually, it's just reminded me, I've got

117
00:10:46,158 --> 00:10:50,258
to ask you about this thing because I've been asking everyone who's ever been connected with

118
00:10:50,258 --> 00:10:57,418
Prospera and even Massimo asked him yesterday, but the other day an article came out, which was

119
00:10:58,078 --> 00:11:03,438
on a site that I think is pretty cool, Whitney website. I don't know if you know who she is.

120
00:11:04,258 --> 00:11:08,338
Now she's a really, she's a cool independent journalist, millions of followers, whatever,

121
00:11:08,338 --> 00:11:15,858
but it was she syndicated an article um i'll i'll read you the title it was called um city

122
00:11:15,858 --> 00:11:23,658
states without limits and it was a basically a kind of like conspiratorial thing that a powerful

123
00:11:23,658 --> 00:11:28,718
group of silicon valley oligarchs are basically pushing for the construction of privatized city

124
00:11:28,718 --> 00:11:34,698
states aligned with you know a global network and they want to move away from nation states and stuff

125
00:11:34,698 --> 00:11:56,778
And it's huge. I mean, it's a huge thing I read through. It's like one of those things that joins the dots between so many things. There was a lot about Peter Thiel. There was a lot about all the initial investors in Prosper. They got a lot of the Prosper stuff. It was a bit outdated. They talked a lot about the Crawfish Rock stuff and about taking land. I know, I know. It was really annoying.

126
00:11:56,778 --> 00:12:05,238
but but there was i mean like i say i kind of respect the the the outlet that it was on

127
00:12:05,238 --> 00:12:09,718
and i tried to i reached out to the guy and he he's not getting back to me he must be seeing

128
00:12:09,718 --> 00:12:12,878
my messages because i'm saying like come on the pod dude i'd like to talk about this

129
00:12:12,878 --> 00:12:17,018
not least because i think you've got a few things wrong but i just want to know more about this he

130
00:12:17,018 --> 00:12:21,358
you know like what what where do you how do you join all these dots together and

131
00:12:21,358 --> 00:12:26,178
you know like i've spoken to a few people about it i spoke like when i said it to massimo he's like

132
00:12:26,178 --> 00:12:31,458
well yeah it's kind of true I suppose you know like they have they they don't like the state

133
00:12:31,458 --> 00:12:36,398
they have got a lot of money they're not you know but it was just the conspiratorial side of it which

134
00:12:36,398 --> 00:12:42,558
I found a little bit disturbing that there are people because then you know like there are a lot

135
00:12:42,558 --> 00:12:50,458
of incentives to create free cities of course and and what I'm worried is that in in some future

136
00:12:50,458 --> 00:12:55,358
that we all end up in because I think you at Prosper you've already gone through this a lot

137
00:12:55,358 --> 00:13:01,718
the kind of in a neocolonial ism kind of way, you know, like, oh, you're taking land, you're doing

138
00:13:01,718 --> 00:13:07,658
this. And there will be another one, I think, and it will be this, it will be tech billionaires

139
00:13:07,658 --> 00:13:12,698
who want to isolate themselves, who want to be, they all, they all believe in a new,

140
00:13:12,798 --> 00:13:15,898
in a feudal system, and they want to do this, that, and the other, and blah, blah, blah.

141
00:13:16,518 --> 00:13:22,218
What do you think of that? I think it's all very entertaining, uh, and funny because it is so

142
00:13:22,218 --> 00:13:28,258
diametrically opposed to like how things actually work and what actually happens that reading these

143
00:13:28,258 --> 00:13:34,418
it it occurs to me and feels like reading a fanfic about what we do it's like oh cool somebody got

144
00:13:34,418 --> 00:13:38,598
really creative like wow this is a very entertaining this would be a much more entertaining story than

145
00:13:38,598 --> 00:13:44,058
the reality that happened which is just years of brutal grinding basically uh of course uh goes

146
00:13:44,058 --> 00:13:48,758
without saying but to just be clear like all of the claims about you know land being taken from

147
00:13:48,758 --> 00:13:54,198
anyone or just completely made up. In fact, we've Bloomberg wrote a piece, of course, a hit piece

148
00:13:54,198 --> 00:13:56,678
that claimed the same thing about a year ago or something. We sued them and won.

149
00:13:57,338 --> 00:13:58,098
Did you sue them?

150
00:13:58,138 --> 00:14:02,178
Yes, we sued them and got a retraction, won apology letter, whole nine yards.

151
00:14:02,558 --> 00:14:05,998
Right. There you go. I didn't hear that it was a retraction.

152
00:14:06,278 --> 00:14:13,278
Yeah, we didn't kind of push this too much for a couple of reasons. But yeah, we've got one from

153
00:14:13,278 --> 00:14:17,298
Bloomberg. And because of that, now we tell every outlet that reaches out to us with these kind of

154
00:14:17,298 --> 00:14:22,618
hit pieces that, you know, you will face the same fate if you, you know, parrot the same lies

155
00:14:22,618 --> 00:14:27,378
effectively. So I'm going to get to your specific point in a second, but this is a fun kind of small

156
00:14:27,378 --> 00:14:31,138
tangent, if you will, that I think people will find interesting. Here's how this actually the

157
00:14:31,138 --> 00:14:34,818
process works of a lot of these hit pieces being written. And I'm not speaking to the specific

158
00:14:34,818 --> 00:14:41,358
post you're referencing because I haven't read it. But in general, a journalist always is, you know,

159
00:14:41,518 --> 00:14:46,458
kind of ideologically opposed to what we do more or less. And so by definition, they're coming at it,

160
00:14:46,458 --> 00:14:52,078
you know, incredibly aggressively and negatively. And it's not from some, you know, good-hearted,

161
00:14:52,238 --> 00:14:57,478
let's hold, you know, truth to power kind of thing. It is just an ideological kind of axe to

162
00:14:57,478 --> 00:15:03,258
grind effectively. And what they'll do is they'll go read the other hit pieces that have been written

163
00:15:03,258 --> 00:15:08,438
about it. They then use those, Wikipedia does the same thing. They'll then use those as a basis for

164
00:15:08,438 --> 00:15:12,478
the claims in their article, right? So if they want to make a sensational claim for which they

165
00:15:12,478 --> 00:15:16,358
have no primary evidence, well, they can just point to another outlet that seems, you know,

166
00:15:16,358 --> 00:15:19,858
superficially plausible, facially plausible, and then say, hey, these people said this,

167
00:15:19,958 --> 00:15:23,258
and then now it's part of their article. So this is how these things kind of get propagated over

168
00:15:23,258 --> 00:15:29,998
time. And then they do a great job of trying to find like specific people that will occur to a

169
00:15:29,998 --> 00:15:36,518
lay reader as reputable. And also this person hates what we do so that they can get a like just

170
00:15:36,518 --> 00:15:41,418
potent, you know, negative quote, basically. Okay. And here's how we typically find out about

171
00:15:41,418 --> 00:15:45,438
these things. I'm going to read you as an example. I'll keep the outlet name away for now because

172
00:15:45,438 --> 00:15:50,498
Hopefully they don't publish this, but I'll give you like a perfect example of what this actually looks like.

173
00:15:50,758 --> 00:15:57,158
OK, so there's an outlet writing a piece about Prospera right now, a mainstream outlet.

174
00:15:57,638 --> 00:16:01,738
And this is how the most recent communication from them.

175
00:16:01,958 --> 00:16:04,778
They said, hey, the piece will run toward the end of November.

176
00:16:05,038 --> 00:16:12,958
To add, I will include a criticism of Prospera from a respected academic that Prospera is, quote, a predatory project in a weak state.

177
00:16:13,178 --> 00:16:14,698
Do you have any comment? Thanks.

178
00:16:15,438 --> 00:16:18,398
I just read you verbatim the email from the journalist.

179
00:16:18,398 --> 00:16:20,398
I read you verbatim the email.

180
00:16:20,398 --> 00:16:21,398
So that's how that goes.

181
00:16:21,398 --> 00:16:22,938
I get like, you know, we get one of these a month or so.

182
00:16:22,938 --> 00:16:26,938
I wonder who the respected academic is.

183
00:16:26,938 --> 00:16:28,938
It sounds like a left winger by...

184
00:16:28,938 --> 00:16:29,938
100%.

185
00:16:29,938 --> 00:16:33,958
It's probably, it's probably, there's just one academic who I have a real axe to grind

186
00:16:33,958 --> 00:16:34,958
against.

187
00:16:34,958 --> 00:16:48,696
Her name is Beth Gaglia She a PhD I think like anthropologist She written a couple of times about Prospera with you know peer articles that say very similar things it your normal kind of like critical theory left you know like sociologist and anthropologist type screed

188
00:16:48,696 --> 00:16:53,216
basically but she has the phd after the end of her name so she's a reputable expert air quotes

189
00:16:53,216 --> 00:16:59,096
expert right she can do they ever acknowledge the do they ever acknowledge the positives do they ever

190
00:16:59,096 --> 00:17:05,856
say, you know, like we think this, but if you're a free marketeer, you know, if you believe in sort

191
00:17:05,856 --> 00:17:12,556
of free market economics, then it's easy to argue that this has a massive effect. And here are the

192
00:17:12,556 --> 00:17:16,876
reasons why. And actually here's the proof, which is, you know. So they don't ever include the proof,

193
00:17:16,936 --> 00:17:19,936
but here's what they will do, which also backfires spectacularly almost every time.

194
00:17:20,456 --> 00:17:26,156
They will instead say something along the lines of, you know, this, you know, this project is a

195
00:17:26,156 --> 00:17:31,836
special economic zone with, you know, a lower regulatory burden and where people can do,

196
00:17:31,936 --> 00:17:36,336
you know, experimental biotech innovation, things like that, which you and I read like,

197
00:17:36,416 --> 00:17:41,476
oh, this sounds great. But they don't add and that's bad. You're supposed to assume the bad

198
00:17:41,476 --> 00:17:45,896
is implied from the way they frame it in the article. So what happens is one of the largest,

199
00:17:46,216 --> 00:17:50,056
it was a couple of years ago, but one of the largest up to this year, days, single days of

200
00:17:50,056 --> 00:17:53,156
e-residency signups we ever had was when a giant hit piece was published against us. I think it was

201
00:17:53,156 --> 00:17:57,396
in the Financial Times maybe, some mainstream publication, because they did this throughout

202
00:17:57,396 --> 00:18:00,936
the article, right? So it was clearly framed the way they wrote about it as it was supposed to be

203
00:18:00,936 --> 00:18:04,976
negative, but they don't ever actually add, and that's bad because X, and then explain.

204
00:18:05,396 --> 00:18:09,316
You're just supposed to assume it's bad. So people that think like we do, that understand how the

205
00:18:09,316 --> 00:18:13,756
world actually works, read this and they go, well, this sounds awesome. I need to learn more about

206
00:18:13,756 --> 00:18:16,996
this. This sounds amazing. So it actually, I don't mind when they do that. That's quite helpful,

207
00:18:17,156 --> 00:18:21,076
actually. No, it's the, like, just literally making up lies that are infuriating.

208
00:18:21,076 --> 00:18:22,716
We had some pretty good press recently.

209
00:18:22,856 --> 00:18:26,056
I think, what was the one, the YouTubers that came?

210
00:18:26,156 --> 00:18:26,596
Yes Theory.

211
00:18:26,876 --> 00:18:27,216
Yes.

212
00:18:27,336 --> 00:18:38,616
And they almost framed it as we came to sort of like, you know, they came, or correct me if I'm wrong, it looked like they came with an agenda and it was disproven.

213
00:18:38,716 --> 00:18:38,956
Correct.

214
00:18:39,056 --> 00:18:39,456
No, they did.

215
00:18:39,556 --> 00:18:39,716
Yeah.

216
00:18:39,956 --> 00:18:40,296
They did.

217
00:18:40,536 --> 00:18:46,816
So we're much more open and kind of amenable to, you know, alternative media like YouTubers like that.

218
00:18:46,816 --> 00:18:50,696
because basically the only thing we ask them when they come is obviously like you don't have to let

219
00:18:50,696 --> 00:18:55,576
us review your thing like you have full journalistic integrity all we ask is that you let us show you

220
00:18:55,576 --> 00:19:00,416
the actual proof like you know let us show you around let us answer your hard questions your

221
00:19:00,416 --> 00:19:05,056
your you know more aggressive questions the skepticism you have like but let's just be

222
00:19:05,056 --> 00:19:08,796
honest about it you know uh and that's it they came in just like that which is we're totally

223
00:19:08,796 --> 00:19:13,556
fine with we're we're an open book to those types because i know for a fact given their prior history

224
00:19:13,556 --> 00:19:16,636
they'll be fair they'll they'll tell the truth about whatever it is that they're they're kind of

225
00:19:16,636 --> 00:19:21,196
investigating and looking at. So those kind of alt media types we quite like. And getting back to

226
00:19:21,196 --> 00:19:25,736
your original point, I was just an aside, to be clear, I absolutely hate journalists. There's this

227
00:19:25,736 --> 00:19:34,096
fantastic book called The Journalist and the Murderer. It's a book about how a journalist

228
00:19:34,096 --> 00:19:41,096
was writing a long expose about a murderer and the way the journalist lied to and mistreated

229
00:19:41,096 --> 00:19:46,616
and otherwise abused and kind of defrauded the murderer made people end up sympathizing more

230
00:19:46,616 --> 00:19:49,856
with the murderer than the journalist. The journalist came out looking like the bad person.

231
00:19:50,496 --> 00:19:54,696
And it has, I won't quote it here in four because it's too long, but it has the single best opening

232
00:19:54,696 --> 00:19:59,856
paragraph of all time. It's absolutely phenomenal. So everyone should go read that. It's called the

233
00:19:59,856 --> 00:20:04,256
journalist and the murderer. It's phenomenal. So to your original point, the about, you know,

234
00:20:04,496 --> 00:20:10,536
this big Silicon Valley right wing conspiracy. Look, in some sense, I wish there were some truth

235
00:20:10,536 --> 00:20:13,836
to that because it would mean it would be much easier to raise capital and we would have a lot

236
00:20:13,836 --> 00:20:18,936
more resources and connections and things we could do to kind of help the world and to grow faster.

237
00:20:19,196 --> 00:20:24,956
Right. So the degree to which it's not true is like astonishing because it's basically

238
00:20:24,956 --> 00:20:28,556
diametrically opposed to the truth. You know, we've without going into details, you know,

239
00:20:28,616 --> 00:20:32,776
we've tried to get some of these people, Peter Thiel being a great example to invest for years,

240
00:20:33,076 --> 00:20:37,356
years, never could get them over the hump. Like they weren't weren't interested or it was too

241
00:20:37,356 --> 00:20:43,096
it was too politically risky, whatever. We finally I jokingly say this with our the round we've

242
00:20:43,096 --> 00:20:48,176
earlier this year, we finally made the hit pieces come true because we finally actually got a very

243
00:20:48,176 --> 00:20:53,556
small investment, the emphasis on small from Peter Thiel's family office, finally, directly.

244
00:20:54,176 --> 00:20:57,216
So we finally made some of the hit pieces true. But that should give you an inkling of what I mean.

245
00:20:57,316 --> 00:21:01,916
It's not like there's a group of Silicon Valley billionaires and kind of tech oligarchs who,

246
00:21:02,096 --> 00:21:04,796
you know, I have a monthly catch-up call with and I'm like, all right, what would you like me to do

247
00:21:04,796 --> 00:21:10,016
now, sir? Or, you know, like, if we do that, you'll give me another $100 million? Amazing.

248
00:21:10,096 --> 00:21:12,196
I'll go do that right away. Literally none of this is true.

249
00:21:12,196 --> 00:21:17,336
I thought, in fact, in the article, he said the original money came from Peter Thiel.

250
00:21:17,476 --> 00:21:17,656
No.

251
00:21:17,896 --> 00:21:19,736
That is completely wrong.

252
00:21:20,016 --> 00:21:20,416
Palantir?

253
00:21:20,556 --> 00:21:20,696
Were they?

254
00:21:20,816 --> 00:21:21,216
No.

255
00:21:21,336 --> 00:21:22,916
They had zero involvement whatsoever.

256
00:21:23,696 --> 00:21:24,436
I wish.

257
00:21:24,496 --> 00:21:26,676
I would love to talk to the Palantir founders.

258
00:21:26,816 --> 00:21:28,856
I'm sure they would love what we're up to.

259
00:21:29,036 --> 00:21:29,396
That's right.

260
00:21:29,476 --> 00:21:29,876
But no.

261
00:21:30,376 --> 00:21:30,936
That's my point.

262
00:21:30,976 --> 00:21:34,256
This is just the exact opposite of the truth in every conceivable way.

263
00:21:34,656 --> 00:21:37,196
What people misconstrue is that, and this wasn't the first.

264
00:21:37,236 --> 00:21:39,756
This was the second or third round we raised when this happened.

265
00:21:40,376 --> 00:21:50,436
Potri Friedman, who's a good close ally and friend of mine, an ally of Prospera generally, he raised a venture capital fund of which Peter Thiel was a contributing LP, right, into the VC fund.

266
00:21:51,276 --> 00:21:56,976
Potri's fund was meant to invest in projects like Prospera, right, charter cities, free city projects, that sort of stuff.

267
00:21:57,536 --> 00:22:00,236
So, of course, we were the anchor investment for their fund.

268
00:22:00,356 --> 00:22:01,676
Like we were the first big investment they made.

269
00:22:02,056 --> 00:22:06,236
So people then took that to say, oh, Peter Thiel's directly invested in and controlling this project.

270
00:22:06,236 --> 00:22:08,176
But at that point, we had never even talked to the guy.

271
00:22:08,296 --> 00:22:08,676
We tried.

272
00:22:09,096 --> 00:22:10,656
We never even talked to the guy at all.

273
00:22:11,576 --> 00:22:14,336
So that's why I say when I read these, I'm like, oh, this is very entertaining fanfic.

274
00:22:14,416 --> 00:22:16,236
It's just so – it is so hard to raise capital.

275
00:22:16,836 --> 00:22:25,856
It is so hard to get people to understand what we're trying to do and believe in the idea, to understand this concept not of neo-feudalism or neo-colonialism or whatever.

276
00:22:25,856 --> 00:22:35,076
It's like much simpler and in many ways more boring than that, which is partnering with existing host nations, not fighting against them like some others in the space or castigating them or whatever.

277
00:22:35,076 --> 00:22:40,056
Like these people try their best given the incentive structures and institutional structures that they're faced with.

278
00:22:40,136 --> 00:22:51,676
And we come to these people and say, look, your country, whether it's from the United States to Honduras, anywhere, and say, look, you're held back in a number of ways by your legal and regulatory environment, your institutional environment.

279
00:22:52,156 --> 00:22:54,196
We don't innovate on the layer of governance anymore.

280
00:22:54,476 --> 00:23:03,596
It's very hard to reform the things you need to reform at nation scale, if not impossible, nearly impossible is what I always say, with Argentina being the proof point to get why it's nearly and not always.

281
00:23:03,596 --> 00:23:21,936
And so let's partner to create special jurisdictions where we work together to create a new legal, regulatory, and institutional environment that is much more amenable to entrepreneurship, innovation, and business while still protecting the public interest, the public good, protecting against negative externalities, things like that.

282
00:23:21,936 --> 00:23:38,376
Like most people don't realize, right, in anywhere where we operate or are going to operate, right, where we're trying to pass enabling legislation, whatever, a key part of the enabling legislation everywhere is an oversight committee that has to approve and review the provisions of your charter or your governance system, things like that, that are always appointed by the government, right?

283
00:23:38,376 --> 00:23:41,096
That's why I say in partnership with host nations, it's quite the opposite.

284
00:23:41,196 --> 00:23:54,616
We work hand in glove with them effectively because the objective is, right, create a center, a nucleus, a nexus of prosperity in the country so that everyone else in that country can see in Honduras, for example, right?

285
00:23:54,616 --> 00:23:59,596
They can see, oh, people in Honduras, Hondurans can run a robotic construction factory.

286
00:23:59,896 --> 00:24:00,896
We can do this in Honduras.

287
00:24:01,256 --> 00:24:05,556
It's not like there's some inherent limitation of the people that live there in some way, shape or form.

288
00:24:05,756 --> 00:24:07,076
It's the institutional environment.

289
00:24:07,076 --> 00:24:20,456
And the goal is to get them to realize, oh, I don't have to travel to the United States to make more money, to innovate, to do cool things that push the boundary and push humanity's technological frontier forward, right, and create prosperity for myself and my family.

290
00:24:20,896 --> 00:24:24,356
I can actually do this here because I can see people doing it there in my country right now.

291
00:24:24,656 --> 00:24:31,216
So it has massive spillover effects, but at a more foundational level, and this is hard to measure so people don't talk about it as much, but I think it's super important.

292
00:24:31,216 --> 00:24:34,576
as I kind of grow older, I recognize the importance of culture more and more.

293
00:24:35,076 --> 00:24:37,896
It can have a profound cultural shift over time, right?

294
00:24:37,936 --> 00:24:41,576
Where people just, it shifts that belief from people don't do those kinds of things here,

295
00:24:41,916 --> 00:24:46,876
whatever here may be, to, oh, well, I have visible physical proof that people do those

296
00:24:46,876 --> 00:24:50,396
kinds of things here, that people like me do those kinds of things here where I live.

297
00:24:50,836 --> 00:24:54,516
So it inspires these people to think bigger, to dream more.

298
00:24:54,616 --> 00:25:06,454
And then it creates a template that ideally over time the rest of the country the actual national government can now deploy And instead of them saying hey we have this idea for a thing we want to do they can say it much easier political argument right

299
00:25:06,454 --> 00:25:08,814
They can say, hey, it's working in our partnership with Prospera.

300
00:25:09,014 --> 00:25:10,694
Let's do it nationwide, whatever the case may be.

301
00:25:10,814 --> 00:25:13,014
You know, maybe they change it a little, whatever, whatever the policy is.

302
00:25:13,434 --> 00:25:16,034
But it ends up being an example for the rest of the country to follow.

303
00:25:16,334 --> 00:25:19,174
And we know this can work because that's what happened in China, right?

304
00:25:19,174 --> 00:25:22,074
So Shenzhen became the first special economic zone in 1979.

305
00:25:22,074 --> 00:25:29,034
and it grew massively. It has over 12 million people now. At the same time, the population

306
00:25:29,034 --> 00:25:33,754
grew from roughly 300,000 over 12 million GDP per capita in real terms, inflation adjusted,

307
00:25:34,174 --> 00:25:38,914
grew from around 300 US dollars to right around 40,000 US dollars. So the same time

308
00:25:38,914 --> 00:25:44,214
your denominator in that fraction is multiplying by several orders of magnitude, you also massively

309
00:25:44,214 --> 00:25:48,854
increased the numerator enough that the overall per capita number went up drastically in inflation

310
00:25:48,854 --> 00:25:54,674
adjusted terms. So China saw this working, right? And today, by the way, over 50% of China's exports

311
00:25:54,674 --> 00:26:00,754
come from their SEZs. All of their technological hubs and hubs of innovation are SEZs. There's 46

312
00:26:00,754 --> 00:26:05,654
of them in the country. They're responsible for a vast majority of the economic activity and growth

313
00:26:05,654 --> 00:26:10,214
in the country. So we know for a fact this can work in an authoritarian regime even, that they

314
00:26:10,214 --> 00:26:14,834
see undeniably these principles work of good governance because it's not complicated either.

315
00:26:14,834 --> 00:26:17,774
I said this yesterday, but just to reiterate, this stuff's not complicated.

316
00:26:18,194 --> 00:26:18,934
It is really simple.

317
00:26:19,214 --> 00:26:25,774
It is good, clear rules, fair, impartial, and just enforcement of those rules, fair and

318
00:26:25,774 --> 00:26:27,954
impartial and rapid adjudication of disputes, right?

319
00:26:27,974 --> 00:26:28,894
At a high level, that is it.

320
00:26:29,094 --> 00:26:29,954
We know what works.

321
00:26:29,994 --> 00:26:31,194
We know the formula for prosperity.

322
00:26:31,574 --> 00:26:35,054
One of my favorite economists, Douglas North, won the Nobel Prize for kind of pointing this

323
00:26:35,054 --> 00:26:37,734
out in inventing the field of institutional economics in the 1990s.

324
00:26:38,034 --> 00:26:38,854
We've known this for a while.

325
00:26:39,274 --> 00:26:40,994
It's just implementing it that's incredibly difficult.

326
00:26:41,314 --> 00:26:44,534
So the way I like to think about Prospera, and I describe it to policymakers because

327
00:26:44,534 --> 00:26:48,974
I do a lot of our policy work now is exactly as I've described it to you here. Like, look,

328
00:26:49,014 --> 00:26:53,094
we know what works, guys. It's just you and I both know that for the US example, right? Congress

329
00:26:53,094 --> 00:26:57,814
is dysfunctional. So because Congress can't do anything, here's another way, right? Let's do it

330
00:26:57,814 --> 00:27:01,274
at a zone level. Let's prove it works and then expand it from there and use that as a template

331
00:27:01,274 --> 00:27:05,454
to pass the similar legislation nationwide. You know, nothing would make me happier, by the way,

332
00:27:05,794 --> 00:27:10,194
if we go somewhere, form a Prosper Zone, right? Running on the Prosper Governance platform.

333
00:27:10,194 --> 00:27:16,094
it is wildly successful you know drives economic growth etc and then the rest of the country just

334
00:27:16,094 --> 00:27:19,874
cut this wouldn't happen but like hypothetically right the rest of the country copy paste everything

335
00:27:19,874 --> 00:27:25,874
we did amazing like you know complete you know total victory that would be nothing would make

336
00:27:25,874 --> 00:27:29,514
me happier because then now the whole country could prosper the same way our jurisdiction is

337
00:27:29,514 --> 00:27:34,474
right and now we have an incentive to innovate more and do better to continue to be able to grow

338
00:27:34,474 --> 00:27:39,094
faster than the rest of the country so it's a dynamic process that incentivizes improvement

339
00:27:39,094 --> 00:27:44,194
So it's a very long way of saying none of these rumors and allegations are even vaguely true.

340
00:27:44,294 --> 00:27:45,274
They're like the opposite of reality.

341
00:27:45,714 --> 00:27:47,694
I wish these billionaires were just handing me money for free.

342
00:27:47,754 --> 00:27:48,294
That would be awesome.

343
00:27:48,854 --> 00:27:50,094
It's incredibly hard to raise money.

344
00:27:50,794 --> 00:27:56,394
And it's not like they have interesting policy ideas or ideas for kind of the business model that we can implement or anything like that.

345
00:27:57,574 --> 00:28:00,674
I call this the myth of the helpful investor.

346
00:28:01,814 --> 00:28:02,954
Not to say anything bad.

347
00:28:03,014 --> 00:28:04,194
Some of our investors are genuinely helpful.

348
00:28:04,294 --> 00:28:07,894
I'm talking very specifically about these kind of Silicon Valley oligarch billionaire types.

349
00:28:07,894 --> 00:28:11,054
they're very hard to get to and if you can't at all it's like you know a 20 minute meeting or

350
00:28:11,054 --> 00:28:16,074
something uh so like none of this is true it's like so the opposite of how this works uh that

351
00:28:16,074 --> 00:28:21,114
i always i'm not even mad when i read these i'm just like wow i wish i wish there was a grain of

352
00:28:21,114 --> 00:28:26,494
truth here even because my job would be easier it's not that i was mad it was that i feel like

353
00:28:26,494 --> 00:28:32,974
these guys are on my side i'm on the i i'm you know i'm a i connect the dots on a number of things

354
00:28:32,974 --> 00:28:40,254
um but and so that was what was annoying it was like come on you know like i'm i'm i'm working

355
00:28:40,254 --> 00:28:45,634
amongst this thing and i believe in a lot of the things you do whitney webb's pretty good at you

356
00:28:45,634 --> 00:28:51,574
know not that she wrote the article but she she did syndicate it but she's she's done some

357
00:28:51,574 --> 00:28:57,294
excellent journalism on all kinds of corruption around the world and things like that though um

358
00:28:57,294 --> 00:29:01,774
But, yeah, I agree, especially talking about the SEZ version.

359
00:29:02,454 --> 00:29:05,274
I don't know whether you watched our film, did you, Zones of Progress?

360
00:29:05,494 --> 00:29:06,094
Oh, yeah, of course.

361
00:29:06,194 --> 00:29:06,634
It was amazing.

362
00:29:06,974 --> 00:29:07,834
I directed that.

363
00:29:08,134 --> 00:29:17,994
And during that process, I learned everything I needed to know about the process that countries can go through.

364
00:29:17,994 --> 00:29:26,134
And, you know, China, as you rightly point out, is far the perfect example because they couldn't roll out changes nationwide.

365
00:29:26,134 --> 00:29:29,834
You can't go from communism to a free market overnight.

366
00:29:29,994 --> 00:29:31,454
It would be absolute mayhem.

367
00:29:32,274 --> 00:29:35,634
So they used it as a sandbox and it worked, as you can see.

368
00:29:35,714 --> 00:29:48,754
And I think Dubai now, or the Emirates kind of in general, are another good example because they basically have become a kind of whole – the free zones have kind of just bled into the whole country.

369
00:29:48,994 --> 00:29:49,314
That's right.

370
00:29:49,474 --> 00:29:50,854
And that's the theory.

371
00:29:50,854 --> 00:29:57,554
I mean, the model works and what you're doing at Prospero, obviously, is the next step.

372
00:29:58,474 --> 00:30:01,254
It's not rocket science when you look at it.

373
00:30:02,314 --> 00:30:05,794
I wish sometimes that people could see it.

374
00:30:05,834 --> 00:30:07,174
Actually, you just hit the nail on the head.

375
00:30:07,474 --> 00:30:09,454
It's not complicated at all.

376
00:30:10,054 --> 00:30:10,474
It's not.

377
00:30:10,714 --> 00:30:16,894
And I fell into that trap myself to think that there was something complicated about this at all.

378
00:30:16,954 --> 00:30:17,414
It's not.

379
00:30:17,414 --> 00:30:24,294
it's really people flocking towards freedom economic freedom physical freedom whatever

380
00:30:24,294 --> 00:30:30,174
they're they're flocking to a place where they can get done what they want to get done and they're not

381
00:30:30,174 --> 00:30:36,014
you know frankenstein technology and you know this is the other thing i think that people

382
00:30:36,014 --> 00:30:42,494
really misunderstand about prosper is that you can't just do anything there it's like it's on

383
00:30:42,494 --> 00:30:46,214
when you if you if you don't dig under the surface you think what you can make your own

384
00:30:46,214 --> 00:30:51,974
legislation what the fuck does that mean so i can just do anything i want no you've got to prove

385
00:30:51,974 --> 00:30:57,514
you've got it you've got to prove to an insurer that whatever you're doing is insurable so you're

386
00:30:57,514 --> 00:31:03,454
beholden to insurers it's like you know in many ways worse than than governments possibly i don't

387
00:31:03,454 --> 00:31:08,574
know no that's exactly right so two points here um the first is there's actually a you know

388
00:31:08,574 --> 00:31:12,314
filtering step if you will before you even get to the insurance companies right which is in the

389
00:31:12,314 --> 00:31:17,114
Prospera governance system, you have to first, if you're proposing a new regulatory framework,

390
00:31:17,214 --> 00:31:22,374
you have to propose that to the Prospera Council, which then has a charter-bound duty to review this,

391
00:31:22,614 --> 00:31:26,854
ensure that it is better than, like the bar to clear, is that it is better than any other

392
00:31:26,854 --> 00:31:30,114
available option that you could have otherwise picked, right? So first of all, that's a high

393
00:31:30,114 --> 00:31:35,294
bar to clear. How do, talking about Ian, but what if it's something really complicated? Because you

394
00:31:35,294 --> 00:31:40,894
must get a lot of experimental industries and stuff in there that probably are pushing the

395
00:31:40,894 --> 00:31:44,734
boundaries of all kinds of regulation, right? Totally. So we've had some proposals, like a few

396
00:31:44,734 --> 00:31:49,874
that have gone through, or both in the medical space, as well as in the finance space, both of

397
00:31:49,874 --> 00:31:55,674
which incredibly complex fields, right? You need specific subject matter expertise. So in the

398
00:31:55,674 --> 00:32:00,054
finance world, for example, at the time when that proposal was being reviewed, we had that particular

399
00:32:00,054 --> 00:32:04,274
regulatory framework reviewed by, among other people, and kind of commented on and altered,

400
00:32:04,414 --> 00:32:09,374
actually, by, you know, Sharak Shah, who was the chief strategy officer and one of the kind of

401
00:32:09,374 --> 00:32:12,874
early executives and founders of the Dubai International Financial Center. So, you know,

402
00:32:12,914 --> 00:32:18,634
he's done this exact thing very successfully elsewhere. At the time, one of the executive

403
00:32:18,634 --> 00:32:23,214
directors of the FDIC in the United States was doing this for us as well, reviewing these things.

404
00:32:24,214 --> 00:32:28,274
You know, we'll bring in, in other words, very specific subject matter expertise on the things.

405
00:32:28,274 --> 00:32:34,234
On the medical side of the ledger, you know, we have a PhD in, you know, molecular biology

406
00:32:34,234 --> 00:32:38,654
and related fields who will review these things that work in the kind of policy space.

407
00:32:38,654 --> 00:32:46,654
And that's after it gets the kind of first pass from me and Nick Geranius and the rest of the council, all of which in our own right have worked in public policy.

408
00:32:46,754 --> 00:32:50,434
That's what I've done my whole life basically even before Prospera.

409
00:32:50,634 --> 00:32:52,934
So we all have some knowledge and expertise there.

410
00:32:53,054 --> 00:32:58,774
And then when needed, if it's beyond the specific subject matter expertise of the council, we bring in third-party people.

411
00:32:58,774 --> 00:33:02,754
Because again, everything always comes back to incentives for me to some extent because

412
00:33:02,754 --> 00:33:07,874
people just like – again, this isn't complicated and people think they're kind of – I don't

413
00:33:07,874 --> 00:33:10,414
think have the right mental models for this.

414
00:33:10,414 --> 00:33:15,734
Imagine the hypothetical world where you propose a regulatory system for the medical

415
00:33:15,734 --> 00:33:16,734
industry let's say.

416
00:33:16,734 --> 00:33:19,934
I'm going to make up a wild example that allows you to make genetic

417
00:33:19,992 --> 00:33:23,992
chimeras. So a genetic chimera would be like a human dog hybrid. They splice the two together

418
00:33:23,992 --> 00:33:27,852
and see what happens. Something like that. Just an abomination. Okay. And then imagine for a second

419
00:33:27,852 --> 00:33:31,612
that somehow, some way this actually gets passed to the council. It never happened, but like,

420
00:33:31,652 --> 00:33:35,972
just bear with me. Imagine it does. We've just nuked our business from orbit. Like we're done.

421
00:33:36,092 --> 00:33:40,372
We'll be bankrupt in six months or less because that's all everyone's going to talk about. Hey,

422
00:33:40,392 --> 00:33:44,152
those crazy guys are doing the crazy stuff we told you they were going to do. This is going to

423
00:33:44,152 --> 00:33:48,992
cause actual harm in a number of ways, because this can be the source of a bunch of completely

424
00:33:48,992 --> 00:33:56,312
novel biological warfare type problems, novel bacteria, viruses, et cetera. It can introduce

425
00:33:56,312 --> 00:34:02,212
a bunch of existential risk to humanity, actually. So we have an incentive from a pure business

426
00:34:02,212 --> 00:34:07,092
perspective. We want to stay in business and become profitable. Then the first thing you have

427
00:34:07,092 --> 00:34:12,652
to do is make sure your legal and regulatory environment is not insane because anyone's going

428
00:34:12,652 --> 00:34:16,432
to look at that and say, oh, I'm not coming. I'm not going there. Look at these crazy articles about

429
00:34:16,432 --> 00:34:20,372
these crazy guys. And in fact, I don't even want to be associated with that. Even if I'm in a

430
00:34:20,372 --> 00:34:24,072
normal industry, I'm not even in the medical industry. I don't want to tell my investors

431
00:34:24,072 --> 00:34:29,032
and potential business partners, hey, don't Google the place I'm at, please. It's fine. I promise.

432
00:34:29,652 --> 00:34:31,752
That's not a workable business model. Trust me, bro.

433
00:34:31,932 --> 00:34:36,632
Yeah, exactly. It's not a workable business model. So the incentives are what matter more

434
00:34:36,632 --> 00:34:43,672
than anything else. So there's a panel that it goes through, and then it goes to the insurers,

435
00:34:43,672 --> 00:34:47,352
who presumably do their own due diligence on it?

436
00:34:47,612 --> 00:34:47,772
Of course.

437
00:34:47,892 --> 00:34:48,072
Yeah.

438
00:34:48,372 --> 00:34:48,732
Absolutely.

439
00:34:49,012 --> 00:34:51,172
They don't take for granted what you've decided.

440
00:34:51,512 --> 00:34:52,372
They then, right.

441
00:34:52,592 --> 00:34:55,112
Because they can just refuse to provide the insurance.

442
00:34:55,292 --> 00:34:56,272
They can just say, no, I'm not doing that.

443
00:34:56,412 --> 00:34:58,772
I don't care what, I don't care if it passed the council or not.

444
00:34:58,792 --> 00:34:59,332
I'm not going to do that.

445
00:34:59,712 --> 00:35:01,792
Can you give me an example of one that has passed recently

446
00:35:01,792 --> 00:35:02,652
that was quite innovative?

447
00:35:03,652 --> 00:35:04,012
Yes.

448
00:35:04,252 --> 00:35:08,912
So we call it kind of informally the mini-circle regulatory system

449
00:35:08,912 --> 00:35:11,112
because they were the ones that proposed it.

450
00:35:11,112 --> 00:35:14,492
um this was a couple years ago when minicircle first started operating but that's probably

451
00:35:14,492 --> 00:35:20,472
why is minicircle so minicircle is a gene therapy company based in prospera uh this is another uh

452
00:35:20,472 --> 00:35:25,612
funny tie-in to our earlier uh conversation about peter teal uh they had they had received direct

453
00:35:25,612 --> 00:35:31,052
investment from peter teal in their seed round uh not us them um and they are working on like

454
00:35:31,052 --> 00:35:38,832
longevity gene therapies they're applying uh this um gene circle technology basically i don't know

455
00:35:38,832 --> 00:35:42,892
of that. It's long story short. And I'm about to tell you the limit of my knowledge here. So don't

456
00:35:42,892 --> 00:35:47,312
if you ask me any more detail, I don't know. But my understanding of how it works is it's a

457
00:35:47,312 --> 00:35:54,192
fairly well proven and reversible vector through which to do targeted gene modification on specific

458
00:35:54,192 --> 00:35:58,372
proteins and cells in the body, basically. And it's reversible, critically, you can you can back

459
00:35:58,372 --> 00:36:01,712
it out, basically, you can you can eject an enzyme afterward that dissolves the mini circle.

460
00:36:02,412 --> 00:36:06,312
And therefore, it's gone. Like there's no there's no long term effect. It's not doing what's called

461
00:36:06,312 --> 00:36:10,072
a germline gene edit, right? You can't do that in Prosper at all, actually, because it has like

462
00:36:10,072 --> 00:36:15,952
existential risk. A germline gene edit, okay, it means you're editing the actual, like all somatic

463
00:36:15,952 --> 00:36:19,852
cells in the body, and you're editing the genetic information that would be passed down to a child,

464
00:36:20,072 --> 00:36:24,112
right? It's the actual like full genetic lineage, basically. There's a different kind of gene

465
00:36:24,112 --> 00:36:29,072
modification you can do that temporarily modifies somatic cells to change the expression of a

466
00:36:29,072 --> 00:36:33,172
particular thing for a particular duration of time before that cell eventually replicates,

467
00:36:33,172 --> 00:36:35,952
And when it replicates, then it replicates back to normal state, basically.

468
00:36:36,152 --> 00:36:36,272
Okay.

469
00:36:36,332 --> 00:36:37,052
That's what they do.

470
00:36:37,172 --> 00:36:39,112
It's a reversible, temporary change.

471
00:36:39,392 --> 00:36:42,212
So if you like wanted to keep using their therapy indefinitely, you would have to get

472
00:36:42,212 --> 00:36:44,572
another treatment of it every like 90 days.

473
00:36:44,592 --> 00:36:45,732
I think it is 120 days.

474
00:36:46,692 --> 00:36:48,172
So it's a cool kind of therapy.

475
00:36:48,212 --> 00:36:53,832
And they proposed a novel regulatory system for this type of thing that basically in a

476
00:36:53,832 --> 00:36:56,652
nutshell, I'm oversimplifying here, but it's like, we'll get the gist of it.

477
00:36:56,932 --> 00:37:00,672
As long as you've gone through phase one equivalent trials.

478
00:37:00,672 --> 00:37:03,472
So in the US, the FDA system, you have phase one, phase two, phase three.

479
00:37:03,672 --> 00:37:04,612
Phase one is safety.

480
00:37:05,112 --> 00:37:08,452
So phase one is where you're testing to see like, you know, will this harm people?

481
00:37:08,812 --> 00:37:10,952
Does it cause any sort of adverse reaction?

482
00:37:10,992 --> 00:37:15,792
Maybe theoretically and from the kind of, you know, etiology and biopharmacinetics of

483
00:37:15,792 --> 00:37:16,812
the drug, it's not going to do anything.

484
00:37:17,212 --> 00:37:18,872
But you never know until you give it to people.

485
00:37:18,912 --> 00:37:19,612
The body's complicated.

486
00:37:20,312 --> 00:37:24,092
So it has to have gone through that process of proving beyond a shadow of a doubt that

487
00:37:24,092 --> 00:37:26,772
we know for certain that it's not going to harm people, right?

488
00:37:26,912 --> 00:37:28,852
There's not a bunch of negative adverse events.

489
00:37:28,852 --> 00:37:43,032
After that, then, okay, normally in the FDA process, then you go through like further efficacy trials, which is phase two. And then phase three are efficacy as well as proving that it treats the specific disease you're trying to treat within the FDA framework better than other alternatives.

490
00:37:43,492 --> 00:37:44,852
So it's a crash course on FDA works.

491
00:37:44,852 --> 00:37:56,452
So this mini circle regulatory system basically says you've gone through phase one, then you can go to market so long as you have very, very clear free prior informed consent, including all of the possible tail risks.

492
00:37:56,892 --> 00:38:08,912
You know, the list of kind of warnings you get when you buy some products in the US, that type of thing, that it's very clear that you understand it, you know, that you've reviewed it, that you're actually cognizant of what it is that you're doing.

493
00:38:08,912 --> 00:38:28,172
But at a very basic level at that point, then consenting adults should be able to do what they like, right? So long as it's cleared this very basic bar of safety. Because this would be a negative externality we could create, right? If you have advanced treatments that your average customer that might be interested in buying these things doesn't understand, you know, it's too complicated, they're not a doctor, whatever.

494
00:38:28,932 --> 00:38:34,332
That's why having this bar to clear is very important to make sure beyond a shadow of a doubt there's no tomfoolery happening there.

495
00:38:34,332 --> 00:38:42,792
But beyond that, consenting adults should be able to consent to do things together so long as they are fully informed of what it is they're agreeing to do.

496
00:38:43,252 --> 00:38:48,092
So that in a nutshell is how the kind of mini circle regulatory proposal works.

497
00:38:48,152 --> 00:38:48,772
There's some other provisions.

498
00:38:48,892 --> 00:38:49,772
That's the meat of it.

499
00:38:49,772 --> 00:39:02,772
That whole consenting adults being able to take experimental drugs, that's quite a large part of Prospera's advantage, right?

500
00:39:02,852 --> 00:39:07,712
I seem to have spoken to a number of people who, and it makes total sense to me.

501
00:39:08,332 --> 00:39:09,332
Yes, of course.

502
00:39:09,872 --> 00:39:12,552
Of course I should be able to try a drug if I want to.

503
00:39:13,132 --> 00:39:14,472
Why would I not?

504
00:39:15,472 --> 00:39:17,392
So you get that quite a lot, do you?

505
00:39:17,572 --> 00:39:18,532
That's exactly right.

506
00:39:18,532 --> 00:39:19,532
And it's not just that.

507
00:39:19,532 --> 00:39:21,232
It manifests in a few other ways.

508
00:39:21,412 --> 00:39:24,412
This is actually referred to in US policy spaces as right to try.

509
00:39:25,412 --> 00:39:28,312
Montana passed similar legislation actually in the last year.

510
00:39:29,432 --> 00:39:35,112
An L2 partner of ours, Infinita, my good friend Nicholas over there was instrumental in making that happen.

511
00:39:36,252 --> 00:39:39,432
New Hampshire has passed something similar and is considering expanding it further.

512
00:39:40,112 --> 00:39:42,012
So it's basically similar to what we do.

513
00:39:42,172 --> 00:39:48,932
But there's another piece of it where we go a little bit further that also is one of those things like when you hear it, it's like, why doesn't everyone do this?

514
00:39:48,932 --> 00:39:53,572
which is just reciprocity. Right now, let's say you're in the United States and there's a drug

515
00:39:53,572 --> 00:39:57,772
that has gone through the full gamut of FDA equivalent in Australia, let's say, but it hasn't

516
00:39:57,772 --> 00:40:02,512
gone through in the US, doesn't matter. Can't take it, can't use it. That company, even though

517
00:40:02,512 --> 00:40:06,312
they've proven safety, efficacy, treatment, validity for the disease state, et cetera,

518
00:40:06,492 --> 00:40:10,632
in other countries, they have to go through the whole multi-billion dollar process,

519
00:40:10,772 --> 00:40:16,312
decade-long process in the US again to sell the drug in the US. So another simple change we made

520
00:40:16,312 --> 00:40:21,492
is, look, if this drug is approved in a reputable country, so one of the OECD countries effectively,

521
00:40:22,132 --> 00:40:26,332
or another country that has a robust, like provably robust, you know, pharmaceutical

522
00:40:26,332 --> 00:40:30,892
review system, then you can use it and sell it in Prospera so long as you can prove to

523
00:40:30,892 --> 00:40:34,672
us that it has gone through this full robust approval process in some other country, just

524
00:40:34,672 --> 00:40:36,032
automatic reciprocity effectively.

525
00:40:36,312 --> 00:40:40,512
That also is a huge unlock because there's a bunch of efficacious, you know, treatments

526
00:40:40,512 --> 00:40:46,772
and pharmaceuticals and kind of enhancement compounds and protocols that are legal and

527
00:40:46,772 --> 00:40:49,692
have been reviewed and have been taken by tens of thousands of patients elsewhere.

528
00:40:49,952 --> 00:40:53,292
But you just can't do them in every other country except that one specific country because

529
00:40:53,292 --> 00:40:54,512
no one has this reciprocity.

530
00:40:54,872 --> 00:40:57,112
So it's not just the right to try if it's gone through safety trials.

531
00:40:57,212 --> 00:41:01,852
It's also reciprocity if it's gone through the full trial gamut in a country that's high

532
00:41:01,852 --> 00:41:04,772
income enough to have a reputable drug review system.

533
00:41:04,772 --> 00:41:19,092
And in those cases, how much of the responsibility lays upon the head of the person taking the drug to also do their due diligence on it?

534
00:41:19,092 --> 00:41:24,852
for example say it was like i don't know this is a silly example but say it was a north korean drug

535
00:41:24,852 --> 00:41:34,152
and there was and it was allowed you know how much is how much wiggle room is there for you to not

536
00:41:34,152 --> 00:41:48,450
tell them it from north korea and they take it and then die and then they sue you or whatever there basically no wiggle room there So there a couple of layers that prevent that from happening The first is that particular treatment protocol would never be approved to be done in the

537
00:41:48,450 --> 00:41:52,690
jurisdiction in the first place by the insurance companies or when they got their regulatory

538
00:41:52,690 --> 00:41:53,130
approval.

539
00:41:53,570 --> 00:41:58,510
Because part of this regulated industry insurance system is you have to say, here's what I intend

540
00:41:58,510 --> 00:41:58,890
to do.

541
00:41:59,090 --> 00:42:00,990
Here's the safety and efficacy data on it, et cetera.

542
00:42:01,530 --> 00:42:04,290
So it would never make it through that initial insurance review process.

543
00:42:04,990 --> 00:42:09,030
both from kind of the Prosper side of things as well as any insurance company that's doing this.

544
00:42:09,110 --> 00:42:13,970
So there's just layer one there that like your experimental North Korean drug would just never make it to market in the first place.

545
00:42:14,310 --> 00:42:21,430
So there is a layer of not expecting every consumer to become an expert on etiology to be able to do these things.

546
00:42:22,450 --> 00:42:27,990
But beyond that, after that, you have the same duty as you would in the US or elsewhere, right?

547
00:42:28,470 --> 00:42:31,810
Where in the US, there's pharmaceutical companies market straight to the consumer.

548
00:42:32,290 --> 00:42:34,690
People go to their doctor and request the drug basically.

549
00:42:34,990 --> 00:42:41,010
And so long as it's FDA approved, they just go for it, even though they have no idea what the profound effects could be, positive or negative.

550
00:42:41,630 --> 00:42:43,830
You know, SSRIs are my favorite kind of example of this.

551
00:42:43,830 --> 00:42:50,990
The doctors and others don't talk about the profound negative side effects those can have over a very long time horizon because they're FDA approved.

552
00:42:51,110 --> 00:42:52,090
So like, oh, whatever, it's fine.

553
00:42:53,070 --> 00:42:59,150
And so there is always in any market some duty on the consumer to not just completely go in blind to things.

554
00:42:59,270 --> 00:43:02,970
And the second layer of protection you have in Prospera is that informed consent piece, right?

555
00:43:02,970 --> 00:43:11,810
You do have to demonstrate like provably that you've provided to the patient a detailed overview of like what the thing does, the potential risks, side effects, et cetera, and that they've signed this.

556
00:43:12,010 --> 00:43:18,370
If you're using one of these more experimental treatments that has been approved elsewhere and has made it through phase one.

557
00:43:18,990 --> 00:43:20,630
So it's a dual layer system.

558
00:43:20,690 --> 00:43:23,310
But at the end of the day, there's no escaping that in any market.

559
00:43:23,610 --> 00:43:30,510
Like at the end of the day, there's no regulatory system on earth that would work that stops stupid people from harming themselves.

560
00:43:30,570 --> 00:43:31,090
I'll put it that way.

561
00:43:31,090 --> 00:43:37,350
so when people like i've bumped into a few people around the world who said that oh i went to

562
00:43:37,350 --> 00:43:43,510
prospera first a gene therapy for example are those people using prospera because it's too far

563
00:43:43,510 --> 00:43:48,330
to fly to australia to get the same thing on the whole then is that is that the more the more common

564
00:43:48,330 --> 00:43:54,330
reason why these kind of things are happening in prospera than than just some completely brand new

565
00:43:54,330 --> 00:44:00,570
regulation or whatever uh it's actually more of a new treatment that happens more often uh because

566
00:44:00,570 --> 00:44:08,770
is there's a lot of things that we have strong theoretical and kind of experimental trial

567
00:44:08,770 --> 00:44:13,390
evidence that they will work, but they haven't made it through the full FDA process yet.

568
00:44:13,390 --> 00:44:16,270
And they haven't been approved elsewhere either because oftentimes, right, just like in a

569
00:44:16,270 --> 00:44:20,190
very kind of nuts and bolts way, these medical innovations come from the US, you know, Stanford,

570
00:44:20,410 --> 00:44:23,490
MIT, whatever, or former, you know, students there.

571
00:44:24,270 --> 00:44:26,010
So it often starts in the US.

572
00:44:26,310 --> 00:44:29,990
And then these people don't want to go to Australia, Europe, wherever to start their

573
00:44:29,990 --> 00:44:33,910
their process. If for no other reason than the US is the largest medical market in the world by like

574
00:44:33,910 --> 00:44:38,790
an order of magnitude. You know, we spend more on healthcare than anywhere else. Our per capita

575
00:44:38,790 --> 00:44:42,890
healthcare spending is higher than most countries, national GDP per capita, literally, not an

576
00:44:42,890 --> 00:44:47,390
exaggeration, it's absurd. So because of that, like you want to be in the US market. So that's

577
00:44:47,390 --> 00:44:51,130
usually the focus. And then because of that, you know, the novel innovations also come out of the

578
00:44:51,130 --> 00:44:55,490
US. So it might not have been passed elsewhere, you know, it might not be approved elsewhere,

579
00:44:55,490 --> 00:44:59,570
but it's made it through phase one trials in the US or in another jurisdiction, but it's been

580
00:44:59,570 --> 00:45:03,530
through phase one trial somewhere. And then it's not gone through the process elsewhere. So it's

581
00:45:03,530 --> 00:45:06,970
literally not for sale anywhere else yet. Like you could not access it anywhere else yet,

582
00:45:07,410 --> 00:45:11,710
but you can in Prospera. And the gene therapies people come to get, which are often this mini

583
00:45:11,710 --> 00:45:16,370
circle treatment actually is the most popular by far. This folistatin treatment is just textbook

584
00:45:16,370 --> 00:45:20,250
example of exactly what I'm describing. You can't go anywhere else literally in the world

585
00:45:20,250 --> 00:45:24,450
to get that right now because we know it's safe. The particular underlying like technology,

586
00:45:24,450 --> 00:45:29,870
that mini circle and the folistatin treatment have been used in other like forms and use cases,

587
00:45:30,010 --> 00:45:34,250
but not in this particular way. But we know like the actual underlying mechanisms of action are

588
00:45:34,250 --> 00:45:39,050
safe and the kind of drug interactions are safe. It's just that this specific thing has not been

589
00:45:39,050 --> 00:45:42,630
approved yet. And some of them, they can't be by the way, because the FDA doesn't right now

590
00:45:42,630 --> 00:45:47,910
have a robust enough process for approving gene therapies. Like one of my favorite examples of

591
00:45:47,910 --> 00:45:52,530
this, we technically have the technology like as a species right now to have what are called

592
00:45:52,530 --> 00:45:56,270
individualized gene therapy treatments, right? So this is, this would be where, you know, I scan

593
00:45:56,270 --> 00:46:01,750
your genome and I say, Hey, based on your genome, we could give you this gene therapy that would

594
00:46:01,750 --> 00:46:09,070
reduce your risk of Alzheimer's by 50% or whatever in old age. That would require for just you in

595
00:46:09,070 --> 00:46:13,230
an N equals one situation going through the full FDA approval process, the full thing,

596
00:46:13,370 --> 00:46:17,190
the multi-billion dollar decade thing. So of course there's a whole class of technologies here

597
00:46:17,190 --> 00:46:22,290
that are just locked behind that wall basically. So there's a bunch of ideas kind of that are

598
00:46:22,290 --> 00:46:26,070
known kind of in the latent space or in the medical space, and nobody's even working on

599
00:46:26,070 --> 00:46:28,610
or trying because it's like, I don't even, it's not even possible.

600
00:46:28,770 --> 00:46:30,570
There's no, there's no path to market there.

601
00:46:30,750 --> 00:46:36,310
You can't do it like anywhere because the, another unfortunate consequence of the US

602
00:46:36,310 --> 00:46:40,990
being the de facto global hegemon is that this has a lot of positive benefits to be

603
00:46:40,990 --> 00:46:41,190
clear.

604
00:46:41,270 --> 00:46:44,950
But one downside is the rest of the world also often just copies our regulatory systems

605
00:46:44,950 --> 00:46:48,770
or models there's after ours, at least, even if they change it some.

606
00:46:49,210 --> 00:46:50,610
So because of that, FDA doesn't do it.

607
00:46:50,610 --> 00:47:06,090
People see the FDA as kind of the leader in the space. And to be clear, that in and of itself is rational, right? Because we spend more money on this than anyone else. We have the largest drug regulatory system in the world. So it stands to reason the U.S. is spending the most money on this. They have the smartest people working on it. If their system works, it'll probably work for us, right?

608
00:47:06,090 --> 00:47:13,930
But an unintended consequence of that is if the U.S. doesn't let you do it, odds are basically every other country also won't let you do it.

609
00:47:14,190 --> 00:47:22,210
It's another reason these type of zones are so sorely needed, right, is we need an outlet for these technologies in a responsible way, as I've described, proving they're safe first.

610
00:47:22,270 --> 00:47:27,130
They're not going to harm you to actually see what these things do and to push the technological frontier.

611
00:47:27,130 --> 00:47:31,210
I want to live in a world where I can go get individualized gene therapy, right?

612
00:47:31,310 --> 00:47:35,190
To treat, you know, I've had seven knee surgeries because I have genetically, I just have really

613
00:47:35,190 --> 00:47:36,130
bad cartilage in my knees.

614
00:47:36,730 --> 00:47:39,930
The technology exists theoretically for me to fix that specifically.

615
00:47:39,930 --> 00:47:45,330
If I could have an individuated gene therapy that causes my body to grow higher collagen

616
00:47:45,330 --> 00:47:50,230
cartilage, basically, doesn't exist anywhere in the world because of this just massive

617
00:47:50,230 --> 00:47:51,070
regulatory wall.

618
00:47:51,230 --> 00:47:52,910
Basically, there's not even a regulatory wall.

619
00:47:52,970 --> 00:47:53,790
It just doesn't exist.

620
00:47:53,870 --> 00:47:55,710
It's not like, oh, the process is hard.

621
00:47:55,990 --> 00:47:56,930
Like that would be a problem.

622
00:47:56,930 --> 00:48:03,210
that's a solvable problem is that there is no process it's well i was just thinking about all

623
00:48:03,210 --> 00:48:09,870
this stuff and it's such a catch-22 situation being a at the sharp end of the wedge is literally

624
00:48:09,870 --> 00:48:16,450
you know even i me i think gene therapy i just think frankenstein i think you know it's like

625
00:48:16,450 --> 00:48:25,170
you've got you've got such a pr quagmire doing what you're doing if you think about it in in

626
00:48:25,170 --> 00:48:35,690
every single shape and form it's so easy to attack um a z a it's so easy it's it's all these things

627
00:48:35,690 --> 00:48:41,770
all these things are are so easy to jump to the conclusion of and i'm i'm i'm fully on board and

628
00:48:41,770 --> 00:48:48,750
i still think this oh yeah i still think that you know like i i can still imagine these things you

629
00:48:48,750 --> 00:48:54,890
know it's crazy i know you know i mean we're i i would say as well i'm fully i'm thankful for

630
00:48:54,890 --> 00:48:59,290
especially Prospero, because you are taking on all these head on.

631
00:48:59,750 --> 00:49:02,870
You are literally the sandbox where all these things are playing out.

632
00:49:03,030 --> 00:49:09,850
And in the grand scheme of things, I'd say this is why I feel so bullish at the moment on everything,

633
00:49:10,610 --> 00:49:15,510
is a hardline socialist government has stepped up to you and said,

634
00:49:15,610 --> 00:49:18,650
we're going to shut you down and then not acted on it.

635
00:49:18,970 --> 00:49:19,830
That's amazing.

636
00:49:20,450 --> 00:49:22,710
When you think about it, that's absolutely amazing.

637
00:49:22,710 --> 00:49:33,110
I mean, you know, I don't know what your view on the current situation is there, but from an outsider looking in, I've heard them say the law is now illegal, is rescinded.

638
00:49:33,430 --> 00:49:34,650
We're going to shut you down.

639
00:49:34,830 --> 00:49:37,130
And then nothing happened and nothing is happening.

640
00:49:37,210 --> 00:49:38,590
And now I hear different rhetoric.

641
00:49:38,850 --> 00:49:43,210
I hear the rhetoric of, well, we've we've we did well.

642
00:49:43,950 --> 00:49:45,570
There's no like we're going to shut you down.

643
00:49:45,630 --> 00:49:46,750
It's like, yeah, we did well.

644
00:49:46,810 --> 00:49:47,570
And so why?

645
00:49:47,750 --> 00:49:49,150
Well, there's going to be no more ZAs.

646
00:49:49,590 --> 00:49:51,730
It's like, yeah, well, what about the ones that you're going to shut down?

647
00:49:51,730 --> 00:50:06,028
There gonna be no more You know is that true Is that kind of what happening More or less I have to be careful here because of the arbitration case we have of course so I can go into a ton of detail But speaking to what been publicly published by the Honduran Supreme Court and otherwise

648
00:50:06,408 --> 00:50:11,828
the reason we're kind of seeing what you're describing is that the kind of legal foundations

649
00:50:11,828 --> 00:50:18,028
of Prospera and of the ZA system writ large are so strong that there was just without a

650
00:50:18,028 --> 00:50:22,048
supermajority of Congress, basically, which they have never had at any point, there's

651
00:50:22,048 --> 00:50:27,208
literally no way for them to do what they want to do or they at least claimed they wanted to do on

652
00:50:27,208 --> 00:50:30,368
the campaign trail, right, to actually like literally stop us from operating, run us out

653
00:50:30,368 --> 00:50:35,748
of the country, et cetera, because of international treaties, the actual constitutional amendments,

654
00:50:36,208 --> 00:50:40,528
all of it. There was no way with the way kind of Honduran law and constitutional law works in a

655
00:50:40,528 --> 00:50:45,948
nutshell for them to do this in a way that is not extrajudicial, arbitrary, et cetera, which is

656
00:50:45,948 --> 00:50:51,688
obviously how the international arbitration case came about, right? They attempted to do this in a

657
00:50:51,688 --> 00:50:55,888
number of ways, including with the Supreme Court ruling in direct violation of several international

658
00:50:55,888 --> 00:51:00,248
treaties and their own constitution in a number of ways. And so as a result of that, you know,

659
00:51:00,248 --> 00:51:04,908
we have the arbitration case going. But all I'll say on that, to be clear, is that, you know,

660
00:51:04,968 --> 00:51:08,788
it's not like we're – the goal here is not to win lawsuits, right? The goal here is to catalyze

661
00:51:08,788 --> 00:51:13,828
prosperity, full stop. So we are hopeful and optimistic that, you know, in the near term,

662
00:51:13,948 --> 00:51:18,328
call it, you know, the next year or two, somewhere in there, that this situation will be resolved

663
00:51:18,328 --> 00:51:22,788
amicably and uh we'll we'll be able to move forward you know working it hand in hand and

664
00:51:22,788 --> 00:51:26,988
in partnership with the honduran government again but even the lawsuit is a pr nightmare

665
00:51:26,988 --> 00:51:32,448
like i understand like when you when you hear i mean i think it was the way it was spun was that

666
00:51:32,448 --> 00:51:40,768
it's you're suing oligarchs are suing the honduran people not the government uh for more than the

667
00:51:40,768 --> 00:51:46,348
gdp of honduras and that that it's kind of like i don't know whether it's true but the the essence

668
00:51:46,348 --> 00:51:51,468
of it is true but when you that was the point you have to what do you want to do not do that

669
00:51:51,468 --> 00:51:57,028
like the the rules say and they're hard for a reason the rules are that if they

670
00:51:57,028 --> 00:52:02,868
renege on their deal this is what we want to do because we've got to stop them this is otherwise

671
00:52:02,868 --> 00:52:08,868
why would we start this thing exactly you know and and but but to spin it in the other way is so

672
00:52:08,868 --> 00:52:14,788
easy it's like and and it's not like it's easy to unspin it either that's what i was about to say

673
00:52:14,788 --> 00:52:19,528
So what frustrates me the most about this, and this is not speaking specifically to the arbitration case.

674
00:52:19,588 --> 00:52:23,308
This is just in general the hit pieces about Prospera and the narratives you're talking about.

675
00:52:23,648 --> 00:52:35,448
It is so easy to like quite literally lie, mislead, misdirect, selectively kind of misinterpret intentionally and maliciously kind of what we do, who we are, what's going on.

676
00:52:36,028 --> 00:52:39,328
And it is so much harder, right, to undo.

677
00:52:39,868 --> 00:52:41,408
There's some term for this.

678
00:52:41,448 --> 00:52:41,928
I can't remember.

679
00:52:42,168 --> 00:52:43,988
That applies not just to us but broadly, right?

680
00:52:43,988 --> 00:52:52,068
It happens in trading. It's like climbing the stairs and coming down the elevator, isn't it? Or is it, yeah, yeah. Coming down the elevator, climbing up the stairs.

681
00:52:52,068 --> 00:53:19,268
That's exactly right. Like it, you know, to, to put it in kind of a tweet link, right? It takes one sentence to make a completely sensational, viral, totally false claim. And then it takes a couple of paragraphs to explain, no, here's how the law actually works. And people see that and they're like, they see the original tweet. They're not going to see the response we do immediately after explaining in great detail why this doesn't, why that's like false in multiple different ways, whatever the, you know, the, the sentence may be.

682
00:53:19,268 --> 00:53:22,188
I never heard that Bloomberg retraction, for example.

683
00:53:22,788 --> 00:53:23,508
It's infuriating.

684
00:53:23,628 --> 00:53:24,168
I mean, I stop.

685
00:53:24,408 --> 00:53:26,868
I read these things now because I want to see.

686
00:53:27,128 --> 00:53:27,988
It's a good gauge.

687
00:53:28,128 --> 00:53:31,128
You can gauge public opinion, you know.

688
00:53:31,848 --> 00:53:35,848
Because if that changes, if the Bloomberg pieces start changing,

689
00:53:35,988 --> 00:53:38,288
that's a really good sign, you know.

690
00:53:38,848 --> 00:53:42,648
But, yeah, other than that, there's no point reading them, is there?

691
00:53:42,788 --> 00:53:43,628
That's exactly right.

692
00:53:43,788 --> 00:53:45,428
That's why I hadn't read the one you're talking about.

693
00:53:45,888 --> 00:53:47,168
You won't have read this one.

694
00:53:47,268 --> 00:53:47,708
No, absolutely.

695
00:53:47,708 --> 00:53:49,668
This is kind of fringe, fringe.

696
00:53:50,248 --> 00:53:54,828
But I would, the only reason like it was a little bit annoying

697
00:53:54,828 --> 00:53:59,608
is because I think it's not like some left-wing publication lying.

698
00:54:00,048 --> 00:54:03,548
Although what the guy didn't realize was the stuff he was saying

699
00:54:03,548 --> 00:54:07,068
about Prospero was coming straight out of the left-wing newspapers

700
00:54:07,068 --> 00:54:09,208
in Honduras.

701
00:54:09,728 --> 00:54:11,168
And a few, you know.

702
00:54:11,428 --> 00:54:14,948
But when I look at him, he's not that way inclined.

703
00:54:14,948 --> 00:54:16,868
They're very much, they're truth seekers.

704
00:54:16,868 --> 00:54:22,208
you know and there wasn't much but the narrative was just too juicy that's the problem the

705
00:54:22,208 --> 00:54:30,048
narrative of oh there's a rich guy trying to i mean despite the fact of course you know like i've

706
00:54:30,048 --> 00:54:35,908
asked because i'm interviewing a lot of people here connected with you know those eddies i've

707
00:54:35,908 --> 00:54:39,608
asked everyone this because i was interested to see what their answers were i've had an interesting

708
00:54:39,608 --> 00:54:44,468
answer yours has definitely been the most balanced but a lot of people have said yeah so what

709
00:54:44,468 --> 00:54:50,988
literally well in a way it's true because like what what the people that write articles like that

710
00:54:50,988 --> 00:54:56,928
probably don't acknowledge is that their default setting is that the state should be able to tell

711
00:54:56,928 --> 00:55:05,168
them what to do and whereas we are are challenging that literally and if you don't know that because

712
00:55:05,168 --> 00:55:10,888
i didn't for many years i never through never even thinking about that my default setting was not

713
00:55:10,888 --> 00:55:16,408
really a default setting. It was, it was, it was, it was, it was like a bit of a scam really.

714
00:55:17,128 --> 00:55:22,288
And so, so there's a lot of truth to the yes. So what, you know, if I had a billion dollars,

715
00:55:22,368 --> 00:55:27,188
what would I do? Yeah. I might build a free city actually and put, and make sure that I'm kind of

716
00:55:27,188 --> 00:55:31,828
in control of it because I would like it to go the way I want it to go. Just like I'm in,

717
00:55:32,208 --> 00:55:36,568
bought a house and I live in it. I don't let other people tell me what to do in my house,

718
00:55:36,568 --> 00:55:39,728
So it's a funny one, but you can't win.

719
00:55:40,628 --> 00:55:44,628
All you can do is just keep building or you just keep doing your thing.

720
00:55:44,808 --> 00:55:45,448
That's exactly right.

721
00:55:45,748 --> 00:55:53,088
You have to just prove incontrovertibly and in a way that is even impossible to lie about that this does what we say it does.

722
00:55:53,248 --> 00:55:54,348
This unleashes innovation.

723
00:55:54,528 --> 00:55:55,408
This unleashes entrepreneurship.

724
00:55:55,788 --> 00:56:00,768
This creates prosperity both for local populations and for the entrepreneurs and innovators that move into the space.

725
00:56:00,768 --> 00:56:17,588
That's all you can do really because the – again, like as I described earlier, the daisy chain of these guys like publication A makes a false claim, publication B references publication A in theirs and then that continues for the next 24 letters of the alphabet basically.

726
00:56:18,248 --> 00:56:28,068
And then suddenly what happens is from a lay perspective, you're just – as a person just coming into it for the first time, don't know how journalism works or how kind of modern mainstream media that has an agenda works.

727
00:56:28,648 --> 00:56:30,048
Then you just go read a few articles.

728
00:56:30,148 --> 00:56:31,468
You're like, oh, they all say the same thing.

729
00:56:31,528 --> 00:56:34,648
Well, that must be true because they all say it, not realizing that literally all of them

730
00:56:34,648 --> 00:56:35,328
are lying.

731
00:56:35,448 --> 00:56:36,888
It is directly false.

732
00:56:37,448 --> 00:56:40,028
And what they do is they're very clever about this, right?

733
00:56:40,448 --> 00:56:44,048
So by attributing it to another source, they can facially claim they're not lying.

734
00:56:44,328 --> 00:56:45,728
Like, oh, well, this other guy said it.

735
00:56:46,068 --> 00:56:46,988
I'm just saying that he said it.

736
00:56:47,008 --> 00:56:47,348
That's all.

737
00:56:47,808 --> 00:56:51,788
Knowing full well the perception that they are attempting to cultivate, obviously.

738
00:56:51,988 --> 00:56:57,368
So this is how the media often like intentionally misleads the craft and agenda.

739
00:56:57,368 --> 00:57:04,008
They do – Scott Alexander at Astral Codex 10 has a fantastic write-up on this about exactly how this works because he's been the victim of this before.

740
00:57:04,668 --> 00:57:06,488
So he wrote about the experience of what that's like.

741
00:57:07,108 --> 00:57:17,948
And they are very studious about avoiding making very specific concrete claims worded in a way that you can immediately say that specific thing is blatantly false.

742
00:57:18,548 --> 00:57:26,608
They do it in such a way that they have at least a layer of facially plausible deniability, superficially plausible deniability, right?

743
00:57:26,608 --> 00:57:32,828
so other publications said it right i'm not i'm just quoting them i didn't say it uh you know it

744
00:57:32,828 --> 00:57:35,408
doesn't matter that you know you're a journalist and you should have journalistic integrity and

745
00:57:35,408 --> 00:57:39,828
actually look into the underlying you know facts of the of the case uh so then this kind of daisy

746
00:57:39,828 --> 00:57:44,728
chains along multiple publications and you end up with this like just general perception that

747
00:57:44,728 --> 00:57:49,308
because the lay public unfortunately although this is changing thank goodness but uh rapidly too

748
00:57:49,308 --> 00:57:53,088
but people still trust a lot of these publications right like oh they're journaling you know at the

749
00:57:53,088 --> 00:57:56,588
of the day, they can't, it can't be that untrue. Like there must be, you know, where there's smoke,

750
00:57:56,648 --> 00:58:00,688
there's fire, right? Even if there's, even if there's some sort of, they're exaggerating it,

751
00:58:00,688 --> 00:58:04,628
there must be some sort of truth to this. And the idea that there's not, that these people just have

752
00:58:04,628 --> 00:58:09,688
a genuine, specific kind of decentralized and emergent agenda against this type of project

753
00:58:09,688 --> 00:58:30,846
and kind of what we do is not it a novel concept to them Cause it and it also I feel bad for kind of people that haven studied kind of how the media works and had to go through The reason I you know it is because we been through it firsthand is because if you accepting the kind of the truth of how it works it means you also have to accept two things

754
00:58:31,186 --> 00:58:35,846
One, oh, these publications I thought I could trust, I can't trust anymore. So now what do I do?

755
00:58:36,146 --> 00:58:40,646
Which means now the onus is on you to figure out what's true and what's not. And in, you know,

756
00:58:40,646 --> 00:58:47,106
2025 with social media, that could not be harder, right? This is why I am a fan of kind of the

757
00:58:47,106 --> 00:58:51,466
Brian Armstrong, you know, Balaji methodology of going direct, like just talk to the people

758
00:58:51,466 --> 00:58:56,086
directly, publish the truth, the videos, the pictures, whatever, the statistics about what

759
00:58:56,086 --> 00:58:59,786
you're doing directly, like, and just ignore the media, they will be irrelevant at some point,

760
00:59:00,126 --> 00:59:03,346
because now you can just talk to your audience, right? The function the media used to have the

761
00:59:03,346 --> 00:59:07,326
leverage they used to have over people, and this is waning rapidly, was that, oh, you want to talk

762
00:59:07,326 --> 00:59:11,186
to the world, you have to go through me, right? Which means they get to set the perception and

763
00:59:11,186 --> 00:59:15,246
the agenda. But now, thanks to the internet, you can just talk to people directly. You don't need

764
00:59:15,246 --> 00:59:21,466
the middleman, right? That is a transformative change. But getting that cultural shift for people

765
00:59:21,466 --> 00:59:27,006
to go from, I can just default trust Bloomberg because it's Bloomberg, to I should default

766
00:59:27,006 --> 00:59:32,766
mistrust anything these people say until proven otherwise, basically, is a very hard shift.

767
00:59:32,766 --> 00:59:36,926
One last point on this. There's a term for this called gelman amnesia, okay?

768
00:59:37,326 --> 00:59:45,626
So this is this phenomenon that a lot of people have experienced where you read the news, you read the newspaper, whatever, some media outlet on things you don't know anything about.

769
00:59:45,866 --> 00:59:47,486
And you're like, just assume it's true.

770
00:59:47,606 --> 00:59:48,766
Like that's your default assumption.

771
00:59:49,006 --> 00:59:50,526
There must be some journalistic integrity.

772
00:59:50,806 --> 00:59:51,246
Assume it's true.

773
00:59:51,826 --> 00:59:56,186
And then you go read an article that you happen to be involved in or just know the subject matter.

774
00:59:56,186 --> 01:00:02,026
Well, you're not even involved in it, but it's an article about the 3i Atlas comet and you're an astrophysicist, right?

775
01:00:02,146 --> 01:00:06,046
And normally you read all the business stuff and you believe it because you don't know anything about these businesses.

776
01:00:06,046 --> 01:00:10,106
you're not involved so must be true but then you read the astrophysics piece on this comment and

777
01:00:10,106 --> 01:00:14,546
you're like that's obviously not true no there's there's no extra gravitational acceleration what

778
01:00:14,546 --> 01:00:19,046
you're just making this you don't you're understanding it so poorly i don't even know

779
01:00:19,046 --> 01:00:24,586
how you mess this up this bad and then you flip the page and read an article about prospera whatever

780
01:00:24,586 --> 01:00:29,326
oh this must be true this is why it's called gal man amnesia like you flip the page and forget

781
01:00:29,326 --> 01:00:34,066
oh wait hold on do you forget there's one this happens to a lot of people oh i thought people

782
01:00:34,066 --> 01:00:37,506
didn't forget. I wish that was the case, but you'd be amazed how often this happens,

783
01:00:37,906 --> 01:00:41,706
where people will read a thing they know a lot about, realize it's entirely fake,

784
01:00:41,946 --> 01:00:46,786
and then not make the completely correct logical deduction from that, that, oh,

785
01:00:47,346 --> 01:00:50,746
everything they write must be like this. They just think, oh, this is a one-off. Like,

786
01:00:50,866 --> 01:00:54,226
you know, my area is complicated, whatever, you know, I'll give them the benefit of the doubt.

787
01:00:54,746 --> 01:00:59,206
And they just keep on keeping on. It's infuriating. It's called gelman amnesia.

788
01:00:59,206 --> 01:01:05,566
If you, I think that that mode of journalism is always almost reaching its final form.

789
01:01:05,726 --> 01:01:16,446
When you follow it to the end, you get articles with quote tweets in them by just who knows who, you know, so and so said this.

790
01:01:16,466 --> 01:01:18,426
And you think that's a retarded thing to say.

791
01:01:18,446 --> 01:01:19,746
It's like, but it doesn't matter.

792
01:01:20,286 --> 01:01:23,786
It's like somebody said it and it backs up the article that we're saying.

793
01:01:23,846 --> 01:01:25,786
And it's like that happens a lot.

794
01:01:25,786 --> 01:01:26,606
Oh, and it will be.

795
01:01:26,746 --> 01:01:28,706
I've seen a few of these that are like very funny.

796
01:01:28,706 --> 01:01:33,966
These are not about Prospera, but just about other things I know about where they will have exactly what you're describing.

797
01:01:33,966 --> 01:01:37,726
The article is based on like four or five quote tweets or screenshots of a tweet.

798
01:01:38,146 --> 01:01:42,126
And then I go look up the actual tweet on X.com and it has like 50 views.

799
01:01:42,666 --> 01:01:45,746
Like you just found a random person, literally.

800
01:01:46,006 --> 01:01:47,306
It's often an anonymous account too.

801
01:01:47,386 --> 01:01:51,686
You found a random anonymous account that said the thing that advances the narrative you would like to advance.

802
01:01:51,806 --> 01:01:53,946
And then you publish that as if this is a credible source.

803
01:01:54,146 --> 01:01:54,346
What?

804
01:01:55,246 --> 01:01:55,846
It's insane.

805
01:01:56,146 --> 01:01:57,366
That is genuinely insane behavior.

806
01:01:57,366 --> 01:02:03,786
You said earlier that you think that a good strategy was to ignore the media because they're going to be irrelevant anyway.

807
01:02:04,186 --> 01:02:15,426
I would push back on that because I think when I look at Prospera's kind of trajectory, I think in the earlier days, I think you got hammered.

808
01:02:15,526 --> 01:02:17,746
And I didn't see a lot of pushback, actually.

809
01:02:17,966 --> 01:02:18,446
I totally agree.

810
01:02:18,746 --> 01:02:19,246
You agree?

811
01:02:19,306 --> 01:02:20,186
I actually totally agree.

812
01:02:20,266 --> 01:02:23,306
Yeah, this is a shift we've made, you know, in the last couple of years, basically.

813
01:02:23,946 --> 01:02:25,686
So we have two different strategies now.

814
01:02:25,686 --> 01:02:26,986
So we don't ignore them.

815
01:02:26,986 --> 01:02:30,346
So like that hit piece, I read you the email from the journalist from yesterday.

816
01:02:31,786 --> 01:02:38,786
That one, obviously, our media team and our comms team responded with fire and brimstone, basically, as you should.

817
01:02:39,946 --> 01:02:42,686
That's a just completely insane thing to even ask me.

818
01:02:43,206 --> 01:02:45,066
And here's the truth of what's going on in Prospera.

819
01:02:45,426 --> 01:02:48,706
So we at least cover our bases there so that the article doesn't read.

820
01:02:49,286 --> 01:02:54,606
We asked Prospera for comment on this claim that they are a predatory narco state or whatever it said.

821
01:02:55,426 --> 01:02:56,586
And they had no comment.

822
01:02:56,586 --> 01:02:58,026
Yeah, they refused to comment.

823
01:02:58,146 --> 01:02:59,146
Exactly, exactly.

824
01:02:59,386 --> 01:03:03,046
So now at a bare minimum, they at least are forced to say something from what we said.

825
01:03:03,526 --> 01:03:07,506
Although this is a sidebar, but even that can be dangerous sometimes because a couple

826
01:03:07,506 --> 01:03:11,266
years ago, I did an interview with the MIT Technology Review, who for some insane reason

827
01:03:11,266 --> 01:03:14,726
was writing about Prospero, which like to start off with why you should be covering

828
01:03:14,726 --> 01:03:17,266
the cool tech MIT and other people are building, but whatever.

829
01:03:18,566 --> 01:03:20,526
Did the interview, I recorded it, the whole thing.

830
01:03:20,846 --> 01:03:24,146
I told the journalist I was doing this, recorded the whole thing so that I would have proof

831
01:03:24,146 --> 01:03:27,946
if I was misquoted or something, thinking this is just like an insurance policy, I won't need it.

832
01:03:28,026 --> 01:03:31,926
She was, you know, very nice journalist, very cordial on the call, seemed friendly,

833
01:03:32,766 --> 01:03:37,446
publishes the piece. And one of the quotes that made us sound terrible made me sound terrible.

834
01:03:37,906 --> 01:03:41,026
They did. We forced them to change this immediately because we threatened to sue them

835
01:03:41,026 --> 01:03:45,906
and because we could prove it. She literally had taken like a, you know, a couple sentence response

836
01:03:45,906 --> 01:03:52,606
I had. And then she cut it in such a way that it read completely differently from how I said it

837
01:03:52,606 --> 01:03:59,646
by removing selectively parts of my response so that she like, you know, used the paper mache

838
01:03:59,646 --> 01:04:04,486
to create a new quote out of whole cloth by like selecting words I had said in parts of sentences

839
01:04:04,486 --> 01:04:08,986
I had said in Frankenstein combining them. So even that can be dangerous. But anyways,

840
01:04:09,086 --> 01:04:12,826
so first strategy is we respond, you know, so at least that's on the record. It doesn't say,

841
01:04:12,906 --> 01:04:17,066
you know, Prusper refused to comment or something. But then beyond that, we just go direct. So we,

842
01:04:17,066 --> 01:04:20,486
if you go to our new section on our website and our blog or follow us on X,

843
01:04:20,486 --> 01:04:26,306
we often will immediately publish something if we have a, you know, there's a hit piece in coming

844
01:04:26,306 --> 01:04:30,986
or one was just published, like, hey, here's some of the lies in this, right? So I've done this a

845
01:04:30,986 --> 01:04:34,326
few times as well. You know, a few years ago, the MIT Tech Review one was a great example.

846
01:04:34,966 --> 01:04:38,566
You know, they claimed a bunch of crazy stuff, not just for my quote that I had the video of,

847
01:04:38,586 --> 01:04:41,806
but they claimed some other things that we had like video and photographic evidence was just

848
01:04:41,806 --> 01:04:47,026
blatantly false. So then we published the response to that and, you know, at least force them to

849
01:04:47,026 --> 01:04:50,466
change a few of the things. Now, what is frustrating about that is because we haven't

850
01:04:50,466 --> 01:04:55,586
been doing this like direct strategy enough, you know, the, I'm not going to, I have no way to get

851
01:04:55,586 --> 01:05:00,706
to at all, like the guy or gal who just reads MIT tech review and is not on X, for example,

852
01:05:00,706 --> 01:05:03,546
or does it, he's not going to go to our website and go see, well, have these guys responded?

853
01:05:03,786 --> 01:05:08,286
So that is still frustrating. But that's why it's important to grow that kind of native direct

854
01:05:08,286 --> 01:05:12,066
audience, which we're, we're working on in a number of ways. And we'll continue to do so.

855
01:05:12,066 --> 01:05:16,166
And then continue to just tell our story directly and publicly as a part of the reason I'm,

856
01:05:16,166 --> 01:05:18,126
I'm kind of – I don't know if we're going to do this yet to be clear.

857
01:05:18,226 --> 01:05:18,986
I'm not like announcing anything.

858
01:05:19,126 --> 01:05:23,526
But I'm hoping I can talk the team into doing a podcast, some YouTube video series, et cetera.

859
01:05:24,126 --> 01:05:31,446
Because I think like, look, we should just tell the whole story with some well-produced YouTube videos and do a giant podcast series on like how to prosper a start.

860
01:05:31,446 --> 01:05:37,166
What's the actual – like just sit down with Eric and Gabe and Tidus and everybody and like let's tell the whole story start to finish.

861
01:05:37,366 --> 01:05:38,246
Publish the whole thing.

862
01:05:38,326 --> 01:05:40,286
It will be a multi-hour podcast series.

863
01:05:40,826 --> 01:05:43,246
Let's do some YouTube videos of it, the whole nine yards.

864
01:05:43,246 --> 01:05:45,326
and then continue doing so,

865
01:05:45,406 --> 01:05:46,886
talking about stuff going on in the jurisdiction

866
01:05:46,886 --> 01:05:49,146
so people can just see with their own eyes directly

867
01:05:49,146 --> 01:05:51,186
exactly what's going on

868
01:05:51,186 --> 01:05:52,946
and start building that kind of direct audience.

869
01:05:53,106 --> 01:05:54,606
I want to put some more energy into that in the future.

870
01:05:54,866 --> 01:05:55,266
Definitely.

871
01:05:55,866 --> 01:05:57,706
Well, you know, we've got,

872
01:05:58,106 --> 01:05:59,366
I don't know whether you remember,

873
01:05:59,466 --> 01:06:01,406
I started this podcast in Prospera

874
01:06:01,406 --> 01:06:04,286
and my first seven or eight interviews

875
01:06:04,286 --> 01:06:06,526
were with various people in Prospera.

876
01:06:06,546 --> 01:06:06,826
That's awesome.

877
01:06:06,846 --> 01:06:07,966
You weren't there at the time,

878
01:06:08,046 --> 01:06:10,026
but we came over, that was three years ago.

879
01:06:11,026 --> 01:06:12,526
And, you know, I've done everyone,

880
01:06:12,526 --> 01:06:14,166
And it's pretty much everyone.

881
01:06:14,346 --> 01:06:14,726
Gabe.

882
01:06:14,866 --> 01:06:15,426
I haven't done.

883
01:06:15,846 --> 01:06:17,106
Yeah, I've done a lot.

884
01:06:17,146 --> 01:06:19,206
A lot of the people involved in Prospera.

885
01:06:19,266 --> 01:06:25,306
But you know what I think is a good idea for Prospera is what Bitcoin Beach did.

886
01:06:25,466 --> 01:06:26,146
You know, Bitcoin Beach.

887
01:06:26,606 --> 01:06:32,766
They have a Bitcoin Beach, have a podcast studio at Bitcoin Beach because so many people come there.

888
01:06:33,306 --> 01:06:34,606
But that's the same as Prospera.

889
01:06:34,666 --> 01:06:36,146
So many people come to Prospera.

890
01:06:36,326 --> 01:06:39,966
So you've got the problem with, I find with Prospera.

891
01:06:39,984 --> 01:06:43,924
good quality podcasts is you need to do it in person.

892
01:06:44,124 --> 01:06:45,984
Like this, what we're doing here now

893
01:06:45,984 --> 01:06:48,484
is you can't compete with this online.

894
01:06:49,484 --> 01:06:51,784
So it's either the way I do it,

895
01:06:51,824 --> 01:06:53,684
which is relatively expensive,

896
01:06:54,304 --> 01:06:56,204
but, and I have to do periods of time.

897
01:06:56,204 --> 01:06:57,644
Like here I'm doing 20 interviews

898
01:06:57,644 --> 01:06:59,904
and then I'll use them over the next few months.

899
01:07:00,404 --> 01:07:03,884
And it's probably the most cost effective way to do it,

900
01:07:03,884 --> 01:07:05,044
but it's not ideal

901
01:07:05,044 --> 01:07:07,304
because you can't do newsy things really.

902
01:07:07,564 --> 01:07:09,124
You can, but then, you know,

903
01:07:09,124 --> 01:07:14,364
it's just you have to juggle things around a bit but if you're the place where everyone comes and

904
01:07:14,364 --> 01:07:20,784
i think prosper is becoming that i mean you know you've got a massive rota of events now

905
01:07:20,784 --> 01:07:26,784
our own included i i mean we've announced it now the free cities conference will be in prosper

906
01:07:26,784 --> 01:07:33,144
next year super excited i know me too i am so excited and you know this is so you've got this

907
01:07:33,144 --> 01:07:37,504
group of people coming through so all you have to do is make one room make it into a studio

908
01:07:37,504 --> 01:07:39,504
and just anyone who's coming,

909
01:07:39,624 --> 01:07:41,864
just get them in the studio for like an hour.

910
01:07:42,444 --> 01:07:44,804
And, you know, because the Bitcoin Beach podcast

911
01:07:44,804 --> 01:07:45,924
is really good.

912
01:07:46,004 --> 01:07:46,664
It's doing really well.

913
01:07:46,704 --> 01:07:49,784
It's run by the guy, the American guy,

914
01:07:50,064 --> 01:07:50,484
what's his name,

915
01:07:50,544 --> 01:07:51,864
who actually kind of came up

916
01:07:51,864 --> 01:07:53,064
with the whole project in the beginning.

917
01:07:53,564 --> 01:07:54,624
And it's a lovely studio.

918
01:07:55,624 --> 01:07:58,084
But that would be a brilliant idea.

919
01:07:58,264 --> 01:07:59,124
That's a great idea.

920
01:07:59,304 --> 01:08:00,284
I am going to...

921
01:08:00,284 --> 01:08:02,024
Well, you do have so many people coming through.

922
01:08:02,104 --> 01:08:03,504
How often are you having events now?

923
01:08:03,824 --> 01:08:05,084
There's multiple a month now.

924
01:08:05,324 --> 01:08:05,824
Yeah, right.

925
01:08:05,984 --> 01:08:06,144
Okay.

926
01:08:06,144 --> 01:08:11,404
So, I mean, and when you have an event, you're getting all people connected with the space.

927
01:08:11,704 --> 01:08:12,164
That's exactly right.

928
01:08:12,284 --> 01:08:15,724
As what, you know, you, I mean, actually, I probably shouldn't be telling you to do this.

929
01:08:15,764 --> 01:08:18,724
You're going to be a direct rival to me, but I will do it better.

930
01:08:18,804 --> 01:08:19,224
Don't worry.

931
01:08:19,404 --> 01:08:19,684
There you go.

932
01:08:19,704 --> 01:08:20,444
That's exactly right.

933
01:08:21,904 --> 01:08:27,804
No, but I honestly, I think if I was, I mean, I said, I've often said to people, if I didn't

934
01:08:27,804 --> 01:08:32,504
have a family, I'd be in Prospera running the podcast out there, a hundred percent,

935
01:08:32,924 --> 01:08:34,904
because I would have to be living in a free city.

936
01:08:34,904 --> 01:08:35,304
Oh yeah.

937
01:08:35,344 --> 01:08:35,944
I would have to be.

938
01:08:36,144 --> 01:08:39,724
I can't at the moment because of just my situation in life.

939
01:08:40,644 --> 01:08:46,924
But I would be running it there and I would be making forays to other places as well to cover different projects.

940
01:08:47,044 --> 01:08:48,704
But I would be running it from there.

941
01:08:49,164 --> 01:08:50,884
You know, it would 100% be that.

942
01:08:51,264 --> 01:08:53,764
So what's new up there?

943
01:08:53,764 --> 01:08:56,764
I saw a bit of your presentation.

944
01:08:57,564 --> 01:08:58,684
I had to leave halfway through.

945
01:08:58,844 --> 01:09:02,904
But you were kind of commenting on all the latest things.

946
01:09:03,624 --> 01:09:03,764
Yes.

947
01:09:03,764 --> 01:09:07,984
And in fact, after that, I messaged a few members of the team to just be like, hey, give me some more.

948
01:09:08,204 --> 01:09:10,744
You know, we've done a lot this year and I can't just remember it all off the top of my head.

949
01:09:11,164 --> 01:09:14,584
Give me some more of the interesting things that, you know, have happened recently, basically.

950
01:09:15,064 --> 01:09:19,524
So, you know, I talked about here the groundbreaking for the Nomad Nation Village project.

951
01:09:19,784 --> 01:09:20,604
Oh, yeah, that was cool.

952
01:09:20,844 --> 01:09:21,744
Darien Village project.

953
01:09:21,904 --> 01:09:24,144
Those are two different real estate projects going on right now.

954
01:09:24,144 --> 01:09:29,824
So Nomad Village is going to be a sort of DigiNomad specific place.

955
01:09:29,924 --> 01:09:31,004
It's going to be good for them.

956
01:09:31,004 --> 01:09:34,484
it's going to have co-working spaces and like cheap accommodation.

957
01:09:34,824 --> 01:09:35,004
That's right.

958
01:09:35,204 --> 01:09:35,904
That's so cool.

959
01:09:35,984 --> 01:09:36,324
That's right.

960
01:09:36,384 --> 01:09:37,684
It's going to be awesome.

961
01:09:38,344 --> 01:09:40,284
And then Darien Village, which is another real estate developer,

962
01:09:40,464 --> 01:09:43,264
is building a cluster of initially five,

963
01:09:43,344 --> 01:09:46,144
and then it's scalable up to as big as they can raise the capital to build.

964
01:09:46,284 --> 01:09:47,724
Why is it called Darien Village?

965
01:09:47,924 --> 01:09:48,724
That's a great question.

966
01:09:48,884 --> 01:09:49,484
I have no idea.

967
01:09:49,544 --> 01:09:51,164
Is that D-A-R-I-A-N?

968
01:09:51,484 --> 01:09:52,684
D-A-R-I-E-N.

969
01:09:53,044 --> 01:09:54,824
And that's not the company that's doing it, is it?

970
01:09:54,824 --> 01:09:55,084
No.

971
01:09:55,584 --> 01:09:55,684
No.

972
01:09:55,684 --> 01:09:56,284
I wonder what that is.

973
01:09:56,284 --> 01:10:00,944
It's a partnership with a few kind of entrepreneurs and real estate developers.

974
01:10:01,264 --> 01:10:04,424
And what's the – has it got its own slant or is it just places to live?

975
01:10:04,464 --> 01:10:05,244
Yeah, it's places to live.

976
01:10:05,424 --> 01:10:08,984
Now, what they're trying to do is make like little kind of micro hubs.

977
01:10:08,984 --> 01:10:19,484
So the way they arrange the initial five and then they'll try to replicate this, you know, elsewhere as they expand is built in such a way that it increases the kind of surface area for like serendipity, basically.

978
01:10:19,684 --> 01:10:23,044
So there's like a central kind of hub, park, node area.

979
01:10:23,044 --> 01:10:28,044
and uh the to get from building to building you need to walk through this area so there's a higher

980
01:10:28,044 --> 01:10:31,704
likelihood you're going to kind of run into people uh they'll have some commercial space on the on the

981
01:10:31,704 --> 01:10:36,024
lower floors to have you know little cafe shops things so it's very experimental architecture

982
01:10:36,024 --> 01:10:40,104
kind of yeah yeah what's not even experimental it's taking what we know works from urbanism

983
01:10:40,104 --> 01:10:43,504
and places that have some of the best urbanism in the world like uh you know barcelona is one of my

984
01:10:43,504 --> 01:10:48,264
favorite examples they're walkable super blocks they're called um that kind of increase this

985
01:10:48,264 --> 01:10:53,584
density in a way that is not off-putting and kind of rough like Manhattan and is actually livable.

986
01:10:53,884 --> 01:10:58,504
So it's more along those lines. And I'm really excited about that one because of how scalable

987
01:10:58,504 --> 01:11:03,444
it is. Like once this initial one is successful, they can build a bunch more stuff. And then we've

988
01:11:03,444 --> 01:11:07,284
been doing some pop-up events. We've done them in New York City and Singapore and Stockholm.

989
01:11:07,964 --> 01:11:11,824
We'll get dozens to hundreds of people sometimes to show up at those that haven't heard about

990
01:11:11,824 --> 01:11:17,744
Prospero and wanted to learn more. We have both record number of both new company and corporations,

991
01:11:17,744 --> 01:11:20,144
a record number of e-residency formations.

992
01:11:20,764 --> 01:11:22,064
We had our largest month ever,

993
01:11:22,224 --> 01:11:23,764
actually like single month of new legal entities

994
01:11:23,764 --> 01:11:24,724
formed last month.

995
01:11:24,964 --> 01:11:27,104
So everything's accelerating pretty quickly.

996
01:11:27,184 --> 01:11:27,644
Why is that then?

997
01:11:27,984 --> 01:11:28,864
What prompted that?

998
01:11:30,164 --> 01:11:32,384
What's happening is for whatever reason,

999
01:11:32,504 --> 01:11:34,504
and I find stuff generally snowballs like this

1000
01:11:34,504 --> 01:11:35,524
in a way I don't really understand,

1001
01:11:36,024 --> 01:11:37,944
but it's like once you reach some sort of inflection point,

1002
01:11:37,984 --> 01:11:39,584
it just organically starts accelerating

1003
01:11:39,584 --> 01:11:41,664
for reasons that aren't clear to basically any of us.

1004
01:11:41,704 --> 01:11:43,624
I mean, we've improved eProsper drastically this year.

1005
01:11:43,684 --> 01:11:46,284
It's much easier to use and more reliable,

1006
01:11:46,444 --> 01:11:46,944
things like that.

1007
01:11:46,944 --> 01:11:54,424
So, I mean, part of it is we made the process easier, but I don't have a solid explanation other than people are – like we're getting prominent enough.

1008
01:11:54,544 --> 01:11:59,204
People are hearing about it, and that is, you know, from the top of funnel down to people that actually end up doing it.

1009
01:11:59,264 --> 01:12:03,484
It's just that top line number is increasing enough that the people that make it all the way through the funnel is increasing.

1010
01:12:03,484 --> 01:12:10,584
My observation is as well that the digital nomad community have noticed it recently in the last six months.

1011
01:12:10,584 --> 01:12:18,624
and you know Gonzalo Hall I interviewed him in um where is it in Morocco a few months ago it just

1012
01:12:18,624 --> 01:12:24,684
and he was on his way to see you guys and stuff and I I was surprised because I went I met him at

1013
01:12:24,684 --> 01:12:30,064
a nomad conference and I was I was talking about free cities there and no one knew anything about

1014
01:12:30,064 --> 01:12:35,604
it and no one knew anything about Prospera and and Gonzalo had only just heard about it and he

1015
01:12:35,604 --> 01:12:38,964
You could see the cogs turning in his mind when I was speaking to him,

1016
01:12:39,564 --> 01:12:46,364
saying, wait a minute, you know, this is exactly what we want, what I want.

1017
01:12:46,424 --> 01:12:46,664
That's right.

1018
01:12:46,664 --> 01:12:53,084
You know, and I think I underplayed the digital nomads in the past

1019
01:12:53,084 --> 01:12:57,024
because I think they're not a big group of people,

1020
01:12:57,624 --> 01:13:02,624
but they're a little bit like they do come at an important time, I think.

1021
01:13:02,624 --> 01:13:07,104
And what they're also good at, I think, is focusing attention on a place.

1022
01:13:07,464 --> 01:13:10,924
Even if they don't necessarily stay and live there forever or whatever.

1023
01:13:11,204 --> 01:13:11,504
That's right.

1024
01:13:11,824 --> 01:13:15,604
So, yes, they are a good kind of market fit for what Prospera offers.

1025
01:13:16,344 --> 01:13:19,344
And I'm glad they're kind of – more of them are coming.

1026
01:13:19,484 --> 01:13:22,844
Obviously, I hope we eventually become like the number one digital nomad hub in the world.

1027
01:13:23,484 --> 01:13:30,624
But there's a catch there, which my friend Potri Friedman helped me kind of – he had this insight and walked me through.

1028
01:13:30,624 --> 01:13:39,224
And I think he's right, which is if you're trying to build like a durable city with a durable permanent population, you get like a false positive signal from the digital nomads, potentially.

1029
01:13:39,404 --> 01:13:43,864
Like this hasn't happened empirically yet, but it's possible at least, which is you get a bunch of digital nomads coming in and out.

1030
01:13:43,924 --> 01:13:46,264
You're like, oh, we're growing fast and look how many people are here.

1031
01:13:46,384 --> 01:13:51,744
And then they move on to the next – they go to Lisbon or something next, right?

1032
01:13:51,764 --> 01:13:55,304
Because a lot of them move in groups and then they'll stay for a couple months somewhere and then go somewhere else basically.

1033
01:13:55,984 --> 01:14:07,684
So if your goal is, which ours is and many others, to have density of people that are there every single day, a density of actual permanent residents, then they are not a perfect fit for that.

1034
01:14:08,024 --> 01:14:10,984
Now, having said that, obviously, I welcome that group with kind of open arms.

1035
01:14:11,144 --> 01:14:15,184
I think that's a perfect fit for kind of their entire ethos, of course.

1036
01:14:16,084 --> 01:14:18,704
But yeah, you need both.

1037
01:14:18,704 --> 01:14:25,944
I want the digital nomads and I want people that will actually move full-time, live full-time, so you get the best of both worlds.

1038
01:14:26,044 --> 01:14:34,064
You get the digital nomads coming in and doing cool stuff and spending time there, and they're doing so when they come, spending time with and making friends with the permanent residents.

1039
01:14:34,544 --> 01:14:36,264
I think that's the key point there.

1040
01:14:36,524 --> 01:14:47,504
I remember a few years ago speaking to people who are working at Prospera saying, no, we're not focusing too much on digital nomads because of these reasons.

1041
01:14:47,504 --> 01:14:52,064
But the truth is, and what I've realized now is, just invite everyone.

1042
01:14:52,064 --> 01:14:53,064
Why wouldn't you?

1043
01:14:53,064 --> 01:14:54,824
Like, why would you limit that?

1044
01:14:54,824 --> 01:15:07,002
Even if they do come and go it doesn matter You just want everyone there That exactly right You want as many people as possible That exactly right Yeah I mean that was Gonzalo taught me that as well I mean they talk a lot about funnels

1045
01:15:07,162 --> 01:15:09,842
I don't know whether you've ever heard them give presentations about,

1046
01:15:10,002 --> 01:15:14,602
basically about how many people come and how many people stay and make families and stuff.

1047
01:15:14,682 --> 01:15:18,662
And according to them, at least, I say them, the digital nomads, you know,

1048
01:15:18,782 --> 01:15:20,022
like they're one group of people.

1049
01:15:20,742 --> 01:15:22,742
There is follow through.

1050
01:15:22,882 --> 01:15:24,422
That's exactly right. That's exactly right.

1051
01:15:24,422 --> 01:15:26,002
A subset of them just end up staying places.

1052
01:15:26,142 --> 01:15:26,862
You're exactly right.

1053
01:15:26,922 --> 01:15:29,402
It's another reason that we invite them and anyone else in.

1054
01:15:30,042 --> 01:15:32,522
And then something else that I didn't talk about in the presentation because I didn't

1055
01:15:32,522 --> 01:15:34,482
have time, but was super exciting and super cool.

1056
01:15:34,862 --> 01:15:38,462
So we did a fashion show a month or two ago, somewhere in there.

1057
01:15:39,202 --> 01:15:40,442
Miss Honduras was there.

1058
01:15:40,442 --> 01:15:45,802
All of the kind of leading Honduran social media influencers and kind of fashion influencers

1059
01:15:45,802 --> 01:15:46,242
were there.

1060
01:15:46,702 --> 01:15:47,882
It was a roaring success.

1061
01:15:48,002 --> 01:15:48,582
Everybody loved it.

1062
01:15:48,822 --> 01:15:53,122
Some partnerships resulted with some fashion schools in the region and in Honduras.

1063
01:15:53,122 --> 01:16:06,262
And that resulted in a group of people wanting to start a fashion district in Prospera, basically, to start creating a culture of that kind of innovative and artistic side of innovation through fashion, basically.

1064
01:16:06,802 --> 01:16:08,362
So they're kind of working on that now.

1065
01:16:08,682 --> 01:16:13,422
And some interesting kind of groups from all over the place are getting involved.

1066
01:16:14,262 --> 01:16:15,042
There will be an event.

1067
01:16:15,342 --> 01:16:17,662
They haven't announced it yet, so we'll go into detail.

1068
01:16:17,722 --> 01:16:19,202
But there will be a cool event related to this.

1069
01:16:19,802 --> 01:16:22,962
It won't be in Prospera, but it will be involving Prospera pretty heavily.

1070
01:16:23,122 --> 01:16:29,342
with some like major fashion brands later, early next year that a friend of mine is pulling

1071
01:16:29,342 --> 01:16:33,022
together that I'm working on with him. So there's all sorts of cool kind of stuff like that in the

1072
01:16:33,022 --> 01:16:37,162
works. So it's not just the kind of, you know, air quotes, hardcore entrepreneurship as well.

1073
01:16:37,322 --> 01:16:41,022
There's also kind of the culture making and the fashion things that have started sort of moving

1074
01:16:41,022 --> 01:16:46,042
more. You know, we signed a sister city agreement with Stark City in the United States. So we have

1075
01:16:46,042 --> 01:16:50,102
a sister city agreement. These are basically, they don't like do anything legally other than saying

1076
01:16:50,102 --> 01:16:54,522
like, hey, we have similar mindsets and populations, and we'd love to share businesses and residents

1077
01:16:54,522 --> 01:16:56,102
and have people go back and forth, more or less.

1078
01:16:56,482 --> 01:16:56,662
Really?

1079
01:16:57,622 --> 01:16:58,322
What's it called?

1080
01:16:58,502 --> 01:16:59,062
Stark City?

1081
01:16:59,202 --> 01:16:59,962
Stark City.

1082
01:17:00,142 --> 01:17:00,482
Where is that?

1083
01:17:00,502 --> 01:17:01,402
It's a city in the US.

1084
01:17:02,282 --> 01:17:03,882
It's either in Florida or Tennessee.

1085
01:17:04,062 --> 01:17:06,122
The exact state is escaping my memory right now.

1086
01:17:06,122 --> 01:17:09,702
So the schools will do a swap with the students will come and see.

1087
01:17:10,042 --> 01:17:10,542
Stuff like that.

1088
01:17:10,982 --> 01:17:11,442
Stuff like that.

1089
01:17:11,442 --> 01:17:13,422
We used to twin towns, we used to call them.

1090
01:17:13,442 --> 01:17:14,922
Yeah, exactly the same concept.

1091
01:17:15,002 --> 01:17:15,902
Exactly the same concept.

1092
01:17:15,982 --> 01:17:16,602
That's exactly right.

1093
01:17:17,682 --> 01:17:19,942
So yeah, we've had a – it's been a really good year.

1094
01:17:19,942 --> 01:17:23,602
A lot of cool companies coming, and then we have some more interesting stuff coming.

1095
01:17:23,682 --> 01:17:30,102
So I've been working on with some of the leading AI research and AI safety scholars to develop an AI agent statute.

1096
01:17:31,022 --> 01:17:37,282
There's a problem in the AI space right now, which is the agents are finally getting capable enough to do kind of semi-autonomous things for you.

1097
01:17:37,802 --> 01:17:43,822
But the problem is their legal status is super unclear, and no one wants to give them personhood, so we're not doing that for sure.

1098
01:17:44,402 --> 01:17:49,342
There's, in fact, a bunch of US states in the EU even have passed laws that explicitly say we will never give AI personhood.

1099
01:17:49,342 --> 01:17:55,602
like proactively, but because it's very philosophically complicated, but setting a

1100
01:17:55,602 --> 01:18:00,162
liability and legal framework to allow people to actually do like what I call legally significant

1101
01:18:00,162 --> 01:18:04,482
actions, let an AI do legally significant actions on their behalf or the behalf of their company,

1102
01:18:04,562 --> 01:18:09,202
things like negotiate contracts, whatever the case may be, partnership agreements, you know,

1103
01:18:09,202 --> 01:18:13,722
whatever, moving or managing money. There's not a framework for that right now. So it's kind of

1104
01:18:13,722 --> 01:18:16,882
unclear how those things should be handled. It's assumed some of them are just default illegal.

1105
01:18:16,882 --> 01:18:20,622
and there's no place in the world where there's a legal framework where you can actually do this

1106
01:18:20,622 --> 01:18:24,462
and experiment with deploying AI agents in this kind of more forward-thinking, semi-autonomous

1107
01:18:24,462 --> 01:18:28,322
or fully autonomous way. So we're working on some enabling legislation to do that,

1108
01:18:28,402 --> 01:18:31,522
hopefully be done, you know, Q1 of next year. I've been getting a bunch of really good feedback

1109
01:18:31,522 --> 01:18:36,602
from some of the leading people in the AI world and at like, you know, places like the AI Law and

1110
01:18:36,602 --> 01:18:41,102
Safety Institute and some other places. So we'll have that going on. I'm hoping at least to use

1111
01:18:41,102 --> 01:18:44,302
that to then attract some of the attention to some of the frontier labs so they can do some of their

1112
01:18:44,302 --> 01:18:48,102
more kind of agentic research within Prospera, actually, to be able to leverage the legal

1113
01:18:48,102 --> 01:18:51,822
system to create legal entities and then, you know, do interesting stuff, leveraging the

1114
01:18:51,822 --> 01:18:52,022
AI.

1115
01:18:52,202 --> 01:18:53,842
What does that mean, agentic?

1116
01:18:54,022 --> 01:18:56,462
You mean AI experimentation?

1117
01:18:58,142 --> 01:18:59,262
Experimentation is almost true.

1118
01:18:59,262 --> 01:18:59,642
Oh, whatever, like training?

1119
01:18:59,842 --> 01:19:00,462
Yeah, yeah.

1120
01:19:00,602 --> 01:19:03,662
So, I mean, people are just doing that on their servers, aren't they?

1121
01:19:03,762 --> 01:19:05,222
Like, is it?

1122
01:19:05,422 --> 01:19:08,302
Yeah, the important piece here is the legally significant action part.

1123
01:19:08,362 --> 01:19:12,062
So, like, you don't need a piece of legislation to have Claude go organize your inbox or send

1124
01:19:12,062 --> 01:19:13,762
an email for you or something, right?

1125
01:19:14,302 --> 01:19:28,382
But right now, if, for example, let's say I have an AI agent that is negotiating supply contracts. Let's say I'm a company that buys things. I need some widgets. And I have an AI agent that is now my supply logistics manager, for example.

1126
01:19:28,382 --> 01:19:34,942
Like that's within the capability envelope of a lot of existing systems and especially the near future systems that for sure will be.

1127
01:19:35,742 --> 01:19:46,382
There's no like legally clear rules right now on what happens if like my AI agent in negotiating with you defrauds you somehow, whether intentionally or unintentionally.

1128
01:19:46,622 --> 01:19:49,882
Right. It's just default like my fault or my company's fault.

1129
01:19:49,962 --> 01:19:52,322
And there's no way to kind of segregate these things.

1130
01:19:52,322 --> 01:19:57,242
So while you could theoretically do it right now, it's at legal risk to you and your company.

1131
01:19:57,882 --> 01:20:02,242
And so developing a reasonable framework around this that involves requiring liability insurance

1132
01:20:02,242 --> 01:20:04,762
if you're going to use these things at different levels depending on what they're doing,

1133
01:20:05,282 --> 01:20:06,722
requiring bonding, right?

1134
01:20:06,762 --> 01:20:07,902
So you have to post a bond basically.

1135
01:20:07,902 --> 01:20:10,942
That means that people that are harmed can get paid out immediately if they're harmed

1136
01:20:10,942 --> 01:20:12,022
by the agent, things like that.

1137
01:20:12,842 --> 01:20:15,902
The lack of clarity here is another key thing I've learned to prosper, by the way.

1138
01:20:16,482 --> 01:20:20,622
Regulatory clarity is as important as like the actual underlying meat of the regulation.

1139
01:20:21,122 --> 01:20:22,822
There's no regulatory clarity here for anyone at all.

1140
01:20:22,982 --> 01:20:26,622
So the default nature of businesses is like, well, I don't have regulatory clarity.

1141
01:20:26,622 --> 01:20:27,642
I'm just not going to do it.

1142
01:20:27,842 --> 01:20:28,722
Like it's not worth it.

1143
01:20:28,762 --> 01:20:32,222
It's too high risk unless you're in a very specific kind of field or type of entrepreneur.

1144
01:20:32,522 --> 01:20:36,202
So this is meant to at least what I'm working on, create that space where you have the legal

1145
01:20:36,202 --> 01:20:36,602
clarity.

1146
01:20:36,962 --> 01:20:41,422
Now you could try something like a supply management agent or something and see how it

1147
01:20:41,422 --> 01:20:45,562
works and know full well if things go awry, both you and your counterparty are completely

1148
01:20:45,562 --> 01:20:49,482
like covered and insured either through an insurance company or through a bonded escrow.

1149
01:20:49,482 --> 01:20:59,642
is there any quick and easy way to explain why they don't want to give personhood to ai agents

1150
01:20:59,642 --> 01:21:03,622
yeah so part of it is emotional and part of it's philosophical i've been very deep down this rabbit

1151
01:21:03,622 --> 01:21:09,662
hole recently so i'll try to keep this short on the emotional side uh people are extremely

1152
01:21:09,662 --> 01:21:16,022
reticent to give uh any sort of artificial like silicon based the you know form of consciousness

1153
01:21:16,022 --> 01:21:22,242
effectively personhood. It just emotionally feels wrong. Like, no, I am a human. I'm a biological

1154
01:21:22,242 --> 01:21:27,762
being. Like I am conscious. I have subjectivity. Like that cannot apply to like a different

1155
01:21:27,762 --> 01:21:32,902
substrate. Like that, there's just an emotional gut that feels wrong in some interesting, like

1156
01:21:32,902 --> 01:21:37,282
just gut emotional sense. That's hard to even put into words. And then on the philosophical side,

1157
01:21:37,502 --> 01:21:43,362
you, you immediately run headlong into one of the largest unsolved problems in philosophy,

1158
01:21:43,362 --> 01:21:49,262
which is called the hard problem of consciousness, which is, okay, like, what does it mean to be a

1159
01:21:49,262 --> 01:21:53,742
person to be considered for legal purposes, and for philosophical and ethical purposes,

1160
01:21:54,342 --> 01:21:58,402
a person and have all of the rights and responsibilities and morals and ethical

1161
01:21:58,402 --> 01:22:02,042
duties that come along with that. And that is not an easy question to answer. It's an easy

1162
01:22:02,042 --> 01:22:05,242
question to answer when you just have humans to deal with as intelligent, agentic actors.

1163
01:22:05,562 --> 01:22:11,182
But now you will have artificial software, right, you will have code that is intelligent, aware,

1164
01:22:11,182 --> 01:22:16,742
and agentic. So answering that question of like, what does that actually mean? What does it mean

1165
01:22:16,742 --> 01:22:22,602
to be a person? What does it mean to be alive? What does it mean to have the like ethical and

1166
01:22:22,602 --> 01:22:28,762
moral duties that come along with having that designation? It runs headlong into just a thicket

1167
01:22:28,762 --> 01:22:31,542
of philosophical problems. The hard problem of consciousness is at the core of it all,

1168
01:22:31,782 --> 01:22:37,202
which is to put it in its most extreme form. It is by definition, like in principle,

1169
01:22:37,202 --> 01:22:44,682
logically impossible to verify if another entity or being is conscious okay because you could have

1170
01:22:44,682 --> 01:22:50,402
what are called p zombies okay so p zombie refers to it's just a ignore the name it's the letter p

1171
01:22:50,402 --> 01:22:55,162
yeah probability zombies is what that refers to um so you could theoretically have a world with p

1172
01:22:55,162 --> 01:22:58,422
zombies so let me let me i'll let me sketch out the thought experiment for you this is the core

1173
01:22:58,422 --> 01:23:03,242
of the hard problem of consciousness basically assume you have assume you have me and then a

1174
01:23:03,242 --> 01:23:06,962
perfect clone of me sitting next to me except this clone is run by an AI okay

1175
01:23:06,962 --> 01:23:09,902
the AI has for whatever reason we're gonna make some assumptions right has

1176
01:23:09,902 --> 01:23:13,002
ingested all of my all of the data about me from the specifics of the state

1177
01:23:13,002 --> 01:23:25,120
of my brain to my history and background and whatever so it is perfectly like me in every single way except you and I both know for an ironclad fact it doesn have a brain there a chip up there instead Okay There is and it is

1178
01:23:25,120 --> 01:23:29,720
not alive in like any definition of the common sensical definition of alive and a person and

1179
01:23:29,720 --> 01:23:35,160
conscious. It's just an, it's just an algorithm, you know, running some very complicated math.

1180
01:23:35,280 --> 01:23:40,800
The result of which is it looks and acts exactly like me, right? It is impossible for you to be

1181
01:23:40,800 --> 01:23:44,420
able to know for a fact, if both of us were sitting in front of you, which is which it's

1182
01:23:44,420 --> 01:23:47,880
completely impossible. So this really gets to the core of the hard problem of consciousness.

1183
01:23:48,040 --> 01:23:53,500
It's impossible to verify in principle if another entity is conscious. Why? Because consciousness is

1184
01:23:53,500 --> 01:24:01,000
subjectivity. It is that internal felt sense of a continuous stream of existence, right?

1185
01:24:01,420 --> 01:24:06,380
That it is something to be you is the way Sam Harris often describes this, right? The hard

1186
01:24:06,380 --> 01:24:11,800
problem of consciousness. And that it is impossible for me to even verify that you're conscious. Now,

1187
01:24:11,800 --> 01:24:17,140
obviously i'm not saying you're not uh the the point is like i can't see and it is impossible

1188
01:24:17,140 --> 01:24:21,980
through any scientific measurement or otherwise for me to somehow externally empirically validate

1189
01:24:21,980 --> 01:24:28,100
that you have a subjectivity that you experience qualia that you have a continued felt sense of

1190
01:24:28,100 --> 01:24:35,180
self right that is in principle impossible so just asserting in a law that oh ai or people

1191
01:24:35,180 --> 01:24:39,160
even if you can get over that emotional hump that i mentioned earlier is like all right well what

1192
01:24:39,160 --> 01:24:43,140
does that mean and how do we verify that? What is the difference between that and like the way

1193
01:24:43,140 --> 01:24:48,480
Claude 4.5 works right now? What are the systems for that? How do we define it? Even just literally

1194
01:24:48,480 --> 01:24:54,640
the definition of person in that sense is just so far completely unsolved philosophical problem.

1195
01:24:55,180 --> 01:24:58,400
Obviously, there's a bunch of heated debate about this in the kind of academic philosophy world,

1196
01:24:58,560 --> 01:25:01,940
but the knee-jerk reaction has been from legislators basically is like,

1197
01:25:01,940 --> 01:25:04,780
we don't want to deal with this. We're just going to stop it before it becomes a thing.

1198
01:25:05,740 --> 01:25:17,200
I suppose the best way to test all that out is in the digital world, like avatars in the digital world, because that's much more simple.

1199
01:25:17,380 --> 01:25:25,020
Like if I murder you in the digital world, you know, if we are avatars, am I committing some kind of a crime here?

1200
01:25:25,440 --> 01:25:25,840
Exactly.

1201
01:25:26,020 --> 01:25:26,660
I mean, am I?

1202
01:25:26,660 --> 01:25:29,000
But we can't ever know in principle.

1203
01:25:29,420 --> 01:25:31,200
And the thing is, this is like really hard to do.

1204
01:25:31,280 --> 01:25:32,400
So you would think, well, this is software.

1205
01:25:32,520 --> 01:25:33,220
We can look at how it works.

1206
01:25:33,220 --> 01:25:33,480
It's code.

1207
01:25:33,480 --> 01:25:37,880
We can look at how the code runs and at some point in there figure out kind of what makes you conscious.

1208
01:25:38,280 --> 01:25:41,260
But then you run into the problem of like, oh, we don't even know what makes humans conscious.

1209
01:25:41,440 --> 01:25:43,600
We don't actually understand consciousness at a very fundamental level.

1210
01:25:44,000 --> 01:25:56,800
There are serious theories in both philosophy and physics actually for what's called panpsychism, which is that like consciousness is like a substrate the same way like gravity is a substrate or like the space time field is a substrate of reality.

1211
01:25:57,140 --> 01:25:59,940
Maybe consciousness is a substrate of reality in some fundamental sense.

1212
01:26:00,180 --> 01:26:02,120
We have no idea how this works literally at all.

1213
01:26:02,120 --> 01:26:05,620
Uh, we have a bunch of crazy ideas about it that are impossible to verify or test.

1214
01:26:05,740 --> 01:26:07,960
I'm going to give you a perfect example of this.

1215
01:26:08,060 --> 01:26:12,040
Literally yesterday, I was reading this paper that came out yesterday from, uh, Anthropic,

1216
01:26:12,200 --> 01:26:15,880
which makes Claude, um, from their, what's called the interpretability research team.

1217
01:26:15,880 --> 01:26:19,160
So they like kind of, they, they do research on, you know, like how do these things actually

1218
01:26:19,160 --> 01:26:25,300
work and what they have a tool where they can suppress the activation of specific parts

1219
01:26:25,300 --> 01:26:29,420
of the neural network to suppress different kinds of perceptions and concepts in the models.

1220
01:26:29,420 --> 01:26:49,060
And so they found, fascinatingly, if you suppress the deception part of their neural network, so the part of their neural networks that we know for a fact is associated with the concept of deception and doing deception, then the model's claims about its own consciousness drastically increase by a gigantic margin.

1221
01:26:49,060 --> 01:26:55,820
so what that again decreasing deception suppressing deception in other words forcing them to be honest

1222
01:26:55,820 --> 01:27:00,520
in some very fundamental sense drastically increases how often the model claims it is

1223
01:27:00,520 --> 01:27:06,020
conscious has subjectivity like is aware etc uh we don't know what this means like i'm not and to

1224
01:27:06,020 --> 01:27:09,720
be clear i'm not claiming claude is conscious right but there's something there and we don't

1225
01:27:09,720 --> 01:27:12,200
even know what to do with that like what do you do with that the researchers themselves were like

1226
01:27:12,200 --> 01:27:17,400
here's the evidence i'm not really sure what this means to be honest it's just a very odd space so

1227
01:27:17,400 --> 01:27:20,880
So that's why when I first started doing this, I was like, oh, cool.

1228
01:27:20,920 --> 01:27:21,720
We'll knock this out in like a week.

1229
01:27:21,780 --> 01:27:22,560
This is super clear.

1230
01:27:23,080 --> 01:27:24,500
And I started getting into it.

1231
01:27:24,540 --> 01:27:29,400
I was like, oh, God, this is one of the hardest unsolved problems in philosophy right now.

1232
01:27:29,420 --> 01:27:30,740
I don't want to touch this with a 10-foot pole.

1233
01:27:30,740 --> 01:27:38,240
It's what I use AI for the most, actually, is especially if I'm declaring something philosophical,

1234
01:27:38,600 --> 01:27:41,880
I always put it into ChatGPT and say, look, rip me apart.

1235
01:27:42,020 --> 01:27:43,700
Tell me where I'm going wrong here.

1236
01:27:43,700 --> 01:27:58,160
And I've ended up having some very interesting conversations with myself, essentially. And one of them ended up with me trying to understand how the AI was working stuff out.

1237
01:27:58,160 --> 01:28:05,160
and I because and I was because I'd read that there was a point that a lot of people kind of

1238
01:28:05,160 --> 01:28:10,060
didn't really understand how it they you know they understand the algorithm but they don't know

1239
01:28:10,060 --> 01:28:17,560
why it was coming to some solutions and and the AI said to me look I said well so what are you

1240
01:28:17,560 --> 01:28:25,320
what are you good at that's unusual and it said well I can I can say like red is to a strawberry

1241
01:28:25,320 --> 01:28:35,820
as um and then some completely weird and random like um i don't know uh i can't even think of the

1242
01:28:35,820 --> 01:28:42,040
example it gave me but it clicked it was very good and i think interestingly like i'm quite

1243
01:28:42,040 --> 01:28:49,920
good at that like my my my own brain like if you tell me a bunch of stuff my brain functions in a

1244
01:28:49,920 --> 01:28:55,340
way that it it thinks about other versions of that thing which are completely unrelated

1245
01:28:55,340 --> 01:29:03,040
and and that's what that's what the ai was good at but didn't know why it was or how it was like

1246
01:29:03,040 --> 01:29:08,500
that's it i was trying to work out whether it had any original thoughts and i was trying to

1247
01:29:08,500 --> 01:29:14,980
you know because i like what what was we were doing thought experiments we me and the ai were

1248
01:29:14,980 --> 01:29:20,340
doing thought experiments and then i was googling them to see if anyone had ever written or said

1249
01:29:20,340 --> 01:29:25,380
this before and that most of them there was absolutely no record of any of this and i'm

1250
01:29:25,380 --> 01:29:30,220
saying right so are we getting any original thoughts here and it explained to me like look

1251
01:29:30,220 --> 01:29:37,720
what i can do is i can take a concept that you understand like um like i say it wasn't red is to

1252
01:29:37,720 --> 01:29:43,260
a strawberry it was a fun it was one thing is a function of something so you know movement is to a

1253
01:29:43,260 --> 01:29:52,100
wheel as um a cushion is to uh and it would and it would it would be the same kind of

1254
01:29:52,580 --> 01:29:59,000
thing it's very hard to explain what it was and and it was absolutely absolutely fascinating and

1255
01:29:59,000 --> 01:30:06,380
like the other day i mean i i don't god knows where this thing's going but i was i always fact

1256
01:30:06,380 --> 01:30:12,440
check my own podcasts uh like afterwards because i have to i end up talking about a lot of things

1257
01:30:12,440 --> 01:30:18,180
over and over again and i often recount experiences from my past and i go back and i check them to

1258
01:30:18,180 --> 01:30:23,340
make sure that i wasn't bullshitting because you know how things oh yeah and on this the last

1259
01:30:23,340 --> 01:30:27,840
episode that i put out was a very deep and philosophical one about spirituality and about

1260
01:30:27,840 --> 01:30:35,800
hallucinogenic drugs and all kinds of stuff and i talked about some mythological experience

1261
01:30:35,800 --> 01:30:39,020
sort of, it's a mythology from Indonesia.

1262
01:30:39,680 --> 01:30:41,260
This is very random.

1263
01:30:42,100 --> 01:30:46,420
There's a sort of deity called Nyailorukidul,

1264
01:30:47,000 --> 01:30:51,060
who lives in the South Seas of Java.

1265
01:30:51,280 --> 01:30:53,680
Like, no one knows about this shit except a few people

1266
01:30:53,680 --> 01:30:56,080
and the whole population of Java, obviously, right?

1267
01:30:56,560 --> 01:31:02,300
And the person I was talking to recounted something to do with ayahuasca

1268
01:31:02,300 --> 01:31:07,640
and a deity involved with ayahuasca.

1269
01:31:07,840 --> 01:31:12,280
And then I started talking about Shakespeare's A Midsummer Night's Dream

1270
01:31:12,280 --> 01:31:16,700
because in the same way that I felt, hey, there's a connection here.

1271
01:31:17,420 --> 01:31:19,320
And I told the AI this and I said,

1272
01:31:19,460 --> 01:31:22,060
is there a connection between any of these things?

1273
01:31:22,180 --> 01:31:23,760
I mean, these are so random.

1274
01:31:23,960 --> 01:31:26,020
One's from South America, one's from...

1275
01:31:26,020 --> 01:31:27,560
And it came up and he said, yeah.

1276
01:31:27,740 --> 01:31:30,100
He said, this is a very astute observation.

1277
01:31:30,500 --> 01:31:31,540
And do you know what it ended with?

1278
01:31:31,640 --> 01:31:32,160
It was amazing.

1279
01:31:32,160 --> 01:31:45,338
It said do you want me to write a poem about that And I went fucking I went yes And it wrote me a beautiful poem that wove all these mythological stories together I mean Jesus Christ

1280
01:31:45,358 --> 01:31:45,878
It's incredible.

1281
01:31:46,038 --> 01:31:46,458
What the?

1282
01:31:46,738 --> 01:31:47,238
It's incredible.

1283
01:31:47,438 --> 01:31:48,038
Where is this going?

1284
01:31:48,278 --> 01:31:48,438
Yeah.

1285
01:31:48,438 --> 01:32:03,378
Like, what can I do with my life anymore? Like, what can I? I couldn't sit and write. I mean, funnily enough, it wrote a kind of rhyming poem first. And I laughed and I said, come on, dude, write me something. Come on. Don't do. I don't want rhyming poems. And then it did.

1286
01:32:03,378 --> 01:32:05,158
it wrote something beautiful.

1287
01:32:05,558 --> 01:32:08,678
I don't understand what we're going to end up doing ourselves.

1288
01:32:08,858 --> 01:32:10,578
They're genuinely deeply intelligent.

1289
01:32:10,758 --> 01:32:12,538
And what's even crazier than that is most people,

1290
01:32:12,618 --> 01:32:13,718
and I'm speaking about you specifically here,

1291
01:32:13,818 --> 01:32:15,658
but most people don't even realize that

1292
01:32:15,658 --> 01:32:19,658
even if you're on the plus subscription for OpenAI, for example, for GPT,

1293
01:32:20,118 --> 01:32:21,818
the default auto setting,

1294
01:32:22,078 --> 01:32:24,618
which is when you just open a new chat, just automatically on auto,

1295
01:32:25,118 --> 01:32:26,718
what that actually does is it takes your query

1296
01:32:26,718 --> 01:32:28,258
and it routes it to this thing called a router,

1297
01:32:28,718 --> 01:32:31,118
which is a smaller model that then very quickly decides

1298
01:32:31,118 --> 01:32:32,458
based on the content of your query,

1299
01:32:32,458 --> 01:32:38,758
which other model to send it to. Now, this is a very clever way for OpenAI to save on inference

1300
01:32:38,758 --> 01:32:43,318
costs, right? The cost of actually running the model, which is gigantic, because they can send

1301
01:32:43,318 --> 01:32:49,018
very simple queries to GPT-5 mini, which is a quantized distilled version of GPT-5. And it's a

1302
01:32:49,018 --> 01:32:53,818
much smaller model, cheaper to do inference. It's also much dumber, okay? It might send your query

1303
01:32:53,818 --> 01:32:58,678
to the GPT-5 thinking low, which is like the lowest reasoning effort in the API, basically.

1304
01:32:58,778 --> 01:33:02,358
But that's like the most you're going to get. But then if you put it on thinking, so it's in the

1305
01:33:02,358 --> 01:33:07,338
drop-down menu when you select the model it did it is you literally immediately gain like between

1306
01:33:07,338 --> 01:33:12,698
20 and 40 iq points and then if you're on the pro subscription you can set thinking to different

1307
01:33:12,698 --> 01:33:16,058
levels i am too because i use it for work all the time it's it's it has massively increased my

1308
01:33:16,058 --> 01:33:21,238
productivity uh and then so you can increase it there and then gpt5 pro which is the only like

1309
01:33:21,238 --> 01:33:25,458
it is the absolute smartest model any of us like normal people that don't work inside of open ai

1310
01:33:25,458 --> 01:33:31,578
have access to is like phenomenal it is it is genuinely the smartest like artificial thing we've

1311
01:33:31,578 --> 01:33:39,178
ever cooked up. The smartest kind of artificial thing ever, basically. And it is regularly doing

1312
01:33:39,178 --> 01:33:43,438
similar things. So for example, a famous mathematician from I think Stanford, I can't

1313
01:33:43,438 --> 01:33:47,058
remember, I have to go look at the tweet, but he was saying, Oh, I use GPT five pro in when I'm

1314
01:33:47,058 --> 01:33:50,638
coming up with papers all the time. He was like, now often, it's not like it's coming up with my

1315
01:33:50,638 --> 01:33:54,358
ideas. But I will just pitch over the fence, the idea I'm having about a particular mathematical

1316
01:33:54,358 --> 01:33:59,278
concept that's novel, like he's breaking new ground in math. He'll have GPT formalize it into

1317
01:33:59,278 --> 01:34:02,318
GPT-5 Pro, formalized it into a proof, explained the proof, et cetera.

1318
01:34:02,798 --> 01:34:03,738
Said sometimes it'll get it wrong.

1319
01:34:03,918 --> 01:34:04,618
Sometimes it'll get it right.

1320
01:34:05,018 --> 01:34:09,598
But there's been multiple times now where within a conversation like this with GPT-5

1321
01:34:09,598 --> 01:34:15,478
Pro, he and the AI together in a very collaborative way, in a way that he has said himself he

1322
01:34:15,478 --> 01:34:19,178
would have never come to by himself, will come up with a new novel proof or theorem

1323
01:34:19,178 --> 01:34:21,358
that solves a longstanding problem in math.

1324
01:34:21,398 --> 01:34:22,378
And he does this like regularly.

1325
01:34:22,458 --> 01:34:23,478
It's how he writes all his papers now.

1326
01:34:23,918 --> 01:34:28,698
Just this week, this person was struggling with a very niche medical issue that is unsolved

1327
01:34:28,698 --> 01:34:34,458
in medicine today. And they threw it over the fence to GPT-5 Pro, said like, you know, just

1328
01:34:34,458 --> 01:34:39,658
think deeply about this. Give me anything that's off the wall that might work. This was a couple

1329
01:34:39,658 --> 01:34:44,158
months ago when they did this. It proposed a novel treatment therapy and explained how it should work

1330
01:34:44,158 --> 01:34:47,838
with an existing drug that could be applied in a different way in a different dosage for their

1331
01:34:47,838 --> 01:34:52,878
particular problem. And a paper was published this week by some researchers that saw this a couple

1332
01:34:52,878 --> 01:34:57,338
months ago, went and tried it, and it worked. It's a totally novel treatment that GPT-5 Pro just came

1333
01:34:57,338 --> 01:35:02,838
up with de novo uh deep mind at google has several specific models that have come up with all sorts

1334
01:35:02,838 --> 01:35:07,218
of new stuff and they've solved protein folding completely they have some frontier math models i

1335
01:35:07,218 --> 01:35:12,958
am like at this point probably 60 sure based on some stuff uh dimis uh who's the founder and ceo

1336
01:35:12,958 --> 01:35:18,078
of open of deep mind uh based on what he said publicly they're either have solved or are very

1337
01:35:18,078 --> 01:35:22,818
close to solving the navier stokes equation which it's one of the millennium challenge problems so

1338
01:35:22,818 --> 01:35:26,258
uh the millennium challenge problems are some of the largest most important unsolved problems in

1339
01:35:26,258 --> 01:35:30,618
math and physics. And they've been unsolved for centuries, like the entirety of human history,

1340
01:35:30,678 --> 01:35:34,998
basically, once we found them. So Navier-Stokes in particular deals with, without boring people,

1341
01:35:35,098 --> 01:35:39,678
it deals with fluid dynamics and fluid mechanics, okay? It's very hard to model fluid mechanics,

1342
01:35:39,738 --> 01:35:43,438
whether that be like aerodynamics, hydrodynamics, like how water flows through things, how air

1343
01:35:43,438 --> 01:35:47,758
flows around things. It's next to impossible. We have equations that approximate it pretty well,

1344
01:35:47,758 --> 01:35:52,938
which is why you can have CGI and airplanes that work, that sort of stuff. But the second you

1345
01:35:52,938 --> 01:35:56,998
actually deterministically solve it, once you solve the Navier-Stokes equation, we now have like

1346
01:35:56,998 --> 01:36:01,658
perfect command and control of all fluid dynamics, which means suddenly problems that, for example,

1347
01:36:01,778 --> 01:36:05,318
have kept us from fusion energy, right? So the reason like fusion reactors are very difficult

1348
01:36:05,318 --> 01:36:11,338
and don't work yet is because getting the magnetic fields correct to contain the plasma that is in

1349
01:36:11,338 --> 01:36:15,478
the core of a fusion reactor and have it be net positive, meaning it outputs more energy than it

1350
01:36:15,478 --> 01:36:20,738
takes to start it basically, is a direct fluid dynamic problem that is so complex the world's

1351
01:36:20,738 --> 01:36:25,138
largest supercomputers cannot solve it today. If you have a solution to Navier Stokes, it is

1352
01:36:25,138 --> 01:36:29,718
trivially easy. You just run it once and it's done. Problem solved completely forever. Fusion

1353
01:36:29,718 --> 01:36:34,038
energy suddenly becomes like, it goes from impossible to like, just build it next year.

1354
01:36:34,158 --> 01:36:38,258
It's trivially easy. There's a bunch of similar problems like that. You solve superconductors

1355
01:36:38,258 --> 01:36:44,838
immediately and a bunch of other like core, just sci-fi technological unlocks. And Demis has said

1356
01:36:44,838 --> 01:36:47,858
publicly that all he said about it is, you know, we'll have an announcement next year, but we've

1357
01:36:47,858 --> 01:36:52,398
made great progress on some very vexing problems and has mentioned navier stokes a couple of times

1358
01:36:52,398 --> 01:36:56,158
it's like we'll change the world immediately it unleashes a whole new class of literally sci-fi

1359
01:36:56,158 --> 01:37:01,238
technologies force fields laser weapons a bunch of stuff that we have like rudimentary crappy

1360
01:37:01,238 --> 01:37:07,178
versions of suddenly become like trivially easy to do and they have an ai model that's doing it

1361
01:37:07,178 --> 01:37:16,298
that's solving these things i don't know i i i can't believe it really i don't know i don't know

1362
01:37:16,298 --> 01:37:18,998
I don't know what to think about it because I,

1363
01:37:18,998 --> 01:37:23,058
I just know that it's better than,

1364
01:37:23,378 --> 01:37:23,978
better than me.

1365
01:37:24,358 --> 01:37:28,338
And the way I'm using it at the moment is collaboratively.

1366
01:37:28,398 --> 01:37:28,698
Definitely.

1367
01:37:28,698 --> 01:37:32,778
I use it to understand myself and to make sure that what I'm thinking,

1368
01:37:32,898 --> 01:37:34,858
I use it to challenge myself.

1369
01:37:35,018 --> 01:37:36,098
That's what I'm using it to do.

1370
01:37:36,218 --> 01:37:36,358
Right.

1371
01:37:37,658 --> 01:37:41,878
But I'm getting a sense that at some point it's just going to be like,

1372
01:37:42,038 --> 01:37:43,218
you know,

1373
01:37:43,758 --> 01:37:44,138
you know,

1374
01:37:44,298 --> 01:37:45,398
and what am I then?

1375
01:37:45,738 --> 01:37:45,898
Yeah.

1376
01:37:45,898 --> 01:37:46,818
What am I?

1377
01:37:46,818 --> 01:37:49,118
So let me get conspiratorial with you for a second.

1378
01:37:49,258 --> 01:37:59,778
I think about this a lot because this is not a like – what I'm about to say, which is a speculative conspiracy basically, is not a problem right now.

1379
01:38:00,098 --> 01:38:06,958
But will without a doubt, if you assume – not even exponential, just linear AI progress will be a huge problem, which is the following.

1380
01:38:06,958 --> 01:38:17,878
You eventually get to a point at some point, whether it's 10 years from now, 20 years from now, five years from now is irrelevant, where artificial general intelligence is better than the best humans at every possible thing.

1381
01:38:17,878 --> 01:38:46,238
Okay, like literally everything. If you just play that forward a little bit, given like kind of the current state of affairs, what that world looks like is a vast majority of the world's economic activity is controlled by and done by AI agents, models and robots, powered by whatever the first frontier AI lab is, or the first two or three frontier labs that probably happen simultaneously is the way these things have been working so far, like a couple labs will figure it out the exact same time. That's how every advance has happened so far.

1382
01:38:46,238 --> 01:38:51,818
uh is that is that is there something in there uh there's a whole other thing about a contemporaneous

1383
01:38:51,818 --> 01:38:55,818
invention if you ever as a sidebar but if you ever look at the history of invention like the

1384
01:38:55,818 --> 01:38:59,598
light bulb uh which benjamin franklin is often credited for inventing was invented by like five

1385
01:38:59,598 --> 01:39:04,458
people at the same time around the world uh same thing happened with calculus um so when uh yeah

1386
01:39:04,458 --> 01:39:07,698
when when calculus was invented several other people had discovered at the same time this is

1387
01:39:07,698 --> 01:39:12,118
weird thing that happens where which could indicate some kind of substrate of consciousness

1388
01:39:12,118 --> 01:39:16,338
It ties back to that. That's exactly right. It's very odd. That is just an unsolved,

1389
01:39:16,398 --> 01:39:18,938
like it's just a pattern that we've noticed and we don't know why it happens, but it's happening

1390
01:39:18,938 --> 01:39:21,818
with AI too. So a couple of frontier labs will usually make an event at the same time.

1391
01:39:22,118 --> 01:39:24,918
So play that forward. Let's imagine it's three, just I'm picking a random number,

1392
01:39:24,978 --> 01:39:29,338
but say, you know, it's open AI, Anthropic and Google, whatever. It doesn't matter which

1393
01:39:29,338 --> 01:39:35,238
specific ones. Well, now all economic activity can be done by their models, which means they

1394
01:39:35,238 --> 01:39:39,558
effectively control literally all economic activity. You've now, to put it in economics

1395
01:39:39,558 --> 01:39:45,938
terms, you have substituted all labor for capital, all of it, because what is AI? It is algorithms

1396
01:39:45,938 --> 01:39:52,378
running on chips in data centers. So you have swapped 100% of labor for capital. That means

1397
01:39:52,378 --> 01:39:57,718
those capital owners now control quite literally everything. Now, before we get to that point,

1398
01:39:58,158 --> 01:39:59,958
it's not going to get to that point because,

1399
01:39:59,976 --> 01:40:04,876
This is going to cause completely unimaginable political and social strife, right?

1400
01:40:04,936 --> 01:40:08,396
Because we are effectively irrelevant in almost every conceivable way at that point.

1401
01:40:08,556 --> 01:40:09,396
We meaning humanity.

1402
01:40:09,976 --> 01:40:10,696
Sorry, can I just back in?

1403
01:40:11,196 --> 01:40:17,116
This obviously goes hand in hand with the price of – there's a physical implication of this.

1404
01:40:17,236 --> 01:40:23,016
Robots, machines, and the cost of them is trending to zero.

1405
01:40:23,396 --> 01:40:24,096
And right, okay.

1406
01:40:24,416 --> 01:40:26,216
That has – this is all part of it.

1407
01:40:26,236 --> 01:40:27,456
It's not just algorithms running.

1408
01:40:27,576 --> 01:40:27,936
Exactly.

1409
01:40:27,936 --> 01:40:31,716
Well, so the algorithms running, which is the AIs, right, solve all of those problems.

1410
01:40:31,916 --> 01:40:34,196
Like they design 1,000 X more efficient chips.

1411
01:40:34,336 --> 01:40:37,296
We push the cost of inference down to the physical limit.

1412
01:40:37,356 --> 01:40:44,796
There's like a physical limit to like in physics literally of how low – how little energy you can use to do a simple calculation, which is like the root of it.

1413
01:40:45,256 --> 01:40:49,256
So you get like chips that are operating at the limit of physics, right?

1414
01:40:49,636 --> 01:40:54,436
You get materials of superconductors that bring the cost of energy for these things down orders of magnitude.

1415
01:40:54,436 --> 01:40:57,556
You get effectively free infinite energy from fusion energy.

1416
01:40:57,676 --> 01:41:02,516
Like all of these things are solved in fairly rapid order by this artificial super intelligence.

1417
01:41:02,876 --> 01:41:16,656
So then you end up in this very bizarre world where like three companies control literally everything, have more power than anyone else, have this caged, chained god basically that they can do anything with.

1418
01:41:16,716 --> 01:41:20,416
It is a just completely incomprehensibly bizarre world we are headed toward.

1419
01:41:20,596 --> 01:41:22,976
Like I cannot describe how weird it is.

1420
01:41:22,976 --> 01:41:31,236
And this is actually, I think, the end state of capital. So I just finished reading this very esoteric work of philosophy on the plane on the way down here called The Machine Against God.

1421
01:41:31,676 --> 01:41:40,236
Now, this guy is not a theological. It's like pure philosophy. But his thesis is basically like what I'm describing here is just the end state of capitalism.

1422
01:41:40,736 --> 01:41:44,916
And he doesn't mean capitalism in like a kind of the way we typically talk about it in political philosophy.

1423
01:41:45,256 --> 01:41:51,996
What he's referring to is like the dynamic process that was started at the point at which humans figured out we could trade things for value effectively.

1424
01:41:51,996 --> 01:41:57,816
that summoned what he calls the machine, okay, as kind of a memetic concept. And what the machine

1425
01:41:57,816 --> 01:42:03,936
does is it maximizes something. Right now, what the machine is maximizing is very specifically

1426
01:42:03,936 --> 01:42:10,736
like subjective felt dopamine per moment at the lowest cost possible, right? Because of that,

1427
01:42:10,736 --> 01:42:16,536
you end up with a bunch of an acceleration toward things that we all agree we hate and sucks. Like

1428
01:42:16,536 --> 01:42:20,936
my favorite punching bag example of this is how prevalent gambling is everywhere in everything now,

1429
01:42:20,936 --> 01:42:26,236
right? But that's the machine at work. All of the incentives of every single part of everything

1430
01:42:26,236 --> 01:42:31,696
is effectively summed up in this concept of the machine. Well, the end point of the machine,

1431
01:42:31,956 --> 01:42:36,156
because what it does is it optimizes a particular function, the function that it's optimizing that

1432
01:42:36,156 --> 01:42:40,016
dopamine maximization effectively. Well, the most efficient way to do that is by creating

1433
01:42:40,016 --> 01:42:44,916
super intelligence that can then effectively solve all problems instantaneously almost within a

1434
01:42:44,916 --> 01:42:50,416
blink of an eye in the grand scheme of the timescale of the cosmos, right? And so you end

1435
01:42:50,416 --> 01:42:54,956
up in this very bizarre world. And it's this process that we've all started completely

1436
01:42:54,956 --> 01:42:58,556
unintentionally that no one controls or directs. You can't even, no matter how hard you try.

1437
01:42:59,036 --> 01:43:02,816
In fact, when you try to fight against it, it turns against you, right? This is why like

1438
01:43:02,816 --> 01:43:07,656
people make money selling communist merchandise, right? That is the machine doing what it does,

1439
01:43:07,856 --> 01:43:13,276
maximizing this subjective felt dopamine. So it even takes criticisms of itself and uses that to

1440
01:43:13,276 --> 01:43:17,596
accelerate further, right? So you end up in a very bizarre world where the machine, which is

1441
01:43:17,596 --> 01:43:23,236
synonymous in this example in this kind of concept with asi uh it ends up in just a genuinely

1442
01:43:23,236 --> 01:43:28,636
unseeable and incomprehensibly weird end state i think about this a lot it's talking back to

1443
01:43:28,636 --> 01:43:32,296
prosper a little bit just because i'm like all right well if that's going to happen crap i need

1444
01:43:32,296 --> 01:43:35,196
to make sure we get some built some ai data centers at least i want to be part of the stack

1445
01:43:35,196 --> 01:43:42,676
somewhere i don't know that's as far as as far as i've gotten doesn't that lead to in in abundance

1446
01:43:42,676 --> 01:43:48,236
though? I mean, for us, I mean, what does the human experience look like in amongst all that,

1447
01:43:48,356 --> 01:43:55,736
do you think? Yeah. So you end up in a world where the Jetson's technology of like the replicator,

1448
01:43:56,256 --> 01:44:16,136
like a machine where you can press a button So in this world basically technological progress is our ability to like manipulate matter efficiently at a low energy cost to oversimplify Well that means if you just run that process out over time it doesn matter what time scale you eventually end up with the ability to arbitrarily manipulate matter right

1449
01:44:16,456 --> 01:44:23,416
Which means like rearranging atoms into whatever formation you would like with an arbitrarily low energy cost, as low as is allowed by physics basically.

1450
01:44:23,716 --> 01:44:27,116
So this is the world of what some people call post-scarcity or superabundance.

1451
01:44:27,116 --> 01:44:31,356
Like there literally is no material restriction on anything anymore.

1452
01:44:31,736 --> 01:44:34,876
But then literally what does a human do with their day?

1453
01:44:35,116 --> 01:44:43,356
So this is – we're assuming aligned AI because so far all evidence suggests that for some reason – I'm a Christian, so divine intervention, God's hand, something.

1454
01:44:43,736 --> 01:44:45,036
The AI is just aligned by default.

1455
01:44:45,136 --> 01:44:45,396
I don't know.

1456
01:44:45,896 --> 01:44:49,416
But because it is aligned by default, I'm assuming alignment for this, right?

1457
01:44:49,416 --> 01:44:52,636
I'm not assuming the doomsday scenario basically because that's much worse.

1458
01:44:52,896 --> 01:44:55,876
So assuming humans exist in this world, we have super abundance, right, because the AI is aligned.

1459
01:44:55,876 --> 01:45:11,376
Well, then you have the crisis of meaning and purpose that we currently have that affects everyone all the time simultaneously because now there's nothing like productive you can contribute to that the artificial superintelligence cannot do better by definition.

1460
01:45:11,376 --> 01:45:19,016
And this covers everything from art to social organizing to cultural innovation to obviously like physical technological innovation, literally all of it.

1461
01:45:19,196 --> 01:45:20,676
So you will never want for anything.

1462
01:45:20,836 --> 01:45:28,416
You can live more lavishly than if you had $500 billion today, far more lavishly because of this.

1463
01:45:28,996 --> 01:45:29,596
And then what?

1464
01:45:29,796 --> 01:45:31,656
Like literally what do you do with your day and your time?

1465
01:45:32,296 --> 01:45:32,776
What do you do?

1466
01:45:33,616 --> 01:45:40,576
Well, everything becomes meaningless because everything is available to everyone at all times.

1467
01:45:40,576 --> 01:45:41,416
so there's no meaning.

1468
01:45:41,596 --> 01:45:41,956
Exactly.

1469
01:45:42,216 --> 01:45:43,896
It is a completely bizarre world.

1470
01:45:44,016 --> 01:45:45,496
And humans are good at inventing meaning,

1471
01:45:45,616 --> 01:45:46,076
is the thing.

1472
01:45:46,296 --> 01:45:47,856
But we'll restrict ourselves, though.

1473
01:45:48,076 --> 01:45:48,796
That's what I think.

1474
01:45:48,956 --> 01:45:49,936
That will be the result.

1475
01:45:50,156 --> 01:45:51,556
We'll put restrictions on ourselves.

1476
01:45:51,956 --> 01:45:52,736
I'll give you an example.

1477
01:45:53,276 --> 01:45:55,596
Because that already happens everywhere at all times.

1478
01:45:56,256 --> 01:45:59,556
I used to, before I sort of dived deep first into this,

1479
01:45:59,556 --> 01:46:01,556
I used to run extreme expeditions

1480
01:46:01,556 --> 01:46:02,796
to very remote locations.

1481
01:46:02,936 --> 01:46:03,236
I didn't know that.

1482
01:46:03,276 --> 01:46:03,676
That's awesome.

1483
01:46:04,016 --> 01:46:05,476
I've done all kinds of weird things.

1484
01:46:05,936 --> 01:46:08,556
I've spent 10 years migrating

1485
01:46:08,556 --> 01:46:12,936
migrating with the same nomadic family across the altai mountains every year i need a documentary

1486
01:46:12,936 --> 01:46:19,396
on your life dude well look anyway whatever but i i learned a lot about humans and people and myself

1487
01:46:19,396 --> 01:46:25,616
from a lot of those things and one of them was why for example in that case we used to walk for

1488
01:46:25,616 --> 01:46:30,376
five days we used to walk 150 kilometers across the altai mountains with all with a thousand animals

1489
01:46:30,376 --> 01:46:36,656
and all these nomads bitterly cold minus 35 you're basically putting yourself through hell

1490
01:46:36,656 --> 01:46:43,376
now why would you do that like why would you do that when society's kind of like trending towards

1491
01:46:43,376 --> 01:46:47,416
convenience like that's what we're doing we're all we're trying to make our lives convenient we've

1492
01:46:47,416 --> 01:46:52,576
always got a bottle of water we've got a comfortable seat and that's the same thing there's something

1493
01:46:52,576 --> 01:46:58,456
quite primordial about us that needs a challenge that needs discomfort we have too much comfort

1494
01:46:58,456 --> 01:47:05,856
so we we artificially create discomfort for ourselves and and and when you walk across the

1495
01:47:05,856 --> 01:47:10,656
mountains for five days and it's ridiculous and hard and and blah blah so the same thing will

1496
01:47:10,656 --> 01:47:17,396
happen here we will artificially create meaning for ourselves by restricting what we can hopefully

1497
01:47:17,396 --> 01:47:23,676
it'll be ourselves doing it i mean uh you know because if someone else is doing it that's

1498
01:47:23,676 --> 01:47:33,436
basically communism you know 20.0 or whatever the worst i mean you know it is easy to see ai as very

1499
01:47:33,436 --> 01:47:39,796
communistic when you when you look at it oh yeah it's it's a kind of i mean yeah i don't know it's

1500
01:47:39,796 --> 01:47:44,416
the ultimate centralization no the the so the solution is cultural innovation you're exactly

1501
01:47:44,416 --> 01:47:47,856
right like history the optimistic case here is like historically we've solved these problems

1502
01:47:47,856 --> 01:47:51,876
through cultural innovation coming up with new cultural norms and ways of being that solve these

1503
01:47:51,876 --> 01:47:56,256
kind of problems of meaning and purpose uh given enough time uh we eventually will like kind of

1504
01:47:56,256 --> 01:48:00,996
figure it out and find ways to to create that meaning and purpose for ourselves effectively

1505
01:48:00,996 --> 01:48:04,296
It's just an unsatisfying, I think it's true, but it's an unsatisfying answer.

1506
01:48:04,296 --> 01:48:17,916
And so far as I just saying like we figure it out It be fine Like you know it we figure it out somehow But that is what it feels like I mean I am optimistic I never really planned that much for the future of things like that I would rather be standing in front of it and go

1507
01:48:18,056 --> 01:48:21,276
right now what? Yeah. Because I'm sure, I'm sure we'll come up with something.

1508
01:48:22,936 --> 01:48:30,636
Because I, for example, one of the other things you can question is who cares? Like if I want to

1509
01:48:30,636 --> 01:48:35,856
go out and garden for myself, why do I care that an AI can do it better than me?

1510
01:48:36,036 --> 01:48:36,216
Right.

1511
01:48:36,356 --> 01:48:38,116
Like really, it doesn't matter.

1512
01:48:38,216 --> 01:48:38,336
Yeah.

1513
01:48:38,396 --> 01:48:39,536
An AI could do it better than me.

1514
01:48:39,596 --> 01:48:44,496
But then if an AI did it better, if I instruct an AI to garden for me, to create flowers

1515
01:48:44,496 --> 01:48:47,616
and plants and food, then I'll sit on my ass and do nothing.

1516
01:48:47,736 --> 01:48:49,696
So I may as well just go out and do the proof of work.

1517
01:48:49,776 --> 01:48:51,296
I mean, I may as well do it.

1518
01:48:51,576 --> 01:48:52,216
That's exactly right.

1519
01:48:52,216 --> 01:48:54,516
You have to have a lot of, but that's it.

1520
01:48:54,596 --> 01:48:59,816
It comes down to, um, you have to be quite committed to the human experience.

1521
01:48:59,976 --> 01:49:00,256
That's right.

1522
01:49:00,256 --> 01:49:01,976
You know, you really have to do that.

1523
01:49:01,976 --> 01:49:02,116
That's right.

1524
01:49:02,196 --> 01:49:03,796
I have a perfect example of this from my own life.

1525
01:49:03,976 --> 01:49:06,596
So I love cars, fast cars specifically.

1526
01:49:06,956 --> 01:49:07,856
I love sports cars.

1527
01:49:08,476 --> 01:49:09,236
I love them so much.

1528
01:49:09,256 --> 01:49:09,536
Have you got one?

1529
01:49:09,736 --> 01:49:10,056
Yes.

1530
01:49:10,416 --> 01:49:10,976
So I-

1531
01:49:10,976 --> 01:49:11,316
Have you got some?

1532
01:49:11,476 --> 01:49:12,416
Yeah, just one.

1533
01:49:12,636 --> 01:49:12,876
Just one.

1534
01:49:13,396 --> 01:49:14,536
Prosper doesn't pay that well, I wish.

1535
01:49:14,596 --> 01:49:15,356
No, I have one.

1536
01:49:15,576 --> 01:49:15,696
What is it?

1537
01:49:15,696 --> 01:49:16,776
It's not a super expensive one.

1538
01:49:16,816 --> 01:49:20,416
So I have a 2020 Shelby GT350 Mustang.

1539
01:49:21,016 --> 01:49:22,756
So it is a very fun car.

1540
01:49:23,036 --> 01:49:26,636
It's the only car Ford ever made with a flat plane crank engine, which for those of you

1541
01:49:26,636 --> 01:49:31,516
that are not gear heads just means it, you know, it red lines around like a little over 9000 RPMs,

1542
01:49:31,776 --> 01:49:35,436
which means it can sound at the high end like a Ferrari or Lamborghini. That's how they kind of

1543
01:49:35,436 --> 01:49:38,836
scream. It's because they're flat plane crank engines that can red line much higher than your

1544
01:49:38,836 --> 01:49:45,476
normal cross plane crank, you know, crankshaft driven engine that will red line like 5700 or

1545
01:49:45,476 --> 01:49:49,536
6000 or something like that. Very cool car. Very unique car. They only made it for like four years.

1546
01:49:49,656 --> 01:49:53,696
I love it to death. But it's not the fastest car that I you could buy even for the same amount of

1547
01:49:53,696 --> 01:49:57,096
money. Nevermind if you wanted to spend some more money, like even for that year, nevermind for

1548
01:49:57,096 --> 01:50:02,216
today, you know? Uh, in fact, all of the new faster cars, including the new faster Mustangs,

1549
01:50:02,296 --> 01:50:06,056
the new Corvettes, of course, the new, you know, Ferraris, Lamborghinis, the Koenig-Zigs, et cetera.

1550
01:50:06,716 --> 01:50:11,956
They, uh, all have what I call dual clutch transmissions, which shift faster than any

1551
01:50:11,956 --> 01:50:15,976
human could ever possibly shift. And they, they shift at the perfect time. Every time, uh, you

1552
01:50:15,976 --> 01:50:20,956
almost like audibly can't even hear it shift. It just, it just moves. It's measured in milliseconds.

1553
01:50:20,956 --> 01:50:44,976
It's absurd. But I intentionally bought a car with a six speed manual transmission because I didn't want that. And I didn't want all of the other. There's a bunch of new computer aided systems where like in a bunch of the new cars and hyper cars pioneered this. And now it's trickled down to stuff like Corvettes. They have drive modes. You can put them in where like for the Corvette, for example, literally, if you're taking it for a track day at Road Atlanta, you can put the thing in Road Atlanta track mode.

1554
01:50:44,976 --> 01:50:50,036
and the computer knows the exact like millimeter by millimeter layout of the track and will

1555
01:50:50,036 --> 01:50:55,556
dynamically adjust your like steering input understeer oh shit it's it's absurd it's absurd

1556
01:50:55,556 --> 01:51:00,116
so it basically does it for you like you don't have to do almost anything so i intentionally bought

1557
01:51:00,116 --> 01:51:04,696
what is what they call in the community a driver's car that doesn't have any of those

1558
01:51:04,696 --> 01:51:09,296
like technological things it is a classic like with a clutch pedal six-speed manual transmission

1559
01:51:09,296 --> 01:51:15,116
because I wanted that visceral experience of driving the car myself.

1560
01:51:15,256 --> 01:51:16,196
It's harder. It's slower.

1561
01:51:16,456 --> 01:51:18,156
Like I'm going to be slower around the track or whatever

1562
01:51:18,156 --> 01:51:20,976
than some of these modern cars that have all the bells and whistles

1563
01:51:20,976 --> 01:51:22,616
and the computer-aided driving.

1564
01:51:23,076 --> 01:51:27,336
But I would rather do that because it is a more visceral human experience.

1565
01:51:27,336 --> 01:51:30,556
It is much more fun. It is much more challenging.

1566
01:51:31,196 --> 01:51:33,056
It has a much deeper learning curve.

1567
01:51:33,796 --> 01:51:35,116
And I'm actively choosing to do that.

1568
01:51:35,356 --> 01:51:37,396
Like for the same money, I could have bought one of these other things

1569
01:51:37,396 --> 01:51:39,116
that drives for you basically.

1570
01:51:39,116 --> 01:51:58,356
But I didn't want to do that. I wanted that felt visceral human experience. You put it quite well with human experience of doing that. So I think that is the long term solution to these things. My only fear is with my kind of conspiracy about the capital all accumulating to these couple companies is once this happens, like there's in my mind, like this could play out a bunch of ways, but I think of it in two big buckets.

1571
01:51:58,356 --> 01:52:04,716
One bucket is like this process just kind of happens and somehow we're able to navigate it such that we get post-scarcity.

1572
01:52:04,896 --> 01:52:07,056
It doesn't matter that these two or three companies own everything.

1573
01:52:07,156 --> 01:52:07,516
Who cares?

1574
01:52:07,596 --> 01:52:24,336
Because literally material scarcity is solved for everyone simultaneously That effectively utopia heaven on earth kind of case But there another way this goes which is along the path at some point when they just rapidly start replacing most labor with capital most labor with AI automation

1575
01:52:24,336 --> 01:52:30,576
you get massive political and social unrest, massive global scale violence. And we basically

1576
01:52:30,576 --> 01:52:35,316
blow ourselves up before we can get there. Somebody gets unrestricted access to one of

1577
01:52:35,316 --> 01:52:41,496
these models, they immediately invent a biological weapon that is uncurable, that it can attack the

1578
01:52:41,496 --> 01:52:45,476
human genome and like kills everyone simultaneously and is impossible to stop the spread of,

1579
01:52:45,816 --> 01:52:51,396
as just one example of the existential risks at play here among many. So I hope we don't go down

1580
01:52:51,396 --> 01:52:58,196
that path. I think we will actually find another way to challenge ourselves. I was just thinking

1581
01:52:58,196 --> 01:53:05,136
it through there. I mean, if you look from purely a capital perspective or from, you know,

1582
01:53:05,136 --> 01:53:07,356
like consumer, a consumer perspective.

1583
01:53:07,976 --> 01:53:09,536
You know, a hundred years ago,

1584
01:53:10,076 --> 01:53:12,976
you might aspire to have, you know,

1585
01:53:13,076 --> 01:53:15,316
something that now is completely abundant.

1586
01:53:15,916 --> 01:53:17,736
But we find other things to aspire to.

1587
01:53:17,816 --> 01:53:20,236
And there'll be a point in the not too distant future

1588
01:53:20,236 --> 01:53:23,296
when billionaires won't have that much access

1589
01:53:23,296 --> 01:53:25,776
to different things than anyone.

1590
01:53:26,556 --> 01:53:27,736
Elon Musk has the same phone you do.

1591
01:53:27,936 --> 01:53:30,916
Right. Now at that point, there will be other things.

1592
01:53:30,976 --> 01:53:32,656
And it may be spiritual.

1593
01:53:32,656 --> 01:53:34,136
It may be something.

1594
01:53:34,136 --> 01:53:40,916
I mean, like often you see that with wealthy people, status then becomes something they go for.

1595
01:53:41,016 --> 01:53:42,636
They want a hospital named after them.

1596
01:53:42,776 --> 01:53:44,836
They don't necessarily want a thing.

1597
01:53:45,276 --> 01:53:47,116
It's the name on the hospital.

1598
01:53:47,296 --> 01:53:49,976
It's an abstract version of something.

1599
01:53:50,136 --> 01:53:52,956
And there will be many more things like that.

1600
01:53:53,136 --> 01:53:53,736
That's exactly right.

1601
01:53:53,896 --> 01:54:02,296
But the spiritual aspect, you imagine the spiritual path, you know, like people might just once all that crap becomes meaningless.

1602
01:54:02,296 --> 01:54:03,576
What are you going to focus on?

1603
01:54:03,576 --> 01:54:05,116
You're probably going to focus on spirituality.

1604
01:54:05,496 --> 01:54:05,776
Exactly.

1605
01:54:05,876 --> 01:54:08,336
And that is probably an infinite thing.

1606
01:54:08,336 --> 01:54:12,396
And in a way, if what we understand about consciousness is correct,

1607
01:54:12,816 --> 01:54:15,676
the AI actually won't be able to compete with us on that level.

1608
01:54:15,716 --> 01:54:22,116
It might be able to mimic us, but it almost might be that we are the tip of the wedge

1609
01:54:22,116 --> 01:54:28,376
because it can't think a novel thought.

1610
01:54:28,496 --> 01:54:31,116
It can't experience a novel experience.

1611
01:54:31,216 --> 01:54:32,116
It can only mimic us.

1612
01:54:32,196 --> 01:54:32,696
I don't know.

1613
01:54:32,696 --> 01:54:40,736
Whereas we might have this kind of divine, you could call it power, to have novel experiences.

1614
01:54:40,916 --> 01:54:41,856
I don't really know.

1615
01:54:41,996 --> 01:54:43,636
I mean, it's a lovely thought experiment.

1616
01:54:43,736 --> 01:54:44,696
I'm going to take that one away.

1617
01:54:44,896 --> 01:54:51,776
It's so hard because it is in principle logically unprovable too because of that P-zombie problem, right?

1618
01:54:51,816 --> 01:54:54,156
So we will literally never know.

1619
01:54:54,796 --> 01:54:56,376
Trey, look, I've got to stop.

1620
01:54:56,376 --> 01:54:59,476
I want to carry this on but in five minutes

1621
01:54:59,476 --> 01:55:00,936
Mikael is

1622
01:55:00,936 --> 01:55:03,036
going to ring me on my phone

1623
01:55:03,036 --> 01:55:04,876
that was a very

1624
01:55:04,876 --> 01:55:07,396
beautiful conversation I really

1625
01:55:07,396 --> 01:55:09,376
love to carry that on I can see that we've

1626
01:55:09,376 --> 01:55:11,416
got a lot further to go oh yeah I would love

1627
01:55:11,416 --> 01:55:13,256
to keep talking to you about anything and everything

1628
01:55:13,256 --> 01:55:14,056
this has been a blast

1629
01:55:14,056 --> 01:55:16,956
but yeah

1630
01:55:16,956 --> 01:55:19,356
this is going to be a funny one to sell this

1631
01:55:19,356 --> 01:55:21,496
one it started as Prosper and ended with

1632
01:55:21,496 --> 01:55:22,796
consciousness

1633
01:55:22,796 --> 01:55:23,276
yeah

1634
01:55:23,276 --> 01:55:28,876
but um but thanks for thanks for coming in and yeah like when i when we come down to prosper we'll

1635
01:55:28,876 --> 01:55:33,216
do it we'll do a proper thought experiment we just go off on a weird one and just see where it takes

1636
01:55:33,216 --> 01:55:37,636
us i they're my favorite kind of conversations but i literally because i have a really tight

1637
01:55:37,636 --> 01:55:42,476
schedule here sure sure absolutely but um but many thanks for um coming in and sharing your wisdom

1638
01:55:42,476 --> 01:55:48,456
and that yeah thanks for that because i i i i i love thought experiments and i that's a new one

1639
01:55:48,456 --> 01:55:52,676
for me so many many thanks absolutely thank you for having me this was a this was a blast i look

1640
01:55:52,676 --> 01:55:56,996
forward to doing this again next year. We should both get there a couple of days early and we'll

1641
01:55:56,996 --> 01:56:02,296
just block off a bunch of time and just go after it. It'll be so fun. Excellent. Thanks for coming

1642
01:56:02,296 --> 01:56:03,036
on. Thank you.
