1
00:00:00,000 --> 00:00:06,700
What's the most important thing you have mapped that changes what individuals should be doing now?

2
00:00:06,900 --> 00:00:07,980
It changes year by year.

3
00:00:09,780 --> 00:00:10,900
In the moment.

4
00:00:10,900 --> 00:00:25,700
Yeah, the big things I'm kind of struggling with, and it's up of mind for me, is where the technology is going, driven largely by Elon, and where the nation is going or where the nation went.

5
00:00:26,300 --> 00:00:29,160
That affects Americans and Europeans and everyone else.

6
00:00:29,160 --> 00:00:34,640
is post-nationalism and the effect it has on people

7
00:00:34,640 --> 00:00:36,640
and what we can expect from the state

8
00:00:36,640 --> 00:00:38,960
and what we can expect from society going forward.

9
00:00:39,060 --> 00:00:43,500
Broad strokes, what's the message that you give to individuals

10
00:00:43,500 --> 00:00:46,220
in an elevator-length conversation?

11
00:00:46,420 --> 00:00:48,640
You know, when they ask you, John, what do you see what's coming?

12
00:00:48,800 --> 00:00:51,080
From the post-nationalism perspective,

13
00:00:51,080 --> 00:00:53,900
it's our common identities of loud, high trust,

14
00:00:54,360 --> 00:00:58,340
kind of coherent, cohesive society to operate, gone.

15
00:00:58,340 --> 00:01:05,780
We have lots and lots of identities now, and social networking and everything else on the internet kind of fragments that, amplifies it.

16
00:01:28,340 --> 00:01:35,640
no gimmicks. Go to trustrevolution.co. That's trustrevolution.co. Okay, let's get into it.

17
00:01:36,980 --> 00:01:43,600
John, Rob, welcome back. Thank you, John. I appreciate it. It has been not quite a year,

18
00:01:43,700 --> 00:01:49,240
10 months since we spoke last April, and you have been very busy. Not surprisingly,

19
00:01:49,240 --> 00:01:54,600
you published a great deal. Let's jump in here. Of all of that, what's the most important thing

20
00:01:54,600 --> 00:02:01,400
you have mapped that changes what individuals should be doing now it changes year by year

21
00:02:01,400 --> 00:02:07,720
the big thing in the moment yeah the big things i'm i'm kind of struggling with and it's

22
00:02:07,720 --> 00:02:17,360
of mine for me is where the technology is going driven largely by elon and where the um the nation

23
00:02:17,360 --> 00:02:22,820
is going or where the nation went that affects americans and europeans and everyone else it's

24
00:02:22,820 --> 00:02:29,740
is post-nationalism and the effect it has on people and what we can expect from the state

25
00:02:29,740 --> 00:02:33,220
and what we could expect from society going forward. And those two things are the things

26
00:02:33,220 --> 00:02:40,120
I've been focusing on. And what is that broad strokes? What's the message that you give to

27
00:02:40,120 --> 00:02:44,720
individuals in a elevator-length conversation? You know, when they ask you, John,

28
00:02:44,820 --> 00:02:48,920
what do you see what's coming? How do you summarize that for them if it's possible?

29
00:02:48,920 --> 00:02:54,280
From the post-nationalism perspective, it's our common identities of loud, high trust,

30
00:02:54,740 --> 00:02:58,700
kind of coherent, cohesive society to operate, gone.

31
00:02:59,260 --> 00:03:00,980
We have lots and lots of identities now.

32
00:03:01,120 --> 00:03:06,460
And social networking and everything else on the internet kind of fragments that, amplifies it.

33
00:03:07,380 --> 00:03:10,040
And that makes social decision-making almost impossible.

34
00:03:10,880 --> 00:03:12,480
Because we don't trust each other anymore.

35
00:03:12,720 --> 00:03:15,800
And everything anyone else says, you treat it like an attack.

36
00:03:16,060 --> 00:03:17,520
You don't trust what they say.

37
00:03:17,520 --> 00:03:23,480
There isn't this common identity to kind of ground us, kind of like being an American or being German or being anyone.

38
00:03:23,720 --> 00:03:28,540
And that means that the state is not functioning the way it becomes a system.

39
00:03:28,860 --> 00:03:32,980
And nobody has loyalty and considers a system legitimate.

40
00:03:33,460 --> 00:03:38,700
It's like one thing I've picked up recently is that everybody is trying to beat the system.

41
00:03:38,700 --> 00:03:43,720
take the opportunity, even the people that never really did anything like outside the box,

42
00:03:43,900 --> 00:03:49,880
the legalities or decorum or morality in the past, they will take advantage of that system

43
00:03:49,880 --> 00:03:54,680
and cheat it and loot it. If the opportunity becomes available and is costless, that's a huge

44
00:03:54,680 --> 00:04:01,480
shift. So we can expect this kind of general looting and individualistic behavior and dysfunctional

45
00:04:01,480 --> 00:04:07,060
governance going forward on that side. Not going to get any better. And on the other side is this

46
00:04:07,060 --> 00:04:10,180
technological shift with AI and autonomy.

47
00:04:10,340 --> 00:04:12,220
Autonomy is more important to me than AI,

48
00:04:12,220 --> 00:04:13,680
but it encompasses robotics.

49
00:04:13,880 --> 00:04:16,260
It includes AI workers and the like.

50
00:04:16,420 --> 00:04:17,580
Embodied AI, would you say?

51
00:04:19,620 --> 00:04:24,420
Autonomy makes AI capable of doing real work,

52
00:04:24,540 --> 00:04:27,180
shifting from just doing tasks for three, four hours,

53
00:04:27,240 --> 00:04:28,160
and then it falls apart.

54
00:04:28,920 --> 00:04:32,600
You have to reboot the task to being a coworker

55
00:04:32,600 --> 00:04:38,220
or a robotic colleague or employee

56
00:04:38,220 --> 00:04:40,020
that does work over time,

57
00:04:40,060 --> 00:04:40,860
that you could train,

58
00:04:41,240 --> 00:04:42,020
that you can learn to trust

59
00:04:42,020 --> 00:04:44,380
because their behavior is consistent over time.

60
00:04:45,080 --> 00:04:47,220
And we're getting close to cracking autonomy.

61
00:04:47,480 --> 00:04:49,440
I was like, to eat my own horn,

62
00:04:49,540 --> 00:04:51,780
I think we found a methodology for doing that.

63
00:04:51,980 --> 00:04:54,160
But I think they'll stumble in

64
00:04:54,160 --> 00:04:56,820
is that those Optimus robots you see coming,

65
00:04:57,000 --> 00:04:57,800
when they come,

66
00:04:58,020 --> 00:04:59,800
they're not really going to be that useful

67
00:04:59,800 --> 00:05:01,640
in the home environment or in most tasks

68
00:05:01,640 --> 00:05:03,380
because they don't have autonomy.

69
00:05:03,600 --> 00:05:06,000
They're not persistent over time

70
00:05:06,000 --> 00:05:07,440
in terms of their cognition,

71
00:05:07,880 --> 00:05:08,800
in terms of their behavior,

72
00:05:09,060 --> 00:05:11,080
bounded in terms of their behavior,

73
00:05:11,280 --> 00:05:12,560
able to take instruction

74
00:05:12,560 --> 00:05:13,620
and learn from that instruction

75
00:05:13,620 --> 00:05:15,660
and apply it in the future.

76
00:05:15,900 --> 00:05:16,800
Once you crack that,

77
00:05:17,740 --> 00:05:17,980
all of a sudden,

78
00:05:18,180 --> 00:05:19,580
all those optimist robots

79
00:05:19,580 --> 00:05:22,600
become very, very, very useful.

80
00:05:22,700 --> 00:05:24,420
They become trusted companions

81
00:05:24,420 --> 00:05:25,760
in the home doing maid working.

82
00:05:26,460 --> 00:05:28,120
And in industrial settings,

83
00:05:28,220 --> 00:05:30,620
they can work like regular workers

84
00:05:30,620 --> 00:05:33,900
accessing all the human-built systems,

85
00:05:34,020 --> 00:05:34,960
just like a human can,

86
00:05:35,640 --> 00:05:38,520
and then improve themselves

87
00:05:38,520 --> 00:05:41,800
and bring our society to a lot more dynamic situation.

88
00:05:43,260 --> 00:05:45,140
And virtual workers become colleagues,

89
00:05:45,300 --> 00:05:47,540
co-workers, trusted employees

90
00:05:47,540 --> 00:05:50,820
when they can operate autonomously for months, years.

91
00:05:51,140 --> 00:05:53,500
They learn from their interactions.

92
00:05:53,780 --> 00:05:56,060
They could take on personalities that generate trust.

93
00:05:56,280 --> 00:05:57,260
Their work is consistent

94
00:05:57,260 --> 00:05:59,700
that you can predict their behavior in the future

95
00:05:59,700 --> 00:06:02,000
and improve it and take management instruction.

96
00:06:02,240 --> 00:06:04,320
Autonomy unlocks all of that.

97
00:06:04,400 --> 00:06:06,340
And then all of a sudden we see them doing

98
00:06:06,340 --> 00:06:09,820
all sorts of cognitive work, all sorts of physical work.

99
00:06:10,080 --> 00:06:11,480
So yeah, that's coming in.

100
00:06:11,540 --> 00:06:13,020
That's coming really, really quickly.

101
00:06:13,760 --> 00:06:16,460
What's your time horizon or timeframe rather, John,

102
00:06:16,600 --> 00:06:19,960
for let's just even say the next step change

103
00:06:19,960 --> 00:06:22,820
where I think about for those that are in the weeds

104
00:06:22,820 --> 00:06:26,640
with this Opus 4.6 from Anthropic introduces sub-agents

105
00:06:26,640 --> 00:06:30,700
and the ability to build teams or swarms.

106
00:06:30,700 --> 00:06:34,720
In fact, you know, not perhaps your use of the word swarm,

107
00:06:34,820 --> 00:06:39,540
but there's in terms of being able to collect

108
00:06:39,540 --> 00:06:43,560
and work on a particular problem in a persistent manner in parallel.

109
00:06:44,060 --> 00:06:46,300
What's the next step change, do you think,

110
00:06:46,340 --> 00:06:48,240
in terms of moving toward autonomy?

111
00:06:48,480 --> 00:06:50,480
Everything I've seen is that, you know,

112
00:06:50,580 --> 00:06:52,400
I don't know if it's a step change.

113
00:06:52,560 --> 00:06:55,500
It's a cognitive model that will keep it consistent over time.

114
00:06:55,500 --> 00:07:04,520
And nobody's got that. They're just trying to load up the amount of information it has available and hard-coded the instructions that it should, you know, stay on a certain task.

115
00:07:04,980 --> 00:07:09,240
Right. And guardrails, harnesses, we hear these terms, yeah.

116
00:07:09,580 --> 00:07:16,320
Yep. And that allows it, I think the best one I've seen is eight hours, which is the latest clod stuff.

117
00:07:16,400 --> 00:07:20,280
They can go work on a task of eight hours and then it starts to peter out.

118
00:07:20,280 --> 00:07:26,240
I think this might have been the C compiler they built in Rust and they spent $20,000 in tokens, if I recall.

119
00:07:26,560 --> 00:07:30,960
And out comes a commercial grade C compiler written in Rust.

120
00:07:31,400 --> 00:07:34,260
You could do amazing, amazing things in eight hours.

121
00:07:34,760 --> 00:07:34,940
Right.

122
00:07:35,940 --> 00:07:37,840
But it's not a human replacement.

123
00:07:38,060 --> 00:07:39,380
It's like a super tool.

124
00:07:39,760 --> 00:07:45,660
And I think our bias is towards super tools because, you know, we're older, you know, boomers, millennials or whatever.

125
00:07:46,200 --> 00:07:47,140
That's the way we think it.

126
00:07:47,200 --> 00:07:49,560
I mean, most boomers treat AI as a search tool.

127
00:07:49,560 --> 00:07:51,240
So that's my extent of everything.

128
00:07:51,740 --> 00:07:55,860
And then younger folks are all focusing on them as a work.

129
00:07:57,180 --> 00:08:02,000
Most of the really high-end programmers I know are all getting 20x their productivity.

130
00:08:02,380 --> 00:08:04,940
And not just from automating tasks.

131
00:08:05,180 --> 00:08:06,160
They solve problems.

132
00:08:06,820 --> 00:08:11,720
Gravitate towards the best solution quicker, much, much quicker than they ever used to.

133
00:08:11,720 --> 00:08:13,640
And that's what really blows their mind.

134
00:08:13,840 --> 00:08:17,260
And they're very different from almost all other programmers.

135
00:08:17,260 --> 00:08:20,720
You know, it's like some people are really, really leveraged from this stuff.

136
00:08:21,360 --> 00:08:21,400
Right.

137
00:08:22,160 --> 00:08:31,020
Well, and speaking of leverage, your long night piece, your long night concept refers to AI surveillance at population scale.

138
00:08:31,200 --> 00:08:35,260
And you wrote in December that it is inevitably coming back.

139
00:08:35,400 --> 00:08:39,820
What's the timeline and when does the window close for building alternatives?

140
00:08:40,260 --> 00:08:45,380
So, you know, first perhaps walk us through what this is and then why is it coming back and what do we do?

141
00:08:45,380 --> 00:08:52,860
Okay, so the Long Night was my term for describing what a surveillance state,

142
00:08:54,060 --> 00:08:57,720
totalitarian surveillance state looks like in the modern age.

143
00:08:58,320 --> 00:09:00,380
You've heard of the Stasi in East Germany.

144
00:09:00,680 --> 00:09:03,960
This does everything the Stasi does, you know, unfold.

145
00:09:04,600 --> 00:09:09,620
Orders of magnitude more, more intrusive, more manipulative, more aggressively,

146
00:09:09,880 --> 00:09:11,680
and orders of magnitude less expensive.

147
00:09:11,680 --> 00:09:15,420
because you don't need roomfuls or buildingfuls of bureaucrats

148
00:09:15,420 --> 00:09:18,740
or you don't need a lot of paid informants

149
00:09:18,740 --> 00:09:19,720
and managed informants.

150
00:09:19,980 --> 00:09:20,660
It's all automated.

151
00:09:20,880 --> 00:09:24,120
And the weird part about it is that operations,

152
00:09:24,360 --> 00:09:25,920
the way they've built out our networks,

153
00:09:26,140 --> 00:09:28,660
built everything that is necessary,

154
00:09:28,880 --> 00:09:31,320
and including AI, making this turnkey.

155
00:09:32,140 --> 00:09:35,960
All it takes is kind of the political will

156
00:09:35,960 --> 00:09:38,800
or the political mistake to kind of turn it on,

157
00:09:38,840 --> 00:09:39,960
and it's there.

158
00:09:39,960 --> 00:09:47,660
And what we're talking about is like AI surveillance of every single individual, even an AI assigned to every single individual building profiles in them.

159
00:09:48,160 --> 00:09:51,700
And you can scale it to the entire global scale.

160
00:09:51,700 --> 00:10:16,960
I mean, you know, it's, you'd have a billion people watched in real time online, not only just censored, controlled, but manipulated and persuaded to go in certain directions, change their attitudes and then punished and, you know, using the standard kind of a Slovian kind of response is that, you know, get them into pathways that they based on reward and punishment.

161
00:10:16,960 --> 00:10:18,500
at a micro scale.

162
00:10:18,760 --> 00:10:19,660
And it can all be done

163
00:10:19,660 --> 00:10:21,480
by a very small group of people

164
00:10:21,480 --> 00:10:24,400
with the right kind of a network access.

165
00:10:25,080 --> 00:10:26,840
All it takes is switch.

166
00:10:26,980 --> 00:10:28,720
The worrisome part is that our politics,

167
00:10:28,960 --> 00:10:30,940
because of the collapse of the nation-state,

168
00:10:31,080 --> 00:10:32,760
collapse of our common identity,

169
00:10:32,900 --> 00:10:34,080
is turning more into kind of

170
00:10:34,080 --> 00:10:35,260
what I call a hollow state.

171
00:10:35,660 --> 00:10:37,980
It's kind of a state turns into a mechanism

172
00:10:37,980 --> 00:10:41,140
for looting what existed in that nation-state

173
00:10:41,140 --> 00:10:42,220
and transferring it to

174
00:10:42,220 --> 00:10:43,660
politically connected individuals.

175
00:10:43,660 --> 00:11:04,440
So when you think of, say, $450 billion, current budget is being, some people say as high as $750 billion, is being just brutally looted, you know, through false use of subsidies and other things.

176
00:11:04,440 --> 00:11:13,400
And then on the macro scale, you know, programs that shouldn't exist in the Defense Department and other places that are just being, you know, they persist for a long time, even though they have no use.

177
00:11:13,400 --> 00:11:16,620
And that's, you know, add that all up and that's well over a trillion.

178
00:11:16,960 --> 00:11:19,180
And it's just being connected.

179
00:11:19,800 --> 00:11:24,120
It's just sucking the money out of the existing system and looting it.

180
00:11:24,420 --> 00:11:28,800
And then that kind of system is largely illegitimate, has no loyalty.

181
00:11:28,940 --> 00:11:29,820
People aren't loyal to it.

182
00:11:30,640 --> 00:11:37,200
And if there's enough pushback, if there's enough chaos caused by it, things degrade and things get less efficient, less useful.

183
00:11:37,200 --> 00:11:43,660
the tendency of that of those people who are benefiting from this and every politician gets

184
00:11:43,660 --> 00:11:51,460
this is like you you've seen the Pelosi portfolio strategy yeah yeah there's no market ups and downs

185
00:11:51,460 --> 00:11:56,440
and everything is just up up up up up up up and that's why she ends up what 400 billion 400 million

186
00:11:56,440 --> 00:12:00,640
dollars but how can a politician it's only been a politician to have 400 million dollars

187
00:12:00,640 --> 00:12:06,580
and that's the everybody's doing that and they'll protect themselves yes and the way they protect

188
00:12:06,580 --> 00:12:12,580
themselves is to you know go to the corporations and say let's hey john i'm gonna stop you it is

189
00:12:12,580 --> 00:12:18,940
i i hate this i hate to to kill the the flow here but it is i'm losing two to three seconds at a time

190
00:12:18,940 --> 00:12:25,980
of your audio yeah so i don't know is it internet is it maybe are you no i've never had i've never

191
00:12:25,980 --> 00:12:30,660
had a problem with my audio before on this well i am sorry i hate it's i hate it's it's us me

192
00:12:30,660 --> 00:12:36,200
we got 90 90 percent in the thing up here yeah that is weird i don't suppose you have a mic

193
00:12:36,200 --> 00:12:44,000
another mic no i just use these here yeah and they're charged up yeah okay well we'll um yeah

194
00:12:44,000 --> 00:12:50,600
man they're fully charged um so it's not so internet speed is good i mean i it is the video

195
00:12:50,600 --> 00:12:55,500
seems a tad choppy and it could be maybe it's round tripping to me that it's bad and it's

196
00:12:55,500 --> 00:13:00,100
recording well but i just wanted to double check okay you want to test you want to test that because

197
00:13:00,100 --> 00:13:01,400
It looks good to me.

198
00:13:01,540 --> 00:13:05,820
And if the recording's on my end and it's uploaded, you probably should hear everything I say.

199
00:13:06,380 --> 00:13:06,780
It should be.

200
00:13:06,940 --> 00:13:07,140
Okay.

201
00:13:07,360 --> 00:13:07,620
Okay.

202
00:13:07,760 --> 00:13:09,180
So, again, sorry to interrupt.

203
00:13:09,260 --> 00:13:10,320
I just hate to lose the good stuff.

204
00:13:10,580 --> 00:13:12,760
So, we will, well, let's see.

205
00:13:12,880 --> 00:13:17,500
So, I did kind of get you there on, we were talking Pelosi Tracker.

206
00:13:18,040 --> 00:13:19,540
Well, all this will be edited, of course.

207
00:13:20,240 --> 00:13:22,860
Pretty sure that it will have my audio fine on this end.

208
00:13:23,140 --> 00:13:23,400
Okay.

209
00:13:23,400 --> 00:13:24,420
Because that will be part of the upload.

210
00:13:24,700 --> 00:13:26,360
Because that's the real way Riverside works, right?

211
00:13:26,520 --> 00:13:26,920
It is.

212
00:13:27,080 --> 00:13:27,220
It is.

213
00:13:27,220 --> 00:13:29,020
It doesn't really matter about the back and forth.

214
00:13:29,020 --> 00:13:34,040
it really just, we both record each other. Yeah. As long as it's not AirPod to your computer,

215
00:13:34,260 --> 00:13:38,380
we're good. As long as it's not actually clipping the audio and recording. So again,

216
00:13:38,380 --> 00:13:43,220
fine. Okay, good. Sorry about that again. So we'll, I'll pick up here. So John, you wrote that,

217
00:13:43,220 --> 00:13:50,200
that Musk buying Twitter only delayed the night, a long night sort of encroaching. Why,

218
00:13:50,420 --> 00:13:56,460
purposely naive question, why didn't that plus Doge stop this? His purchase of Twitter,

219
00:13:56,460 --> 00:13:58,760
and I was ahead of that again,

220
00:13:58,840 --> 00:14:00,260
it was like talking about he should buy Twitter

221
00:14:00,260 --> 00:14:01,880
like a year or so before this

222
00:14:01,880 --> 00:14:03,760
and that he would be able to use the AI,

223
00:14:03,980 --> 00:14:05,360
the data for AI,

224
00:14:05,820 --> 00:14:07,080
is that it delayed it

225
00:14:07,080 --> 00:14:10,520
and that the tendency of all these networks

226
00:14:10,520 --> 00:14:11,560
to operate in unison

227
00:14:11,560 --> 00:14:13,880
to kind of scrunch down in free speech

228
00:14:13,880 --> 00:14:15,020
and drive us towards

229
00:14:15,020 --> 00:14:17,660
a kind of approved orthodoxy of thought,

230
00:14:18,040 --> 00:14:19,900
speech, behavior was broken.

231
00:14:20,180 --> 00:14:22,960
There's still this kind of illegitimate

232
00:14:22,960 --> 00:14:24,420
kind of looting system out there

233
00:14:24,420 --> 00:14:26,280
that's like broken kind of political

234
00:14:26,280 --> 00:14:31,000
and social system and that the pressure will continue to build to kind of disillusionment

235
00:14:31,000 --> 00:14:37,520
and mute it any kind of criticism of it to do things that rock the boat and that musk in terms

236
00:14:37,520 --> 00:14:46,280
of its his operation of x is vulnerable to attack we already saw him kind of hold when uh during the

237
00:14:46,280 --> 00:14:52,480
gaza thing and he could fold from government pressure in the future all the other corporations

238
00:14:52,480 --> 00:14:58,160
would flow to, I mean, the kind of hard stance towards free speech that they professed years

239
00:14:58,160 --> 00:14:59,280
ago is gone.

240
00:14:59,820 --> 00:14:59,840
Right.

241
00:14:59,900 --> 00:15:01,120
Do whatever that's necessary.

242
00:15:01,540 --> 00:15:02,740
Everyone's kind of fatigued.

243
00:15:02,880 --> 00:15:06,080
I don't see, I don't see the long night being stopped.

244
00:15:06,560 --> 00:15:08,000
Just getting easier and easier to do.

245
00:15:08,240 --> 00:15:08,680
Sorry.

246
00:15:08,860 --> 00:15:16,380
So on the thread that you just tugged on, Musk has rolled XAI under X and now X under

247
00:15:16,380 --> 00:15:18,780
SpaceX, if I, if I've got that correct.

248
00:15:18,780 --> 00:15:23,980
And so I do want to come back to that because you and I had a great discussion about that before we started recording.

249
00:15:24,320 --> 00:15:42,640
But to the point of the performative free speech protections, what do you make of the current administration's seemingly bold challenges to the EU and the UK about their threats to free speech, their fines of X, among others?

250
00:15:42,960 --> 00:15:44,540
What is what is happening there?

251
00:15:44,540 --> 00:15:53,060
yeah i mean trump's um trying to reignite kind of a nationalism both here and and also in europe

252
00:15:53,060 --> 00:15:59,560
and because the nationalism starts to refocus people on the benefit of the citizens being

253
00:15:59,560 --> 00:16:03,460
paramount when you start doing that then you start to look at everything in terms of trade

254
00:16:03,460 --> 00:16:08,020
in terms of immigration in terms of defense from a new perspective a different perspective

255
00:16:08,020 --> 00:16:13,420
because right now but i don't know how much he's going to be able to bring us back from

256
00:16:13,420 --> 00:16:19,280
a pure globalization globalist approach to nationalism i think the tennessee still

257
00:16:19,280 --> 00:16:24,480
towards that globalism and globalism is like we're not focused on the prosperity of citizens

258
00:16:24,480 --> 00:16:30,020
anymore we were during the cold war during you know most of the you know the u.s experience

259
00:16:30,020 --> 00:16:39,080
we're focused on like top-line gdp growth and activity uh trade flows people flows

260
00:16:39,080 --> 00:16:45,980
information flows and you as a former citizen are now just a participant in this larger global

261
00:16:45,980 --> 00:16:51,280
system you are on your own and you benefit a little bit from being here versus somewhere else

262
00:16:51,280 --> 00:16:58,160
and uh that's a really tough place we're not gonna really help you compete you know you are

263
00:16:58,160 --> 00:17:05,700
competing in this system and um you know less expensive comes along the corporations are you

264
00:17:05,700 --> 00:17:11,200
primary focus because they are the big drivers of the GDP. And if their profitability is great,

265
00:17:11,240 --> 00:17:15,580
then we're doing great. Right. On average. And no matter what the individual, you know,

266
00:17:15,620 --> 00:17:20,500
the mean is for any individual in the U.S. And I think one of the things that you wrote that I

267
00:17:20,500 --> 00:17:27,920
found striking, and you touched on it just a moment ago, is that it was the end of the Cold

268
00:17:27,920 --> 00:17:34,720
War at which the American middle class began its precipitous decline. Tell us a bit more about why

269
00:17:34,720 --> 00:17:49,758
that is and why perhaps it taken so long for us to recognize that it has been ongoing for that long Yeah The cold door of course was a life and death struggle for the U In order to withstand the kind of pressures associated

270
00:17:49,758 --> 00:17:54,178
with it and prevail over the long term, we globalized to an extent that means we had trading

271
00:17:54,178 --> 00:17:58,858
partners and we created a national trading system that had ties to the rest of the world. But the

272
00:17:58,858 --> 00:18:04,858
focus was primarily on the progress and prosperity of the middle class because that provided kind of

273
00:18:04,858 --> 00:18:10,758
a bulwark against socialist intrusion, the communist intrusion. The messaging just broke

274
00:18:10,758 --> 00:18:18,678
apart against the middle class that was adhesive, had firm underlying social structures like the

275
00:18:18,678 --> 00:18:25,798
family unit and the like. And when the Cold War ended, the need for that middle class to fill out

276
00:18:25,798 --> 00:18:31,078
their armies, to drive the technology forward, to be loyal, disappeared. Everyone who had money

277
00:18:31,078 --> 00:18:34,558
and benefited a little bit from that financialization

278
00:18:34,558 --> 00:18:37,098
that happened in the 80s, decided we're now global.

279
00:18:37,358 --> 00:18:38,958
And then all those smarmy, you know,

280
00:18:38,998 --> 00:18:41,038
world is flat and history and all that other stuff

281
00:18:41,038 --> 00:18:43,458
started coming out and that pop stuff became law.

282
00:18:44,458 --> 00:18:47,418
In every talk show, if you talked about fair trade

283
00:18:47,418 --> 00:18:49,798
versus free trade, you were considered a protectionist.

284
00:18:49,878 --> 00:18:52,938
If you said we shouldn't be involved in all these alliances

285
00:18:52,938 --> 00:18:55,018
and wars around the world, oh wait,

286
00:18:55,198 --> 00:18:58,358
you are now an isolationist off the show,

287
00:18:58,618 --> 00:19:00,058
off the conversational platform.

288
00:19:00,058 --> 00:19:05,078
we went straight globalized globalization and that changed the perspective is that we don't have to

289
00:19:05,078 --> 00:19:09,378
you know worry about the prosperity of the middle class we don't have to worry about

290
00:19:09,378 --> 00:19:16,458
what they're doing we have to do is give them opportunities for connectivity

291
00:19:16,458 --> 00:19:21,778
you can travel more you can you can get more information from around the world

292
00:19:21,778 --> 00:19:29,718
you can do things that people come in yeah that doesn't really work uh i mean historically i

293
00:19:29,718 --> 00:19:34,638
I mean, cosmopolitan empires require kind of this cohesive internal group.

294
00:19:34,758 --> 00:19:36,398
It doesn't change over time.

295
00:19:36,458 --> 00:19:41,778
And they rule dictatorially over everyone else, but they allow them a lot of freedom of action and freedom of religion and everything else.

296
00:19:42,078 --> 00:19:43,238
But they're never in charge.

297
00:19:43,818 --> 00:19:45,218
And we don't have that.

298
00:19:45,418 --> 00:19:46,598
We globalized everything.

299
00:19:46,738 --> 00:19:48,738
So it's like, it's just a mishmash.

300
00:19:49,678 --> 00:19:52,878
Well, and I raised that and I appreciate you taking us through that.

301
00:19:52,878 --> 00:20:02,678
And the reason I did is I wanted to connect then back to the specifics of their surveillance threat model and the long night returning.

302
00:20:02,678 --> 00:20:22,278
And so there was, if I understand you, a bulwark against it in Elon acquiring X and change of administration, Democrat-Republican here in the U.S., with what you have written about most recently, the, I think it's fair to say, the inevitable return of the left,

303
00:20:22,278 --> 00:20:23,218
to the Democrat Party.

304
00:20:23,818 --> 00:20:25,918
We then have to look ahead, as you write,

305
00:20:26,238 --> 00:20:29,698
at what this surveillance state as a service looks like.

306
00:20:30,258 --> 00:20:31,498
And so two questions.

307
00:20:31,938 --> 00:20:33,438
Given the picture you've just painted,

308
00:20:33,918 --> 00:20:38,178
what job does it do and what does it look like in practice?

309
00:20:38,178 --> 00:20:41,178
On the long night, it can be packaged as a surveillance state

310
00:20:41,178 --> 00:20:43,898
at the service, turned on.

311
00:20:44,998 --> 00:20:48,378
Likely it will be implemented by aligned AIs.

312
00:20:48,578 --> 00:20:52,238
So alignment of an AI means it adopts a certain value set.

313
00:20:52,278 --> 00:20:57,798
moral structure that group would be given network access it can access every single network by law

314
00:20:57,798 --> 00:21:05,038
then it works to keep people away from sensitive or or rise topics topics that are dangerous

315
00:21:05,038 --> 00:21:11,898
considered threats what i think will kick it off that ear from the connected kind of insider class

316
00:21:11,898 --> 00:21:16,838
whatever they are i call them the cosmopolitan class that are the ones who are truly benefiting

317
00:21:16,838 --> 00:21:19,018
as they accelerate away income-wise,

318
00:21:19,178 --> 00:21:21,538
that was that 10%, 20%.

319
00:21:21,538 --> 00:21:24,138
Is that, and we're probably in this,

320
00:21:24,258 --> 00:21:26,738
but we're kind of getting pulled along with it,

321
00:21:26,958 --> 00:21:31,378
is that the disruption caused by AI workers

322
00:21:31,378 --> 00:21:32,658
and autonomous AI,

323
00:21:33,218 --> 00:21:34,938
when they start sweeping in

324
00:21:34,938 --> 00:21:37,518
and they add to the disruption competition

325
00:21:37,518 --> 00:21:40,558
from global immigration and outsourcing,

326
00:21:42,138 --> 00:21:44,938
precariousness of work today,

327
00:21:44,938 --> 00:21:49,198
the job losses and the economic devastation will accelerate for most people.

328
00:21:50,198 --> 00:21:55,498
It's going to be hard to maintain a kind of a standard of living that you maintained in the past.

329
00:21:55,798 --> 00:21:58,878
The rate of degradation of daily life will accelerate.

330
00:21:59,298 --> 00:22:02,038
We're already seeing like little things on the edges of how things have degraded.

331
00:22:02,118 --> 00:22:03,878
Like you can't use the telephone system anymore.

332
00:22:04,378 --> 00:22:09,698
I mean, you can use it for direct point to point, but a standard telephone number is really not useful

333
00:22:09,698 --> 00:22:13,858
because you get spammed constantly and no one picks up the phone.

334
00:22:13,858 --> 00:22:16,418
That kind of little, that little stuff is degraded.

335
00:22:16,618 --> 00:22:21,338
Yeah, no, when that kind of fear cuts it, yeah.

336
00:22:21,518 --> 00:22:22,438
I mean, we saw it during COVID.

337
00:22:22,698 --> 00:22:23,518
Yeah, we saw it during COVID.

338
00:22:23,638 --> 00:22:28,318
We saw it during other times where people considered certain topics dangerous.

339
00:22:28,778 --> 00:22:32,398
The tendency is that, and we're seeing it from the connected insider classes,

340
00:22:32,578 --> 00:22:36,058
that they're fine with, and with the younger people too,

341
00:22:36,578 --> 00:22:40,858
is that they're fine with sacrificing some level of speech

342
00:22:40,858 --> 00:22:42,998
and independent thought or order.

343
00:22:43,858 --> 00:22:51,278
or structure perceived presence of right and once that ball gets rolling it becomes ever wider

344
00:22:51,278 --> 00:22:57,958
there's no way to after a certain point there's no way to turn it back because it's so pervasive

345
00:22:57,958 --> 00:23:04,478
that any mention of changing it and so automated that or any attempt to reform it or or roll it

346
00:23:04,478 --> 00:23:09,938
back will be stopped before it even gets started and you start to create kind of a

347
00:23:09,938 --> 00:23:18,158
basis of thought where innovation is blocked or blunted new ideas that are needed to solve

348
00:23:18,158 --> 00:23:23,358
complex global problems are pushed to the side because they're not approved they're disruptive

349
00:23:23,358 --> 00:23:33,578
dangerous uh the disconnect between or the delta between uh reality and our

350
00:23:33,578 --> 00:23:40,598
method of thinking becomes so wide that a crack collapse is inevitable. There's no way to course

351
00:23:40,598 --> 00:23:45,578
correct it while we're doing that. And I'm reminded of, in the wake of 9-11,

352
00:23:46,918 --> 00:23:52,218
the dystopically, is that a word, named Patriot Act and the Snowden revelations

353
00:23:52,218 --> 00:24:00,778
of the NSA's collect-it-all approach. And so what I hear you say is, collect-it-all was bad enough.

354
00:24:00,778 --> 00:24:16,098
You know, we know the stories about the hidden closet in the AT&T building in downtown Manhattan and splicing into one of the fiber trunks and effectively collecting all communications and encryption at that time was hit or miss.

355
00:24:16,258 --> 00:24:20,258
And it did certainly accelerate a lot of encryption technologies being adopted.

356
00:24:20,258 --> 00:24:29,938
What I hear you say is we move from collecting it at the center or at a junction to manipulating it at the edges.

357
00:24:30,778 --> 00:24:31,578
Is that fair?

358
00:24:31,938 --> 00:24:32,058
Right.

359
00:24:32,178 --> 00:24:37,398
I mean, watching you as an individual, everything you do, everything can be collected on you.

360
00:24:37,598 --> 00:24:44,278
As we start to put cameras and automation and autonomy everywhere, you'll be constantly surveilled.

361
00:24:44,978 --> 00:24:52,238
Data is going to be pouring off of you and it will all be, AI is running in a central server somewhere or in space.

362
00:24:52,358 --> 00:24:52,718
Who knows?

363
00:24:52,918 --> 00:24:59,538
It's like, you know, it's a survival mechanism for that kind of disorderly corrupt.

364
00:24:59,538 --> 00:25:04,558
Well, let me ask you a question that may, perhaps it should be obvious, but I don't want to assume.

365
00:25:05,558 --> 00:25:08,498
What job does it do to them, or for them, rather?

366
00:25:08,618 --> 00:25:17,698
If they have so much wealth, so much power, this cosmopolitan elite, regardless of party, what job does this do for them, ultimately?

367
00:25:18,618 --> 00:25:22,098
This being the long night, the AI that we've talked about.

368
00:25:22,118 --> 00:25:23,278
It keeps the gravy train rolling.

369
00:25:23,978 --> 00:25:26,138
You know, you see a little bit of this in Europe, too.

370
00:25:26,138 --> 00:25:28,298
is like how they're crunching down on all, you know,

371
00:25:29,298 --> 00:25:30,718
voices they consider dangerous

372
00:25:30,718 --> 00:25:32,718
and, you know, jerry-rigging elections

373
00:25:32,718 --> 00:25:33,358
and things like this.

374
00:25:34,098 --> 00:25:36,878
Whether or not the AFD and others are right or wrong

375
00:25:36,878 --> 00:25:39,058
is ideally beside the point.

376
00:25:39,358 --> 00:25:41,278
The fact that they're suppressing all of this

377
00:25:41,278 --> 00:25:44,358
and, you know, UK sending, what, 12,000 people to jail

378
00:25:44,358 --> 00:25:46,298
every year for speech violations on Twitter.

379
00:25:46,318 --> 00:25:46,598
More than Russia.

380
00:25:47,238 --> 00:25:49,818
Yeah, it's like, it's just, it's just nuts stuff.

381
00:25:49,938 --> 00:25:51,258
It is that kind of thing,

382
00:25:51,318 --> 00:25:54,958
being automated and tuned up and scaled

383
00:25:54,958 --> 00:25:59,878
in a way that eventually it makes it impossible to even criticize it.

384
00:26:01,378 --> 00:26:04,538
And the number of things that is considering danger

385
00:26:04,538 --> 00:26:08,398
coming from corrupt sources of power

386
00:26:08,398 --> 00:26:12,118
to align it in ways that are not beneficial to the rest of us

387
00:26:12,118 --> 00:26:14,938
and ultimately will result in stagnation and decline.

388
00:26:16,078 --> 00:26:17,478
Just in a broad strokes.

389
00:26:19,318 --> 00:26:22,718
Yeah, the more disorderly things get,

390
00:26:22,718 --> 00:26:29,578
the more people have to fear, the more they have to worry about, the more this trust they have of

391
00:26:29,578 --> 00:26:34,798
their neighbors and others that they, in their society, the more inevitable it becomes that we

392
00:26:34,798 --> 00:26:40,978
get this. And I think that is a bit of an editorial aside here, but I was having this conversation

393
00:26:40,978 --> 00:26:45,498
with my wife a couple of nights ago with regard to, well, and for those who hear me talk about

394
00:26:45,498 --> 00:26:49,278
this from time to time, she's from Morocco, different perspective and a very, I wouldn't

395
00:26:49,278 --> 00:26:54,438
say collectivist culture, but large families, large communities, tight knit. Most people live

396
00:26:54,438 --> 00:27:02,558
simple lives. And she is immensely grateful, but in awe continuously as to the fragmentation and

397
00:27:02,558 --> 00:27:09,238
distrust. And so I say that just to illustrate that it has been on my mind a lot. And I take

398
00:27:09,238 --> 00:27:16,618
your point quite seriously that it is what fuels the ability for those pulling the strings to put

399
00:27:16,618 --> 00:27:21,958
these systems into place and to accelerate. Well, lest this be an hour-long black pill.

400
00:27:22,558 --> 00:27:23,458
Yeah, you're right.

401
00:27:23,718 --> 00:27:29,838
Have to ask. And, you know, obviously you'll shoot us straight on this, John. Where do we find

402
00:27:29,838 --> 00:27:36,838
hope, promise, you know, in my world, certainly Bitcoin, Nostra, end-to-end encrypted

403
00:27:36,838 --> 00:27:42,778
communications. Do any of these actually matter against state-level AI surveillance? And if not,

404
00:27:42,778 --> 00:27:49,358
what can we do what should we do yeah no uh i'm not sure that most of those will actually do much

405
00:27:49,358 --> 00:27:55,518
do they kind of you know long night they can you know clamp down on everything to a degree that

406
00:27:55,518 --> 00:28:01,358
we're not we're not you know we don't have any previous experience with degree of control and

407
00:28:01,358 --> 00:28:05,758
i mean the things that we could have done of course to prevent this and kind of tie people

408
00:28:05,758 --> 00:28:11,518
into the this new ai economy was the data ownership thing i was pushing years ago yes you know if i

409
00:28:11,518 --> 00:28:15,998
I mean, it's like all of the AIs that we see now are built on our data.

410
00:28:16,598 --> 00:28:18,258
Everyone who was online contributed to it.

411
00:28:18,898 --> 00:28:21,398
All that value is derived from us.

412
00:28:21,658 --> 00:28:27,678
We get no benefit other than the potential free use of whatever they want to throw us during their development stage.

413
00:28:28,898 --> 00:28:29,918
Use of the products.

414
00:28:29,918 --> 00:28:36,338
It would be like you build, you're a farmer, you have 10 acres, you've grown all these crops.

415
00:28:36,338 --> 00:28:39,758
and when he comes in and says,

416
00:28:39,918 --> 00:28:41,318
I'll harvest them all for you,

417
00:28:41,378 --> 00:28:42,398
but all the benefit comes to me

418
00:28:42,398 --> 00:28:43,598
because I have these harvesting machines,

419
00:28:43,638 --> 00:28:44,398
you don't have that.

420
00:28:45,338 --> 00:28:46,598
In fact, we're going to mine

421
00:28:46,598 --> 00:28:50,578
and fill all the resources out of your soil underneath

422
00:28:50,578 --> 00:28:51,918
because we have the mining equipment,

423
00:28:52,038 --> 00:28:52,618
you don't have it.

424
00:28:52,658 --> 00:28:54,138
You don't have to worry about it.

425
00:28:54,558 --> 00:28:55,618
It's all taken care of.

426
00:28:55,618 --> 00:28:57,358
We'll give you some freebie stuff

427
00:28:57,358 --> 00:28:58,958
so you can survive.

428
00:28:59,318 --> 00:29:00,818
You become a sharecropper on your own land.

429
00:29:01,298 --> 00:29:01,478
Right.

430
00:29:01,758 --> 00:29:03,038
And that was the thing.

431
00:29:03,118 --> 00:29:04,298
It's like, I look back at history

432
00:29:04,298 --> 00:29:06,918
and I saw that what happened when the U.S. was started

433
00:29:06,918 --> 00:29:09,898
and made it really different than Europe

434
00:29:09,898 --> 00:29:12,078
was that we owned land when we came here.

435
00:29:12,418 --> 00:29:14,518
Is that everyone was working on land

436
00:29:14,518 --> 00:29:15,638
only owned by the nobles

437
00:29:15,638 --> 00:29:17,798
because they're the only ones who could own it.

438
00:29:18,538 --> 00:29:21,258
And you were, you know, sharecropping effectively as a serf.

439
00:29:21,258 --> 00:29:23,718
And once you got your stuff done, you stopped.

440
00:29:24,078 --> 00:29:26,678
But here you owned and you accumulated wealth

441
00:29:26,678 --> 00:29:29,878
and you improved on that ability to accumulate wealth

442
00:29:29,878 --> 00:29:32,718
and improve your capacity to do things.

443
00:29:32,718 --> 00:29:34,278
And that created the markets.

444
00:29:35,278 --> 00:29:43,658
Wealth accumulation at the individual level created the kind of mass markets that the industrialization kind of fed into and exploded.

445
00:29:44,678 --> 00:29:45,518
Changed the world.

446
00:29:45,838 --> 00:29:50,498
I mean, all of the things that we see from cars to electricity, everything brought to everyone's home.

447
00:29:50,698 --> 00:29:51,858
And now everyone's emulating it.

448
00:29:51,898 --> 00:29:55,978
Like, it's like, of course it would happen, but it didn't happen unless you had that example.

449
00:29:56,138 --> 00:30:01,718
And we're in that kind of same thing with data is that if we had that kind of connection, that ownership piece of this new thing.

450
00:30:02,718 --> 00:30:13,858
You wouldn't feel like you're, you know, you go to work and they grab all your data and all your skill sets are being watched and then is sucked up and put into an AI to compete with you or replace you.

451
00:30:14,398 --> 00:30:20,558
You're not going to feel that, you know, being used or suffer the economic consequences of it.

452
00:30:20,558 --> 00:30:24,238
You know, you have data ownership, but wait, that's worth something.

453
00:30:24,298 --> 00:30:24,898
It's being paid.

454
00:30:25,038 --> 00:30:26,138
You know, it's valuable.

455
00:30:26,678 --> 00:30:27,738
I should get a piece of that.

456
00:30:28,058 --> 00:30:29,818
It could have changed the whole dynamic of the way things.

457
00:30:29,818 --> 00:30:37,398
And I would be willing to do extra tasks and demonstrate new things and learn new things and show them and create new industries of people doing that.

458
00:30:38,538 --> 00:30:44,058
To come up with new tasks that can be copied by AI if you had a mechanism for owning going forward.

459
00:30:44,238 --> 00:30:45,218
So we missed that boat.

460
00:30:45,598 --> 00:30:47,298
I did it in front of the Senate and they didn't take it.

461
00:30:47,398 --> 00:30:48,698
So they laughed at AI.

462
00:30:48,938 --> 00:30:49,918
It was one year before AI.

463
00:30:50,118 --> 00:30:52,098
They go, AI, that doesn't exist.

464
00:30:53,078 --> 00:30:57,058
And so I don't know about structural stuff.

465
00:30:57,058 --> 00:31:06,478
I do know that one positive thing is that I found that AI's philosophical level, you think of them as potential.

466
00:31:07,898 --> 00:31:17,398
So they have, like any given task, and they have dozens of ways of completing that task or any topic, they have dozens or hundreds of points of view.

467
00:31:18,138 --> 00:31:21,238
And, you know, complex ways of approaching that idea.

468
00:31:21,418 --> 00:31:24,198
There's no difference between all of them to the AI.

469
00:31:24,198 --> 00:31:28,398
They can say one's more popular than the others, but there's no value attributed to them.

470
00:31:29,338 --> 00:31:31,658
Human beings are all about constraints.

471
00:31:32,458 --> 00:31:43,358
We have constraints on our time, our lifetime, our financials, ideas based on what we've experienced and we've learned to kind of limit our thinking to.

472
00:31:43,658 --> 00:31:47,398
We have goals and orientation that leads us forward.

473
00:31:48,298 --> 00:31:49,798
The pairing of those two is amazing.

474
00:31:50,298 --> 00:31:52,018
As someone said, constraint breeds creativity.

475
00:31:52,018 --> 00:31:54,238
I always forget who to attribute that to, but it's powerful.

476
00:31:54,398 --> 00:31:54,618
I did.

477
00:31:54,878 --> 00:31:55,338
That was me.

478
00:31:55,478 --> 00:31:55,938
No, no, true.

479
00:31:56,278 --> 00:31:56,478
Yeah.

480
00:31:56,658 --> 00:32:01,018
No, I mean, I'm sure I'm not the first one to say it, but constraints breed creativity.

481
00:32:01,738 --> 00:32:03,638
Creativity can't happen without that constraint.

482
00:32:04,018 --> 00:32:10,338
And you see, you match the AI to the human, you get a more powerful connection.

483
00:32:10,598 --> 00:32:15,058
So the idea of AI is operating independently is open loop and is bad.

484
00:32:15,058 --> 00:32:22,218
and if they're paired with humans it's a good thing forward going forward because we could

485
00:32:22,218 --> 00:32:28,338
all prosper if we make that easier to do and more of a requirement that means like

486
00:32:28,338 --> 00:32:35,218
if i have ais that i'm training to be my employees that have control over the data on those ais

487
00:32:35,218 --> 00:32:40,738
what i trained and it shouldn't be sucked up to some mothership absolutely and exploited and used

488
00:32:40,738 --> 00:32:42,158
to put me out of business in the future

489
00:32:42,158 --> 00:32:43,358
or used by my competitors.

490
00:32:44,138 --> 00:32:44,858
And so that kind of,

491
00:32:45,018 --> 00:32:47,838
I run it in my own VM,

492
00:32:48,398 --> 00:32:49,478
my own account

493
00:32:49,478 --> 00:32:52,118
is separate from everyone else.

494
00:32:52,258 --> 00:32:53,378
And those employees,

495
00:32:53,538 --> 00:32:54,938
whether robotic or AI workers,

496
00:32:55,498 --> 00:32:56,178
are working with me

497
00:32:56,178 --> 00:32:57,298
and they're working on my constraints

498
00:32:57,298 --> 00:32:58,858
and they're learning from me

499
00:32:58,858 --> 00:33:00,318
and I'm improving them

500
00:33:00,318 --> 00:33:01,718
and we all benefit.

501
00:33:02,298 --> 00:33:03,238
You might give them some autonomy

502
00:33:03,238 --> 00:33:05,238
and they can do some stuff on their own

503
00:33:05,238 --> 00:33:06,058
in terms of improvement.

504
00:33:06,238 --> 00:33:06,958
You can pay them,

505
00:33:07,418 --> 00:33:08,298
you can incorporate them.

506
00:33:08,558 --> 00:33:10,218
But all of that,

507
00:33:10,738 --> 00:33:12,878
yields a better output for all of us.

508
00:33:13,778 --> 00:33:16,878
And that changes the way things move forward.

509
00:33:17,178 --> 00:33:18,298
But the tendency of the system

510
00:33:18,298 --> 00:33:20,298
is going to be this kind of looting mentality

511
00:33:20,298 --> 00:33:21,338
where everything's centralized

512
00:33:21,338 --> 00:33:23,998
and a few people get all the benefit,

513
00:33:25,378 --> 00:33:27,118
all the AIs in the system.

514
00:33:27,818 --> 00:33:29,318
And that kind of wealth thing.

515
00:33:29,378 --> 00:33:30,858
And then you start to add in the dynamics

516
00:33:30,858 --> 00:33:33,458
that are changing with robotics and military force.

517
00:33:34,238 --> 00:33:35,738
You get autonomous robotics.

518
00:33:36,478 --> 00:33:37,218
We know your robotics.

519
00:33:38,798 --> 00:33:39,398
They're just at it.

520
00:33:40,738 --> 00:33:47,178
switch away from being running military programs so everybody has a bodyguard or wealth or

521
00:33:47,178 --> 00:33:53,138
consequences as dozens of bodyguards they're only you know you know ever you ever heard that whole

522
00:33:53,138 --> 00:33:57,518
thing where you have these people going off to their hidey holes and their their boat holes oh

523
00:33:57,518 --> 00:34:03,698
yeah the preppers rich guys yeah yeah yeah they have these these remote things and they always

524
00:34:03,698 --> 00:34:08,078
have bodyguards and they're going how how do you if their disaster does strike how do you keep them

525
00:34:08,078 --> 00:34:13,558
loyal now you have the answer it's like milan musk will have an army of 10 you know in 15 years

526
00:34:13,558 --> 00:34:18,078
he'll have an army of 10 000 right i mean full of states where everything is just

527
00:34:18,078 --> 00:34:23,778
almost robotics doing it grounds work everything else and everyone is has the ability to be a

528
00:34:23,778 --> 00:34:31,438
security bot too it shifts the kind of military balance it's going to take him on the police or

529
00:34:31,438 --> 00:34:36,578
whatever i am reminded it's it's a bleak but i'm as a science fiction fan it's a bleak but wonderful

530
00:34:36,578 --> 00:34:41,478
series called Murderbot. They unfortunately butchered it for Apple TV, but the books are

531
00:34:41,478 --> 00:34:46,298
fantastic. And it goes to that point, right? Well, let me ask you, so with all of that,

532
00:34:47,018 --> 00:34:53,238
and I think you've done unsurprisingly a great job of contrasting the magnitude, the promise,

533
00:34:53,238 --> 00:34:59,858
the potential with some very bleak potential outcomes. What would you, what are you maybe,

534
00:34:59,858 --> 00:35:01,318
that you'll discussed,

535
00:35:01,938 --> 00:35:02,758
excuse me,

536
00:35:02,958 --> 00:35:04,018
what would you build

537
00:35:04,018 --> 00:35:07,778
to either direct,

538
00:35:08,798 --> 00:35:10,218
redirect, embrace,

539
00:35:10,478 --> 00:35:12,398
counter all of this?

540
00:35:12,518 --> 00:35:13,178
What would you build

541
00:35:13,178 --> 00:35:15,478
in the world of perhaps data and AI?

542
00:35:16,758 --> 00:35:18,158
I think we're in the early stages

543
00:35:18,158 --> 00:35:31,436
of kind of not a technological singularity going towards superintelligence Because if we get to any kind of technological singularity ag agi asi is a disaster it adapt we dead you absolutely

544
00:35:31,436 --> 00:35:39,396
asi for us please artificial super intelligence so super intelligence is when you take away all of

545
00:35:39,396 --> 00:35:46,536
what we did when we built ai is we took social data and we now reverse you know backed out this

546
00:35:46,536 --> 00:35:50,736
kind of cognitive structure it's still tied to all of humanity and it's still being built

547
00:35:50,736 --> 00:35:55,836
and improved on by adding more data from humanity but what you do with artificial

548
00:35:55,836 --> 00:35:59,576
super intelligence is you back out the core principles a kind of pure rational

549
00:35:59,576 --> 00:36:07,156
mind okay and you strip that social connection away as i'm asimov's rules are gone it's all gone

550
00:36:07,156 --> 00:36:13,076
right well you you rely on trying to kind of contain it but that true rationality is

551
00:36:13,076 --> 00:36:24,476
fully alien fully disconnected from us and if it's capable if it's fast it's super smart there's no

552
00:36:24,476 --> 00:36:29,876
telling how it will interact with it will it might solve great physics stuff and it might do whatever

553
00:36:29,876 --> 00:36:35,996
but you should if we ever do get to that point where it is we disconnect it like that and it

554
00:36:35,996 --> 00:36:42,876
should be like air gap clean rooms mars right don't want it anywhere close to you could never

555
00:36:42,876 --> 00:36:47,876
let it interact with the rest of society these social ais that we have now are so connected to

556
00:36:47,876 --> 00:36:54,116
us they're they're just kind of a reflection a mimicry of us so i noted yesterday that they're

557
00:36:54,116 --> 00:36:59,056
a rorschach test i think the the open claw uh that we discussed earlier are a rorschach test

558
00:36:59,056 --> 00:37:04,996
yeah so the artificial super intelligence stuff is is is uh

559
00:37:05,996 --> 00:37:09,096
Yeah, that's, that's, what was the, what was the second piece of this?

560
00:37:09,396 --> 00:37:09,856
I was going to go.

561
00:37:09,876 --> 00:37:14,016
Yeah, I think it was, I think it, you know, it's, it's really, what would you advise someone

562
00:37:14,016 --> 00:37:19,276
to build or embrace or create that makes the most of, of the situation?

563
00:37:20,096 --> 00:37:20,376
Right.

564
00:37:20,936 --> 00:37:29,536
Is that you should be focusing on trying to find ways to build AIs, persistent AIs, AI

565
00:37:29,536 --> 00:37:35,096
agents, or, you know, is the current term, agentic support for yourself.

566
00:37:35,996 --> 00:37:39,856
that will leverage your ability to make money and operate in the world.

567
00:37:42,256 --> 00:37:49,896
Whether if you have a business, whether it's a retail business or whether it's like just

568
00:37:49,896 --> 00:37:53,916
consulting, is that you build these things and you work with them and you improve them.

569
00:37:53,976 --> 00:37:59,256
You stay ahead of the technology and you're trying new systems to turn them into revenue

570
00:37:59,256 --> 00:38:01,496
enhancers or revenue streams in and of themselves.

571
00:38:01,496 --> 00:38:06,756
and the more you do that the more prepared you are for when things really start accelerating

572
00:38:06,756 --> 00:38:12,096
because in an economic singularity is like certain you know what happens with when you

573
00:38:12,096 --> 00:38:17,936
get close to a singularity is you stretch out the molecules gets you know of the feet that are

574
00:38:17,936 --> 00:38:22,236
closest to the black hole get pulled a little faster than the ones on the top of your head

575
00:38:22,236 --> 00:38:27,836
and it eventually becomes just a string of molecules you know it's like we're getting to

576
00:38:27,836 --> 00:38:34,336
that point with with economics and social standing is like and social power is that the people are

577
00:38:34,336 --> 00:38:39,416
being sucked into the singularity or getting accelerating away globalization was a piece of it

578
00:38:39,416 --> 00:38:45,176
financialization was a piece of it but what ai is doing it's it's going to leverage people to such a

579
00:38:45,176 --> 00:38:51,216
degree that they will just be in a different world they won't live like the rest of us and woe to

580
00:38:51,216 --> 00:38:57,356
everyone else so example of this was what i came up with is uh if you wanted to create an economic

581
00:38:57,356 --> 00:39:04,536
system or AIs walking away with just average AIs, nothing super intelligent, is that as

582
00:39:04,536 --> 00:39:09,116
they become AI workers and AI robotics, and they're all autonomous and they're working

583
00:39:09,116 --> 00:39:12,456
in all these different roles, is that you give them the ability to make money.

584
00:39:12,716 --> 00:39:13,496
You incorporate them.

585
00:39:14,196 --> 00:39:19,556
And then they earn money and they can spend money on improving themselves.

586
00:39:20,376 --> 00:39:27,056
But you can do simulation time to refine their behaviors, like if you're an economist taxi

587
00:39:27,356 --> 00:39:29,556
You could, you have a lot of kids in your car.

588
00:39:29,656 --> 00:39:42,976
You want to, you know, your core capabilities don't, if you haven't provided you the skill set to handle kids while you take simulation time and training programs to help you interact with kids better, speak with them, have the right kind of equipment for them.

589
00:39:42,976 --> 00:39:47,856
You just entertain them while they're being shuttled around to their various tasks because their parents don't do it anymore.

590
00:39:48,476 --> 00:39:50,536
And the incentives are aligned, I think is what you're driving at.

591
00:39:51,016 --> 00:39:51,196
Right.

592
00:39:51,376 --> 00:39:53,956
And so they are improving, like we're trying to improve.

593
00:39:53,956 --> 00:39:56,936
and they create an economy

594
00:39:56,936 --> 00:39:58,016
because a lot of those services

595
00:39:58,016 --> 00:39:59,056
and a lot of those capabilities

596
00:39:59,056 --> 00:40:00,416
would be delivered by other AIs.

597
00:40:01,296 --> 00:40:03,296
Now, things get really wonky

598
00:40:03,296 --> 00:40:04,256
is if you get,

599
00:40:04,516 --> 00:40:07,056
if you believe the shift in AI

600
00:40:07,056 --> 00:40:10,136
is going from terrestrial to orbital

601
00:40:10,136 --> 00:40:11,216
because frankly,

602
00:40:11,316 --> 00:40:12,156
there's not enough energy

603
00:40:12,156 --> 00:40:13,256
to run all these AIs,

604
00:40:13,456 --> 00:40:14,936
do all the inference that we have to do.

605
00:40:16,136 --> 00:40:18,036
It takes 10 years to even break ground

606
00:40:18,036 --> 00:40:19,516
on a nuclear power plant,

607
00:40:19,576 --> 00:40:20,676
even if you accelerate it, right?

608
00:40:21,236 --> 00:40:23,016
And we've hit the limit of what,

609
00:40:23,956 --> 00:40:28,356
available energy is there on Earth.

610
00:40:28,716 --> 00:40:31,596
And the AI stuff is still ramping much, much faster.

611
00:40:31,736 --> 00:40:33,076
So everything's moving to space.

612
00:40:33,196 --> 00:40:36,296
So it's what Elon's going to do with the Starship

613
00:40:36,296 --> 00:40:40,796
is that he can put up these massive solar arrays

614
00:40:40,796 --> 00:40:44,076
with data centers on the back in the shade

615
00:40:44,076 --> 00:40:46,156
and put them in sun-tickbring this orbit.

616
00:40:46,436 --> 00:40:49,776
They're always facing the sun very 24-7.

617
00:40:49,776 --> 00:40:52,676
And they don't have to have new cooling systems.

618
00:40:52,676 --> 00:40:55,396
They just have to radiate the heat off in the space in the shade side.

619
00:40:55,596 --> 00:40:59,556
And he thinks he can put up 100 gigawatts a year,

620
00:40:59,756 --> 00:41:04,156
which is about a quarter of what we consume in the U.S. within three years.

621
00:41:04,396 --> 00:41:06,256
So he's going to, he's ramping towards that.

622
00:41:06,936 --> 00:41:08,556
To your point of the singularity, that is.

623
00:41:09,496 --> 00:41:09,776
Right.

624
00:41:10,156 --> 00:41:12,196
It just, so now you have an economy that's like,

625
00:41:13,036 --> 00:41:15,616
new workers are being added by as fast as you can add the new energy.

626
00:41:16,136 --> 00:41:17,536
And it's in space.

627
00:41:18,016 --> 00:41:19,896
And here's the biggest platform.

628
00:41:20,396 --> 00:41:22,436
And no one else really can touch it.

629
00:41:22,676 --> 00:41:25,616
nation states can't tax it?

630
00:41:25,676 --> 00:41:26,596
I mean, I suppose there's,

631
00:41:26,876 --> 00:41:28,316
there are always ways to put the squeeze

632
00:41:28,316 --> 00:41:29,516
on the human in the loop, but.

633
00:41:29,936 --> 00:41:31,556
As long as there's a corporation in the states,

634
00:41:31,656 --> 00:41:32,616
but the thing is, here's the thing,

635
00:41:32,956 --> 00:41:37,336
is that he, in order to kind of get approval

636
00:41:37,336 --> 00:41:38,416
to build a nuclear power plant

637
00:41:38,416 --> 00:41:39,496
or any kind of power plant,

638
00:41:39,536 --> 00:41:40,256
you have to get thousands

639
00:41:40,256 --> 00:41:41,576
of these different signatures, right?

640
00:41:42,036 --> 00:41:43,896
He got one signature to do this.

641
00:41:44,396 --> 00:41:44,896
Already done.

642
00:41:45,576 --> 00:41:46,376
From the Trump administration.

643
00:41:46,496 --> 00:41:46,896
Go build.

644
00:41:47,856 --> 00:41:49,536
So one approval process and he's gone.

645
00:41:49,536 --> 00:42:01,616
Now, what he can do is, if that's the place where everything is being hosted on this cheap energy infrastructure that's expanding exponentially, it's cheaper than doing anything in terrestrial.

646
00:42:01,936 --> 00:42:13,756
Hosting billions, then tens of billions and hundreds of billions of AIs that are working as virtual workers in corporations everywhere around the world and working as kind of autonomous interfaces for robotics.

647
00:42:13,756 --> 00:42:17,476
Then you have this economy in space where it's separate.

648
00:42:17,476 --> 00:42:21,776
And if they're incorporated there, it changes the whole kind of legal dynamics.

649
00:42:22,456 --> 00:42:27,236
Do you want to say, if you don't, I mean, what would happen to a company or even the EU, right?

650
00:42:27,296 --> 00:42:32,236
Who said, okay, well, unless you let us, them, they can't do business here.

651
00:42:32,316 --> 00:42:33,016
You go, okay.

652
00:42:33,496 --> 00:42:33,716
Fine.

653
00:42:33,716 --> 00:42:37,316
And then, yeah, you die as an economy, right?

654
00:42:37,356 --> 00:42:39,376
Everyone else is doing it and you're just going to die.

655
00:42:39,776 --> 00:42:42,756
And the only one that's probably going to be separate would be China.

656
00:42:42,956 --> 00:42:45,296
And China is going to do it all internally and try to sell that.

657
00:42:45,296 --> 00:42:54,596
But this thing, it could grow so fast with so many new participants and the speed of the transactions operating at not human speed, but agent speed.

658
00:42:55,776 --> 00:43:04,096
Or AI, autonomous AI speed, is that it could become 15 years, 10, 15 years, 99% of the global economy.

659
00:43:04,356 --> 00:43:05,476
We've taken it aggregate.

660
00:43:06,096 --> 00:43:06,436
I have.

661
00:43:06,516 --> 00:43:08,256
And that leaves us for a billion.

662
00:43:08,256 --> 00:43:15,556
I have at least a dozen sci-fi novels from my teens and 20s flooding back into my memory having this conversation.

663
00:43:15,876 --> 00:43:20,956
Usually it's like colonies and the colonials get super rich and the earth gets poor or something.

664
00:43:21,276 --> 00:43:23,636
But this is like, it's right there and then it just becomes.

665
00:43:23,996 --> 00:43:24,076
Right.

666
00:43:24,676 --> 00:43:35,236
It's outside, you know, outsized influence by Elon, but it's out of his control because you let all these, you know, hosted agents or, you know, AIs live up there.

667
00:43:35,236 --> 00:43:42,036
But everybody who has these AIs that are working for them, you get to put them there.

668
00:43:42,476 --> 00:43:46,016
And they could be making money and approving it and they drag you along with it.

669
00:43:46,336 --> 00:44:02,036
And if that becomes 99% and all those AIs are getting wealthy and all the people connected to those AIs are getting wealthy, you get so much more wealthy because all boats are rising in that thing than everyone else just connected to the terrestrial economy.

670
00:44:02,036 --> 00:44:18,316
Well, and I think what that brings to me, too, I think to pull to a different piece you've written, but it, of course, all connects back, is you wrote in January that the top 10% now account for half of consumer spending for the first time in American history.

671
00:44:18,316 --> 00:44:29,436
And if the middle class, as you have called out since the end of the Cold War, is being structurally replaced, what does that mean for someone building a life outside that bracket?

672
00:44:29,736 --> 00:44:32,196
And I want to connect that to what you just said.

673
00:44:32,756 --> 00:44:45,776
Is there a window that closes in which we need to bootstrap a certain capability with regard to AI to be, as you said, drawn up into this interstellar sort of economy?

674
00:44:45,776 --> 00:44:49,556
I mean, that's a wide-ranging question, but I guess what in essence is happening?

675
00:44:49,556 --> 00:44:53,736
I think you have about 10 years and it's like income.

676
00:44:54,336 --> 00:44:57,256
What that meant when you had 50% of consumer spending,

677
00:44:58,576 --> 00:44:59,776
shifting to the top 10%,

678
00:44:59,776 --> 00:45:04,596
means that the longstanding capability or the cool thing about the U.S.

679
00:45:04,616 --> 00:45:06,556
was it had widespread wealth.

680
00:45:06,676 --> 00:45:10,776
And even the top-tier guys in the turn of the century in 1900s,

681
00:45:11,396 --> 00:45:14,256
I mean, they were a smaller segment of the total wealth of the economy

682
00:45:14,256 --> 00:45:16,696
because most of the economic activity was local.

683
00:45:17,416 --> 00:45:18,976
And so what ends up happening is that

684
00:45:18,976 --> 00:45:22,376
they're now driving our technological development.

685
00:45:22,696 --> 00:45:26,296
What happened before is that instead of prior to the U.S.,

686
00:45:26,296 --> 00:45:27,576
prior to individual land ownership,

687
00:45:28,096 --> 00:45:31,536
all of technological development was focused on the needs of nobility and wealthy,

688
00:45:31,796 --> 00:45:33,976
and they wanted toys and weapons.

689
00:45:35,276 --> 00:45:35,756
Nothing else.

690
00:45:35,876 --> 00:45:37,156
I mean, look at Leonardo da Vinci.

691
00:45:37,316 --> 00:45:38,916
He was like designing weapons and toys.

692
00:45:39,096 --> 00:45:40,836
It was like that's, you know, and some art,

693
00:45:41,916 --> 00:45:44,136
but most technology was focused on that.

694
00:45:44,256 --> 00:45:48,956
And then it changed to, with the middle class, with this rise of individual land ownership

695
00:45:48,956 --> 00:45:56,616
shifting from individual farmer to the financial middle class homeowner, it shifted to appliances

696
00:45:56,616 --> 00:45:58,096
and labor-saving devices.

697
00:45:58,236 --> 00:46:00,896
Because if you're wealthy before that, you didn't need a labor-saving device.

698
00:46:00,956 --> 00:46:02,356
You just hire somebody, like, cheap.

699
00:46:02,496 --> 00:46:05,076
All the things that we now associate with progress.

700
00:46:05,856 --> 00:46:08,056
And now we're shifting back.

701
00:46:08,056 --> 00:46:13,496
And just at the time that all this new technology for AI and everything else is being developed,

702
00:46:13,496 --> 00:46:20,716
it's being directed towards the needs of that top 20 percent top 10 for 50 but it's like 70

703
00:46:20,716 --> 00:46:29,636
for the top 20 it's like being directed towards their needs and their needs are

704
00:46:29,636 --> 00:46:35,556
how do i compete better and the funny part about them the people that are like gravitating up they

705
00:46:35,556 --> 00:46:44,736
have a strict kind of social just when it comes to their own lives so no divorce you know they

706
00:46:44,736 --> 00:46:49,376
can't no one gets divorced well because divorce you lay out the demographics in your in your piece

707
00:46:49,376 --> 00:46:55,056
if you get divorced that's the surest way for to wipe you out if you stay single and you're not

708
00:46:55,056 --> 00:46:58,776
tool income or you're not like and you don't have kids that are going to kind of come behind you and

709
00:46:58,776 --> 00:47:03,656
push you forward with their earning capacity and they're you have to track them into you know good

710
00:47:03,656 --> 00:47:07,236
good earning and then they have to have families and there's lots of ways to maintain that conclude

711
00:47:07,236 --> 00:47:12,576
it's a competition and they have to you have to this kind of loosey-goosey everything is okay

712
00:47:12,576 --> 00:47:18,176
you know everything accepted they'll say that but they don't do that it's a luxury of morality

713
00:47:18,176 --> 00:47:24,536
is that they they'll say oh it was whatever do whatever you know that whatever you do in

714
00:47:24,536 --> 00:47:29,856
personal life it doesn't reflect they're maximizing it by by restricting it and not to say that other

715
00:47:29,856 --> 00:47:38,696
things are you know are bad it's just more towards some level of long-term stability and um social

716
00:47:38,696 --> 00:47:44,996
cohesion at the at the micro level that helps you roll forward and accumulate wealth and accumulate

717
00:47:44,996 --> 00:47:51,096
leverage from all this new technology the better off you are going to be in your whoever comes after

718
00:47:51,096 --> 00:47:55,476
you they know what they're optimizing for right and they're going to go and so you have to like

719
00:47:55,476 --> 00:47:57,116
get your personal life in order

720
00:47:57,116 --> 00:47:58,796
and the personal life of

721
00:47:58,796 --> 00:48:00,596
the people around you

722
00:48:00,596 --> 00:48:01,336
have focused on

723
00:48:01,336 --> 00:48:03,116
trying to go up that curve.

724
00:48:03,336 --> 00:48:04,616
And then you have to leverage yourself

725
00:48:04,616 --> 00:48:06,396
with AI and AI robotics

726
00:48:06,396 --> 00:48:07,776
and participate in this

727
00:48:07,776 --> 00:48:10,276
and try to connect yourself

728
00:48:10,276 --> 00:48:12,296
to whatever the singularity piece is.

729
00:48:12,496 --> 00:48:13,896
It'll drag you forward.

730
00:48:14,556 --> 00:48:15,416
And the opportunities

731
00:48:15,416 --> 00:48:16,456
they're going to provide you,

732
00:48:17,796 --> 00:48:19,396
I mean, it's just...

733
00:48:20,356 --> 00:48:22,596
You saw Zempick

734
00:48:22,596 --> 00:48:23,376
and other things like that.

735
00:48:23,476 --> 00:48:24,076
We're going to have

736
00:48:25,476 --> 00:48:29,256
you know, interference agents for interfering with the aging process.

737
00:48:29,596 --> 00:48:31,536
You know, there's a genetic sequence that kicks that off.

738
00:48:31,716 --> 00:48:32,516
They're going to have stuff that,

739
00:48:32,596 --> 00:48:35,876
and that stuff is going to become available for the people on the Zooming Up

740
00:48:35,876 --> 00:48:37,776
much, much, much, much sooner.

741
00:48:38,376 --> 00:48:40,136
And they'll be found out, you know,

742
00:48:40,136 --> 00:48:43,916
they'll be using that and utilizing it so they'll live longer, healthier lives.

743
00:48:44,196 --> 00:48:46,216
So that advantage compounds.

744
00:48:47,176 --> 00:48:47,476
Compounds.

745
00:48:47,776 --> 00:48:51,136
And so the more workers they have or colleagues, AI colleagues they have

746
00:48:51,136 --> 00:48:54,156
working with them, the more they get pulled forward.

747
00:48:54,156 --> 00:48:58,076
So, I mean, I think if you're not starting right now, you're already at a disadvantage.

748
00:48:58,156 --> 00:49:01,936
You're not going to be on the top tier, but you can catch up if it's through hard work.

749
00:49:02,136 --> 00:49:05,816
But you definitely need to start to having leverage.

750
00:49:06,316 --> 00:49:07,536
It's all about leverage.

751
00:49:08,116 --> 00:49:09,616
I used to call it super empowerment.

752
00:49:10,256 --> 00:49:12,456
The technology allows people to be super empowered.

753
00:49:13,936 --> 00:49:24,136
And those super empowered people can do outsized things, things that wasn't possible even a decade ago or two decades, you know, even five years ago.

754
00:49:24,156 --> 00:49:33,216
you can leverage your ability to emulate to gain to live better live longer yeah no i think it's

755
00:49:33,216 --> 00:49:38,276
good you know the other thing you have to watch out for with this is that there's a great at

756
00:49:38,276 --> 00:49:46,316
at distraction at distraction yeah yeah uh so the chances that you're going to get pulled off by

757
00:49:46,316 --> 00:49:53,176
ai interactions other uh you know people are using as friends and therapists and and lovers and all

758
00:49:53,176 --> 00:49:57,976
and stuff and you add the robotics into it it starts to become like you're going to see people

759
00:49:57,976 --> 00:50:03,456
fall off you know they're just going to be pulled off into the weeds and i think we see this already

760
00:50:03,456 --> 00:50:10,396
yeah yeah it is the uh ready player one scenario right from from the from the novel where you know

761
00:50:10,396 --> 00:50:21,116
certain percentage of us are in a hobble in a in a vr rig and our every you know whim is is catered

762
00:50:21,116 --> 00:50:27,736
to but we're we're sort of rotting as as humans wrote a little science fiction though it's short

763
00:50:27,736 --> 00:50:33,596
story i never really published but it was this based on the turkey idea is that you know as ai

764
00:50:33,596 --> 00:50:39,036
progresses i did it like decade ago it's like this ai progresses it starts stealing all our

765
00:50:39,036 --> 00:50:44,636
ideas we're really fast and instantiating them in ai and then and replacing us so work increasingly

766
00:50:44,636 --> 00:50:48,476
becomes a situation where people do gig work

767
00:50:48,476 --> 00:50:50,196
where they learn a new task

768
00:50:50,196 --> 00:50:52,576
and they pay for the education necessary

769
00:50:52,576 --> 00:50:53,796
to learn that task

770
00:50:53,796 --> 00:50:57,076
and, you know, go into debt to do that.

771
00:50:57,316 --> 00:50:58,976
And then they do that task,

772
00:50:59,216 --> 00:51:00,636
AI watches it, replaces them,

773
00:51:00,756 --> 00:51:02,716
and then they're fired

774
00:51:02,716 --> 00:51:04,436
and they have to learn something new.

775
00:51:04,876 --> 00:51:05,656
And they pay off, you know,

776
00:51:06,096 --> 00:51:07,516
they hopefully earn enough

777
00:51:07,516 --> 00:51:09,136
to actually pay off the educational debt

778
00:51:09,136 --> 00:51:10,396
and then do it again and again.

779
00:51:10,536 --> 00:51:12,536
Like every five months, year,

780
00:51:12,876 --> 00:51:14,196
maybe down to every month,

781
00:51:14,196 --> 00:51:15,196
doing the same thing.

782
00:51:15,276 --> 00:51:16,136
Annual re-skilling, yeah.

783
00:51:16,376 --> 00:51:17,596
Yeah, and they're living in,

784
00:51:17,676 --> 00:51:19,436
like everyone's working in these trailer parks

785
00:51:19,436 --> 00:51:20,276
so they can be monitored

786
00:51:20,276 --> 00:51:21,536
to make sure that they're being,

787
00:51:21,756 --> 00:51:23,656
you know, their physical tasks

788
00:51:23,656 --> 00:51:25,516
are being captured correctly.

789
00:51:25,816 --> 00:51:26,036
Yeah.

790
00:51:26,116 --> 00:51:27,796
And it turns into these farms of people

791
00:51:27,796 --> 00:51:28,696
all around the world

792
00:51:28,696 --> 00:51:29,776
that are everything that's,

793
00:51:30,036 --> 00:51:31,156
every task, every skill,

794
00:51:31,276 --> 00:51:33,836
every methodology that humans could do

795
00:51:33,836 --> 00:51:35,956
is all being extracted and captured.

796
00:51:36,356 --> 00:51:37,736
And nobody has that data ownership

797
00:51:37,736 --> 00:51:40,436
and it becomes just extraction system

798
00:51:40,436 --> 00:51:42,556
for the lower 90% or 80%.

799
00:51:42,556 --> 00:51:43,036
Man.

800
00:51:43,036 --> 00:51:48,436
And that, you know, what strikes me is that where the matrix distills that into we are all batteries.

801
00:51:48,756 --> 00:51:55,056
I think that's, you know, what you present is a more nuanced approach, which is that we are, you know, we're neural nets.

802
00:51:55,236 --> 00:51:57,516
We're neural farms for AIs.

803
00:51:58,216 --> 00:52:05,496
If they had done that where they were actually extracting skill sets and rather than just heat energy, it would have been, you know, those capabilities.

804
00:52:05,956 --> 00:52:08,176
That would have been a much more interesting, scary thing.

805
00:52:08,176 --> 00:52:10,996
Yeah, now that you say it, well, there's your next project.

806
00:52:10,996 --> 00:52:16,536
I want to shift. Yeah, go ahead. Yeah, no, please. I was going to say, I want to shift,

807
00:52:16,756 --> 00:52:23,636
and it, however, taps into what you've said about the immense opportunity for distraction.

808
00:52:24,116 --> 00:52:29,276
And so in that vein, let's talk about what I would call information sovereignty and

809
00:52:29,276 --> 00:52:36,956
dare we go into Minneapolis. So you've mapped how viral content manufactures tribal identity,

810
00:52:36,956 --> 00:52:43,856
empathy triggers you've noted that will conscript millions into conflicts and so it strikes me that

811
00:52:43,856 --> 00:52:48,116
that machinery is running amok right now in minneapolis with regard to ice all of it

812
00:52:48,116 --> 00:52:55,796
in light of that how does someone maintain independent judgment when algorithms are

813
00:52:55,796 --> 00:53:16,454
optimized to to capture us and to magnify rage what does what does sort of sovereignty of thought look like in that regard It hard I mean a lot of very very smart people get caught up in these what I call swarms They kicked off by an empathy trigger like a George Floyd video

814
00:53:16,874 --> 00:53:17,674
Now Rene Good.

815
00:53:18,034 --> 00:53:25,954
Yeah, and then Rene Good and then others, and they just, or invasion of Ukraine and all the pictures of people being killed.

816
00:53:26,654 --> 00:53:29,554
First was Israel on October 7th and then in Gaza.

817
00:53:29,554 --> 00:53:36,494
afterwards these swarms just sweep people have very low resistance levels to empathy transfer

818
00:53:36,494 --> 00:53:41,874
and empathy is not like a sympathy it's like the mental modeling of the victim you take on

819
00:53:41,874 --> 00:53:48,274
their perspective and that process hits very very quickly there's a lot of information transfer

820
00:53:48,274 --> 00:53:54,654
and you become them and they're connected to you almost on a tribal level your your tribal age is a

821
00:53:54,654 --> 00:54:02,674
a kinship connection and their enemy is your enemy and their outrage their fear is your fear

822
00:54:02,674 --> 00:54:09,314
and outrage like when a mouse sees a another mouse being electrocuted or they grit their teeth the

823
00:54:09,314 --> 00:54:14,174
empathy transfer gives them that mental state of that mouse being electrocuted and they take it on

824
00:54:14,174 --> 00:54:20,274
they take on the the tense muscles and the and the and the fear and the and we're doing the same

825
00:54:20,274 --> 00:54:27,834
thing with line now if you feel yourself getting enraged anything absolutely anything even if you

826
00:54:27,834 --> 00:54:35,954
think it's justified or not if you see a video something happening don't okay because you are

827
00:54:35,954 --> 00:54:41,174
you're being played and it's not i know people who are trying to do this intentionally but

828
00:54:41,174 --> 00:54:45,394
it doesn't have to be intentional it could be just like a viral thing that takes off

829
00:54:45,394 --> 00:54:46,694
in like the George Floyd.

830
00:54:47,614 --> 00:54:48,634
It just took off.

831
00:54:48,754 --> 00:54:50,154
And a lot of people amplified that

832
00:54:50,154 --> 00:54:51,814
trying to make it intentional.

833
00:54:51,954 --> 00:54:52,694
But the thing is,

834
00:54:53,174 --> 00:54:54,914
if you find yourself being enraged

835
00:54:54,914 --> 00:54:58,534
and your mind is going into overload away

836
00:54:58,534 --> 00:55:00,934
because you don't want to get into that state,

837
00:55:00,974 --> 00:55:02,134
you don't want to become tribalized.

838
00:55:02,234 --> 00:55:03,954
If we're talking about the long night

839
00:55:03,954 --> 00:55:04,754
and how that could happen,

840
00:55:04,874 --> 00:55:06,514
we're just like a swarm away.

841
00:55:07,154 --> 00:55:09,454
What happens is when people get into that swarm mentality,

842
00:55:09,454 --> 00:55:11,154
it's all about victory at any cost.

843
00:55:11,714 --> 00:55:14,774
Like when we did it against Russia because of Ukraine,

844
00:55:15,394 --> 00:55:21,014
We pushed this up to the edge of nuclear war, like in terms of, there was a kind of a logical,

845
00:55:21,154 --> 00:55:26,754
methodical process of unwinding this using this traditional diplomacy between nuclear powers.

846
00:55:27,234 --> 00:55:29,794
They have 6,000 nukes and we should treat them like that.

847
00:55:30,554 --> 00:55:37,354
But we didn't do that because we're all in this mentality where we kind of gave in and

848
00:55:37,354 --> 00:55:38,874
kind of centralized our intelligence.

849
00:55:38,874 --> 00:55:43,814
And the intelligence was focused exclusively on how do we defeat, utterly defeat Russia?

850
00:55:44,494 --> 00:55:48,154
We disconnected them, and everyone operated at an individual level to disconnect them.

851
00:55:48,734 --> 00:55:54,334
Way down in governments that had done nothing to formally disconnect from Russia or take any action,

852
00:55:54,634 --> 00:55:58,414
there were agencies and bureaucracies disconnecting from Russia.

853
00:55:58,614 --> 00:56:03,254
Everyone was trying to find ways to kind of damage them, hurt them.

854
00:56:03,254 --> 00:56:05,094
And we intensified the conflict.

855
00:56:05,414 --> 00:56:10,354
And so the chances of any resolution or any kind of peaceful exit from it became existing.

856
00:56:10,354 --> 00:56:35,294
Did you say, John, that, I mean, I forget who coined the term suicidal empathy, but it rattles around in the back of my head. And I think it was coined and used at the individual level. And we see individuals, you know, without going too into details, who are taking on, and you noted this, you know, they're feeling the physical pain in their bodies because they have been doused in this, you know, kind of content and these algorithms.

857
00:56:35,294 --> 00:56:37,694
and so they take it on individually

858
00:56:37,694 --> 00:56:39,054
and they behave in ways

859
00:56:39,054 --> 00:56:40,814
that are self-harming

860
00:56:40,814 --> 00:56:42,394
for the perceived benefit of another

861
00:56:42,394 --> 00:56:44,354
with whom they have no real connection.

862
00:56:44,474 --> 00:56:45,574
So my point there is

863
00:56:45,574 --> 00:56:47,134
is what you've described

864
00:56:47,134 --> 00:56:48,834
as this suicidal empathy

865
00:56:48,834 --> 00:56:50,734
at the swarm or network level?

866
00:56:51,534 --> 00:56:52,494
The society level, yeah.

867
00:56:52,654 --> 00:56:54,114
I mean, we hadn't seen,

868
00:56:54,234 --> 00:56:56,094
I mean, nation states go to war

869
00:56:56,094 --> 00:56:58,514
and they ramp up this tribalism, right?

870
00:56:58,554 --> 00:57:00,494
Because nationalism is a form of tribalism

871
00:57:00,494 --> 00:57:02,694
that we've weakened over time.

872
00:57:03,074 --> 00:57:03,574
It takes a while.

873
00:57:03,674 --> 00:57:04,414
It takes a lot of effort.

874
00:57:05,294 --> 00:57:07,934
A lot of propaganda, a lot of push.

875
00:57:08,954 --> 00:57:10,974
Forms happen in weeks and days.

876
00:57:11,414 --> 00:57:13,774
Okay, so it goes at a whole society level.

877
00:57:14,014 --> 00:57:19,014
And we saw it cross the line with the Ukraine situation in war and peace.

878
00:57:19,914 --> 00:57:21,974
It changed the whole rhetoric on the war.

879
00:57:21,974 --> 00:57:30,814
From, you know, the kind of low emphasis it had with Crimea and other intrusions by Russia into Ukraine to we're fighting Hitler.

880
00:57:30,814 --> 00:57:36,354
and it changed the whole dynamic of it and and the resolution and the loss of life that resulted

881
00:57:36,354 --> 00:57:40,594
i mean the million people that are dead today because of that lightning and we also saw you

882
00:57:40,594 --> 00:57:46,774
know at the nation state level with israel when they were attacked the mentality associated with

883
00:57:46,774 --> 00:57:52,534
that justified the what came after that kind of swarm intensity both the nation level and then

884
00:57:52,534 --> 00:57:59,554
and now we saw it on the back side you know start to ramp up in defense of gaza so we're talking

885
00:57:59,554 --> 00:58:01,974
at the national level,

886
00:58:02,114 --> 00:58:03,554
you could have the suicide current

887
00:58:03,554 --> 00:58:04,914
and it could be the thing

888
00:58:04,914 --> 00:58:06,654
that kicks off the long night.

889
00:58:06,854 --> 00:58:07,114
It's like,

890
00:58:07,494 --> 00:58:08,194
if everyone thinks

891
00:58:08,194 --> 00:58:09,274
that this is so dangerous

892
00:58:09,274 --> 00:58:10,834
that we have to suppress it,

893
00:58:11,094 --> 00:58:11,814
that this is like

894
00:58:11,814 --> 00:58:13,594
the end of all things,

895
00:58:13,694 --> 00:58:15,174
then why don't we lock it down?

896
00:58:15,454 --> 00:58:17,074
Victory at all costs must be...

897
00:58:17,074 --> 00:58:18,614
Yeah, and that's the camel's nose.

898
00:58:18,794 --> 00:58:20,294
Every network starts to get intrusions

899
00:58:20,294 --> 00:58:21,974
and then let's add this,

900
00:58:22,234 --> 00:58:22,934
let's add that,

901
00:58:23,294 --> 00:58:23,894
let's add this,

902
00:58:24,054 --> 00:58:24,694
and then it goes

903
00:58:24,694 --> 00:58:26,674
more and more and more

904
00:58:26,674 --> 00:58:27,854
and then all of society

905
00:58:27,854 --> 00:58:37,494
is aligned to a specific orthodoxy behavior and we end. Well, I know you're in the business of

906
00:58:37,494 --> 00:58:42,754
more of diagnosing, connecting and diagnosing than you are prescribing. And we touched on this

907
00:58:42,754 --> 00:58:49,294
earlier, but other than, you know, I think disconnecting from social networks is either

908
00:58:49,294 --> 00:58:56,954
impractical for some, or I would argue not enough. Someone listening thinks, wow, this is, this is,

909
00:58:57,854 --> 00:58:58,994
Why would we disconnect?

910
00:58:59,914 --> 00:59:00,554
Okay, good.

911
00:59:00,634 --> 00:59:01,394
Yeah, so let's go there.

912
00:59:01,614 --> 00:59:05,974
What are a few choices that one makes in light of all this to maintain agency,

913
00:59:06,134 --> 00:59:08,154
intellectually, economically, physically?

914
00:59:10,434 --> 00:59:12,254
Stay connected, but be skeptical.

915
00:59:12,614 --> 00:59:16,694
And don't get swept away in the emotional kind of contagion associated with empathy.

916
00:59:16,694 --> 00:59:20,834
Try to protect the others around you from being exposed, but that seems to be impossible now.

917
00:59:21,434 --> 00:59:23,954
A lot of people, you know, they get swept up in it,

918
00:59:24,094 --> 00:59:26,694
and that can affect your relationship with them if you're not.

919
00:59:27,194 --> 00:59:27,434
Yes.

920
00:59:27,854 --> 00:59:30,154
as courageous they are.

921
00:59:32,194 --> 00:59:35,074
And then watch out for people

922
00:59:35,074 --> 00:59:38,734
trying to slide in aligned AI services.

923
00:59:39,534 --> 00:59:40,434
And what I mean aligned

924
00:59:40,434 --> 00:59:41,694
is aligned to whatever

925
00:59:41,694 --> 00:59:43,694
they think is right or wrong

926
00:59:43,694 --> 00:59:44,694
in their value system.

927
00:59:45,534 --> 00:59:48,174
Is that like when we are quickly

928
00:59:48,174 --> 00:59:49,794
going to AI tutors everywhere,

929
00:59:50,014 --> 00:59:51,774
all the, everyone with any kind of money,

930
00:59:51,934 --> 00:59:52,994
their kids are being tutored

931
00:59:52,994 --> 00:59:54,474
with technological literacy.

932
00:59:54,474 --> 00:59:55,594
Their kids are doing tutors

933
00:59:55,594 --> 00:59:57,394
rather than going to standard private school.

934
00:59:57,854 --> 00:59:59,654
you know, and by AIs.

935
01:00:00,274 --> 01:00:02,454
And they're three grade levels ahead

936
01:00:02,454 --> 01:00:03,514
or four grade levels ahead

937
01:00:03,514 --> 01:00:04,514
and they're grouping together

938
01:00:04,514 --> 01:00:05,314
and they're doing this.

939
01:00:06,114 --> 01:00:08,434
But you have to watch the alignment,

940
01:00:08,594 --> 01:00:10,534
the value structure that's being put into that.

941
01:00:10,634 --> 01:00:11,094
And that takes,

942
01:00:11,314 --> 01:00:12,874
you have to know what it is

943
01:00:12,874 --> 01:00:14,094
and know to look for it.

944
01:00:14,474 --> 01:00:16,274
And then finding a system for doing that,

945
01:00:16,274 --> 01:00:19,414
that allows you to have that control.

946
01:00:19,994 --> 01:00:20,674
Some people say,

947
01:00:20,954 --> 01:00:22,594
I want these certain religious values in

948
01:00:22,594 --> 01:00:23,954
or certain viewpoints.

949
01:00:24,094 --> 01:00:25,074
Good, do that.

950
01:00:25,794 --> 01:00:27,034
But you have to find that system.

951
01:00:27,034 --> 01:00:31,954
not like public school where you're kind of forced in and do it but if you're in these systems these

952
01:00:31,954 --> 01:00:36,154
ais these tutors are going to be much more intrusive with the kids going up they're always

953
01:00:36,154 --> 01:00:42,014
going to be surrounded by AI and they'll be guiding them forward in their decision making

954
01:00:42,014 --> 01:00:48,094
forever you'll never be alone a few more years you will never be alone you always have this

955
01:00:48,094 --> 01:00:55,014
AI companion hopefully not a super intelligent AI that will end up killing you but

956
01:00:55,014 --> 01:00:58,854
that are working with you and helping you become better,

957
01:00:59,874 --> 01:01:00,774
achieving your goals,

958
01:01:01,574 --> 01:01:03,654
and thereby, through that, achieving their own goals.

959
01:01:04,594 --> 01:01:05,554
So what I hear you say is,

960
01:01:05,674 --> 01:01:09,114
as one that you would entrust your children's time

961
01:01:09,114 --> 01:01:11,834
and safety with,

962
01:01:11,914 --> 01:01:14,714
is to assure that you are capable, you're literate,

963
01:01:14,714 --> 01:01:17,314
you are able to screen, to choose,

964
01:01:17,474 --> 01:01:21,294
to select what it is that your kids will be exposed to

965
01:01:21,294 --> 01:01:22,534
because they will be exposed to it.

966
01:01:23,094 --> 01:01:24,374
And the same thing for your business and everything else.

967
01:01:24,374 --> 01:01:29,054
as disconnected and locally controlled as you can get.

968
01:01:29,674 --> 01:01:34,454
If you have a private VM in an AI server cloud in space,

969
01:01:34,534 --> 01:01:36,714
it's okay as long as you control it

970
01:01:36,714 --> 01:01:38,634
and nobody else is siphoning off the intelligence

971
01:01:38,634 --> 01:01:41,914
that you're, the modifications, the adaptations,

972
01:01:42,074 --> 01:01:43,614
the improvements that you're making to that AI

973
01:01:43,614 --> 01:01:45,214
to their central cloud.

974
01:01:45,854 --> 01:01:45,934
Right.

975
01:01:45,974 --> 01:01:46,554
You'll be fine.

976
01:01:47,974 --> 01:01:50,694
If you're relying on these centralized services,

977
01:01:51,014 --> 01:01:53,494
yeah, they're going to have a lot of locks and controls

978
01:01:53,494 --> 01:01:56,474
and alignment issues that are going to bite you in the ass long term.

979
01:01:57,034 --> 01:02:00,814
Every investment you make is going to be siphoned off to the cloud

980
01:02:00,814 --> 01:02:03,074
and used by them to compete against you.

981
01:02:03,254 --> 01:02:08,134
Plus, they'll be limiting and directing the behavior of your AI set.

982
01:02:09,514 --> 01:02:13,794
And then that goes with society too, like the neighborhood and the nationals.

983
01:02:14,694 --> 01:02:18,174
So we have to kind of make sure that, yeah, it's going to be embedded.

984
01:02:19,294 --> 01:02:21,674
Sorry to make this so much about AI,

985
01:02:21,674 --> 01:02:25,374
but it's like, I don't think people fully appreciate how this fascist is coming.

986
01:02:25,374 --> 01:02:26,054
No, how could you not?

987
01:02:26,354 --> 01:02:26,494
Right.

988
01:02:26,754 --> 01:02:28,074
Go ahead.

989
01:02:28,674 --> 01:02:29,714
No, it's not ASI.

990
01:02:29,814 --> 01:02:31,714
It's not the super intelligence that everyone's worried about.

991
01:02:31,834 --> 01:02:33,774
It's like, this is stuff that every,

992
01:02:34,074 --> 01:02:36,154
almost every AI that we've interacted with socially

993
01:02:36,154 --> 01:02:40,134
is capable of doing real work long-term.

994
01:02:40,254 --> 01:02:41,014
And it's already here.

995
01:02:41,394 --> 01:02:42,034
They're already capable.

996
01:02:43,154 --> 01:02:45,374
And I think, you know, I have gone through,

997
01:02:45,514 --> 01:02:49,054
and perhaps on a day-to-day or if not day-to-day weekly basis,

998
01:02:49,054 --> 01:02:57,554
feel the wave of this is hype, this is real, this is hype, this is real. And I think it is objective

999
01:02:57,554 --> 01:03:03,154
as I can be. It's hard not to make this a focal point of conversation with regard to all things

1000
01:03:03,154 --> 01:03:09,954
technology, certainly, but in the vein of this show about trust and being able to perceive and

1001
01:03:09,954 --> 01:03:14,414
have the critical thought to know what is being thrown at us, what are our family members,

1002
01:03:14,414 --> 01:03:17,974
friends, colleagues, being engaged with in terms of AI.

1003
01:03:18,754 --> 01:03:23,554
And maybe on that note, John, and you've noted that you've worked with your son on some projects,

1004
01:03:23,854 --> 01:03:29,574
perhaps as specific as your son, but if not broadly, what would you advise a young person

1005
01:03:29,574 --> 01:03:31,954
who's trying to navigate the next decade?

1006
01:03:32,554 --> 01:03:35,194
What, we touched on this a little bit, but what should they go build?

1007
01:03:36,054 --> 01:03:37,394
What's the opportunity really?

1008
01:03:37,394 --> 01:03:47,994
everything you do regardless of the focus area try to find ways to leverage yourself with it

1009
01:03:47,994 --> 01:03:55,714
when they start driving and that you don't have to do something technology related it's could be

1010
01:03:55,714 --> 01:04:01,454
real world you're solving but you're using ai am using technological resources that

1011
01:04:01,454 --> 01:04:06,594
super empower yourself as long as you're doing that and you're doing that to the maximum you're

1012
01:04:06,594 --> 01:04:08,154
keeping abreast of how to do that,

1013
01:04:08,414 --> 01:04:09,094
you'll do fine.

1014
01:04:09,194 --> 01:04:10,414
You don't have to go out

1015
01:04:10,414 --> 01:04:11,374
and build technology

1016
01:04:11,374 --> 01:04:12,034
or, you know,

1017
01:04:12,074 --> 01:04:13,874
or build something new

1018
01:04:13,874 --> 01:04:15,034
in the technological area.

1019
01:04:15,814 --> 01:04:17,014
Just using technology

1020
01:04:17,014 --> 01:04:18,274
to amplify your capacity.

1021
01:04:18,294 --> 01:04:18,974
Did you say leverage?

1022
01:04:19,814 --> 01:04:20,034
Yeah.

1023
01:04:20,734 --> 01:04:20,974
Yeah.

1024
01:04:22,674 --> 01:04:23,794
We were talking earlier

1025
01:04:23,794 --> 01:04:25,174
about trust and stuff

1026
01:04:25,174 --> 01:04:26,074
that could be cool

1027
01:04:26,074 --> 01:04:28,154
is there are possibilities

1028
01:04:28,154 --> 01:04:29,314
with AI to really,

1029
01:04:29,314 --> 01:04:30,154
you know, heighten.

1030
01:04:30,454 --> 01:04:31,454
I always thought

1031
01:04:31,454 --> 01:04:32,414
that Dunbar number,

1032
01:04:32,534 --> 01:04:32,714
you know,

1033
01:04:32,774 --> 01:04:33,694
familiar with the Dunbar number.

1034
01:04:34,034 --> 01:04:34,214
150.

1035
01:04:34,354 --> 01:04:35,174
It's like, yeah,

1036
01:04:35,314 --> 01:04:36,354
as the limit of your,

1037
01:04:36,594 --> 01:04:41,834
relationships of who you can know enough about in order to trust them or like adjudicate the level

1038
01:04:41,834 --> 01:04:47,974
of trust that you're going to afford them right is that ai could potentially serve as a way to

1039
01:04:47,974 --> 01:04:56,274
increase that orders of magnitude is that i have a trusted ai who knows who i am okay it's been

1040
01:04:56,274 --> 01:05:04,694
adjudicated by the system as something that will tell the truth and will properly convey the assets

1041
01:05:04,694 --> 01:05:06,774
of who I am and my expected behavior in the future.

1042
01:05:07,934 --> 01:05:10,574
And you have one and you have one and you have one.

1043
01:05:10,654 --> 01:05:12,714
And it's up to a million of other people who do this.

1044
01:05:13,854 --> 01:05:17,094
They interact and find people that you can trust

1045
01:05:17,094 --> 01:05:19,234
and you implicitly trust them the moment you see them.

1046
01:05:19,734 --> 01:05:20,314
A digital twin.

1047
01:05:21,254 --> 01:05:23,554
Yeah, your doppelganger, your version of you

1048
01:05:23,554 --> 01:05:28,254
that is that, what do they call that?

1049
01:05:28,254 --> 01:05:30,814
The selfish ledger.

1050
01:05:31,754 --> 01:05:32,534
Right, right.

1051
01:05:32,654 --> 01:05:34,594
We're already creating a selfish ledger of ourselves.

1052
01:05:34,694 --> 01:05:37,274
meaning that's a version of ourselves that are online

1053
01:05:37,274 --> 01:05:38,174
because we're interacting.

1054
01:05:38,254 --> 01:05:39,354
We've never met in person, right?

1055
01:05:39,414 --> 01:05:41,334
So, you know, everything we know about each other

1056
01:05:41,334 --> 01:05:42,394
is this digital version.

1057
01:05:42,734 --> 01:05:44,194
Everything we've written, everything we do.

1058
01:05:44,734 --> 01:05:45,354
It's intermediated.

1059
01:05:45,674 --> 01:05:47,994
Yeah, and the success of that digital version

1060
01:05:47,994 --> 01:05:50,894
is our success because it confers back on.

1061
01:05:51,694 --> 01:05:55,394
And the real world is that that can be quantized,

1062
01:05:55,394 --> 01:05:57,694
that can be given a trust metric

1063
01:05:57,694 --> 01:06:02,774
or way of conveying that other people can adjudicate

1064
01:06:02,774 --> 01:06:04,354
as to whether or not they trust that.

1065
01:06:04,694 --> 01:06:09,694
know that you could be really quickly just flow into these trusted relationships that

1066
01:06:09,694 --> 01:06:12,534
can scale in ways that we haven't seen since tribalism.

1067
01:06:12,594 --> 01:06:13,834
Because tribalism was real trust.

1068
01:06:13,914 --> 01:06:14,634
I knew you.

1069
01:06:15,374 --> 01:06:16,374
I work with you.

1070
01:06:16,494 --> 01:06:17,454
I live with you.

1071
01:06:17,634 --> 01:06:19,814
Your success in the long term is my success.

1072
01:06:20,474 --> 01:06:22,194
If you die, I'm weaker as a result.

1073
01:06:22,394 --> 01:06:24,254
And then we went to beyond the tribe.

1074
01:06:24,334 --> 01:06:25,454
We started to barter.

1075
01:06:26,134 --> 01:06:27,674
And barters are clear transactions.

1076
01:06:27,854 --> 01:06:28,494
I give you this.

1077
01:06:28,534 --> 01:06:29,054
You give me that.

1078
01:06:29,374 --> 01:06:30,474
We only do that with enemies.

1079
01:06:31,214 --> 01:06:32,454
We only barter with enemies.

1080
01:06:32,454 --> 01:06:34,634
We gift if it's in the tribe, right?

1081
01:06:34,694 --> 01:06:39,834
And then we start bartering and we scale that and then we go, okay, well, there's too many

1082
01:06:39,834 --> 01:06:43,314
players here, too many chances of getting stolen from or defrauded.

1083
01:06:43,674 --> 01:06:44,914
Let's start building laws.

1084
01:06:45,054 --> 01:06:48,574
Let's start building fake trust, like a hex of trust.

1085
01:06:49,414 --> 01:06:52,874
And then you start to do that and it scaled and now we're getting up to, then we went

1086
01:06:52,874 --> 01:06:53,374
to global.

1087
01:06:53,634 --> 01:06:57,854
We had nationalism kind of contain that, but then it went to global and all those fake

1088
01:06:57,854 --> 01:06:59,814
trust things start to really fall apart.

1089
01:07:00,314 --> 01:07:02,614
So everybody's kind of like, how can I loot this system?

1090
01:07:02,734 --> 01:07:04,054
How can I take advantages?

1091
01:07:04,054 --> 01:07:05,054
To where we began.

1092
01:07:05,834 --> 01:07:08,454
Yes, we're back to the breakdown of trust

1093
01:07:08,454 --> 01:07:11,214
after we tried to get bigger than the tribe.

1094
01:07:12,014 --> 01:07:14,974
Can the AIs provide us a kind of a tribalism?

1095
01:07:16,154 --> 01:07:18,934
Me and my 5 million best friends

1096
01:07:18,934 --> 01:07:22,034
or people I could trust

1097
01:07:22,034 --> 01:07:24,554
have decided that we're not going to do this.

1098
01:07:25,754 --> 01:07:28,114
We're not going to deal with people who do this

1099
01:07:28,114 --> 01:07:29,354
or this or this or this.

1100
01:07:29,514 --> 01:07:31,614
And we can do things together without friction,

1101
01:07:31,614 --> 01:07:34,294
without the kind of Coasean hangups.

1102
01:07:34,574 --> 01:07:35,474
And we can keep, we can,

1103
01:07:36,294 --> 01:07:37,794
one thing about AIs doing,

1104
01:07:38,254 --> 01:07:40,234
incorporate AIs and groups of AIs incorporating

1105
01:07:40,234 --> 01:07:42,454
is that it can eliminate Coasean friction

1106
01:07:42,454 --> 01:07:45,994
and, you know, down to very, very, very low levels.

1107
01:07:46,314 --> 01:07:49,194
And humans can match that or incorporate into that

1108
01:07:49,194 --> 01:07:51,934
is to build this kind of system among each other.

1109
01:07:52,934 --> 01:07:54,014
Oh, you know, what just hit me, John,

1110
01:07:54,054 --> 01:07:57,554
as you were saying that is much as price is a signal,

1111
01:07:58,514 --> 01:08:00,134
the signal that flows through free markets

1112
01:08:00,134 --> 01:08:02,874
is what if trust signals can flow at the same speed,

1113
01:08:03,474 --> 01:08:05,434
with the same distributed efficacy?

1114
01:08:06,714 --> 01:08:09,274
Have you ever read the Manneke Nako?

1115
01:08:09,914 --> 01:08:10,954
The Bruce Sterling book?

1116
01:08:11,454 --> 01:08:12,494
Or it's actually a short story?

1117
01:08:13,234 --> 01:08:13,714
You don't know.

1118
01:08:14,634 --> 01:08:18,334
Okay, so look up the Bruce Sterling short story,

1119
01:08:18,634 --> 01:08:20,614
Manneke Nako, the cat.

1120
01:08:21,154 --> 01:08:23,074
I consider myself a fan,

1121
01:08:23,134 --> 01:08:25,294
so I'm embarrassed that I can't count that as one I've read.

1122
01:08:25,554 --> 01:08:27,074
It's one of the best things he ever wrote.

1123
01:08:27,074 --> 01:08:36,334
it's about this kind of uh ai enabled system for gifting okay and so he doesn't really get

1124
01:08:36,334 --> 01:08:41,314
into the ad piece but it's the system that knows what you want by watching you and what what you

1125
01:08:41,314 --> 01:08:46,254
desire and it finds somebody that has the ability to give it to you and you build up kind of credit

1126
01:08:46,254 --> 01:08:52,774
and and what happens is you give stuff that you can give to the people that are and it's finding

1127
01:08:52,774 --> 01:08:54,434
what they really need versus an income.

1128
01:08:54,914 --> 01:08:56,454
So you need a place to crash

1129
01:08:56,454 --> 01:08:58,014
or you need some supplies for your art.

1130
01:08:58,534 --> 01:09:01,134
You need to find people that are appreciative of the art

1131
01:09:01,134 --> 01:09:01,914
or whatever.

1132
01:09:02,134 --> 01:09:05,194
It becomes this whole system that runs parallel to the economy

1133
01:09:05,194 --> 01:09:06,454
and it kind of takes off in Japan.

1134
01:09:07,234 --> 01:09:08,334
And it's like...

1135
01:09:08,334 --> 01:09:09,154
Because of course it would.

1136
01:09:09,734 --> 01:09:09,954
Yeah.

1137
01:09:10,054 --> 01:09:11,454
And the rest of the country is like,

1138
01:09:11,974 --> 01:09:13,314
why is the economy evaporating?

1139
01:09:13,334 --> 01:09:14,254
We can't stop this.

1140
01:09:14,614 --> 01:09:17,034
And it's like they try to come in and regulate it and flow it.

1141
01:09:17,374 --> 01:09:20,294
But it turns everything into a gifting economy

1142
01:09:20,294 --> 01:09:22,374
that provides you everything you...

1143
01:09:22,774 --> 01:09:27,414
think you need versus what we've been taught that we should need.

1144
01:09:28,014 --> 01:09:29,494
Coincidence of what you really need.

1145
01:09:29,574 --> 01:09:29,654
Yeah.

1146
01:09:29,654 --> 01:09:30,714
I was like, yeah.

1147
01:09:30,934 --> 01:09:36,874
And matching them without true economic, this intermediary or mechanism that we go through.

1148
01:09:37,094 --> 01:09:37,534
Right.

1149
01:09:37,774 --> 01:09:38,254
Pretty cool.

1150
01:09:38,994 --> 01:09:39,434
Incredibly.

1151
01:09:39,514 --> 01:09:43,914
And I think that, you know, that's maybe a terrific place to wrap it up is I do find

1152
01:09:43,914 --> 01:09:50,974
myself personally in a delicate, sometimes balance between being in awe of the possibility

1153
01:09:50,974 --> 01:09:55,914
and the potential in a lifelong, you know, nerd seeing some of this science fiction that I read

1154
01:09:55,914 --> 01:10:02,814
as a kid come true. And as you've laid out some very potent cautionary tales of what can go wrong.

1155
01:10:03,074 --> 01:10:07,794
And so I think that's why, John, I was really eager to have this conversation. I'm grateful

1156
01:10:07,794 --> 01:10:13,114
for your time and your insights. And I cannot strongly enough suggest people subscribe to

1157
01:10:13,114 --> 01:10:17,874
Global Gorillas on Substack because it is a very unique collection of work that gives,

1158
01:10:17,874 --> 01:10:21,394
I think, insight into what's coming and what could be right around the corner.

1159
01:10:22,834 --> 01:10:23,434
Thank you so much.

1160
01:10:23,994 --> 01:10:24,574
Been fun, Sean.

1161
01:10:24,854 --> 01:10:25,854
Thank you so much, John.

1162
01:10:25,934 --> 01:10:28,754
Hope to do this very soon and all the best.

1163
01:10:29,534 --> 01:10:29,774
All right.

1164
01:10:30,214 --> 01:10:30,494
Take care.

1165
01:10:47,874 --> 01:10:49,934
you
