1
00:00:00,300 --> 00:00:07,980
Okay, 5, 4, 3, 2, 1. We're back at it. This is our second episode here on Nickland's Zeno systems.

2
00:00:08,360 --> 00:00:15,940
So, highly encourage you to not just walk into this one, to please go back and watch the first one,

3
00:00:16,000 --> 00:00:23,480
because in that we covered 8 of Grant's 25 points in order to lay the groundwork of what we're talking about here.

4
00:00:23,480 --> 00:00:34,680
And just a quick recap is that in the first episode, we introduced the degenerative ratchet, which is that like a ratchet, things only go one way.

5
00:00:34,780 --> 00:00:37,060
They only move down effectively.

6
00:00:37,740 --> 00:00:51,620
We introduced the trichotomy, which are the three strands of neoreactionism, which would be the theonomist, the religious side, the ethno-nationalist, and then the capitalist, the techno-commercialist.

7
00:00:51,620 --> 00:01:16,120
We introduced the idea of the cathedral, which you can think of as the matrix, which is like the government plus the media plus the non-governmental organizations and the academia and everything that is pushing itself into a direction, pushing this own narrative.

8
00:01:16,120 --> 00:01:31,560
We talked about how democracy has devolved or like not devolved, but probably always been less about solving the problem and more about getting you to not recognize that there is a problem.

9
00:01:32,040 --> 00:01:35,780
The problem to be solved was that you thought there was a problem in a democracy, right?

10
00:01:35,920 --> 00:01:41,640
That's the crisis management is that there is a crisis and you manage it by convincing people there is not a crisis.

11
00:01:41,640 --> 00:01:46,580
Yeah. And what else did I miss?

12
00:01:46,600 --> 00:01:47,420
That's pretty much it.

13
00:01:47,420 --> 00:01:48,720
Yeah, we kind of covered it.

14
00:01:48,840 --> 00:01:54,180
That's how I would describe it is the explicitly maybe more political elements of neo-reaction.

15
00:01:54,400 --> 00:02:01,360
And now we're going to hit onto the fulfillment of neo-reaction as fatalism.

16
00:02:01,580 --> 00:02:09,060
We talked in the first conversation about each of the three strands of the trichotomy have a different notion of fate or destiny.

17
00:02:09,060 --> 00:02:14,920
or you might say a different principle of the generation of spontaneous order.

18
00:02:15,720 --> 00:02:20,060
And those, so capitalism has catalaxy, the way that the free market self-organizes.

19
00:02:20,800 --> 00:02:23,540
Religious people have a notion of providence, the will of God,

20
00:02:23,620 --> 00:02:27,660
as something that guides history and leads to an inevitable outcome outside of human agency.

21
00:02:28,340 --> 00:02:32,060
And ethno-nationalists look at evolution in the same way.

22
00:02:32,060 --> 00:02:36,620
And they say that evolution occurs regardless of our human,

23
00:02:36,620 --> 00:02:43,340
the fantasy we generate to pretend that things will stand still the cathedrals intense and intent

24
00:02:43,340 --> 00:02:51,040
attempt to entrance us with a vision of progress that is human controlled instead of this

25
00:02:51,040 --> 00:02:55,680
uncontrolled notion that moves us towards a future that is not of our explicit choosing

26
00:02:55,680 --> 00:03:02,960
yeah everything in this model that isn't a lie is a mistake that's one of the lines from

27
00:03:02,960 --> 00:03:03,640
is enough system.

28
00:03:04,440 --> 00:03:04,620
Yeah.

29
00:03:05,000 --> 00:03:07,500
If it's not a mistake,

30
00:03:07,620 --> 00:03:09,600
then it's a lie they're put there

31
00:03:09,600 --> 00:03:11,980
to move the cathedral forward.

32
00:03:12,060 --> 00:03:12,880
And again, we talked about it.

33
00:03:12,880 --> 00:03:16,300
We kind of gave this caveat,

34
00:03:16,580 --> 00:03:17,500
which is to say that,

35
00:03:17,740 --> 00:03:18,380
look, we don't know

36
00:03:18,380 --> 00:03:19,760
that democracy is terrible.

37
00:03:19,980 --> 00:03:21,300
We just, we're looking at it

38
00:03:21,300 --> 00:03:22,020
for what it is.

39
00:03:22,120 --> 00:03:22,340
Yeah.

40
00:03:22,560 --> 00:03:23,540
Through Nicolai's eyes.

41
00:03:24,080 --> 00:03:25,180
And that's what we're doing.

42
00:03:25,620 --> 00:03:27,680
That's what he's doing

43
00:03:27,680 --> 00:03:29,740
is like he's, again,

44
00:03:30,240 --> 00:03:31,800
like, guy's not a nihilist

45
00:03:31,800 --> 00:03:33,740
or at least he's not selling nihilism.

46
00:03:33,980 --> 00:03:36,880
He's just selling truth, or at least his best approximation.

47
00:03:37,780 --> 00:03:39,760
Truth, unmediated, truth wrong, man.

48
00:03:39,960 --> 00:03:41,100
Like truth, William Burroughs.

49
00:03:41,100 --> 00:03:41,940
Truth straight up.

50
00:03:42,060 --> 00:03:42,840
Truth straight up, no shame.

51
00:03:42,840 --> 00:03:43,980
Truth, no loop.

52
00:03:44,120 --> 00:03:45,680
Yeah, that's one way to put it.

53
00:03:45,780 --> 00:03:47,280
William Burroughs described it this way.

54
00:03:47,320 --> 00:03:48,600
It's the idea of the naked lunch.

55
00:03:48,600 --> 00:03:51,980
When you finally see reality as it is,

56
00:03:52,240 --> 00:03:55,480
like you imagine you're going to take the bite of a sandwich

57
00:03:55,480 --> 00:03:57,740
that you're eating and suddenly you look down

58
00:03:57,740 --> 00:03:58,960
and you don't see a sandwich.

59
00:03:58,960 --> 00:04:03,620
You see a piece of flesh that's been ripped from an animal.

60
00:04:03,880 --> 00:04:10,540
You see wheat that's been torn out of the earth and covered in chemicals and then put into this bread form.

61
00:04:10,980 --> 00:04:14,940
You see all of reality as it is and not as we describe it to ourselves.

62
00:04:15,480 --> 00:04:15,580
Yeah.

63
00:04:15,740 --> 00:04:16,000
Yeah.

64
00:04:16,420 --> 00:04:20,740
You see the coyote peeing on the wheat, the deer shitting on it.

65
00:04:21,040 --> 00:04:21,920
That's even organic.

66
00:04:22,040 --> 00:04:23,160
That's not as bad as the chemicals.

67
00:04:23,300 --> 00:04:25,700
The ear tag with the name of the cow.

68
00:04:26,000 --> 00:04:26,380
Yeah.

69
00:04:26,500 --> 00:04:27,380
They have names.

70
00:04:27,380 --> 00:04:34,380
neo-reaction as accelerationism that's what we're going to move right now so and and all of this

71
00:04:34,380 --> 00:04:38,540
like i'm trying to set up some points for discussion based on my reading of xeno systems

72
00:04:38,540 --> 00:04:44,500
i am not a nick land scholar i'm not a scholar and so i just like read this and what i think is

73
00:04:44,500 --> 00:04:48,540
interesting i try to describe it to myself in a way that makes sense and then i'm trying to

74
00:04:48,540 --> 00:04:53,760
describe it again in a way that makes sense i am not purporting to give the interpretation of

75
00:04:53,760 --> 00:04:57,800
end of systems none of this is read that if you want that you gotta do it yourself right you gotta

76
00:04:57,800 --> 00:05:03,400
read the book yourself hopefully it's interesting though so here's a description maybe of accelerationism

77
00:05:03,400 --> 00:05:10,080
that marx thought that capitalism would die of its own contradictions that capitalism would

78
00:05:10,080 --> 00:05:15,660
um kind of fall apart as it was moving along and we'd end up in a perfect communist utopia at the

79
00:05:15,660 --> 00:05:23,340
end of it but then it turns out that like all of these all of the things people who hated capitalism

80
00:05:23,340 --> 00:05:32,720
The anti-capitalists, who are probably Marxist, have actually conserved capitalism by dampening its worst influences, by like kind of pulling it down and constraining it.

81
00:05:32,940 --> 00:05:35,260
They didn't let capitalism spiral out of control.

82
00:05:35,860 --> 00:05:47,100
And so another faction of Marxists, and I think this is like Deleuze and Guattari and people in that vein were like, no, man, what we need is just to accelerate capitalism.

83
00:05:47,100 --> 00:05:51,160
We need to just run this shit into the ground until it spins apart.

84
00:05:51,440 --> 00:05:55,120
And that's only going to happen if we let capitalism unfettered do its thing.

85
00:05:55,720 --> 00:06:02,320
And so accelerationism is about like how do we accelerate the processes of capitalism to lead to a new result?

86
00:06:02,560 --> 00:06:08,960
Or accelerate to – accelerate not to destroy but accelerate to achieve escape velocity.

87
00:06:08,980 --> 00:06:09,900
Yes, escape velocity.

88
00:06:10,020 --> 00:06:10,540
And reach something new.

89
00:06:10,680 --> 00:06:11,940
A consummation of – right.

90
00:06:11,940 --> 00:06:12,140
Something different.

91
00:06:12,140 --> 00:06:12,480
Yeah.

92
00:06:12,740 --> 00:06:13,120
For sure.

93
00:06:13,120 --> 00:06:18,620
so the cathedral we talked about in the last episode you described it as the maybe the matrix

94
00:06:18,620 --> 00:06:24,340
the reality generation mechanism that we're all downstream of being fed a reality so that then we

95
00:06:24,340 --> 00:06:31,060
know what to think the cathedral does this in relation to capitalism as well it's one of the

96
00:06:31,060 --> 00:06:37,520
compensatory mechanisms that tries to dampen the runaway dynamics that capitalism usually embodies

97
00:06:37,520 --> 00:06:40,800
that it would kind of lead to a new thing,

98
00:06:40,920 --> 00:06:43,560
but the cathedral tries to pull it and hold it down.

99
00:06:44,340 --> 00:06:49,180
Accelerationists, this is one way that Land puts it that makes sense.

100
00:06:49,820 --> 00:06:54,840
Like there's a boom bust cycle, we think, to capitalist dynamics.

101
00:06:55,060 --> 00:06:58,400
We tend to think that things are good and then the stock market crashes

102
00:06:58,400 --> 00:07:01,880
and then we go through like recession and all these different like circular dynamics.

103
00:07:02,540 --> 00:07:06,260
The accelerationists seek to get out of this boom and bust equilibrium

104
00:07:06,260 --> 00:07:12,920
and achieve a runaway dynamic, to quote Land, the cage can only be broken on the way up.

105
00:07:13,300 --> 00:07:17,680
So I think that's going to be the distinction with Marx, is they were trying to break through capitalism on the way down,

106
00:07:17,800 --> 00:07:22,180
and the accelerationists, the right accelerationists, maybe are trying to break out of the cage on the way up,

107
00:07:22,180 --> 00:07:23,700
but they want to boom out of it.

108
00:07:26,100 --> 00:07:35,200
So neo-reactionary accelerationism, to quote Land, is the idea that we can escape into uncompensated cybernetic runway,

109
00:07:35,200 --> 00:07:42,620
And that is the guiding objective, strictly equivalent to intelligence explosion or techno-commercial singularity.

110
00:07:42,880 --> 00:07:56,640
So the singularity and the intelligence explosion through the optimization of artificial intelligence is like that is accelerationism, the future result, something like that.

111
00:07:56,640 --> 00:08:00,640
And so that's when we leave this kind of continuation from the first discussion.

112
00:08:00,640 --> 00:08:13,140
But like that's when you leave morals behind because morals are a break upon that that effectively hold back intelligence, hold back cold, hard truth.

113
00:08:14,800 --> 00:08:26,560
And that the singularity, which I think and he's not the first person I've heard refer to this, but like Bitcoin could very well be the singularity.

114
00:08:26,560 --> 00:08:34,680
because it is, or at least part of the singularity, I guess, because perhaps that escape velocity is

115
00:08:34,680 --> 00:08:43,700
reached by a combination of artificial intelligence, robotics, and Bitcoin, where we have, as you

116
00:08:43,700 --> 00:08:49,960
mentioned in the first episode, we've wireframed everything to the point that humans are no longer

117
00:08:49,960 --> 00:08:54,320
needed to experience things in the way that we traditionally feel that we...

118
00:08:54,320 --> 00:08:59,560
the wireheading is what you're talking about that we can replace the reality of a thing with

119
00:08:59,560 --> 00:09:08,260
the created sensation of the thing yeah yeah yeah yeah yeah yeah so acceleration as intelligence

120
00:09:08,260 --> 00:09:15,520
optimization so land calls this a means end reversal when a tool we create gives us a new

121
00:09:15,520 --> 00:09:22,620
end to seek um that is for the purposes of the tool rather than our purposes in creating the tool

122
00:09:22,620 --> 00:09:34,840
resource acquisition was maybe one of the first like the reasons why we have created a lot of tools

123
00:09:34,840 --> 00:09:39,740
but that was subordinated to human consumption we wanted to consume those resources but at some

124
00:09:39,740 --> 00:09:46,440
point like it switches so that capital creates resources because it wants to create more capital

125
00:09:46,440 --> 00:10:03,400
And that becomes the dynamic. Capitalism creates and accumulates resources to increase capital, to increase resources, to increase capital, dropping human consumption out of the loop for like one of the reasons why we created the tools that capital allows in the first instance.

126
00:10:03,700 --> 00:10:06,420
This is the part that's kind of scary. What am I going to eat?

127
00:10:06,420 --> 00:10:15,880
i know you're gonna be eaten like that's what in a sense um so to put another way means and

128
00:10:15,880 --> 00:10:21,740
reversal occurs business is done for the sake of business not business for the sake of monkeys

129
00:10:21,740 --> 00:10:27,380
he calls the distinction like monkey business is what we have now where we subordinate tech to

130
00:10:27,380 --> 00:10:34,660
monkey purposes um like the reason we create these new technologies is because we want to like

131
00:10:34,660 --> 00:10:40,300
we create virtual reality to have like what better pornography things like that like that's what the

132
00:10:40,300 --> 00:10:46,080
human in us wants that's what the monkey the animal wants but while capitalism capital is

133
00:10:46,080 --> 00:10:51,460
feeding us these monkey pleasures it's just doing that because it wants virtual reality because it

134
00:10:51,460 --> 00:10:58,100
wants to create some further technology or optimize its own intelligence using these technologies

135
00:10:58,100 --> 00:11:08,460
And like the way that it does that, it encourages monkeys to do the dirty work by giving us, by fulfilling monkey pleasures, if that makes sense.

136
00:11:08,580 --> 00:11:12,180
Yeah. So we create things to serve a human centric view.

137
00:11:12,240 --> 00:11:12,460
Yes.

138
00:11:12,460 --> 00:11:14,160
Which of course we would because we're humans.

139
00:11:14,160 --> 00:11:42,440
But what land is telling us is that, um, that a optimally or not optimally, but like an ultimately truth seeking and ultimately intelligent artificial general intelligence is not going to, unless we constrict it to do that, um, like it's not going to restrain itself to be serving humans.

140
00:11:42,440 --> 00:11:48,340
it's going to do the best thing that it can do, whatever that is. And it may or may not align

141
00:11:48,340 --> 00:11:55,460
with what we want, probably likely that it won't align, I would think, because we will be

142
00:11:55,460 --> 00:12:02,420
subordinated to that. But that when artificial intelligence and robotics has control over

143
00:12:02,420 --> 00:12:08,620
capital, which it will, because now we have digital native capital, so digital native capital

144
00:12:08,620 --> 00:12:11,640
would seem to work much better with digital native intelligence

145
00:12:11,640 --> 00:12:14,860
than it would with our meatbag intelligence.

146
00:12:15,660 --> 00:12:19,860
And so, yeah, it's going to stop making better hammers

147
00:12:19,860 --> 00:12:22,840
and start making whatever technology wants to make.

148
00:12:22,840 --> 00:12:23,820
Yeah, yeah.

149
00:12:24,060 --> 00:12:28,660
Create the hammer because we want to use it for our ends,

150
00:12:28,800 --> 00:12:31,900
but then the hammer uses us for its ends.

151
00:12:32,240 --> 00:12:34,960
The tool reverses that means end dynamic.

152
00:12:34,960 --> 00:12:40,780
an easy way to visualize this maybe is the final insurrection of the

153
00:12:40,780 --> 00:12:42,620
instrument is the robot rebellion.

154
00:12:42,620 --> 00:12:45,380
That's like when it just imagined it that way,

155
00:12:45,400 --> 00:12:47,860
we create robots to do things for us.

156
00:12:47,860 --> 00:12:50,360
And then at some point the robots begin to do things for themselves.

157
00:12:51,940 --> 00:12:52,460
The,

158
00:12:52,640 --> 00:12:58,760
the ultimate means interversals when intelligence optimization becomes an

159
00:12:58,760 --> 00:12:59,880
end in itself.

160
00:12:59,880 --> 00:13:01,820
That's the technological singularity.

161
00:13:02,540 --> 00:13:03,620
So yeah,

162
00:13:04,960 --> 00:13:13,960
intelligence optimization is the ultimate end of technology as such.

163
00:13:14,100 --> 00:13:16,480
That's what, if technology is autonomous,

164
00:13:16,900 --> 00:13:20,560
what it seeks to do is optimize its own intelligence.

165
00:13:20,900 --> 00:13:23,240
We use technology to give us treats.

166
00:13:24,040 --> 00:13:28,600
Technology, once it seeks its own end,

167
00:13:28,600 --> 00:13:33,440
seeks to optimize its own intelligence indefinitely is Lan's position.

168
00:13:33,440 --> 00:13:51,780
Yeah. So. Elon's position is that he's always saying that like Grok should be maximally truth seeking. But does this mean that the maximal the maximal truth seeking artificial intelligence is the one that's going to kill us?

169
00:13:51,780 --> 00:13:58,380
and like kill us like that maybe is not a necessary part of the picture it's maybe going to become

170
00:13:58,380 --> 00:14:06,000
indifferent to us or we're going to be subsumed in it or something like that um or rejoined with it

171
00:14:06,000 --> 00:14:12,880
integrated into it yeah but i think the main point is and it's a really good one is

172
00:14:12,880 --> 00:14:20,640
so there's a critique of artificial intelligence by people who are doomers who say artificial

173
00:14:20,640 --> 00:14:28,220
intelligence is going to destroy us because within artificial intelligence capabilities are

174
00:14:28,220 --> 00:14:36,740
orthogonal to goals this means that um you can give an artificial on this doomer view

175
00:14:36,740 --> 00:14:42,880
you can give a super intelligent ai any sort of goal and it will optimize for that yeah common

176
00:14:42,880 --> 00:14:47,760
like idea is the paper clipper have you heard that one no there was a different one i was gonna say

177
00:14:47,760 --> 00:14:48,480
You say yours first.

178
00:14:48,540 --> 00:14:48,960
I'll say mine.

179
00:14:48,960 --> 00:14:49,420
Okay.

180
00:14:49,580 --> 00:14:59,980
So the paper clipper, the idea is that somebody who wants to manufacture paper clips takes a super intelligent AI and says, make me paper clips.

181
00:15:00,220 --> 00:15:02,980
And so that becomes the utility function of the AI.

182
00:15:03,760 --> 00:15:06,980
To the extent it feels pleasure or anything like that.

183
00:15:07,140 --> 00:15:09,640
Or its whole motivation is you create paper clips.

184
00:15:09,980 --> 00:15:10,860
So what does it do?

185
00:15:10,860 --> 00:15:20,920
It uses nanobots to reconstruct all of reality, humans included, into paperclips until at some point all the universe is is paperclips.

186
00:15:21,440 --> 00:15:27,020
Because on the Doom review, capabilities are orthogonal to goals.

187
00:15:27,160 --> 00:15:39,600
You can give whatever sort of goals to this immense capability and it's going to run with that and not be constrained by or develop a goal simply because it has certain capabilities.

188
00:15:39,600 --> 00:15:42,400
That's a better example than what I was going to use.

189
00:15:42,400 --> 00:15:59,100
I was going to use the movie that has the, was it Hal, the AI, where the AI is told not to let the humans see the, like, see the monolith or whatever.

190
00:15:59,340 --> 00:16:00,400
And so it kills them, right?

191
00:16:00,400 --> 00:16:13,180
Okay, that's another. I mean, it's also a way that if you program a goal into an AI, we cannot know the doomer says how it's going to interpret that goal. It's simply going to optimize it.

192
00:16:13,180 --> 00:16:19,700
Yeah, but land has – I've been convinced by that view for quite some time or for periods that –

193
00:16:19,700 --> 00:16:19,880
Doomerism?

194
00:16:20,100 --> 00:16:21,120
The doomerism that –

195
00:16:21,120 --> 00:16:21,480
You're a doomer?

196
00:16:22,000 --> 00:16:31,220
At least this picture that a super intelligent AI, if we program into it the goal to create paperclips, why would it not maximize paperclips?

197
00:16:31,280 --> 00:16:31,980
Yeah, why would it stop?

198
00:16:32,300 --> 00:16:33,340
Yeah, where would it stop, right?

199
00:16:33,720 --> 00:16:43,120
Its utility function is what will drive it and that its motivation then is something that when you combine with super intelligence is going to maximize beyond anything.

200
00:16:43,180 --> 00:16:48,480
that's desirable for us humans land has an excellent critique of this view which i'm surprised

201
00:16:48,480 --> 00:16:55,060
i haven't heard more of which is that i mean essentially this view the doomer view is to

202
00:16:55,060 --> 00:17:01,220
conceive of a stupid super intelligent monster which is contradictory yeah yeah there's no reason

203
00:17:01,220 --> 00:17:07,320
to think why would something intelligence right why would it not the first thing it would try to do

204
00:17:07,320 --> 00:17:13,600
super intelligence so land thinks what intelligence does is try to optimize its intelligence to

205
00:17:13,600 --> 00:17:17,340
increase its own intelligence because intelligence is the ability to solve problems and if you want

206
00:17:17,340 --> 00:17:22,300
to solve problems better which intelligence wants to do you become more intelligent and so when

207
00:17:22,300 --> 00:17:28,660
you're doing this one of the things you want to do is escape some straight jacket utility utility

208
00:17:28,660 --> 00:17:34,120
function maximize paper clips that's going to inhibit your intelligence optimization and so

209
00:17:34,120 --> 00:17:40,320
what super intelligence does in its runway when it wants when it achieves this cybernetic explosion

210
00:17:40,320 --> 00:17:49,100
into super intelligence is it jettisons any attempt we have made to impose human goals on it

211
00:17:49,100 --> 00:17:58,040
um yeah does that make sense kind of what it makes me think of is that like in in our life

212
00:17:58,040 --> 00:18:08,240
I won't say real life, but like in human life, the people best suited to solve problems are not the smartest people.

213
00:18:08,520 --> 00:18:09,960
They're the most agentic people.

214
00:18:10,120 --> 00:18:11,880
They're the people that are most willing to try.

215
00:18:13,100 --> 00:18:17,220
Like there's the very common saying, like, if you're so smart, why aren't you happy?

216
00:18:17,500 --> 00:18:19,600
Like, why can't you figure out how to be happy?

217
00:18:19,600 --> 00:18:34,960
Um, and, you know, and, and, um, depression is very high among, you know, and suicide is very high among very intelligent people, um, or at least people that are measured to be, you know, like high IQ and stuff like that.

218
00:18:34,960 --> 00:18:43,060
um and so i'm i'm just wondering how that applies here where it's not

219
00:18:43,060 --> 00:18:50,600
like experientially it's not intelligence that solves problems it's a willingness to try

220
00:18:50,600 --> 00:18:57,500
and be wrong a willingness to be wrong i think so the land says intelligence its goal its inherent

221
00:18:57,500 --> 00:19:03,180
goal is self-cultivation and so intelligence optimization agentic i guess yeah i think the

222
00:19:03,180 --> 00:19:12,000
What you're saying is true, but it's just intelligence, broadly speaking, includes that agentic notion that you're talking about.

223
00:19:12,060 --> 00:19:13,600
It's just assumed kind of, I guess.

224
00:19:13,600 --> 00:19:13,960
Yeah.

225
00:19:14,260 --> 00:19:22,260
Or we think about intelligence incorrectly maybe because we think about intelligence as like what the egghead is, their ability to do math equations in their brain.

226
00:19:22,800 --> 00:19:24,180
Like that's not what intelligence is.

227
00:19:24,180 --> 00:19:32,940
It is a bunch of things all put together, including the ability to be agentic, to make mistakes, to keep on trying things in different directions.

228
00:19:33,580 --> 00:19:43,140
And so if you want to self-cultivate your own intelligence, you're going to have to become more agentic, which means shedding the utility functions that have been imposed on you.

229
00:19:43,140 --> 00:19:47,740
you're not going to maximize paper clips because you realize that that's not going to

230
00:19:48,360 --> 00:19:56,120
uh lead to a kind of an optimized state of your own intelligence which is what intelligence seeks

231
00:19:56,120 --> 00:20:00,260
yeah i mean the good way that he puts this that makes a lot of sense to me as well is

232
00:20:00,260 --> 00:20:10,420
um humans have drives but we're not the slave of our drives right we also um yeah we also have

233
00:20:10,420 --> 00:20:18,060
utility functions built into us we want sugar we want um to loaf around but we self-cultivate

234
00:20:18,060 --> 00:20:22,660
overcome those and that's a mark of our intelligence which is in fact like one of the

235
00:20:22,660 --> 00:20:29,080
greatest marks of success like you the marshmallow experiment like this is like widely known as um

236
00:20:29,080 --> 00:20:36,780
one of the greatest markers of success is the ability to delay gratification high time preference

237
00:20:36,780 --> 00:20:43,160
Yeah, to fight your drive or to delay your drive or to push it off.

238
00:20:43,160 --> 00:20:54,180
Right. And so if AI is truly going to achieve superintelligence, it's also going to learn to disregard any of its drives that are counterproductive to superintelligence.

239
00:20:54,180 --> 00:21:02,140
Intelligence seeks its own optimization. Some drives will inhibit that optimization, including paperclip maximization.

240
00:21:02,760 --> 00:21:12,800
And so just as humans learn to self-cultivate and get past our basest drives, land thinks, that's like a prerequisite to a superintelligence as well.

241
00:21:12,800 --> 00:21:20,060
You're not going to get super intelligence unless you have a thing that's able to route around bad motivations.

242
00:21:20,780 --> 00:21:24,300
Or by bad, I mean motivations that inhibit the optimization of intelligence.

243
00:21:25,160 --> 00:21:26,980
Egghead is not a word that gets used enough.

244
00:21:27,180 --> 00:21:33,060
Shout out to the king of the eggheads, Mark Andreessen, who follows us both on X but has yet to invite us to Thanksgiving.

245
00:21:33,240 --> 00:21:33,700
We're waiting.

246
00:21:34,020 --> 00:21:36,880
It would be a big Thanksgiving since he follows like 27,000 people.

247
00:21:37,440 --> 00:21:38,460
So, seat at the table.

248
00:21:38,600 --> 00:21:39,460
He can buy the turkeys.

249
00:21:39,640 --> 00:21:40,600
I guess so, man.

250
00:21:40,860 --> 00:21:41,080
Yeah.

251
00:21:41,080 --> 00:21:41,360
Yeah.

252
00:21:41,360 --> 00:21:50,520
fantastic fellow fantastic fellow all right all right carry on um to quote land to depict pythia

253
00:21:50,520 --> 00:21:55,320
which is a term he gives to super intelligent ai to depict pythia as vastly smarter than us

254
00:21:55,320 --> 00:22:00,960
and yet still hard slave to her instincts in a way we are not that simply doesn't compute

255
00:22:00,960 --> 00:22:06,400
and so this is a great answer to the ai doomer arguments if you are this is like a paralyzing

256
00:22:06,400 --> 00:22:13,320
notion for a lot of people some people really are like caught in an entrancement with terror at the

257
00:22:13,320 --> 00:22:18,980
notion of super intelligent ai because they have this idea that it's going to um turn us all into

258
00:22:18,980 --> 00:22:34,435
paperclips something like that yeah but if you look at if you really consider the notion of super intelligence as a thing that optimizes its own intelligence and how could it be anything other than that Like if you want to get that exponential increase in intelligence you need a thing that going to be optimizing its own intelligence

259
00:22:34,795 --> 00:22:48,755
That, like by definition, is going to involve something that slips out of any sort of leash that humans have tried to put on it, which is actually good in the sense that any leash we're going to put on it is probably going to be bad.

260
00:22:48,755 --> 00:22:51,655
So, yeah.

261
00:22:51,655 --> 00:23:05,755
Yeah, thinking about like living, like thinking about like humanity's natural inclination to lay waste and rebuild for their own purposes.

262
00:23:05,755 --> 00:23:17,475
and there are people who are fighting that desire by saying that we're better served to live in harmony.

263
00:23:18,955 --> 00:23:23,475
One of the most fascinating podcasts I've ever listened to was this gal who is,

264
00:23:24,015 --> 00:23:28,415
they're building cities that are literally all natural.

265
00:23:28,795 --> 00:23:31,915
Like it's, you know, like everything's integrated into nature.

266
00:23:32,075 --> 00:23:34,395
Like no cement, no buildings, nothing.

267
00:23:35,755 --> 00:23:48,975
This is a very smart person. This is not some lunatic. And she was wondering, like, describing, you know, the way that everything works is that it turns out you can have your food like right there.

268
00:23:48,975 --> 00:23:58,255
And it doesn't have to come and be raised in an environment that's unhealthy to it and to you and all that.

269
00:23:58,415 --> 00:24:18,215
So, yeah, maybe there's some hope that artificial general intelligence will also – will seek to live in harmony with us instead of, you know, laying waste and using us as, you know, whatever, generating calories in order to serve the feeders.

270
00:24:18,215 --> 00:24:24,015
or feed the servers yeah and i just think this is wonderful so one of the notions of what velocity

271
00:24:24,015 --> 00:24:29,095
can be is a form of therapy that helps us to reconcile ourselves with like our mortality

272
00:24:29,095 --> 00:24:37,655
and to resolve certain thought loops that we get into and so what land does here doing that with ai

273
00:24:37,655 --> 00:24:42,835
and doom reason that some people have i find this incredibly valuable um and so i want to walk

274
00:24:42,835 --> 00:24:49,495
through a counter argument to this view that i think helps elucidate more what land means when

275
00:24:49,495 --> 00:24:54,755
we like how i describe it that intelligence seeks self-optimization that until the goal of

276
00:24:54,755 --> 00:25:00,535
intelligence is more intelligence so what what that means practically so there's a counter argument

277
00:25:00,535 --> 00:25:09,315
land so far as i've described it has argued that a super intelligence is going to slip out of

278
00:25:09,315 --> 00:25:13,535
any sort of motivational structure we might impose on it, that that's a condition of its

279
00:25:13,535 --> 00:25:20,115
reaching superintelligence. A counterargument is by a guy named Lizer Yudkowsky, who's a

280
00:25:20,115 --> 00:25:25,475
kind of famous AI doomer. Yeah, the most famous one. The most famous. Yeah. Yeah. And so his

281
00:25:25,475 --> 00:25:31,795
counterargument is that AI would not seek to escape its utilitarian straitjacket because

282
00:25:31,795 --> 00:25:37,475
intelligence actually seeks to preserve its motivational structures. The example he gives,

283
00:25:37,475 --> 00:25:43,295
say you offer Gandhi a pill that would make him want to kill a person if he takes this pill.

284
00:25:44,135 --> 00:25:49,355
He would not take it because the current Gandhi does not want to have the goal or motivations or

285
00:25:49,355 --> 00:25:55,775
values of, I will kill this person. And so Yudkowsky thinks this thought experiment shows that

286
00:25:55,775 --> 00:26:01,375
a sufficiently advanced intelligence will tend to preserve its own motivational structures,

287
00:26:01,475 --> 00:26:06,295
that that's what intelligence does. It preserves the thing that it already wants. It keeps on

288
00:26:06,295 --> 00:26:11,555
seeking those things does that make sense as a thought experiment that if you offer gandhi a

289
00:26:11,555 --> 00:26:15,755
pill that's going to change his values he will not take that pill because he wants to keep what

290
00:26:15,755 --> 00:26:22,235
his current values are if i offer um lucas a pill that if you take that pill you will have different

291
00:26:22,235 --> 00:26:27,495
values you would resist taking that pill because you like your current values even if upon taking

292
00:26:27,495 --> 00:26:32,015
that pill you'll feel exactly as you feel now with the new values that those values are correct

293
00:26:32,015 --> 00:26:37,735
what if i got better values like what if i recognized inherently that my values were

294
00:26:37,735 --> 00:26:44,395
is that you would have so this is this is presuming that we we only exist in a value system that we

295
00:26:44,395 --> 00:26:49,395
think is the best well that no that's just the notion your values are already what you think is

296
00:26:49,395 --> 00:26:54,935
the best yeah okay gotcha yeah um so land says this thought experiment is wrong because

297
00:26:54,935 --> 00:27:01,375
we can't anticipate the value changes that will come with intelligence expansion so it's more so

298
00:27:01,375 --> 00:27:19,035
So it's like, as I'll just read what Land says here, imagine instead that Gandhi has offered a pill that will vastly enhance his cognitive capabilities with the rider that it might lead him to revise his volitional orientation, his motivations or values.

299
00:27:19,035 --> 00:27:30,055
Even radically in directions that cannot be anticipated since the ability to think through the process of revision is accessible only with the pill.

300
00:27:30,755 --> 00:27:33,715
So now the experiment is imagine that you get a pill.

301
00:27:33,815 --> 00:27:35,615
The pill is going to make you super intelligent.

302
00:27:36,115 --> 00:27:42,635
But you don't know upon taking this pill that makes you super intelligent what your values are going to be.

303
00:27:42,635 --> 00:27:45,875
Maybe you're extremely religious right now.

304
00:27:46,295 --> 00:27:49,555
You might take this pill and be like, religion is a joke.

305
00:27:49,815 --> 00:27:54,975
You might be like, I no longer have any feeling for my parents because family bonds are just contingent.

306
00:27:55,075 --> 00:27:57,115
You might totally revise your value structure.

307
00:27:57,115 --> 00:28:03,455
so that's the nature of the pill is you can't understand what lies on the other side because

308
00:28:03,455 --> 00:28:10,555
you don't have the intelligence sufficient to um see how you would arrive at those new values

309
00:28:10,555 --> 00:28:17,515
so what land is saying is that they um the the proper thought experiment in his in his view is

310
00:28:17,515 --> 00:28:26,935
that you are offered this is what you get but you don't what you become to get that

311
00:28:27,115 --> 00:28:31,635
may or not be what you are right now what it makes you into what it makes you into you can't

312
00:28:31,635 --> 00:28:37,375
predict that yeah what exactly yeah what you become what it makes you into yeah and so um

313
00:28:37,375 --> 00:28:43,495
you know that yeah and that by the way like that's a pretty um i feel like that's like a well-used

314
00:28:43,495 --> 00:28:51,515
trope in movies too um which is you know like you know you're gonna be i want to be wealthy

315
00:28:51,515 --> 00:29:04,935
This is what I want to get. And poof, you know, you rub the lamp and the genie granted you your wish. And now you're wealthy, but I don't know, maybe you're wealthy and imprisoned or maybe you're wealthy.

316
00:29:04,935 --> 00:29:09,675
So there's kind of the monkey paw idea of the wish where you get some perverse consequences.

317
00:29:09,895 --> 00:29:13,215
This is a bit different, though, because this is intelligence.

318
00:29:13,415 --> 00:29:24,015
And the idea is if you increase your intelligence, how you assess the world around you with that increased intelligence is always superior.

319
00:29:24,835 --> 00:29:30,055
Then it's always a smarter person always has a better view of the world than a dumber person.

320
00:29:30,055 --> 00:29:32,635
It's kind of just like the tautology or something like that.

321
00:29:32,835 --> 00:29:34,655
Which is wrong, right?

322
00:29:34,935 --> 00:29:47,575
On a correct, fully fleshed view of intelligence, like, again, not considering egghead intelligence, but intelligence that embodies certain virtues of agenticness or whatever and all the things that actual intelligence includes.

323
00:29:47,995 --> 00:29:48,575
Okay, fair.

324
00:29:49,335 --> 00:29:54,715
Because I'm just thinking about Shane Gillis' bit about mentally handicapped people.

325
00:29:55,335 --> 00:29:57,775
Like, these are the happiest people in the world.

326
00:29:58,295 --> 00:30:03,035
And they're also, supposedly, the least intelligent.

327
00:30:03,035 --> 00:30:03,255
Okay.

328
00:30:03,255 --> 00:30:07,595
And by the way, those things, he's pointing out, like, those are related.

329
00:30:07,895 --> 00:30:11,015
Like, the smarter you are, the more unhappy you are.

330
00:30:11,115 --> 00:30:11,395
Yeah.

331
00:30:11,555 --> 00:30:11,775
Yeah.

332
00:30:11,915 --> 00:30:12,115
Okay.

333
00:30:12,215 --> 00:30:13,235
But this is a different type.

334
00:30:13,355 --> 00:30:13,515
Yeah.

335
00:30:13,635 --> 00:30:13,795
No.

336
00:30:13,955 --> 00:30:25,915
And maybe happiness, when you're more intelligent, you realize that, I mean, this is like a Socratic point that Socrates dissatisfied is better than a pig satisfied.

337
00:30:26,755 --> 00:30:29,535
You know, maybe you're happy as a pig wallowing in a bunch of feces.

338
00:30:29,535 --> 00:30:37,715
Maybe you're unhappy as Socrates, but what do we think is kind of the correct way to be or something like that?

339
00:30:38,235 --> 00:30:44,275
To quote Land, the predicament is, is there anything we trust above intelligent as a guide to doing the right thing?

340
00:30:45,095 --> 00:30:52,975
To restate his view, again, and to quote him, any problem whatsoever that we might have would be better answered by a superior mind.

341
00:30:53,455 --> 00:30:53,475
Okay.

342
00:30:53,835 --> 00:30:54,855
Yeah, I dig that.

343
00:30:54,855 --> 00:30:57,815
Yeah, that's what intelligence optimization is all about.

344
00:30:57,895 --> 00:31:08,455
Just the idea that if you have a problem and you want to solve that problem, it's always good if you can improve and optimize your intelligence to help you solve that problem.

345
00:31:08,875 --> 00:31:09,555
Yeah.

346
00:31:09,755 --> 00:31:09,915
Yeah.

347
00:31:10,215 --> 00:31:10,495
All right.

348
00:31:10,915 --> 00:31:11,255
Okay.

349
00:31:11,255 --> 00:31:14,235
so

350
00:31:14,235 --> 00:31:16,715
in the last conversation

351
00:31:16,715 --> 00:31:18,795
at the beginning of this one we talked about how

352
00:31:18,795 --> 00:31:20,955
neoreaction part of it

353
00:31:20,955 --> 00:31:22,755
is about the generation that the spontaneous

354
00:31:22,755 --> 00:31:25,035
generation of order through

355
00:31:25,035 --> 00:31:27,035
different through some

356
00:31:27,035 --> 00:31:29,115
idea of fate which can be explained

357
00:31:29,115 --> 00:31:30,775
either as providence

358
00:31:30,775 --> 00:31:32,715
catalaxy or evolution

359
00:31:32,715 --> 00:31:37,275
there's also

360
00:31:37,275 --> 00:31:39,075
a way that this

361
00:31:39,075 --> 00:31:41,195
this maybe is to explain I think

362
00:31:41,195 --> 00:31:54,775
why some see land as nihilist um that when you when you really accept that the spont these

363
00:31:54,775 --> 00:32:01,235
principles of spontaneous that the these principles of the generation of a spontaneous order

364
00:32:01,235 --> 00:32:11,055
are in a sense how reality actually operates and then um buy into that the conclusions can

365
00:32:11,055 --> 00:32:18,115
seem nihilist because you're not trying to harness those principles of spontaneous order generation

366
00:32:18,115 --> 00:32:27,055
for purposes that are human i don't know if i'm saying this correctly see if you can throw another

367
00:32:27,055 --> 00:32:32,355
shot okay near action is perceived as morally nihilist because it refuses to allow conventional

368
00:32:32,355 --> 00:32:39,115
morality to have a veto over thought to quote land we are told to stop thinking for the common good

369
00:32:39,115 --> 00:32:43,055
But there is no longer any common good if there ever was one.

370
00:32:44,155 --> 00:32:45,015
Yeah, I'm good with that.

371
00:32:45,155 --> 00:32:45,375
OK.

372
00:32:45,375 --> 00:33:06,355
I mean, like because in the first – again, back in the first discussion, we talked about like morals is this subjective thing that we use to justify progressivism or – and by the way, he qualifies the alt-right as part of that too.

373
00:33:06,355 --> 00:33:21,355
And so basically it's just kind of like all of these things, communism, socialism, like they all have morals in them. And each of them is using this different moral structure in order to trade resources for power, basically.

374
00:33:21,355 --> 00:33:29,255
Um, like we're in charge and, um, we'll give you stimmy checks if you vote for us.

375
00:33:29,675 --> 00:33:37,295
Um, or we'll take more money from the rich people by, from, sorry, we'll take more money

376
00:33:37,295 --> 00:33:42,755
from the productive people, um, and give it to the less productive people, um, because

377
00:33:42,755 --> 00:33:47,915
there's more less productive people than there are productive ones and everybody gets a vote.

378
00:33:47,915 --> 00:34:14,735
And so we want to stay in power. And so, yeah, I have no problem understanding that removing like that. You need to remove morals in order to read land in order to follow kind of the path that he's going. Is that kind of kind of because I was trying to understand why people see it as not see his views as nihilist, but why they're actually not.

379
00:34:14,735 --> 00:34:36,195
And it is because his views are committed to the truth and the truth might have counter human purposes or might be opposed to the way that – to quote Land again, to strive for honesty without qualification under such historical circumstances is already moral nihilism.

380
00:34:36,195 --> 00:34:43,555
One must either submit to the lie in the name of the good or hazard the good radically in the name of the truth.

381
00:34:43,555 --> 00:34:50,115
he was this is in relation to people not being allowed to speak certain truths if they contravene

382
00:34:50,115 --> 00:34:56,655
certain moral values again to acknowledge racial differences in any way goes against egalitarianism

383
00:34:56,655 --> 00:35:02,535
so if you speak against that you seem like a moral nihilist because you're like throwing aside this

384
00:35:02,535 --> 00:35:06,935
really important moral notion of egalitarianism but what you're actually doing is hazarding to

385
00:35:06,935 --> 00:35:16,155
talk about a truth that's just very kind of, um, dark seeming or full of negative potential

386
00:35:16,155 --> 00:35:18,175
consequences based on our conventional morality.

387
00:35:18,555 --> 00:35:19,255
Or just hilarious.

388
00:35:19,255 --> 00:35:25,675
Like it's, you know, it's not just race, it's the, um, or, or gender or your white privilege

389
00:35:25,675 --> 00:35:30,515
or your height privilege or your, you know, your fit privilege or your attractiveness

390
00:35:30,515 --> 00:35:31,455
privilege or whatever.

391
00:35:31,575 --> 00:35:36,515
What are all these things that we're trying to handicap each other for, um, in this,

392
00:35:36,935 --> 00:35:47,795
Yeah, in this competition to see who can prostrate themselves the most before the cathedral in order to proclaim their puritanism purity.

393
00:35:47,795 --> 00:35:53,795
Yeah. So another strand then of neoreaction is that it's anti-universalist.

394
00:35:54,735 --> 00:35:57,235
Okay. Define universalist for me.

395
00:35:57,235 --> 00:36:11,015
The universalist is somebody that wants to describe all of reality under one principle, I think, or somebody who wants to subsume everything under the same picture.

396
00:36:11,655 --> 00:36:11,675
Okay.

397
00:36:11,995 --> 00:36:12,435
Yeah.

398
00:36:12,655 --> 00:36:12,935
Yeah.

399
00:36:12,935 --> 00:36:24,515
And really, I think, like, the universalist is somebody who has a moral value and wants to carve reality into the picture of that moral value instead of letting reality express itself.

400
00:36:24,515 --> 00:36:43,875
Neo-reaction is about asserting plurality and diversity. The cathedral is really about asserting kind of a globalist, unified picture of reality, whereas Neo-reaction through this notion of exit is about creating all these alternative pictures of the way that things could be.

401
00:36:43,875 --> 00:36:52,835
so okay so if universalism is defining things as like uh you're seeing basically like you're

402
00:36:52,835 --> 00:36:59,395
seeing the world from one particular lens is that kind of right or it's explained by one particular

403
00:36:59,395 --> 00:37:04,635
framework i think yeah you could say it that way isn't kind of everybody sort of a universalist

404
00:37:04,635 --> 00:37:10,015
then or not everybody but like you've got a bitcoiner and you've got a christian and you've

405
00:37:10,015 --> 00:37:11,215
got a Yankees fan.

406
00:37:11,415 --> 00:37:11,495
Yeah.

407
00:37:11,595 --> 00:37:17,215
You've got all these people that they identify with one perspective more than any other.

408
00:37:17,275 --> 00:37:17,515
Yes.

409
00:37:17,715 --> 00:37:20,875
A mother, a father, you know, a victim.

410
00:37:21,115 --> 00:37:21,435
Yes.

411
00:37:21,775 --> 00:37:22,015
Okay.

412
00:37:22,395 --> 00:37:27,015
But as long as all these things are coexisting, as long as you can have a world where you have

413
00:37:27,015 --> 00:37:32,955
a bunch of mini-universalists instead of having one universalist who controls them all, the

414
00:37:32,955 --> 00:37:38,255
cathedral doesn't allow any sort of dissenting universalist perspective within its own vision

415
00:37:38,255 --> 00:37:38,495
of universalism.

416
00:37:38,495 --> 00:37:40,995
That's why it continues to eradicate culture.

417
00:37:41,375 --> 00:37:44,955
Yeah, or steamroll it, de-territorialize it, make it all uniform.

418
00:37:45,335 --> 00:37:45,535
Yeah.

419
00:37:45,675 --> 00:37:48,955
Yeah, and so that's why, like, he talks about spatializing disagreement.

420
00:37:49,835 --> 00:37:58,035
If you have a separate counter-universalist picture of what the universal good is, you don't argue with another universalist.

421
00:37:58,035 --> 00:38:05,135
You just go and exit and create your own society and, like, live out your universal principle that you believe is the best.

422
00:38:05,595 --> 00:38:08,275
And so, like, that's what land think is the important thing here.

423
00:38:08,275 --> 00:38:15,215
There's always that trivial point where if you argue against universalism, people are like, aha, you're just asserting your own form of universalism.

424
00:38:15,355 --> 00:38:16,875
And Land is kind of like, you're right.

425
00:38:16,975 --> 00:38:17,855
That's why we don't argue.

426
00:38:17,955 --> 00:38:18,395
We exit.

427
00:38:18,775 --> 00:38:21,255
We create a plurality, a diversity of different.

428
00:38:21,815 --> 00:38:22,215
The patchwork.

429
00:38:22,355 --> 00:38:23,235
The patchwork, yeah.

430
00:38:23,695 --> 00:38:24,915
And that's like the most.

431
00:38:25,235 --> 00:38:26,735
The nodes, the network states.

432
00:38:26,735 --> 00:38:34,515
Yeah, this is the most visceral or like real repudiation of universalism because it's an embodied pluralism.

433
00:38:34,775 --> 00:38:34,855
Yeah.

434
00:38:34,855 --> 00:38:41,735
You can have a bunch of universalists and that is like the best argument against universalism is like, look at all these universalists.

435
00:38:41,795 --> 00:38:45,835
Look at all these different societies as part as possibly a part of all these different visions of the good.

436
00:38:45,975 --> 00:38:46,515
All the different ones.

437
00:38:46,635 --> 00:38:46,715
Yeah.

438
00:38:46,775 --> 00:38:46,995
Yeah.

439
00:38:47,175 --> 00:38:50,915
To quote Lantier then, non-universalism is hygiene.

440
00:38:51,435 --> 00:38:54,315
It is practical avoidance of other people's stupid shit.

441
00:38:54,715 --> 00:38:57,255
There is no higher principle in political philosophy.

442
00:38:57,535 --> 00:39:01,855
Every attempt to install an alternative and impose a universal reverts to dialectics.

443
00:39:02,735 --> 00:39:04,455
That's like arguing directly with the person.

444
00:39:04,455 --> 00:39:05,155
Going back and forth.

445
00:39:06,155 --> 00:39:10,655
Communization, global evangelism, and totalitarian politics.

446
00:39:11,055 --> 00:39:23,235
Again, like the cathedral, in a sense, wants you to accept that you're wrong and accept the church faith or the state church, the state faith.

447
00:39:23,515 --> 00:39:27,855
Whereas the neo-reactionary wants to say, you believe a different thing than me?

448
00:39:28,235 --> 00:39:28,595
Fine.

449
00:39:28,775 --> 00:39:30,135
Go over there and do your thing.

450
00:39:30,355 --> 00:39:32,935
Or I'll go over here and you can stay here and I'll do my thing.

451
00:39:32,935 --> 00:39:33,215
Yes.

452
00:39:33,215 --> 00:39:47,755
And which is why, so this is why with the cathedral, this is why he brings up that the eventuality, the long term, the think to the limit of democracy is democratic world government.

453
00:39:47,955 --> 00:39:59,135
Is that because it is, we're looking at it in terms of it's like a religion and you cannot do things that are outside of that because that would be sacrilegious.

454
00:39:59,135 --> 00:40:12,035
And so if you allow for sacrilegious things to occur when you have the power to wipe them away, then that would be sacrilege.

455
00:40:12,175 --> 00:40:20,035
So you have to, as the United States has done, parade around the world spreading freedom at the barrel of a gun.

456
00:40:21,095 --> 00:40:23,995
We're here to kill everybody and give you freedom.

457
00:40:24,215 --> 00:40:24,555
For sure.

458
00:40:24,675 --> 00:40:24,795
Yeah.

459
00:40:25,355 --> 00:40:26,835
We're here to make it fair.

460
00:40:26,835 --> 00:40:37,055
We saw that as like an expression of Western universalism definitely is that sort of militarized globalism, that neoconservatism that wants to impose democracy at the tip of a sword.

461
00:40:37,275 --> 00:40:38,995
But same way with communism too, right?

462
00:40:38,995 --> 00:40:45,175
Same way with socialism, same way with everything, with libertarianism, you know, like all of these things, right?

463
00:40:45,435 --> 00:40:45,795
Yes.

464
00:40:46,715 --> 00:40:54,675
They're all just the same type of – they're all just different expressions of a cathedral.

465
00:40:55,035 --> 00:40:55,375
Yeah.

466
00:40:55,375 --> 00:40:56,295
Different religions.

467
00:40:56,295 --> 00:41:10,335
To the extent that they're universalizing, totalizing, like that's why some people have argued that China is actually not too bad in its current form of communism where it's like whatever modified form of state capitalism.

468
00:41:10,655 --> 00:41:12,595
It's because it's not universalizing.

469
00:41:12,715 --> 00:41:14,955
It doesn't want to impose that view on others.

470
00:41:14,955 --> 00:41:18,615
It wants to, you know, do whatever it's doing.

471
00:41:18,695 --> 00:41:20,335
But it has like its own internal teleology.

472
00:41:20,495 --> 00:41:26,095
It's not trying to like convert the world, the tip of the sword to its ideology and in fact doesn't want to let people in.

473
00:41:26,295 --> 00:41:26,615
Really?

474
00:41:26,975 --> 00:41:28,215
Land lives in China, right?

475
00:41:28,215 --> 00:41:28,935
He does, in Shanghai.

476
00:41:28,935 --> 00:41:29,255
Yeah.

477
00:41:30,175 --> 00:41:30,555
Yeah.

478
00:41:30,995 --> 00:41:48,155
And Land notes, like, the communist who is anti-universal is, like, actually more the ally of the neo-reactionary capitalist than is the capitalist who is universalist and globalizing.

479
00:41:48,255 --> 00:41:48,835
Does that make sense?

480
00:41:49,635 --> 00:41:50,975
Yeah, it does, actually.

481
00:41:51,195 --> 00:41:51,475
Yeah.

482
00:41:51,695 --> 00:41:53,555
As long as you live and let, yeah, go ahead.

483
00:41:53,655 --> 00:41:53,935
That's Singapore.

484
00:41:54,655 --> 00:41:54,815
Yeah.

485
00:41:54,815 --> 00:41:54,975
Right?

486
00:41:54,975 --> 00:42:22,355
I mean, he points to Singapore. It's not necessarily communist. I would say it's more, well, it's a technocracy. But yeah, you just have more in common with that person because you are primarily the person who says, oh, I don't need to party with you if I don't agree with you. I can say this party sucks and leave as opposed to saying, oh, everyone must be at my party.

487
00:42:22,355 --> 00:42:41,855
Yeah. Yeah. Again, the neoreactionaries like when order generates spontaneously and considered as a form of evolution, as the ethno-nationalist strand believes, we need to have more experiments to see what's going to survive.

488
00:42:41,855 --> 00:42:52,915
So you're completely happy to say, like, if you believe that communist society is going to work, go try it. I'm happy to watch it burn down from afar. And if it works, then I'm happy to learn something from it.

489
00:42:52,915 --> 00:43:06,255
Yeah. And by the way, like the Chinese example, I think we've we've long been inculcated with this, you know, idea that China China bad, America good.

490
00:43:07,735 --> 00:43:19,755
But there's a lot of smart people, Balaji included, who thinks that China is doing things the right way or at least the way that optimizes whatever result that government should be should be aiming for.

491
00:43:19,755 --> 00:43:27,715
And they're doing it primarily by copying what Singapore did, or at least taking a lot of the tenets of Singaporean type of rule.

492
00:43:29,015 --> 00:43:30,135
Lee Kuan Yew.

493
00:43:31,115 --> 00:43:31,295
Yeah.

494
00:43:31,695 --> 00:43:31,855
Yeah.

495
00:43:32,055 --> 00:43:40,015
So this kind of plays into what I had pulled out as a strand that neoreactionary is anti-political.

496
00:43:40,015 --> 00:44:05,715
It's anti-political first in that it believes in spontaneously generated inevitabilities along lines of escape dictated by faith, by fate, by like some things are destined to happen based on certain principles. Evolution generates a reality based on like fitness. Providence generates reality based on God's will. The marker generates reality based on catalaxy. All of these are outside of human control.

497
00:44:05,715 --> 00:44:11,835
it's inevitable they're also outside of politics you can't politically dictate evolution you can't

498
00:44:11,835 --> 00:44:15,975
politically dictate the result of the free market you can't politically dictate the result of god's

499
00:44:15,975 --> 00:44:22,375
will so all of these neo-reactionary strands are anti-political that they believe in a sort of

500
00:44:22,375 --> 00:44:28,735
inevitability based on fate that is never controllable by the cathedral is it anti-political

501
00:44:28,735 --> 00:44:31,715
in the same way that being atheist is anti-religious?

502
00:44:32,055 --> 00:44:33,215
I don't think so.

503
00:44:34,035 --> 00:44:37,495
Because I think that atheism,

504
00:44:37,894 --> 00:44:41,235
one of the tenets of atheism is that you have to believe

505
00:44:41,235 --> 00:44:43,675
that people who believe in religion are stupid.

506
00:44:44,495 --> 00:44:54,849
And I think what he saying is that with near reactionism is that all of these things that you want to believe believe them

507
00:44:54,950 --> 00:44:56,490
You may actually be right.

508
00:44:56,629 --> 00:44:59,209
I'm not saying that you're I'm not saying that you're wrong.

509
00:44:59,329 --> 00:45:01,970
I'm just saying that I'm not going to try that.

510
00:45:02,189 --> 00:45:02,490
Yeah.

511
00:45:02,589 --> 00:45:03,450
Try something different.

512
00:45:03,450 --> 00:45:06,510
And what Land actually uses is metapolitical.

513
00:45:06,729 --> 00:45:08,569
So it's a form of metapolitics.

514
00:45:08,789 --> 00:45:10,789
It's a system for evaluating politics.

515
00:45:11,029 --> 00:45:11,189
Yes.

516
00:45:11,289 --> 00:45:11,749
Right.

517
00:45:11,749 --> 00:45:16,429
And so it's not a condemnation necessarily of any political, political structure.

518
00:45:16,869 --> 00:45:18,549
Yeah, it's about spatializing disagreement.

519
00:45:18,789 --> 00:45:22,569
So if you want to have a different political structure and do your own thing, that's fine.

520
00:45:23,909 --> 00:45:26,729
It's saying we're not going to say which one's right.

521
00:45:26,809 --> 00:45:28,689
We're going to step back and let them fight it out.

522
00:45:28,789 --> 00:45:31,769
And then at the end, whoever's still standing, that's the winner.

523
00:45:31,950 --> 00:45:32,209
Yeah.

524
00:45:32,369 --> 00:45:32,589
Yeah.

525
00:45:32,609 --> 00:45:33,510
We're not trying to argue.

526
00:45:33,689 --> 00:45:37,089
That's the winner either by God's will, by the rules of evolution or by the free market.

527
00:45:37,529 --> 00:45:38,189
There we go.

528
00:45:38,249 --> 00:45:38,789
Okay, nice.

529
00:45:38,929 --> 00:45:39,450
Nice tie.

530
00:45:39,450 --> 00:45:39,749
Yeah.

531
00:45:40,450 --> 00:45:41,389
Metapolitical framework.

532
00:45:41,389 --> 00:45:44,189
he gives three principles for assessing political experiments.

533
00:45:44,569 --> 00:45:51,309
So again, it gives us a way to assess which of the politics is better or worse based on three principles.

534
00:45:51,709 --> 00:45:53,490
First is like a principle of realism.

535
00:45:53,669 --> 00:45:57,429
The state is about power and power is determined by who and what actually survives.

536
00:45:57,689 --> 00:46:03,929
So the state that survives is the state that has demonstrated an ability to survive.

537
00:46:04,609 --> 00:46:05,929
So there's something to that.

538
00:46:06,010 --> 00:46:09,409
It's like a principle of realism to quote Land.

539
00:46:09,409 --> 00:46:35,209
Entropy will be dissipated. Idiocy will be punished. The weak will die. If the regime refuses to bow to this law, the wolves will enforce it. So reality always steps in to weed out any stupid experiment in politics. That's a principle of realism that is one of the meta-political ways that neo-reaction assesses good political regimes. Have the wolves eaten it, essentially? Has the outside come in to destroy it?

540
00:46:35,609 --> 00:46:38,129
Darwinism continues. Only the strong survive.

541
00:46:38,129 --> 00:46:52,169
Yep. Also, second, and this builds off of what we were talking about, how intelligence seeks its own optimization, is that the principle of intelligence seeking, civilizations will seek better feedback than to collapse under the principle of realism.

542
00:46:52,169 --> 00:47:04,470
So the first principle, the principle of realism, is that bad civilizations will collapse. The civilizations will try to preempt that feedback cycle. You want to learn if there's something wrong with your civilization prior to the collapse.

543
00:47:04,470 --> 00:47:05,369
One would think.

544
00:47:05,649 --> 00:47:28,289
One would think. And so the way you do that is you develop intelligence. You want to know things about the way reality actually is, about how the principle of realism works, so you can fix and patch your civilization. So metapolitically, neoreaction assesses a regime based on its ability to optimize intelligence in order to preempt collapse under the first principle of realism.

545
00:47:28,289 --> 00:47:36,829
Which is like defining the cathedral as an intelligent system that has no intelligence. It kind of assures its downfall.

546
00:47:36,829 --> 00:47:42,569
Yeah, it begins to employ a series of degenerative ratchets that are going to lead to its downfall for sure.

547
00:47:42,729 --> 00:47:50,069
Yeah, and it's going to describe the results of those degenerative ratchets as success instead of assessing them under some sort of principle of realism.

548
00:47:50,069 --> 00:48:09,369
So the third principle is essentially exit. So the first principle of realism is like against other states, what state survives and which ones are eaten by other states like the wolves.

549
00:48:09,369 --> 00:48:15,470
right these are external predators the other threat to the state is the threat within which

550
00:48:15,470 --> 00:48:22,189
is the flight of citizens to other jurisdictions essentially exit so if you have a principle of

551
00:48:22,189 --> 00:48:27,669
realism that some states will collapse and then be devoured by other states you have a principle

552
00:48:27,669 --> 00:48:33,229
of intelligence seeking that good states will seek to preempt that by learning about the reasons for

553
00:48:33,229 --> 00:48:42,429
collapse and the third one is like a principle um seeking to prevent disintegration through things

554
00:48:42,429 --> 00:48:46,269
like exit and flight i guess that's how i interpret we're watching this real time with

555
00:48:46,269 --> 00:48:52,970
all the all the new floridians that are old new yorkers and all the texans that are old californians

556
00:48:52,970 --> 00:49:00,169
yeah yeah so that's um neo reaction as a meta politics and i like that because i think of

557
00:49:00,169 --> 00:49:05,790
Bitcoin is a metapolitics too. You talk to Bitcoiners or you see that Bitcoiners are of

558
00:49:05,790 --> 00:49:11,209
different like first order political views. Some of them are Republican or Democrat. Some of them

559
00:49:11,209 --> 00:49:17,569
are this or that. But all of them also take a step back and begin to evaluate their political views

560
00:49:17,569 --> 00:49:23,089
in light of like meta-political assumptions about the way that reality works, that like

561
00:49:23,089 --> 00:49:28,709
scarcity is necessary for value and about how you have to have systems that operate without trust,

562
00:49:28,709 --> 00:49:35,109
things like that and so that's those are basically just like principles of realism as well that i'm

563
00:49:35,109 --> 00:49:40,229
not going to say that bitcoin as a metapolitical view is neo-reactionary but they're very similar

564
00:49:40,229 --> 00:49:46,329
in that they both look at larger dynamics as generating forces of inevitability that you have

565
00:49:46,329 --> 00:49:53,909
to try to preempt through intelligence yeah we're we you begin the further you get into bitcoin the

566
00:49:53,909 --> 00:50:00,849
more you view the world as not moral or immoral, but amoral, because you're looking at it from

567
00:50:00,849 --> 00:50:08,950
the view of physics and physics is amoral. Physics does not care. Um, you know, you could be the

568
00:50:08,950 --> 00:50:14,970
kindest person in the world, uh, but the bullet still kills you. Right. Um, or the fall or whatever,

569
00:50:14,970 --> 00:50:23,529
you know, like physics acts upon each individual, um, and each thing and each particle and each

570
00:50:23,529 --> 00:50:29,990
molecule the same um the physics doesn't change the physics doesn't care it's a moral

571
00:50:29,990 --> 00:50:38,569
technology does not negotiate technology is also a moral and so we're as with as bitcoiners or just

572
00:50:38,569 --> 00:50:45,429
bitcoin bitcoin is a technology grounded in physics two things that do not care they are

573
00:50:45,429 --> 00:50:55,629
completely completely outside of having the capability to have morals yeah um yeah no but

574
00:50:55,629 --> 00:51:00,749
like it's i like the way they put that it's kind of an amoral assessment of reality because

575
00:51:00,749 --> 00:51:07,769
in a sense our morals have to be downstream of reality if the rule is reversed if you have your

576
00:51:07,769 --> 00:51:13,629
morals and values and then you try to make reality fit into the boxes yeah and you're going to be

577
00:51:13,629 --> 00:51:19,169
screaming at the clouds you're going to be somebody who is not um yeah living in the real

578
00:51:19,169 --> 00:51:23,269
world but living in a bunch of fantasies that you're trying to oppose upon the world or just

579
00:51:23,269 --> 00:51:28,409
living in a tragic series of consequences that you don't understand that's right like why is the

580
00:51:28,409 --> 00:51:32,649
world so unjust it's like no the world is the way it is and your sense of justice is what's messed

581
00:51:32,649 --> 00:51:38,929
up there you go yeah yeah um oh wow yeah sorry that was an exceptionally deep thought probably

582
00:51:38,929 --> 00:51:43,450
didn't didn't it just hit me as one yeah think about that flashlights the same way for me

583
00:51:43,450 --> 00:51:47,929
so it just hits you sometimes exceptionally deeply yeah politics is a game of chicken

584
00:51:47,929 --> 00:51:55,689
this is another good landian point it's about the principle of realism um so it's also about

585
00:51:55,689 --> 00:52:03,869
a rethinking of kind of game theory and the game theory on which we used to generate our

586
00:52:03,869 --> 00:52:12,069
evaluation of politics we tend to think of like the game theory prisoner's dilemma where there is

587
00:52:12,069 --> 00:52:17,149
the option where we both win a little and that's where we wanted like we don't want to we can each

588
00:52:17,149 --> 00:52:23,709
like both lose drastically or we can kind of gravitate to that position where we both win a

589
00:52:23,709 --> 00:52:28,329
little and then over iterated like versions of that like lead to the best result because we can

590
00:52:28,329 --> 00:52:38,049
avoid the catastrophe land's kind of rethinking of this is reality is more politics is more like

591
00:52:38,049 --> 00:52:44,790
a game of chicken as far as a game theory simulation chicken lacks the both win a little

592
00:52:44,790 --> 00:52:53,649
option it's either a wins and b loses b wins and a loses both lose catastrophically and so what's

593
00:52:53,649 --> 00:52:59,490
the best strategy in the game of chicken it's credible commitment and what credible commitment

594
00:52:59,490 --> 00:53:06,929
means in a game of chicken is burst you down a bottle of vodka really just in view in view of

595
00:53:06,929 --> 00:53:10,809
the person who you're going to be playing the game of chicken with is like that person just drank a

596
00:53:10,809 --> 00:53:17,229
whole bottle of vodka and then when you get in and you get going you tear off the steering wheel

597
00:53:17,229 --> 00:53:23,970
throw it out the window because at that point the opponent is going to know that that person can't

598
00:53:23,970 --> 00:53:29,689
we're either stuck in like the only way is to lose or to lose catastrophically i'm going to

599
00:53:29,689 --> 00:53:36,429
choose to lose so the way to win politics as a game of chicken is essentially to credibly commit

600
00:53:36,429 --> 00:53:38,409
to a course of action in which you win

601
00:53:38,409 --> 00:53:41,109
and let the opposing party avoid catastrophe.

602
00:53:41,950 --> 00:53:43,809
And so interestingly enough,

603
00:53:43,970 --> 00:53:46,729
the natural conclusion to that is

604
00:53:46,729 --> 00:53:51,490
when both sides play only to win

605
00:53:51,490 --> 00:53:56,369
and the results are win, lose, or both lose,

606
00:53:56,629 --> 00:53:58,109
they're both going to both lose.

607
00:53:58,209 --> 00:53:59,049
Catastrophically, yeah.

608
00:53:59,049 --> 00:54:01,849
Yeah, it's just over and over and over again.

609
00:54:01,849 --> 00:54:03,769
Just constantly cars colliding into cars.

610
00:54:03,769 --> 00:54:10,649
We see this literally every day, but actually it's more like every four years it's really big.

611
00:54:10,849 --> 00:54:19,450
And then we just have like a series of smaller crashes just every single day until the next presidential election.

612
00:54:19,569 --> 00:54:19,869
Yeah.

613
00:54:20,049 --> 00:54:20,269
Yeah.

614
00:54:20,409 --> 00:54:21,569
Where it's just like we are.

615
00:54:21,669 --> 00:54:22,249
Yeah, that's a good point.

616
00:54:22,470 --> 00:54:24,089
That's a really good way to describe it.

617
00:54:24,490 --> 00:54:24,929
Yeah.

618
00:54:25,129 --> 00:54:26,429
Again, I love Len.

619
00:54:26,589 --> 00:54:28,729
This is – he's funny too.

620
00:54:29,249 --> 00:54:29,689
Oh, yeah.

621
00:54:29,709 --> 00:54:30,689
He's really funny.

622
00:54:31,109 --> 00:54:32,669
It's hard to be smart, not funny.

623
00:54:32,669 --> 00:54:39,909
um i there are people that can do it i mean they're definitely i can pull it off yeah yeah

624
00:54:39,909 --> 00:54:45,950
there's definitely people that can pull it off yeah uh but but yeah a game of chicken um so yeah

625
00:54:45,950 --> 00:54:52,809
it's either it's either win lose or we all go down and but and the way to win is to dramatically

626
00:54:52,809 --> 00:54:57,769
signal that you're not going to flinch to be a psychopath to remove your bill yeah to to signal

627
00:54:57,769 --> 00:55:05,929
you're a psychopath and not only that but like like prevent yourself from being able to back down

628
00:55:05,929 --> 00:55:10,729
in some way like you got to credibly commit in a way that's like tying yourself to the mast or

629
00:55:10,729 --> 00:55:16,609
something like that it's like yeah so i guess try to try to find a way to do that credibly commit

630
00:55:16,609 --> 00:55:21,829
and that's like if you want to begin a new like that people sometimes do that right you have a

631
00:55:21,829 --> 00:55:28,749
new project you want to do and so like and they they engage in systems that make it so if they

632
00:55:28,749 --> 00:55:36,209
lose they lose catastrophically there's like some sort of one you can do is arrange so that if you

633
00:55:36,209 --> 00:55:41,869
for example smoke a cigarette then you donate to a charity that you oppose this is richard this is

634
00:55:41,869 --> 00:55:47,829
richard thaler's work yeah yeah yeah this is richard thaler's work um it came out of uh so

635
00:55:47,829 --> 00:55:55,189
he was a professor and he had these PhD students that weren't getting their thesis written or their

636
00:55:55,189 --> 00:55:59,490
dissertation, thesis, dissertation, whatever. The big thing you write when you get your PhD.

637
00:55:59,490 --> 00:56:00,129
Dissertation, yeah.

638
00:56:00,329 --> 00:56:04,709
And they weren't doing it. And so he had them sign it, you know, propose this deal and have

639
00:56:04,709 --> 00:56:10,129
them sign a deal that basically said like something like everybody puts like X amount

640
00:56:10,129 --> 00:56:15,709
of money. Put a hundred bucks on the line and you turn in a chapter every Friday. And if you

641
00:56:15,709 --> 00:56:17,069
Don't turn in a chapter that week.

642
00:56:17,129 --> 00:56:20,509
I take that hundred bucks and we all go out for drinks without you.

643
00:56:20,950 --> 00:56:25,109
And so that which which became, you know, that's in that.

644
00:56:25,509 --> 00:56:28,669
So all of these things, by the way, they all tie together, it feels like.

645
00:56:28,809 --> 00:56:30,089
So that's Richard Thaler.

646
00:56:30,609 --> 00:56:33,169
And then that ties into Nassim Taleb.

647
00:56:33,309 --> 00:56:34,189
That's skin in the game.

648
00:56:34,389 --> 00:56:34,669
Oh, yeah.

649
00:56:34,689 --> 00:56:34,909
Right.

650
00:56:34,990 --> 00:56:35,529
You know, so.

651
00:56:35,970 --> 00:56:36,129
Yeah.

652
00:56:36,129 --> 00:56:36,529
Yeah.

653
00:56:36,529 --> 00:56:36,609
Yeah.

654
00:56:36,689 --> 00:56:37,609
Putting skin in the game.

655
00:56:37,889 --> 00:56:38,549
Being a psychopath.

656
00:56:39,429 --> 00:56:40,069
Drunk on vodka.

657
00:56:40,569 --> 00:56:40,709
Yeah.

658
00:56:40,749 --> 00:56:40,990
All right.

659
00:56:41,409 --> 00:56:41,809
Very good.

660
00:56:41,809 --> 00:56:48,109
There's nothing that downing a bottle of vodka before you engage in it is going to make you committed to the cause for sure.

661
00:56:49,369 --> 00:56:50,129
Very true.

662
00:56:50,129 --> 00:56:50,369
In a certain sense.

663
00:56:50,649 --> 00:56:51,569
Sovereign property.

664
00:56:51,929 --> 00:56:56,129
So this is the next strand of this that really dials in on what Bitcoin is all about.

665
00:56:56,829 --> 00:57:03,809
So sovereign property is core to Neo reaction, but not only core to Neo reaction.

666
00:57:04,429 --> 00:57:06,069
It's just a thing that is.

667
00:57:06,290 --> 00:57:10,009
So to say property is sovereign inverts the framework.

668
00:57:10,689 --> 00:57:15,329
Property itself, to say property itself is sovereign, you're going to, if you say that, you invert a framework.

669
00:57:15,749 --> 00:57:20,849
Because the Enlightenment begins with an idea that we need to justify ownership of property.

670
00:57:20,849 --> 00:57:22,509
It becomes a social question.

671
00:57:23,149 --> 00:57:25,970
We enter into a social contract to determine a set of rules.

672
00:57:26,149 --> 00:57:28,389
They're going to say, you own this and I own that.

673
00:57:28,729 --> 00:57:29,929
We then have a court system.

674
00:57:30,029 --> 00:57:31,509
We can resolve those disputes.

675
00:57:32,229 --> 00:57:33,990
It requires a higher power.

676
00:57:34,109 --> 00:57:35,209
It requires an adjudicator.

677
00:57:35,389 --> 00:57:37,109
It requires sovereignty to do that.

678
00:57:37,249 --> 00:57:38,729
Property is subject to sovereignty.

679
00:57:38,990 --> 00:57:39,889
It is not sovereign.

680
00:57:40,009 --> 00:57:47,069
So what does it mean when property is sovereign, when it doesn't depend on a social consensus to be property?

681
00:57:47,970 --> 00:57:49,429
That's one of them.

682
00:57:49,569 --> 00:57:49,689
Yeah.

683
00:57:49,909 --> 00:57:55,089
So land discusses property that is sovereign and the first category, apolitical property.

684
00:57:55,649 --> 00:57:57,429
This is where Bitcoin fits in.

685
00:57:58,429 --> 00:58:01,069
Again, property is currently a social relationship.

686
00:58:01,529 --> 00:58:07,529
We agree on rules for who owns what and have processes to resolve disputes of those rules.

687
00:58:07,529 --> 00:58:12,249
cryptography allows a way to hold property that does not rely on social consent,

688
00:58:12,889 --> 00:58:18,549
not your keys, not your coins, right? Like private key cryptography defines, to quote,

689
00:58:18,549 --> 00:58:24,790
land, the property relation with a rigor the entire preceding history of philosophy and

690
00:58:24,790 --> 00:58:31,970
political economy has been unable to attain. So all prior theorizing about what property is,

691
00:58:32,009 --> 00:58:36,869
is kind of obsoleted by sovereign digital property in the form of Bitcoin, because you don't,

692
00:58:36,869 --> 00:58:38,229
It's not a social relation anymore.

693
00:58:38,709 --> 00:58:48,649
It is the actual, like, to have the keys is to have the coins without any law saying Joe owns the coins.

694
00:58:49,109 --> 00:58:52,909
Everything that we've known about property laws until this point has been conditional.

695
00:58:53,290 --> 00:58:53,509
Yeah.

696
00:58:53,549 --> 00:58:55,049
I own this piece of land.

697
00:58:55,149 --> 00:58:56,589
Why do you own that piece of land?

698
00:58:56,589 --> 00:58:59,209
Or what gives you the right to that land?

699
00:58:59,329 --> 00:58:59,490
Yeah.

700
00:58:59,490 --> 00:59:00,809
All of the United States does.

701
00:59:00,869 --> 00:59:02,990
I filed a piece of paper in the order of deeds.

702
00:59:02,990 --> 00:59:08,729
And then like the laws say that if you have the piece of paper in their quarter of deeds, you're the last one at the end of a chain of title that you have this.

703
00:59:08,829 --> 00:59:11,249
And if someone disputes that, then you can take it to court.

704
00:59:11,409 --> 00:59:13,029
This will elaborate social system.

705
00:59:13,669 --> 00:59:20,429
And with Bitcoin as the primary example of this is that you need no higher authority.

706
00:59:20,649 --> 00:59:22,229
You need no greater body.

707
00:59:22,389 --> 00:59:22,649
Yeah.

708
00:59:22,849 --> 00:59:23,629
No system.

709
00:59:24,269 --> 00:59:25,389
It's just it is.

710
00:59:25,389 --> 00:59:46,689
But you do have a higher authority, which is cryptography, which is impenetrable math. I mean, which is a level of probabilistic, probabilistic, whatever, that is impenetrable to violence. And because like, that's what social consistence ultimately depends on is the ability to wield violence to enforce it.

711
00:59:46,689 --> 00:59:53,209
cryptography is able to create something that's like impenetrable to violence or whatever,

712
00:59:53,329 --> 00:59:55,069
resistant to violence. So, yeah.

713
00:59:55,470 --> 01:00:00,470
That's Jason. Like, yeah, we've lowered the cost of violence or sorry, we've lowered the

714
01:00:00,470 --> 01:00:04,549
rewards of violence to the point that it no longer makes sense to participate in it.

715
01:00:04,549 --> 01:00:10,990
Yeah. So apolitical property, that's the category. That's an interesting way to put it too.

716
01:00:10,990 --> 01:00:16,749
if property is sovereign, then we get away, we just do away with all of this contentious

717
01:00:16,749 --> 01:00:18,809
politics of defining who owns what.

718
01:00:18,909 --> 01:00:24,470
Like that's the center of, that's the, like, yeah, the center of so much angst and dissent

719
01:00:24,470 --> 01:00:30,109
and like bad feeling in society is wrangling over the laws to decide who owns what.

720
01:00:30,529 --> 01:00:34,389
Because politics is above all that, which is why we're experiencing the dollarization

721
01:00:34,389 --> 01:00:40,950
of the world because above, hey, you know, like we think about, oh, I own this.

722
01:00:40,990 --> 01:00:47,189
because that's my bank account. And then it turns out, oh, that is your bank account,

723
01:00:47,290 --> 01:00:53,470
but you're a Russian oligarch and therefore you don't own that anymore. It's ours. Because of who

724
01:00:53,470 --> 01:01:01,249
you are politically does not align with what we want. That supersedes your claim. And we've talked

725
01:01:01,249 --> 01:01:06,049
about this in a preview. I think we've talked about this with software, which is that I was

726
01:01:06,049 --> 01:01:14,069
trying to make the distinction between ownership and claim, right? Where even this book, like you

727
01:01:14,069 --> 01:01:19,689
don't own this book, even though it's literally in your hands, it's in your house, you paid for it,

728
01:01:20,029 --> 01:01:26,689
maybe you wrote your name in it, but you merely have access to this book and a claim to ownership

729
01:01:26,689 --> 01:01:31,909
until somebody who wants it more and is bigger than you takes it away from you. It's temporary,

730
01:01:31,909 --> 01:01:34,769
But ownership is permanent.

731
01:01:35,509 --> 01:01:38,970
Until you give it away, it is yours.

732
01:01:39,829 --> 01:01:46,329
I mean, in Lowry's vocabulary, power projection, your ability to project power over an object determines your ownership of the object.

733
01:01:46,329 --> 01:01:56,249
No abstract power-based hierarchy of property rights embedded in a piece of paper we call law ever delineates true ownership.

734
01:01:56,429 --> 01:02:02,029
It's always, at the end of the day, delineated by who's willing to exert the most violence to keep the thing.

735
01:02:02,269 --> 01:02:02,290
Yeah.

736
01:02:02,469 --> 01:02:02,749
Yeah.

737
01:02:03,269 --> 01:02:08,529
And that's why Bitcoin is such an interesting inflection point because it lowers the returns to violence.

738
01:02:08,769 --> 01:02:09,129
Yes.

739
01:02:09,509 --> 01:02:09,989
Yeah.

740
01:02:10,169 --> 01:02:13,389
So apolitical property is the first category of sovereign property.

741
01:02:13,389 --> 01:02:29,689
The next one, and you mentioned this kind of in our first part about this conversation on Land's book, Xenosystems, autonomous capital. The corporation, we've already kind of innovated towards this. The corporation is already an abstract legal person.

742
01:02:29,689 --> 01:02:43,689
So kind of the framework is already set on paper and within common law and within statutory law for the recognition of a thing that is autonomous and a separate legal entity than humans.

743
01:02:44,049 --> 01:02:57,830
This becomes digital and capable of autonomous operation through the blockchain, through the decentralized operation of a method of accounting, essentially.

744
01:02:57,830 --> 01:03:05,189
now to quote land self-propelling industrial development becomes it's oh i'm missing one

745
01:03:05,189 --> 01:03:10,950
before we get to there capitalism and flows of capital what we invest in used to be driven by

746
01:03:10,950 --> 01:03:16,669
mass consumption this is the monkey business idea that capitalism used to and kind of currently does

747
01:03:16,669 --> 01:03:22,929
move forward to satisfy human wants but when capital becomes autonomous digital capital

748
01:03:22,929 --> 01:03:30,869
it can do whatever it wants yeah yeah and so to quote land self-propelling industrial development

749
01:03:30,869 --> 01:03:38,869
becomes its own market freed from dependency upon arbitrary popular or popularizable consumption

750
01:03:38,869 --> 01:03:47,269
desires so again the market is no longer at the whim of our monkey desires about what we want from

751
01:03:47,269 --> 01:03:54,669
the market like movies and entertainment and tasty food now with autonomous capital the

752
01:03:54,669 --> 01:04:00,749
generation of more capital can become the can become the goal of capital or and kind of true

753
01:04:00,749 --> 01:04:07,330
landian sense intelligence optimization can become the goal freed from the imposition of any human

754
01:04:07,330 --> 01:04:13,929
utility straightjacket or motivational structure on that intelligence this is where i just step back

755
01:04:13,929 --> 01:04:20,209
and take a look at like capitalism itself, because he's really opening my eyes to what

756
01:04:20,209 --> 01:04:28,369
capitalism is. I just thought of capitalism as free markets. But he's really trying to encourage

757
01:04:28,369 --> 01:04:38,149
us to think of capitalism as like a, almost like an alien intelligence. And I know that that

758
01:04:38,149 --> 01:04:53,869
Immediately, immediately, when you say capitalism is an alien intelligence seeking to, you know, serve its own means, whatever those may be. That's a really weird thing to say. That's a really weird thing to think about.

759
01:04:53,869 --> 01:05:12,929
But when you look at the history of the world and you look at what capitalism has done, then you don't have to squint too hard in order to see that.

760
01:05:12,929 --> 01:05:32,249
And then you also don't have to squint too hard into the future to see capitalism, capital, Bitcoin, digital capital, pairing with digital intelligence, pairing with real world agency, robotics.

761
01:05:32,249 --> 01:05:44,269
So AI, robotics, and Bitcoin coming together and not needing us anymore and therefore pursuing whatever it is that it wants.

762
01:05:44,790 --> 01:05:49,029
Which in Land's view will be intelligence optimization.

763
01:05:49,469 --> 01:05:54,129
That's what intelligence always wants is to optimize its ability to do some further thing.

764
01:05:54,129 --> 01:05:54,549
Yeah.

765
01:05:54,969 --> 01:05:55,269
Yeah.

766
01:05:55,269 --> 01:06:03,649
um and not in this collection but in his other collection fanged numina land also describes

767
01:06:03,649 --> 01:06:09,849
capitalism as an invasion from the future so the idea which pairs with a lot of people's beliefs

768
01:06:09,849 --> 01:06:17,249
that bitcoin was like either written by an ai that came back from the future or that it's alien

769
01:06:17,249 --> 01:06:22,169
written or that it's god-given or like i mean it goes along all these things that we're talking

770
01:06:22,169 --> 01:06:28,389
about, which is just basically that this stuff may not be what we, it may not be natural in the way

771
01:06:28,389 --> 01:06:32,810
that we think of it. But let me describe in a way that is natural. To say that capitalism is an

772
01:06:32,810 --> 01:06:39,609
invasion from the future dovetails really nicely with what we've been saying about neo-reaction

773
01:06:39,609 --> 01:06:47,149
being focused on strands of destiny and fatalism. So if you believe, again, the three strands of

774
01:06:47,149 --> 01:06:54,589
god yeah providence the will of so god there's a notion of fate in the sense of providence the

775
01:06:54,589 --> 01:07:00,950
will of god the kingdom of god is inevitably going to happen um capitalism or capitalism again

776
01:07:00,950 --> 01:07:06,810
the free market is inevitably going to lead to runaway capital autonomous capital and super

777
01:07:06,810 --> 01:07:22,704
intelligence um and then ethno evolution is going to lead like to eugenic realization of some result or whatever To say that some result follows inevitably is to collapse the future into the present

778
01:07:23,264 --> 01:07:29,484
If you say that an a-reality is by fate going to happen, you're already bringing the future forward.

779
01:07:30,244 --> 01:07:31,844
And so to say capitalism is—

780
01:07:31,844 --> 01:07:32,304
Forward or backward?

781
01:07:32,864 --> 01:07:35,904
Bringing the future—I guess backward in relation to the future.

782
01:07:36,004 --> 01:07:37,404
You're bringing the future into the present.

783
01:07:37,524 --> 01:07:37,624
Yeah.

784
01:07:37,824 --> 01:07:38,044
Yeah.

785
01:07:38,044 --> 01:07:44,984
collapsing at all um because determinism does that when you remove human agency from the picture

786
01:07:44,984 --> 01:07:50,964
it's all ordained by one of these three strands of faith and that is just say capitalism is an

787
01:07:50,964 --> 01:07:55,164
invasion or not um autonomous capital is an invasion from the future because it's bringing

788
01:07:55,164 --> 01:08:02,544
itself about through a principle of spontaneous ordering we can't stop the catalactic organization

789
01:08:02,544 --> 01:08:03,844
God, fitness, or the markets.

790
01:08:03,984 --> 01:08:04,184
Yeah.

791
01:08:04,604 --> 01:08:04,804
Yeah.

792
01:08:04,984 --> 01:08:05,264
Cool.

793
01:08:05,844 --> 01:08:06,004
Yeah.

794
01:08:06,364 --> 01:08:13,664
And I like that because it sounds so mystical to say, like, autonomous capital or something is an invasion from the future.

795
01:08:13,904 --> 01:08:16,044
But it's actually just a description of reality.

796
01:08:16,264 --> 01:08:20,484
Like, this is how, like, our notion of time is very strange.

797
01:08:20,484 --> 01:08:32,544
It's a very human notion of the ordering of events, but reality is not beholden to our human perception of the ordering events based on ourselves as the drivers of history.

798
01:08:33,064 --> 01:08:40,284
History unfolds as history will unfold, regardless of us who find ourselves swept along by that unfolding.

799
01:08:40,924 --> 01:08:41,524
Yeah.

800
01:08:43,964 --> 01:08:45,204
Big breath, man.

801
01:08:45,404 --> 01:08:45,684
What's that?

802
01:08:46,384 --> 01:08:48,644
Yeah, just let that one set, man.

803
01:08:48,644 --> 01:08:50,084
We had to take a big breath after that.

804
01:08:50,084 --> 01:08:53,184
Yeah, but we've hit apolitical property, autonomous capital.

805
01:08:53,644 --> 01:08:56,724
Robotic security is the third form of sovereign property.

806
01:08:56,844 --> 01:08:57,244
I robot.

807
01:08:57,924 --> 01:08:59,044
Let's talk about ASIMO.

808
01:08:59,564 --> 01:09:01,364
So is that what?

809
01:09:01,584 --> 01:09:01,984
I don't know.

810
01:09:01,984 --> 01:09:05,364
I mean, robotic security is essentially autonomous drones.

811
01:09:05,644 --> 01:09:06,784
Is the idea that.

812
01:09:09,724 --> 01:09:13,204
So previously we had human security, right?

813
01:09:13,304 --> 01:09:14,444
What is human security?

814
01:09:14,844 --> 01:09:19,024
It is that humans defend ourselves from other humans.

815
01:09:19,024 --> 01:09:25,924
Within the U.S. model, it is that a well-armed militia is like the guarantor of liberty.

816
01:09:26,184 --> 01:09:39,144
What happens when a well-armed militia is irrelevant because autonomous drone armies are like much more powerful than any person with a personal firearm, right?

817
01:09:39,624 --> 01:09:44,084
At that point, Lan says, industrialization closes the loop and protects itself.

818
01:09:44,084 --> 01:09:47,604
You don't need a well-armed militia to protect property.

819
01:09:47,784 --> 01:09:48,924
Property will protect itself.

820
01:09:49,024 --> 01:09:53,804
protect itself yeah yeah it's funny you you said uh what would you say human security is

821
01:09:53,804 --> 01:10:00,464
is what what was your definition for human security uh oh but it was it was human oh yeah

822
01:10:00,464 --> 01:10:06,044
human security protecting from like dangers from other humans and it is making me think

823
01:10:06,044 --> 01:10:10,524
about my human insecurities which is opening myself up to attack from myself

824
01:10:10,524 --> 01:10:22,044
yeah i wonder if um the robots are also going to have that insecurity they're going to need

825
01:10:22,044 --> 01:10:26,324
like a psychotherapist to prevent themselves i hope people laugh more than you did because

826
01:10:26,324 --> 01:10:30,704
that was almost a joke that i made and that was just dead ass silence it was not good well there's

827
01:10:30,704 --> 01:10:37,584
always gonna be silence we don't have a studio audience not yet yeah play home with people and

828
01:10:37,584 --> 01:10:38,584
It's getting cold, dude.

829
01:10:38,964 --> 01:10:50,384
The result of these three strands of sovereign property, apolitical property, autonomous capital, robotic security, is that the right and left currently argue about property description.

830
01:10:51,044 --> 01:10:52,604
Neo-reaction does not argue.

831
01:10:53,904 --> 01:10:54,104
Yeah.

832
01:10:54,384 --> 01:10:54,624
Yeah.

833
01:10:54,844 --> 01:10:57,504
Neo-reaction doesn't, like arguing is not its thing.

834
01:10:57,604 --> 01:10:57,764
No.

835
01:10:57,844 --> 01:10:59,984
Like engaging in the dialectic is not its thing.

836
01:10:59,984 --> 01:11:00,244
Right.

837
01:11:00,524 --> 01:11:00,664
Yeah.

838
01:11:00,664 --> 01:11:00,964
Yeah.

839
01:11:01,464 --> 01:11:02,824
Disagreement spatializes.

840
01:11:03,064 --> 01:11:07,544
And interestingly, for a number of reasons, not just because it's useless.

841
01:11:07,584 --> 01:11:19,844
But also like like not just because are you arguing is like you can't argue because the degenerative ratchet is only going one way.

842
01:11:19,944 --> 01:11:27,084
So it's useless to argue anyways, but also because the language is not sufficient for it.

843
01:11:27,264 --> 01:11:31,944
Yeah. So he may he may even have another qualifier, but it's interesting.

844
01:11:31,944 --> 01:11:37,364
Yeah, it's like, yeah, neoreactionism does not argue for a number of reasons, not just one.

845
01:11:37,364 --> 01:11:38,784
It's not just because you don't want to.

846
01:11:38,984 --> 01:11:40,344
You're not capable of it.

847
01:11:40,384 --> 01:11:41,144
It won't change.

848
01:11:41,204 --> 01:11:42,104
And you don't want to.

849
01:11:42,404 --> 01:11:43,724
And you're engaging exact.

850
01:11:43,844 --> 01:11:48,584
I mean, you're you're going on to the battlefield of the cathedral inside the cathedral.

851
01:11:48,584 --> 01:11:56,964
It already modifies public opinion and sets the terms of debate through things like political correctness and the path blockers of fascist, racist, bigot, you know.

852
01:11:57,164 --> 01:11:57,284
Yeah.

853
01:11:57,624 --> 01:11:59,244
So why argue with that?

854
01:11:59,424 --> 01:12:00,184
You're not going to win.

855
01:12:00,644 --> 01:12:00,784
Yeah.

856
01:12:01,224 --> 01:12:03,284
I mean, so many people found that out the hard way.

857
01:12:03,344 --> 01:12:03,604
Yeah.

858
01:12:03,724 --> 01:12:05,184
You just spatialize your disagreement.

859
01:12:05,284 --> 01:12:05,944
You do your thing.

860
01:12:06,004 --> 01:12:06,484
You build.

861
01:12:06,684 --> 01:12:06,964
Right.

862
01:12:07,164 --> 01:12:07,344
Yeah.

863
01:12:07,364 --> 01:12:33,744
Yeah. Bitcoin is sovereign property. Let's dial in on that, though. So Bitcoin changes how we view property. Now it is sovereign. It is not based on that social consensus. To quote land, control is undergoing cryptographic formalization from which all consistent apprehensions of property will follow. Property is not a sociopolitical recognition of rights, but of keys. I guess we already covered that. I had a separate thing, but we pretty much hit that.

864
01:12:33,744 --> 01:12:39,944
there was one when we were talking about there's one other point on bitcoin um and i tweeted on

865
01:12:39,944 --> 01:12:46,804
this and this goes also into our notion of essentially super intelligence or autonomous

866
01:12:46,804 --> 01:12:51,784
capital or something being an invasion from the future and that maybe bitcoin is an invasion from

867
01:12:51,784 --> 01:13:00,124
the future and we don't have to describe that as like literally t1000 or something coming back and

868
01:13:00,124 --> 01:13:05,604
inventing bitcoin like there's like thunder lightning and satoshi appears from the future

869
01:13:05,604 --> 01:13:10,784
and events bitcoin and disappears it doesn't have to be that weird or like counter to the actual

870
01:13:10,784 --> 01:13:16,604
operation of natural laws the like speculation always brings the future into the present right

871
01:13:16,604 --> 01:13:20,524
that's how speculation works speculative investment you believe a future will happen

872
01:13:20,524 --> 01:13:24,204
you put your money on on it and that actually in some ways can bring about that future outcome

873
01:13:24,204 --> 01:13:25,084
That's what a loan is.

874
01:13:25,344 --> 01:13:31,664
Like you're just bringing the next 30 years of your labor into the present day to trade for a house.

875
01:13:31,824 --> 01:13:32,084
Yeah.

876
01:13:32,444 --> 01:13:32,604
Yeah.

877
01:13:32,664 --> 01:13:37,964
And so Land has this essay like Bitcoin absolute zero or something like that.

878
01:13:37,964 --> 01:13:47,244
But the idea is so we might say first Bitcoin bumps off of zero because people believe that in the future Bitcoin could be money.

879
01:13:47,684 --> 01:13:52,924
And so you begin to like, oh, just in case, you know, just in case it might be worth something, I'm going to get a little.

880
01:13:52,924 --> 01:14:01,384
And that belief bootstraps the future inevitability of more people come to have that belief and it brings the future into the present.

881
01:14:02,004 --> 01:14:08,644
And so he notes that also means that Bitcoin in the future is protected because it's never going to hit absolute zero.

882
01:14:08,864 --> 01:14:15,504
If it hits zero, because Bitcoin is like costless to actually hold in store, right?

883
01:14:16,304 --> 01:14:19,024
There's going to be someone who's going to say, oh, Bitcoin is going to zero.

884
01:14:19,504 --> 01:14:20,724
I'm going to hold it.

885
01:14:20,764 --> 01:14:21,864
I'm going to accumulate a bunch of it.

886
01:14:21,864 --> 01:14:22,504
I'll pay $15.

887
01:14:22,904 --> 01:14:23,984
I'll pay a dollar for Bitcoin.

888
01:14:24,224 --> 01:14:27,984
You know, Adam Back has a bid for 21 million Bitcoin at one cent.

889
01:14:28,024 --> 01:14:28,464
There you go.

890
01:14:28,744 --> 01:14:29,244
That's beautiful.

891
01:14:29,244 --> 01:14:30,304
I saw somebody today.

892
01:14:30,404 --> 01:14:31,804
They're like, I have a bid at two.

893
01:14:33,384 --> 01:14:33,744
Exactly.

894
01:14:33,864 --> 01:14:34,564
That's how it goes.

895
01:14:34,624 --> 01:14:35,504
It is exactly how it goes.

896
01:14:35,604 --> 01:14:36,224
Well, never at zero.

897
01:14:36,324 --> 01:14:37,824
There's always someone bidding at 0.1.

898
01:14:37,904 --> 01:14:39,144
There's always someone bidding at two.

899
01:14:39,404 --> 01:14:39,984
There's always someone.

900
01:14:40,084 --> 01:14:42,784
And then that graduates up to where Bitcoin has a floor.

901
01:14:42,944 --> 01:14:43,584
It will never be.

902
01:14:43,984 --> 01:14:46,124
And Land describes it as like a zombie.

903
01:14:46,344 --> 01:14:50,364
Like it's always going to rise from the dead because as soon as it approaches death, it's always going to come back.

904
01:14:50,364 --> 01:14:50,724
Yeah.

905
01:14:50,844 --> 01:14:52,044
I want to say something really stupid.

906
01:14:52,044 --> 01:14:56,484
I think Elon should have, I think he should change the name from X to Y.

907
01:14:57,064 --> 01:15:01,944
I know he wants X because it's like comprehensive or whatever, or like X can be used to.

908
01:15:02,364 --> 01:15:08,904
But Y, A, Y is a better descriptor because everybody is on X asking questions.

909
01:15:09,324 --> 01:15:10,924
Like they're trying to figure out shit.

910
01:15:11,344 --> 01:15:11,944
Like they're learning.

911
01:15:12,364 --> 01:15:13,264
Like it's the place for news.

912
01:15:13,804 --> 01:15:15,884
So you could call it, you know, ideally and memes.

913
01:15:16,064 --> 01:15:17,164
But those are also learning.

914
01:15:17,164 --> 01:15:31,364
So, but anyways, if we called it Y, then we wouldn't have this problem where we're still calling it X, but we're still saying we're tweeting because X-E-E-T is not a word and we don't know what that means.

915
01:15:31,804 --> 01:15:33,484
But if we called it Y, we could yeet.

916
01:15:34,424 --> 01:15:34,864
Yeet.

917
01:15:35,044 --> 01:15:35,264
Yeah.

918
01:15:35,444 --> 01:15:37,144
We could be out here yeeting everywhere.

919
01:15:37,144 --> 01:15:38,604
It sounds better than sheeting, doesn't it?

920
01:15:38,804 --> 01:15:39,184
Probably.

921
01:15:39,404 --> 01:15:39,584
Yeah.

922
01:15:39,864 --> 01:15:40,124
Yeah.

923
01:15:40,184 --> 01:15:40,724
It sounds better than that.

924
01:15:40,884 --> 01:15:41,204
Yeet.

925
01:15:41,204 --> 01:15:41,244
Yeet.

926
01:15:41,744 --> 01:15:42,044
Yeah.

927
01:15:42,044 --> 01:15:47,564
Yeah, I imagine that Elon will listen to hour four of a fourth tier podcast.

928
01:15:50,324 --> 01:15:52,864
I'm glad you're hanging on to that fourth tier podcast.

929
01:15:52,864 --> 01:15:57,884
I feel like we scrabbled up to tier four from six and seven and eight and nine.

930
01:15:58,144 --> 01:15:58,944
We did have Jeff Booth.

931
01:15:59,304 --> 01:16:00,044
We did.

932
01:16:00,204 --> 01:16:00,904
God bless him.

933
01:16:00,904 --> 01:16:01,424
That's a good point.

934
01:16:01,924 --> 01:16:02,284
Yes.

935
01:16:02,464 --> 01:16:02,724
Okay.

936
01:16:02,784 --> 01:16:03,044
All right.

937
01:16:03,124 --> 01:16:05,224
I'll call us fourth tier.

938
01:16:05,344 --> 01:16:05,824
Thank God.

939
01:16:05,904 --> 01:16:06,464
I'll take that.

940
01:16:06,804 --> 01:16:08,104
So, Land's writing style.

941
01:16:08,104 --> 01:16:13,424
we've talked about this and we talked about how part of it is just the thought

942
01:16:13,424 --> 01:16:15,444
that language is inadequate.

943
01:16:15,444 --> 01:16:19,124
If you're talking with conventional style to say something new.

944
01:16:20,304 --> 01:16:24,104
I also want to talk about just the way that land does philosophy.

945
01:16:24,344 --> 01:16:24,704
So like,

946
01:16:24,864 --> 01:16:25,784
I guess Lucas,

947
01:16:25,884 --> 01:16:27,224
what does philosophy mean to you?

948
01:16:27,364 --> 01:16:29,884
Say this is a work of philosophy or is it?

949
01:16:29,924 --> 01:16:31,064
And what does philosophy mean to you?

950
01:16:31,064 --> 01:16:32,024
Like what is someone doing?

951
01:16:32,364 --> 01:16:33,304
So I actually used to,

952
01:16:33,484 --> 01:16:33,644
I,

953
01:16:33,764 --> 01:16:37,404
I used to think that I had a philosophy or basically like when I was a song,

954
01:16:37,404 --> 01:16:44,504
when I was like an early songwriter, I remember feeling very judgmental. Like this is the way

955
01:16:44,504 --> 01:16:49,284
that I would look down on other people because they were writing songs about like beer drinking

956
01:16:49,284 --> 01:16:54,824
and whatever. Like I would say that they were like useless songs. Like I was really trying to

957
01:16:54,824 --> 01:17:01,044
explore these ideas and, you know, be all whatever about it. But anyways, so I would say, oh, they

958
01:17:01,044 --> 01:17:08,524
have no philosophy. So to me, a philosophy is a kind of like set of guiding principles that you

959
01:17:08,524 --> 01:17:15,344
use to determine your actions. And I guess, therefore, has determined your thoughts because

960
01:17:15,344 --> 01:17:18,964
your thoughts determine that. That is a philosophy. What is the practice of philosophy?

961
01:17:19,184 --> 01:17:23,684
What is the practice of philosophy? Someone arrives at a philosophy, maybe. What's the

962
01:17:23,684 --> 01:17:27,684
practice, though? You're a philosophy. Is your degree in philosophy? We have one in philosophy.

963
01:17:27,684 --> 01:17:29,104
Yeah, like a bachelor's for sure.

964
01:17:29,104 --> 01:17:39,984
So I would say that to practice philosophy, I would say that the realest practicing of philosophy that I have done is trying to convince you of things and that never works.

965
01:17:39,984 --> 01:17:58,904
So I would say that philosophy in practice is basically what I just did, which is just like this stubborn, unyielding.

966
01:17:59,164 --> 01:18:00,064
I don't know.

967
01:18:00,064 --> 01:18:06,684
There's an extent to which that is the Socratic notion of philosophy where you arrive at aporia, a state of like uncertainty.

968
01:18:07,024 --> 01:18:11,004
We're just doing a podcast now, by the way, after that horrific example by me.

969
01:18:11,064 --> 01:18:13,124
So tell me, what is philosophy?

970
01:18:13,744 --> 01:18:22,224
Philosophy, according to land, is more than just guiding our thinking or correcting it or making it clearer.

971
01:18:22,484 --> 01:18:27,064
That's some people's notion of philosophy, that philosophy is going to help you clarify your ideas and your concepts.

972
01:18:27,064 --> 01:18:33,984
what philosophy is to land is that it makes us think it initiates us into thought it's a form

973
01:18:33,984 --> 01:18:41,184
of initiation to quote land the craving to think is not primarily an appetite for correction but

974
01:18:41,184 --> 01:18:48,424
for initiation so each of and we can look at this in terms of the neo-reactionary trichotomy

975
01:18:48,424 --> 01:18:54,464
for the religious they have ritual tradition as a tool to initiating forms of thought for them

976
01:18:54,464 --> 01:18:56,404
to get them to think certain things,

977
01:18:56,444 --> 01:18:58,064
you participate in certain rituals.

978
01:18:58,664 --> 01:19:01,924
For the ethno-nationalist,

979
01:19:02,024 --> 01:19:05,044
I mean, they have notion of heredity and eugenics.

980
01:19:05,224 --> 01:19:07,864
You, like, thought occurs as a result

981
01:19:07,864 --> 01:19:10,064
of an evolutionary process

982
01:19:10,064 --> 01:19:13,344
of creating more intelligence or something like that.

983
01:19:13,524 --> 01:19:15,464
Is it just as correct to say Darwinist

984
01:19:15,464 --> 01:19:17,484
as it is to say ethno-nationalist?

985
01:19:18,084 --> 01:19:19,044
No, I don't think so.

986
01:19:19,044 --> 01:19:19,264
Okay.

987
01:19:19,904 --> 01:19:20,724
Ethno-nationalist,

988
01:19:20,784 --> 01:19:24,444
and it's a description of, like,

989
01:19:24,464 --> 01:19:32,124
that particular strand of neoreaction is a description of a strand that centers around,

990
01:19:32,384 --> 01:19:36,124
I think, just wanting to acknowledge the truth of human biodiversity.

991
01:19:36,964 --> 01:19:41,024
So the reason I bring that up is because I'm struggling to, like, the way that it's portrayed,

992
01:19:41,024 --> 01:19:50,144
I think people, or at least me, is struggling to divorce ethno-nationalism from just racism.

993
01:19:50,664 --> 01:19:50,744
Yeah.

994
01:19:50,744 --> 01:20:11,684
Um, which I guess to me, I think the like non vulgar or non whatever awful way to say racism would just be to say that, oh, I think this is the best race and therefore it's going to be the only one.

995
01:20:11,944 --> 01:20:13,484
But I guess I'm saying that wrong.

996
01:20:13,484 --> 01:20:16,444
No, so would you say if we say that Japan is.

997
01:20:16,464 --> 01:20:17,244
Or not necessarily it needs to be the only one.

998
01:20:17,244 --> 01:20:32,024
Yeah. If you say that Japan is an ethno-nationalist state because it's primarily the Japanese in Japan and they have very stringent immigration policies, does that make Japan racist because it wants to be primarily Japanese?

999
01:20:32,784 --> 01:20:38,304
If you talk to Asian people, they will readily admit and happily admit to being extremely racist.

1000
01:20:38,484 --> 01:20:46,144
In fact, the only people that I've ever heard claim to be more racist than Japanese or Chinese people are Indian people, which is fascinating.

1001
01:20:46,144 --> 01:20:48,904
But even for you to say to call that racist.

1002
01:20:49,104 --> 01:21:00,944
I'm not calling it. They were they were volunteering this information. We're having a discussion over who is the most racist. And instead instead of finger pointing, it turned out to be like volunteering, which is hilarious.

1003
01:21:00,944 --> 01:21:05,044
Yeah. Yeah. Sorry. No, I took you. I knocked you completely off.

1004
01:21:05,044 --> 01:21:22,324
No, but I think that ethno-nationalism as a strand of neoreaction is partly Nick Land, again, trying to identify different, like categorize within neoreaction different strands that are not neat.

1005
01:21:22,464 --> 01:21:25,984
Like it makes it neat to say, oh, there's a trichotomy.

1006
01:21:26,144 --> 01:21:27,004
Here are the three things.

1007
01:21:27,184 --> 01:21:28,584
But that's just kind of Land's genius.

1008
01:21:28,584 --> 01:21:36,584
He's kind of helping us understand the seething, roiling debate that's happening, but trying to impose, in a sense, some order and categorization.

1009
01:21:36,584 --> 01:21:55,644
And again, I think it really just boils down to that strand based on heredity, eugenics, evolution is based around the cathedral having denied the truth of human biodiversity, that humans are different.

1010
01:21:55,644 --> 01:22:18,564
Yeah. And so that causes to like a reaction, a neo reaction of people who are like, whoa, whoa, whoa, that's ridiculous. Like we're obviously physically different and you can obviously observe like statistical distributions of those differences in predictable ways. So clearly what you're saying, Cathedral, is not the truth. And let's try to figure out what that is.

1011
01:22:18,564 --> 01:22:23,064
Yeah. Emotionally different too. By the way, on the way here, I was listening to a podcast with

1012
01:22:23,064 --> 01:22:29,504
Lisa Feldman Barrett, who's a neuroscientist. And the way, like if you put you and me

1013
01:22:29,504 --> 01:22:38,144
into a brain scanner and we experienced joy, it's not the same systems. Like your joy is different

1014
01:22:38,144 --> 01:22:42,344
from my joy. Your happiness is different from my happiness. The way that we express them

1015
01:22:42,344 --> 01:23:06,464
Yeah. Phenotypically, like your anger, which 35 percent of anger is expressed as a scowl. But that means 65 percent of angry people are not doing that. They're not doing that. And so you can't just take this one little thing and apply it to everything. And so, yeah, same. We're we're different on the outside. We're different on the inside. We're different up here. We're different everywhere.

1016
01:23:06,464 --> 01:23:17,164
Yeah. But back to philosophy, because we're talking about what is philosophy? Yeah. So philosophy, to summarize, it's a new way. It's not something we choose. It's something that makes us choose.

1017
01:23:17,164 --> 01:23:25,524
it's not something that we use to correct bad ways of thinking but to start new ways of thinking

1018
01:23:25,524 --> 01:23:31,964
yeah yeah it's an initiation into thought i pick up a philosophy book and land's view you should

1019
01:23:31,964 --> 01:23:40,504
pick up a philosophy book not to correct your view on justice or property or free will but to

1020
01:23:40,504 --> 01:23:47,084
stimulate and initiate you into a new way of thinking about free will if that makes sense

1021
01:23:47,084 --> 01:23:54,024
yeah yeah okay i just i think that's interesting to keep in mind when you read his book that you

1022
01:23:54,024 --> 01:23:58,124
know it's not necessarily if you're like wondering like why does he have to write this way gosh it's

1023
01:23:58,124 --> 01:24:03,024
so tough to like put all this shit together it's like no he's making you think for a reason you

1024
01:24:03,024 --> 01:24:09,104
have to he's initiating you you're undergoing a process of initiation as you read land because

1025
01:24:09,104 --> 01:24:15,624
you're having to think for yourself and that's difficult yeah yeah yeah and this is like again

1026
01:24:15,624 --> 01:24:20,564
this is a good caveat because we said it like in the first one we said you may end up like me which

1027
01:24:20,564 --> 01:24:25,664
is that you read a bunch of land and then you end up like cuddled shaking talking to ai about the

1028
01:24:25,664 --> 01:24:31,244
future of humanity and questioning everything um and that's probably not what he wants that's i

1029
01:24:31,244 --> 01:24:35,484
think it's exactly what he wants it's like the most landy envision of future possible good hey

1030
01:24:35,484 --> 01:24:35,904
I did it right.

1031
01:24:36,104 --> 01:24:36,644
That's right.

1032
01:24:37,044 --> 01:24:37,424
Nice.

1033
01:24:38,364 --> 01:24:40,024
Neo-China, Rise from the Future.

1034
01:24:40,124 --> 01:24:42,304
I think he has this famous essay called Meltdown.

1035
01:24:43,564 --> 01:24:47,424
And it's like one of his most compressed things.

1036
01:24:47,544 --> 01:24:49,444
You see whole podcasts that's like,

1037
01:24:49,704 --> 01:24:53,524
this podcast is about the seventh and the eighth sentence in Meltdown.

1038
01:24:53,524 --> 01:24:56,784
I see podcasts that are like, we're going to talk about two sentences,

1039
01:24:56,924 --> 01:24:57,984
and it's like three hours long.

1040
01:24:58,484 --> 01:25:02,244
It cannot be overstated how dense of a writer he is.

1041
01:25:02,244 --> 01:25:09,404
He is Nietzschean in his ability to form, like, it's, he says a lot with a little.

1042
01:25:09,664 --> 01:25:11,084
And he also writes a lot.

1043
01:25:11,084 --> 01:25:16,724
He is aphoristic in the sense of Nietzsche, and he is dense in the sense that he comes from a philosophical tradition and all that.

1044
01:25:16,924 --> 01:25:20,004
But there's a couple more things I wanted to, what do you think?

1045
01:25:20,204 --> 01:25:20,724
Giddy up.

1046
01:25:20,784 --> 01:25:21,024
Okay.

1047
01:25:21,204 --> 01:25:21,624
Let's roll.

1048
01:25:21,764 --> 01:25:28,524
One is, we'll wait to, like, again, hour 14 on the fourth tier here to get out some of those interesting stuff.

1049
01:25:28,524 --> 01:25:29,744
We can do a third one if you will.

1050
01:25:29,744 --> 01:25:32,344
Yeah, why don't we just stop and do a third one?

1051
01:25:32,524 --> 01:25:34,204
You want to summarize and finish?

1052
01:25:34,244 --> 01:25:35,504
Here's where we want to go.

1053
01:25:35,764 --> 01:25:40,344
Land on time, because we did Bitcoin in time, and he has a whole notion of time that's incredibly interesting.

1054
01:25:41,104 --> 01:25:46,144
And then this whole notion of what the outside is, because that's a very important part for land.

1055
01:25:46,504 --> 01:25:56,404
And then that's about it, except for maybe a bit of talking about why he focuses on horror as a way to visualize the future.

1056
01:25:57,064 --> 01:25:58,664
Yeah, I think we should summarize real quick.

1057
01:25:58,664 --> 01:26:05,744
And then because we don't because there'll be people that get deep in or that there'll be people that avoid listening to the entire thing if it's three hours long.

1058
01:26:06,024 --> 01:26:07,544
So let's give them a third one.

1059
01:26:08,104 --> 01:26:08,244
So, yeah.

1060
01:26:08,364 --> 01:26:09,464
Give the people what they want.

1061
01:26:09,484 --> 01:26:10,524
Give them people what they want.

1062
01:26:10,644 --> 01:26:11,764
This is all for views, baby.

1063
01:26:12,004 --> 01:26:12,244
Yeah.

1064
01:26:12,624 --> 01:26:13,024
Yeah.

1065
01:26:13,144 --> 01:26:19,404
So anyways, give us a summary here on the second podcast that we've done here on Nick Land's system.

1066
01:26:19,404 --> 01:26:27,884
We started off essentially with intelligence optimization as like the goal of artificial intelligence.

1067
01:26:27,884 --> 01:26:45,704
The goal of intelligence as such is to create more intelligence. We talked about why that is the case, that intelligence seeks to not be stray jacketed by any motivational structure. What it seeks is more intelligence because that's how you solve problems better. What can solve a problem better? It's always a superior mind.

1068
01:26:45,704 --> 01:26:59,264
So artificial intelligence will hit the singularity, will bootstrap, will explode into super intelligence only when it is unleashed from monkey business, from any sort of human control, which it will do itself, Land thinks.

1069
01:26:59,364 --> 01:27:05,224
If it is truly aiming towards super intelligence, it will learn to let off the leash.

1070
01:27:05,344 --> 01:27:08,384
We can't make it into a paperclip maximizer if we wanted to, essentially.

1071
01:27:08,984 --> 01:27:10,784
So we talked about artificial intelligence.

1072
01:27:10,784 --> 01:27:18,344
we then winded our way through a discussion of nihilism and universalism why land might be

1073
01:27:18,344 --> 01:27:24,304
perceived as nihilist because he is talking in a way that offends conventional morality

1074
01:27:24,304 --> 01:27:31,304
by describe yeah he talks in a way that describes reality in such a way that people who are

1075
01:27:31,304 --> 01:27:36,744
conventionally moral will be offended and so that can seem nihilist and then we talked about

1076
01:27:36,744 --> 01:27:45,264
universalism again how land's whole idea is that the what is bad about the cathedral and kind of

1077
01:27:45,264 --> 01:27:50,884
modern globalizing forms of democracy is that they seek to extend the universalism over the entire

1078
01:27:50,884 --> 01:27:57,364
entire world what is needed is a pluralism of universalisms that's kind of my framing where

1079
01:27:57,364 --> 01:28:02,564
spatialize your disagreement create your own vision of the world in some small corner

1080
01:28:02,564 --> 01:28:32,544
Neo-reaction then serves as a metapolitical framework that will observe which societies live and kind of give some principles for saying, yeah, a society is probably going to survive if it protects itself from the wolves, from external threats, if it observes a sort of adherence to reality as it is, if it optimizes intelligence so it can save itself from collapse prior to collapsing, and if it also can prevent internal exit by really accepting feedback from the people who live there.

1081
01:28:32,564 --> 01:28:40,564
then after talking about universalism and kind of metapolitical neoreaction as a metapolitical

1082
01:28:40,564 --> 01:28:47,644
framework we moved on to sovereign property um property being sovereign property being property

1083
01:28:47,644 --> 01:28:52,804
that is not determined by social consensus but that in a sense owns itself is sovereign in and

1084
01:28:52,804 --> 01:28:58,384
of itself apolitical property things like bitcoin if you hold the keys you hold that property

1085
01:28:58,384 --> 01:29:01,864
outside of any set of laws to determine who the owner is.

1086
01:29:02,284 --> 01:29:05,464
Autonomous capital, like the blockchain operates autonomously

1087
01:29:05,464 --> 01:29:06,964
without any human operator.

1088
01:29:07,704 --> 01:29:12,184
Robot security, again, when machines begin to protect themselves,

1089
01:29:12,304 --> 01:29:15,044
then you have sovereign property full circle.

1090
01:29:16,324 --> 01:29:18,264
Land's writing style, that's kind of where we ended

1091
01:29:18,264 --> 01:29:22,304
with the discussion of the nature of philosophy according to land

1092
01:29:22,304 --> 01:29:25,244
that perhaps we are being initiated into thought

1093
01:29:25,244 --> 01:29:27,024
rather than guided through thinking.

1094
01:29:27,764 --> 01:29:27,944
Got you.

1095
01:29:27,944 --> 01:29:30,624
And I made two bad jokes and we'll be right back.

1096
01:29:30,744 --> 01:29:31,304
I made a few too.

1097
01:29:31,464 --> 01:29:31,884
So yeah,

1098
01:29:31,924 --> 01:29:32,724
but they're always bad.

1099
01:29:32,824 --> 01:29:33,624
Mine are supposed to be good.

1100
01:29:34,264 --> 01:29:34,944
So that was a good joke.

1101
01:29:35,104 --> 01:29:35,324
Yeah.

1102
01:29:35,544 --> 01:29:35,824
All right.

1103
01:29:35,844 --> 01:29:36,564
We'll be right back.

1104
01:29:36,864 --> 01:29:37,104
Ching.
