1
00:00:00,000 --> 00:00:07,980
applause for Matt O'Dell, Derek Ross, and Sean Yeager.

2
00:00:08,200 --> 00:00:09,480
Yo, how's it going, guys?

3
00:00:10,760 --> 00:00:15,220
We'll be talking today about the future of digital comms,

4
00:00:16,000 --> 00:00:21,000
identity, social, and open communities.

5
00:00:23,420 --> 00:00:26,740
And my good friends, Derek and Sean, here with us.

6
00:00:26,740 --> 00:00:32,980
I think an interesting place to start is diagnosing the problem.

7
00:00:33,820 --> 00:00:38,740
We've found ourselves in an increasing digital world.

8
00:00:40,100 --> 00:00:47,280
And the status quo has kind of just been built like one foot in front of the other without really kind of like any real planning.

9
00:00:47,600 --> 00:00:49,060
And now we're here.

10
00:00:49,800 --> 00:00:51,740
So let's start off with Sean.

11
00:00:51,740 --> 00:01:03,400
When you think about the current state of digital communities and identity and social, where do you diagnose the problems existing?

12
00:01:03,700 --> 00:01:09,380
I think as with most everything, as an admitted Bitcoiner, it starts with broken money.

13
00:01:09,980 --> 00:01:11,940
And with broken money come broken incentives.

14
00:01:13,400 --> 00:01:19,100
And from that flow, business models that turn, as we all know, us into the product.

15
00:01:19,100 --> 00:01:24,820
and there has been an increasing, I think, drive

16
00:01:24,820 --> 00:01:27,940
to try to milk more out of the consumer user

17
00:01:27,940 --> 00:01:30,860
and then the advertisers and the businesses.

18
00:01:32,880 --> 00:01:36,040
Cory Doctorow has a colorful phrase.

19
00:01:36,640 --> 00:01:37,840
Maybe I won't utter it here.

20
00:01:38,760 --> 00:01:42,760
But to describe how these cycles roll out.

21
00:01:42,760 --> 00:01:48,940
And so where we find ourselves is not only are we not the user,

22
00:01:49,100 --> 00:01:56,340
we're the product, but I think increasingly we are seen as something to be packaged.

23
00:01:57,240 --> 00:02:02,980
And we see creeping KYC. We see everything happening in the UK with the Online Safety Act.

24
00:02:04,000 --> 00:02:08,680
And yeah, we're not in a good place right now. Very well said. Derek, how do you think about it?

25
00:02:09,900 --> 00:02:16,020
Well, I think that over the past few years, we've all come to a place where we

26
00:02:16,020 --> 00:02:22,380
know somebody or we interacted online with somebody, followed somebody that has been

27
00:02:22,380 --> 00:02:31,660
censored or has been shadow banned, something along those lines. It's becoming more apparent

28
00:02:31,660 --> 00:02:37,640
and it's accelerating. It's kind of odd to see it accelerating. Like Sean just said,

29
00:02:37,640 --> 00:02:44,240
we're seeing that happen across the European Union, across the, you know, in the UK.

30
00:02:44,240 --> 00:02:50,680
we're starting to see this actually even happen recently here in the United States where

31
00:02:50,680 --> 00:02:57,400
people can have their whole entire livelihood, their business taken away because they built

32
00:02:57,400 --> 00:03:04,720
their business on somebody else's foundation. And they don't own that content. They don't own that

33
00:03:04,720 --> 00:03:11,460
their followers. They don't own their entire social graph and it's disappearing overnight.

34
00:03:11,460 --> 00:03:16,100
years and years of hard work can be taken away from you and you can't do anything about it

35
00:03:16,100 --> 00:03:23,960
because you built your entire digital life on somebody else's foundation and it's becoming

36
00:03:23,960 --> 00:03:30,520
very apparent that there needs to be a better way yeah i think um there's a there's a couple

37
00:03:30,520 --> 00:03:40,340
issues that compound on top of each other um that result in the current trajectory that we're that

38
00:03:40,340 --> 00:03:48,320
we're going down in terms of big tech and digital platforms. So, I mean, you guys phoned in on

39
00:03:49,100 --> 00:03:56,420
censorship and control, which I think is one that people talk about a lot. So, Sean, you've been

40
00:03:56,420 --> 00:04:04,220
exploring like kind of this intersection between, you know, AI and Bitcoin. And the other piece here

41
00:04:04,220 --> 00:04:10,060
that is really interesting to me is like this idea of deep fakes and verifiability. How do you

42
00:04:10,060 --> 00:04:17,980
think about that in the current paradigm? I think, I mean, and just a brief bit of background,

43
00:04:18,380 --> 00:04:23,360
hopefully not a shameless shill, is the point of trust revolution is to pursue two questions. One

44
00:04:23,360 --> 00:04:30,520
is how do we as developed nations find ourselves in low trust societies in that we, I think most of

45
00:04:30,520 --> 00:04:33,400
us can agree, Pew Research and others would certainly back this up. We don't trust the

46
00:04:33,400 --> 00:04:35,860
government. We don't trust the media. We trust healthcare. We don't trust education. We don't

47
00:04:35,860 --> 00:04:39,880
trust each other. We don't trust across party lines. That's not a black pill. I think it's

48
00:04:39,880 --> 00:04:45,880
just observably true. The second more hopeful question is how and where can we reclaim the

49
00:04:45,880 --> 00:04:51,900
trust that we have given or been demanded of and it has been broken? And how can we build trust

50
00:04:51,900 --> 00:04:56,460
where we believe it should be? So that's all to say, can we trust our eyes to your question? You

51
00:04:56,460 --> 00:05:01,560
know, can we trust the media that we see and we consume? I think what's hopeful about that is the

52
00:05:01,560 --> 00:05:10,680
ability to utilize public-private key cryptography to sign, authenticate, attribute media. I think

53
00:05:10,680 --> 00:05:15,500
we're quite a ways away from that being large scale. I think, once again, the incentives are

54
00:05:15,500 --> 00:05:20,700
not necessarily aligned for that to be widely adopted, but I think the tools are there.

55
00:05:21,500 --> 00:05:27,020
And the big question in my mind, to echo yours, is at what point do we reach this inflection

56
00:05:27,020 --> 00:05:41,918
where there is so much questioning and confusion about is what I seeing real that there a broader adoption of the tools that we do have like Noster and these public key pairs to address that challenge

57
00:05:42,578 --> 00:05:46,218
But, I mean, aren't we kind of already there?

58
00:05:47,278 --> 00:05:49,298
In what way? There in terms of...

59
00:05:49,298 --> 00:05:52,118
I think most people, like, when you open your phone, you're like, is that real?

60
00:05:52,138 --> 00:05:53,058
Oh, yes.

61
00:05:53,618 --> 00:05:57,158
Like, we're very close, if not already across the chasm, right?

62
00:05:57,158 --> 00:06:11,018
Yeah, which I mean, and I'll just say one quick thing there is I think much as in sort of prior waves of technology, there has been the need to create a certain literacy and a certain ability to scrutinize.

63
00:06:11,158 --> 00:06:21,858
I hope that it incentivizes and motivates people to become more thoughtful about what they consume and what they question or trust.

64
00:06:21,858 --> 00:06:35,338
I think expanding on what you consume is a unique problem in itself because what content I want to consume versus what content I'm forced to consume is very different.

65
00:06:35,338 --> 00:06:43,598
Because we are slaves to the algorithms and what these platforms want us to see.

66
00:06:43,758 --> 00:06:46,978
We don't really have control over the content.

67
00:06:47,158 --> 00:06:49,138
We don't have the control over our attention.

68
00:06:49,918 --> 00:06:51,538
And that's part of the problem too.

69
00:06:51,538 --> 00:06:59,898
So if you didn't want to see certain types of content, it's really hard to not see it using these existing legacy social platforms.

70
00:07:00,078 --> 00:07:00,698
You're being spoon fed.

71
00:07:01,918 --> 00:07:06,978
So I mean from like a productive point of view, how do you mitigate that?

72
00:07:07,038 --> 00:07:08,478
How do you actually solve that problem?

73
00:07:08,678 --> 00:07:10,938
I mean that's easier said than done.

74
00:07:10,958 --> 00:07:12,238
Yeah, it's easier said than done.

75
00:07:12,238 --> 00:07:18,698
but we need tools for users that allow them to choose their own algorithm, to choose the type

76
00:07:18,698 --> 00:07:25,718
of content they want to see, to choose and curate their, their social feeds. Just because Elon and

77
00:07:25,718 --> 00:07:30,938
Mark Zuckerberg say that this is the content that you need to see doesn't mean that I want to see it.

78
00:07:31,018 --> 00:07:36,598
Doesn't mean that you want to see it, but I don't have a choice if I use Instagram or Facebook or

79
00:07:36,598 --> 00:07:43,498
x twitter like i have to see that algorithm content i i don't have a choice of choosing

80
00:07:43,498 --> 00:07:49,638
you know cat pics as my feed if i want to you know if i want a few cats or whatever it is

81
00:07:49,638 --> 00:07:54,878
i easily sure i could browse a hashtag or something like that but that's not a good

82
00:07:54,878 --> 00:07:59,778
you know that's not a good choice we need more user tools we need more user choice

83
00:07:59,778 --> 00:08:05,798
and there are options out there that give users full control over what they want to consume full

84
00:08:05,798 --> 00:08:10,578
control over their attention because that's what these platforms are monetizing they're monetizing

85
00:08:10,578 --> 00:08:17,318
our attention right like we need a way to take that back it's our it's you know what my eyes see

86
00:08:17,318 --> 00:08:24,138
it's my attention i should be able to designate what gets my attention and do you think the the

87
00:08:24,138 --> 00:08:28,878
friction point with that because i do think that's the path forward the friction point with that is

88
00:08:28,878 --> 00:08:33,518
it requires a level of personal responsibility from the actual user.

89
00:08:33,998 --> 00:08:34,218
Yeah.

90
00:08:34,938 --> 00:08:36,838
Like how do we handle that friction?

91
00:08:37,258 --> 00:08:39,698
There's some people that just want to scroll, right?

92
00:08:39,818 --> 00:08:45,778
They don't have time to build and curate their own feed.

93
00:08:46,078 --> 00:08:47,398
And that's fine.

94
00:08:47,478 --> 00:08:49,258
For that, you have a choice.

95
00:08:49,658 --> 00:08:51,958
But the fact that you don't have a choice is the problem.

96
00:08:52,198 --> 00:08:54,178
If you want the spoon-fed content, great.

97
00:08:55,098 --> 00:08:57,238
If you don't want the spoon-fed content,

98
00:08:57,238 --> 00:08:59,338
you want to be your own algorithm, be in control,

99
00:08:59,558 --> 00:09:01,958
you should have that choice in a wide variety of choices.

100
00:09:02,318 --> 00:09:05,418
The choices should be open and transparent,

101
00:09:05,698 --> 00:09:08,658
and you should be able to decide which path you want to go.

102
00:09:09,278 --> 00:09:11,598
And I would say it's also experiential in the sense that

103
00:09:11,598 --> 00:09:15,238
if you're not on Nostra, if you haven't tried Nostra...

104
00:09:15,778 --> 00:09:16,458
What is Nostra?

105
00:09:16,558 --> 00:09:17,098
What is Nostra?

106
00:09:17,098 --> 00:09:18,138
We didn't even talk about that yet.

107
00:09:18,158 --> 00:09:18,678
What is Nostra?

108
00:09:20,158 --> 00:09:23,438
Well, so like Bitcoin, I'll let Matt talk to this,

109
00:09:23,518 --> 00:09:24,298
it is an open protocol.

110
00:09:24,638 --> 00:09:26,098
No one controls it, no one owns it,

111
00:09:26,098 --> 00:09:28,698
and therefore it is there to be built upon.

112
00:09:28,798 --> 00:09:29,878
And the reason I mention it is,

113
00:09:31,538 --> 00:09:34,678
I think most of traditional social media and communications channels,

114
00:09:34,678 --> 00:09:39,058
one to many, they are not only monetizing our attention,

115
00:09:39,278 --> 00:09:41,158
increasingly they're monetizing our outrage.

116
00:09:42,818 --> 00:09:46,838
And I think as people that I've observed experience an alternative,

117
00:09:48,138 --> 00:09:49,818
Mastodon, others there are out there,

118
00:09:49,978 --> 00:09:51,878
I think we all agree that Nostra is the way to go.

119
00:09:51,878 --> 00:10:01,978
once you remove the outrage, it is experiential that I feel better, possibly at least not worse,

120
00:10:02,438 --> 00:10:09,438
as I have engaged with others on Noster versus X versus Facebook versus others. And so that is all

121
00:10:09,438 --> 00:10:16,158
to say, I think part of the key is just giving people a sense of what that's like. And I think

122
00:10:16,158 --> 00:10:23,278
they can begin each of us to sort of rewire those receptors those dopamine you know hits that we're

123
00:10:23,278 --> 00:10:28,898
accustomed to getting um but it will take some time i mean you're drilling down on basically this

124
00:10:28,898 --> 00:10:38,478
concept of healthy usage of of technology yes um which i would say as a society

125
00:10:38,478 --> 00:10:46,538
we're probably very deep into unhealthy usage of of the tools and i mean i see this firsthand

126
00:10:46,538 --> 00:10:53,638
with my own life i see this across all different aspects of society right now we have a term for

127
00:10:53,638 --> 00:10:59,738
that nowadays it's called doom scrolling like it became so apparent we have that ai psychosis

128
00:10:59,738 --> 00:11:14,837
yeah doom scrolling everyone does it a lot of people do it they know they doing it they and continue to do it But one part on this one aspect of this idea of digital health and healthy usage

129
00:11:14,837 --> 00:11:20,737
that I think is incredibly key for our society going forward is all three of us are parents.

130
00:11:20,957 --> 00:11:24,497
It's specifically, I mean, I think adults use it in very unhealthy ways,

131
00:11:24,657 --> 00:11:29,777
but the question is, like, how does that affect childhood development?

132
00:11:29,777 --> 00:11:36,237
and for something like NOSTER that's an open protocol that's not controlled by anybody

133
00:11:36,237 --> 00:11:43,477
how do you think I mean we'll start with Sean again how do you think about handling that issue

134
00:11:43,477 --> 00:11:48,837
like how how does how does society handle that going forward with kids growing up with

135
00:11:48,837 --> 00:11:57,197
basically just a fire hose of information well I am there's my little guy right there my my

136
00:11:57,197 --> 00:12:04,257
almost four-year-old, so I'm a dad to a young boy. And so I have a bit of time, but I'll just

137
00:12:04,257 --> 00:12:12,217
sort of maybe share an anecdote, which is that we, full credit to my wife, had given, close your

138
00:12:12,217 --> 00:12:18,577
ears, Lath, had given maybe an hour to two per morning of screen time so that, you know, she at

139
00:12:18,577 --> 00:12:25,077
home could have some space to do some things. It is remarkable, the change, and this will be obvious

140
00:12:25,077 --> 00:12:31,397
to those of you who've done it, but it was remarkable to me that in saying no and ending

141
00:12:31,397 --> 00:12:38,617
that and having zero screen time, the change in our son was incredible. And I personally don't know

142
00:12:38,617 --> 00:12:44,697
of any better reference point in my life than to have observed that firsthand. So I can only imagine

143
00:12:44,697 --> 00:12:51,497
what a young child given a device in their hand, that's not a judgment for anyone who chooses to

144
00:12:51,497 --> 00:12:59,557
do that but i just can't imagine the damage that that will do so i feel very passionate about

145
00:12:59,557 --> 00:13:05,197
our collective and individual most of all responsibility within our families

146
00:13:05,197 --> 00:13:11,737
to find better ways so i mean we're seeing like right now we're seeing a lot of conversation about

147
00:13:11,737 --> 00:13:20,117
disenfranchised youth getting radicalized on internet communities it's become a very sensitive

148
00:13:20,117 --> 00:13:27,337
conversation. Some of the quote unquote solutions that have been proposed involves restricting

149
00:13:27,337 --> 00:13:36,917
speech, restricting access, adding digital ID, adding age restrictions. I mean, we just saw

150
00:13:36,917 --> 00:13:44,477
blue sky, I think in two states just added age restrictions to their app. Derek, how do you,

151
00:13:44,477 --> 00:13:48,417
what is the most productive path forward?

152
00:13:48,517 --> 00:13:53,057
Because I think the key here is that that is actually a problem.

153
00:13:53,277 --> 00:13:57,357
I do think disenfranchised youth are getting radicalized

154
00:13:57,357 --> 00:13:59,237
on niche internet communities.

155
00:14:00,117 --> 00:14:02,977
But when you're building out something like NOS or an open protocol

156
00:14:02,977 --> 00:14:08,557
where you inherently can't age restrict on a top-down level,

157
00:14:08,937 --> 00:14:10,997
what is the most productive path?

158
00:14:10,997 --> 00:14:14,457
How do we actually solve that in a healthy way?

159
00:14:15,417 --> 00:14:17,037
That's a very good question.

160
00:14:17,157 --> 00:14:18,537
And it's probably a very hard question.

161
00:14:19,537 --> 00:14:25,077
I think I'll say part of it goes back to what Sean was alluding to,

162
00:14:25,157 --> 00:14:30,077
is that ultimately parents should parent.

163
00:14:30,897 --> 00:14:36,677
If kids are having issues online, getting radicalized over certain content,

164
00:14:36,677 --> 00:14:38,177
and you don't want that to happen to your kid,

165
00:14:38,217 --> 00:14:42,697
then you need to restrict access to certain applications.

166
00:14:43,317 --> 00:14:48,157
Now, that doesn't mean completely take away because we know that kids today are very social

167
00:14:48,157 --> 00:14:51,297
online, so you can still give them apps.

168
00:14:51,297 --> 00:14:59,477
So the second part of this is we just need more user controls and we need more apps across

169
00:14:59,477 --> 00:15:07,357
the Noster ecosystem that maybe do focus on restricting, filtering, that type of content.

170
00:15:07,477 --> 00:15:12,617
So maybe you have, because Noster is widely open and you can do anything you want, maybe

171
00:15:12,617 --> 00:15:20,977
somebody builds a NOSTER application that is more suitable for the youth. Maybe restricts certain

172
00:15:20,977 --> 00:15:28,237
type of content. It's only bound to certain content filtered relays and you can't use anything else

173
00:15:28,237 --> 00:15:35,177
but that. Now the argument is, well, the kid can take the profile, the NSEC and just use another

174
00:15:35,177 --> 00:15:42,237
app. But if you're the parent, you do parenting and you lock down access to certain applications,

175
00:15:42,237 --> 00:15:44,757
You only give them access to the parent approved app.

176
00:15:45,217 --> 00:15:46,437
I mean, they're your kids.

177
00:15:46,517 --> 00:15:49,237
You should be able to say what apps they use.

178
00:15:49,277 --> 00:15:54,277
And the personal example was that is I didn't let my kids use TikTok for a very long time.

179
00:15:54,797 --> 00:15:57,797
And my kids are now 14 and 16 years old.

180
00:15:58,117 --> 00:15:59,297
They now use TikTok.

181
00:15:59,517 --> 00:16:05,117
But they wanted to use it years ago when their friends were all using it, you know, 10, 12 years old.

182
00:16:05,397 --> 00:16:07,877
And I said, no, you're not using that app.

183
00:16:07,997 --> 00:16:08,537
I'm sorry.

184
00:16:08,717 --> 00:16:10,197
And they complained a lot.

185
00:16:10,197 --> 00:16:14,957
And I was a parent and said, well, I'm sorry, you're not using it.

186
00:16:15,297 --> 00:16:20,537
And I used my parental rights to restrict my kids' access to something I didn't want them on.

187
00:16:20,977 --> 00:16:21,737
Now they're older.

188
00:16:22,317 --> 00:16:24,297
Sure, I let them do it.

189
00:16:25,137 --> 00:16:26,917
And the same would go for any Noster app.

190
00:16:27,017 --> 00:16:31,477
I would restrict access and block if I wanted to, the access to do that.

191
00:16:31,517 --> 00:16:33,517
Because we have the tools to do that.

192
00:16:33,517 --> 00:16:47,495
But then as I said on the other side we do need a NOSTER client to step up and build a kid kid environment Well and I think just quickly the thing that so powerful about this in my strong

193
00:16:47,495 --> 00:16:54,195
promotion of Noster, or whatever may come after, is the ability for individuals, for

194
00:16:54,195 --> 00:16:58,995
parents in this particular case, to be given the tools to make the choice.

195
00:17:00,095 --> 00:17:01,115
Yeah, I think that's the core.

196
00:17:01,295 --> 00:17:02,375
It should not come from X.

197
00:17:02,435 --> 00:17:03,595
It should not come from the government.

198
00:17:03,595 --> 00:17:09,115
It should come from the individuals closest to and most invested in that little human's health.

199
00:17:09,795 --> 00:17:15,655
And I think NOSTER is a prime example of what an open protocol does with regard to giving us that power.

200
00:17:16,015 --> 00:17:21,515
Yeah, I think you give parents tools so that they can parent better.

201
00:17:21,915 --> 00:17:22,435
Absolutely.

202
00:17:22,895 --> 00:17:24,355
And have them take responsibility.

203
00:17:24,555 --> 00:17:25,995
And it's bigger than NOSTER, right?

204
00:17:26,195 --> 00:17:26,595
Absolutely.

205
00:17:26,595 --> 00:17:56,575
I mean it's kind of bewildering that you don't like that apple doesn't have built into the iphone or whatever like really granular controls for parents to choose how their kids are interacting with these things I think you bring it down to almost the os level right like because I'm a tech nerd I know how to go in and on my router and block access to my kids devices to certain websites it's I'll say it's easy but is it easy for everybody probably not

206
00:17:56,595 --> 00:18:02,735
not. So we need easier tools for everybody to use. Yeah, I agree. I mean, guys, this has been a great

207
00:18:02,735 --> 00:18:09,055
conversation. We've been a little bit more abstract just to bring it, bring it all back

208
00:18:09,055 --> 00:18:13,055
together and make it a little bit more actionable to people here that have never used Noster

209
00:18:13,055 --> 00:18:19,015
and maybe want to play around with the test. I think, you know, the best way to learn is to,

210
00:18:19,095 --> 00:18:25,315
you know, just get your hands dirty and actually use the tools. I mean, Sean, what would be your

211
00:18:25,315 --> 00:18:30,615
recommendation to someone who's interested in seeing what's being built up. Yeah, I'll take

212
00:18:30,615 --> 00:18:37,195
just a, I'll steal someone else's analogy metaphor is if you were a medieval king and you needed to

213
00:18:37,195 --> 00:18:41,735
issue a directive throughout the kingdom to your military, to someone else, as you would probably

214
00:18:41,735 --> 00:18:46,875
recall, you would have a signet ring. That signet ring would be heated, pressed into wax. It creates

215
00:18:46,875 --> 00:18:52,595
a seal that letters then delivered to Matt, the general. And my signet ring is my private key.

216
00:18:53,175 --> 00:18:56,955
It is difficult to mimic, difficult to forge, presumably hard to steal.

217
00:18:57,615 --> 00:19:02,755
That's my piece of property that allows me to sign.

218
00:19:03,175 --> 00:19:04,635
The seal is the public key.

219
00:19:04,635 --> 00:19:10,895
And so that is all to say, in these ways that have been created and recreated throughout time,

220
00:19:11,395 --> 00:19:12,795
Nostert gives you that ownership.

221
00:19:13,255 --> 00:19:14,875
Now, with that comes great responsibility.

222
00:19:15,275 --> 00:19:16,035
You own that key.

223
00:19:16,155 --> 00:19:17,115
You have that signet ring.

224
00:19:17,115 --> 00:19:31,735
And so from that understanding that you can own your identity, you can own the ability to attribute your creation or publishing of content, it can be quite simple.

225
00:19:32,175 --> 00:19:33,895
So I think Primal's brilliant.

226
00:19:34,695 --> 00:19:38,075
I'll disclaimer format, 1031, investor in Primal.

227
00:19:38,635 --> 00:19:39,915
Fantastic application.

228
00:19:40,175 --> 00:19:42,755
So Primal.net, I think it's a great way to get started.

229
00:19:42,855 --> 00:19:44,875
I think it's one of the best consumer UXs.

230
00:19:44,875 --> 00:19:54,395
There are many others, depending on where you are on the spectrum from, I just want it to work, Apple-esque style, to, you know, like us, we're nerds and want to dig in.

231
00:19:54,575 --> 00:19:56,875
But I would say, in short, Primal.net, take a look.

232
00:19:57,695 --> 00:19:58,495
Great recommendation.

233
00:20:00,435 --> 00:20:02,155
I think he handled that really well.

234
00:20:02,355 --> 00:20:02,475
Yeah.

235
00:20:02,595 --> 00:20:12,175
So while we have a little bit more time, just real quick, Vibe Coding, Nostra, AI, Bitcoin, that's where your focus is right now.

236
00:20:12,395 --> 00:20:12,835
Yes.

237
00:20:13,015 --> 00:20:13,915
Why is that powerful?

238
00:20:13,915 --> 00:20:24,115
Because Soapbox is building tools that allow people that are creators or have their own community to build an application.

239
00:20:24,575 --> 00:20:25,495
You can vibe code it.

240
00:20:25,735 --> 00:20:29,275
You can build your own app for your own community.

241
00:20:29,855 --> 00:20:34,135
And because it's built on Noster, you can own all of that content.

242
00:20:34,135 --> 00:20:45,775
So instead of using Discord or Twitter or whatever for your community, you could use Shakespeare to build your own community app, customized how you've always wanted it to be.

243
00:20:45,975 --> 00:20:46,955
And you own it.

244
00:20:47,015 --> 00:20:47,995
You own all the source code.

245
00:20:48,055 --> 00:20:48,775
You own all the data.

246
00:20:49,215 --> 00:20:50,335
It's decentralized.

247
00:20:50,675 --> 00:20:52,175
You can do whatever you want with it.

248
00:20:52,535 --> 00:20:54,135
And nobody can take that away from you.

249
00:20:54,195 --> 00:21:01,195
Whereas if your Discord server gets taken down because you're a streamer or a musician or an artist or something, well, you're screwed.

250
00:21:01,275 --> 00:21:01,995
You can't do anything.

251
00:21:01,995 --> 00:21:06,795
But if you use soapbox tools and you build Shakespeare, you can own every piece of the puzzle.

252
00:21:07,455 --> 00:21:10,995
Yeah, and the key there is you don't need closed API access.

253
00:21:11,255 --> 00:21:12,955
You don't need to verify.

254
00:21:13,035 --> 00:21:13,975
You don't need to ask permission.

255
00:21:14,195 --> 00:21:14,795
You just do it.

256
00:21:15,215 --> 00:21:16,995
Yeah, you have the social graph.

257
00:21:17,115 --> 00:21:18,995
You have the identity layer.

258
00:21:19,715 --> 00:21:24,555
You have the comms protocol, all in Nostra, which is basically like an open API for the world for that.

259
00:21:24,555 --> 00:21:32,495
And then on the payment side, you have Bitcoin so that you don't have to get a Stripe API or something like that to integrate payments.

260
00:21:32,495 --> 00:21:34,175
No permission required. Just go do it.

261
00:21:34,495 --> 00:21:41,335
Yeah, you want to build a website that accepts Bitcoin payments for your product that you're selling or for your personal website or something.

262
00:21:41,955 --> 00:21:43,195
You don't need to know any code.

263
00:21:43,575 --> 00:21:45,095
You don't need to be a developer on how to do it.

264
00:21:45,115 --> 00:21:46,615
You just have a conversation with AI.

265
00:21:47,335 --> 00:21:50,515
And you say, build me this website that does this thing, A, B, C, D.

266
00:21:50,855 --> 00:21:52,555
And a few minutes later, boom, it's done.

267
00:21:52,555 --> 00:21:55,055
and it's yours and you can do whatever you want with it.

268
00:21:55,695 --> 00:21:56,135
Love it.

269
00:21:56,215 --> 00:21:58,195
Can we have a huge round of applause for Derek and Sean?

270
00:21:58,335 --> 00:21:58,935
Thank you guys.

271
00:21:59,015 --> 00:21:59,275
Thank you.
