1
00:00:02,960 --> 00:00:03,939
How are you, sir?

2
00:00:04,799 --> 00:00:06,819
Man, I'm doing great. Yeah? Yeah.

3
00:00:07,359 --> 00:00:12,019
How was Bitcoin takeover at South by Southwest last week? It was awesome.

4
00:00:12,344 --> 00:00:16,125
Takeover was the culmination of, like, an entire week of Bitcoin content.

5
00:00:16,664 --> 00:00:31,260
And by the and I was speaking at Takeover. So, like, the whole week, I was kind of dreading you know, excited, but also dreading when you have a speaking engagement. Yeah. And so I was just getting more and more tired every late night at a Bitcoin event, and I'm like, alright. I gotta have energy for Friday. But,

6
00:00:31,720 --> 00:00:37,260
it ended up going really well. I think, the talk was good, but also the content from all the other speakers,

7
00:00:37,800 --> 00:00:38,780
was pretty solid.

8
00:00:40,454 --> 00:00:46,475
Well, fantastic. What was any any particular takeaways of last week's sessions or

9
00:00:47,414 --> 00:00:47,914
interactions?

10
00:00:48,295 --> 00:00:49,754
What what's what's the vibe?

11
00:00:50,214 --> 00:00:56,250
Yeah. What is the vibe? I mean, people are obviously excited. I think one of the general vibes is that

12
00:00:56,870 --> 00:00:58,410
there has been a shift now

13
00:00:58,870 --> 00:01:02,329
where last year people were worried about being thrown in jail.

14
00:01:02,790 --> 00:01:11,415
And now they're worried if it's going to be Bitcoin only, or if, you know, alt altcoins are gonna make it into the reserve. So it's been quite a shift

15
00:01:11,715 --> 00:01:15,735
in in in the worries that we have on our plate and the arguments that we're having online.

16
00:01:16,195 --> 00:01:19,895
Absolutely. Yeah. I mean, it is an embarrassment of riches in some ways, and yet

17
00:01:20,515 --> 00:01:21,255
keep pushing.

18
00:01:22,300 --> 00:01:27,040
What was the attendance like? I mean, what did you at at commons, was it full house,

19
00:01:27,740 --> 00:01:37,119
usual usual suspects, or some new faces, hopefully? Yeah. It was pretty packed. I think they were trying to max out the fire code, so I think we hit about 300 people.

20
00:01:37,765 --> 00:01:40,104
There was it was standing room only in there.

21
00:01:40,405 --> 00:01:41,865
A lot of the usual suspects,

22
00:01:42,244 --> 00:01:43,865
a lot of new faces as well.

23
00:01:44,165 --> 00:01:48,024
I chatted I chatted with one guy who was a stay at home dad for years,

24
00:01:48,564 --> 00:01:49,064
and

25
00:01:49,600 --> 00:02:06,805
this was his first time kind of venturing back out into the marketplace. His he and his wife are swapping. She's gonna be stay at home now. And so he was looking to get out and network with people and has an interest in Bitcoin. And so I thought that was awesome. Like somebody who's just kind of interested and and trying to network, great place to start.

26
00:02:07,105 --> 00:02:10,165
Terrific. Yeah. I think, you know, we it's at least,

27
00:02:11,345 --> 00:02:21,220
over the last few years, given my focus in lightning and payments, it's been tempting to want to hope that merchants and small business and, you know, but I think it it begins with the individual.

28
00:02:21,520 --> 00:02:25,460
It's all about the individual, so that's great to see. Yeah. Well, speaking of which,

29
00:02:26,400 --> 00:02:28,180
let's dive in. I'm sorry. Okay.

30
00:02:29,175 --> 00:02:35,755
I was thinking to myself this morning, that I well, I still have mutiny wallet installed on my iPhone. I have not yet been able

31
00:02:36,055 --> 00:02:37,275
to to to remove that.

32
00:02:38,535 --> 00:02:46,590
But but maybe starting there not to to rehash too much of that. I mean, I'll make sure to cover in the show notes some of that backdrop for those who are interested. But,

33
00:02:47,130 --> 00:02:48,110
I mean, it was

34
00:02:48,890 --> 00:02:52,590
was and is, as it lives on an open source, a bold experiment

35
00:02:53,210 --> 00:02:54,670
in, broadly speaking,

36
00:02:55,210 --> 00:02:56,190
user control

37
00:02:56,490 --> 00:02:57,310
and privacy.

38
00:02:58,025 --> 00:02:59,645
And so I'd love to understand,

39
00:03:00,425 --> 00:03:05,405
you know, coming out of that and transitioning or transforming, if you will, into Open Secret,

40
00:03:06,185 --> 00:03:08,285
what did you take from? What informed

41
00:03:09,225 --> 00:03:11,565
the creation of Open Secret from

42
00:03:12,750 --> 00:03:13,250
the,

43
00:03:14,189 --> 00:03:16,209
foundations of of Mutiny Wallet.

44
00:03:16,510 --> 00:03:25,730
Sure. Yeah. So I wasn't there on day one with the company, Mutiny, but I was there as a user on day one. I was following along. I was right with you, you know, downloading the app,

45
00:03:26,189 --> 00:03:31,785
first running in the web as a progressive web app. And then once they launched on the app store, had that as well.

46
00:03:32,085 --> 00:03:35,305
And it was my lightning wallet of choice. I was using it for everything.

47
00:03:35,845 --> 00:03:45,620
And really the I think the the goal was to take this really complex thing of lightning and help people run it easily on their mobile device. But then the biggest unlock was.

48
00:03:46,319 --> 00:03:49,300
Being able to synchronize that across all your devices because

49
00:03:49,920 --> 00:03:53,555
lightning has to be always on in order to accept payments. It has to be,

50
00:03:53,855 --> 00:03:57,075
you know, have to be running your own server and that's what made it so complicated. So

51
00:03:57,535 --> 00:04:08,400
you guys thought, let's just try and simplify it and then people can have it, you know, in their browser. And when they wanna accept payments, how do we do that? Well, they innovated some ways to do like blind tokens

52
00:04:09,019 --> 00:04:14,400
and use e cash to accept payments for you when you're offline. There was really cool stuff that was going on.

53
00:04:14,860 --> 00:04:17,040
But, the difficult part was

54
00:04:17,345 --> 00:04:19,125
it's a difficult user experience.

55
00:04:19,665 --> 00:04:30,965
And one of the biggest drawbacks was having users manage their own private key. And so when I joined mutiny, the goal was let's take this to millions of users, hopefully

56
00:04:31,520 --> 00:04:34,820
billions of users, right. At some point in the distant future.

57
00:04:35,520 --> 00:04:52,865
But it was like this non starter where if you are going to just a regular average user who just downloads apps and uses them, and the first interaction is write down these 12 words. If you lose them, you're gonna lose all your money. Like that's that they just get scared of that and go away. So we were trying to figure out how to solve that problem.

58
00:04:53,725 --> 00:04:55,985
And we started looking at secure enclaves.

59
00:04:56,445 --> 00:05:01,320
The mutiny guys were at the sovereign engineering cohort number one in Madera.

60
00:05:01,940 --> 00:05:06,520
And while we were there chatting with some people and learning about secure enclaves in the cloud

61
00:05:06,820 --> 00:05:13,160
and thought this is a really interesting technology, I wonder if we can utilize this. And so as we were looking at Mutiny and scaling it,

62
00:05:13,715 --> 00:05:23,335
then, you know, there was a there were there were a lot of variables at play there, but it turned out that as a company, we needed to kinda pivot away from from scaling Mutiny

63
00:05:24,034 --> 00:05:24,435
and,

64
00:05:24,914 --> 00:05:37,150
try to do something else. But one of the something else's we wanted to do was to build an e cash wallet that could scale to millions of users. And again, we kept running this problem of the UX, the user experience. Like, we have to win on user experience.

65
00:05:37,690 --> 00:05:56,970
And we looked at secure enclaves and said, this is how we do it. We can use Secure Enclaves to manage the private key for the user. They don't have to worry about it when they first get in, but at some point you can prompt them and say, Hey, it would be great if you downloaded your private key, save it as a backup, you know, but you can do that when you actually have developed a relationship with them rather than right off the bat.

66
00:05:57,770 --> 00:06:08,190
And as we looked around, there was nothing out there for using secure enclaves for mobile apps. They are used primarily inside large organizations for securing, like, internal process controls.

67
00:06:08,650 --> 00:06:22,235
And so we thought, okay, if we are going to use secure enclaves, we need to build a platform to do that for ourselves. And it was at that point we realized, well, instead of having just one e cash wallet or one mutiny wallet that uses secure enclaves,

68
00:06:22,570 --> 00:06:31,950
why don't we open this up and hopefully have hundreds of apps that have the same privacy, the same security that mutiny wallet or, you know, or such could have? So

69
00:06:32,330 --> 00:06:38,945
that was a long a long ramble there a bit, but that's kind of how we got to where we are today. Absolutely. And I think maybe for those,

70
00:06:40,065 --> 00:06:41,125
who aren't familiar,

71
00:06:41,505 --> 00:06:42,485
I mean, I think

72
00:06:43,985 --> 00:06:49,845
of right or wrong. I I think of trusted execution environments. I think of early Intel chip architecture

73
00:06:50,145 --> 00:06:53,440
or the what is it? ME engine. I forget some of these. But,

74
00:06:55,340 --> 00:06:55,919
give us

75
00:06:56,460 --> 00:07:02,720
will will you, mark a primer on secure enclaves? What what's what is it in a nutshell? What what is it,

76
00:07:03,340 --> 00:07:04,880
as a technology, and

77
00:07:05,354 --> 00:07:09,215
what does it seek to solve, or what does it, in fact, solve? Yeah.

78
00:07:09,995 --> 00:07:14,574
So a lot of phrases you might hear. You mentioned trusted execution environments, TEEs,

79
00:07:15,275 --> 00:07:16,335
secure enclaves,

80
00:07:18,030 --> 00:07:20,690
secure elements. You might hear confidential computing.

81
00:07:21,230 --> 00:07:28,050
A lot of these are all referring to roughly the same thing. And that is, if you look at your phone, right, you have your phone in your pocket.

82
00:07:28,510 --> 00:07:33,170
It has a secure enclave and they've been in there for over a decade. And it is a secure chip

83
00:07:33,495 --> 00:07:37,435
where you can stick code inside of this chip and the code is running,

84
00:07:37,815 --> 00:07:41,995
but then the data that it talks to, you cannot see. So this data is encrypted

85
00:07:42,775 --> 00:07:47,035
before it enters into the chip. And once it goes into the chip, only the chip can decrypt it

86
00:07:47,820 --> 00:07:56,640
and then run the code against that. And then when it spits it out the other end, it is re encrypted again. And so no human has the ability to intercept that, that data

87
00:07:57,180 --> 00:08:07,585
in a plain text state. So effectively, that's all it is on, on your device. It is securing your thumbprint, your face ID, your wallet, driver's license, that kind of stuff.

88
00:08:08,365 --> 00:08:13,585
But then they've moved into the cloud now. So Intel was one of the first with Intel SGX.

89
00:08:14,280 --> 00:08:21,020
If you do any research, you'll find that there were some vulnerabilities with that. That was like the older generation. Now they've moved on to the Intel TDX.

90
00:08:21,639 --> 00:08:26,540
Amazon has one, they call AWS nitro and video has their own that they're doing on top of GPU's.

91
00:08:27,240 --> 00:08:27,800
And then,

92
00:08:28,634 --> 00:08:31,455
Amazon recently announced their own GPU thing

93
00:08:31,914 --> 00:08:48,400
just, you know, like very recently. And I'm curious to see what they're gonna do as far as encrypted GPU's for theirs as well. So there's a lot going on here, but effectively you can just think of it as the, the analogy I like to use is if you're in the kitchen and you are baking your favorite cookies, right?

94
00:08:48,860 --> 00:08:52,320
Well, you're gonna get a recipe and I'm gonna give you a recipe to follow.

95
00:08:53,020 --> 00:09:07,014
And, and that recipe is like the code that you would put inside the secure, the secure enclave. So you take my recipe, but I'm not there in the kitchen with you watching you actually bake. And so I don't see which kinds of ingredients you use. I don't see which kind of flour you use, what kind of chocolate chips or,

96
00:09:07,714 --> 00:09:19,390
you know, which kind of milk you use. So those that data that you put in there is private to you. But the recipe that is being used is open. It's out on the internet. Everybody can see which recipe is being followed.

97
00:09:19,930 --> 00:09:26,670
So it's analogy that seems to kind of resonate with people that, that really the secure element, the secure enclave is,

98
00:09:27,065 --> 00:09:34,445
is a verifiable code that's out in the open for everybody to audit, but the data that passes through cannot be seen by anyone.

99
00:09:35,225 --> 00:09:37,165
And I would imagine you have run through

100
00:09:37,625 --> 00:09:42,524
a few dozen of these metaphors to lock into one because it is complex. It is challenging

101
00:09:42,825 --> 00:09:45,490
Yeah. Yeah. To get that across, but great job. I like it.

102
00:09:47,089 --> 00:09:49,830
And as you say, Mark, we've we've been carrying these,

103
00:09:50,610 --> 00:09:51,589
pieces of hardware

104
00:09:51,970 --> 00:09:54,630
on our phones at least for a decade.

105
00:09:55,810 --> 00:09:57,990
What have been some of the most notable

106
00:09:59,485 --> 00:09:59,985
breakthroughs?

107
00:10:01,165 --> 00:10:05,185
You know, and I'll I'll throw in that, correct me if I'm wrong, I believe

108
00:10:05,564 --> 00:10:15,260
wallets, the digital wallets that we carry. I'm I'm an iPhone carrier and person, so I am not as deeply familiar with Android, but I believe it's safe to say on both sides, differential

109
00:10:15,880 --> 00:10:18,380
privacy that I know Apple has put at the center,

110
00:10:19,240 --> 00:10:21,080
be it in in the way it it,

111
00:10:22,280 --> 00:10:36,035
applies AI to photos and the like. Can you can you paint a picture there of of, you know, what has it brought us, let's say, over the last five to ten years? Sure. I think that they're getting the benefit of, and that is these, ephemeral credit card numbers.

112
00:10:36,415 --> 00:10:41,555
So it used to be I don't know how often you've had to change your credit card number because you had some kinda charge that happened,

113
00:10:41,910 --> 00:10:49,610
But, it's happened to me multiple times. And so I'd call up the credit card company and say this charge was not mine and they would have to give me an all new number. Well now,

114
00:10:50,870 --> 00:11:08,965
with the, the, the wallet that is on there secured by the Enclave, Apple's able to generate a new credit card number for you per transaction per per vendor. And so I think that's a big benefit. So now if one is compromised, they don't have to change your entire credit card number over. Another very notable case that was on the news, when was this, like 2015,

115
00:11:09,025 --> 00:11:09,845
'20 '17,

116
00:11:10,880 --> 00:11:14,580
was the, the shooter incident in San Bernardino, California,

117
00:11:15,120 --> 00:11:19,700
where there was a person who, you know, unfortunately was engaged in, in a shooting event

118
00:11:20,240 --> 00:11:20,740
and

119
00:11:21,120 --> 00:11:28,515
the FBI wanted apple to unlock the phone for them. They couldn't get in, they couldn't get in through the pin. The secure enclave was securing the identity on the phone

120
00:11:29,135 --> 00:11:30,035
and apple

121
00:11:30,735 --> 00:11:38,329
stood their ground and said, we're not going to compromise the device. Basically the FBI wanted them to shift, to to to ship,

122
00:11:39,050 --> 00:11:43,149
modified firmware to the phone that would unlock the secure enclave.

123
00:11:43,529 --> 00:11:50,350
And Apple stood the ground and said, We're not gonna unlock the enclave because that creates a backdoor that's vulnerable for anyone to get into.

124
00:11:50,825 --> 00:11:53,645
And so this, this was a test of the enclave

125
00:11:54,185 --> 00:11:58,925
showing that even the FBI, like, was having a hard time getting in and finding this information.

126
00:11:59,305 --> 00:12:02,765
I hesitate to use that as an example because it's such a tragic event.

127
00:12:03,550 --> 00:12:07,010
But it was one to show that that the enclaves are,

128
00:12:07,630 --> 00:12:10,290
you know, are are really secure in that regard.

129
00:12:10,830 --> 00:12:14,450
So and I think, you know, as we could spend a lot of time talking about it is

130
00:12:14,830 --> 00:12:20,705
whether we look to The UK with them now trying to yet again force a backdoor on Apple devices and Apple,

131
00:12:21,565 --> 00:12:22,065
deactivating

132
00:12:22,365 --> 00:12:22,865
their

133
00:12:23,565 --> 00:12:24,305
their iCloud,

134
00:12:24,925 --> 00:12:26,705
end to end security and response.

135
00:12:27,485 --> 00:12:31,160
It is those extreme edge cases that test either the,

136
00:12:31,699 --> 00:12:37,160
viability of the technology or in this case, perhaps the commitment of a technology company

137
00:12:37,860 --> 00:12:38,839
to to,

138
00:12:40,100 --> 00:12:47,815
use it for for the good of their customers and and not create what we know that does not work, which is a backdoor for one, but not for all.

139
00:12:49,315 --> 00:12:51,415
So so from from what we've seen

140
00:12:52,195 --> 00:12:53,815
secure enclaves enable,

141
00:12:54,755 --> 00:12:55,575
what remains?

142
00:12:56,035 --> 00:12:58,135
Or rather, what are some of the more egregious

143
00:12:58,620 --> 00:12:59,520
challenges, problems,

144
00:13:00,380 --> 00:13:02,400
risks to trust and privacy

145
00:13:02,940 --> 00:13:03,840
that we endure

146
00:13:04,620 --> 00:13:06,560
that you foresee secure enclaves

147
00:13:07,020 --> 00:13:09,600
addressing? And certainly, we'll talk about Maple AI.

148
00:13:09,900 --> 00:13:10,400
Sure.

149
00:13:12,675 --> 00:13:14,935
Man, there are so many, so really it's,

150
00:13:15,475 --> 00:13:16,455
it's kind of

151
00:13:16,835 --> 00:13:22,375
this, this upgrade for security on the internet that I think that we are about to embark upon. HTTP,

152
00:13:22,835 --> 00:13:27,255
everybody types in the URL, they go HTTP, you know, colon slash slash w w whatever.

153
00:13:28,060 --> 00:13:30,320
And that was all just out in the plane

154
00:13:30,860 --> 00:13:32,080
and it was insecure.

155
00:13:32,700 --> 00:13:41,760
So then when all of our banks started coming online and we started having usernames and passwords, we realized we needed this to be secure. So we upgraded the entire internet to HTTPS.

156
00:13:42,780 --> 00:13:43,440
It took

157
00:13:43,824 --> 00:13:47,045
over a decade to get everybody on board, but we eventually made

158
00:13:47,425 --> 00:13:49,445
it. Now pretty much everything goes secure.

159
00:13:50,545 --> 00:13:52,165
However, that only secures

160
00:13:52,704 --> 00:13:53,764
data in transit.

161
00:13:54,225 --> 00:13:58,884
Right? So you're using your password. If you're on a public wifi trying to log into your bank,

162
00:13:59,300 --> 00:14:05,079
it it is relatively safe to do that. However, once your data gets in the back end of that bank,

163
00:14:05,380 --> 00:14:11,639
or let's just talk about some random app you have on your phone, you're doing you're going on a run around your neighborhood and tracking

164
00:14:11,940 --> 00:14:13,160
your location constantly,

165
00:14:13,745 --> 00:14:20,964
you don't really know who that developer is. You don't know who they hired at their company. You don't know who they recently fired at the company who might have taken data from their servers.

166
00:14:21,425 --> 00:14:27,125
And so you're trusting them with your daily location, where you go on a run that could leave you vulnerable.

167
00:14:27,810 --> 00:14:33,830
And that data is not encrypted to you. It is open to them. They can share that with advertisers.

168
00:14:34,370 --> 00:14:36,150
They can share that with the government.

169
00:14:36,530 --> 00:14:50,635
Data hackers can get in and steal that. So it, it is this privacy trade off that we don't quite think through a lot. We don't realize that we're doing that, but if you look at every app on your phone, you're effectively leaking your data to unknown people

170
00:14:51,095 --> 00:14:51,915
all the time.

171
00:14:52,295 --> 00:14:58,750
And so I think we are going to witness a third upgrade now to the internet, and that is secure enclaves.

172
00:14:59,930 --> 00:15:01,870
I'm I'm calling it HTTP

173
00:15:02,250 --> 00:15:03,709
S E for secure enclaves,

174
00:15:04,089 --> 00:15:04,910
but effectively,

175
00:15:06,170 --> 00:15:13,305
I think every app needs to start using enclaves to not only protect user privacy, right? Where you can encrypt every single user individually,

176
00:15:13,845 --> 00:15:16,905
but it also protects the developer and their liability.

177
00:15:17,525 --> 00:15:18,985
In 2024,

178
00:15:19,205 --> 00:15:22,985
we did some research on, on data hacks that happen, data breaches.

179
00:15:23,440 --> 00:15:24,260
And the average

180
00:15:24,560 --> 00:15:28,960
cost to a US company when there was a data breach was about $9,000,000

181
00:15:28,960 --> 00:15:38,100
In 2023, it was lower. It was like in the fours or fives. But that's a significant liability that you're hanging onto as an app provider if you are custodying

182
00:15:38,965 --> 00:15:44,665
custody, that's a hard word, if you're taking custody of user personal identifiable information.

183
00:15:45,285 --> 00:15:48,665
And so we think that secure enclaves are gonna be a way to

184
00:15:49,045 --> 00:15:51,065
to protect businesses and users.

185
00:15:52,029 --> 00:15:55,570
And I think I was listening. I was after a walk earlier today listening to a conversation.

186
00:15:56,190 --> 00:16:01,970
I believe it was Kale. And, I don't know the other. It was just one of these, you know, you grab fountain podcast app

187
00:16:03,070 --> 00:16:07,904
and do a search and and and hit play. And it was about certainly a a conversation

188
00:16:08,205 --> 00:16:10,305
I'm hearing happen more often,

189
00:16:10,765 --> 00:16:11,665
which is

190
00:16:13,085 --> 00:16:16,865
from don't be evil to can't be evil and from

191
00:16:17,325 --> 00:16:18,225
personal data

192
00:16:18,525 --> 00:16:21,105
as an asset as the new oil goes the cliche

193
00:16:21,770 --> 00:16:23,150
to an absolute liability.

194
00:16:23,530 --> 00:16:28,510
So, you know, collect it all to collect less to collect none.

195
00:16:30,970 --> 00:16:35,310
Talk to me a bit about that. I mean, where could it go? Where are you seeing

196
00:16:35,785 --> 00:16:36,765
some of the more

197
00:16:37,145 --> 00:16:38,605
progressive early adopter,

198
00:16:39,225 --> 00:16:40,765
you know, customers, companies,

199
00:16:41,945 --> 00:16:45,805
start to poke around in terms of their ability to collect less or collect none?

200
00:16:46,905 --> 00:16:51,010
I like the I like that phrase change from don't be evil to can't be evil.

201
00:16:51,550 --> 00:17:01,250
00 in the morning from, from the government,

202
00:17:02,175 --> 00:17:06,995
to unlock someone's account. And I just don't wanna have that liability on me. I've got a family,

203
00:17:07,455 --> 00:17:13,075
I've got a life to live. So I would rather create an awesome user experience for our users without,

204
00:17:14,015 --> 00:17:15,795
knowing what they're doing effectively.

205
00:17:16,415 --> 00:17:18,270
Is it overly hopeful

206
00:17:18,970 --> 00:17:21,630
to think that there is a trend in that direction

207
00:17:22,010 --> 00:17:22,510
from

208
00:17:23,210 --> 00:17:29,870
collect it all to at least collect as little as possible? And I in America, perhaps other other places, it's different.

209
00:17:31,285 --> 00:17:31,945
But I

210
00:17:32,485 --> 00:17:35,225
think I I carve health care aside, you know, the HIPAA,

211
00:17:36,165 --> 00:17:37,945
act and and some of the related

212
00:17:38,405 --> 00:17:38,905
legislation

213
00:17:40,005 --> 00:17:41,385
at least creates the facade.

214
00:17:42,165 --> 00:17:47,770
I don't know that they're they're a sterling example of protecting, you know, PII or personal health information.

215
00:17:49,590 --> 00:17:59,595
So there or elsewhere, are are you seeing a trend or are we very, very early and and certainly Open Secret is is looking to push those boundaries? So I think we're early in many regards.

216
00:18:00,135 --> 00:18:04,155
Before we decided to pivot into Open Secret, we had meetings with

217
00:18:04,695 --> 00:18:07,435
a handful of developers, call it 10 different developers,

218
00:18:08,135 --> 00:18:15,280
from just the regular mobile app space. I come from mobile apps. I did, I worked at a few different startups as an early employee.

219
00:18:15,660 --> 00:18:19,760
And so I've been doing mobile development since really before the app store even launched.

220
00:18:20,380 --> 00:18:27,134
And so we had a lot of conversations with people and it was about fifty, fifty, Half of them were on board with the idea that

221
00:18:27,595 --> 00:18:32,335
they wanted to provide users their privacy and also protect themselves from liability.

222
00:18:32,794 --> 00:18:39,750
The other half told us, Yeah, I'm not super interested in that. Part of it is that they actually like the ability to monetize their user data.

223
00:18:40,150 --> 00:18:43,690
And so that's, that's been a big part of the business model for so long. Advertising,

224
00:18:43,990 --> 00:18:47,690
user data monetization, you talk about data being the, the new oil.

225
00:18:47,990 --> 00:18:54,330
So I think it's gonna take time to migrate away from that. And there's also, there could be a hybrid approach where

226
00:18:54,904 --> 00:18:59,404
you actually, and this was a couple of the developers who talked to said they would like to take this approach where

227
00:18:59,784 --> 00:19:03,804
you lock down the most sensitive information, but then you still can share

228
00:19:04,505 --> 00:19:05,005
non

229
00:19:05,544 --> 00:19:18,280
PII within the system and provide trends and analysis and maybe do the advertising model on top of that. But you do create this separation, which is extremely important. And so that's really where we are trying to

230
00:19:18,660 --> 00:19:20,200
start is that

231
00:19:20,755 --> 00:19:22,135
if you're an app developer

232
00:19:22,515 --> 00:19:24,294
starting out building an app today,

233
00:19:24,595 --> 00:19:27,575
you don't even, most of them don't even think about privacy. They don't

234
00:19:27,875 --> 00:19:38,480
look long term to say what's gonna happen if I have all this user data, they're just concerned with, can I build this user experience and will I get downloads, right? SQL and a Docker container and just start grabbing everything.

235
00:19:38,860 --> 00:20:00,715
Yeah. When you look at tools like Replit or Bolt dot fun where people are just typing in prompts now, and they're just vibe coding into an app. And then they put it out there and ask people to start using it. That's what they're, that's what they're concerned with right now. And they don't think about the fact that if this thing goes viral and takes off, they might suddenly have a database full of this information that they really shouldn't be holding onto.

236
00:20:01,015 --> 00:20:02,554
And so we would like to

237
00:20:03,150 --> 00:20:10,290
create that same experience, but have privacy turned on by default, have encryption turned on by default. So when they Yolo into something

238
00:20:10,830 --> 00:20:15,570
and vibe code and all the buzzwords you wanna say, right? Like when they, when they get into it,

239
00:20:16,015 --> 00:20:25,635
they are protected from day one and their users are protected from day one. There's no reason why it shouldn't be that way. The only reason is that hasn't been built yet. And so that's why we're building it. It's it's

240
00:20:26,495 --> 00:20:30,674
the foot guns are are there waiting, I think, when you when you take that approach.

241
00:20:31,350 --> 00:20:34,010
Well, I think that's that's a great segue mark into

242
00:20:35,030 --> 00:20:35,530
your

243
00:20:36,150 --> 00:20:40,230
sister company or or product at least Maple AI. You'll correct me on that.

244
00:20:41,030 --> 00:20:43,610
It is no surprise certainly to me that

245
00:20:44,195 --> 00:20:46,535
LLMs and AI where you began

246
00:20:46,915 --> 00:20:50,055
because of the sheer volume of data, the uptake utilization.

247
00:20:51,155 --> 00:20:51,895
I spend

248
00:20:52,995 --> 00:20:54,455
probably an hour a day,

249
00:20:55,555 --> 00:20:56,615
in in various

250
00:20:57,280 --> 00:20:58,340
AI products. So

251
00:20:58,800 --> 00:21:00,180
how does Maple AI

252
00:21:00,800 --> 00:21:02,980
make these interactions more trustworthy,

253
00:21:03,680 --> 00:21:05,140
if that is the right word,

254
00:21:05,520 --> 00:21:08,340
than what's out there today, be it a Grok and OpenAI

255
00:21:08,640 --> 00:21:11,460
or others? What's what's fundamentally different?

256
00:21:12,095 --> 00:21:14,835
And and second, I'd love to know your take on

257
00:21:15,375 --> 00:21:17,075
today, at least, what are we sacrificing

258
00:21:17,535 --> 00:21:22,515
to get those benefits that in this case, Maple Eye Maple AI and Open Secret deliver?

259
00:21:22,895 --> 00:21:24,595
Mhmm. What's the trade off?

260
00:21:25,679 --> 00:21:40,899
Well, Maple Maple was really born out of those developer conversations we had when we were pitching open secret to developers. Pretty much a % of them, close to a % said, hey, if you had some kind of AI app where I could chat privately with AI, I'd be all over that.

261
00:21:41,644 --> 00:21:46,304
And so that's why we decided to build that. It was, it was a way to

262
00:21:47,005 --> 00:21:49,024
figure out if open secret is even possible,

263
00:21:49,325 --> 00:21:57,720
because we didn't wanna build a platform for developers before understanding if the UX was even possible to accomplish. We wanted that rock solid UX of

264
00:21:58,500 --> 00:22:07,000
download an app, log in with email, log in with, you know, whatever OAuth, Google, Twitter, or what have you. And now you're in with a private key and end to end encrypted.

265
00:22:07,380 --> 00:22:09,560
And so Maple was our way to prove that.

266
00:22:10,145 --> 00:22:16,245
And, I mean, I'm really happy that we did because it's a phenomenal app. It's really easy to use and

267
00:22:17,024 --> 00:22:26,450
it, it helps with those trade offs that you're alluding to here and that you're asking about. And that is when you go use ChatGPT or Grok, they're amazing tools are phenomenal.

268
00:22:26,910 --> 00:22:31,090
They do a lot of great things. They have a lot of great functionality, a lot of functionality we don't have in maple

269
00:22:31,630 --> 00:22:36,450
yet, but the trade off you're making is you are sacrificing your privacy to the companies.

270
00:22:36,750 --> 00:22:38,450
So as you type in your information,

271
00:22:39,375 --> 00:22:42,915
everything you type is, is stored permanently on their servers

272
00:22:43,375 --> 00:22:46,515
and then used to train their models in the future.

273
00:22:46,895 --> 00:22:47,955
Now, for some people,

274
00:22:48,575 --> 00:23:05,145
I mean, so many people I'm sure you've run across this. So many people you talk to, they they don't see that as a problem. Right? I don't have anything to hide. Why do I, why should I care if I'm giving up my privacy? But our privacy is really important for maintaining open societies, for maintaining open markets, for maintaining this freedoms, these freedoms that we enjoy.

275
00:23:05,705 --> 00:23:13,485
Because when we give up our privacy, we allow authority. We'll, you know, we allow people to come in to start inserting themselves in the process,

276
00:23:13,785 --> 00:23:19,405
and they can create roadblocks. They can create censorship. They can shut us down. So we need to maintain that barrier.

277
00:23:20,040 --> 00:23:22,700
Now from the phrase, the the right to selectively

278
00:23:23,080 --> 00:23:23,580
disclose.

279
00:23:23,960 --> 00:23:29,420
Mhmm. I don't know if that was Snowden or far, you know, long before him, but that's the phrase that sticks with me is

280
00:23:29,720 --> 00:23:32,380
even if you presume you have nothing to hide,

281
00:23:33,000 --> 00:23:33,660
it is

282
00:23:34,200 --> 00:23:37,235
maintaining defending the right for selective disclosure.

283
00:23:37,775 --> 00:23:39,795
Yeah. It's a it's a great quote.

284
00:23:40,655 --> 00:23:44,435
So that's that's really where we're at with with AI and with Maple.

285
00:23:45,215 --> 00:23:45,715
And

286
00:23:46,255 --> 00:23:47,635
from a practical standpoint,

287
00:23:48,495 --> 00:23:51,155
the more that you tell to these other services,

288
00:23:52,760 --> 00:24:02,380
the more that it can be used in the future, which can be convenient, but then you have to realize you don't know who is at that company again, right? Open AI hires

289
00:24:02,919 --> 00:24:04,059
dozens of people

290
00:24:04,534 --> 00:24:09,275
and employees. So many individuals, and you don't know who is accessing your data.

291
00:24:10,135 --> 00:24:12,635
And it might be very personal information

292
00:24:13,575 --> 00:24:20,840
conversations that you might have just between you and a family member, right? You and your SIM significant other, you are now sharing with,

293
00:24:21,140 --> 00:24:22,280
with random people.

294
00:24:22,660 --> 00:24:35,785
And so maple AI gives you that ability and, and high level, what maple AI does is it creates a private key for you in the secure enclave. It encrypts your chat that you make on your device, and then it sends it to our servers.

295
00:24:36,165 --> 00:24:44,265
And then not only do we have that encryption happening in the, in the enclave, but we also have a second one, which is encryption on the GPU. So we have the open source

296
00:24:44,645 --> 00:24:56,390
large language model booted up in the CPU, in the GPU. And so your request is passed along encrypted to the GPU as well. So nobody is able to see what is being chatted about. The GPU comes up with a response

297
00:24:56,770 --> 00:25:01,590
with the LLM, sends it back, encrypted back to your device. So the whole pipeline is encrypted,

298
00:25:02,065 --> 00:25:03,765
which means that we have no censorship

299
00:25:04,065 --> 00:25:14,625
in there. We don't sanitize anything. We just give you a raw open source model and let you talk to it. Now it is based on Meta's llama 3.3

300
00:25:14,625 --> 00:25:17,125
at the moment. We want to bring other ones online

301
00:25:17,450 --> 00:25:22,510
soon. We're we're working through that. We actually currently have an LLM provider that we use.

302
00:25:22,970 --> 00:25:29,710
But because they use GPUs that are encrypted, we can, we can verify the whole process and know that they are end to end encrypted as well.

303
00:25:30,250 --> 00:25:32,910
So we're looking at how we spin up our own

304
00:25:33,255 --> 00:25:33,755
encrypted

305
00:25:34,215 --> 00:25:38,555
models. But, but yeah, you, the, the only kind of censorship you might get is

306
00:25:38,935 --> 00:25:39,995
the model itself,

307
00:25:40,295 --> 00:25:45,515
the data that it was trained on, right? The biases, the weights that it has in it that might

308
00:25:45,910 --> 00:25:47,610
tilt one way or another politically.

309
00:25:47,910 --> 00:25:49,610
But we do our best to

310
00:25:49,990 --> 00:25:57,370
we wanna find models that are the least, you know, politically motivated as possible, and then we don't do anything to insert ourselves in the middle of that process.

311
00:25:58,144 --> 00:26:03,445
And what I mean, I I when I initially signed up for Maple AI, I used I think I used GitHub,

312
00:26:03,985 --> 00:26:10,965
authentication, and it was absolutely seamless. There was there was absolutely nothing about it that felt different

313
00:26:11,850 --> 00:26:15,309
other than understanding a bit more about what was going on behind the scenes.

314
00:26:16,890 --> 00:26:18,670
Talk more about those

315
00:26:19,210 --> 00:26:23,450
well, perhaps the trade off is in the model and what's required. I mean, I

316
00:26:24,615 --> 00:26:35,355
you know, I'll I'll shamelessly, sort of say that grok is my preferred at this point. It's just mind blowing for various reasons. So, you know, where is the line, I guess, in terms of what's possible

317
00:26:36,615 --> 00:26:38,795
with open secret and and secure enclaves?

318
00:26:39,830 --> 00:26:43,450
And where is your place in that ecosystem as you see it today?

319
00:26:44,230 --> 00:26:52,330
So I I'm definitely not going to be building my own data center and doing all the cool stuff that Elon's doing. I think it's it's it's top notch what they're trying to accomplish.

320
00:26:53,110 --> 00:26:54,090
So for us,

321
00:26:54,695 --> 00:27:00,235
I, I don't look at maple AI as like the only AI tool that people should have.

322
00:27:00,695 --> 00:27:14,600
There are so many tools out there. And so it should be one tool in your toolbox. You know, if you are trying to fix something in your house, you're gonna go out in your garage. You're gonna open up your drawer. You're open up your box. And there are 20 or 30 different tools in there, and you need a different tool for a different situation.

323
00:27:15,220 --> 00:27:17,880
And so I use Grok all the time. I use ChatGPT.

324
00:27:18,340 --> 00:27:19,000
I use

325
00:27:19,380 --> 00:27:21,960
Venice AI. I use a lot of other AI services

326
00:27:22,420 --> 00:27:23,560
for different purposes.

327
00:27:24,025 --> 00:27:37,165
But I'm so glad that I have Maple there as well, because when there are chats, things that I want that are just should be more private, then I have that tool to use. And so Maple, it actually has a very long roadmap. We have a long strategy for Maple.

328
00:27:37,519 --> 00:27:48,980
It, it started, you know, it was a glint in our eye as a proof of concept, but then very quickly we realized there's so much more here. So we are going to continue adding features into maple. It is going to become more feature rich.

329
00:27:49,360 --> 00:27:55,524
It will be someplace that can sit side by side with chat GBT and some of these other services.

330
00:27:56,705 --> 00:28:04,304
But then Maple is also a service for other app developers. And so it plays into this whole ecosystem of Open Secret. Let's go back to that,

331
00:28:05,299 --> 00:28:09,559
that running app that you might be using. So you're tracking your runs around your neighborhood.

332
00:28:10,100 --> 00:28:10,600
Well,

333
00:28:10,900 --> 00:28:17,640
if that developer is built on Open Secret, they're using it to secure your location data. So now your location data is private,

334
00:28:18,045 --> 00:28:32,305
but what if they want to use AI to help you make decisions about like, how can you have a better running schedule? Maybe you want to run a marathon. It's like, okay, based on how your running history has gone, let me use AI to, to build you out a plan to get there.

335
00:28:32,990 --> 00:28:37,090
The current option right now is for them to grab an API key from someone like ChatGPT.

336
00:28:37,470 --> 00:28:40,130
And now they have to break that privacy wall. They have to,

337
00:28:40,670 --> 00:28:44,850
to, to take your private, really sensitive information and go give it to OpenAI.

338
00:28:45,550 --> 00:29:00,615
And that is not a great, that is not a great compromise to make. And so Maple is actually gonna be part of the it is part of the open secret ecosystem. So they can now have a private AI to chat with, to bring within their app and stay within that privacy ecosystem.

339
00:29:01,120 --> 00:29:04,419
So we see Maple as both a really solid consumer

340
00:29:05,039 --> 00:29:21,235
app and business app. We have a lot of business users on Maple already. So that we will continue to iterate there. And then as that gets stronger, it is going to be stronger for the developers who are using private AI and vice versa. When developers ask us for things that they want to do in the private AI space,

341
00:29:21,615 --> 00:29:27,715
that will go into Maple AI as an end user product, and both will get better at the same time.

342
00:29:28,120 --> 00:29:32,299
Terrific. And that makes a ton of sense. In fact, you touched on one of the questions I wanted to ask, which is,

343
00:29:33,159 --> 00:29:37,100
it seems to me, I'm no expert here, that in enterprise use,

344
00:29:37,720 --> 00:29:49,434
certainly OpenAI and others, and, you know, not to not to speak ill of any of them, but they've got their team licenses every time I sign into one of these products. They they wanna push me to upgrade the team to a team product or team license.

345
00:29:50,054 --> 00:29:51,355
That to me seems

346
00:29:51,815 --> 00:29:52,315
particularly

347
00:29:52,695 --> 00:29:53,915
problem some in terms

348
00:29:54,295 --> 00:29:54,795
of

349
00:29:55,830 --> 00:29:58,170
rags or otherwise training these models

350
00:29:58,870 --> 00:29:59,370
on

351
00:29:59,910 --> 00:30:02,250
intellectual property that is, you know,

352
00:30:03,030 --> 00:30:09,770
belongs to the company, sensitive documents, whatever that case may be. Talk to me a bit about what that looks like. And

353
00:30:10,215 --> 00:30:14,375
here too. I mean, you know, are are you bullish on on enterprise uptake,

354
00:30:14,855 --> 00:30:15,355
or,

355
00:30:16,294 --> 00:30:22,875
you know, do they suffer some of the same challenges collectively that individuals do, which is, it's not a problem. We'll just give it to OpenAI.

356
00:30:24,020 --> 00:30:33,000
If you if you go look around on Reddit and some of the forums, you'll find countless people saying that they have no problem sharing their company information with open AI,

357
00:30:33,700 --> 00:30:38,015
which is unfortunate. Right? That's just the nature of it. So some people

358
00:30:38,715 --> 00:30:59,670
don't realize, like, it's not a they they they just think it's not a big deal. I would question whether those were officers of the company hanging out on Reddit saying those things. But, yeah, probably not. Probably someone who, you know, doesn't get perp walked out the front door if they if they, break a reg. Yeah. That's right. And I can understand where they're coming from. Right? Maybe you're running some small business in a town, and

359
00:31:00,930 --> 00:31:04,070
and if you are thinking that sharing information with ChatGPT,

360
00:31:04,770 --> 00:31:13,215
that the threat is ChatGPT is gonna come take over your business, Like that's, that's the wrong threat vector to be thinking about really what it is is that

361
00:31:14,155 --> 00:31:17,615
you are training chat GPTs models to benefit your competitor.

362
00:31:17,915 --> 00:31:34,050
That's in the same town as you, because they're going to go to chat GPT and start asking questions about, Hey, how do I spin up this thing? How do I do a marketing campaign in this market? And if you were using it six months prior or a year prior, your data is now part of that, that brain

363
00:31:34,445 --> 00:31:38,065
that is going to make recommendations to your competitors. So you're making the system smarter

364
00:31:39,005 --> 00:31:41,745
to not only benefit you, but benefit those around you.

365
00:31:42,045 --> 00:31:44,304
And as a business person, you,

366
00:31:44,845 --> 00:31:51,000
you need to make that call, Right. Do you want to improve it for your competitors or not? Whereas with something like maple,

367
00:31:51,380 --> 00:31:59,240
you can, you can bring your chats in there. We're adding document upload soon. So you'll be able to upload your documents and do kind of your own rag type stuff.

368
00:31:59,715 --> 00:32:01,095
And you'll be able to

369
00:32:01,475 --> 00:32:03,575
go over strategy, marketing plans,

370
00:32:04,035 --> 00:32:12,535
customer service ideas, right? Documentation. There'll be all sorts of things that you can do within maple that won't be shared with anybody. They will stay local to you

371
00:32:12,910 --> 00:32:31,465
and private to you. And that's, that's really a big competitor advantage for business users is that they can have this AI that they, they, they bring into their strategy room. Right? You bring the smartest people in your room, you close the door and you have a conversation. Well, now you can bring Maple in there too, and trust that, and you can verify. You don't have to trust us. You can verify

372
00:32:31,925 --> 00:32:32,425
cryptographically

373
00:32:33,125 --> 00:32:36,265
that that your information is not being spilled out anywhere.

374
00:32:37,445 --> 00:32:38,345
And is there

375
00:32:39,205 --> 00:32:45,360
I I think I know the answer to this, but I'll ask it. Are there clear trade offs in the sense that what you just described,

376
00:32:45,900 --> 00:32:51,120
which is fantastic, sounds like I can have my cake and eat it too. I can benefit from models trained on

377
00:32:51,500 --> 00:32:55,600
this massive corpus of of of data and information across the Internet.

378
00:32:56,015 --> 00:32:57,075
You know, Grok presumably

379
00:32:57,615 --> 00:33:00,515
excels because it has real time access to to x.

380
00:33:01,215 --> 00:33:04,755
So I can take advantage of of that baseline,

381
00:33:06,255 --> 00:33:06,755
augment

382
00:33:07,055 --> 00:33:07,555
through

383
00:33:08,095 --> 00:33:09,795
my own documents, my own data.

384
00:33:10,830 --> 00:33:15,169
Are there obvious, or do you foresee places where it breaks and where,

385
00:33:15,870 --> 00:33:19,730
you know, the the YOLO approach of an OpenAI or whoever,

386
00:33:21,630 --> 00:33:30,605
outpaces those who've who've elected to apply your similar technology. So are there are there trade offs that are apparent? The AI

387
00:33:31,305 --> 00:33:33,165
race right now is very exhausting.

388
00:33:33,705 --> 00:33:37,485
If you're online at all, every week there are new advances happening.

389
00:33:37,980 --> 00:33:39,200
So I definitely feel

390
00:33:39,660 --> 00:33:44,960
the, the anxiety and the struggle of like, if we're going to have a product like maple AI,

391
00:33:45,260 --> 00:33:51,520
we have to be able to keep up in some regard with all the advances that grok and open AI are making

392
00:33:51,835 --> 00:33:52,655
deep seek,

393
00:33:53,035 --> 00:33:53,535
Mistral

394
00:33:54,155 --> 00:33:57,855
there. Everybody's throwing out new stuff constantly and doing great work.

395
00:33:58,955 --> 00:33:59,935
I think what

396
00:34:00,795 --> 00:34:06,495
really what we need to do is we need to provide a base level of features for our users

397
00:34:07,070 --> 00:34:08,770
and do it in a way that is private.

398
00:34:09,230 --> 00:34:13,970
And then we kinda have the benefit of also building Open Secret that has

399
00:34:14,590 --> 00:34:16,050
Maple built into it.

400
00:34:16,350 --> 00:34:18,130
And we would be extremely

401
00:34:18,430 --> 00:34:26,185
fine and happy if a developer came along and built a, a maple competitor on top of open secret. I would have no problem with that because

402
00:34:26,724 --> 00:34:47,805
the, the technology's all there. The tools are all there for them to do that. And if they wanna go after a market, like if they wanna do image generation, for example, we don't do image gen inside of maple. There are reasons why we haven't done that yet. We might do it in the future, but if somebody wants to come build that on top of open secret, by all means have at it, make it an, a phenomenal business for yourself, scale it to millions of users. That'd be

403
00:34:48,105 --> 00:34:49,245
great. And so really,

404
00:34:50,025 --> 00:34:51,565
we we're looking at as

405
00:34:52,665 --> 00:35:09,270
we want to use maple ourselves. I use it daily. I use it for my own. Like I have a personal assistant prompt that I use to to prioritize my day. I use it for doing strategy, for doing marketing, all sorts of things. And so I wanna continue to use it, and we will continue to keep maple

406
00:35:10,050 --> 00:35:12,470
usable for business use cases

407
00:35:13,305 --> 00:35:27,405
As long as there isn't another competitor that is as private. If, if, if somebody builds something as private and may as maple and is more full featured, then at that point we start to look at the cost benefit of maintaining maple or, you know, winding it down. But

408
00:35:27,740 --> 00:35:34,240
I don't think, I don't think the, private AI part of open secret will ever go away. That's going to be a mainstay of it. And so

409
00:35:34,780 --> 00:35:38,480
there will always be some really strong private AI functionality

410
00:35:39,339 --> 00:35:53,315
in open secret. And there will always be end user apps on open secret that provide private AI. Don't know if that answers your question, but that's It does. Absolutely. The thought. And it's it's and it seems to me that I have these conversations often, particularly with friends, colleagues who are developers

411
00:35:53,615 --> 00:35:54,115
who

412
00:35:55,135 --> 00:35:58,940
push back, you know, and they've they've heard the examples of

413
00:36:00,200 --> 00:36:09,580
just half baked code that cursor or something, you know, not to pick on someone, but has spit out. And they've got to now spend 10 x the time trying to debug it or get it back up and running.

414
00:36:09,924 --> 00:36:12,345
But I think it's inevitable. And so in my mind,

415
00:36:13,365 --> 00:36:13,865
LLMs,

416
00:36:14,325 --> 00:36:15,704
AI broadly, agents

417
00:36:16,565 --> 00:36:17,305
are now

418
00:36:17,684 --> 00:36:20,424
a foundational element of any sort of,

419
00:36:20,885 --> 00:36:21,385
system.

420
00:36:22,005 --> 00:36:25,224
And so I absolutely hear you there. What comes next?

421
00:36:25,550 --> 00:36:29,650
Know, if you're willing to give us a glimpse of of what's next in the roadmap or,

422
00:36:30,349 --> 00:36:33,170
you know, if you had your way,

423
00:36:33,950 --> 00:36:38,530
who would come knocking on open secrets door and what would they be looking to build?

424
00:36:38,910 --> 00:36:40,930
For me, the the lowest hanging fruit

425
00:36:41,535 --> 00:36:42,755
app ideas out there

426
00:36:43,295 --> 00:36:45,795
are note taking apps, journal apps,

427
00:36:46,255 --> 00:36:52,755
people who want to write down their thoughts, right. Their, their daily thoughts or, or their internal struggles.

428
00:36:53,214 --> 00:36:58,690
These are things that are very personal and private. I think those are going to be the first apps that that could be written.

429
00:36:59,470 --> 00:37:02,050
And then you plug in. Yeah. Obsidian.

430
00:37:03,470 --> 00:37:06,109
One of the apps we talked to was,

431
00:37:06,510 --> 00:37:13,595
day one, day one, right? You're just gonna be a heavy user. And when they got when they got dicey on their data policy, I

432
00:37:14,215 --> 00:37:14,715
stopped.

433
00:37:15,015 --> 00:37:20,875
Yeah. Right? And they did build private end to end encryption into it. Right? It took them a long time.

434
00:37:21,495 --> 00:37:32,390
I was good friends with those guys. I was actually at at Apple's developer conference when they won their first apple design award. I gotta hold it, take a picture while it was fun. Right? So they they're they're great guys.

435
00:37:32,710 --> 00:37:39,505
But we chatted with their main developer. He's not with them anymore. And he said if open secret was around when they were building it, it would have taken,

436
00:37:39,905 --> 00:37:54,245
you know, like a matter of weeks, maybe a couple months total to implement it. Whereas it took them a year and a half because they had to build it for iOS. They had to build it for Android and for web, and they were figuring it out as they went. Whereas open secret is just build it once, run it everywhere

437
00:37:54,600 --> 00:37:56,300
and you get really strong encryption.

438
00:37:57,400 --> 00:38:01,420
So I think those are the first apps and obviously AI is a big part of it.

439
00:38:02,440 --> 00:38:04,300
So there's gonna be a proliferation

440
00:38:04,600 --> 00:38:07,340
of AI apps. And then you mentioned AI agents.

441
00:38:07,645 --> 00:38:09,425
That's really where we're headed.

442
00:38:09,725 --> 00:38:15,745
There's, you know, AI for, for those listening. If you, if you've heard AI agents, but you're still confused, what the heck are they? They're effectively

443
00:38:16,445 --> 00:38:27,760
AI bots. They are specialized AI that know how to do a task. So if you are hungry for dinner that night, right, you would go into your main AI, call it Chat GPT or maple. You go and say, I'm hungry.

444
00:38:28,540 --> 00:38:29,680
Get me some food.

445
00:38:30,460 --> 00:38:40,095
Chat GPT doesn't know how to like fulfill your wishes there. All it can do is say, here, let me give you the steps on how to figure out what you wanna eat, and then here are the steps on how to order food on DoorDash.

446
00:38:40,395 --> 00:38:41,615
And that's where it stops.

447
00:38:41,915 --> 00:38:43,775
But with agents, ChachiPT

448
00:38:44,235 --> 00:38:45,855
or Maple or some other,

449
00:38:46,155 --> 00:38:48,095
AI will be able to say, okay,

450
00:38:48,609 --> 00:39:11,194
here's what you want to eat. And now let me spin up an agent that knows how to go find restaurants in your area. And then it's going to spin up another agent that knows how to order food on DoorDash. And so it will go spin those up, but those agents are not free. Right. They cost compute credits, power, electricity, all that stuff. And so we are going to need to have a way to pay all of them. And that's where e cash comes in.

451
00:39:11,815 --> 00:39:36,494
You know, you have lightning, you have Bitcoin, you have stable coins, you have credit cards that are trying to do this. Don't think credit cards are gonna be able to shoehorn themselves into this new paradigm of of AI pain ease pain each other because we're talking about three and a quarter points per transaction plus 30¢ sat down. Yeah. Yeah. Exactly. And we're talking about, like, fractions of a penny per transaction, maybe a few cents, really. It's a few sats is, is the term that we should be throwing around. And so

452
00:39:36,795 --> 00:39:37,935
what we need is

453
00:39:38,315 --> 00:39:47,670
some kind of wallet that's attached to your AI that can then pay other AI agents and you just you authorize it. You say, okay. Here's your budget. You got $50 in here,

454
00:39:48,050 --> 00:39:52,150
and now just use it up as you need to, and then I'll replenish you in the future.

455
00:39:52,450 --> 00:39:52,950
And

456
00:39:53,890 --> 00:39:58,470
I'm I'm hopeful and bullish that ECash is gonna be that solution. I think it was built

457
00:39:58,795 --> 00:39:59,615
for this,

458
00:39:59,995 --> 00:40:05,215
and, it satisfies a real need. And so Let me if I may, Mark, let me pause you there. I mean, I think

459
00:40:05,675 --> 00:40:12,415
on the topic of trust, and I'm I'm dating myself, but I think back to the nineties, the beginning of my career at Microsoft, and we were

460
00:40:12,800 --> 00:40:13,859
we were professing

461
00:40:14,160 --> 00:40:23,060
intelligent agents. You know, you stick around long enough, these things always come full circle. But now as then, I think it comes down to what will it take for me to trust

462
00:40:24,160 --> 00:40:27,119
to let this thing loose? You know, even even

463
00:40:27,705 --> 00:40:29,405
I've got a computer science degree,

464
00:40:29,945 --> 00:40:34,685
been in tech for my entire career. I pretty deeply understand the technology. I still don't

465
00:40:35,145 --> 00:40:38,605
know that I know what it will take for me. What do I

466
00:40:39,065 --> 00:40:40,685
need to qualify and quantify

467
00:40:41,640 --> 00:40:42,380
giving this

468
00:40:42,760 --> 00:40:45,020
thing a a wallet of budget, which, you know,

469
00:40:45,480 --> 00:40:51,020
on Nostra Nostra Wallet Connect, I can set budgets. All these things are great. I understand that. But I mean,

470
00:40:51,799 --> 00:40:54,875
that's sort of the crux of it, I suppose, is how

471
00:40:55,335 --> 00:40:57,595
far away are we, do you think, from this

472
00:40:57,975 --> 00:40:59,435
being in the wild?

473
00:40:59,895 --> 00:41:02,315
And what is your take on how

474
00:41:02,855 --> 00:41:05,275
Secure Enclave's Open Secret specifically

475
00:41:06,869 --> 00:41:12,730
addresses that gap, that trust gap of, okay. Yes. I'm gonna give it a leash and let it run. Yeah.

476
00:41:13,349 --> 00:41:17,930
There, in my mind as I'm listening to you, I think you are describing two different

477
00:41:18,630 --> 00:41:22,935
definitions of trust there. So the first one is, do you trust it to be effective?

478
00:41:23,395 --> 00:41:27,655
Right. Do you trust to, to let these AIs go and do the thing that you want?

479
00:41:28,194 --> 00:41:37,095
And so if you want to eat food for dinner that night, do you trust that it's actually gonna bring you food that you like, or is it gonna bring you some styrofoam that you don't wanna eat at all?

480
00:41:37,930 --> 00:41:44,670
So that, and that is something that we can just play out through the user experience and trial and error and make sure that we, we trust that that's gonna happen.

481
00:41:45,290 --> 00:41:47,710
Then there's the deeper version of trust of,

482
00:41:48,010 --> 00:41:57,355
do I actually think that it's going to be benevolent and it's gonna act on my behalf, that it's not gonna have bugs that will suddenly spend my entire wallet in one swoop.

483
00:41:58,055 --> 00:42:04,875
And I think And I think hallucination in the in the in the age of LLMs is perhaps for me at least reinjected

484
00:42:05,175 --> 00:42:07,275
that concern. You know, you see how

485
00:42:07,640 --> 00:42:09,740
far off the track these things can go.

486
00:42:10,120 --> 00:42:10,940
Yeah, definitely.

487
00:42:11,880 --> 00:42:13,100
And I think agents

488
00:42:14,440 --> 00:42:19,980
hallucination is definitely a big part of chat. And I think agents have an opportunity to be more transactional

489
00:42:20,744 --> 00:42:27,724
where they can write unit tests and they can, they can prove things and say, if the input is this, make sure the output is this

490
00:42:28,184 --> 00:42:34,924
and don't stray from that. And so I think we might be able to have more provable trust in, in that kind of design.

491
00:42:35,859 --> 00:42:36,359
But,

492
00:42:37,220 --> 00:42:41,240
you know, you mentioned working at Microsoft. I used to work at Apple when you're in these big corporations,

493
00:42:41,859 --> 00:42:42,740
you have these,

494
00:42:43,700 --> 00:42:44,200
thresholds

495
00:42:44,500 --> 00:42:55,045
of payments that can be approved. Right? So all these different managers you have above you, they each have their own budget level, right? Discretionary spending. They can approve up to $250 The next person can approve up to $250,000

496
00:42:55,045 --> 00:42:55,704
or something.

497
00:42:56,165 --> 00:43:03,464
I think that we probably end up in some kind of space like that. And you're seeing that with Nostril Wallet Connect, being able to set a budget for different apps.

498
00:43:04,200 --> 00:43:19,635
So maybe you say, right, like, I'm I'm okay with you making decisions up to this point on your own. If it needs to go above that, come back for me for approval, whether that's a push notification that shows up on your phone. I mean, how dope would that be if you're, like, sitting in a movie or sitting somewhere at a at a ballgame

499
00:43:20,015 --> 00:43:30,035
and your phone says, oh, hey. Your your AI agent wants to do this really important thing with for you. Do you approve it? And you're like, oh, yeah. For sure. Go for it. Absolutely. Maybe it's negotiating

500
00:43:30,575 --> 00:43:50,994
your car insurance for you. And it's like, hey. I just found that I can save you $500 a year by switching to this new one. Do you authorize me to do it? Yes. Yes. I do. Yes. I would say yes to that. Maybe it gives me a little quick report right there on the screen and I can say yes to that. So. I I think we need to figure out how to, how do we set up these guardrails so that we do feel comfortable,

501
00:43:51,694 --> 00:43:57,555
you know, at that level of trust. And then with open secret, what we're doing is we are putting the open source code out there.

502
00:43:57,855 --> 00:44:01,954
This is the code that runs inside the enclaves, and then we have reproducible builds.

503
00:44:02,380 --> 00:44:10,720
So any user that wants to can go out, download the code, run the build, and they get a check sum. They get this number that says this is, you know, this is the fingerprint

504
00:44:11,099 --> 00:44:13,920
of the build. And then when you go log into Maple,

505
00:44:14,315 --> 00:44:17,615
you actually see the fingerprint that the server is doing. It's called attestation.

506
00:44:18,155 --> 00:44:32,090
And you as a user can compare those two fingerprints together and verify that they are correct. Now the software does that for you. It gives you a green verified badge, a little check mark, but users are able to do that. And to me, that's kind of the final piece to this trust that, that you're asking about.

507
00:44:32,950 --> 00:44:38,010
Is we need to see the code. We need to know what's running on these servers, running in the secure enclaves

508
00:44:38,630 --> 00:44:43,825
and with companies like Apple, with their private cloud compute and their Apple intelligence,

509
00:44:44,365 --> 00:45:05,610
they're using secure enclaves. However, Apple doesn't wanna open source all of that. And so they're going to third party auditors, and we have to trust the auditors are looking at the code and verifying it. It's the same when same thing with Signal. Signal has third party auditors that come look at their code. And so if you're gonna use signal to chat, you are doing some trust, right? Because you download the app from the app store.

510
00:45:05,910 --> 00:45:10,730
You can't verify even though they even though signal publishes their code open source,

511
00:45:11,305 --> 00:45:21,805
you can't verify that the build you're downloading is the exact same code that's that's out on the open source. You you can hope, but now you're trusting the third party auditors that they are saying, yeah, we checked out this build. It is the same code.

512
00:45:22,265 --> 00:45:31,420
So there is a, there's a level of trust there. And with Open Seeker, we're trying to get as close to the user as possible with that trust. We wanna bring the code and the user as close together.

513
00:45:32,280 --> 00:45:32,780
So

514
00:45:33,320 --> 00:45:42,315
that's, that's really what we're going after. And I see there are gonna be obviously companies do a different way and different ways, and you have to decide as user for yourself, what is your

515
00:45:42,694 --> 00:45:43,835
what is your risk,

516
00:45:44,535 --> 00:45:52,954
model that you have? How much can you trust someone, and how much do you need to have more eyes on the code yourself? And I think that that brings to mind I mean, I I wonder,

517
00:45:53,360 --> 00:45:56,260
Mark, if you and the team didn't have a lot of interesting conversations

518
00:45:57,600 --> 00:45:59,460
about business model and monetization

519
00:46:00,160 --> 00:46:00,660
versus

520
00:46:01,520 --> 00:46:12,704
don't trust verify. Was that non trivial, or was it clear from the beginning that you were gonna push it as close to the user and and have as much visibility and transparency as you've laid out?

521
00:46:13,005 --> 00:46:14,944
Because that does seem to me to be

522
00:46:15,484 --> 00:46:16,464
at least the

523
00:46:17,645 --> 00:46:21,265
the reason, if not excuse, that a lot of established companies hold up.

524
00:46:21,670 --> 00:46:22,170
Mhmm.

525
00:46:23,270 --> 00:46:25,530
This definitely was a big part of the conversation.

526
00:46:26,310 --> 00:46:30,010
We we very strongly feel like we need to have our code open source.

527
00:46:30,310 --> 00:46:33,930
However, we're trying to build a business, especially a SaaS business.

528
00:46:34,550 --> 00:46:35,050
And

529
00:46:36,155 --> 00:46:41,775
it's a threat right. To where your software can be copied and someone can spin up a competitor to you.

530
00:46:42,075 --> 00:46:46,255
So we looked at the licenses out there. This was, this was controversial

531
00:46:46,635 --> 00:46:54,860
within our company, in our discussions of which license do we go with. We eventually went with a GPL for the, the secure enclave code,

532
00:46:55,240 --> 00:47:06,275
maple itself, the client that is not, that is MIT. People are welcome to fork that and do their own commercial products if they want to, and they don't have to contribute back. But we did H E P a GPL for a few reasons.

533
00:47:07,535 --> 00:47:10,115
One of them was we are trying to build a business.

534
00:47:10,415 --> 00:47:16,755
And so if people are going to copy our code and are going to improve it, we would love for those improvements to come back to us

535
00:47:17,130 --> 00:47:21,390
and for there not to be just like a straight competitor right off the bat when we're so small.

536
00:47:22,010 --> 00:47:24,670
Another reason is we feel very strongly

537
00:47:25,210 --> 00:47:25,710
that

538
00:47:26,010 --> 00:47:31,069
secure enclaves to be effective, need to have their code open source and reproducible.

539
00:47:31,885 --> 00:47:33,985
And so if someone's going to take

540
00:47:34,525 --> 00:47:37,105
the, the, the open secret enclave code,

541
00:47:37,485 --> 00:47:45,745
we want them to publish it open source as well for the protection of the users. If, if our code is gonna be securing people's stuff, even if it's on another person's device,

542
00:47:46,119 --> 00:47:50,060
another person's server, we want it to be out in the open so that users can be protected.

543
00:47:50,760 --> 00:47:55,260
So that's that's really the approach that we took. It was not an easy decision to make

544
00:47:56,680 --> 00:48:04,315
because we wanted to provide the the strongest privacy possible, but that's that's eventually where we landed. And I I think that takes us to

545
00:48:05,015 --> 00:48:09,835
the last question I'd love to to discuss or raise with you, Mark, which is

546
00:48:11,255 --> 00:48:15,275
speaking to business leaders, be they heads of product, heads of engineering,

547
00:48:15,960 --> 00:48:16,460
CXOs,

548
00:48:17,880 --> 00:48:19,020
in a future where

549
00:48:20,760 --> 00:48:25,420
AI certainly, LLMs, I think we can we can foresee that become ubiquitous,

550
00:48:27,000 --> 00:48:29,420
other compute intensive and increasingly

551
00:48:29,720 --> 00:48:32,045
data hungry applications become ubiquitous.

552
00:48:33,065 --> 00:48:33,964
What changes

553
00:48:34,505 --> 00:48:36,444
do those incumbent players

554
00:48:37,224 --> 00:48:38,045
need to start

555
00:48:38,505 --> 00:48:39,005
deliberating

556
00:48:39,704 --> 00:48:40,925
and deciding on

557
00:48:41,545 --> 00:48:43,565
in order to to keep pace

558
00:48:44,010 --> 00:48:44,750
and presumably

559
00:48:45,130 --> 00:48:48,030
to make the decisions that that I think you and I would agree

560
00:48:48,330 --> 00:48:49,710
are best for the user

561
00:48:50,170 --> 00:48:50,670
and

562
00:48:50,970 --> 00:48:54,590
provide the trust that that they'll demand. Trust and privacy

563
00:48:54,890 --> 00:49:00,265
have been a nice to have for too long. For decades, they've been a nice to have

564
00:49:00,725 --> 00:49:01,225
because

565
00:49:01,525 --> 00:49:02,745
in order to build

566
00:49:03,525 --> 00:49:05,945
effective cloud environments and to monetize

567
00:49:06,245 --> 00:49:07,305
a SaaS product,

568
00:49:08,005 --> 00:49:12,185
you, you almost needed to throw privacy out the window to really make it effective.

569
00:49:13,270 --> 00:49:16,730
The technology has finally caught up with things like secure enclaves

570
00:49:17,350 --> 00:49:24,170
and with devices having so much power on them, you can run LLM on a mobile phone now pretty effectively.

571
00:49:24,550 --> 00:49:28,570
It won't get you everything. The models that we run-in the cloud are still way more powerful,

572
00:49:28,885 --> 00:49:29,705
but you can

573
00:49:30,005 --> 00:49:38,905
get, enough sometimes of what you need on there. So the computing has caught up to this and now we are going to see users caring more about privacy.

574
00:49:39,765 --> 00:49:41,305
You look at the political landscape

575
00:49:41,660 --> 00:49:47,760
and just what has happened in the last few months, more people are talking about privacy and and trust and what does that mean for themselves.

576
00:49:48,140 --> 00:49:57,244
I'm not talking just in America. I'm talking, you know, all over the world. Europe is a big example of that where privacy is is really hanging. It's a threat. You're attacking that continuously.

577
00:49:57,785 --> 00:49:58,525
Yeah, definitely.

578
00:49:58,905 --> 00:50:05,165
So I think if you are at, if you're at a company, if you have an app that's being used by a lot of people,

579
00:50:05,704 --> 00:50:14,680
you need to be watching for a competitor come out that has the the trust and the privacy angle that you don't have? And how can you adapt to that?

580
00:50:15,780 --> 00:50:21,080
With open secret, we don't require people to throw away all their code. We can plug into an existing tech stack.

581
00:50:21,380 --> 00:50:25,480
So that, that was really important to us that we don't require people to start from scratch.

582
00:50:25,925 --> 00:50:31,625
So you could come, come to talk to us and we can help you start to, you know, secure just pieces of your data.

583
00:50:32,165 --> 00:50:36,825
But you don't have to, right? You can download our code and do your own secure enclaves. But I think that

584
00:50:37,125 --> 00:50:37,625
every,

585
00:50:38,210 --> 00:50:40,390
every company out there that has a successful app

586
00:50:40,690 --> 00:50:43,190
should start looking at secure enclaves and figuring out

587
00:50:43,490 --> 00:50:54,795
what user data needs to be public within our database and shared within our database and what does not. Because if it does not need to be shared for like mission critical things to function,

588
00:50:55,095 --> 00:51:05,595
you should be locking that down. You should be privatizing it, securing it. It's not only good for your users, it's good for your liabilities of business, and it's good for a competitive advantage for those other apps that are gonna come along

589
00:51:05,990 --> 00:51:09,450
that will be trying to be more secure and be more private.

590
00:51:10,309 --> 00:51:14,089
I wonder, Mark, if you have a perspective on what this does in terms of regulatory

591
00:51:15,270 --> 00:51:17,690
advantage or or to just minimize

592
00:51:18,069 --> 00:51:20,170
the attack surface, if if you will,

593
00:51:20,505 --> 00:51:24,924
that as companies and we see this with Apple, we see this with Google, we see this with all the large players,

594
00:51:25,545 --> 00:51:29,644
they're running this gauntlet of overlapping and competing regulatory regimes.

595
00:51:30,105 --> 00:51:31,964
So do you have a sense

596
00:51:33,400 --> 00:51:37,900
of what adopting secure enclaves, adopting Open Secret would do to one's

597
00:51:38,600 --> 00:51:41,740
exposure, let's say, to a lot of these,

598
00:51:43,400 --> 00:51:46,380
messy, sometimes overlapping and competing privacy

599
00:51:46,760 --> 00:51:48,140
and regulatory regimes?

600
00:51:48,974 --> 00:51:52,355
That's Yeah. That that is a, a heavy question right there. First off,

601
00:51:53,535 --> 00:52:00,115
my my portion of this podcast is going over obscure VPN, so I'm gonna dox myself a little bit there. It's a great product. I love it. Fantastic.

602
00:52:00,575 --> 00:52:04,230
But I And second, you're not a lawyer. Yeah, exactly. I'm not a lawyer. So there's the other

603
00:52:05,830 --> 00:52:07,770
thing. I think that secure enclaves,

604
00:52:08,470 --> 00:52:10,890
they help you when it comes to regulation, because

605
00:52:11,510 --> 00:52:14,410
you can effectively be hands off, you know,

606
00:52:15,190 --> 00:52:23,145
in the, in the Bitcoin world and the digital currency world, we talk a lot about self self custody, right? You have your own keys, you secure your own coins.

607
00:52:23,525 --> 00:52:24,825
And so a business,

608
00:52:25,125 --> 00:52:34,230
if the users are, have their own keys, they don't have custody of their coins of their money. And so they, there are all sorts of regulations and liabilities that don't apply to them

609
00:52:34,790 --> 00:52:37,930
With open secret, we are bringing self custody to your data.

610
00:52:38,550 --> 00:52:39,210
And so

611
00:52:39,510 --> 00:52:40,890
it's all the same things apply

612
00:52:41,190 --> 00:52:44,890
where the user is in control of their data. You as a, as a provider,

613
00:52:45,510 --> 00:52:58,454
don't actually have access. You don't have any control over what data that they generate and what they interact with and what they do with that data. So all sorts of liability no longer applies to you in that regard and regulations don't apply to you. Now on the flip side,

614
00:52:59,319 --> 00:53:05,660
I do wonder if, companies that are earliest to embrace your secure enclaves might be some kind of a target

615
00:53:06,200 --> 00:53:11,020
for coercion, you know, to come like Apple experienced with the enclaves on the phone.

616
00:53:11,655 --> 00:53:15,115
Maybe I do get a knock on my door that says there is this one specific user.

617
00:53:15,415 --> 00:53:16,875
We need you to push an update

618
00:53:17,175 --> 00:53:20,875
to this product knowing that, it will actually break the product.

619
00:53:21,495 --> 00:53:29,329
So if somebody goes on with maple, if we push out a nefarious build that doesn't match with the open source version of maple, your client will not connect. It'll immediately

620
00:53:30,190 --> 00:53:33,650
sever the connection and you won't do anything. So you won't be compromised.

621
00:53:33,950 --> 00:53:41,230
However, you know, we could push a backdoor for government to get at one specific user if that user came and logged in. Now,

622
00:53:41,630 --> 00:53:43,089
but again, like it's,

623
00:53:43,415 --> 00:53:45,675
it requires that user to

624
00:53:46,135 --> 00:53:51,115
to to to go on the code. That that's it's a very detailed question. A very detailed answer. Right? I don't wanna dig in all the technicalities.

625
00:53:51,655 --> 00:53:56,234
But we have set it up to make it that situation nearly impossible to happen.

626
00:53:56,680 --> 00:54:06,380
And so, but I I do wonder, you know, there is this threat that that somebody could come put pressure on us. And so we build all of this cryptography, we build all this encryption in there so that

627
00:54:06,760 --> 00:54:09,180
that scenario cannot play out. It's basically,

628
00:54:10,395 --> 00:54:14,815
I don't wanna say impossible, but it's it's as close to impossible as we can try and make it,

629
00:54:15,275 --> 00:54:16,815
so that we can protect ourselves

630
00:54:17,355 --> 00:54:20,255
from our users, and our users can be protected from us.

631
00:54:20,715 --> 00:54:26,040
It's an excellent point. You know, I came at this, from the standpoint, obviously, of what would it do perhaps

632
00:54:28,120 --> 00:54:28,780
to demonstrably,

633
00:54:30,120 --> 00:54:32,700
prove that, hey, I I can't help you. But then,

634
00:54:33,720 --> 00:54:42,015
little if anything stops these these government agencies from believing that you can build a backdoor for just them. Right? So Right. It's a great point.

635
00:54:42,974 --> 00:55:00,460
Well, Mark, I've been looking forward to this conversation and have just been delighted. Really appreciate it. For those, and I'll certainly get this into the show notes, who want to follow you, your work, Open Secret, Maple AI, where should they where should they look? Where should they follow you? Great. Likewise. I've really enjoyed this conversation. Was looking forward to it.

636
00:55:00,920 --> 00:55:02,780
I'm on Twitter at Noster.

637
00:55:03,160 --> 00:55:11,260
My username on on x is marks underscore f t w for the win. And then on Noster, I'm just marks@primal.net.

638
00:55:11,535 --> 00:55:19,155
You can follow me there. And then we have Maple on all the places as well. On X, it's try maple.ai.

639
00:55:19,454 --> 00:55:22,355
On Nostr, it's just maple.ai@primal.net.

640
00:55:22,415 --> 00:55:31,920
And then Open Secret is there as well. Open Secret Cloud is where you'll find our handle. And then our website is available as well, opensecret.cloud.

641
00:55:32,220 --> 00:55:40,635
And then I recommend any developer that wants to try out Secure Enclaves or wants to try out Open Secret, first go to maple. Go to try maple.ai

642
00:55:40,775 --> 00:55:49,435
and see what it's like for an end user. And that is the light bulb moment. Because when you think about building encryption or doing any of these private key things,

643
00:55:49,930 --> 00:55:55,870
it's very daunting and you think it's gonna be convoluted and difficult to use. So just go to try maple.ai,

644
00:55:56,250 --> 00:55:57,070
log in

645
00:55:57,690 --> 00:56:10,625
and you won't notice that you're using end to end encryption. It's that easy. And so that's where I would really point people is just go create a free account, give it a try, and then you'll taste what it's like to have an app that is built on open secret.

646
00:56:11,005 --> 00:56:17,984
Terrific. Yep. I use it. Great product, smooth as can be, and I'm really excited to see iterations of it and what others build.

647
00:56:18,285 --> 00:56:22,716
Thanks, Mark. Appreciate the time. Talk to you soon. Okay. We'll talk to you. Bye.
