1
00:00:00,000 --> 00:00:06,720
KYC is an interesting topic because it's in that area where data is so valuable.

2
00:00:07,760 --> 00:00:13,120
And yet we are so resistant to moving to some of the next kinds of technologies.

3
00:00:13,520 --> 00:00:16,780
And this is where we get into the difference between, we'll talk about biometrics.

4
00:00:17,220 --> 00:00:19,980
The difference between verification and identification.

5
00:00:20,420 --> 00:00:22,460
One's very good, less intrusive.

6
00:00:23,480 --> 00:00:26,280
Great when it comes to preventing fraud.

7
00:00:26,740 --> 00:00:28,420
Identification, not so much.

8
00:00:29,120 --> 00:00:31,120
Very intrusive, not as accurate.

9
00:00:31,720 --> 00:00:34,100
And even if it were, not very transparent.

10
00:00:34,840 --> 00:00:37,620
Authentication, very transparent because you're doing it.

11
00:00:37,780 --> 00:00:40,100
You are initiating a transaction.

12
00:00:40,840 --> 00:00:47,600
So the confusion that people have between the two leads to we don't want either one.

13
00:00:51,440 --> 00:00:53,640
James Lee, welcome to Trust Revolution.

14
00:00:54,400 --> 00:00:55,480
Well, thank you very much.

15
00:00:55,540 --> 00:00:56,180
Happy to be here.

16
00:00:56,820 --> 00:00:58,160
I am grateful for your time.

17
00:00:58,160 --> 00:01:04,740
when you are, as you noted, it is a cold, still icy day in the Beltway.

18
00:01:06,340 --> 00:01:14,760
And so where I'd like to begin, James, is we will certainly spend most of our time on your current work.

19
00:01:14,760 --> 00:01:19,440
But to set the backdrop sort of from inside the machine, let's begin here.

20
00:01:19,440 --> 00:01:36,780
You were a senior vice president at Choice Point when the organization was, as goes the story, selling personal data to identity thieves, which is an incident that triggered breach notification laws across the country.

21
00:01:37,540 --> 00:01:47,140
And so to the degree that you care to get into that, what did that experience teach you about how institutions actually think about the data that they hold?

22
00:01:47,140 --> 00:01:49,460
How much time do we have?

23
00:01:50,920 --> 00:01:52,120
As much as you want.

24
00:01:53,000 --> 00:01:54,800
It's actually an interesting story.

25
00:01:55,600 --> 00:02:03,920
It's one of those things where the gentleman who was the president of the company at the time, Doug Curley,

26
00:02:04,360 --> 00:02:11,120
Doug, I think, made the comment one time that anytime you go through an experience like that, you learn something.

27
00:02:11,220 --> 00:02:12,320
It's like talking to your wife.

28
00:02:12,420 --> 00:02:15,920
It may not always be a pleasant conversation, but you're always going to learn something.

29
00:02:15,920 --> 00:02:21,000
And that was certainly the case in ChoicePoint.

30
00:02:21,580 --> 00:02:28,320
We were very much a leading edge in data at that time.

31
00:02:28,420 --> 00:02:31,180
We were big data before anybody used the term big data.

32
00:02:32,460 --> 00:02:40,320
And although we thought we had very good policies and procedures, we were very thorough in our due diligence,

33
00:02:40,320 --> 00:02:49,980
It showed at the time that no matter how tight you thought you had the information locked down, how many good processes and procedures you had, they weren't good enough.

34
00:02:51,180 --> 00:02:53,300
And this was not a cyber attack.

35
00:02:54,080 --> 00:03:08,140
This was, today we think of it as a scam or fraud, where the individuals who got a hold of this data set up shell companies with storefronts.

36
00:03:08,140 --> 00:03:12,740
and for the sole purpose of being able to access credit reports.

37
00:03:13,940 --> 00:03:16,000
And they had business licenses.

38
00:03:16,540 --> 00:03:18,940
By all accounts, it looked like a legitimate business.

39
00:03:19,100 --> 00:03:21,560
Now, if you had scratched the surface a little bit more,

40
00:03:22,420 --> 00:03:23,980
you would have found out, wait a minute,

41
00:03:24,580 --> 00:03:28,760
and actually the technical things that we had in place did catch it,

42
00:03:28,760 --> 00:03:34,100
but not in time because they immediately began to order reports

43
00:03:34,100 --> 00:03:35,900
in a way that was not intended.

44
00:03:35,900 --> 00:03:39,520
The algorithms were tripped immediately.

45
00:03:39,700 --> 00:03:41,240
This is an anomalous search.

46
00:03:41,660 --> 00:03:44,280
And so they only had like a day, maybe, of exit.

47
00:03:44,800 --> 00:03:46,020
Maybe a little bit more than that.

48
00:03:46,120 --> 00:03:50,760
It's been nearly 20 years, so I don't remember all the fine details, but it wasn't very long.

49
00:03:51,920 --> 00:03:56,360
And we had this raging debate within the organization.

50
00:03:56,920 --> 00:04:00,440
Do we follow the California law only?

51
00:04:00,440 --> 00:04:06,040
because at the time, California had the only data breach notice law in the world,

52
00:04:06,200 --> 00:04:08,100
not just in the country, in the world.

53
00:04:09,300 --> 00:04:13,120
So do we just follow that narrow law, which said,

54
00:04:13,120 --> 00:04:18,100
if it's, I believe it was 25,000 people or more who are residents of California,

55
00:04:19,020 --> 00:04:20,400
then you had to issue a data breach.

56
00:04:21,100 --> 00:04:26,540
And at that time, our count was around 50,000, 56,000, I believe,

57
00:04:27,260 --> 00:04:29,540
of the potential impact.

58
00:04:29,540 --> 00:04:36,760
Now, we're going to come back to the actual impact later, but the potential impact, that's 56,000 California residents.

59
00:04:37,120 --> 00:04:43,020
And so we had the debate, do we notify everybody whether they're in California or not?

60
00:04:43,040 --> 00:04:47,060
Because there were people who were not residents of California who were in their searches.

61
00:04:48,380 --> 00:04:55,280
And the decision was, well, we should, we have to, you know, the law says California, we're going to follow the law.

62
00:04:56,180 --> 00:04:58,320
Well, that didn't last long.

63
00:04:59,540 --> 00:05:04,200
because once you put that genie out of the bottle,

64
00:05:04,300 --> 00:05:05,480
there's no putting it back in.

65
00:05:06,040 --> 00:05:09,100
Data does not recognize those little dotted lines

66
00:05:09,100 --> 00:05:10,260
that we learned on the map.

67
00:05:11,080 --> 00:05:11,120
Indeed.

68
00:05:11,260 --> 00:05:12,740
Data goes everywhere.

69
00:05:13,660 --> 00:05:17,840
And so you had literally the entire country going,

70
00:05:18,040 --> 00:05:19,980
is my data in their database?

71
00:05:20,720 --> 00:05:22,160
Or did they lose my data?

72
00:05:22,860 --> 00:05:23,940
Who are these people?

73
00:05:24,020 --> 00:05:26,740
Because we were not a household name.

74
00:05:26,740 --> 00:05:33,360
even though we had information on if you had homeowners or auto insurance in this country,

75
00:05:33,360 --> 00:05:39,760
we probably had information about you. So the vast majority of adults in the United States

76
00:05:39,760 --> 00:05:43,340
were in our database and every one of them wanted to know, what did you do with my data?

77
00:05:44,240 --> 00:05:51,500
So we had 100 media calls a day, every day for 30 days, not to mention all the congressional

78
00:05:51,500 --> 00:05:57,200
hearings, the state legislative hearings, regulators, all of the things which come around

79
00:05:57,200 --> 00:06:00,220
that because nobody knew who we were.

80
00:06:00,740 --> 00:06:04,560
Nobody knew what we had and nobody knew what had happened.

81
00:06:05,200 --> 00:06:08,160
And as you say, James, forgive me, this is 20 years ago.

82
00:06:08,560 --> 00:06:09,220
20 years ago.

83
00:06:09,280 --> 00:06:09,440
Yeah.

84
00:06:09,640 --> 00:06:10,700
It was 2005.

85
00:06:13,440 --> 00:06:21,060
So when we expanded the search, we got up to, I don't remember the exact number, let's

86
00:06:21,060 --> 00:06:28,040
to say it's 150,000. So we added another 100,000 potential individuals who had been impacted.

87
00:06:28,280 --> 00:06:32,680
And meanwhile, we're also going through inside the company, we're going through the whole

88
00:06:32,680 --> 00:06:38,780
discussion around, well, we know what happened now. We know what we need to do to make sure

89
00:06:38,780 --> 00:06:44,680
that that doesn't necessarily happen again. But let's look at this more broadly.

90
00:06:45,240 --> 00:06:46,800
What should we be doing?

91
00:06:46,900 --> 00:06:48,680
What should the business be?

92
00:06:48,820 --> 00:06:55,760
Not what it can be within the legal structure, but what is it we should actually be doing?

93
00:06:55,760 --> 00:07:02,380
And what level of transparency, because we thought we had a good level to begin with, but we obviously needed more.

94
00:07:02,460 --> 00:07:04,080
What more can we do?

95
00:07:04,340 --> 00:07:13,640
And we spent the next year rebuilding a lot of systems, rebuilding processes, reevaluating customers.

96
00:07:13,640 --> 00:07:30,780
We exited entire customer sets, entire lines of business, because we determined it was not in the best interest of the company, our shareholders, and most importantly, from the data subjects, the people whose data we had, it wasn't in their best interest that anybody sell that data.

97
00:07:30,780 --> 00:07:40,920
So we got out of the business in a lot of ways, particularly around things like private investigators, which is a big business.

98
00:07:41,520 --> 00:07:57,460
But ultimately, that area of the business where we sold that, which was what we thought we were selling to as an insurance company, doing insurance verification and doing verification on potential insurers.

99
00:07:57,460 --> 00:08:01,460
we exited that investigatory part of the business.

100
00:08:01,920 --> 00:08:05,220
That 1% of revenue put 100% of the enterprise at risk.

101
00:08:05,840 --> 00:08:07,220
So when you looked at it that way,

102
00:08:07,460 --> 00:08:10,160
it wasn't a very difficult decision.

103
00:08:10,860 --> 00:08:13,600
But more so, what should we be doing?

104
00:08:13,700 --> 00:08:15,720
So that's when we started saying,

105
00:08:16,160 --> 00:08:19,360
well, we have a lot of data about a lot of people,

106
00:08:19,700 --> 00:08:21,240
but they don't know that.

107
00:08:22,340 --> 00:08:25,200
And they're going to be curious about that,

108
00:08:25,200 --> 00:08:26,020
as they should be.

109
00:08:26,020 --> 00:08:30,640
How can we make it easy for them to actually see the data?

110
00:08:31,180 --> 00:08:37,240
How can we follow the same guidelines and requirements if it's a credit report?

111
00:08:37,340 --> 00:08:41,800
So under the Fair Credit Reporting Act, there's very specific kinds of things and access.

112
00:08:41,980 --> 00:08:48,100
You have free credit reports, free right to challenge the data, to have the data corrected.

113
00:08:48,520 --> 00:08:50,660
Let's apply that to public records data.

114
00:08:50,660 --> 00:08:57,640
Let's apply that concept to all of the data we have, which is what we said about doing that over the balance of the year.

115
00:08:58,060 --> 00:09:06,340
So ultimately, for us, the learning experience was there are things you can do and there are things you should do.

116
00:09:07,280 --> 00:09:09,640
And the reverse of that is true as well.

117
00:09:09,640 --> 00:09:20,520
So we very much focused on stopping doing things which were not helpful and doing more things which were helpful to individuals with a respect for their privacy.

118
00:09:20,660 --> 00:09:34,280
But one thing that, and it's still in the debate today about the use of data, the collection and use of data, and that is sometimes data is actually very useful and people like it.

119
00:09:34,280 --> 00:09:38,720
It makes things convenient and people like convenience.

120
00:09:39,380 --> 00:09:44,200
And you shouldn't have convenience at the sacrifice of security or privacy.

121
00:09:44,400 --> 00:09:46,260
You can't have all three of those things.

122
00:09:46,260 --> 00:09:52,820
You just have to dial, tune in the dials to make sure you've got the right balance there.

123
00:09:52,920 --> 00:09:55,260
But it is certainly possible to do.

124
00:09:55,420 --> 00:10:12,820
And that's what we really set about doing after that was making sure we are being respectful of privacy, also respectful of making things secure, and offering products that help people get what they needed and what they wanted in a way that was both privacy and security-centric.

125
00:10:12,820 --> 00:10:22,620
So it sounds like here 20 years later, we're going to, in short order, get into, pardon me, your work now.

126
00:10:23,960 --> 00:10:35,400
It sounds like maintain a perspective that data brokers are a net positive, or how would you now reflect on their role in at least American society and business?

127
00:10:35,780 --> 00:10:37,720
Well, there's a lot of things we need.

128
00:10:38,100 --> 00:10:41,620
Maybe we don't always like who it is that gives it to us.

129
00:10:41,620 --> 00:10:43,420
but we need it nonetheless.

130
00:10:44,040 --> 00:10:45,900
So there are good players and there are bad players.

131
00:10:46,120 --> 00:10:48,860
And that's true of most any kind of industry,

132
00:10:48,980 --> 00:10:49,700
most any kind of business.

133
00:10:49,940 --> 00:10:51,840
There are some people, there are some businesses

134
00:10:51,840 --> 00:10:56,380
that you can fully embrace and endorse what they do

135
00:10:56,380 --> 00:10:57,140
and how they do it.

136
00:10:57,140 --> 00:10:58,480
And then there are others you kind of go,

137
00:10:58,640 --> 00:11:01,260
they're a company.

138
00:11:02,080 --> 00:11:02,760
Right, right, they exist.

139
00:11:03,540 --> 00:11:07,200
Yeah, and they're doing, they're obviously following the law

140
00:11:07,200 --> 00:11:10,140
and somebody likes what they do and whatever.

141
00:11:10,140 --> 00:11:11,580
so

142
00:11:11,580 --> 00:11:13,780
on a whole

143
00:11:13,780 --> 00:11:16,440
we have built

144
00:11:16,440 --> 00:11:18,180
an entire global economy

145
00:11:18,180 --> 00:11:19,980
based on data

146
00:11:19,980 --> 00:11:22,280
the concept that we can

147
00:11:22,280 --> 00:11:24,340
overly restrict or

148
00:11:24,340 --> 00:11:26,540
overly release data

149
00:11:26,540 --> 00:11:28,640
both poles of that

150
00:11:28,640 --> 00:11:30,060
are not realistic

151
00:11:30,060 --> 00:11:32,440
and don't lead to good outcomes

152
00:11:32,440 --> 00:11:34,540
but we need data

153
00:11:34,540 --> 00:11:36,120
but what we need is

154
00:11:36,120 --> 00:11:38,180
at the time

155
00:11:38,180 --> 00:11:39,940
and in the manner and with the

156
00:11:40,140 --> 00:11:43,760
all of the protections all at the right time.

157
00:11:43,940 --> 00:11:46,700
We need to have regulation.

158
00:11:47,020 --> 00:11:50,580
In the United States, we need to have more regulation.

159
00:11:51,120 --> 00:11:55,340
Now, it doesn't have to be regulation that chokes things off

160
00:11:55,340 --> 00:11:57,300
or it makes things particularly difficult,

161
00:11:57,300 --> 00:12:01,100
but we do need more regulation, if for no other reason

162
00:12:01,100 --> 00:12:05,500
than to drive out the illegitimate actors, of which there are plenty,

163
00:12:05,500 --> 00:12:13,900
and to make the legitimate actors more secure so they don't have to make those decisions about,

164
00:12:14,420 --> 00:12:20,020
well, do I take this to the maximum level I can to protect somebody,

165
00:12:20,020 --> 00:12:25,860
or do I dial it back a little bit because my competition is out doing things that are harmful to people,

166
00:12:26,240 --> 00:12:33,240
but they're impacting my financial health, so I can't afford to do the better thing.

167
00:12:33,240 --> 00:12:46,080
Yeah, it's fascinating. I didn't actually even put this here as a prop, but I will make sure my fingers are covering my PII. But, you know, here's another notice, this one from Conduent.

168
00:12:47,080 --> 00:12:49,300
There we go.

169
00:12:49,300 --> 00:12:54,640
Well, I mean, and so you too, James, got free credit reporting, right?

170
00:12:55,080 --> 00:13:04,660
And so I think what is, and this is a bit of a digression, but I think it's still on point, is it has become, I am personally desensitized to it.

171
00:13:05,320 --> 00:13:14,640
I kept it because I'm still debating within reason, you know, within the realm of what's possible, what I want to do with it.

172
00:13:14,660 --> 00:13:17,280
Because in my view, I'd love to get your take on this.

173
00:13:17,280 --> 00:13:25,320
it has become a humiliation ritual of sorts is we, you know, and I love terms like threat actor

174
00:13:25,320 --> 00:13:31,080
and all these are just absolute nonsense. And so, you know, we messed up, we lost your data,

175
00:13:31,240 --> 00:13:37,320
we got breached. We're telling you six months later, here's some free credit reporting.

176
00:13:37,820 --> 00:13:46,500
So, so I guess with that, what is your read, James, on the state of affairs broadly?

177
00:13:47,280 --> 00:13:49,240
and we'll unpack this over time.

178
00:13:49,860 --> 00:13:51,080
Well, let me ask it this way.

179
00:13:51,180 --> 00:13:53,840
When you, in particular, as you did,

180
00:13:53,900 --> 00:13:55,380
received this letter from Conduit,

181
00:13:55,640 --> 00:13:56,700
what's your take?

182
00:13:56,780 --> 00:13:58,240
What are your thoughts and feelings

183
00:13:58,240 --> 00:13:59,860
as you open these notices?

184
00:13:59,860 --> 00:14:03,380
You know, I'm burdened by knowledge.

185
00:14:03,980 --> 00:14:05,520
And the first part is right.

186
00:14:06,440 --> 00:14:07,920
The curse of insight.

187
00:14:08,560 --> 00:14:10,560
Yeah, I don't know anything about this company.

188
00:14:10,760 --> 00:14:11,980
I know nothing about the company,

189
00:14:12,060 --> 00:14:13,260
but I know a lot about data breach.

190
00:14:14,760 --> 00:14:17,060
And this comes at the same time

191
00:14:17,280 --> 00:14:27,160
right about the time we released the ITRC's 20th edition of our annual data breach report.

192
00:14:27,660 --> 00:14:29,480
I think within a matter of days.

193
00:14:29,840 --> 00:14:32,360
Now, this is dated December 31st.

194
00:14:32,460 --> 00:14:32,840
Mine is.

195
00:14:32,920 --> 00:14:34,080
I'm not sure what year it's dated.

196
00:14:34,220 --> 00:14:34,860
It is as well.

197
00:14:34,900 --> 00:14:35,640
I'm looking at it right now.

198
00:14:36,040 --> 00:14:36,240
Yeah.

199
00:14:36,760 --> 00:14:38,140
Dated December 31st.

200
00:14:38,140 --> 00:14:41,820
I received it at the end of January, first part of February.

201
00:14:42,480 --> 00:14:43,660
I think I got early Feb.

202
00:14:43,940 --> 00:14:44,080
Yeah.

203
00:14:44,080 --> 00:14:56,740
And so I immediately look at that and said, well, that's that's and I knew from from our research into last year's data breaches that this had occurred a long time ago.

204
00:14:57,000 --> 00:15:00,140
And they're just now getting around to issuing notice.

205
00:15:00,320 --> 00:15:04,520
But more importantly, I look at this and I go, I don't know who you people are.

206
00:15:04,680 --> 00:15:06,880
I don't know how you have my data.

207
00:15:07,300 --> 00:15:09,520
I don't know what data it is.

208
00:15:10,140 --> 00:15:13,700
You say you've got my social security number and address.

209
00:15:14,080 --> 00:15:19,860
That was in the file, and they have no evidence of actual or attempted misuse.

210
00:15:19,980 --> 00:15:20,620
Well, that's great.

211
00:15:20,700 --> 00:15:21,600
How do you know that?

212
00:15:22,100 --> 00:15:25,180
What forensic tools did you use?

213
00:15:25,540 --> 00:15:26,260
How do you know that?

214
00:15:26,480 --> 00:15:27,920
What else is in that file?

215
00:15:27,920 --> 00:15:30,240
And how did you get it?

216
00:15:30,260 --> 00:15:32,020
Because you're a third party.

217
00:15:33,300 --> 00:15:35,140
So who was your customer?

218
00:15:36,760 --> 00:15:39,380
Whose supply chain are you a part of?

219
00:15:39,840 --> 00:15:43,000
So I can go and poke them in the chest and go, what are you doing?

220
00:15:43,980 --> 00:15:44,120
Right.

221
00:15:44,320 --> 00:15:44,820
And please do.

222
00:15:44,820 --> 00:15:45,660
What are you doing to protect my data?

223
00:15:46,300 --> 00:15:46,700
Right.

224
00:15:47,960 --> 00:15:57,720
Well, and that's probably a great, and I've buried the lead here, but let's then do take that as an opportunity, James, to move into from ChoicePoint.

225
00:15:57,720 --> 00:16:07,100
You went to an organization that is now the nation's leading identity crime nonprofit, the Identity Theft Resource Center.

226
00:16:07,540 --> 00:16:12,660
Was there a specific moment when you decided you needed to switch sides?

227
00:16:13,000 --> 00:16:14,460
That's where you needed to be.

228
00:16:16,800 --> 00:16:28,140
Well, so part of the whole Choice Board experience was when we recognized, look, we've got to do more and we have to lead by example.

229
00:16:28,500 --> 00:16:31,080
We donated a million dollars to the ITRC.

230
00:16:31,080 --> 00:16:41,460
And after that, after a period of time, I was invited on the board and I served on the board, well, a total of 14 years.

231
00:16:43,000 --> 00:16:44,840
In two different times.

232
00:16:45,180 --> 00:16:46,680
And I actually served three years as chairman.

233
00:16:47,640 --> 00:16:53,220
The person who's actually the CEO of the organization, I was the chairman when she was hired.

234
00:16:54,560 --> 00:16:59,520
And after I left Choice One, because we were ultimately acquired by the parent company of LexisNexis.

235
00:17:00,540 --> 00:17:01,900
And so that was in 2008.

236
00:17:01,900 --> 00:17:09,060
So I left and then I did consulting for a number of years and then ultimately ended my corporate career in cybersecurity.

237
00:17:10,080 --> 00:17:12,280
Particularly application security in Ireland.

238
00:17:13,000 --> 00:17:27,000
But the entire time I stayed in touch on this concept of data breaches and was involved with a number of efforts around data privacy and how to improve data privacy.

239
00:17:27,000 --> 00:17:32,600
I worked with ANSI on to the American National Standards Institute on a couple of projects

240
00:17:32,600 --> 00:17:41,880
to improve data privacy and data identity management around the whole concept of verification,

241
00:17:42,060 --> 00:17:59,470
but doing it in a way that is privacy centric So I had kept in touch with this And then in my cybersecurity work it was all around keeping data secure keeping particularly systems where data is held secure

242
00:17:59,470 --> 00:18:04,870
and protected from software glitches and banned code,

243
00:18:04,970 --> 00:18:09,370
which is one of the leading causes of data breaches historically.

244
00:18:09,370 --> 00:18:10,330
Certainly.

245
00:18:10,510 --> 00:18:15,590
Today, the leading cause of data breaches is actually other data breaches, but that's a different – we'll probably get to that.

246
00:18:16,170 --> 00:18:17,890
But I stayed in touch with that.

247
00:18:18,610 --> 00:18:29,250
And then when I left the company in Ireland, when it was a startup, so another round of funders, new team comes in.

248
00:18:29,670 --> 00:18:30,010
Sure.

249
00:18:30,610 --> 00:18:35,610
I was having dinner with the CEO of the ITSC, and she said, you know what?

250
00:18:35,610 --> 00:18:48,790
All those years, you were badgering me to do some things and to grow some things and to expand into more policy work, expand into some products that can be helpful for individuals and businesses.

251
00:18:49,210 --> 00:18:55,070
Take that mountain of data that we've been collecting since 2005 and actually do something with it.

252
00:18:55,350 --> 00:18:57,030
You need to come to San Diego and do that.

253
00:18:57,110 --> 00:18:58,250
So that's exactly what I did.

254
00:18:58,530 --> 00:19:00,850
And then six weeks later, we had a global pandemic.

255
00:19:02,210 --> 00:19:03,430
And we all know how that went.

256
00:19:03,430 --> 00:19:19,470
Well, so now, so with that, ChoicePoint 2005, as you noted, 21 now, almost years later, ITRC's own data shows breach transparency is on the decline, if not collapsing.

257
00:19:19,470 --> 00:19:29,330
And so the data brokerage, pardon me, industry sector, have they learned anything from the era that you worked through and built through?

258
00:19:29,570 --> 00:19:34,090
Or are they repeating the same mistakes with better legal cover?

259
00:19:34,430 --> 00:19:46,150
I think what we have seen is your last statement about better legal cover, I would apply that broadly across business period.

260
00:19:46,850 --> 00:19:47,050
Fair.

261
00:19:47,050 --> 00:19:49,190
Not just data brokers.

262
00:19:49,190 --> 00:19:57,690
I do think the data industry, writ large, so that's any company that has data and winds up selling data, whether that is their sole product.

263
00:19:57,830 --> 00:20:00,950
But let's just think of organizations that may they collect it.

264
00:20:01,050 --> 00:20:02,710
That may not be what they do for a business.

265
00:20:03,110 --> 00:20:04,330
MasterCard comes to mind.

266
00:20:04,410 --> 00:20:04,510
Yeah.

267
00:20:04,870 --> 00:20:09,070
And any of the credit card companies, any of the retailers, they're going to sell those lists.

268
00:20:09,990 --> 00:20:14,390
So they're in the data business along with selling widgets.

269
00:20:14,390 --> 00:20:24,530
What all of those organizations have learned is if you do not have proper cybersecurity, you are going to have a data breach.

270
00:20:24,770 --> 00:20:38,690
If you are going to have a data breach, you are going to have an extraordinary level, depending on your size, an extraordinary level of unbudgeted expense and reputational expense, customer churn.

271
00:20:38,930 --> 00:20:41,210
You're going to have all those bad things happen.

272
00:20:41,210 --> 00:20:45,930
You are probably going to survive, but you're never going to be the same.

273
00:20:47,210 --> 00:20:55,270
And if I may then, James, which I find really interesting, perhaps it's the cynic in me, but I tend to think it is a cost of doing business.

274
00:20:55,410 --> 00:21:05,270
But what I hear you say is it is painful enough to motivate a change in behavior or perhaps it depends on the size of company and size of the breach.

275
00:21:05,270 --> 00:21:13,170
And I think we're both right, because I think there are people who do treat it as it's a cost of doing business and it's cheaper to pay the fine.

276
00:21:13,390 --> 00:21:20,650
There are clearly organizations, and I don't like to engage in name and shame.

277
00:21:21,070 --> 00:21:21,590
No, no, of course.

278
00:21:21,630 --> 00:21:26,210
But if we go back a few years and we look at a company called Blackbaud, publicly traded company.

279
00:21:28,330 --> 00:21:31,290
Massive amount of data, never issued a data breach note.

280
00:21:32,410 --> 00:21:35,130
Left it all to their customers to do it.

281
00:21:35,270 --> 00:21:38,230
who in turn are non-profits largely, if I recall.

282
00:21:38,410 --> 00:21:41,190
Which are non-profits and who, in many cases,

283
00:21:41,310 --> 00:21:43,530
didn't know their data had even been compromised.

284
00:21:44,250 --> 00:21:45,810
And it wasn't just their data.

285
00:21:45,950 --> 00:21:47,710
It was their donors, their customers,

286
00:21:48,070 --> 00:21:51,450
all of the people who, maybe their patrons,

287
00:21:52,450 --> 00:21:53,590
their arts organizations.

288
00:21:54,710 --> 00:21:55,850
That was the data.

289
00:21:55,850 --> 00:21:57,690
Those were the individuals who were lost.

290
00:21:58,050 --> 00:22:00,710
Most of those individuals were never told.

291
00:22:01,470 --> 00:22:05,050
We tracked out, at the time, we tracked the ITRC.

292
00:22:05,270 --> 00:22:10,370
about 600 data breach notices directly related to Blackbaud.

293
00:22:10,590 --> 00:22:17,790
And that was after they were basically forced by the Securities Exchange Commission to fess up.

294
00:22:17,910 --> 00:22:22,070
And they did it during their SEC filings, not in a data breach notice,

295
00:22:22,450 --> 00:22:25,730
which I don't know about you, but I don't spend a lot of time reading SEC filings.

296
00:22:26,470 --> 00:22:29,030
Indeed, great place to bury that information.

297
00:22:29,930 --> 00:22:33,690
So they were sued by the state's attorneys general.

298
00:22:33,690 --> 00:22:36,830
I think it was 49 of the states sued.

299
00:22:37,810 --> 00:22:43,150
And in the course of that lawsuit, then we find out it wasn't 600 organizations.

300
00:22:43,150 --> 00:22:46,350
It was 12,000, nearly 13,000 organizations.

301
00:22:47,290 --> 00:22:54,850
So the customers and the patrons of those organizations never knew their information had been compromised.

302
00:22:55,890 --> 00:23:00,090
Because they thought it was cheaper to pay the fine.

303
00:23:00,090 --> 00:23:08,110
And that exists in every kind of business, but not every kind of business has the impact that data businesses do.

304
00:23:09,090 --> 00:23:16,350
When you hold the information of an individual, you hold the keys to their life in many instances.

305
00:23:16,630 --> 00:23:24,590
And when that's compromised, you have a deep and abiding impact on those individuals.

306
00:23:24,590 --> 00:23:29,290
and organizations who have gone through that

307
00:23:29,290 --> 00:23:33,210
generally come out with a much deeper respect

308
00:23:33,210 --> 00:23:34,510
for their responsibilities.

309
00:23:35,490 --> 00:23:39,590
Now, cybersecurity is the biggest risk.

310
00:23:40,330 --> 00:23:43,350
That's where most breaches begin today.

311
00:23:44,690 --> 00:23:49,390
And you can't stop all of them,

312
00:23:49,730 --> 00:23:51,870
but you can stop a lot of them.

313
00:23:51,870 --> 00:23:59,010
And if you have the right policies, practices, and procedures in place, you can make that down to a very, very small number.

314
00:23:59,110 --> 00:24:02,970
So you minimize the impact to the maximum degree that you can.

315
00:24:03,050 --> 00:24:04,030
So they're always going to happen.

316
00:24:04,490 --> 00:24:07,730
We're never going to get to zero, but we can always minimize that impact.

317
00:24:07,850 --> 00:24:08,450
We can do more.

318
00:24:08,990 --> 00:24:18,010
So the companies that we want to encourage to do more, we want to recognize them and pat them on the back when they do it, are those companies that take that seriously.

319
00:24:18,790 --> 00:24:20,230
And the rest of them, we need to slap around.

320
00:24:20,230 --> 00:24:28,910
So I'm with you there. I'm with you there. So now with over 20 years, I won't put a number on you,

321
00:24:29,050 --> 00:24:35,310
James, but with 20 plus years of work in this field, let's, you know, I think I saw someone

322
00:24:35,310 --> 00:24:42,230
use the phrase the golden age of identity crime. But you told the Senate, you know, that very thing,

323
00:24:42,490 --> 00:24:47,270
right? That guy being you. So what does that actually look like? What are the

324
00:24:47,270 --> 00:24:53,230
ITRC's advisors hearing from victims that is different from, say, five years ago?

325
00:24:54,030 --> 00:24:55,330
Yeah, great question.

326
00:24:56,250 --> 00:25:01,610
That was in front of the Senate Commerce Committee, and that was right at the tail end of the

327
00:25:01,610 --> 00:25:02,050
pandemic.

328
00:25:02,050 --> 00:25:07,250
So that was 21, I believe, 2021.

329
00:25:07,810 --> 00:25:09,630
So here we are five years later.

330
00:25:10,350 --> 00:25:15,070
And if that was the golden age, I'm not sure what happens after the golden age, but that's

331
00:25:15,070 --> 00:25:16,070
the age we're in now.

332
00:25:16,070 --> 00:25:17,290
At the platinum age.

333
00:25:18,150 --> 00:25:18,390
Yeah.

334
00:25:19,390 --> 00:25:21,710
For a fricken flyer

335
00:25:21,710 --> 00:25:24,170
analogy, we're headed toward

336
00:25:24,170 --> 00:25:26,190
diamond, double diamond

337
00:25:26,190 --> 00:25:26,790
or whatever.

338
00:25:29,170 --> 00:25:32,270
We are seeing things that have

339
00:25:32,270 --> 00:25:33,910
in many respects changed

340
00:25:33,910 --> 00:25:35,610
and in other respects not.

341
00:25:39,150 --> 00:25:39,870
Most

342
00:25:39,870 --> 00:25:42,290
identity crimes today,

343
00:25:42,390 --> 00:25:43,950
whether it's identity theft,

344
00:25:44,770 --> 00:25:46,010
which is when you take

345
00:25:46,070 --> 00:25:51,990
the information. So a data breach is identity. But then somebody has to use that. They don't

346
00:25:51,990 --> 00:25:57,050
want to steal it just because it's fun. Maybe once upon a time they did that. Now they don't.

347
00:25:57,130 --> 00:26:03,250
They want to make money off it. And that's when you get into frauds and scams. And so a lot of

348
00:26:03,250 --> 00:26:11,770
what we're seeing today are data-fueled frauds and scams. So whether that's the romance scams

349
00:26:11,770 --> 00:26:16,630
that we hear about now, things like maybe they're AI-fueled.

350
00:26:16,750 --> 00:26:21,090
There's some sort of variation of what we call the grandparent scam,

351
00:26:21,210 --> 00:26:24,950
which is called in the middle of the night saying your grandchild, your child,

352
00:26:25,090 --> 00:26:29,170
your spouse, your cousin, co-worker, they're in the ER,

353
00:26:29,570 --> 00:26:33,110
or they've been kidnapped, and they need $10,000 right now

354
00:26:33,110 --> 00:26:37,470
to be able to treat them, or something bad is going to happen.

355
00:26:38,150 --> 00:26:40,770
And those kind of things, we see that.

356
00:26:40,770 --> 00:26:44,690
We see crypto scams. We see account takeover.

357
00:26:45,290 --> 00:27:00,470
Because one of the things that we see as a direct result of all this mountain of data, mountain range of data that's been stolen over the years, particularly the last five, is the ability to impersonate another person.

358
00:27:00,470 --> 00:27:10,890
You have enough information between what has been stolen, and then compare that with what's publicly available, probably on your social media account.

359
00:27:12,530 --> 00:27:22,370
Somebody can impersonate you well enough to either take over an account you already have or open up a new account in your name, and you are completely unaware that that is happening.

360
00:27:23,270 --> 00:27:28,050
And, by the way, as I'm sure you're tracking, can now replicate your voice pitch perfect.

361
00:27:28,250 --> 00:27:28,650
Yeah.

362
00:27:29,050 --> 00:27:29,170
Yeah.

363
00:27:29,170 --> 00:27:31,110
Three or four seconds or once.

364
00:27:31,290 --> 00:27:33,290
Somebody can pull this and pretend to be me.

365
00:27:33,950 --> 00:27:34,150
Yes.

366
00:27:34,790 --> 00:27:35,650
Good luck to them.

367
00:27:36,510 --> 00:27:39,470
If you want to be me, here, I'll give you the key.

368
00:27:42,030 --> 00:27:55,390
But that's something that, you know, from a, if you're the person whose data has been stolen, that realization is just now kind of coming to the forefront.

369
00:27:55,390 --> 00:28:14,210
This most recent report, the research we did for this report on the 2025 data breaches was the first time we've actually seen people be able to say, I know, or at least I think I know, what they did with my data.

370
00:28:15,330 --> 00:28:19,230
So it's attempts at account takeover.

371
00:28:19,550 --> 00:28:21,290
It's actual account takeover.

372
00:28:21,650 --> 00:28:23,810
It's attempt at impersonation.

373
00:28:23,810 --> 00:28:32,170
it's increased spam increased scam text fake email our phones are our phones are virtually

374
00:28:32,170 --> 00:28:39,210
unusable at this point right yeah as a phone and so people are paying attention which is absolutely

375
00:28:39,210 --> 00:28:45,610
amazing and it's great and they are then taking steps after that they have not taken before

376
00:28:45,610 --> 00:28:52,130
which is also it's great so messaging to a degree is getting through and the frustration

377
00:28:52,130 --> 00:29:00,390
with this constant rise of data breach notices is getting to a point where, you know,

378
00:29:00,410 --> 00:29:06,530
if people ever get organized around this concept, we'll actually see some good public policy come

379
00:29:06,530 --> 00:29:11,970
out of it. Right now, we don't have that groundswell. But boy, it sure looks like the early

380
00:29:11,970 --> 00:29:19,810
indicators of that are there and may very well grow over time because there's nothing that I see

381
00:29:19,810 --> 00:29:25,190
on the horizon that is going to reduce the number of data breaches and data breach notices.

382
00:29:26,370 --> 00:29:32,290
You touched on it, which is you flagged so-called recycled information, so stolen credentials from

383
00:29:32,290 --> 00:29:38,210
old breaches being repackaged for new attacks. And so, you know, if our data is already out there

384
00:29:38,210 --> 00:29:44,710
from multiple breaches, what's the risk model for an individual? So to your prior point,

385
00:29:44,710 --> 00:29:50,810
we're becoming more aware. I hope we're becoming more intolerant. But realistically,

386
00:29:50,810 --> 00:29:59,670
what should we expect as individuals? Tough question, perhaps, but, you know, how bad is

387
00:29:59,670 --> 00:30:03,910
it going to get before it gets better, if it gets better? Yeah, that's the, and I think that's one

388
00:30:03,910 --> 00:30:10,350
of the great unknowns. I think what we're going to see is, again, the continued rise in the volume

389
00:30:10,350 --> 00:30:12,930
and probably velocity of data breaches,

390
00:30:12,930 --> 00:30:15,730
which means we're going to get more data breach notices.

391
00:30:16,190 --> 00:30:19,070
We're at a point where if you're an adult in the United States,

392
00:30:19,570 --> 00:30:24,270
there's not a lot more about you that isn't readily available already.

393
00:30:24,630 --> 00:30:29,590
And that's a staggering statement to say and to absorb.

394
00:30:29,590 --> 00:30:33,270
But your social security number has been available for years.

395
00:30:34,350 --> 00:30:39,730
Probably the thing that is coming on now is driver's licenses.

396
00:30:39,730 --> 00:30:49,230
They have not been very valuable historically, but since the pandemic and during the pandemic, we have seen the value of a driver's license number go up.

397
00:30:49,730 --> 00:30:51,290
A lot easier to get a new driver's license number.

398
00:30:51,290 --> 00:30:51,570
And why is that, by the way?

399
00:30:52,410 --> 00:30:54,850
Because we're using it in ways we've never used it before.

400
00:30:55,390 --> 00:30:57,670
So think about it historically, pre-pandemic.

401
00:30:57,670 --> 00:31:12,410
So let's talk about if you're an adult in the United States in 2019, you used your driver's license when you went to the airport and you used your driver's license when you went to a place where you had to show your age, whether you're buying alcohol or whatever reason.

402
00:31:13,050 --> 00:31:16,310
And if you're a college kid with your fake ID, right?

403
00:31:17,990 --> 00:31:19,270
Those are the only times.

404
00:31:19,450 --> 00:31:23,450
And then if you got pulled over, you know, those are the times you needed your driver's license.

405
00:31:24,350 --> 00:31:25,990
Now, fast forward.

406
00:31:26,130 --> 00:31:27,250
Think about mid-2021.

407
00:31:27,670 --> 00:31:30,090
You could open up a bank account online.

408
00:31:30,410 --> 00:31:33,510
You can open up any kind of account online.

409
00:31:34,250 --> 00:31:39,110
Anytime you need to verify you are who you say you are, you're setting up your account with the IRS.

410
00:31:39,350 --> 00:31:40,610
They're going to make you go through a process.

411
00:31:40,730 --> 00:31:41,270
What do you have to do?

412
00:31:41,290 --> 00:31:42,250
You've got to show your driver's license.

413
00:31:42,530 --> 00:31:45,630
Your insurance company, you're renewing your auto insurance.

414
00:31:45,710 --> 00:31:46,150
What do you got to do?

415
00:31:46,230 --> 00:31:47,090
You've got to show your driver's license.

416
00:31:47,790 --> 00:31:53,450
Your driver's license has become the de facto social security number because the social security number is so competent.

417
00:31:54,470 --> 00:31:56,250
So bad guys figured this out.

418
00:31:56,250 --> 00:32:03,370
So your social security number in a marketplace where they buy and sell data is a BOGO.

419
00:32:03,850 --> 00:32:06,550
You buy data and they give you the social security number.

420
00:32:06,650 --> 00:32:14,450
There is no financial value in terms of a purchase price to a social security number.

421
00:32:14,970 --> 00:32:17,750
Your driver's license, that number went through the roof.

422
00:32:18,430 --> 00:32:24,530
So the height of the pandemic, you're talking about $300, $400 for a driver's license number and the attendant information.

423
00:32:24,530 --> 00:32:27,170
now it's back closer to $150, $200

424
00:32:27,170 --> 00:32:28,850
social security number

425
00:32:28,850 --> 00:32:29,670
still zero

426
00:32:29,670 --> 00:32:32,010
driver's license $150, $200

427
00:32:32,010 --> 00:32:33,370
depending on what's stated

428
00:32:33,370 --> 00:32:36,150
and that then is used

429
00:32:36,150 --> 00:32:37,670
to impersonate you to what?

430
00:32:38,090 --> 00:32:38,970
open up accounts

431
00:32:38,970 --> 00:32:40,430
you know

432
00:32:40,430 --> 00:32:42,690
if it's a good enough fake

433
00:32:42,690 --> 00:32:44,130
and some of the fakes are good enough

434
00:32:44,130 --> 00:32:45,430
it gets you through TSA

435
00:32:45,430 --> 00:32:47,110
it gets you through

436
00:32:47,110 --> 00:32:49,110
any other kind of

437
00:32:49,110 --> 00:32:50,430
situation

438
00:32:50,430 --> 00:32:52,830
we have recorded instances

439
00:32:52,830 --> 00:33:00,130
working with people where they have encountered law enforcement coming to them

440
00:33:00,130 --> 00:33:04,250
and either attempting to arrest them or tell them they have arrest warrants,

441
00:33:04,990 --> 00:33:08,070
get judicial notices for failure to appear in court

442
00:33:08,070 --> 00:33:13,330
because the driver's license presented to a law enforcement officer was so good

443
00:33:13,330 --> 00:33:16,310
that it passed their test.

444
00:33:17,030 --> 00:33:22,570
And that person then later found out they were in a different state

445
00:33:22,570 --> 00:33:26,130
in an auto accident or in some way encountered law enforcement,

446
00:33:26,270 --> 00:33:28,650
and they only found out about it when somebody came to arrest them.

447
00:33:30,190 --> 00:33:30,750
Incredible.

448
00:33:32,430 --> 00:33:36,330
And so let me guess this, James, is on that note, you know,

449
00:33:36,410 --> 00:33:40,510
given my field of work certainly over the last four or five years in payments,

450
00:33:41,490 --> 00:33:43,950
KYC is an ongoing raging debate.

451
00:33:44,250 --> 00:33:47,870
And so the question I have for you there, which what you mentioned is fascinating,

452
00:33:48,130 --> 00:33:52,410
zero value to a Social Security number, nominal, growing,

453
00:33:52,570 --> 00:33:58,170
declining value, some significant value to a driver's license. And the story you just told

454
00:33:58,170 --> 00:34:04,210
about being able to deceive a cop with a driver's license is just staggering. So my question is,

455
00:34:04,350 --> 00:34:13,690
to what degree are you tracking and do you see these app-based KYC processes being duped

456
00:34:13,690 --> 00:34:18,450
by stolen or fake identity? Where's that on the radar for you?

457
00:34:18,450 --> 00:34:28,870
It is on the radar. And what we're seeing today, fortunately, is more attempts than success.

458
00:34:29,890 --> 00:34:41,590
Because some of the some of these companies who are using the either they're either using this app base or they're offering app based KYC solutions.

459
00:34:41,890 --> 00:34:46,210
They're actually more effective than you think they might be sometimes.

460
00:34:46,210 --> 00:34:51,210
It's like everything else. There's a wide variety. There's some that are very intrusive.

461
00:34:52,390 --> 00:35:09,850
Yeah. But they are blocking. So we are seeing more attempts, but we also are seeing more successes overall in account takeover, which means whatever we're doing from an account verification at the startup.

462
00:35:09,850 --> 00:35:11,970
which is

463
00:35:11,970 --> 00:35:13,930
one, it's an area of granularity

464
00:35:13,930 --> 00:35:15,870
we've got to get into more which is

465
00:35:15,870 --> 00:35:17,190
the differentiation between

466
00:35:17,190 --> 00:35:19,330
when you establish the account

467
00:35:19,330 --> 00:35:21,570
so account initiation and

468
00:35:21,570 --> 00:35:23,590
is the takeover occurring

469
00:35:23,590 --> 00:35:24,410
later

470
00:35:24,410 --> 00:35:28,070
so there's not secondary verification

471
00:35:28,070 --> 00:35:29,590
along the way so

472
00:35:29,590 --> 00:35:44,520
you log in one time you verify one time and then they trust you from there on out Well that doesn work either anymore You need to have secondary verification and authentication periodically

473
00:35:44,980 --> 00:35:51,980
Now, some people would argue, depending on how valuable what it is you're trying to protect behind that account,

474
00:35:51,980 --> 00:35:55,880
would indicate, well, I need to do that every time I log in.

475
00:35:55,880 --> 00:36:02,260
and others, you know, if it's my IMD account, IMDB account,

476
00:36:02,420 --> 00:36:04,280
before I figure out what movie I'm going to go see this weekend,

477
00:36:04,400 --> 00:36:05,260
maybe not so much.

478
00:36:05,840 --> 00:36:08,460
But if I'm getting into my bank account,

479
00:36:08,900 --> 00:36:11,160
they better authenticate me every single time.

480
00:36:11,880 --> 00:36:13,020
And you do see that.

481
00:36:13,680 --> 00:36:19,160
But KYC is an interesting topic because it's in that area

482
00:36:19,160 --> 00:36:21,120
where data is so valuable.

483
00:36:21,120 --> 00:36:27,560
and yet we are so resistant to moving to some of the next kinds of technologies

484
00:36:27,560 --> 00:36:33,700
and this is where we get into the difference between we'll talk about biometrics the difference

485
00:36:33,700 --> 00:36:42,360
between verification and identification one's very good less intrusive great when it comes to

486
00:36:42,360 --> 00:36:51,100
preventing fraud identification not so much very intrusive not as accurate and even if it were

487
00:36:51,100 --> 00:36:58,940
not very transparent. Authentication, very transparent because you're doing it. You are

488
00:36:58,940 --> 00:37:07,020
initiating a transaction. So the confusion that people have between the two leads to,

489
00:37:07,140 --> 00:37:13,640
we don't want either one. Right. And I think, and I would add, pardon me, James, I would add

490
00:37:13,640 --> 00:37:20,220
the, well, the intrusiveness, the friction, the, you know, I mean, we could easily see,

491
00:37:20,220 --> 00:37:27,460
and I've had a couple of these conversations with folks in the field, that every time I open an app,

492
00:37:27,500 --> 00:37:31,320
it needs to scan my face. You know, I'm sure from their standpoint, they could make these arguments.

493
00:37:31,580 --> 00:37:37,500
Perhaps, you know, ITRC could make these arguments. So I guess the gist of my question is,

494
00:37:37,500 --> 00:37:47,940
what do you see in terms of where it ends, right? I mean, diminishing returns, one could be like me,

495
00:37:47,940 --> 00:37:52,780
skeptical that they'll always breach it. It's always going to be insufficient. So sort of where

496
00:37:52,780 --> 00:37:58,980
do we land on this eventually? Which is, I think that's a conversation we've got to have. And I

497
00:37:58,980 --> 00:38:05,160
think it's a matter of degrees. It's back to which is more important to protect. If I'm protecting my

498
00:38:05,160 --> 00:38:11,120
financial assets, my real estate assets, if I have any, my family, whatever it is that I need to

499
00:38:11,120 --> 00:38:17,800
protect, I may be willing to share and go a little bit further on that than I would be on something

500
00:38:17,800 --> 00:38:19,040
of lesser value.

501
00:38:19,520 --> 00:38:27,300
I don't need to show my face, my finger, a code or something for a routine transaction.

502
00:38:27,300 --> 00:38:30,120
If it's even a transaction at all, I might just be seeking information.

503
00:38:30,340 --> 00:38:35,620
I might be browsing, looking for something before I ever make a decision that I better

504
00:38:35,620 --> 00:38:36,060
purchase.

505
00:38:36,460 --> 00:38:41,300
I don't need to have that where I'm not providing data.

506
00:38:41,740 --> 00:38:45,460
I don't need to have a lot of friction at that point.

507
00:38:45,460 --> 00:38:49,500
I need a lot of friction as you go up that value chain.

508
00:38:50,120 --> 00:38:52,700
And at the ITRC, we're big fans of friction.

509
00:38:52,920 --> 00:38:59,540
We think we have gone too far down the path between convenience and security.

510
00:39:01,420 --> 00:39:08,880
But if some people want to go to the maximum degree, it should be their choice, but it shouldn't be default.

511
00:39:09,880 --> 00:39:12,740
And that, I think, is the crux, absolutely.

512
00:39:12,740 --> 00:39:19,320
uh yeah let's let's think about where we are with um any of the instant payment

513
00:39:19,320 --> 00:39:32,800
uh processes you know most people that we talk to who get scammed today um their their their

514
00:39:32,800 --> 00:39:38,160
their method of sending money if they're not actually getting cash out of the bank

515
00:39:38,160 --> 00:39:42,920
is they're using one of those instant transfer products.

516
00:39:43,700 --> 00:39:44,940
Well, it's just like cash.

517
00:39:45,000 --> 00:39:45,960
Once it's gone, it's gone.

518
00:39:46,620 --> 00:39:50,180
Now, maybe there's a way to claw back

519
00:39:50,180 --> 00:39:52,520
depending upon the time of the day, the day of the week,

520
00:39:52,780 --> 00:39:55,960
other things, other circumstances.

521
00:39:56,280 --> 00:39:58,040
But for the most part, that money's gone.

522
00:39:59,040 --> 00:40:02,760
And you as the individual who find out later that you're a scam,

523
00:40:03,180 --> 00:40:05,940
you think the financial institution should make you whole.

524
00:40:06,420 --> 00:40:07,640
Maybe they should, maybe they shouldn't.

525
00:40:07,640 --> 00:40:08,560
That's a different debate.

526
00:40:09,600 --> 00:40:16,200
But what we also know from working with victims is when you talk to them, and they'll tell you,

527
00:40:16,580 --> 00:40:25,200
if I had only taken a minute or two more to think about it, if I hadn't made that decision in the heat of the moment, I wouldn't have done it.

528
00:40:26,380 --> 00:40:37,460
Well, what if we didn't make it quite so easy to send large sums of money, which is relative, $500 is large to somebody, and $500 is large to other people.

529
00:40:37,640 --> 00:40:54,500
But whatever the amount is, what if you had the option where you can't let me make an instant transfer unless I talk to somebody, I get a second verification.

530
00:40:55,981 --> 00:40:58,860
There's got something to insert some friction.

531
00:40:59,620 --> 00:41:05,380
Milliseconds to complete a transaction is not going to make somebody go someplace else.

532
00:41:05,820 --> 00:41:06,380
Right.

533
00:41:06,380 --> 00:41:18,640
Well, and let me ask you this, James, just in fact, today we'll publish an episode with, pardon me, a gentleman, Jesse Poster, who's the co-founder of a company called Vora.

534
00:41:19,180 --> 00:41:25,060
And so this gets us into the Bitcoin realm and self-custodied money and all these things.

535
00:41:25,060 --> 00:41:46,360
And so what Jesse and team are building at Vora and as is available in different degrees are devices and so-called signers or, you know, cold wallets, what all these terms float around for those not familiar, that are true self-custody.

536
00:41:46,360 --> 00:41:52,880
And with that, I am able to fully control, in this case, Bitcoin and the way it moves.

537
00:41:53,481 --> 00:41:56,860
I also have, so I have extreme ownership and extreme responsibility.

538
00:41:56,860 --> 00:41:57,981
If I screw up, it's on me.

539
00:41:58,040 --> 00:42:00,600
There's nobody to bail me out, claw me back.

540
00:42:00,700 --> 00:42:09,220
So I say all that to ask that in your circles, be it DC or otherwise, to what degree do these

541
00:42:09,220 --> 00:42:12,140
options appear in conversation?

542
00:42:12,140 --> 00:42:18,280
to what degree are they palatable to those making these regulatory or other choices?

543
00:42:18,740 --> 00:42:29,080
Or is it, as you say, that the financial institutions, you know, need to create, add additional scrutiny, additional verification?

544
00:42:29,500 --> 00:42:34,240
So maybe on that spectrum from it's mine, I control it, I own it, I'll do what I want,

545
00:42:34,320 --> 00:42:40,680
I'll make my own mistakes and be responsible to the institutions need to keep me from hitting my thumb with the hammer.

546
00:42:40,680 --> 00:42:41,600
Yeah

547
00:42:41,600 --> 00:42:44,380
Great question

548
00:42:44,380 --> 00:42:48,200
I think we're in the very very early days

549
00:42:48,200 --> 00:42:49,440
Of these discussions

550
00:42:49,440 --> 00:42:51,900
And it's because of the circumstances

551
00:42:51,900 --> 00:42:53,160
That we've now seen emerge

552
00:42:53,160 --> 00:42:55,380
Where people who are losing large sums of money

553
00:42:55,380 --> 00:42:56,800
Through fraud

554
00:42:56,800 --> 00:42:59,200
As well as you have people who

555
00:42:59,200 --> 00:43:01,060
Have large sums of money

556
00:43:01,060 --> 00:43:03,020
That's a relative basis

557
00:43:03,020 --> 00:43:04,460
It's large to them

558
00:43:04,460 --> 00:43:07,800
That they want to have more control over

559
00:43:07,800 --> 00:43:10,000
Both are equally valid

560
00:43:10,680 --> 00:43:17,620
But you're trying to wedge both of them under a system that doesn't really recognize the value of either one.

561
00:43:19,300 --> 00:43:34,260
So I think we're early days in this discussion, but there is this growing recognition that this maximalist, friction-free road we've been going down is not serving everyone well.

562
00:43:34,260 --> 00:43:50,600
And we have to maybe come up with a process and a schema that recognizes the spectrum as opposed to shoving everybody in to everybody gets the maximum amount that we can technologically do today at scale.

563
00:43:51,660 --> 00:43:52,740
Right. Fair. Absolutely.

564
00:43:52,740 --> 00:43:56,540
Dial it, not necessarily dial it back, but let's dial it differently.

565
00:43:56,660 --> 00:43:58,100
Let's divide up things differently.

566
00:43:58,481 --> 00:44:05,420
Those discussions, I think, are beginning to happen because everybody's realized that it's very problematic.

567
00:44:06,120 --> 00:44:11,200
And as you might imagine, most large institutions are trying to avoid overregulation.

568
00:44:11,200 --> 00:44:31,981
So if they can avoid overregulation by changing process to give that kind of flexibility, let people decide how much friction they want, then I think you're going to see, that's where we'll see more probably action over the next year or so.

569
00:44:31,981 --> 00:44:38,020
But as you might imagine, it's not easy conversations, it's not going to be fast conversations.

570
00:44:38,620 --> 00:44:40,680
And there's going to be a lot of bumps along the road.

571
00:44:41,200 --> 00:44:44,940
Well, speaking of the fox and the hen house.

572
00:44:45,500 --> 00:44:45,780
Yeah.

573
00:44:47,780 --> 00:44:48,600
The government.

574
00:44:48,800 --> 00:44:53,320
And so I don't know, and perhaps you do, the degree to which it is still an active conversation.

575
00:44:53,320 --> 00:45:06,560
But before Elon left the building, there was obviously a very lively debate about Doge and accessing Social Security Administration data, you know, combining that with other federal databases.

576
00:45:06,560 --> 00:45:09,960
is as someone who tracks this,

577
00:45:11,100 --> 00:45:13,340
what's your read on the government itself,

578
00:45:13,400 --> 00:45:14,700
at least at the federal level,

579
00:45:14,820 --> 00:45:18,940
where are they in helpful to hindrance,

580
00:45:19,180 --> 00:45:20,520
you know, again, on that spectrum?

581
00:45:21,800 --> 00:45:21,860
Yeah.

582
00:45:22,360 --> 00:45:24,460
And I know it's not one thing, so.

583
00:45:24,660 --> 00:45:27,620
Yeah, yeah, because there are a lot of moving parts.

584
00:45:28,140 --> 00:45:28,320
Yes.

585
00:45:28,320 --> 00:45:32,380
And we're involved in,

586
00:45:32,481 --> 00:45:33,880
there's a lot of litigation over this.

587
00:45:33,940 --> 00:45:36,460
We just had a decision late last week

588
00:45:36,460 --> 00:45:41,860
over the Social Security Administration sharing data with DHS.

589
00:45:42,460 --> 00:45:47,481
And you've had IRS data being,

590
00:45:47,820 --> 00:45:50,020
the litigation around IRS data being shared.

591
00:45:50,020 --> 00:45:56,320
And so there are both good and bad things

592
00:45:56,320 --> 00:46:00,200
within government regulations about data share and laws

593
00:46:00,200 --> 00:46:04,020
that let's set aside the last year.

594
00:46:04,020 --> 00:46:08,020
because one of the things that frustrates

595
00:46:08,020 --> 00:46:11,520
victims of identity crimes,

596
00:46:11,640 --> 00:46:13,600
and those of us who work in these areas

597
00:46:13,600 --> 00:46:15,900
where you're talking about identity theft, fraud, and scams,

598
00:46:16,481 --> 00:46:19,960
is the lack of data sharing within the government.

599
00:46:21,140 --> 00:46:23,380
And there are a lot of barriers.

600
00:46:24,160 --> 00:46:26,600
There are statutory barriers that exist

601
00:46:26,600 --> 00:46:30,400
that thou shalt not share this with anybody else.

602
00:46:30,920 --> 00:46:33,120
In many cases, like the IRS data,

603
00:46:33,120 --> 00:46:37,580
social security data there's very good and valid reasons that some of the other agencies

604
00:46:37,580 --> 00:46:46,940
less so if you're sharing it with another agency just something as basic as how many

605
00:46:46,940 --> 00:46:52,720
reports of identities that were there what were the particulars not necessarily who or any of the

606
00:46:52,720 --> 00:46:57,240
personal information but just the fact that it existed that kind of data doesn't get shared

607
00:46:57,240 --> 00:47:09,040
We don't have good data around the number of individuals who are actually impacted by fraud, scam, and identity theft.

608
00:47:09,340 --> 00:47:11,680
What data we do have is all self-reported.

609
00:47:12,920 --> 00:47:17,620
And I presume, Lee, James, excuse me, that that goes to incentives?

610
00:47:18,360 --> 00:47:19,240
Or is it incompetence?

611
00:47:19,960 --> 00:47:20,960
Is it complexity?

612
00:47:22,240 --> 00:47:23,580
What's your read on that?

613
00:47:23,580 --> 00:47:26,620
more than anything else

614
00:47:26,620 --> 00:47:28,540
there is incentive

615
00:47:28,540 --> 00:47:31,080
and it is

616
00:47:31,080 --> 00:47:35,120
nothing more complicated

617
00:47:35,120 --> 00:47:36,440
than turf battles

618
00:47:36,440 --> 00:47:41,160
within government there's always a level

619
00:47:41,160 --> 00:47:42,680
of turfiness

620
00:47:42,680 --> 00:47:45,180
and that's a big part

621
00:47:45,180 --> 00:47:45,540
of it

622
00:47:45,540 --> 00:47:49,320
in the business world

623
00:47:49,320 --> 00:47:51,280
and in all of our personal

624
00:47:51,280 --> 00:47:53,320
lives we can't manage something

625
00:47:53,320 --> 00:47:58,040
we don't know about. I can't fix something I don't know about. And I can't manage risk I'm

626
00:47:58,040 --> 00:48:11,020
unaware of particularly. Right. And we don't have good data about the real scope of, not of just the

627
00:48:11,020 --> 00:48:17,960
volume and velocity of the occurrences, but what are the real impact on real individuals? We don't

628
00:48:17,960 --> 00:48:24,840
have good data about that. It doesn't have to be in a central repository, but it does need to be

629
00:48:24,840 --> 00:48:30,320
collected and shared. You can scatter it across as many agencies as you want, but it does need to

630
00:48:30,320 --> 00:48:37,520
be collected and shared so we can then make good and valid decisions about how to address whatever

631
00:48:37,520 --> 00:48:44,360
the underlying issue is. We don't have that. I think if I got this right, ITRC's data shows

632
00:48:44,360 --> 00:48:51,660
over 25,000 breaches over 20 years and 79 billion exposed records?

633
00:48:53,120 --> 00:48:53,880
Yeah, that's right.

634
00:48:54,620 --> 00:48:56,680
And so, you know, staggering.

635
00:48:57,020 --> 00:49:02,320
And I guess as we talk about multi-agencies, federal government, data brokers,

636
00:49:02,920 --> 00:49:06,500
it certainly occurs to me that at some point the question stops being,

637
00:49:06,960 --> 00:49:08,920
how do we secure these databases better,

638
00:49:09,040 --> 00:49:13,160
but should anyone be holding this much data in the first place?

639
00:49:13,160 --> 00:49:15,640
where are you on that question?

640
00:49:16,440 --> 00:49:17,500
We're a big

641
00:49:17,500 --> 00:49:19,380
fan of

642
00:49:19,380 --> 00:49:20,920
data minimization.

643
00:49:22,880 --> 00:49:23,920
Again, there are

644
00:49:23,920 --> 00:49:25,560
good and valid reasons why

645
00:49:25,560 --> 00:49:28,180
data should be collected and used

646
00:49:28,180 --> 00:49:29,820
but that's different from

647
00:49:29,820 --> 00:49:31,900
should it be stored. So when we talk about

648
00:49:31,900 --> 00:49:32,880
data minimization

649
00:49:32,880 --> 00:49:35,880
it's a multi-step

650
00:49:35,880 --> 00:49:37,860
process. First question is, do I need

651
00:49:37,860 --> 00:49:39,880
the data? If you don't need

652
00:49:39,880 --> 00:49:41,700
the data, then we'll separate

653
00:49:41,700 --> 00:49:46,840
need from one. I might want the data, but the marketing department always wants the data.

654
00:49:47,760 --> 00:49:53,780
But do they need the data? No. So if you don't need it, don't collect it. You know what that does?

655
00:49:53,840 --> 00:50:01,180
That reduces your risk profile because you cannot have data compromise that you do not have.

656
00:50:01,840 --> 00:50:07,580
So data breaches go down right there. Data breaches go down. Victims go down. Data breaches

657
00:50:07,580 --> 00:50:09,481
go down, cybersecurity attacks

658
00:50:09,481 --> 00:50:10,260
go down.

659
00:50:11,080 --> 00:50:13,660
Do you need the data? Yes or

660
00:50:13,660 --> 00:50:15,320
no? Okay, I need the data.

661
00:50:16,820 --> 00:50:17,120
Okay.

662
00:50:17,580 --> 00:50:18,820
Why do you need the data?

663
00:50:19,420 --> 00:50:20,800
And how long do you need the data?

664
00:50:21,580 --> 00:50:23,320
So once that purpose,

665
00:50:24,160 --> 00:50:25,680
that valid purpose, is

666
00:50:25,680 --> 00:50:26,260
fulfilled,

667
00:50:27,280 --> 00:50:29,120
are you statutorily or

668
00:50:29,120 --> 00:50:31,300
regulatorily required to keep it? If not,

669
00:50:31,380 --> 00:50:33,460
get rid of it. Don't store

670
00:50:33,460 --> 00:50:35,500
it. And this gets to

671
00:50:35,500 --> 00:50:37,380
that convenience, that friction part again.

672
00:50:37,580 --> 00:50:41,300
where they ask you, well, do you want to store the credit card?

673
00:50:42,060 --> 00:50:43,400
Don't ask that question.

674
00:50:43,620 --> 00:50:44,320
Just don't do it.

675
00:50:44,481 --> 00:50:47,920
If you don't need it, you've completed the transaction,

676
00:50:48,060 --> 00:50:50,900
you provided the receipt, you don't need it anymore, get rid of it.

677
00:50:52,160 --> 00:50:55,000
Having to fill in the credit card number again when they come back.

678
00:50:55,800 --> 00:50:57,360
My browser will do it quickly for me.

679
00:50:58,280 --> 00:50:58,440
Yeah.

680
00:50:59,180 --> 00:51:01,700
15 seconds longer is the transaction.

681
00:51:02,540 --> 00:51:04,540
If they want the sweater, they're going to buy the sweater.

682
00:51:04,760 --> 00:51:05,640
Don't worry about it.

683
00:51:05,640 --> 00:51:16,440
And then the third thing is, if you need the data, you have to keep the data, and you've got to store it in a secure fashion.

684
00:51:16,560 --> 00:51:17,420
It's got to be encrypted.

685
00:51:18,240 --> 00:51:21,040
And now increasingly, we've got to prepare for quantum.

686
00:51:21,560 --> 00:51:31,080
So it's got to be encrypted in a way that will be still safe and secure once quantum becomes mainstream.

687
00:51:31,080 --> 00:51:41,420
Of course, there are a lot of people who believe that we're having preemptive attacks to cache data that is encrypted now.

688
00:51:42,240 --> 00:51:46,820
They just hold it so when quantum becomes available, they can go in and decrypt it using quantum.

689
00:51:48,981 --> 00:51:57,660
But the base part of minimization is we over-collect data, and it creates risk, not reward,

690
00:51:57,660 --> 00:52:01,420
not financial value in the long term, or organizations.

691
00:52:02,920 --> 00:52:11,080
Changing that mindset is very difficult and is not going to happen absent regulation.

692
00:52:11,940 --> 00:52:14,740
We've been having these conversations for 20 years.

693
00:52:15,920 --> 00:52:21,280
The power, the computing power we have, the storage capacity we have today,

694
00:52:21,420 --> 00:52:24,400
has led to the massive amounts of data that's been stolen

695
00:52:24,400 --> 00:52:27,580
because we've collected massive amounts of data that we couldn't collect.

696
00:52:27,660 --> 00:52:35,501
Was it Edward Snowden during those revelations that called the NSA's approach collect it all?

697
00:52:36,180 --> 00:52:41,680
Not to conflate the NSA with your average data broker, but I think to some degree we can.

698
00:52:42,560 --> 00:52:43,640
Well, you can certainly.

699
00:52:43,840 --> 00:52:44,820
It is a mentality.

700
00:52:45,180 --> 00:52:48,580
And I wouldn't limit it to the NSA or to data brokers.

701
00:52:48,880 --> 00:52:49,900
It is any organization.

702
00:52:49,900 --> 00:52:50,220
My point precisely.

703
00:52:50,840 --> 00:52:50,940
Yeah.

704
00:52:52,940 --> 00:52:56,200
Retail is probably the absolute worst when it comes to that.

705
00:52:56,200 --> 00:53:00,280
because they're always profiling their customers.

706
00:53:00,440 --> 00:53:01,900
They're always profiling the purchase.

707
00:53:02,021 --> 00:53:04,940
They're always trying to figure out what it will take to get you to buy more.

708
00:53:06,380 --> 00:53:10,640
Obviously, the social media platforms, which are only just –

709
00:53:10,640 --> 00:53:12,600
they're giant advertising platforms.

710
00:53:12,600 --> 00:53:16,040
Always remember, you are the product.

711
00:53:16,660 --> 00:53:17,920
You are not the customer.

712
00:53:18,340 --> 00:53:30,210
You are the product And so they always wanting to collect more so they can analyze more so they can be more refined in it So it a global obsession

713
00:53:31,210 --> 00:53:34,390
So we're not going to change that, certainly not overnight.

714
00:53:34,390 --> 00:53:43,670
And it's only going to change if we have a regulatory structure, whether that's self-regulation or governor regulation,

715
00:53:43,670 --> 00:53:55,331
that gives people the incentive to collect less, use less, and store less because it reduces their risk.

716
00:53:56,591 --> 00:53:56,690
Absolutely.

717
00:53:56,690 --> 00:54:03,030
And in combination with that, you, I believe, advocate for credit freezes, the use of pass keys,

718
00:54:03,291 --> 00:54:07,190
which are promising, you know, I think still somewhat emerging technology.

719
00:54:07,791 --> 00:54:11,550
Both of these, to differing degrees, put control back in the individual's hands

720
00:54:11,550 --> 00:54:14,470
instead of trusting an institution to protect you.

721
00:54:14,811 --> 00:54:17,811
Is that the direction identity needs to go?

722
00:54:18,010 --> 00:54:20,571
Less custodianship, more individual control?

723
00:54:21,010 --> 00:54:27,111
And what's your over under on the average individual's ability to do that?

724
00:54:29,771 --> 00:54:35,091
I do think we are moving to a time of more balance

725
00:54:35,091 --> 00:54:37,930
where people are beginning to realize,

726
00:54:38,091 --> 00:54:39,410
look, nobody's going to take care of me.

727
00:54:39,751 --> 00:54:41,291
I'm going to have to take care of myself.

728
00:54:41,550 --> 00:54:48,811
so that's why you're seeing the increase in passkey usage you're seeing an increase in credit

729
00:54:48,811 --> 00:54:54,990
freezes you're seeing an increase in people closing old accounts clearing out

730
00:54:54,990 --> 00:55:06,251
old files and things which have value to a an identity criminal but they don't need anymore

731
00:55:06,251 --> 00:55:11,231
and they're asking organizations to get rid of it too you don't you don't need all this data on me

732
00:55:11,231 --> 00:55:12,670
I don't want you to have this on me.

733
00:55:13,010 --> 00:55:15,450
And so the places where they allow you to delete it,

734
00:55:15,510 --> 00:55:17,351
you're seeing people actually do that.

735
00:55:17,591 --> 00:55:20,450
Not in big numbers, but that will grow over time.

736
00:55:21,010 --> 00:55:25,450
So we're seeing this balance emerge, which has historically been,

737
00:55:25,831 --> 00:55:27,071
now that's somebody else's job.

738
00:55:27,831 --> 00:55:31,010
The company that has my data, that's their job to protect me,

739
00:55:31,091 --> 00:55:32,950
not my job to protect myself.

740
00:55:33,130 --> 00:55:34,190
That was never true.

741
00:55:34,650 --> 00:55:38,450
It was always, it had to be a joint.

742
00:55:38,450 --> 00:55:43,130
And the visible part of the failure was always on the part of the organization.

743
00:55:43,311 --> 00:55:47,630
And the highly visible part is still organizational failure.

744
00:55:48,190 --> 00:55:58,351
But because we weren't taking as individuals the steps beforehand to make that data less useful, it exacerbated the impact.

745
00:55:58,811 --> 00:56:02,970
That's what we can do as individuals is make our data less useful.

746
00:56:02,970 --> 00:56:07,510
Hard to make it less available, but easy for us to take steps to make it less useful.

747
00:56:07,510 --> 00:56:09,851
and so long as the bad guys

748
00:56:09,851 --> 00:56:11,731
continue to be basically lazy

749
00:56:11,731 --> 00:56:13,530
if they can't do it at scale

750
00:56:13,530 --> 00:56:14,891
and they can't do it automatically

751
00:56:14,891 --> 00:56:17,311
if you throw up roadblocks to their usage

752
00:56:17,311 --> 00:56:18,550
they're going to go away

753
00:56:18,550 --> 00:56:20,831
they're going to move on to somebody else

754
00:56:20,831 --> 00:56:23,251
because you're making it

755
00:56:23,251 --> 00:56:24,470
you're making them work

756
00:56:24,470 --> 00:56:25,751
they don't like to work

757
00:56:25,751 --> 00:56:27,811
so they'll move on to somebody else

758
00:56:27,811 --> 00:56:28,530
who makes it easy

759
00:56:28,530 --> 00:56:31,291
so we're reaching that battle

760
00:56:31,291 --> 00:56:34,150
but I do think that continuum

761
00:56:34,150 --> 00:56:35,550
has to do just that

762
00:56:35,550 --> 00:56:38,630
continue. Businesses have to continue to improve.

763
00:56:39,710 --> 00:56:43,490
Individuals have to continue to take more responsibility for their own

764
00:56:43,490 --> 00:56:47,650
data protection. And ultimately, government has to provide

765
00:56:47,650 --> 00:56:50,891
the framework that incentivizes both to do that.

766
00:56:51,450 --> 00:56:55,490
And that incentive may be a hammer. It doesn't have to be

767
00:56:55,490 --> 00:56:59,510
a financial incentive. It doesn't have to be

768
00:56:59,510 --> 00:57:03,430
a you get out of jail free card. It can be there's a hammer.

769
00:57:03,430 --> 00:57:07,091
but it should be a level of

770
00:57:07,091 --> 00:57:08,751
if you're doing all the right things

771
00:57:08,751 --> 00:57:10,950
all the right way and you still get

772
00:57:10,950 --> 00:57:11,430
ag

773
00:57:11,430 --> 00:57:14,510
maybe you do get

774
00:57:14,510 --> 00:57:16,410
not a get out of jail free card

775
00:57:16,410 --> 00:57:17,670
but you get a

776
00:57:17,670 --> 00:57:20,650
pass on the

777
00:57:20,650 --> 00:57:22,751
X multiplier

778
00:57:22,751 --> 00:57:25,091
on punitive damages

779
00:57:25,091 --> 00:57:25,990
or you know

780
00:57:25,990 --> 00:57:27,731
you have to

781
00:57:27,731 --> 00:57:30,891
maybe you get out of some other forms of liability

782
00:57:30,891 --> 00:57:33,210
maybe the

783
00:57:33,210 --> 00:57:37,690
Regulators won't make you sign a 20-year deal that you have to.

784
00:57:37,690 --> 00:57:38,010
Right.

785
00:57:38,490 --> 00:57:38,710
Yeah.

786
00:57:39,071 --> 00:57:46,811
But we have to have all three of those elements working together, individuals, organizations, and government.

787
00:57:46,950 --> 00:57:48,510
And that's what's missing today.

788
00:57:48,510 --> 00:57:57,771
And that's one of the things we brought home in this most recent report, and we're going to talk about it increasingly over time, is no one organization can do it.

789
00:57:58,271 --> 00:57:59,550
Consumers can't do it by themselves.

790
00:57:59,771 --> 00:58:01,050
Businesses can't do it by themselves.

791
00:58:01,271 --> 00:58:02,591
Government shouldn't do it by itself.

792
00:58:03,210 --> 00:58:08,950
So we all got to work together, or this problem is not going to get better.

793
00:58:09,091 --> 00:58:10,611
It's only going to continue to get better.

794
00:58:11,170 --> 00:58:21,550
Appreciating the naive nature of this question, I am still interested, James, to ask if you could redesign how identity works in the U.S. from scratch.

795
00:58:21,550 --> 00:58:23,731
If you had that magic wand, what would it look like?

796
00:58:23,731 --> 00:58:33,891
well you know we just changed our mission statement and our vision to be a very simple one

797
00:58:33,891 --> 00:58:43,071
a world where nobody can use my identity but me and so i think we would need to design processes

798
00:58:43,071 --> 00:58:49,891
that start at birth that give you full and complete control over your identity and if

799
00:58:49,891 --> 00:58:55,030
and then with all the protections that come and would be required for anybody

800
00:58:55,030 --> 00:58:59,530
that you're sharing it with, but also all the responsibilities that you would incur.

801
00:59:00,530 --> 00:59:06,550
So from a practical standpoint, you know, a credential, you know,

802
00:59:06,871 --> 00:59:15,630
that follows you throughout your life that is completely static,

803
00:59:16,891 --> 00:59:18,550
not sure that's the way to go.

804
00:59:18,811 --> 00:59:19,351
Don't know.

805
00:59:19,891 --> 00:59:22,851
Uh, because how would you secure that?

806
00:59:22,950 --> 00:59:25,430
That we'd have, that you'd still have the same problem we have today.

807
00:59:25,930 --> 00:59:29,731
We, the social security number was designed to be a lifetime number.

808
00:59:30,630 --> 00:59:34,530
Uh, now we've got driver's license that are designed to be your adult life.

809
00:59:34,530 --> 00:59:44,970
Um, we're, if we could, if we get people over biometrics, you know, a, a, a, a secure biometric would undoubtedly be a part of that.

810
00:59:44,970 --> 01:00:06,030
But we've got to rethink the whole concept of how identity data is captured, used, stored, and then we have to reevaluate when and how we authenticate that identity.

811
01:00:06,030 --> 01:00:14,650
What we're doing today works okay, but obviously we see the flaws.

812
01:00:14,851 --> 01:00:18,150
And when the flaws occur, there's significant impact.

813
01:00:18,530 --> 01:00:24,291
Well, the good part about it is, as you well know, look, there are trillions of transactions a day that involve your identity.

814
01:00:24,831 --> 01:00:27,111
Not your individual, but individuals collected.

815
01:00:27,851 --> 01:00:28,510
Trillions a day.

816
01:00:29,271 --> 01:00:33,791
The vast majority of those go through without a hit.

817
01:00:33,791 --> 01:00:36,410
and they're secure and they're not at risk.

818
01:00:36,950 --> 01:00:37,710
But that doesn't matter.

819
01:00:38,111 --> 01:00:41,331
It's that slice that don't

820
01:00:41,331 --> 01:00:43,811
and that there are people in the world

821
01:00:43,811 --> 01:00:46,891
who are hell-bent on making sure

822
01:00:46,891 --> 01:00:48,710
that that number grows.

823
01:00:49,030 --> 01:00:52,371
The number of transactions that don't make it through,

824
01:00:52,630 --> 01:00:54,030
they want that number to grow.

825
01:00:54,871 --> 01:00:58,450
So we have to design an identity system

826
01:00:58,450 --> 01:01:02,170
that is built for that world,

827
01:01:02,170 --> 01:01:05,751
whereas the identity system we have today is not.

828
01:01:06,071 --> 01:01:09,731
And to me, it strikes me as a valuable framing,

829
01:01:09,970 --> 01:01:12,291
which is the social security number initiated,

830
01:01:12,470 --> 01:01:15,950
what, in the 30s as an accounting identifier

831
01:01:15,950 --> 01:01:19,391
assumes a benevolent environment.

832
01:01:19,391 --> 01:01:20,630
And I think to your point,

833
01:01:20,690 --> 01:01:25,751
we have to assume a hostile environment

834
01:01:25,751 --> 01:01:27,811
and build accordingly.

835
01:01:28,611 --> 01:01:31,150
Well, and on that note, let's wrap it here, James.

836
01:01:31,150 --> 01:01:43,371
So for someone who's listening, watching, who's just gotten their third, you know, breach notice this year and hasn't done anything, what are two or three things that they should do today?

837
01:01:43,630 --> 01:01:46,010
Whether you've got a data breach notice or not, go freeze your credit.

838
01:01:48,150 --> 01:02:00,470
This is not 20 years ago when credit freezes were first introduced where you had to pay for them and you had to physically mail a letter to the credit bureau and wait for a letter to come back and all.

839
01:02:00,470 --> 01:02:04,550
Look, it's as easy as any online transaction.

840
01:02:04,970 --> 01:02:07,610
It has no impact on your credit score, which some people think it does.

841
01:02:07,650 --> 01:02:08,251
It does not.

842
01:02:09,071 --> 01:02:14,411
So freeze your credit because that's the only thing that is going to stop something bad from happening.

843
01:02:14,591 --> 01:02:19,010
Everything else is basically a trailing indicator.

844
01:02:20,791 --> 01:02:22,871
So freeze your credit.

845
01:02:23,930 --> 01:02:25,731
Adopt pass keys if you haven't already.

846
01:02:25,731 --> 01:02:30,091
I would assume most of the people listening to this are probably well down the path of pass keys.

847
01:02:30,470 --> 01:02:37,930
realizing the value because there's nothing stored at the account side and it's a token on your side.

848
01:02:38,010 --> 01:02:38,791
You never see it.

849
01:02:39,130 --> 01:02:43,210
You cannot self-compromise, which is how a lot of data breaches occur today,

850
01:02:43,751 --> 01:02:46,930
is people self-compromising credentials that we use later.

851
01:02:47,690 --> 01:03:00,050
So passkeys are a vast improvement over what we have today from an account opening and account verification setup.

852
01:03:00,470 --> 01:03:04,970
if these people are still using passwords, obviously,

853
01:03:05,571 --> 01:03:07,670
make sure you're not reusing them

854
01:03:07,670 --> 01:03:10,130
because we still see a very high number of people

855
01:03:10,130 --> 01:03:13,210
using the same password on every account.

856
01:03:13,571 --> 01:03:15,430
There's that convenience factor again.

857
01:03:16,210 --> 01:03:17,391
So don't do that.

858
01:03:17,470 --> 01:03:18,811
Use a password manager.

859
01:03:19,610 --> 01:03:21,990
Also not perfect, but better than nothing.

860
01:03:22,251 --> 01:03:24,311
MFA, not perfect, better than nothing.

861
01:03:25,311 --> 01:03:27,231
So taking in its totality,

862
01:03:27,231 --> 01:03:28,891
if you take those steps,

863
01:03:28,891 --> 01:03:33,990
that's going to get you from a technical perspective that's about as far as you can go

864
01:03:33,990 --> 01:03:38,751
as an individual now there are some other things you can do and and should consider doing

865
01:03:38,751 --> 01:03:45,630
do you like the privacy policies the data practice policies the data collection do you have the right

866
01:03:45,630 --> 01:03:49,970
to look at your data do you have a right to correct your data delete your data at all the

867
01:03:49,970 --> 01:03:56,130
organizations where you do business and if you don't find someplace else to give them your

868
01:03:56,130 --> 01:03:58,150
hard-earned money. You don't have to

869
01:03:58,150 --> 01:04:00,331
always go to the same organization

870
01:04:00,331 --> 01:04:01,571
if you don't like what they're doing

871
01:04:01,571 --> 01:04:04,150
if they breach your data and you don't think they're

872
01:04:04,150 --> 01:04:06,371
doing enough to protect it, go someplace else.

873
01:04:07,731 --> 01:04:08,351
But that's

874
01:04:08,351 --> 01:04:09,430
a personal choice.

875
01:04:10,110 --> 01:04:12,231
And that's the kind

876
01:04:12,231 --> 01:04:14,071
of thing we have to get people to think

877
01:04:14,071 --> 01:04:16,091
more of today, think critically

878
01:04:16,091 --> 01:04:16,690
about it.

879
01:04:17,591 --> 01:04:20,110
It's in cybersecurity, it's a zero-trust

880
01:04:20,110 --> 01:04:20,450
model.

881
01:04:21,851 --> 01:04:24,331
I only want to do business

882
01:04:24,331 --> 01:04:25,811
with people that I trust

883
01:04:26,130 --> 01:04:28,291
who are going to protect my interests.

884
01:04:28,411 --> 01:04:30,190
And if you don't, I'm going to go someplace else.

885
01:04:31,430 --> 01:04:33,710
So we have to do that.

886
01:04:33,990 --> 01:04:36,311
We have to be critical and have a critical eye

887
01:04:36,311 --> 01:04:37,891
towards the things that are coming at us.

888
01:04:41,411 --> 01:04:43,731
There's a scam going around now

889
01:04:43,731 --> 01:04:47,210
with the Social Security annual benefit note.

890
01:04:48,550 --> 01:04:51,470
Some of your folks watching here listening

891
01:04:51,470 --> 01:04:52,970
may have received one of those.

892
01:04:53,110 --> 01:04:54,771
They are letter perfect.

893
01:04:54,771 --> 01:04:56,391
letter perfect

894
01:04:56,391 --> 01:04:59,351
but they're fake

895
01:04:59,351 --> 01:05:01,610
and when you click on it you go to

896
01:05:01,610 --> 01:05:03,110
an info stealer that's going to

897
01:05:03,110 --> 01:05:05,470
when you're trying to verify your

898
01:05:05,470 --> 01:05:07,650
social security administration login

899
01:05:07,650 --> 01:05:09,450
information it's just collecting it

900
01:05:09,450 --> 01:05:11,091
they're a fake website

901
01:05:11,091 --> 01:05:13,591
so you know we

902
01:05:13,591 --> 01:05:15,251
have to have this critical eye

903
01:05:15,251 --> 01:05:17,530
to look for those telltale signs which

904
01:05:17,530 --> 01:05:19,391
is harder to do with

905
01:05:19,391 --> 01:05:21,571
AI making things so realistic

906
01:05:21,571 --> 01:05:23,630
so we have to be

907
01:05:23,630 --> 01:05:29,630
that zero trust. We have to have a critical eye toward what we're receiving. And don't be afraid

908
01:05:29,630 --> 01:05:35,311
to ask questions. Don't be afraid to be. Maybe your mother told you it's rude to ask a question.

909
01:05:35,450 --> 01:05:41,150
It's rude to say no. It's neither. And it's something we have to get far more comfortable

910
01:05:41,150 --> 01:05:46,450
with than we are historically. Most Americans, we're friendly people. We're gregarious. We're

911
01:05:46,450 --> 01:05:53,510
outgoing. We like interacting with people. And the bad guys you set against us. We want to be

912
01:05:53,510 --> 01:05:57,271
helpful. We want to be helpful to other people. We certainly want to be helpful to our

913
01:05:57,271 --> 01:06:01,470
friends and family. So if we think we're dealing with somebody that we know and we turn out we're

914
01:06:01,470 --> 01:06:05,510
ma, those kind of scams

915
01:06:05,510 --> 01:06:09,430
occur every day. So we have to learn the tools of

916
01:06:09,430 --> 01:06:13,470
asking questions. Somebody asks about money and they never ask you about

917
01:06:13,470 --> 01:06:17,510
money in the 25 years you've known them or the 25 minutes you've

918
01:06:17,510 --> 01:06:21,550
known them. That's a red flag. Follow up

919
01:06:21,550 --> 01:06:29,091
on that. Don't just assume that it's okay. So we've got to kind of change a lot of the ways we

920
01:06:29,091 --> 01:06:34,690
think and do business. And we can sit here and say, gosh, I wish we didn't have to do that. But

921
01:06:34,690 --> 01:06:41,050
unfortunately, my friends, we do. So those are the kind of things, the technical things,

922
01:06:41,050 --> 01:06:44,791
and then the sort of lifestyle things that will make us more secure.

923
01:06:46,030 --> 01:06:49,110
Great, great advice, James. Thank you for that. And so paired with that,

924
01:06:49,110 --> 01:07:00,411
What should a small business owner who relies on third-party software for everything they do be doing differently after reading ITRC's 2025 report?

925
01:07:01,311 --> 01:07:05,231
And it's so hard to tell the difference between a small business and an individual anymore.

926
01:07:05,470 --> 01:07:10,411
In a lot of cases, a lot of gig workers, a lot of single-entity LLCs.

927
01:07:10,970 --> 01:07:14,591
Hopefully, they've got all that software.

928
01:07:15,030 --> 01:07:17,930
It's SaaS, software as a service, all that.

929
01:07:17,930 --> 01:07:20,311
They've got all the automatic updates set.

930
01:07:20,571 --> 01:07:23,710
That is the bare minimum you need to do.

931
01:07:24,411 --> 01:07:28,050
Your antivirus and everything is kind of built into your software now.

932
01:07:28,150 --> 01:07:30,891
It's built into your operating system.

933
01:07:30,891 --> 01:07:35,710
You can always do belt and suspenders and go to a higher level of antivirus.

934
01:07:36,490 --> 01:07:39,150
That's a resource question for most people.

935
01:07:40,291 --> 01:07:46,091
But even before you do that, if you're looking and you want to spend more of your hard-earned resources,

936
01:07:46,091 --> 01:07:49,610
It's pinned on a managed service security provider, an MSSP.

937
01:07:50,791 --> 01:07:55,490
So have that third-party person who's going to monitor your equipment,

938
01:07:55,650 --> 01:07:59,130
who's going to monitor your traffic in and out of your network,

939
01:07:59,650 --> 01:08:05,331
look for those attacks, block those attacks, keep those tools up and running.

940
01:08:05,990 --> 01:08:11,831
And that is sort of the first step beyond just you doing everything yourself

941
01:08:11,831 --> 01:08:13,510
that you should really contemplate doing.

942
01:08:13,510 --> 01:08:21,791
And that's very important because small businesses are big targets.

943
01:08:22,570 --> 01:08:29,331
People may think that just because I'm small, I'm a one person, I'm this, I'm that, that I don't have anything that anybody wants.

944
01:08:29,411 --> 01:08:30,070
That's not true.

945
01:08:30,431 --> 01:08:33,230
The bad guys will find a way to make money off of anything and anybody.

946
01:08:34,110 --> 01:08:39,511
So you are a target, and you need to do things to defend yourself.

947
01:08:39,511 --> 01:08:45,610
And so as soon as you get a little extra cash and you can, when your business is growing, get that managed service provider.

948
01:08:46,090 --> 01:08:52,431
And then ultimately, you know, the goal would be to have somebody maybe on your own staff who's looking after that as your business grows.

949
01:08:52,550 --> 01:08:54,590
But that is that is very important.

950
01:08:55,210 --> 01:09:07,431
If you have employees and not all that, train yourself to look for all of those indicators of fraud, fraud and now leads cyber attacks.

951
01:09:07,431 --> 01:09:14,351
so they're going to try to trick you into maybe paying an invoice you don't owe.

952
01:09:14,951 --> 01:09:17,170
I get two or three of those a day, it seems.

953
01:09:17,170 --> 01:09:22,431
Yeah, so things like that, learn those indicators,

954
01:09:23,331 --> 01:09:30,831
and there are probably if you're in an industry that has some sort of association in your state

955
01:09:30,831 --> 01:09:34,070
or in your community, those groups should create resources, too,

956
01:09:34,070 --> 01:09:36,190
for what specifically is being targeted

957
01:09:36,190 --> 01:09:39,610
in your particular business sector.

958
01:09:40,411 --> 01:09:42,291
So take advantage of those things.

959
01:09:43,471 --> 01:09:45,110
But train your staff to look for that too

960
01:09:45,110 --> 01:09:49,590
because the bad guys know the weakest link

961
01:09:49,590 --> 01:09:51,590
and the best security are human beings.

962
01:09:52,270 --> 01:09:53,710
Maybe well-intentioned,

963
01:09:53,990 --> 01:09:55,371
but somebody's going to click on a link.

964
01:09:55,650 --> 01:09:56,891
They're going to answer a phone call

965
01:09:56,891 --> 01:09:58,471
and provide information they shouldn't.

966
01:09:58,871 --> 01:10:00,650
They're going to respond to a text.

967
01:10:00,650 --> 01:10:05,951
It is so hard to make sure that doesn't happen.

968
01:10:06,550 --> 01:10:08,411
But the way you do that is through good training.

969
01:10:08,570 --> 01:10:11,251
And it's not just training the day they show up for work.

970
01:10:11,530 --> 01:10:14,170
It's periodic training because the techniques change.

971
01:10:14,911 --> 01:10:16,411
The technology changes.

972
01:10:16,971 --> 01:10:19,030
So you need to do that on a periodic basis.

973
01:10:20,130 --> 01:10:23,270
Those things are sort of the baseline for small businesses.

974
01:10:23,650 --> 01:10:23,831
Excellent.

975
01:10:24,251 --> 01:10:25,971
Well, James, thank you so much for your time today.

976
01:10:26,011 --> 01:10:26,610
It's been great.

977
01:10:26,710 --> 01:10:27,471
I appreciate it.

978
01:10:27,471 --> 01:10:36,110
I wish you all the success in your Sisyphean, but incredibly important tasks.

979
01:10:36,110 --> 01:10:40,411
And I say that with full sincerity in terms of policy, particularly.

980
01:10:40,670 --> 01:10:44,570
I know you guys are doing great work within the spheres you can control.

981
01:10:44,931 --> 01:10:49,051
But I know on the policy and government front, it is a tough slog.

982
01:10:49,310 --> 01:10:51,351
So thanks for the work you do and thanks for your time today.

983
01:10:52,030 --> 01:10:52,751
Well, thank you very much.

984
01:10:52,810 --> 01:10:53,351
Appreciate it.

985
01:10:57,471 --> 01:10:58,471
Thank you.
