WEBVTT 1 00:00:07.110 --> 00:00:09.690 Anna Delaney: Hello and welcome to the ISMG Editors' Panel. I'm 2 00:00:09.690 --> 00:00:12.150 Anna Delaney, and this week, we'll discuss the recent 3 00:00:12.150 --> 00:00:15.420 Snowflake breach and its impact on yet another victim, the 4 00:00:15.420 --> 00:00:19.020 ongoing challenges in combating online fraud, and takeaways from 5 00:00:19.020 --> 00:00:23.040 ISMG Cybersecurity Summit held in Chicago last week. The gang 6 00:00:23.040 --> 00:00:26.460 today includes Tom Field, senior vice president of editorial, 7 00:00:26.640 --> 00:00:30.180 Suparna Goswami, associate editor at ISMG Asia, and Mathew 8 00:00:30.180 --> 00:00:33.510 Schwartz, executive editor of DataBreachToday and Europe. 9 00:00:33.870 --> 00:00:34.830 Excellent to see you all. 10 00:00:35.610 --> 00:00:36.810 Tom Field: Thanks for having us over to the cookout. 11 00:00:36.810 --> 00:00:39.030 Suparna Goswami: Welcome back! 12 00:00:39.030 --> 00:00:40.050 Anna Delaney: Suparna, where are you? 13 00:00:41.340 --> 00:00:42.300 Tom Field: Couldn't find you in there. 14 00:00:42.300 --> 00:00:45.480 Suparna Goswami: So yes, it's a very unusual ... so it's a 15 00:00:45.480 --> 00:00:49.230 background of the streets of Afghanistan. So they qualified 16 00:00:49.230 --> 00:00:52.320 for semi-finals of the ongoing Cricket World Cup. And this is 17 00:00:52.320 --> 00:00:56.670 the first time they have qualified for a semi-final spot. 18 00:00:57.090 --> 00:00:59.940 And before this, they haven't won a single match in a World 19 00:00:59.940 --> 00:01:04.050 Cup of T20. So this is amazing that and that they qualified 20 00:01:04.050 --> 00:01:07.950 straight now after defeating very strong teams for a 21 00:01:08.280 --> 00:01:11.640 semi-final spot. And I particularly love this picture 22 00:01:11.640 --> 00:01:15.090 because - one cricket happens to be the only source of 23 00:01:15.090 --> 00:01:17.880 entertainment in that country. And second is just reminded me 24 00:01:18.150 --> 00:01:22.170 of the streets of Argentina when they won the World Cup in 2022. 25 00:01:22.470 --> 00:01:25.590 So just goes to show that how human emotions are so similar 26 00:01:25.590 --> 00:01:29.670 across the globe and sports tends to unite, you know? So I 27 00:01:29.670 --> 00:01:30.840 thought that's a lovely picture. 28 00:01:31.350 --> 00:01:33.660 Tom Field: Suparna, bring us up to speed as we sit here today. 29 00:01:33.720 --> 00:01:34.380 Who was left? 30 00:01:35.400 --> 00:01:35.940 Suparna Goswami: Sorry? 31 00:01:36.420 --> 00:01:39.540 Tom Field: Who was left in the in the playoffs as we sit here 32 00:01:39.540 --> 00:01:39.840 today? 33 00:01:39.900 --> 00:01:42.540 Suparna Goswami: Okay, so that's all the semi-finals going on. So 34 00:01:42.540 --> 00:01:45.540 tomorrow is Afghanistan versus South Africa, and we also, we 35 00:01:45.540 --> 00:01:47.700 have India versus England. So let's see. 36 00:01:48.030 --> 00:01:48.990 Tom Field: I wonder who you're following. 37 00:01:51.300 --> 00:01:53.670 Anna Delaney: You always share some of the most powerful 38 00:01:53.670 --> 00:01:56.910 images, Supana. You put mine, you make mine feel very 39 00:01:56.910 --> 00:02:02.100 superficial in comparison. Great shots. Tom, I think this, we're 40 00:02:02.100 --> 00:02:03.000 on the sunset. 41 00:02:03.330 --> 00:02:06.180 Tom Field: Yeah this is me looking out the airplane window 42 00:02:06.180 --> 00:02:08.880 last week heading into Chicago. Just a beautiful sunset over 43 00:02:08.880 --> 00:02:09.870 Lake Michigan and some. 44 00:02:12.630 --> 00:02:17.520 Anna Delaney: Yeah, for sure. Mathew, another arty landscape. 45 00:02:18.120 --> 00:02:19.980 Mathew Schwartz: It seems appropriate. This is an arty 46 00:02:19.980 --> 00:02:24.630 looking museum. This is the V&A, not in London, but in Dundee on 47 00:02:24.630 --> 00:02:28.680 the River Tay. There's no audio soundtrack, but if there was, 48 00:02:28.680 --> 00:02:33.000 you could have heard the shouting and then possibly cries 49 00:02:33.000 --> 00:02:38.580 of frustration from legions of Scottish football fans after 50 00:02:38.580 --> 00:02:43.050 they got kicked out of the Euro 2024 competition this past 51 00:02:43.050 --> 00:02:46.980 weekend. Not far down from here on the river was a viewing area, 52 00:02:47.250 --> 00:02:49.980 and it was very loud, but it was a lovely outdoor area where 53 00:02:49.980 --> 00:02:53.130 people could gather and watch the game on a big screen. So 54 00:02:53.130 --> 00:02:55.470 that was happening recently here as well. 55 00:02:55.860 --> 00:02:57.780 Anna Delaney: Yes, I hear a very painful loss. 56 00:02:58.380 --> 00:02:58.860 Mathew Schwartz: Yes. 57 00:02:58.860 --> 00:03:01.350 Anna Delaney: You may talk about that. Great shot. Well, I'm 58 00:03:01.350 --> 00:03:04.530 showing a view of Hammersmith bridge in London, taken on the 59 00:03:04.530 --> 00:03:06.660 balmy night of the summer solstice last week. 60 00:03:08.130 --> 00:03:09.450 Tom Field: I do understand you've got the hottest day of 61 00:03:09.450 --> 00:03:10.590 the year as we're speaking today. 62 00:03:10.680 --> 00:03:16.620 Anna Delaney: Today yeah! and 30 Celsius. We're all suffering. We 63 00:03:16.620 --> 00:03:18.090 love to talk about the weather here so then. 64 00:03:18.120 --> 00:03:19.650 Mathew Schwartz: We do, yeah, it's like winter up here ... 65 00:03:19.950 --> 00:03:20.370 just. 66 00:03:20.700 --> 00:03:25.650 Anna Delaney: Really? Well. Tom, on last week's Editors' Panel, 67 00:03:25.650 --> 00:03:28.230 you told us that you were looking forward to the ISMG 68 00:03:28.230 --> 00:03:31.530 Cybersecurity Summit in Chicago. So how did it go and what are 69 00:03:31.530 --> 00:03:32.910 the takeaways we need to know? 70 00:03:33.270 --> 00:03:35.430 Tom Field: It did not disappoint. Yeah, it was our 71 00:03:35.490 --> 00:03:40.110 Midwest Chicago event. We've been doing Chicago events for 72 00:03:40.110 --> 00:03:43.680 over a decade now. Good attendance, good speakers, 73 00:03:43.710 --> 00:03:46.230 excellent engagement throughout the day. In terms of the 74 00:03:46.230 --> 00:03:50.340 sessions, we had, the exercises, the roundtables I hosted. Video 75 00:03:50.340 --> 00:03:53.340 studio that our friend Michael was hosting throughout the day. 76 00:03:53.850 --> 00:03:57.900 No surprise - AI remains the hot topic in a couple different 77 00:03:57.900 --> 00:04:02.190 ways. We had panels on how AI is being used in cybersecurity 78 00:04:02.190 --> 00:04:05.070 defenses. We talked about the threat landscape. So if you 79 00:04:05.070 --> 00:04:08.460 divide it down to the threat landscape and defenses on the 80 00:04:08.460 --> 00:04:12.660 threat landscape, a lot more talk about deepfakes, 81 00:04:12.840 --> 00:04:15.480 particularly with financial institutions, and they're 82 00:04:15.480 --> 00:04:19.500 concerned about those with account takeover and even 83 00:04:19.500 --> 00:04:22.380 starting new accounts - impersonations - same thing 84 00:04:22.410 --> 00:04:26.370 leads to a lot of fraud. And then you hear consistently about 85 00:04:26.370 --> 00:04:29.340 the relentless phishing attempts. Now, not just the 86 00:04:29.340 --> 00:04:33.540 sophistication of the emails that are being used, or whatever 87 00:04:33.540 --> 00:04:35.790 is being used to socially engineer the prospective 88 00:04:35.790 --> 00:04:39.900 victims, but just the relentlessness the scale of it. 89 00:04:40.470 --> 00:04:43.590 Those are things that people are talking about. In terms of 90 00:04:43.620 --> 00:04:49.380 defenses, a lot more talk about using generative AI to enhance 91 00:04:49.410 --> 00:04:52.860 the work of SOC analysts to bring information together and 92 00:04:53.730 --> 00:04:58.110 help to separate some of the false negatives and positives. 93 00:04:58.440 --> 00:05:02.100 More on malware analysis, which has been called the killer app 94 00:05:02.100 --> 00:05:08.160 of gen AI by some CISOs. And then interesting ... more CISOs 95 00:05:08.250 --> 00:05:12.420 now talking less about what they're doing internally with 96 00:05:12.420 --> 00:05:17.520 gen AI, but asking the question, what are my vendors doing? And 97 00:05:17.520 --> 00:05:20.610 asking a lot more into what their key partners ... how 98 00:05:20.610 --> 00:05:25.800 they're using gen AI and to what benefit. In addition to the AI 99 00:05:25.800 --> 00:05:29.850 topics, certainly we discuss supply chain security, software 100 00:05:29.850 --> 00:05:33.930 and business supply chain, as well as the staffing and the 101 00:05:33.930 --> 00:05:36.750 skills crisis and what organizations are doing to try 102 00:05:36.750 --> 00:05:41.160 to make cybersecurity positions more attractive to increasingly 103 00:05:41.190 --> 00:05:43.920 whoever is curious. It's not that they're looking for 104 00:05:43.920 --> 00:05:47.910 specific certifications or experiences, but curious people 105 00:05:47.910 --> 00:05:51.810 that want to come in and learn, and we can work with them. In 106 00:05:51.810 --> 00:05:54.600 addition to that, we had our solutions room, which you've 107 00:05:54.600 --> 00:05:58.980 run, where we bring in a deepfake incident and do a 108 00:05:58.980 --> 00:06:02.340 tabletop exercise with the entire room, aided by our own 109 00:06:02.370 --> 00:06:05.970 CyberEdBoard members, as well as in this case, the Secret Service 110 00:06:05.970 --> 00:06:11.100 and Mandiant part of Google Cloud. Terrific event. People 111 00:06:11.100 --> 00:06:13.530 love that so much - the exercise. And we bring people up 112 00:06:13.530 --> 00:06:17.670 on stage and talk about it. And clearly this deepfake topic 113 00:06:17.670 --> 00:06:21.990 resonates. And just to summarize the exercises that on a weekend, 114 00:06:21.990 --> 00:06:26.520 the CFO hears from the CEO with a video message saying, there's 115 00:06:26.520 --> 00:06:30.420 an imperative to wire $5 million immediately for a business 116 00:06:30.420 --> 00:06:33.630 transaction. Don't tell anybody else about this. We don't want 117 00:06:33.630 --> 00:06:36.900 word to get out. Do it now, and the CFO does it. It's 118 00:06:36.900 --> 00:06:40.500 fraudulent. What happens? So the topic certainly resonates, and a 119 00:06:40.500 --> 00:06:45.210 lot of our attendees are taking these exercises, bringing them 120 00:06:45.210 --> 00:06:48.450 back to their organizations, following up and using them to 121 00:06:48.450 --> 00:06:52.200 see discussions with their own leadership and even with the 122 00:06:52.200 --> 00:06:55.230 board. So, I have a lot of fun with this, and I'm glad that 123 00:06:55.230 --> 00:06:58.230 it's getting such traction. So in summary, that's what we did 124 00:06:58.230 --> 00:06:59.280 in Chicago last week. 125 00:06:59.690 --> 00:07:01.670 Anna Delaney: Excellent. Did anything surprising come up in 126 00:07:01.670 --> 00:07:04.430 the deepfake scenario session? 127 00:07:05.210 --> 00:07:07.340 Tom Field: No, I wouldn't say anything surprising came up 128 00:07:07.340 --> 00:07:11.120 there. But I would say, I also moderated two different 129 00:07:11.120 --> 00:07:14.390 roundtables. One was on continuous exposure management 130 00:07:14.750 --> 00:07:17.870 and just trying to get a better handle on visibility into all 131 00:07:17.870 --> 00:07:20.390 the devices and networks and stuff and when the telemetry 132 00:07:20.390 --> 00:07:22.940 that's coming from them. But the more interesting one, not more 133 00:07:22.940 --> 00:07:25.910 interesting, but interestingly, equally interesting and 134 00:07:25.910 --> 00:07:29.720 engaging, was talking about the evolving threat landscape. And 135 00:07:29.720 --> 00:07:33.290 we had a gentleman there from Mandiant - again part of Google 136 00:07:33.290 --> 00:07:36.590 Cloud - sharing highlights of their latest threat report 137 00:07:36.680 --> 00:07:41.090 looking at the top threat actors and actions, as well as the 138 00:07:41.090 --> 00:07:46.430 reduction of dwell time. And I thought what stuck out to me 139 00:07:46.580 --> 00:07:51.200 talking to the CISOs in the room was the one who felt that he had 140 00:07:51.200 --> 00:07:55.190 identity pretty well nailed down through the ubiquitous use of 141 00:07:55.190 --> 00:07:59.600 YubiKeys and he wasn't terribly concerned about ransomware. He 142 00:07:59.600 --> 00:08:02.390 felt he could take care of that ... wasn't terribly concerned 143 00:08:02.390 --> 00:08:05.030 about software supply chain. Felt he had a good handle on 144 00:08:05.030 --> 00:08:09.140 that. But what he said is that the AI guy is the one that's 145 00:08:09.140 --> 00:08:14.180 going to bring me down, and that ceded a lot of thought to me, 146 00:08:14.450 --> 00:08:16.700 and gave me something to bring back to our team that this is 147 00:08:16.700 --> 00:08:20.090 something we want to explore more with our audience, to find 148 00:08:20.090 --> 00:08:23.480 out what their concerns are and how some of those concerns might 149 00:08:23.480 --> 00:08:26.390 be addressed. But the AI guy is the one who's going to bring me 150 00:08:26.390 --> 00:08:27.950 down. That's a line that resonates. 151 00:08:29.910 --> 00:08:32.130 Mathew Schwartz: Yeah, really topical stuff Tom. I mean with 152 00:08:32.160 --> 00:08:35.400 the elections happening, I think there's one in the U.S. later 153 00:08:35.400 --> 00:08:37.710 this year, but there's definitely one happening here in 154 00:08:37.710 --> 00:08:43.080 Britain on July 4, of all possible dates in the year. And 155 00:08:43.110 --> 00:08:46.020 there's been a lot of concern about deepfakes being used 156 00:08:46.350 --> 00:08:50.430 there. I think the malicious use cases are really bad for 157 00:08:50.430 --> 00:08:55.380 financial stuff. I think there's been a lot of research showing, 158 00:08:55.380 --> 00:08:59.520 thankfully, that for elections, people tend to spot when stuff 159 00:08:59.520 --> 00:09:03.630 is junk, so that part at least is good, because there was a lot 160 00:09:03.630 --> 00:09:06.540 of concern about that 6-12 months ago, which I don't think 161 00:09:06.540 --> 00:09:10.320 is coming to fruition. So there's some a good note on the 162 00:09:10.320 --> 00:09:11.610 malicious AI front. 163 00:09:11.910 --> 00:09:13.980 Tom Field: Yeah, I'm not so sure we're as good as spotting junk in 164 00:09:13.980 --> 00:09:17.760 the U.S. as perhaps in the U.K. But that aside, I'm remiss but 165 00:09:17.760 --> 00:09:21.690 not mentioned that we did have an Illinois congressman, Bill 166 00:09:21.690 --> 00:09:25.110 Foster, who sat with me for our keynote discussion, and yes, we 167 00:09:25.110 --> 00:09:28.680 had to talk about the upcoming election, because, look, we're 168 00:09:28.680 --> 00:09:31.590 going to be electing a president in November. It could be the 169 00:09:31.590 --> 00:09:34.500 president we have now. It could be the president we had before. 170 00:09:34.830 --> 00:09:39.660 But what that says is cybersecurity priorities are 171 00:09:39.660 --> 00:09:43.080 subject to change, and if they do, what are the ramifications 172 00:09:43.080 --> 00:09:45.840 of that? And I wish somebody had an answer, but nobody does. This 173 00:09:45.840 --> 00:09:49.380 is ... they're all big. This one's big. 174 00:09:51.240 --> 00:09:53.670 Anna Delaney: Well, an excellent event, rich education. Thanks 175 00:09:53.670 --> 00:09:57.480 for sharing Tom. Suparna, you have written that a major 176 00:09:57.480 --> 00:10:00.900 barrier to combating online fraud is the difficulty of 177 00:10:00.900 --> 00:10:04.800 reporting scams on Facebook due to inadequate reporting options 178 00:10:04.800 --> 00:10:07.950 and responses. Just how bad is the problem, Suparna? 179 00:10:09.090 --> 00:10:11.880 Suparna Goswami: Oh yes, it was a fascinating topic that I found 180 00:10:11.880 --> 00:10:14.700 interesting, and that's why I happen to write a blog that 181 00:10:14.700 --> 00:10:18.900 Facebook does literally nothing to stop scams on its platform. 182 00:10:19.350 --> 00:10:22.560 So look at the kind of crimes committed on Facebook. You know, 183 00:10:22.560 --> 00:10:25.200 you can get the best of phones and cars using stolen 184 00:10:25.200 --> 00:10:29.490 identities. You can open bank accounts, credit union accounts. 185 00:10:29.490 --> 00:10:32.880 You can ... the people they openly ask if they want you want 186 00:10:32.880 --> 00:10:36.840 to make quick money, and if you have accounts in certain banks, 187 00:10:36.840 --> 00:10:39.990 can you carry out these fraudulent activities? In fact, 188 00:10:40.290 --> 00:10:44.250 there is a group on Facebook called 'FRAUD UNIVERSITY', which 189 00:10:44.250 --> 00:10:47.370 has more than 7,000 members, and they teach you the various ways 190 00:10:47.370 --> 00:10:50.670 to commit fraud. And this is one of the several groups I'm 191 00:10:50.670 --> 00:10:54.240 mentioning here. This is not the only group, and what amazes me 192 00:10:54.240 --> 00:10:58.380 is how openly they write about it. They talk about it, but if 193 00:10:58.380 --> 00:11:01.290 you try to report any of the scam content to Meta, which 194 00:11:01.290 --> 00:11:04.320 happens to ... which is Facebook's parent company, you 195 00:11:04.320 --> 00:11:09.030 won't find a scam or a fraud reporting option. So I'll give 196 00:11:09.030 --> 00:11:12.180 you an example, one of my contacts on LinkedIn, he posted 197 00:11:12.210 --> 00:11:15.660 and he also posted this on LinkedIn. He tried to report a 198 00:11:15.660 --> 00:11:21.330 scam on Facebook ... like he tried to report it to Meta. So 199 00:11:21.330 --> 00:11:25.380 first, like I mentioned, he did not find any reporting option. 200 00:11:25.380 --> 00:11:28.020 There was no scam or a fraud option that was given. So what 201 00:11:28.020 --> 00:11:32.100 he did, he selected the next best, obvious choice, which was 202 00:11:32.310 --> 00:11:36.480 unauthorized sales. After reporting the group, he waited 203 00:11:36.480 --> 00:11:39.570 for a week or so for a response to come. There was no immediate 204 00:11:39.570 --> 00:11:43.140 response. Now you think that Facebook would delete a group 205 00:11:43.350 --> 00:11:46.980 called FRAUD UNIVERSITY after it was being reported, but no 206 00:11:47.220 --> 00:11:51.510 Facebook investigators did not think the group violated any of 207 00:11:51.510 --> 00:11:56.880 its guidelines. And even worse, after having reported the group 208 00:11:56.880 --> 00:12:01.230 to this ... to Meta, the scammers got together and 209 00:12:01.230 --> 00:12:05.400 reported him for harassment, and Meta subsequently took down his 210 00:12:05.400 --> 00:12:09.240 profile. So just imagine, so they did not take down the FRAUD 211 00:12:09.240 --> 00:12:12.570 UNIVERSITY group, but they took down his profile because 212 00:12:12.660 --> 00:12:17.280 apparently he was harassing the group. So and report ... I 213 00:12:17.280 --> 00:12:21.390 reported this because scams have a huge impact on banks as well. 214 00:12:21.690 --> 00:12:24.930 It is a known fact that the large percentage of scams 215 00:12:24.960 --> 00:12:29.100 originate on social media. Now, a report by Federal Trade 216 00:12:29.100 --> 00:12:32.970 Commission says that one in four people who have removed reported 217 00:12:32.970 --> 00:12:36.570 losing money since 2021. They said that the scam started 218 00:12:36.960 --> 00:12:41.160 either on WhatsApp or on social media. A U.K. finance analysis 219 00:12:41.190 --> 00:12:45.660 of nearly 7000 authorized push payment fraud cases also find 220 00:12:46.140 --> 00:12:50.760 that 70% of the scams originated on online platform like 221 00:12:50.760 --> 00:12:54.810 Facebook. So but the reaction of Facebook is every now and then 222 00:12:54.810 --> 00:12:57.810 we will hear them that, you know, they are trying to work 223 00:12:57.810 --> 00:13:02.610 out ways to fight against the tools used by scammers. But even 224 00:13:02.610 --> 00:13:05.880 today, most of us have the have difficult time getting the 225 00:13:05.880 --> 00:13:09.360 internet platforms to actually take down the scam sites. The 226 00:13:09.360 --> 00:13:14.070 problem is that meta and other social media platforms have very 227 00:13:14.070 --> 00:13:18.000 little accountability or regulatory oversight. So I'll 228 00:13:18.390 --> 00:13:22.140 give you an example. So Section 230 of the Communications 229 00:13:22.140 --> 00:13:25.860 Decency Act in the U.S. says that Congress explicitly 230 00:13:25.860 --> 00:13:29.820 protects social media platform from liability for the content 231 00:13:30.090 --> 00:13:33.900 that users post on their web ... on their sites. Now, this act 232 00:13:33.900 --> 00:13:37.200 was back in 1996 and unfortunately social media now 233 00:13:37.230 --> 00:13:39.960 is used for various scams like your ... like I said - pig 234 00:13:39.960 --> 00:13:42.060 butchering, your money mulling, your crypto scams, 235 00:13:42.060 --> 00:13:45.930 disinformation, hate speech, everything. Last month, two 236 00:13:45.930 --> 00:13:49.110 house representatives introduced a bill to kill the threat 237 00:13:49.110 --> 00:13:52.590 Section 230 in an effort to force Congress to reform the 238 00:13:52.590 --> 00:13:57.180 law. But again, all the tech companies, big tech companies, 239 00:13:57.240 --> 00:13:59.910 they rallied together to oppose it, and nothing has been done. 240 00:14:00.540 --> 00:14:03.060 Even in other countries, there has not been much of an effort. 241 00:14:03.060 --> 00:14:08.280 In the U.K. a few months back, there was, they were rallying to 242 00:14:08.280 --> 00:14:14.880 get both telcos and those tech companies together. But again, 243 00:14:14.910 --> 00:14:19.170 there is no law as such. Australia will have a law in a 244 00:14:19.170 --> 00:14:22.260 few months, where both telcos and tech companies will be 245 00:14:22.260 --> 00:14:26.160 liable for scams, and not only financial institutions. So let's 246 00:14:26.160 --> 00:14:27.180 see what happens here. 247 00:14:27.960 --> 00:14:30.540 Anna Delaney: What do you think is needed to ensure better 248 00:14:30.540 --> 00:14:33.180 accountability and regulation on these platforms? Is it possible? 249 00:14:34.530 --> 00:14:36.300 Suparna Goswami: I think monetary penalties are the only 250 00:14:36.300 --> 00:14:39.360 effective incentive. You know, the banks and the financial 251 00:14:39.360 --> 00:14:43.320 institutions, I keep talking to them. They do a lot to tackle 252 00:14:43.320 --> 00:14:45.960 fraud. Of course, they are not perfect, but they are doing a 253 00:14:45.960 --> 00:14:50.280 lot from their own, from their end, to tackle fraud and 254 00:14:50.280 --> 00:14:52.620 everything, but its efforts alone will not solve the 255 00:14:52.620 --> 00:14:56.760 problem. Tech companies have to be made equally liable or 256 00:14:56.760 --> 00:15:00.000 equally involved, if not liable, but equally involved in tackling 257 00:15:00.000 --> 00:15:03.120 the scam. Without their involvement, the public will 258 00:15:03.120 --> 00:15:08.550 remain at high risk of online scams. So, you know, consumers, 259 00:15:08.580 --> 00:15:13.290 because even I wrote this that consumers need more protection, 260 00:15:13.290 --> 00:15:15.570 and it's not the big tech companies who need protection, 261 00:15:15.570 --> 00:15:19.200 but it's the consumers who need bit more protection. Like I 262 00:15:19.200 --> 00:15:22.320 said, some countries are doing. But still, lot needs to be done. 263 00:15:22.560 --> 00:15:25.740 Like I said, even in United Kingdom - they signed an online fraud 264 00:15:25.740 --> 00:15:30.780 charter last year, November 2023. But after that, very 265 00:15:30.780 --> 00:15:34.170 little action has been taken to that. So because it's a huge 266 00:15:34.170 --> 00:15:38.070 money, you know, the big tech companies, they don't really 267 00:15:38.070 --> 00:15:41.160 have that incentive, and it's not a top priority for them 268 00:15:41.160 --> 00:15:46.020 because, obviously, it's a bad thing. Everybody agrees, but 269 00:15:46.050 --> 00:15:48.300 it's just not a top priority for them because there is no 270 00:15:48.300 --> 00:15:51.000 financial liability that is involved. So why would they 271 00:15:51.000 --> 00:15:52.080 focus on this? 272 00:15:53.340 --> 00:15:56.430 Mathew Schwartz: I think you are seeing some better defenses in 273 00:15:56.430 --> 00:16:00.270 the forms of banks here in the U.K. I have put a lot more 274 00:16:00.270 --> 00:16:05.070 checks in place when you go to pay for something, trying to get 275 00:16:05.070 --> 00:16:09.060 you to think twice about why am I doing this? Has someone told 276 00:16:09.060 --> 00:16:12.960 me it has to happen right away, which is obviously a red flag. 277 00:16:12.960 --> 00:16:13.290 Suparna Goswami: Yes of course. Yeah. 278 00:16:14.070 --> 00:16:18.000 Mathew Schwartz: We had a lot of trouble with fraud and this sort 279 00:16:18.000 --> 00:16:23.640 of thing. Back in the days of print newspapers, you had people 280 00:16:23.640 --> 00:16:27.090 listing scams, running scams, stuff like that. There's no way 281 00:16:27.090 --> 00:16:31.320 newspapers could police that. And I think that online 282 00:16:31.800 --> 00:16:34.920 criminals have taken those skills or that approach, and 283 00:16:34.920 --> 00:16:37.920 they're ... it's almost the equivalent of malicious 284 00:16:37.920 --> 00:16:40.950 classified advertising, and it's very difficult, if not 285 00:16:40.950 --> 00:16:45.870 impossible, I would say for social media firms to block 286 00:16:45.870 --> 00:16:48.720 this. So personally, I don't know if holding them or trying 287 00:16:48.720 --> 00:16:51.480 to hold them liable is the right step. 288 00:16:52.500 --> 00:16:54.540 Suparna Goswami: See, it would be difficult, no doubt about it. 289 00:16:54.540 --> 00:16:58.050 And like you said, U.K. happens to be one of the few countries 290 00:16:58.050 --> 00:17:02.040 who have put a lot of liability on the banks. But like I said, 291 00:17:02.070 --> 00:17:05.010 it's not the banks who will alone be responsible. You gave a 292 00:17:05.010 --> 00:17:09.660 good example of newspapers, but tech companies, like I said, if 293 00:17:09.660 --> 00:17:12.990 there is somebody who's reporting a scam on a group 294 00:17:12.990 --> 00:17:15.210 called FRAUD UNIVERSITY, at least you should have the 295 00:17:15.240 --> 00:17:18.720 reporting option of putting a fraud or a scam. 296 00:17:18.720 --> 00:17:18.990 Mathew Schwartz: Definitely. 297 00:17:19.260 --> 00:17:21.240 Suparna Goswami: But they have just been blind, and it's not 298 00:17:21.240 --> 00:17:24.210 that it's been told to them today. It has been there for 299 00:17:24.210 --> 00:17:27.480 years, like at least post Covid, but they have taken little 300 00:17:27.480 --> 00:17:31.320 action on that. So why have they not taken action? Maybe monetary 301 00:17:31.320 --> 00:17:33.180 liability is the way forward. Let's see. 302 00:17:33.210 --> 00:17:34.860 Tom Field: You make a good point so far. There needs to be a way 303 00:17:34.860 --> 00:17:35.670 to report this. 304 00:17:38.040 --> 00:17:40.830 Anna Delaney: Excellent points around. Well, time will tell. 305 00:17:40.830 --> 00:17:45.450 I'm not sure how much luck we get without Meta, but it's a 306 00:17:46.500 --> 00:17:51.030 serious issue. Mat, onto you. Another victim of the Snowflake 307 00:17:51.030 --> 00:17:54.000 breach has surfaced with Neiman Marcus reporting that nearly 308 00:17:54.000 --> 00:17:57.780 65,000 customers' personal information was exposed as part 309 00:17:57.780 --> 00:18:01.860 of a larger campaign affecting about 165 Snowflake customer 310 00:18:01.860 --> 00:18:04.320 accounts. Just tell us about this latest development. 311 00:18:05.040 --> 00:18:08.640 Mathew Schwartz: Yes. So those of you into luxury retail might 312 00:18:08.640 --> 00:18:13.020 know the name Neiman Marcus. It's a chain of about three 313 00:18:13.020 --> 00:18:16.650 dozen Neiman Marcus physical stores, also a couple of 314 00:18:16.680 --> 00:18:21.390 Bergdorf Goodman stores, another high end luxury name there, and 315 00:18:21.390 --> 00:18:25.110 then a handful of things called Last Call, which I learned this 316 00:18:25.110 --> 00:18:29.970 week are their outlet stores, but these physical shops, as 317 00:18:29.970 --> 00:18:34.650 well as the online presence of what is now known as the Neiman 318 00:18:34.680 --> 00:18:40.080 Marcus group, as you say Anna, is notifying nearly 65,000 319 00:18:40.110 --> 00:18:44.580 shoppers that their personal information has gone walkabout 320 00:18:44.670 --> 00:18:50.040 and may be getting used by criminals. So in terms of what 321 00:18:50.040 --> 00:18:54.690 was the breach, customer names, contact details, like email 322 00:18:54.690 --> 00:18:58.740 address, their birth date, gift card numbers for any gift cards 323 00:18:58.740 --> 00:19:01.110 they may have purchased, although apparently the PIN 324 00:19:01.110 --> 00:19:05.610 codes weren't stolen, so shoppers can rest assure that 325 00:19:05.610 --> 00:19:10.710 their gift cards are still intact. All this aside, this is 326 00:19:10.740 --> 00:19:15.690 yet another breach to come to light via an attack on 327 00:19:16.050 --> 00:19:20.010 Snowflake. As you mentioned, what is Snowflake? So hard to 328 00:19:20.010 --> 00:19:23.550 keep track of all of these upstart tech firms with these 329 00:19:23.610 --> 00:19:26.910 random names, right? Well, this one actually has been around, as 330 00:19:26.910 --> 00:19:30.000 I again learned in the last couple of weeks, for a long 331 00:19:30.000 --> 00:19:34.020 time, started out as a data warehousing platform provider, 332 00:19:34.170 --> 00:19:37.590 and it's used by lots of companies so they can throw 333 00:19:37.590 --> 00:19:41.700 their data in there and do analytics with it. Other 334 00:19:41.700 --> 00:19:46.170 Snowflake users include Ticketmaster, owned by Live 335 00:19:46.200 --> 00:19:50.640 Nation Entertainment, also Santander Bank, automotive parts 336 00:19:50.640 --> 00:19:56.160 supplier Advanced Auto Parts and the Los Angeles Unified School 337 00:19:56.160 --> 00:19:58.920 District. What do these things have in common? Well, they're 338 00:19:58.920 --> 00:20:03.030 not just Snowflake customers, they are also breach victims - 339 00:20:03.270 --> 00:20:08.460 thanks to their Snowflake account having been hacked. How 340 00:20:08.460 --> 00:20:12.300 did this happen? Now this is an interesting wrinkle. Snowflake 341 00:20:12.690 --> 00:20:16.200 gave customers the ability to enable multi-factor 342 00:20:16.230 --> 00:20:22.050 authentication, not in a great way. It had a single enterprise 343 00:20:22.080 --> 00:20:26.340 managed instance of Duo. Not that Duo gets bad reviews. But 344 00:20:26.610 --> 00:20:30.240 instead of offering a plethora of ways for customers to enable 345 00:20:30.240 --> 00:20:36.630 MFA, they apparently had to opt into this one way. So because of 346 00:20:36.630 --> 00:20:40.800 that, or because of poor uptake or whatever, lots of Snowflake 347 00:20:40.800 --> 00:20:45.090 customers didn't have MFA enabled. The current count, as 348 00:20:45.090 --> 00:20:50.370 we understand it, is about 155 customers who didn't have MFA 349 00:20:50.460 --> 00:20:54.900 enabled got hit by this Snowflake-customer-targeting 350 00:20:54.900 --> 00:20:59.040 campaign. The attackers apparently wrote themselves a 351 00:20:59.040 --> 00:21:03.690 little tool that could use credentials they had obtained in 352 00:21:03.690 --> 00:21:09.450 other breaches. So for example, if you are user.com and your 353 00:21:09.450 --> 00:21:13.560 password is 'puppy' and you use that in Amazon, and they grab 354 00:21:13.560 --> 00:21:16.770 that from a breach or wherever, eBay, Amazon, whoever gets 355 00:21:16.770 --> 00:21:20.010 breached, they will try that on a range of other sites. They 356 00:21:20.010 --> 00:21:22.890 tried it on Snowflake, and they had an automated tool for 357 00:21:22.920 --> 00:21:27.030 logging them in using these credentials via Snowflake's web 358 00:21:27.060 --> 00:21:30.000 user interface, or there's also a command line tool they were 359 00:21:30.000 --> 00:21:33.150 apparently hitting. And they've come up with a lot of victims, 360 00:21:33.450 --> 00:21:37.860 and these victims have started to come to light because a 361 00:21:37.890 --> 00:21:42.030 Breach Forums' user - Breach Forums is a data leak market - a 362 00:21:42.030 --> 00:21:45.450 little like FRAUD UNIVERSITY - it does what it says in the tin. 363 00:21:46.200 --> 00:21:51.810 Breach Forums started to list a lot of stolen credentials that 364 00:21:51.840 --> 00:21:56.160 had a tag with the name of the utility that the attackers were 365 00:21:56.160 --> 00:22:00.120 using. So researchers have traced that back to an 366 00:22:00.120 --> 00:22:04.230 increasing number of known public victims of this campaign 367 00:22:04.230 --> 00:22:09.090 ... Neiman Marcus - the most recent victim. But there's lots 368 00:22:09.090 --> 00:22:15.420 of other breach victim pools of data that they're advertising as 369 00:22:15.420 --> 00:22:20.100 well. For example, the U.S. LA, sorry, the Los Angeles Unified 370 00:22:20.100 --> 00:22:24.360 School District, apparently, a lot of student and employees' 371 00:22:24.390 --> 00:22:28.980 data past and present got stolen and is being listed for sale. 372 00:22:28.980 --> 00:22:31.470 And then after it was listed for sale and nobody paid for it, 373 00:22:31.680 --> 00:22:34.710 apparently the attacker just dumped the data, as ransomware 374 00:22:34.710 --> 00:22:38.850 groups are wanted to do, because it gets headlines. So we're 375 00:22:38.850 --> 00:22:43.560 seeing that cycle as we've seen it before play out again with 376 00:22:44.250 --> 00:22:47.190 these Snowflake victims. I presume we're going to see a lot 377 00:22:47.190 --> 00:22:50.070 more come to light. The person who is part of the group that's 378 00:22:50.070 --> 00:22:52.740 leaking data said that some of the victims have already paid a 379 00:22:52.740 --> 00:22:56.610 ransom. You could expect them to try to cash in on the ones that 380 00:22:56.610 --> 00:23:01.710 haven't paid a ransom by making a big noise about the fact that 381 00:23:01.710 --> 00:23:04.620 this victim was breached and then probably leaking the data 382 00:23:04.620 --> 00:23:07.650 for free again, because it gets headlines, makes the group look 383 00:23:07.650 --> 00:23:09.540 bigger and better than it probably really is. 384 00:23:10.680 --> 00:23:12.510 Tom Field: Well Mat, not that anyone's counted. But this is 385 00:23:12.510 --> 00:23:15.780 the fourth time Neiman Marcus has made it in the headlines for 386 00:23:15.780 --> 00:23:20.340 a significant breach. I'm thinking back in 2013, maybe 10 387 00:23:20.340 --> 00:23:23.310 years ago, 2015 and then 2020. 388 00:23:23.640 --> 00:23:26.190 Mathew Schwartz: That's correct. Yeah. Their first big breach 389 00:23:26.190 --> 00:23:28.860 happened around the time of Target breach. 390 00:23:28.890 --> 00:23:29.340 Tom Field: Right after. 391 00:23:30.000 --> 00:23:32.160 Mathew Schwartz: And Neiman Marcus, following so quickly 392 00:23:32.160 --> 00:23:36.300 after Target, helped lead to things like the Payment Card 393 00:23:36.300 --> 00:23:39.900 Industry Data Security Standard that we now know to secure 394 00:23:39.900 --> 00:23:44.190 payment card data, because that had gone missing way back when, 395 00:23:44.460 --> 00:23:48.720 and then Neiman Marcus just, I think, settled. It was either 396 00:23:48.720 --> 00:23:53.340 that breach or a subsequent breach with a bunch of states in 397 00:23:53.340 --> 00:23:58.020 the last few years. So these things tick on very slowly. That 398 00:23:58.020 --> 00:24:01.440 was all with payment card data that got siphoned off by malware 399 00:24:01.440 --> 00:24:04.980 installed on point of sale systems. Thankfully, in this 400 00:24:04.980 --> 00:24:08.430 case, only the last four digits I think of payment cards 401 00:24:08.640 --> 00:24:12.180 allegedly got stolen. So they seem to have gotten better with 402 00:24:12.180 --> 00:24:15.780 their handling of payment card data, but retailers are still 403 00:24:15.780 --> 00:24:17.550 amongst the targets. 404 00:24:18.950 --> 00:24:22.010 Anna Delaney: I'm sensing a bit of sympathy in your voice when 405 00:24:22.010 --> 00:24:26.840 it comes to these victims who didn't implement MFA just 406 00:24:26.840 --> 00:24:30.440 because of the difficulty with the Duo account. Is that right? 407 00:24:30.440 --> 00:24:33.770 Mathew Schwartz: Yeah, definitely. MFA everywhere is a 408 00:24:33.770 --> 00:24:38.180 great slogan, and especially with CISA in the U.S., the 409 00:24:38.180 --> 00:24:40.910 cybersecurity agency recommending that I've been 410 00:24:41.510 --> 00:24:44.660 putting this question to a lot of CISOs in recent months and 411 00:24:44.660 --> 00:24:48.890 also vendors who offer MFA or who tie into MFA, and what I'm 412 00:24:48.890 --> 00:24:54.320 hearing is, yes, enable it everywhere you can. Legacy 413 00:24:54.320 --> 00:24:57.800 systems can be a challenge. Cloud-based environments can be 414 00:24:57.800 --> 00:25:01.070 a challenge. Sometimes you have to pay more for your cloud 415 00:25:01.070 --> 00:25:05.390 environment to give you all the MFA options that you might want. 416 00:25:05.660 --> 00:25:11.240 So this should be much easier than it is, and just as the 417 00:25:11.240 --> 00:25:16.190 Neiman Marcus breach more than a decade ago helped usher in a new 418 00:25:16.190 --> 00:25:19.280 era of payment card data security, hopefully these 419 00:25:19.280 --> 00:25:23.840 breaches now are focusing attention on why wasn't MFA 420 00:25:23.840 --> 00:25:26.930 enabled, and what can we do, or what should vendors be doing to 421 00:25:26.930 --> 00:25:29.990 make it easier? I hope we see a lot more pressure on 422 00:25:29.990 --> 00:25:34.340 organizations like Snowflake. After the breach, they've now 423 00:25:34.340 --> 00:25:38.450 detailed a bunch of new ways in which you can access MFA, 424 00:25:38.630 --> 00:25:42.470 including via open security standards, and they are pledging 425 00:25:42.470 --> 00:25:47.180 to provide a way to make it a default for an organization's 426 00:25:47.210 --> 00:25:51.770 users. So by default, everyone will have to do MFA, unless 427 00:25:51.770 --> 00:25:54.710 maybe they opt out. But that's then on them. That's what we 428 00:25:54.710 --> 00:25:56.810 should be seeing. Unfortunately, it's taken a breach to make 429 00:25:56.810 --> 00:25:59.540 Snowflake rethink it. Hopefully, this will be a lesson now. 430 00:26:00.110 --> 00:26:02.120 Tom Field: And invoking the spirit of our friend Jeremy 431 00:26:02.120 --> 00:26:04.520 Grant - not all MFA is created equally. 432 00:26:08.190 --> 00:26:10.770 Anna Delaney: Positive progress. I hope. Thank you Mat! That was 433 00:26:10.770 --> 00:26:14.010 great. And finally just for fun, if you could interview a 434 00:26:14.010 --> 00:26:17.580 sentient robot, what would be the first question you'd ask it? 435 00:26:20.130 --> 00:26:20.670 Suparna? 436 00:26:20.640 --> 00:26:23.349 Suparna Goswami: See from a cybersecurity point of view, I 437 00:26:23.415 --> 00:26:27.248 thought why not ask them that how do you feel when you see 438 00:26:27.314 --> 00:26:31.940 passwords like '1234' or the word 'password' being used as a password? 439 00:26:31.920 --> 00:26:33.210 Tom Field: Big conversation with Mat. 440 00:26:35.940 --> 00:26:38.130 Anna Delaney: They'd have some good humor to share I'm sure. 441 00:26:38.790 --> 00:26:39.810 Tom, go for it. 442 00:26:40.260 --> 00:26:43.770 Tom Field: The shortest question in the English language and the 443 00:26:43.770 --> 00:26:47.640 hardest one to answer. Let's put it to the robot. Why? 444 00:26:50.100 --> 00:26:51.360 Anna Delaney: I thought you're gonna ... I thought it would be 445 00:26:51.390 --> 00:26:56.040 'to be or not to be'. But, 'why' is a good one. Mat? 446 00:26:56.750 --> 00:26:59.870 Mathew Schwartz: Well, so we've been awash in all things AI and 447 00:26:59.930 --> 00:27:04.760 LLMs since the end of 2022, and if that's taught me one thing, 448 00:27:04.970 --> 00:27:09.350 it would be to ask the robot what data were you trained on? 449 00:27:11.250 --> 00:27:15.000 Anna Delaney: Actually, that's a great one. I'd love to know what 450 00:27:15.000 --> 00:27:19.650 they make of humans and our relationship with technology. 451 00:27:19.830 --> 00:27:23.280 And I'd like them to describe it in one word, hopefully it's a 452 00:27:23.280 --> 00:27:23.790 polite one. 453 00:27:25.020 --> 00:27:27.180 Tom Field: The robot rolls its eyes and says, carbon. 454 00:27:29.940 --> 00:27:32.340 Anna Delaney: Excellent! Educational as always. Thank you 455 00:27:32.340 --> 00:27:33.450 so much everybody. 456 00:27:33.480 --> 00:27:33.810 Suparna Goswami: Thank you. 457 00:27:34.410 --> 00:27:34.860 Mathew Schwartz: Thanks Anna! 458 00:27:35.970 --> 00:27:37.890 Anna Delaney: Thanks so much for watching. Until next time.