WEBVTT 1 00:00:05.910 --> 00:00:09.240 Anna Delaney: Welcome to the ISMG Editors' Panel, live on the 2 00:00:09.240 --> 00:00:14.370 final day of InfoSec Europe 2024. I'm Anna Delaney. I'm 3 00:00:14.370 --> 00:00:18.810 joined by my colleague Mathew Schwartz and Ian Thornton-Trump, 4 00:00:18.990 --> 00:00:23.370 special guest, CyberEdBoard member, CISO at Cyjax. Thank you 5 00:00:23.370 --> 00:00:24.330 for joining us Ian. 6 00:00:24.420 --> 00:00:26.250 Ian Thornton-Trump: It's my absolute pleasure. I said I was 7 00:00:26.250 --> 00:00:29.490 going to be the noneditorial person on the panel of editors. 8 00:00:29.760 --> 00:00:31.590 Anna Delaney: You're always welcome on our panel, and we're 9 00:00:31.590 --> 00:00:35.610 so lucky to have you. So how was it? We made it. 10 00:00:35.700 --> 00:00:38.700 Ian Thornton-Trump: It had a great vibe. I have to say, it 11 00:00:38.700 --> 00:00:41.460 was nice to see, especially on the first couple of days, the 12 00:00:41.460 --> 00:00:46.170 traffic was busy, the show was busy. I think there's a number 13 00:00:46.170 --> 00:00:49.050 of key kind of themes that came out of it. Of course, you know, 14 00:00:49.050 --> 00:00:53.340 sort of the one is the tongue in cheek, AI everywhere. But I 15 00:00:53.340 --> 00:00:55.920 think, you know, there were some other really interesting things 16 00:00:55.920 --> 00:01:00.210 going on. I'm seeing a lot of smaller businesses, you know, 17 00:01:00.240 --> 00:01:04.170 trying to get noticed. We see some of the big players in the 18 00:01:04.170 --> 00:01:07.500 industry as well really contributing. But I think the 19 00:01:07.500 --> 00:01:12.210 big story is really about the £20,000 that was raised for the 20 00:01:12.210 --> 00:01:16.740 National Child Protection Agency at the pen test partners over 21 00:01:16.740 --> 00:01:20.580 two days, cyber house party and RADs coming together. It was 22 00:01:20.580 --> 00:01:24.270 just a great, euphoric moment to have the community sort of put 23 00:01:24.270 --> 00:01:27.900 away their grievances, put away the grudges, and just celebrate 24 00:01:27.900 --> 00:01:30.450 the awesomeness of information security. 25 00:01:30.480 --> 00:01:33.570 Anna Delaney: Fantastic achievement. So how's the gen AI 26 00:01:33.600 --> 00:01:35.760 discussion changed since last year? What do you think? 27 00:01:35.000 --> 00:01:37.340 Ian Thornton-Trump: So it's getting more mature, but it's by 28 00:01:35.000 --> 00:02:26.420 Yeah. Excellent! Mat, you spoke to lots of people, as did I, but 29 00:01:37.399 --> 00:01:40.852 no means like I'm going to see it ready. It's baking in the 30 00:01:40.910 --> 00:01:44.188 oven right now. I saw some folks that have a really good 31 00:01:44.246 --> 00:01:48.050 strategy, but maybe they haven't really figured out how to put it 32 00:01:48.109 --> 00:01:51.737 into the product in a way that makes business value as opposed 33 00:01:51.796 --> 00:01:55.366 to security value sense, because the language that we need to 34 00:01:55.424 --> 00:01:58.526 speak, and it's sort of universally recognized, is we 35 00:01:58.585 --> 00:02:01.862 need to start thinking about business and less about the 36 00:02:01.921 --> 00:02:05.490 actual security tooling that everybody here is trying to sell 37 00:02:05.549 --> 00:02:09.353 us. And I think that story about how you can go in and say, we're 38 00:02:09.412 --> 00:02:12.747 looking at this product that has AI to help us defend the 39 00:02:12.806 --> 00:02:16.259 business against these threats that we know are like either 40 00:02:16.317 --> 00:02:19.536 national critical infrastructure or just their business 41 00:02:19.595 --> 00:02:22.580 operations, and there's a potential for loss there. 42 00:02:26.600 --> 00:02:29.900 any standout today or yesterday? 43 00:02:30.020 --> 00:02:31.700 Mathew Schwartz: Yeah definitely. I had a wonderful 44 00:02:31.700 --> 00:02:35.660 discussion with the head of the Wales cybersecurity resilience 45 00:02:35.660 --> 00:02:38.330 center. I may have gotten the name wrong, but it's the center 46 00:02:38.330 --> 00:02:41.270 of Wales devoted to resilience, yeah, and with a cybersecurity 47 00:02:41.270 --> 00:02:44.240 focus. Great discussion just about the outreach that they're 48 00:02:44.240 --> 00:02:49.490 doing, trying to up everybody's game from them, from CISA, from 49 00:02:49.490 --> 00:02:52.160 the National Cybersecurity Center, not here at the moment, 50 00:02:52.160 --> 00:02:55.280 because we're in the election period. We're hearing MFA 51 00:02:55.280 --> 00:02:58.760 please, at least. And that came up. We did a lot of interviews. 52 00:02:58.970 --> 00:03:02.540 You don't need to have MFA everywhere. I mean, ideally, 53 00:03:02.540 --> 00:03:05.240 eventually, yes, but just get started. You know that one 54 00:03:05.240 --> 00:03:08.030 little bite of the elephant, we'll get there. But just try to 55 00:03:08.030 --> 00:03:11.120 do better, because we're seeing still so much in the way of 56 00:03:11.120 --> 00:03:13.670 cybercrime, especially ransomware, but also business 57 00:03:13.670 --> 00:03:16.670 email compromise, getting the really easy stuff 58 00:03:16.790 --> 00:03:20.030 opportunistically. I know we've been hearing that for years, but 59 00:03:20.210 --> 00:03:24.560 great to hear that get really focused on. One other interview 60 00:03:24.590 --> 00:03:29.060 that I would highlight as well was the CISO and CSO of Virgin 61 00:03:29.060 --> 00:03:33.800 Media O2 - Stuart Seymour; it was great. He was speaking here 62 00:03:33.800 --> 00:03:38.150 at InfoSec Europe about crisis management. And the great Mike 63 00:03:38.150 --> 00:03:40.640 Tyson quote, "Everyone's got a plan till they get punched in 64 00:03:40.640 --> 00:03:44.570 the face." So he has a military background and was just talking 65 00:03:44.570 --> 00:03:47.510 about, anytime you think there's something wrong, stand up your 66 00:03:47.510 --> 00:03:50.630 crisis management response. A lot of people don't want to do 67 00:03:50.630 --> 00:03:52.880 that. They think they've done something wrong. He said, it's 68 00:03:52.880 --> 00:03:56.330 the opposite. Get in there. If it's not a problem, stand down. 69 00:03:56.600 --> 00:04:00.050 If it still is a problem, you've responded. And I think it's good 70 00:04:00.050 --> 00:04:03.980 to have that level of psychological readiness being 71 00:04:03.980 --> 00:04:06.590 discussed when it comes to cybersecurity, because the 72 00:04:06.590 --> 00:04:09.590 quicker you get out there, I mean, the better off everything 73 00:04:09.590 --> 00:04:10.280 turns out to be. 74 00:04:10.910 --> 00:04:12.920 Ian Thornton-Trump: I want to go back to MFA for a moment, 75 00:04:13.250 --> 00:04:18.050 because, listen, I think the game has changed. Fundamentally, 76 00:04:18.080 --> 00:04:21.260 if your product doesn't support MFA, you're going to have a 77 00:04:21.260 --> 00:04:23.870 tough time selling that product in the market today given the 78 00:04:23.870 --> 00:04:26.960 threat landscape. And it does kind of send the message, we 79 00:04:26.960 --> 00:04:31.760 don't care about the customer data. Now, making it optional 80 00:04:31.790 --> 00:04:35.120 and making it something that the customer has to turn on, it's 81 00:04:35.120 --> 00:04:39.350 debatable as to whose side of the fence that's on, and sure, 82 00:04:39.410 --> 00:04:43.400 the customer can make an informed decision maybe, but 83 00:04:43.460 --> 00:04:48.830 mandatory MFA after you've sold $1 or have, you know, deployed 84 00:04:48.830 --> 00:04:51.890 something that is not ephemeral, that is going to be part of the 85 00:04:51.890 --> 00:04:55.430 infrastructure. I think we need potentially the legislation that 86 00:04:55.430 --> 00:05:00.080 says, turn it the hell on after you've taken a dollar turn it 87 00:05:00.080 --> 00:05:04.490 on, because if the customer isn't informed about the threat, 88 00:05:04.760 --> 00:05:09.050 maybe you have to act on behalf of that customer to, well, keep 89 00:05:09.200 --> 00:05:10.580 the customer, right? 90 00:05:11.000 --> 00:05:11.600 Mathew Schwartz: Yeah, great. 91 00:05:12.200 --> 00:05:14.570 Anna Delaney: And what sort of conversations surface in terms 92 00:05:14.570 --> 00:05:17.330 of the geopolitical landscape, the threat landscape there? I 93 00:05:17.330 --> 00:05:18.470 know you've got some thoughts on that. 94 00:05:18.470 --> 00:05:20.540 Ian Thornton-Trump: It's so nice to hear about the thoughts. I 95 00:05:20.540 --> 00:05:24.080 think the big news story is law enforcement that has some 96 00:05:24.080 --> 00:05:28.850 fantastic wins, and we started looking at the cyber underground 97 00:05:28.850 --> 00:05:31.580 and what effect it has had. Well, it's really interesting. 98 00:05:31.610 --> 00:05:35.360 We're seeing more sort of, like ransomware groups that are 99 00:05:35.390 --> 00:05:39.050 reinventing themselves, reorganizing, clearing out this 100 00:05:39.050 --> 00:05:42.800 sort of old, open-source information that we might have 101 00:05:42.800 --> 00:05:45.920 about those personas and people, and they're coming back at a 102 00:05:45.920 --> 00:05:49.430 ferocious level. Like we've seen some really big data breaches. 103 00:05:49.430 --> 00:05:52.610 Data breaches that remind me of the early days where, you know, 104 00:05:52.640 --> 00:05:56.750 millions of records were, you know, dumped from Yahoo and some 105 00:05:56.750 --> 00:06:00.350 of the biggest you know names in the business. And you know what, 106 00:06:00.380 --> 00:06:03.710 In the wake of the Ticketmaster data breach, which is truly 107 00:06:03.800 --> 00:06:08.750 horrifying for its potential impact on, you know, election 108 00:06:08.750 --> 00:06:11.030 disinformation, voter suppression and things like 109 00:06:11.030 --> 00:06:15.080 this, we're into that paradigm again where we're seeing big 110 00:06:15.080 --> 00:06:19.070 data breaches and, you know, patient zero, that one was MFA, 111 00:06:19.160 --> 00:06:19.940 or the lack of it. 112 00:06:19.970 --> 00:06:20.540 Mathew Schwartz: Blackout. 113 00:06:20.600 --> 00:06:21.020 Ian Thornton-Trump: Yeah. 114 00:06:22.190 --> 00:06:24.440 Anna Delaney: I heard a lot about regulation because there's 115 00:06:24.440 --> 00:06:28.760 NIST 2.0, there's DORA and how organizations are frantically 116 00:06:28.760 --> 00:06:31.760 trying to prepare if they haven't already. And a lot of 117 00:06:31.760 --> 00:06:35.510 legal advice here, saying, okay, we know it's complex, but here's 118 00:06:35.510 --> 00:06:38.270 how to prioritize. So lovely conversations around that. Loved 119 00:06:38.360 --> 00:06:42.260 my conversation with Jonathan Armstrong, lawyer as well. What 120 00:06:42.260 --> 00:06:45.230 can we learn from the British Post Office scandal, which of 121 00:06:45.230 --> 00:06:49.700 course impacted over 700 people falsely accused of theft and 122 00:06:49.700 --> 00:06:53.480 fraud because of an IT system, a new IT system. And love his 123 00:06:53.480 --> 00:06:56.870 insights on, you know, the legal, the ethical perspectives 124 00:06:56.870 --> 00:07:00.410 there and what can we learn. So brilliant conversations. 125 00:07:00.450 --> 00:07:02.220 Ian Thornton-Trump: I think there's three things that really 126 00:07:02.220 --> 00:07:07.830 come out of this, is it will come back to haunt you. So once 127 00:07:07.830 --> 00:07:10.980 it goes into the legal realm, where there's discovery, where 128 00:07:10.980 --> 00:07:13.980 there's the ability of the government to find the documents 129 00:07:13.980 --> 00:07:16.650 that they need to, you know, basically figure out what the 130 00:07:16.650 --> 00:07:19.740 heck went wrong. I think that's a big part of the story. The 131 00:07:19.740 --> 00:07:24.150 other part of the story is that the vendor-supplier relationship 132 00:07:24.150 --> 00:07:27.150 needs to be well defined. You can't have, you know, a 133 00:07:27.150 --> 00:07:32.490 situation where the vendor is saying x and the consumer is 134 00:07:32.490 --> 00:07:36.810 saying y, because that's, you know, an area that just falls 135 00:07:36.810 --> 00:07:40.680 apart completely. And then finally, I think, you know, our 136 00:07:40.680 --> 00:07:44.700 tendency, and this is a good thing, I think, as information 137 00:07:44.700 --> 00:07:48.240 security professionals, we're going to have to prove the case 138 00:07:48.390 --> 00:07:52.470 that the computer could be wrong under certain circumstances. In 139 00:07:52.470 --> 00:07:55.710 British law, the big thing was the computer was accepted as 140 00:07:55.710 --> 00:08:00.000 being truth. And now we've clearly seen that computers are 141 00:08:00.000 --> 00:08:04.350 not always right, and they may do some crazy things, which 142 00:08:04.350 --> 00:08:08.340 brings us to the AI story about crazy things and how, you know, 143 00:08:08.340 --> 00:08:11.430 we're turning loose to technology that we don't really 144 00:08:11.430 --> 00:08:14.640 know what it's going to do. We know it's going to do some 145 00:08:14.640 --> 00:08:19.080 things, and we're hoping right now that it does good things, 146 00:08:19.200 --> 00:08:22.770 but hope isn't the plan. So you know, going back to the other 147 00:08:22.770 --> 00:08:26.370 story that Mathew was following about resiliency, man, we're 148 00:08:26.370 --> 00:08:29.790 going to have to look at what that looks like from when your AI 149 00:08:29.790 --> 00:08:32.970 goes bonkers and angers a whole bunch of your customers. 150 00:08:33.540 --> 00:08:35.940 Anna Delaney: Love that you put in bonkers. That's one of my 151 00:08:35.940 --> 00:08:39.750 favorite words. What about the feel on the floor this year in terms 152 00:08:39.750 --> 00:08:41.580 of other years? You've been here for many. 153 00:08:42.650 --> 00:08:46.040 Mathew Schwartz: I'll flashback, if I may, first at least, to the 154 00:08:46.070 --> 00:08:48.590 previous Infosecurity Europe. They were held at the Olympia, 155 00:08:48.590 --> 00:08:51.830 for example, which would turn into kind of a sweltering 156 00:08:52.040 --> 00:08:56.840 environment, agricultural hall, beautiful of the tour of the 157 00:08:56.840 --> 00:09:02.330 Eiffel Tower era, iron and glass, even on a cold London day 158 00:09:02.330 --> 00:09:07.340 would get very balmy. So, venue upgrade, here at the ExCeL, I 159 00:09:07.340 --> 00:09:09.560 wasn't here last year, but I have to say, compared to my 160 00:09:09.560 --> 00:09:13.880 previous years, the buzz is remarkable. Tuesday, hopping. 161 00:09:13.910 --> 00:09:18.650 Wednesday, you could hardly move on the show floor, and excellent 162 00:09:18.650 --> 00:09:22.370 parties afterwards. Thursday, still a strong buzz. A lot of 163 00:09:22.370 --> 00:09:25.700 the halls with the lectures, the presentations, are completely 164 00:09:25.700 --> 00:09:30.650 full today still. So I would say it's been a very good event. 165 00:09:30.690 --> 00:09:33.240 Ian Thornton-Trump: Day 3 is usually a hard slag. You know, 166 00:09:33.240 --> 00:09:36.600 everybody has been out the night before, perhaps having one or 167 00:09:36.600 --> 00:09:40.200 two beverages, perhaps spending some of the very generous bar 168 00:09:40.200 --> 00:09:44.130 tab that pen test partners provided. But I will say you're 169 00:09:44.130 --> 00:09:47.010 right about the buzz. You're also right about the engagement. 170 00:09:47.190 --> 00:09:50.520 I see a lot of people here that aren't just security people. 171 00:09:50.520 --> 00:09:55.830 I've seen CEOs, CFOs, all the C- suite are here. This, you know, 172 00:09:55.830 --> 00:09:58.500 is part of one of the fundamental observations I had 173 00:09:58.500 --> 00:10:02.490 in that the color scheme. So it's not the garish kind of like 174 00:10:02.520 --> 00:10:06.570 AlienVault you know, like yellow and black sprayed everywhere. 175 00:10:06.570 --> 00:10:07.320 Mathew Schwartz: Hyper green. 176 00:10:07.350 --> 00:10:09.330 Ian Thornton-Trump: Yeah, or hyper green, you know, it's 177 00:10:09.330 --> 00:10:13.200 more, I think, business friendly and less scary. And I think 178 00:10:13.200 --> 00:10:16.680 that's important, because, again, who is signing the checks 179 00:10:16.710 --> 00:10:20.280 of the CISO? It's the CFO, the CIO and stuff like that. 180 00:10:20.280 --> 00:10:23.040 Everybody's got a boss somewhere, and in order to spend 181 00:10:23.040 --> 00:10:25.140 your budget, you got to convince them that it's the right 182 00:10:25.140 --> 00:10:26.010 business move. 183 00:10:26.820 --> 00:10:30.030 Anna Delaney: What about swag this year? Collect any freebies 184 00:10:30.360 --> 00:10:30.690 on the floor? 185 00:10:31.050 --> 00:10:33.690 Ian Thornton-Trump: I tend to not do that. But I looked at 186 00:10:33.690 --> 00:10:35.760 some of the options that they have. I mean, you've got the 187 00:10:35.760 --> 00:10:38.760 lovely Government of Canada here, you know, hawking their 188 00:10:38.760 --> 00:10:41.370 new wares that they want to bring to the market. They've got 189 00:10:41.370 --> 00:10:45.330 some great swag over there. It was. I find swag as an excuse to 190 00:10:45.330 --> 00:10:47.490 have a conversation, and I never have a problem having a 191 00:10:47.490 --> 00:10:47.970 conversation. 192 00:10:48.990 --> 00:10:51.150 Mathew Schwartz: Well, I'll step into this swag vacuum that's 193 00:10:51.150 --> 00:10:54.570 been created here. There's a great little arcade ... Cisco is 194 00:10:54.750 --> 00:10:57.750 sponsoring that with some old arcade games. So I had a little 195 00:10:57.780 --> 00:11:00.930 turn of the Donkey Kong and the Galaga, which is good fun. 196 00:11:01.260 --> 00:11:04.710 There's been some excellent flat whites. I won't name names, but 197 00:11:04.770 --> 00:11:06.900 a lot of baristas here this year, which is interesting. 198 00:11:06.930 --> 00:11:09.300 Yeah, there's a crux stand, which I have not had the 199 00:11:09.300 --> 00:11:13.350 opportunity to partake in, but the cues have been substantial 200 00:11:13.350 --> 00:11:17.910 for that. I've seen your usual bouncy balls, your glowing green 201 00:11:17.910 --> 00:11:20.700 swords, that sort of thing. But I think there's been a real 202 00:11:20.700 --> 00:11:22.350 focus on food this year actually. 203 00:11:22.560 --> 00:11:26.188 Anna Delaney: Yes, got to fuel your punters. So one word to 204 00:11:26.268 --> 00:11:29.010 describe the event in essence Mat? 205 00:11:29.000 --> 00:11:29.600 Mathew Schwartz: Buzz. 206 00:11:29.010 --> 00:11:29.580 Anna Delaney: Buzz. 207 00:11:30.620 --> 00:11:31.400 Mathew Schwartz: Successful. 208 00:11:32.090 --> 00:11:34.430 Anna Delaney: Yeah, friendly. I'm going to say it's friendly. 209 00:11:34.430 --> 00:11:37.700 I've loved, I love bumping into people and say I haven't seen 210 00:11:37.700 --> 00:11:40.160 for a year, and it's amazing. 211 00:11:40.000 --> 00:11:41.530 Ian Thornton-Trump: I was meeting people right in the 212 00:11:41.530 --> 00:11:44.590 hallway just coming in, and we hadn't seen each other in years, 213 00:11:44.590 --> 00:11:47.230 and it was a great catch up. This is sort of like, again, 214 00:11:47.230 --> 00:11:49.900 it's a community gathering. But of course, part of that 215 00:11:49.900 --> 00:11:53.440 community are our vendors, right? And everybody here, I 216 00:11:53.440 --> 00:11:56.410 think, is focused on protecting businesses, and that's a great 217 00:11:56.440 --> 00:12:01.300 mission that has value. How good they sell it really comes down 218 00:12:01.300 --> 00:12:03.730 to how skilled that staff is. 219 00:12:04.490 --> 00:12:07.220 Anna Delaney: Well Ian, it's been such a pleasure and a great 220 00:12:07.700 --> 00:12:08.690 crack as they say. 221 00:12:10.700 --> 00:12:11.600 Mathew Schwartz: Honorary editor on the Editors' Panel. 222 00:12:13.790 --> 00:12:16.070 Ian Thornton-Trump: Amazing! Thank you! Deeply honored guys. 223 00:12:16.100 --> 00:12:16.910 Deeply honored. 224 00:12:17.090 --> 00:12:18.440 Anna Delaney: Thank you Mat. 225 00:12:18.440 --> 00:12:19.993 Mathew Schwartz: Thanks Anna. 226 00:12:20.048 --> 00:12:22.770 Anna Delaney: Thank you so much for watching. For ISMG, I am Anna Delaney.