WEBVTT 1 00:00:00.000 --> 00:00:02.670 Anna Delaney: Hello! I'm Anna Delaney, and welcome to the ISMG 2 00:00:02.700 --> 00:00:07.230 Editors' Panel at the end of our London Summit 2024. I'm very 3 00:00:07.230 --> 00:00:09.660 pleased to be joined by my colleague, Mathew Schwartz. 4 00:00:09.690 --> 00:00:10.140 Mathew Schwartz: Hello. 5 00:00:10.000 --> 00:00:12.760 Anna Delaney: Hello! And CyberEdBoard members - Jon 6 00:00:12.760 --> 00:00:16.960 Staniforth, former CISO of Royal Mail, and Helmut Spoecker, vice 7 00:00:16.960 --> 00:00:20.560 president, chief security officer, ECS Partner Management. 8 00:00:20.590 --> 00:00:21.220 That's a long title. 9 00:00:21.220 --> 00:00:21.400 Helmut Spoecker: Thank you! 10 00:00:21.400 --> 00:00:23.230 Anna Delaney: But thank you so much gentlemen for joining me. 11 00:00:23.590 --> 00:00:24.040 Jon Staniforth: Thank you. 12 00:00:25.140 --> 00:00:27.600 Anna Delaney: So Jon, why don't we start with ... you started 13 00:00:27.630 --> 00:00:31.770 the day with a phenomenal panel conversation about lessons 14 00:00:31.770 --> 00:00:35.160 learned from ransomware attacks. Of course, Royal Mail 15 00:00:35.220 --> 00:00:39.690 experienced their own ransomware attack last year, and I thought 16 00:00:39.690 --> 00:00:43.860 that was fabulous, invaluable commentary and insights from 17 00:00:43.860 --> 00:00:47.160 you. Tell us about how that panel went and any other 18 00:00:47.160 --> 00:00:47.940 insights you want to share. 19 00:00:48.150 --> 00:00:49.770 Jon Staniforth: Yeah, I think the panel was really good. We 20 00:00:49.770 --> 00:00:53.010 sort of transitioned from sort of the technical end and that 21 00:00:53.010 --> 00:00:55.980 sort of initial public response and actually developed it into, 22 00:00:55.980 --> 00:00:58.380 sort of, what's it mean for the teams involved in the rest of 23 00:00:58.380 --> 00:01:01.860 the company. I think the day itself also flowed quite well 24 00:01:01.860 --> 00:01:04.020 with a sort of mixture of sort of the technical end and the 25 00:01:04.020 --> 00:01:07.260 sort of softer end. And I think the final session, particularly 26 00:01:07.260 --> 00:01:10.200 the sort of tabletop, helped people sort of start to 27 00:01:10.200 --> 00:01:12.930 understand that these cyber-related issues are 28 00:01:12.930 --> 00:01:16.890 actually much more company-wide to deepfakes and the potential 29 00:01:16.890 --> 00:01:19.620 for sort of CFOs to make payments - means you've got 30 00:01:19.620 --> 00:01:23.130 business processes, sort of segregation of duties, different 31 00:01:23.130 --> 00:01:26.460 types of compliance. It's not just a CISO anymore. So, that 32 00:01:26.460 --> 00:01:29.010 sort of whole journey I thought was quite well done during the 33 00:01:29.010 --> 00:01:29.400 whole day. 34 00:01:30.200 --> 00:01:33.110 Mathew Schwartz: Yeah. One point on your opening panel, not to 35 00:01:33.260 --> 00:01:37.520 play to the current panelist to my right, but somebody turned to 36 00:01:37.520 --> 00:01:41.750 me after that and said, between you and Heather Lowrie, your 37 00:01:41.780 --> 00:01:45.020 co-panelist, who's former CISO at Manchester University, also 38 00:01:45.020 --> 00:01:49.460 suffered a cyberattack last year. He just said how apparent 39 00:01:49.490 --> 00:01:52.640 it was that you were cybersecurity leaders. He said 40 00:01:52.640 --> 00:01:53.810 your leadership came through. 41 00:01:53.990 --> 00:01:54.350 Jon Staniforth: Oh that's nice. 42 00:01:54.380 --> 00:01:56.549 Mathew Schwartz: Thinking about how you approached things and 43 00:01:56.599 --> 00:01:59.656 communicating that to others. He said that had been a really a 44 00:01:59.706 --> 00:02:02.664 good message. It was wonderful to see about how you not only 45 00:02:02.714 --> 00:02:05.180 deal with it but how you inspire those around you. 46 00:02:05.100 --> 00:02:06.540 Jon Staniforth: Oh that's nice to hear. Thank you. 47 00:02:06.590 --> 00:02:08.480 Anna Delaney: I think Mat, also what you said at the end of that 48 00:02:08.480 --> 00:02:12.830 panel. You've been around a little bit, a little while in 49 00:02:12.830 --> 00:02:13.550 the industry. 50 00:02:13.580 --> 00:02:15.740 Mathew Schwartz: We've all been around a little bit Anna in the 51 00:02:15.720 --> 00:02:18.354 Anna Delaney: And before, it was harder to find these lessons 52 00:02:15.740 --> 00:02:16.310 industry. 53 00:02:18.410 --> 00:02:21.492 learned. Maybe there was reticence of sharing, but also 54 00:02:21.548 --> 00:02:24.575 there were a few lessons learned. Now, we've got a lot 55 00:02:24.631 --> 00:02:27.825 more lessons, and thank you so much for that openness and 56 00:02:27.881 --> 00:02:31.356 sharing. I think it's invaluable to the community. Helmut. How 57 00:02:31.412 --> 00:02:35.055 about you? Any insights that you want to share, or any highlights 58 00:02:35.111 --> 00:02:35.840 from the day? 59 00:02:35.630 --> 00:02:40.010 Helmut Spoecker: Oh many. First of all, I was glad to see that 60 00:02:40.370 --> 00:02:45.110 many of my peers share the same thoughts, issues, problems, 61 00:02:45.110 --> 00:02:50.690 challenges - whatever we call it. That they also have similar 62 00:02:50.690 --> 00:02:57.320 approaches to resolve them. And yes, I also ... I enjoyed two 63 00:02:57.320 --> 00:03:01.250 conversations most. First one was Jonathan and Heather's, 64 00:03:02.690 --> 00:03:05.660 because it gave me a lot of insight, and I was very curious 65 00:03:05.660 --> 00:03:12.950 about that. I've seen ransomware attacks myself, and so I was 66 00:03:12.950 --> 00:03:16.640 curious to see and learn how other leaders in the industry 67 00:03:16.640 --> 00:03:22.550 would handle such a situation. I was in lucky position that we 68 00:03:22.550 --> 00:03:29.900 could limit damage very much. But, communication was a 69 00:03:29.900 --> 00:03:34.040 challenge to us as well, and so this was a big learning for me. 70 00:03:34.430 --> 00:03:37.344 Anna Delaney: Yeah, communication is so important. 71 00:03:37.425 --> 00:03:39.530 Mat, any favorite moments? 72 00:03:39.000 --> 00:03:42.420 Mathew Schwartz: Oh it's hard to pick favorites. It's dangerous, 73 00:03:42.720 --> 00:03:45.390 very dangerous to pick favorites, especially sitting 74 00:03:45.960 --> 00:03:50.010 with some of the attendees. I loved the energy of the day. 75 00:03:50.400 --> 00:03:54.090 Your opening panel set a wonderful tone note. It was very 76 00:03:54.090 --> 00:03:56.640 ably moderated by Ian Thornton-Trump. He brought a lot 77 00:03:56.640 --> 00:04:00.420 of energy, and he brought a lot of energy to the next panel 78 00:04:00.420 --> 00:04:03.270 discussion that I had the pleasure of moderating about 79 00:04:03.450 --> 00:04:06.480 "hackers not hacking in - they're logging in." Talking 80 00:04:06.480 --> 00:04:10.260 about identity compromise, which is true. I mean, again, we've 81 00:04:10.260 --> 00:04:13.230 been in the industry a long time, hacks, right? We always 82 00:04:13.230 --> 00:04:15.870 talk about hacks, but so often when you get the details of 83 00:04:15.870 --> 00:04:18.720 these attacks, it's a teenager phoning somebody up and sweet 84 00:04:18.720 --> 00:04:21.990 talking their way through, or it's somebody guessing a 85 00:04:21.990 --> 00:04:25.650 username-password combo, and it works because there wasn't 86 00:04:25.980 --> 00:04:28.920 multi-factor authentication on the account. So, it's less 87 00:04:28.920 --> 00:04:32.550 sophisticated than it looks. If you're criminally inclined, it's 88 00:04:32.550 --> 00:04:36.180 a little easier to achieve, perhaps. So, I liked getting the 89 00:04:36.180 --> 00:04:40.050 detail that we got there. Some other wonderful sessions talking 90 00:04:40.050 --> 00:04:43.410 about AI. What does AI mean? It means too many things, probably 91 00:04:43.410 --> 00:04:46.230 to too many different people. But getting into the specificity 92 00:04:46.230 --> 00:04:50.220 of what do we mean when we say AI, what is it good for? And one 93 00:04:50.220 --> 00:04:54.870 of my takeaways from what was said on stage is actually humble 94 00:04:54.870 --> 00:04:59.010 ambitions, narrow the focus, know what you're putting in so 95 00:04:59.010 --> 00:05:02.100 that you can you verify what's coming out, especially in a 96 00:05:02.400 --> 00:05:06.750 compliance or a GRC context. I thought that was really useful. 97 00:05:06.780 --> 00:05:09.360 So, lots of fun and interesting sessions today. 98 00:05:09.870 --> 00:05:13.920 Anna Delaney: And so, looking at the agenda here, AI regulation, 99 00:05:14.190 --> 00:05:18.330 the supply chain, not necessarily new themes. What ... 100 00:05:18.750 --> 00:05:21.540 was there anything new that you learned today? Was there 101 00:05:21.540 --> 00:05:26.040 anything that you will walk away with, you know, thinking that 102 00:05:26.040 --> 00:05:27.210 it's a new insight? 103 00:05:28.050 --> 00:05:32.370 Jon Staniforth: I quite liked the Rubrik discussion on AI, 104 00:05:32.400 --> 00:05:34.890 because I think it got rid of some of the hype and was a bit 105 00:05:34.890 --> 00:05:38.970 more balanced. And so, that's sort of where are people really 106 00:05:39.270 --> 00:05:42.000 in the real-life journey, as opposed to the hype journey on 107 00:05:42.000 --> 00:05:45.900 AI. And I think that brings a sense of realism that you don't 108 00:05:45.900 --> 00:05:48.720 always hear at some of the conferences or some of the 109 00:05:48.720 --> 00:05:51.540 talks. It's a little bit too ... all the bad guys are doing 110 00:05:51.540 --> 00:05:54.720 everything bad, or the good guys can use it for everything. And I 111 00:05:54.720 --> 00:05:57.960 think that bring's that to the light. And in particularly like 112 00:05:57.960 --> 00:06:01.320 the fact that I'm in the same camp of the opportunity AI is 113 00:06:01.320 --> 00:06:04.410 bringing companies is to start getting data governance owned 114 00:06:04.410 --> 00:06:07.440 much more, and that's always been a challenge for a CISO as 115 00:06:07.440 --> 00:06:11.340 we're only part of that puzzle. So, you've got the ICO, the data 116 00:06:11.340 --> 00:06:14.850 quality, and I think AI, as companies can mature more, are 117 00:06:14.850 --> 00:06:16.770 starting to realize they actually need better data 118 00:06:16.770 --> 00:06:19.050 governance, and that can only sort of contribute to better 119 00:06:19.050 --> 00:06:19.740 cyber hygiene. 120 00:06:21.010 --> 00:06:24.048 Anna Delaney: And thinking about AI first strategies. So, they 121 00:06:24.112 --> 00:06:28.226 came up. As you said, use cases, like specific use cases of where 122 00:06:28.290 --> 00:06:32.405 the benefits lie with AI and the challenges, of course, but it is 123 00:06:32.468 --> 00:06:36.520 good to have those concrete conversations. Helmut, anything you? 124 00:06:36.000 --> 00:06:43.470 Helmut Spoecker: Yeah. AI - I found interesting that the 125 00:06:43.470 --> 00:06:48.000 limits were shown like the case studies were, or the case study 126 00:06:48.000 --> 00:06:53.250 where it was shown, we used this tool to help our SOC, but it 127 00:06:53.250 --> 00:06:56.850 didn't. As a matter of fact, it created more work than it took 128 00:06:56.850 --> 00:07:00.030 away from us. That was an important aspect I will take 129 00:07:00.030 --> 00:07:06.570 home. The other one is, yeah, it also confirmed my conviction 130 00:07:06.570 --> 00:07:10.560 that AI is potentially helping the bad guys more than it is 131 00:07:10.560 --> 00:07:16.140 helping the good guys for now, because it helps exploit 132 00:07:17.010 --> 00:07:22.020 vulnerabilities faster than it did previously. So and time is 133 00:07:22.020 --> 00:07:25.440 our enemy. Has always been, but it's getting worse. 134 00:07:25.000 --> 00:07:29.440 Mathew Schwartz: Well put. Less time to deal with anything. 135 00:07:29.480 --> 00:07:30.590 Helmut Spoecker: That's a quote from Maverick. 136 00:07:30.830 --> 00:07:34.670 Mathew Schwartz: Oh, very good. Here's to Maverick. 137 00:07:36.130 --> 00:07:37.870 Anna Delaney: Well not to put you on the spot, but is there 138 00:07:37.870 --> 00:07:42.460 one word that defines today where we're at the moment in the 139 00:07:42.460 --> 00:07:44.650 industry, I think the AI? 140 00:07:44.680 --> 00:07:45.310 Helmut Spoecker: Awareness. 141 00:07:45.340 --> 00:07:47.170 Anna Delaney: Awareness? Still awareness? 142 00:07:47.000 --> 00:07:49.910 Helmut Spoecker: I found a lot of awareness in the audience - 143 00:07:49.940 --> 00:07:54.230 that was inspiring, because it seems that the good guys are 144 00:07:54.230 --> 00:07:59.000 gathering around the flag, something like that. 145 00:07:59.750 --> 00:08:02.510 Jon Staniforth: For me, some continual learning. So, I've 146 00:08:02.510 --> 00:08:06.230 been in cyber for 20 odd years, and 20 years ago... 147 00:08:06.230 --> 00:08:07.250 Mathew Schwartz: Who's carrying Jon? 148 00:08:07.000 --> 00:08:13.300 Jon Staniforth: Yeah, it might grow back. So basically, it's 149 00:08:13.480 --> 00:08:16.270 continual learning then, which is why I got into the sort of 150 00:08:16.270 --> 00:08:19.750 sector and area, and it's still that now. And I think that sort 151 00:08:19.750 --> 00:08:23.080 of sharing with other people other sectors. I think that's 152 00:08:24.790 --> 00:08:25.720 continual learning. 153 00:08:26.500 --> 00:08:28.930 Mathew Schwartz: Fantastic! "Resilience" - I don't mean to 154 00:08:28.930 --> 00:08:33.610 sound like an eternal optimist, and it was said better in the 155 00:08:33.610 --> 00:08:36.670 course of the day on the supply chain panel that I was 156 00:08:36.700 --> 00:08:40.330 moderating with Dom Lucas and Brian Brackenborough. And I 157 00:08:40.330 --> 00:08:43.990 forget which, or if both of them said this, but they said, "Never 158 00:08:43.990 --> 00:08:48.100 forget how far we've come." We focus on a lot of the problems 159 00:08:48.100 --> 00:08:52.120 that we're facing necessarily, but look at where we are and 160 00:08:52.120 --> 00:08:54.580 look at all the things we can do now that we couldn't do before, 161 00:08:54.790 --> 00:08:59.110 and the level of discussion that we're having. And that is, I 162 00:08:59.110 --> 00:09:02.560 mean, it was inspiring I think. And from a resilience 163 00:09:02.560 --> 00:09:05.770 standpoint, I think we're in such a much better place than we 164 00:09:05.770 --> 00:09:08.500 were. The discussions I'm hearing and that organizations 165 00:09:08.530 --> 00:09:12.640 are having and that regulators are engaging with, we're in such 166 00:09:12.640 --> 00:09:13.960 a much better place than we used to be. 167 00:09:15.520 --> 00:09:17.860 Anna Delaney: I'm thinking about verification. I think that came 168 00:09:17.860 --> 00:09:23.500 up a lot, obviously has to but verifying, say, a deepfake. How 169 00:09:23.500 --> 00:09:26.050 can organization do that. The tools they're using, but it 170 00:09:26.050 --> 00:09:28.660 comes down to awareness, comes down to resilience, continuous 171 00:09:28.660 --> 00:09:31.600 learning. Maybe they all apply. But also, in your session 172 00:09:31.600 --> 00:09:36.550 earlier, "Hackers Don’t Hack In – They Log In." So, how do you 173 00:09:36.550 --> 00:09:39.160 verify who's authentic, who's real? 174 00:09:40.300 --> 00:09:42.670 Mathew Schwartz: And who's ChatGPT, I guess? 175 00:09:45.070 --> 00:09:47.230 Anna Delaney: And on that note, thank you very much gentlemen 176 00:09:47.260 --> 00:09:51.910 for all your insights. Really appreciate this and the rich 177 00:09:51.910 --> 00:09:53.170 insights you brought to the day as well. 178 00:09:53.000 --> 00:09:53.330 Helmut Spoecker: Thank you. 179 00:09:53.270 --> 00:09:53.990 Jon Staniforth: Thank you very much. 180 00:09:54.260 --> 00:09:54.800 Mathew Schwartz: Thank you. 181 00:09:54.830 --> 00:09:55.520 Jon Staniforth: Nice to meet both. 182 00:09:55.850 --> 00:09:58.070 Anna Delaney: Thanks so much for watching. Until next time.