WEBVTT 1 00:00:01.680 --> 00:00:03.270 Michael Novinson: Hello, this is Michael Novinson with 2 00:00:03.270 --> 00:00:05.820 Information Security Media Group. We're going to be talking 3 00:00:05.820 --> 00:00:08.640 about artificial intelligence and the talent shortage with 4 00:00:08.640 --> 00:00:12.210 Kyle Hanslovan. He is the co-founder and CEO at Huntress. 5 00:00:12.330 --> 00:00:13.260 Hi, Kyle, how are you? 6 00:00:13.290 --> 00:00:15.330 Kyle Hanslovan: Hey again! Just stoked to be here. 7 00:00:15.510 --> 00:00:16.890 Michael Novinson: Stoked to have you. I know it's been 8 00:00:16.890 --> 00:00:20.670 approximately five months since the launch of ChatGPT, a lot of 9 00:00:20.670 --> 00:00:23.820 dialogue in the past few months about all of the things that AI 10 00:00:23.820 --> 00:00:27.240 does. What are some of the things that AI really struggles 11 00:00:27.240 --> 00:00:27.510 with? 12 00:00:27.690 --> 00:00:30.360 Kyle Hanslovan: So I love, obviously, all the automation, 13 00:00:30.390 --> 00:00:34.020 but a lot of people forget that there are real limitations. My 14 00:00:34.020 --> 00:00:36.900 favorite example is like for the audience out there if they were 15 00:00:36.900 --> 00:00:39.930 googling the words creepy Mickey, that's a weird word to 16 00:00:39.930 --> 00:00:42.570 Google, right? But there's tons of creepy Mickey Mouse images on 17 00:00:42.570 --> 00:00:45.510 there. And you can train some of these models, whether attended 18 00:00:45.510 --> 00:00:48.180 or unattended machine learning, and still, sometimes these AI 19 00:00:48.180 --> 00:00:51.090 models will say that is still Mickey Mouse. But the difference 20 00:00:51.090 --> 00:00:54.690 is is a 3-5-6 year old child can still spot out and be like, 21 00:00:54.720 --> 00:00:57.240 "That's not Mickey. That's creepy Mickey," right? And that 22 00:00:57.240 --> 00:01:00.030 kind of shows these gaps that sometimes exist in this 23 00:01:00.030 --> 00:01:03.060 automation, where oftentimes the automation will help, it'll 24 00:01:03.060 --> 00:01:06.150 bring a good enough, but I tend to use the word, instead of 25 00:01:06.270 --> 00:01:09.150 replacing personnel, I'm seeing it as a kind of a revolution for 26 00:01:09.150 --> 00:01:11.730 augmenting. And I always just throw out that creepy Mickey 27 00:01:11.730 --> 00:01:15.390 analogy of it's not quite perfect. That's not the Mickey 28 00:01:15.390 --> 00:01:18.600 you're looking for in this case. I think sometimes AI is a little 29 00:01:18.600 --> 00:01:20.940 bit more blown up, but it's not quite the solution you're 30 00:01:20.940 --> 00:01:22.020 looking for either. 31 00:01:22.000 --> 00:01:23.530 Michael Novinson: So if you're talking about augmentation, what 32 00:01:23.530 --> 00:01:26.050 are some of the things that you feel AI does really well? What 33 00:01:26.050 --> 00:01:27.730 are some of the things that humans are still better at? 34 00:01:27.780 --> 00:01:29.400 Kyle Hanslovan: Yeah, so I'm obviously knee deep in the 35 00:01:29.400 --> 00:01:31.770 cybersecurity side of the house, and we're getting ready to see 36 00:01:31.770 --> 00:01:34.560 and if we've already kind of seen like the generative, you 37 00:01:34.560 --> 00:01:37.320 know, AI platforms that are out there right now are building 38 00:01:37.320 --> 00:01:39.540 beautiful things: pictures that you can't tell if they're real, 39 00:01:40.080 --> 00:01:43.830 the same thing, text that seems believable. But when you start 40 00:01:43.830 --> 00:01:46.800 really get into detection of the nominal anomalous, especially 41 00:01:46.800 --> 00:01:49.200 like trying to figure out somebody who's purposely trying 42 00:01:49.200 --> 00:01:51.990 to deceive the detection algorithm, I personally have a 43 00:01:51.990 --> 00:01:55.440 bias toward; the AI is really flourishing and creating new, 44 00:01:55.470 --> 00:01:59.460 generative, beautiful, you know, creating works of art, but 45 00:01:59.460 --> 00:02:01.920 trying to be able to identify somebody who's trying to game 46 00:02:01.920 --> 00:02:05.520 the algorithm itself, we're just not seeing the same monumental 47 00:02:05.520 --> 00:02:07.410 leaps and bounds as you're seeing kind of in the other 48 00:02:07.410 --> 00:02:12.060 places in the space. So that's me saying, I embrace it, I love 49 00:02:12.060 --> 00:02:16.590 it. But we still see a whole lot of humans augmenting AI for the 50 00:02:16.590 --> 00:02:19.260 very, very near future. And I would even say, into the 51 00:02:19.260 --> 00:02:19.770 distant. 52 00:02:20.320 --> 00:02:21.760 Michael Novinson: So at this point, and I know, it's still 53 00:02:21.760 --> 00:02:24.550 really early days for generative AI, are there any areas where 54 00:02:24.550 --> 00:02:27.130 you see it really adding value from a cyber perspective, and if 55 00:02:27.130 --> 00:02:27.670 so, where? 56 00:02:27.750 --> 00:02:29.880 Kyle Hanslovan: Yeah, so being able to make some of these 57 00:02:29.880 --> 00:02:33.210 things, I would call it almost like democratizing certain parts 58 00:02:33.210 --> 00:02:36.570 of cybersecurity of like, hey, there was a very heavy bar to be 59 00:02:36.570 --> 00:02:39.390 able to create maybe some of these proofs of concepts, or 60 00:02:39.390 --> 00:02:42.240 being able to create like a pattern or a thought process. 61 00:02:42.240 --> 00:02:45.540 And we see some of this in software development. Let me 62 00:02:45.540 --> 00:02:48.540 give you a basic example of code that you can then build upon on 63 00:02:48.540 --> 00:02:51.000 your own or, you know, in your own element, or tailor it to 64 00:02:51.000 --> 00:02:53.730 your own needs. That's amazing because then you take somebody 65 00:02:53.730 --> 00:02:56.460 who has a junior baseline expertise, and they can build on 66 00:02:56.460 --> 00:02:59.070 it. So they're standing on the shoulders of a lot of this AI. 67 00:02:59.490 --> 00:03:02.310 And again, I've just not seen any of these cases where it's 68 00:03:02.310 --> 00:03:05.250 truly blowing things out of the water, maybe in limited 69 00:03:05.250 --> 00:03:08.850 circumstances, but truly scalable, especially if you 70 00:03:08.850 --> 00:03:11.970 look. Attackers are starting to use some of this AI as well. 71 00:03:12.210 --> 00:03:14.130 They're also getting democratized, they're now 72 00:03:14.130 --> 00:03:16.500 writing a phishing email, they're a heck of a lot better 73 00:03:16.500 --> 00:03:18.930 at using the English language. Now, their phishing email than 74 00:03:18.930 --> 00:03:21.240 they were, you know, even a couple of months ago when it was 75 00:03:21.240 --> 00:03:24.780 broken English and typo. So it goes both ways; defenders are 76 00:03:24.780 --> 00:03:27.480 getting better using it. But also attackers are as well. 77 00:03:27.000 --> 00:03:29.370 Michael Novinson: So what are some of the more interesting 78 00:03:29.370 --> 00:03:31.950 ways you've seen attackers use it? You mentioned in terms of 79 00:03:31.950 --> 00:03:35.070 getting kind of that authentic English emails, what are some 80 00:03:35.460 --> 00:03:38.940 other novel interesting things you've seen adversaries do with 81 00:03:38.940 --> 00:03:39.300 AI? 82 00:03:39.360 --> 00:03:41.490 Kyle Hanslovan: In the early days, we started seeing, you 83 00:03:41.490 --> 00:03:44.580 know, this is the OpenAI ChatGPT abuse that they're trying to 84 00:03:44.580 --> 00:03:49.530 lock down, not to do bad things, per se. But what I think was 85 00:03:49.530 --> 00:03:52.650 probably most of the novel attack was, "Hey, can I get some 86 00:03:52.650 --> 00:03:55.380 of these basically call macros?" Those are the embedded features 87 00:03:55.380 --> 00:03:58.290 that you put inside documents that when clicked, someone opens 88 00:03:58.290 --> 00:04:01.860 a document says, "Yes, you know, enable this automation, or 89 00:04:01.860 --> 00:04:05.250 enable macros," it does bad things that required somebody 90 00:04:05.250 --> 00:04:08.430 who used to know how to code or how to be able to cobble 91 00:04:08.430 --> 00:04:11.820 together things off of like GitHub or Stack Overflow. And 92 00:04:11.820 --> 00:04:15.180 now being able to see actors come out there and create kind 93 00:04:15.180 --> 00:04:17.490 of like a template, which at ChatGPT, and then adding their 94 00:04:17.490 --> 00:04:20.760 basic functionality, that again, that's lowering the bar for a 95 00:04:20.760 --> 00:04:23.550 threat actor to be able to do something. And to me, I'm 96 00:04:23.550 --> 00:04:25.710 obsessed with that. Don't you know, my whole business is being 97 00:04:25.710 --> 00:04:29.400 able to make more junior people stand on the shoulders of kind 98 00:04:29.400 --> 00:04:33.060 of great automation and great experience. And I think I was 99 00:04:33.090 --> 00:04:36.030 not surprised to see attackers do the exact same thing. 100 00:04:36.330 --> 00:04:38.970 Michael Novinson: Of course. And sticking with that human element 101 00:04:38.970 --> 00:04:41.910 to know you were talking about the idea of AI being able to 102 00:04:41.910 --> 00:04:44.610 augment humans. I know it's been approximately a year since we 103 00:04:44.610 --> 00:04:46.920 really started into this economic downturn, the 104 00:04:47.160 --> 00:04:51.060 two-headed monster inflation and interest rates, and what has the 105 00:04:51.090 --> 00:04:55.020 economic environment meant in terms of the ability of end user 106 00:04:55.020 --> 00:04:58.290 organizations to access skilled cyber professionals? 107 00:04:58.000 --> 00:05:00.550 Kyle Hanslovan: Yeah, I mean, anybody you talk to say, "I 108 00:05:00.550 --> 00:05:03.130 can't find great security people and I'm even struggling to find 109 00:05:03.130 --> 00:05:06.310 the more senior IT people." That whole work from home, remote 110 00:05:06.310 --> 00:05:08.800 world, kind of really put pressure. And now some of these 111 00:05:08.800 --> 00:05:12.190 great talent that were locked away in areas that maybe were 112 00:05:12.190 --> 00:05:14.890 available to local companies are now available on a remote or 113 00:05:14.890 --> 00:05:18.400 global scale. And so the end result is still no great talent. 114 00:05:18.850 --> 00:05:21.550 And if you are finding great talent, you're paying for it, 115 00:05:21.910 --> 00:05:25.450 which means you have to try to retain it too. And so bringing 116 00:05:25.450 --> 00:05:27.520 those two conversations together, we're actually seeing 117 00:05:27.520 --> 00:05:30.850 a lot of this automation of if you do have that great talent, 118 00:05:30.970 --> 00:05:33.310 how do you help automate, manage, take care of the 119 00:05:33.310 --> 00:05:36.190 majority of the heavy lifting, and that's applying in IT, 120 00:05:36.490 --> 00:05:39.220 that's applying in security and say, "Hey, can I make that more 121 00:05:39.220 --> 00:05:42.400 junior person stand on the shoulders of this technology and 122 00:05:42.400 --> 00:05:45.880 have essentially the expertise of somebody that's more mid or 123 00:05:45.880 --> 00:05:49.630 more senior?" And I think that, you know, I'm biased, I'm an Air 124 00:05:49.630 --> 00:05:52.510 Force guy, right? And so one of the things most people forget is 125 00:05:52.510 --> 00:05:56.140 like, when World War II was won, it actually was won by air 126 00:05:56.140 --> 00:05:59.320 power; a lot of people say, and when you look at air power, we 127 00:05:59.320 --> 00:06:03.790 didn't make it. So pilots became, you know, amazing 128 00:06:03.820 --> 00:06:06.460 pilots, and just grew infinite numbers of amazing pilots. We 129 00:06:06.460 --> 00:06:09.310 actually made the planes easier to fly. So I'm seeing right now 130 00:06:09.310 --> 00:06:11.260 it kind of has a similar analogy, we don't have a lot of 131 00:06:11.260 --> 00:06:13.870 these great texts. We don't have a lot of these great texts that 132 00:06:13.870 --> 00:06:17.530 are these superhuman security skilled folks. But what we are 133 00:06:17.530 --> 00:06:20.230 doing is we're making again, the planes easier to fly, then the 134 00:06:20.230 --> 00:06:22.810 security products, the IT products, they're using this 135 00:06:22.810 --> 00:06:26.140 automation that enable again, a more junior talent, just like a 136 00:06:26.140 --> 00:06:29.560 more junior pilot to fly this plane. So I think this has been 137 00:06:29.560 --> 00:06:31.690 done before. But I don't see many people thinking about it 138 00:06:31.690 --> 00:06:33.580 that way of how to leverage this technology. 139 00:06:33.630 --> 00:06:35.670 Michael Novinson: But to double click on that automation piece 140 00:06:35.670 --> 00:06:38.760 and get a sense of really what tasks, what functions in a cyber 141 00:06:38.760 --> 00:06:41.310 context. are you seeing automated? And what's still 142 00:06:41.310 --> 00:06:43.860 being left to maybe that more mid or more senior level 143 00:06:43.860 --> 00:06:44.460 personnel? 144 00:06:44.510 --> 00:06:46.430 Kyle Hanslovan: Yeah, if you think of budget driving this, 145 00:06:46.700 --> 00:06:49.820 almost everybody we're talking to is saying how do I do more 146 00:06:49.820 --> 00:06:52.400 with the same and sometimes more with less because of the 147 00:06:52.400 --> 00:06:55.520 economic conditions? But what's beautiful about that is when 148 00:06:55.520 --> 00:06:59.600 they're saying, "Hey, I need to do more with the same." Double 149 00:06:59.600 --> 00:07:02.660 clicking on that, they started asking, "Well, can I outsource? 150 00:07:02.660 --> 00:07:05.210 Can I have somebody else take care of this?" Somebody with the 151 00:07:05.210 --> 00:07:10.220 expertise that maybe he can deliver my expertise that I need 152 00:07:10.400 --> 00:07:12.530 at the price of a product. And that's where you get your SAS 153 00:07:12.530 --> 00:07:15.050 products that come in, and some of the SAS products that are 154 00:07:15.050 --> 00:07:17.990 starting to use whether it's just better automation, or 155 00:07:17.990 --> 00:07:20.570 thinking through the value that you have to deliver, it's really 156 00:07:20.570 --> 00:07:22.940 kind of - I would call a renaissance of IT first 157 00:07:22.940 --> 00:07:26.990 generation of these products like EDR. Well, here it is, now 158 00:07:26.990 --> 00:07:30.140 you have to manage it. Second generation to this is wait, you 159 00:07:30.140 --> 00:07:33.200 could have somebody else manage that at scale, at a price that 160 00:07:33.200 --> 00:07:36.050 makes sense. And if you can take that with the savings you have, 161 00:07:36.620 --> 00:07:39.440 you don't necessarily need to do more with less. But if you're 162 00:07:39.440 --> 00:07:41.240 talking about how to do more with the same, have somebody 163 00:07:41.240 --> 00:07:43.850 like that take care of it for you. And then use those awesome 164 00:07:43.850 --> 00:07:46.580 IT and security people you have on your true hardest business 165 00:07:46.580 --> 00:07:49.700 problems. And so I'm pretty pumped to see that, in some 166 00:07:49.700 --> 00:07:53.360 ways, economic conditions, plus new creative technology is kind 167 00:07:53.360 --> 00:07:55.550 of like really making a big difference, at least to the 168 00:07:55.550 --> 00:07:56.540 companies we're talking to. 169 00:07:56.000 --> 00:07:58.642 Michael Novinson: Want to get a sense as well, in terms of this, 170 00:07:58.700 --> 00:08:01.745 terms of the economic constraints: What it's about in 171 00:08:01.802 --> 00:08:04.962 terms of measuring the effectiveness of security tools, 172 00:08:05.020 --> 00:08:08.352 and the focus on return on investment. What do users do to 173 00:08:08.409 --> 00:08:11.742 assess questions round? Is the money spent on this product 174 00:08:11.799 --> 00:08:14.270 actually worth it? How do they answer this? 175 00:08:14.000 --> 00:08:16.376 Kyle Hanslovan: I always keep it super simple with it like is the 176 00:08:16.425 --> 00:08:19.578 juice worth the squeeze? That is what everybody is asking in some 177 00:08:19.626 --> 00:08:22.585 of these things that I love is for a really long time, it was 178 00:08:22.633 --> 00:08:25.738 just, "Can I buy a product? Can I feed all the data?" I need all 179 00:08:25.786 --> 00:08:28.696 the data, right? And then you're like who's going to manage? 180 00:08:28.745 --> 00:08:31.898 Who's going to look at it? Who's going to tell me of their signal 181 00:08:31.946 --> 00:08:34.711 and noise? And so the things that I'm loving probably the 182 00:08:34.759 --> 00:08:37.670 most right now is that it's not calling technology out. It's 183 00:08:37.718 --> 00:08:40.434 just holding technology accountable and saying, "Well, I 184 00:08:40.483 --> 00:08:43.539 need an outcome." Like, what's the purpose of all this data, if 185 00:08:43.587 --> 00:08:46.546 I can't make a decision based on it? Like one of the simplest 186 00:08:46.595 --> 00:08:49.165 things that blow me away is, think of your old school 187 00:08:49.214 --> 00:08:51.930 antivirus. Right? Nothing special about it. But you need 188 00:08:51.979 --> 00:08:54.986 to know is it turned on? Is it up to date? And if you had, for 189 00:08:55.034 --> 00:08:58.139 instance, an incident, was that incident - that was quarantined? 190 00:08:58.187 --> 00:09:01.000 Was it really quarantine because it did its job? Or was it 191 00:09:01.049 --> 00:09:04.056 because there's a threat actor doing something multiple times. 192 00:09:04.105 --> 00:09:07.209 And what blows me away is asking about that accountable results, 193 00:09:07.257 --> 00:09:10.071 we're starting to talk to more and more like, whether it's 194 00:09:10.119 --> 00:09:13.127 CISOs, or even just IT directors that are like, I have this, I 195 00:09:13.175 --> 00:09:16.134 don't have the cycles to manage it. And so if you think about 196 00:09:16.182 --> 00:09:19.238 that combination of automation, it's like, Whoa, I can actually 197 00:09:19.287 --> 00:09:22.439 now manage this technology, have somebody be able to take care of 198 00:09:22.488 --> 00:09:25.398 the heavy lifting, deliver me exactly what I need to do with 199 00:09:25.447 --> 00:09:28.502 it. And then again, have those more senior people take the next 200 00:09:28.551 --> 00:09:31.607 heavy steps of like, there is a threat actor, I need to respond 201 00:09:31.655 --> 00:09:34.517 to that. So it's just a really novel time for us to be more 202 00:09:34.566 --> 00:09:37.427 accountable and say, "Well, we're not going to let it slide 203 00:09:37.476 --> 00:09:40.629 that we just have technology for the sake of technology." And for 204 00:09:40.677 --> 00:09:43.248 those that are reporting to board members, or for the 205 00:09:43.296 --> 00:09:46.352 outsourcers that report to like, you know, their own customers, 206 00:09:46.401 --> 00:09:49.408 they're having to answer that question of what have I done for 207 00:09:49.456 --> 00:09:52.415 you lately, and this is what I spent the budget on. So that's 208 00:09:52.464 --> 00:09:55.617 kind of a cool moment for a geek like me to all sudden be able to 209 00:09:55.665 --> 00:09:58.333 say like, oh, the geeky nerdy stuff that we're doing is 210 00:09:58.381 --> 00:10:01.340 actually delivering value to business folks, not just the CFO 211 00:10:01.389 --> 00:10:04.202 and CIO, but actually the CEO and the board, understanding 212 00:10:04.250 --> 00:10:05.900 what have we done for them lately. 213 00:10:06.260 --> 00:10:08.120 Michael Novinson: Absolutely. Kyle, thank you so much for the 214 00:10:08.120 --> 00:10:08.420 time. 215 00:10:08.460 --> 00:10:09.750 Kyle Hanslovan: man. You've been awesome. Thank you again. 216 00:10:10.290 --> 00:10:12.180 Michael Novinson: Thank you. We've been speaking with Kyle 217 00:10:12.180 --> 00:10:15.420 Hanslovan. He is the co-founder and CEO at Huntress. For 218 00:10:15.420 --> 00:10:18.270 Information Security Media Group, this is Michael Novinson. 219 00:10:18.570 --> 00:10:19.410 Have a nice day.