1 00:00:00,000 --> 00:00:17,400 thanks better 2 00:00:17,400 --> 00:00:23,259 [Music] 3 00:00:25,960 --> 00:00:29,769 [Music] 4 00:00:37,950 --> 00:00:50,030 [Music] 5 00:00:50,030 --> 00:00:52,100 [Applause] 6 00:00:52,100 --> 00:00:57,850 [Music] 7 00:01:00,079 --> 00:01:02,160 hello welcome back i am patrick tucker 8 00:01:02,160 --> 00:01:04,000 technology editor at defence one our 9 00:01:04,000 --> 00:01:05,040 next panel is 10 00:01:05,040 --> 00:01:07,520 titans of technology winning the future 11 00:01:07,520 --> 00:01:08,640 of defense 12 00:01:08,640 --> 00:01:11,760 joining me is emmanuel brasson advisor 13 00:01:11,760 --> 00:01:12,000 for 14 00:01:12,000 --> 00:01:14,240 cyber ai and disruptive technologies 15 00:01:14,240 --> 00:01:16,640 within the french ministry of armies 16 00:01:16,640 --> 00:01:18,560 in the directorate general for 17 00:01:18,560 --> 00:01:20,400 international relations and strategy 18 00:01:20,400 --> 00:01:22,799 and dr heather roth senior research 19 00:01:22,799 --> 00:01:24,799 analyst at the johns hopkins university 20 00:01:24,799 --> 00:01:27,040 applied physics lab welcome to you both 21 00:01:27,040 --> 00:01:29,280 uh quick reminder to our audience we do 22 00:01:29,280 --> 00:01:31,200 want to hear from you we're very anxious 23 00:01:31,200 --> 00:01:31,520 to 24 00:01:31,520 --> 00:01:32,720 incorporate your questions into our 25 00:01:32,720 --> 00:01:35,119 programming today use the globe sec app 26 00:01:35,119 --> 00:01:37,119 to send us questions and i'll be able to 27 00:01:37,119 --> 00:01:39,439 see them here uh later on i might uh 28 00:01:39,439 --> 00:01:41,759 ask the room if they have questions 29 00:01:41,759 --> 00:01:42,880 please raise your hand and then i'll 30 00:01:42,880 --> 00:01:44,000 call on you and then you can approach 31 00:01:44,000 --> 00:01:45,680 the microphone uh 32 00:01:45,680 --> 00:01:48,000 so please keep that in mind we do want 33 00:01:48,000 --> 00:01:48,880 to hear from you 34 00:01:48,880 --> 00:01:52,079 let's okay 35 00:01:52,079 --> 00:01:54,640 uh one second uh emmanuel can you hear 36 00:01:54,640 --> 00:01:57,200 me okay 37 00:02:03,680 --> 00:02:06,000 let manuel fix this sound problem real 38 00:02:06,000 --> 00:02:07,520 quick let me turn 39 00:02:07,520 --> 00:02:09,758 instead to you heather real quick how 40 00:02:09,758 --> 00:02:11,760 can the international community 41 00:02:11,760 --> 00:02:14,239 strengthen the regulatory framework when 42 00:02:14,239 --> 00:02:16,000 it comes to the ethical use 43 00:02:16,000 --> 00:02:20,319 of ai in theater operations 44 00:02:21,040 --> 00:02:23,040 thank you pat and thank you everyone for 45 00:02:23,040 --> 00:02:24,080 it 46 00:02:24,080 --> 00:02:25,920 i'm very honored to be asked to be here 47 00:02:25,920 --> 00:02:27,360 today and i just want to let you know 48 00:02:27,360 --> 00:02:28,640 that i'm speaking in my personal 49 00:02:28,640 --> 00:02:29,440 capacity 50 00:02:29,440 --> 00:02:32,239 so these are uh personal opinions they 51 00:02:32,239 --> 00:02:33,200 do not represent 52 00:02:33,200 --> 00:02:35,360 um johns hopkins or the department of 53 00:02:35,360 --> 00:02:37,280 defense or anybody else or brookings or 54 00:02:37,280 --> 00:02:39,120 anybody else 55 00:02:39,120 --> 00:02:41,920 um i think one of the big things about 56 00:02:41,920 --> 00:02:42,959 strengthening 57 00:02:42,959 --> 00:02:46,080 ethical implications around ai in 58 00:02:46,080 --> 00:02:47,280 theater operations 59 00:02:47,280 --> 00:02:50,959 even when tactical operations 60 00:02:50,959 --> 00:02:53,760 um joint operations i mean any really 61 00:02:53,760 --> 00:02:55,680 any operations generally 62 00:02:55,680 --> 00:02:58,080 um is that we have to start with the 63 00:02:58,080 --> 00:02:59,760 technology and we have to start with 64 00:02:59,760 --> 00:03:00,239 worse 65 00:03:00,239 --> 00:03:03,519 warfighter center design um 66 00:03:03,519 --> 00:03:07,120 so when we're thinking about how to best 67 00:03:07,120 --> 00:03:10,480 research design develop deploy um 68 00:03:10,480 --> 00:03:13,360 we have to think about what the exact 69 00:03:13,360 --> 00:03:14,400 technology is 70 00:03:14,400 --> 00:03:17,280 is going to be used for and then uphold 71 00:03:17,280 --> 00:03:19,040 certain types of principles so 72 00:03:19,040 --> 00:03:20,879 the u.s department of defense for 73 00:03:20,879 --> 00:03:22,800 example has adopted a set of five 74 00:03:22,800 --> 00:03:24,720 ethical principles 75 00:03:24,720 --> 00:03:28,080 um governable responsible equitable 76 00:03:28,080 --> 00:03:29,840 um there is a there's a variety of 77 00:03:29,840 --> 00:03:32,159 things that are 78 00:03:32,159 --> 00:03:34,640 that are that are in that um and a lot 79 00:03:34,640 --> 00:03:36,560 of that have to do with 80 00:03:36,560 --> 00:03:38,239 designing for traceability and 81 00:03:38,239 --> 00:03:40,480 auditability ensuring that 82 00:03:40,480 --> 00:03:43,360 warfighters are center to design and 83 00:03:43,360 --> 00:03:44,400 deployment 84 00:03:44,400 --> 00:03:47,280 um and that human responsibility is at 85 00:03:47,280 --> 00:03:47,920 the core 86 00:03:47,920 --> 00:03:51,440 of all operations and so i think when 87 00:03:51,440 --> 00:03:52,879 we're thinking about how we 88 00:03:52,879 --> 00:03:57,680 deploy ai within defense of operations 89 00:03:57,680 --> 00:04:00,480 we have to think about what the ai is 90 00:04:00,480 --> 00:04:02,159 intended to do 91 00:04:02,159 --> 00:04:04,640 is it a decision aid is it an isr 92 00:04:04,640 --> 00:04:05,840 application 93 00:04:05,840 --> 00:04:08,400 is it a strike application do we have 94 00:04:08,400 --> 00:04:10,239 the right kind of data is the data 95 00:04:10,239 --> 00:04:12,239 properly curated 96 00:04:12,239 --> 00:04:15,040 do we have uh appropriate testing 97 00:04:15,040 --> 00:04:17,120 evaluation verification validation 98 00:04:17,120 --> 00:04:19,040 regimes in place to ensure that those 99 00:04:19,040 --> 00:04:21,199 systems are working as intended 100 00:04:21,199 --> 00:04:24,400 do we have some way to observe those 101 00:04:24,400 --> 00:04:26,320 applications as they're deployed to 102 00:04:26,320 --> 00:04:28,320 ensure that they are 103 00:04:28,320 --> 00:04:31,280 working within the bound of intended use 104 00:04:31,280 --> 00:04:32,880 the boundaries of intended use and that 105 00:04:32,880 --> 00:04:33,919 they are not learning 106 00:04:33,919 --> 00:04:36,960 behaviors outside of those um that 107 00:04:36,960 --> 00:04:38,400 intended use scope 108 00:04:38,400 --> 00:04:41,440 do we have some measure of auditability 109 00:04:41,440 --> 00:04:45,120 how do we how do we train and equip 110 00:04:45,120 --> 00:04:48,479 officers and operators to 111 00:04:48,479 --> 00:04:51,520 continually understand those systems 112 00:04:51,520 --> 00:04:55,040 how do we certify and license 113 00:04:55,040 --> 00:04:58,240 autonomous technologies um to ensure 114 00:04:58,240 --> 00:05:00,000 that their lifetime use 115 00:05:00,000 --> 00:05:03,120 is is in accordance with what we want 116 00:05:03,120 --> 00:05:03,840 them to do 117 00:05:03,840 --> 00:05:05,919 and and that they are not learning um 118 00:05:05,919 --> 00:05:08,080 unintended behaviors outside of that 119 00:05:08,080 --> 00:05:11,440 so this is a very big endeavor 120 00:05:11,440 --> 00:05:14,240 that that really spans doctrine 121 00:05:14,240 --> 00:05:15,280 organization 122 00:05:15,280 --> 00:05:19,120 training logistics design development 123 00:05:19,120 --> 00:05:21,919 um this is a huge endeavor that i think 124 00:05:21,919 --> 00:05:24,240 all militaries really have to 125 00:05:24,240 --> 00:05:26,320 to take into consideration from 126 00:05:26,320 --> 00:05:28,720 everything from policy to 127 00:05:28,720 --> 00:05:31,759 how many maintainers um you need 128 00:05:31,759 --> 00:05:34,560 for a given platform and what types of 129 00:05:34,560 --> 00:05:37,520 training those individuals need 130 00:05:37,520 --> 00:05:41,840 how you train lawyers and 131 00:05:41,840 --> 00:05:44,240 judge advocate generals to ensure that 132 00:05:44,240 --> 00:05:45,440 they are making appropriate 133 00:05:45,440 --> 00:05:47,840 proportionality calculations 134 00:05:47,840 --> 00:05:49,759 and understanding the implications of 135 00:05:49,759 --> 00:05:51,680 using particular assets in the use of 136 00:05:51,680 --> 00:05:53,199 armed force 137 00:05:53,199 --> 00:05:55,680 this is this is not a one-shot problem 138 00:05:55,680 --> 00:05:56,319 this is a 139 00:05:56,319 --> 00:05:58,560 huge problem that all militaries are 140 00:05:58,560 --> 00:06:00,880 ultimately going to have to 141 00:06:00,880 --> 00:06:02,639 spend a great deal of time thinking 142 00:06:02,639 --> 00:06:04,000 through 143 00:06:04,000 --> 00:06:06,479 so in in covering this space uh one 144 00:06:06,479 --> 00:06:08,160 thing that's been very clear to me and 145 00:06:08,160 --> 00:06:09,360 to a lot of other people that are 146 00:06:09,360 --> 00:06:10,560 watching the effects of emerging 147 00:06:10,560 --> 00:06:12,800 technology on military technology 148 00:06:12,800 --> 00:06:16,080 uh is the accelerating effects 149 00:06:16,080 --> 00:06:18,000 of things like machine learning and big 150 00:06:18,000 --> 00:06:19,600 data on the pace of 151 00:06:19,600 --> 00:06:22,639 operations potentially on the pace of 152 00:06:22,639 --> 00:06:23,520 warfare 153 00:06:23,520 --> 00:06:26,479 and that seems to put a lot of potential 154 00:06:26,479 --> 00:06:28,800 future stress on commanders 155 00:06:28,800 --> 00:06:32,000 and others to make decisions ever faster 156 00:06:32,000 --> 00:06:32,840 with an ever 157 00:06:32,840 --> 00:06:35,600 shrinking uh observe orient decide an 158 00:06:35,600 --> 00:06:37,280 act or ooda loop cycle 159 00:06:37,280 --> 00:06:40,880 so uh having an ai principles list at 160 00:06:40,880 --> 00:06:42,479 the department of defense is a huge 161 00:06:42,479 --> 00:06:43,199 accomplishment 162 00:06:43,199 --> 00:06:45,039 it's uh i think something that 163 00:06:45,039 --> 00:06:47,199 international allies are very excited to 164 00:06:47,199 --> 00:06:48,240 see 165 00:06:48,240 --> 00:06:51,440 but is that sufficient to your mind to 166 00:06:51,440 --> 00:06:52,560 deal with 167 00:06:52,560 --> 00:06:55,120 the threat posed by accelerated 168 00:06:55,120 --> 00:06:56,160 operations 169 00:06:56,160 --> 00:06:59,280 due to the advent of new technologies 170 00:06:59,280 --> 00:07:01,039 like machine learning is having 171 00:07:01,039 --> 00:07:04,720 a principles list enough to offset 172 00:07:04,720 --> 00:07:08,400 the potential harm there 173 00:07:09,440 --> 00:07:12,160 well no you you can't just have a 174 00:07:12,160 --> 00:07:14,000 principles list and say job done 175 00:07:14,000 --> 00:07:16,880 right um what really matters is the 176 00:07:16,880 --> 00:07:18,960 application and the institutionalization 177 00:07:18,960 --> 00:07:19,520 of those 178 00:07:19,520 --> 00:07:22,639 principles um the joint ai center within 179 00:07:22,639 --> 00:07:24,000 the dod right now 180 00:07:24,000 --> 00:07:26,800 is is working very hard on figuring out 181 00:07:26,800 --> 00:07:28,000 the ways in which 182 00:07:28,000 --> 00:07:31,039 they are instituting those principles 183 00:07:31,039 --> 00:07:33,520 um again like so go back going back to 184 00:07:33,520 --> 00:07:35,520 testing and evaluation for example like 185 00:07:35,520 --> 00:07:36,960 what does that look like 186 00:07:36,960 --> 00:07:40,319 um you know we don't have a very robust 187 00:07:40,319 --> 00:07:41,680 testing regime for 188 00:07:41,680 --> 00:07:43,840 for machine learning systems right we 189 00:07:43,840 --> 00:07:45,759 can't exhaustively test 190 00:07:45,759 --> 00:07:48,720 the space of machine learning we can do 191 00:07:48,720 --> 00:07:50,960 simulations we can try to find edge 192 00:07:50,960 --> 00:07:51,919 cases 193 00:07:51,919 --> 00:07:55,199 but in reality we can't fully test these 194 00:07:55,199 --> 00:07:56,960 systems like we have tested 195 00:07:56,960 --> 00:08:00,080 prior previous legacy systems and so 196 00:08:00,080 --> 00:08:01,840 there's a there's a fundamental shift 197 00:08:01,840 --> 00:08:02,560 there 198 00:08:02,560 --> 00:08:04,800 um but i think you know even with the 199 00:08:04,800 --> 00:08:06,800 speeding up of operations 200 00:08:06,800 --> 00:08:08,879 there is there is an inherent limit 201 00:08:08,879 --> 00:08:10,240 right i mean 202 00:08:10,240 --> 00:08:12,479 we are talking about i mean cyber is one 203 00:08:12,479 --> 00:08:14,400 thing we can talk about cyber and how 204 00:08:14,400 --> 00:08:14,879 that 205 00:08:14,879 --> 00:08:18,560 um is a bit of a different space but 206 00:08:18,560 --> 00:08:20,560 when it comes to physical operations 207 00:08:20,560 --> 00:08:21,840 right we we are still 208 00:08:21,840 --> 00:08:25,599 limited by time space 209 00:08:25,599 --> 00:08:28,000 logistics you know how many how many 210 00:08:28,000 --> 00:08:29,199 worships do i have 211 00:08:29,199 --> 00:08:31,919 um in an area how many planes do i need 212 00:08:31,919 --> 00:08:32,799 to 213 00:08:32,799 --> 00:08:34,399 to complete a mission what does my 214 00:08:34,399 --> 00:08:36,640 resupply line look like 215 00:08:36,640 --> 00:08:39,200 how do i do joint operations with nato 216 00:08:39,200 --> 00:08:40,000 right how do i 217 00:08:40,000 --> 00:08:41,919 do my coms lines how do i divide air 218 00:08:41,919 --> 00:08:43,599 space there are limits 219 00:08:43,599 --> 00:08:47,839 um to operations given the physical 220 00:08:47,839 --> 00:08:50,880 constraints of warfare 221 00:08:50,880 --> 00:08:53,519 but we do have a huge amount of data 222 00:08:53,519 --> 00:08:54,959 that's coming in 223 00:08:54,959 --> 00:08:59,200 um we are pretty much 224 00:08:59,200 --> 00:09:00,959 observed all the time i mean when we 225 00:09:00,959 --> 00:09:02,560 think about about 226 00:09:02,560 --> 00:09:06,240 uh space assets and isr uh there's just 227 00:09:06,240 --> 00:09:08,160 you know your adversary can see you all 228 00:09:08,160 --> 00:09:10,720 the time and what you're doing and what 229 00:09:10,720 --> 00:09:13,120 movements you're planning um and so that 230 00:09:13,120 --> 00:09:14,959 becomes its own challenge 231 00:09:14,959 --> 00:09:18,000 um but what i would say is that 232 00:09:18,000 --> 00:09:20,320 you know the the ooda loop is speeding 233 00:09:20,320 --> 00:09:21,120 up 234 00:09:21,120 --> 00:09:24,399 but it can only speed up on the quality 235 00:09:24,399 --> 00:09:26,560 of data just because you have data 236 00:09:26,560 --> 00:09:28,560 doesn't mean that it's useful 237 00:09:28,560 --> 00:09:30,800 it has to be curated it has to be 238 00:09:30,800 --> 00:09:32,320 appropriate for 239 00:09:32,320 --> 00:09:35,680 the mission it has to be representative 240 00:09:35,680 --> 00:09:37,839 of your adversary's intentions or 241 00:09:37,839 --> 00:09:39,760 beliefs or assets 242 00:09:39,760 --> 00:09:43,519 and when we're in warfare you know 243 00:09:43,519 --> 00:09:45,279 your adversary is not going to tell you 244 00:09:45,279 --> 00:09:47,519 exactly what they're going to do 245 00:09:47,519 --> 00:09:50,320 they have incentives to dissemble they 246 00:09:50,320 --> 00:09:50,800 haven't 247 00:09:50,800 --> 00:09:53,839 they have incentives to use ruses 248 00:09:53,839 --> 00:09:56,080 um and so the data that we're getting is 249 00:09:56,080 --> 00:09:58,480 not clean it is not perfect 250 00:09:58,480 --> 00:10:01,839 um and even with massive amounts of it 251 00:10:01,839 --> 00:10:04,560 uh we there's a lot of noise in that 252 00:10:04,560 --> 00:10:05,920 data 253 00:10:05,920 --> 00:10:07,760 let me turn to you emanuel uh thank you 254 00:10:07,760 --> 00:10:09,680 for joining us the same question 255 00:10:09,680 --> 00:10:11,279 on the one hand we know that artificial 256 00:10:11,279 --> 00:10:13,360 intelligence and emergency technologies 257 00:10:13,360 --> 00:10:15,360 are going to be a force multiplier 258 00:10:15,360 --> 00:10:17,120 they're going to allow militaries to 259 00:10:17,120 --> 00:10:17,760 have 260 00:10:17,760 --> 00:10:19,839 much greater effect with fewer assets at 261 00:10:19,839 --> 00:10:21,839 the same time they also contribute to an 262 00:10:21,839 --> 00:10:23,920 accelerating pace of operations 263 00:10:23,920 --> 00:10:26,000 which can be uh very difficult for 264 00:10:26,000 --> 00:10:28,160 commanders particularly commanders 265 00:10:28,160 --> 00:10:31,360 that live and operate in societies 266 00:10:31,360 --> 00:10:33,600 where there's a strong sense of uh 267 00:10:33,600 --> 00:10:34,720 government accountability 268 00:10:34,720 --> 00:10:36,720 command and control and where individual 269 00:10:36,720 --> 00:10:38,560 officers are held accountable for their 270 00:10:38,560 --> 00:10:39,760 actions as opposed to 271 00:10:39,760 --> 00:10:41,519 uh other types of governments so can you 272 00:10:41,519 --> 00:10:43,600 describe a little bit of that friction 273 00:10:43,600 --> 00:10:46,160 from your perspective uh from a military 274 00:10:46,160 --> 00:10:47,120 perspective how 275 00:10:47,120 --> 00:10:49,440 on the one hand do you use technology 276 00:10:49,440 --> 00:10:50,560 like ai 277 00:10:50,560 --> 00:10:53,200 to be a force multiplier to have as 278 00:10:53,200 --> 00:10:54,959 maximal effect on the battlefield as 279 00:10:54,959 --> 00:10:57,040 possible while at the same time 280 00:10:57,040 --> 00:10:59,680 allowing commanders to deal with the 281 00:10:59,680 --> 00:11:03,519 accelerated pace of operations 282 00:11:03,600 --> 00:11:07,600 can you hear me yes we do 283 00:11:07,870 --> 00:11:09,440 [Music] 284 00:11:09,440 --> 00:11:11,120 within the ministry of defense we have 285 00:11:11,120 --> 00:11:12,959 identified a 286 00:11:12,959 --> 00:11:16,240 number of fees where the embedding of 287 00:11:16,240 --> 00:11:18,640 artificial intelligence can help in 288 00:11:18,640 --> 00:11:21,839 this multiplying effect that you just 289 00:11:21,839 --> 00:11:23,680 described 290 00:11:23,680 --> 00:11:25,360 whatever it's of the battlefield 291 00:11:25,360 --> 00:11:27,760 whatever it's in the cyber space for 292 00:11:27,760 --> 00:11:31,279 intelligence information whatever it's 293 00:11:31,279 --> 00:11:33,279 on the materials for logistics 294 00:11:33,279 --> 00:11:34,560 maintenance 295 00:11:34,560 --> 00:11:38,640 uh on the medical for rescue evacuation 296 00:11:38,640 --> 00:11:40,720 and so on 297 00:11:40,720 --> 00:11:44,240 and uh of course uh uh don't forget 298 00:11:44,240 --> 00:11:47,200 the the the i should say the masterpiece 299 00:11:47,200 --> 00:11:48,480 of all these fields 300 00:11:48,480 --> 00:11:52,079 uh which is the uh the the actions of uh 301 00:11:52,079 --> 00:11:56,639 planifications decision makings and uh 302 00:11:56,639 --> 00:11:59,760 situation awareness in all these 303 00:11:59,760 --> 00:12:01,920 in all these domains the artificial 304 00:12:01,920 --> 00:12:02,880 intelligence 305 00:12:02,880 --> 00:12:06,160 is uh is a factor of uh 306 00:12:06,160 --> 00:12:08,800 accelerate acceleration uh 307 00:12:08,800 --> 00:12:10,000 multiplication of 308 00:12:10,000 --> 00:12:13,760 efficiency and uh improvement of the 309 00:12:13,760 --> 00:12:15,200 precision 310 00:12:15,200 --> 00:12:18,560 uh meaning with this 311 00:12:18,560 --> 00:12:20,639 taking better decision taking more 312 00:12:20,639 --> 00:12:22,639 precise decision or at the 313 00:12:22,639 --> 00:12:25,680 at the better moment or 314 00:12:25,680 --> 00:12:28,160 calibrating more appropriately the 315 00:12:28,160 --> 00:12:30,079 resources and the materials you need 316 00:12:30,079 --> 00:12:33,360 in the on the field 317 00:12:33,360 --> 00:12:37,040 of course embedding such 318 00:12:37,040 --> 00:12:39,040 algorithms and artificial intelligence 319 00:12:39,040 --> 00:12:41,200 and data everywhere can lead to 320 00:12:41,200 --> 00:12:44,320 increased complexity for the commandment 321 00:12:44,320 --> 00:12:46,240 this is also one of the reasons for 322 00:12:46,240 --> 00:12:47,760 which we have 323 00:12:47,760 --> 00:12:51,360 identified the uh as other 324 00:12:51,360 --> 00:12:54,800 critical uh as a criteria of 325 00:12:54,800 --> 00:12:57,680 critical importance to uh to remain 326 00:12:57,680 --> 00:12:59,040 within 327 00:12:59,040 --> 00:13:02,079 a rigorous chain of commands when uh 328 00:13:02,079 --> 00:13:03,760 performing the operations 329 00:13:03,760 --> 00:13:06,800 and in particular uh the question of 330 00:13:06,800 --> 00:13:10,560 uh maintaining at at every moment and 331 00:13:10,560 --> 00:13:13,600 at every level a sufficient uh 332 00:13:13,600 --> 00:13:18,000 human control on all these operations 333 00:13:18,000 --> 00:13:20,639 uh excellent let's go to zoom and take a 334 00:13:20,639 --> 00:13:22,320 question from jim townsend in the united 335 00:13:22,320 --> 00:13:24,639 states 336 00:13:27,440 --> 00:13:30,320 hello everyone uh this is jim townsend 337 00:13:30,320 --> 00:13:33,120 uh just a sound check can you hear me 338 00:13:33,120 --> 00:13:36,480 i hear you good uh well thank you very 339 00:13:36,480 --> 00:13:38,399 much i found both of those 340 00:13:38,399 --> 00:13:41,040 uh presentations absolutely fascinating 341 00:13:41,040 --> 00:13:41,839 uh 342 00:13:41,839 --> 00:13:44,720 i uh my background is i've spent about 343 00:13:44,720 --> 00:13:47,680 35 years working nato and europe 344 00:13:47,680 --> 00:13:50,720 at the pentagon so i i'm not a technical 345 00:13:50,720 --> 00:13:51,120 guy 346 00:13:51,120 --> 00:13:53,680 but you've got to keep track with 347 00:13:53,680 --> 00:13:55,279 technology particularly dealing with 348 00:13:55,279 --> 00:13:56,399 nato 349 00:13:56,399 --> 00:13:58,079 and make sure there's interoperability 350 00:13:58,079 --> 00:14:00,959 and that's my question for both of you 351 00:14:00,959 --> 00:14:03,360 nato yesterday uh this heads of state 352 00:14:03,360 --> 00:14:05,680 and government agreed to establish 353 00:14:05,680 --> 00:14:09,519 uh an innovate an innovation accelerator 354 00:14:09,519 --> 00:14:12,880 which is a great name uh as because 355 00:14:12,880 --> 00:14:13,360 they're 356 00:14:13,360 --> 00:14:16,240 concerned the alliance that um that nato 357 00:14:16,240 --> 00:14:18,639 allies keep on top of the technology 358 00:14:18,639 --> 00:14:20,959 changes and i know some allies are 359 00:14:20,959 --> 00:14:23,760 uh but what does this mean for you what 360 00:14:23,760 --> 00:14:25,440 how do you coordinate with nato 361 00:14:25,440 --> 00:14:26,160 yourselves 362 00:14:26,160 --> 00:14:28,399 uh dod i mean i worked in the pentagon 363 00:14:28,399 --> 00:14:29,519 for many years 364 00:14:29,519 --> 00:14:32,480 there is some coordination with nato 365 00:14:32,480 --> 00:14:34,320 through the regional offices there and a 366 00:14:34,320 --> 00:14:36,000 little bit on the acquisition side but 367 00:14:36,000 --> 00:14:37,440 on the tech side 368 00:14:37,440 --> 00:14:40,160 um with this uh this new innovation 369 00:14:40,160 --> 00:14:41,279 there's also a fund 370 00:14:41,279 --> 00:14:43,440 that nato is going to put towards r d 371 00:14:43,440 --> 00:14:44,480 and tech 372 00:14:44,480 --> 00:14:46,399 uh how are you going to negotiate how 373 00:14:46,399 --> 00:14:47,760 are you going to uh 374 00:14:47,760 --> 00:14:50,240 to inter-operate and to to communicate 375 00:14:50,240 --> 00:14:51,040 and to 376 00:14:51,040 --> 00:14:52,959 coordinate with what nato is doing as an 377 00:14:52,959 --> 00:14:54,399 institution 378 00:14:54,399 --> 00:14:56,000 and what nato allies are going to be 379 00:14:56,000 --> 00:14:57,920 doing individually or we can be doing 380 00:14:57,920 --> 00:15:00,160 things in stove pipes 381 00:15:00,160 --> 00:15:03,279 thanks i'll take that to you emanuel 382 00:15:03,279 --> 00:15:04,399 first 383 00:15:04,399 --> 00:15:08,399 oh and then and then heather yeah 384 00:15:08,399 --> 00:15:10,399 okay thanks uh this is a very 385 00:15:10,399 --> 00:15:12,240 interesting questions 386 00:15:12,240 --> 00:15:16,160 how should allies coordinate with uh 387 00:15:16,160 --> 00:15:18,800 between them and at the alliance level 388 00:15:18,800 --> 00:15:19,360 to 389 00:15:19,360 --> 00:15:22,160 uh incorporate and to take the best uh 390 00:15:22,160 --> 00:15:24,399 the better the best advantage of 391 00:15:24,399 --> 00:15:28,399 such technologies uh so first of all i 392 00:15:28,399 --> 00:15:29,199 should say that 393 00:15:29,199 --> 00:15:33,120 uh the the most important thing is to 394 00:15:33,120 --> 00:15:36,560 to understand ourselves mutually uh on 395 00:15:36,560 --> 00:15:40,240 uh what we're talking about 396 00:15:40,240 --> 00:15:43,440 it looks like basics but in case of uh 397 00:15:43,440 --> 00:15:46,639 usage responsible use of artificial 398 00:15:46,639 --> 00:15:48,240 intelligence and especially in the 399 00:15:48,240 --> 00:15:51,279 defense sector in the military domain 400 00:15:51,279 --> 00:15:54,800 this is extremely important 401 00:15:54,800 --> 00:15:57,040 because we know that the the the limits 402 00:15:57,040 --> 00:15:57,920 are not 403 00:15:57,920 --> 00:16:00,959 easy to not uh not easy to define 404 00:16:00,959 --> 00:16:04,079 and to uh i would say to translate 405 00:16:04,079 --> 00:16:06,839 into concrete actions programs and 406 00:16:06,839 --> 00:16:08,399 developments uh 407 00:16:08,399 --> 00:16:10,320 also about the developments and the the 408 00:16:10,320 --> 00:16:13,360 military program one of the 409 00:16:13,360 --> 00:16:16,560 question i i think is still to to be on 410 00:16:16,560 --> 00:16:17,600 the table 411 00:16:17,600 --> 00:16:20,240 is that these technologies whatever we 412 00:16:20,240 --> 00:16:22,560 we talk about uh ai in itself 413 00:16:22,560 --> 00:16:25,759 or our other 414 00:16:25,759 --> 00:16:28,880 disruptive technologies uh the point is 415 00:16:28,880 --> 00:16:30,160 that these technologies are 416 00:16:30,160 --> 00:16:33,279 uh uh inherently dual 417 00:16:33,279 --> 00:16:36,639 uh dual civil and military 418 00:16:36,639 --> 00:16:39,920 which means that we have to decide uh uh 419 00:16:39,920 --> 00:16:42,639 between between a member of the alliance 420 00:16:42,639 --> 00:16:43,440 what will be 421 00:16:43,440 --> 00:16:47,839 uh to be uh to to be led by the alliance 422 00:16:47,839 --> 00:16:50,000 either the military alliance 423 00:16:50,000 --> 00:16:51,920 and well as to be developed and under 424 00:16:51,920 --> 00:16:53,519 the responsibility of 425 00:16:53,519 --> 00:16:57,759 of states i think this is 426 00:16:57,759 --> 00:17:00,399 important especially when we when we 427 00:17:00,399 --> 00:17:01,600 talk 428 00:17:01,600 --> 00:17:04,880 about making use of these technologies 429 00:17:04,880 --> 00:17:06,480 in a military context 430 00:17:06,480 --> 00:17:09,599 in a context of of uh potential uh 431 00:17:09,599 --> 00:17:12,160 conflicts when we have a command 432 00:17:12,160 --> 00:17:14,000 reference which is the application 433 00:17:14,000 --> 00:17:17,280 the application of the existing uh 434 00:17:17,280 --> 00:17:20,400 international human humanitarian law 435 00:17:20,400 --> 00:17:22,720 for which we have to uh to think of 436 00:17:22,720 --> 00:17:24,160 course on how 437 00:17:24,160 --> 00:17:28,400 and how concretely and by which means 438 00:17:28,400 --> 00:17:32,720 the ihl can apply to the to the military 439 00:17:32,720 --> 00:17:35,039 use of these technologies 440 00:17:35,039 --> 00:17:38,240 heather you yeah i 441 00:17:38,240 --> 00:17:39,679 jim thank you that was a wonderful 442 00:17:39,679 --> 00:17:41,440 question um i think there's 443 00:17:41,440 --> 00:17:43,919 multiple ways in which nato and the 444 00:17:43,919 --> 00:17:44,960 alliances 445 00:17:44,960 --> 00:17:47,120 um and not just nato i think even for 446 00:17:47,120 --> 00:17:49,760 the us case and dod the five eyes 447 00:17:49,760 --> 00:17:52,320 um i think there's there's a there's a 448 00:17:52,320 --> 00:17:54,400 lot of things that need to be worked out 449 00:17:54,400 --> 00:17:57,120 and one of the the i think most pressing 450 00:17:57,120 --> 00:17:57,600 things 451 00:17:57,600 --> 00:18:00,160 and one of the most uh big elephants in 452 00:18:00,160 --> 00:18:01,440 the room is that 453 00:18:01,440 --> 00:18:03,120 you know even though that we're gonna 454 00:18:03,120 --> 00:18:05,280 start using ai and different types of 455 00:18:05,280 --> 00:18:06,559 applications 456 00:18:06,559 --> 00:18:08,720 um and different militaries are gonna 457 00:18:08,720 --> 00:18:09,760 use different a 458 00:18:09,760 --> 00:18:11,440 different types of systems because of 459 00:18:11,440 --> 00:18:13,360 their of their national defense posture 460 00:18:13,360 --> 00:18:14,880 so we can think of 461 00:18:14,880 --> 00:18:17,520 you know britain might be leaning more 462 00:18:17,520 --> 00:18:18,080 into 463 00:18:18,080 --> 00:18:20,720 naval operations the u.s might be trying 464 00:18:20,720 --> 00:18:21,600 to do more 465 00:18:21,600 --> 00:18:24,480 um ai and autonomy applications at the 466 00:18:24,480 --> 00:18:26,240 get-go within the air force 467 00:18:26,240 --> 00:18:27,760 and so there's going to be kind of 468 00:18:27,760 --> 00:18:29,360 there's going to be variation between 469 00:18:29,360 --> 00:18:31,440 the the member states about what types 470 00:18:31,440 --> 00:18:32,880 of applications that they're going to 471 00:18:32,880 --> 00:18:34,799 pursue and then develop 472 00:18:34,799 --> 00:18:36,480 and then deploy within their own 473 00:18:36,480 --> 00:18:37,840 militaries 474 00:18:37,840 --> 00:18:40,960 but we are also always hamstrung by our 475 00:18:40,960 --> 00:18:42,960 networks and our communications 476 00:18:42,960 --> 00:18:45,760 and so you know until we can figure out 477 00:18:45,760 --> 00:18:46,960 a better way to do 478 00:18:46,960 --> 00:18:48,240 those types of networks and 479 00:18:48,240 --> 00:18:50,559 communications and interoperability 480 00:18:50,559 --> 00:18:53,679 between members um and train we have to 481 00:18:53,679 --> 00:18:55,280 train with our members 482 00:18:55,280 --> 00:18:58,320 um these are these are going to 483 00:18:58,320 --> 00:19:00,720 the legacy systems are going to always 484 00:19:00,720 --> 00:19:02,160 hamstring 485 00:19:02,160 --> 00:19:04,559 the new systems right we are not going 486 00:19:04,559 --> 00:19:05,280 to 487 00:19:05,280 --> 00:19:07,600 um kind of overturn the apple cart and 488 00:19:07,600 --> 00:19:09,120 say everything's new 489 00:19:09,120 --> 00:19:10,480 everything's not new we're going to 490 00:19:10,480 --> 00:19:12,559 still have link 16 we're still going to 491 00:19:12,559 --> 00:19:14,320 have different types of communication 492 00:19:14,320 --> 00:19:15,120 systems 493 00:19:15,120 --> 00:19:17,280 that we have to work with we're still 494 00:19:17,280 --> 00:19:19,039 going to have to divide air space in in 495 00:19:19,039 --> 00:19:20,400 such a way 496 00:19:20,400 --> 00:19:22,480 so you know my patriot battery is going 497 00:19:22,480 --> 00:19:24,000 to be over here and you know 498 00:19:24,000 --> 00:19:25,440 your missile defense system is going to 499 00:19:25,440 --> 00:19:28,000 be over there and so 500 00:19:28,000 --> 00:19:30,080 i think we we have to continually think 501 00:19:30,080 --> 00:19:31,200 about how we 502 00:19:31,200 --> 00:19:33,200 we train and we and we equip our 503 00:19:33,200 --> 00:19:34,320 fighters now 504 00:19:34,320 --> 00:19:36,720 and then how we feather in these new 505 00:19:36,720 --> 00:19:38,000 types of systems into 506 00:19:38,000 --> 00:19:41,200 our existing uh forced postures and then 507 00:19:41,200 --> 00:19:42,559 what types of missions that they are 508 00:19:42,559 --> 00:19:44,160 going to be appropriate for 509 00:19:44,160 --> 00:19:47,039 and to and to emmanuel's um you know 510 00:19:47,039 --> 00:19:48,000 points 511 00:19:48,000 --> 00:19:50,720 you know yes ihl is is an overriding 512 00:19:50,720 --> 00:19:51,440 concern 513 00:19:51,440 --> 00:19:53,760 right we were we are not going to field 514 00:19:53,760 --> 00:19:56,000 systems that are indiscriminate 515 00:19:56,000 --> 00:20:00,240 um that do not um comply with with the 516 00:20:00,240 --> 00:20:01,840 laws of armed conflict 517 00:20:01,840 --> 00:20:04,799 um but how we prove that they are 518 00:20:04,799 --> 00:20:05,760 discriminate 519 00:20:05,760 --> 00:20:07,679 how we prove that we have sufficiently 520 00:20:07,679 --> 00:20:09,039 tested them 521 00:20:09,039 --> 00:20:11,600 um is going to start to shift and going 522 00:20:11,600 --> 00:20:13,440 to start to change because 523 00:20:13,440 --> 00:20:15,039 of the way in which these systems 524 00:20:15,039 --> 00:20:16,480 operate they're they're fundamentally 525 00:20:16,480 --> 00:20:17,520 different so 526 00:20:17,520 --> 00:20:20,480 i think as we are rolling out new um 527 00:20:20,480 --> 00:20:22,000 applications and new 528 00:20:22,000 --> 00:20:24,640 platforms that utilize ai we're gonna 529 00:20:24,640 --> 00:20:26,400 not just see these as machine learning 530 00:20:26,400 --> 00:20:27,520 systems that 531 00:20:27,520 --> 00:20:29,600 you know are just learning all the time 532 00:20:29,600 --> 00:20:30,799 no we're going to put 533 00:20:30,799 --> 00:20:33,280 systems that have a hybrid architecture 534 00:20:33,280 --> 00:20:34,880 out that have different types of 535 00:20:34,880 --> 00:20:36,320 governors on them 536 00:20:36,320 --> 00:20:38,559 you know that um that are not machine 537 00:20:38,559 --> 00:20:40,000 learning all the way down right that are 538 00:20:40,000 --> 00:20:41,679 going to have expert based systems that 539 00:20:41,679 --> 00:20:42,960 we can fully 540 00:20:42,960 --> 00:20:46,240 uh test and validate that you know 541 00:20:46,240 --> 00:20:48,159 ground collision avoidance systems 542 00:20:48,159 --> 00:20:49,440 they're going to have geo-fencing 543 00:20:49,440 --> 00:20:50,480 there's there's just going to be a 544 00:20:50,480 --> 00:20:51,120 different 545 00:20:51,120 --> 00:20:54,640 um kind of stepwise approach to how we 546 00:20:54,640 --> 00:20:55,760 how we feather those in 547 00:20:55,760 --> 00:20:57,679 and and the allies within nato are going 548 00:20:57,679 --> 00:20:59,919 to have to just be much more robust and 549 00:20:59,919 --> 00:21:01,840 transparent about 550 00:21:01,840 --> 00:21:03,840 what those systems are able to do and 551 00:21:03,840 --> 00:21:05,120 how they've tested them and their 552 00:21:05,120 --> 00:21:06,640 communications networks and how they 553 00:21:06,640 --> 00:21:07,919 train with them 554 00:21:07,919 --> 00:21:10,880 um it's just going to be even harder for 555 00:21:10,880 --> 00:21:12,559 for allies to do that 556 00:21:12,559 --> 00:21:15,280 going forward because it's it's not the 557 00:21:15,280 --> 00:21:17,120 same old technology that i can say this 558 00:21:17,120 --> 00:21:18,559 bomb is going to explode 559 00:21:18,559 --> 00:21:20,480 in this location at this time every 560 00:21:20,480 --> 00:21:22,159 single time i drop it it's it's going to 561 00:21:22,159 --> 00:21:24,559 be it's a behavioral shift 562 00:21:24,559 --> 00:21:26,640 more technologies more challenges in 563 00:21:26,640 --> 00:21:28,880 that interoperability space for allies a 564 00:21:28,880 --> 00:21:30,080 very important point 565 00:21:30,080 --> 00:21:33,120 uh new question coming in from zun 566 00:21:33,120 --> 00:21:36,480 from maurizio michoulan from mexico 567 00:21:36,480 --> 00:21:40,000 maurizio hello 568 00:21:40,000 --> 00:21:42,400 great to be here and and very very 569 00:21:42,400 --> 00:21:44,080 interesting discussion 570 00:21:44,080 --> 00:21:47,600 from mexico uh let me ask you this we 571 00:21:47,600 --> 00:21:50,880 we have been hearing from 572 00:21:50,880 --> 00:21:54,480 uh from these issues since long ago we 573 00:21:54,480 --> 00:21:55,919 have been reflecting 574 00:21:55,919 --> 00:21:59,039 on the ethics of artificial intelligence 575 00:21:59,039 --> 00:22:00,559 since i remember 576 00:22:00,559 --> 00:22:04,159 however we are subject to artificial 577 00:22:04,159 --> 00:22:07,520 intelligence mechanisms when we use our 578 00:22:07,520 --> 00:22:09,840 cell phones or our computers 579 00:22:09,840 --> 00:22:12,799 every day just to get profit from 580 00:22:12,799 --> 00:22:13,360 different 581 00:22:13,360 --> 00:22:17,039 companies so how how serious 582 00:22:17,039 --> 00:22:20,000 is this reflection when we speak about 583 00:22:20,000 --> 00:22:23,039 national interests when we speak about 584 00:22:23,039 --> 00:22:26,240 enemies when we speak about uh 585 00:22:26,240 --> 00:22:29,600 interests and agendas how how 586 00:22:29,600 --> 00:22:32,960 how should we approach this in 587 00:22:32,960 --> 00:22:35,840 war scenarios that are happening in the 588 00:22:35,840 --> 00:22:36,640 world 589 00:22:36,640 --> 00:22:39,679 uh every day what what would you 590 00:22:39,679 --> 00:22:43,600 what would be your approach to this 591 00:22:44,159 --> 00:22:47,440 manual yeah 592 00:22:47,440 --> 00:22:50,480 thank you very much this is a 593 00:22:50,480 --> 00:22:52,720 very important question in the sense 594 00:22:52,720 --> 00:22:53,760 that 595 00:22:53,760 --> 00:22:56,799 we are talking about uh what especially 596 00:22:56,799 --> 00:22:59,600 what we should do or not do and what our 597 00:22:59,600 --> 00:23:02,240 potential adversaries 598 00:23:02,240 --> 00:23:05,340 would allow to to themselves to do 599 00:23:05,340 --> 00:23:07,840 [Music] 600 00:23:07,840 --> 00:23:11,120 our goal is to uh to stay within the 601 00:23:11,120 --> 00:23:14,559 uh strong ethics in our in our actions 602 00:23:14,559 --> 00:23:18,000 and in our use of these technologies 603 00:23:18,000 --> 00:23:21,200 so basically we want to uh to maintain 604 00:23:21,200 --> 00:23:25,440 a core of uh of a fundamental principle 605 00:23:25,440 --> 00:23:28,480 uh uh i took previously about the 606 00:23:28,480 --> 00:23:29,440 respect of the 607 00:23:29,440 --> 00:23:31,600 international humanitarian law the 608 00:23:31,600 --> 00:23:33,280 respect of the chain of command 609 00:23:33,280 --> 00:23:37,120 and the uh the the the the fight of 610 00:23:37,120 --> 00:23:39,840 having a sufficient human control at 611 00:23:39,840 --> 00:23:40,559 every level 612 00:23:40,559 --> 00:23:43,380 and at every moment 613 00:23:43,380 --> 00:23:45,840 [Music] 614 00:23:45,840 --> 00:23:48,960 giving this principle we have to uh to 615 00:23:48,960 --> 00:23:50,960 build on uh 616 00:23:50,960 --> 00:23:53,520 some uh we have to do some i would say 617 00:23:53,520 --> 00:23:54,000 some 618 00:23:54,000 --> 00:23:57,440 doctrinal exercises to understand 619 00:23:57,440 --> 00:24:00,080 uh how we should respond and how we 620 00:24:00,080 --> 00:24:02,000 should behave 621 00:24:02,000 --> 00:24:04,960 if at some point we would be facing an 622 00:24:04,960 --> 00:24:06,480 enemy 623 00:24:06,480 --> 00:24:09,200 that would not respect uh the law and 624 00:24:09,200 --> 00:24:10,159 the principle 625 00:24:10,159 --> 00:24:14,880 like uh the same the same as we do 626 00:24:14,880 --> 00:24:17,200 what do we want to do for our soldiers 627 00:24:17,200 --> 00:24:19,600 we want to protect them in their actions 628 00:24:19,600 --> 00:24:23,120 but at the same time we want to remain 629 00:24:23,120 --> 00:24:26,400 exemplary in the way that we are 630 00:24:26,400 --> 00:24:29,120 performing our operation 631 00:24:29,120 --> 00:24:32,880 so this is a reflection which is 632 00:24:32,880 --> 00:24:35,919 uh very important to to incorporate at 633 00:24:35,919 --> 00:24:36,240 the 634 00:24:36,240 --> 00:24:38,720 from the very beginning in our 635 00:24:38,720 --> 00:24:40,799 development in our programs and in our 636 00:24:40,799 --> 00:24:43,440 operations 637 00:24:44,320 --> 00:24:47,600 very very to to to simplify this this 638 00:24:47,600 --> 00:24:51,200 the the ethic uh 639 00:24:51,200 --> 00:24:54,400 the the ethic of our behavior is to be 640 00:24:54,400 --> 00:24:55,760 taken into account 641 00:24:55,760 --> 00:24:58,960 at every step since the beginning 642 00:24:58,960 --> 00:25:02,480 in order to be as performant as we can 643 00:25:02,480 --> 00:25:03,279 but 644 00:25:03,279 --> 00:25:05,600 staying within the limit that we have 645 00:25:05,600 --> 00:25:07,679 defined previously 646 00:25:07,679 --> 00:25:09,039 sorry to cut you off emanuel i didn't 647 00:25:09,039 --> 00:25:10,240 mean to heather did you have a thought 648 00:25:10,240 --> 00:25:12,320 on this 649 00:25:12,320 --> 00:25:15,039 um yeah so you know i was involved with 650 00:25:15,039 --> 00:25:15,919 the creation 651 00:25:15,919 --> 00:25:19,039 of the dod's um ai ethics principles i 652 00:25:19,039 --> 00:25:20,799 was the special governmental expert 653 00:25:20,799 --> 00:25:23,039 attached to the defense innovation board 654 00:25:23,039 --> 00:25:25,039 writing those principles and so 655 00:25:25,039 --> 00:25:26,720 you know that was a very long exercise 656 00:25:26,720 --> 00:25:28,080 and what i can say is 657 00:25:28,080 --> 00:25:31,520 that um you know to emmanuel's point 658 00:25:31,520 --> 00:25:34,720 you know there is a long history of 659 00:25:34,720 --> 00:25:36,880 of attempting to have a professional 660 00:25:36,880 --> 00:25:38,400 code of ethics 661 00:25:38,400 --> 00:25:41,440 um of how to how to fight 662 00:25:41,440 --> 00:25:44,400 within the bounds of armed conflict um 663 00:25:44,400 --> 00:25:45,039 and 664 00:25:45,039 --> 00:25:48,480 most modern western militaries take that 665 00:25:48,480 --> 00:25:50,000 that professional code very very 666 00:25:50,000 --> 00:25:51,919 seriously and so 667 00:25:51,919 --> 00:25:55,279 um where it changes with ai i think is 668 00:25:55,279 --> 00:25:57,520 we have to think about what's different 669 00:25:57,520 --> 00:25:58,880 right like this is 670 00:25:58,880 --> 00:25:59,919 you know how are we going to fight 671 00:25:59,919 --> 00:26:01,120 differently and how are we going to 672 00:26:01,120 --> 00:26:03,520 uphold those same ethical codes 673 00:26:03,520 --> 00:26:06,640 um of you know command and control of 674 00:26:06,640 --> 00:26:10,080 um of of upholding loac 675 00:26:10,080 --> 00:26:12,159 of of upholding um principles of 676 00:26:12,159 --> 00:26:13,360 distinction and necessity and 677 00:26:13,360 --> 00:26:14,799 proportionality 678 00:26:14,799 --> 00:26:17,120 um we have to think about how we design 679 00:26:17,120 --> 00:26:17,840 and develop 680 00:26:17,840 --> 00:26:21,520 and test and train um and these are all 681 00:26:21,520 --> 00:26:24,480 very big questions and and they're not 682 00:26:24,480 --> 00:26:25,039 easy 683 00:26:25,039 --> 00:26:28,880 problems to solve um this means that 684 00:26:28,880 --> 00:26:31,120 all of the militaries going forward have 685 00:26:31,120 --> 00:26:33,679 to take this as a serious commitment 686 00:26:33,679 --> 00:26:35,919 you know you're not just acquiring a new 687 00:26:35,919 --> 00:26:37,200 piece of kit 688 00:26:37,200 --> 00:26:39,120 and then disseminating it throughout the 689 00:26:39,120 --> 00:26:41,279 force and everything's going to be fine 690 00:26:41,279 --> 00:26:44,159 this is the way in which you interact 691 00:26:44,159 --> 00:26:45,840 with systems that learn 692 00:26:45,840 --> 00:26:47,679 the way that you interact with systems 693 00:26:47,679 --> 00:26:49,919 that behaviors might be slightly 694 00:26:49,919 --> 00:26:53,840 unpredictable within a set of bounds 695 00:26:54,159 --> 00:26:56,559 you have to you have to equip commanders 696 00:26:56,559 --> 00:26:58,320 and operators with the appropriate 697 00:26:58,320 --> 00:26:58,960 knowledge 698 00:26:58,960 --> 00:27:01,440 and how to use that when to make the 699 00:27:01,440 --> 00:27:03,760 appropriate judgments of when to use 700 00:27:03,760 --> 00:27:04,960 that type of 701 00:27:04,960 --> 00:27:07,600 of kit you know if i'm looking at a 702 00:27:07,600 --> 00:27:09,679 loitering munition or i'm looking at 703 00:27:09,679 --> 00:27:12,799 a sub hunter or i'm looking at um 704 00:27:12,799 --> 00:27:15,440 a swarm all of these things are going to 705 00:27:15,440 --> 00:27:16,640 be 706 00:27:16,640 --> 00:27:19,200 based in a very big set of human 707 00:27:19,200 --> 00:27:20,080 judgments 708 00:27:20,080 --> 00:27:23,120 with intelligence uh reports coming in 709 00:27:23,120 --> 00:27:25,679 with satellite imagery dual 710 00:27:25,679 --> 00:27:27,440 authentication of all sorts of things 711 00:27:27,440 --> 00:27:30,159 and so humans have to deal with an 712 00:27:30,159 --> 00:27:32,480 increasingly complex space 713 00:27:32,480 --> 00:27:34,399 and make the appropriate judgments in 714 00:27:34,399 --> 00:27:36,240 line with their moral 715 00:27:36,240 --> 00:27:39,440 sets of values but i do think that 716 00:27:39,440 --> 00:27:40,320 having 717 00:27:40,320 --> 00:27:42,640 been part of that process of enumerating 718 00:27:42,640 --> 00:27:46,000 what those principles are 719 00:27:46,399 --> 00:27:48,720 the militaries are very much committed 720 00:27:48,720 --> 00:27:49,919 to doing that 721 00:27:49,919 --> 00:27:52,000 um and even if our adversary says i 722 00:27:52,000 --> 00:27:54,960 don't want to play by these rules 723 00:27:54,960 --> 00:27:58,559 that's what sets us apart as 724 00:27:58,559 --> 00:28:01,039 as as modern militaries who do comply 725 00:28:01,039 --> 00:28:02,559 with a set of normative values in 726 00:28:02,559 --> 00:28:04,399 international humanitarian law if we 727 00:28:04,399 --> 00:28:05,520 throw out those 728 00:28:05,520 --> 00:28:07,919 principles then we have no moral 729 00:28:07,919 --> 00:28:09,279 standing 730 00:28:09,279 --> 00:28:12,480 on the international stage and civilians 731 00:28:12,480 --> 00:28:14,799 will ultimately suffer 732 00:28:14,799 --> 00:28:16,960 not just our warfighters but civilians 733 00:28:16,960 --> 00:28:18,720 will suffer and so it's very important 734 00:28:18,720 --> 00:28:19,919 to maintain 735 00:28:19,919 --> 00:28:22,640 that set of values to minimize force and 736 00:28:22,640 --> 00:28:24,399 to protect human life as much as 737 00:28:24,399 --> 00:28:25,360 possible 738 00:28:25,360 --> 00:28:28,399 if we have to start using force 739 00:28:28,399 --> 00:28:30,399 i think it's a vital point and if you 740 00:28:30,399 --> 00:28:31,520 aren't familiar with the 741 00:28:31,520 --> 00:28:33,120 ai ethics principles i encourage you to 742 00:28:33,120 --> 00:28:34,799 check them out because uh 743 00:28:34,799 --> 00:28:37,919 uh in many ways they are a blueprint for 744 00:28:37,919 --> 00:28:38,480 the way 745 00:28:38,480 --> 00:28:40,320 uh very large corporations should be 746 00:28:40,320 --> 00:28:41,840 constructing ai principles 747 00:28:41,840 --> 00:28:44,159 uh the military's ai principles the 748 00:28:44,159 --> 00:28:45,039 united states 749 00:28:45,039 --> 00:28:47,520 uh runs several pages dozens of pages as 750 00:28:47,520 --> 00:28:48,480 opposed to 751 00:28:48,480 --> 00:28:50,559 uh some that you see coming out of 752 00:28:50,559 --> 00:28:52,880 silicon valley that are maybe 800 words 753 00:28:52,880 --> 00:28:55,279 so uh it's a real feat but uh we have 754 00:28:55,279 --> 00:28:57,120 time for maybe one last question and so 755 00:28:57,120 --> 00:28:59,039 i'll uh turn it to you heather 756 00:28:59,039 --> 00:29:00,880 and if you want to chime in a manual go 757 00:29:00,880 --> 00:29:02,080 ahead but 758 00:29:02,080 --> 00:29:03,679 watching this space over the years 759 00:29:03,679 --> 00:29:05,120 there's lots of different ways to invest 760 00:29:05,120 --> 00:29:06,480 in artificial intelligence you can 761 00:29:06,480 --> 00:29:07,840 invest in 762 00:29:07,840 --> 00:29:09,440 new startups that are developing new 763 00:29:09,440 --> 00:29:11,200 algorithms you can 764 00:29:11,200 --> 00:29:14,399 invest in more data sets and cloud to 765 00:29:14,399 --> 00:29:15,840 host that data 766 00:29:15,840 --> 00:29:17,399 you can invest in more testing and 767 00:29:17,399 --> 00:29:19,760 experimentation uh 768 00:29:19,760 --> 00:29:22,320 to help reveal new plateaus of human 769 00:29:22,320 --> 00:29:23,919 machine teaming so 770 00:29:23,919 --> 00:29:27,039 to your mind where is a military's 771 00:29:27,039 --> 00:29:29,919 research money best spent right now is 772 00:29:29,919 --> 00:29:31,200 it in testing 773 00:29:31,200 --> 00:29:35,039 is it in procuring new flashy 774 00:29:35,039 --> 00:29:38,159 solutions out of silicon valley is it in 775 00:29:38,159 --> 00:29:40,640 perhaps cloud hosting and creating that 776 00:29:40,640 --> 00:29:41,200 uh 777 00:29:41,200 --> 00:29:43,200 technical basis for interoperability 778 00:29:43,200 --> 00:29:44,559 that was just discussed 779 00:29:44,559 --> 00:29:47,520 where does a military invest now to best 780 00:29:47,520 --> 00:29:48,880 harness this future i'll 781 00:29:48,880 --> 00:29:52,559 turn to you heather first that's a great 782 00:29:52,559 --> 00:29:53,600 question pat 783 00:29:53,600 --> 00:29:57,440 um i because everybody it's a 784 00:29:57,440 --> 00:29:59,760 it's it's a zero-sum game right we only 785 00:29:59,760 --> 00:30:01,520 have so much money so where do we put 786 00:30:01,520 --> 00:30:02,559 that money 787 00:30:02,559 --> 00:30:06,000 um infrastructure is is 788 00:30:06,000 --> 00:30:09,039 really the key and and and 789 00:30:09,039 --> 00:30:11,279 it depending upon a state's national 790 00:30:11,279 --> 00:30:12,399 interests 791 00:30:12,399 --> 00:30:15,360 and where they are going to pursue ai 792 00:30:15,360 --> 00:30:18,000 given those national interests 793 00:30:18,000 --> 00:30:20,159 i think infrastructure is key so that's 794 00:30:20,159 --> 00:30:23,120 going to be in how do i 795 00:30:23,120 --> 00:30:26,240 uh create maybe it's cloud 796 00:30:26,240 --> 00:30:28,480 right how do i best secure cloud 797 00:30:28,480 --> 00:30:29,840 computing storage 798 00:30:29,840 --> 00:30:32,960 and and do so um effectively 799 00:30:32,960 --> 00:30:37,120 how do i upgrade my networks how do i 800 00:30:37,120 --> 00:30:41,360 get more bandwidth how do i 801 00:30:41,360 --> 00:30:44,960 make data easy to to be interoperable 802 00:30:44,960 --> 00:30:46,640 and accessible 803 00:30:46,640 --> 00:30:49,840 across a force to repurpose and to 804 00:30:49,840 --> 00:30:50,799 relabel 805 00:30:50,799 --> 00:30:54,080 um how do i then um create that testing 806 00:30:54,080 --> 00:30:56,240 verification validation pipeline so that 807 00:30:56,240 --> 00:30:57,440 i can do 808 00:30:57,440 --> 00:31:00,320 software as a service instead of having 809 00:31:00,320 --> 00:31:00,799 to do 810 00:31:00,799 --> 00:31:04,559 acquisitions in a very old way of 811 00:31:04,559 --> 00:31:07,360 30-year cycles and upgrades and someone 812 00:31:07,360 --> 00:31:08,960 walking around with a thumb drive 813 00:31:08,960 --> 00:31:11,679 and going to each computer so i think 814 00:31:11,679 --> 00:31:13,200 really the backbone 815 00:31:13,200 --> 00:31:16,480 of of any force right now is 816 00:31:16,480 --> 00:31:19,200 is the really not sexy stuff it's the 817 00:31:19,200 --> 00:31:20,640 infrastructure 818 00:31:20,640 --> 00:31:23,039 about data hosting cloud computing 819 00:31:23,039 --> 00:31:24,320 software as a service 820 00:31:24,320 --> 00:31:26,799 networks and communication bandwidth 821 00:31:26,799 --> 00:31:28,080 that will enable 822 00:31:28,080 --> 00:31:31,519 a force to get the shiny bright widget 823 00:31:31,519 --> 00:31:33,039 and i'm afraid we're going to have to 824 00:31:33,039 --> 00:31:35,039 end on that note we are out of time 825 00:31:35,039 --> 00:31:37,120 uh thank you both for being here today 826 00:31:37,120 --> 00:31:38,880 thank you to our audience coming up next 827 00:31:38,880 --> 00:31:40,240 in habsburg at four 828 00:31:40,240 --> 00:31:42,000 we have a session called democracy in 829 00:31:42,000 --> 00:31:43,840 the digital space an alliance for a 830 00:31:43,840 --> 00:31:45,279 healthy infosphere 831 00:31:45,279 --> 00:31:48,159 and on the mariturisa stage european 832 00:31:48,159 --> 00:31:49,919 cooperation with representatives 833 00:31:49,919 --> 00:31:52,240 of ministries of defense from slovakia 834 00:31:52,240 --> 00:31:53,919 the czech republic and sweden 835 00:31:53,919 --> 00:31:55,360 so thank you all for being here thank 836 00:31:55,360 --> 00:31:58,240 you to my panelists 837 00:31:58,559 --> 00:32:01,840 thank you very much 838 00:32:02,370 --> 00:32:11,010 [Music]