1 00:00:01,240 --> 00:00:12,010 [Music] 2 00:00:12,390 --> 00:00:14,460 [Applause] 3 00:00:14,460 --> 00:00:17,119 [Music] 4 00:00:17,119 --> 00:00:18,400 ladies and gentlemen 5 00:00:18,400 --> 00:00:20,480 welcome to moderator of the last session 6 00:00:20,480 --> 00:00:22,320 at habsburg stage of the day one 7 00:00:22,320 --> 00:00:23,960 steve clemons 8 00:00:23,960 --> 00:00:27,039 [Music] 9 00:00:28,880 --> 00:00:30,880 hi everyone i'm steve clemens editor at 10 00:00:30,880 --> 00:00:32,880 large of the hill in washington dc i've 11 00:00:32,880 --> 00:00:33,520 been a 12 00:00:33,520 --> 00:00:35,600 regular attendee at globe sec here for a 13 00:00:35,600 --> 00:00:36,559 long time and this 14 00:00:36,559 --> 00:00:40,000 topic about digital health digital rules 15 00:00:40,000 --> 00:00:42,879 ethics in the digital space has been one 16 00:00:42,879 --> 00:00:44,000 that we've been dealing with 17 00:00:44,000 --> 00:00:46,399 at globe secure and body slava for many 18 00:00:46,399 --> 00:00:47,520 many years 19 00:00:47,520 --> 00:00:49,360 and i think what is what is happening is 20 00:00:49,360 --> 00:00:51,520 we've seen a robustness of the 21 00:00:51,520 --> 00:00:53,120 importance of this issue grow 22 00:00:53,120 --> 00:00:54,800 there are initiatives right now i should 23 00:00:54,800 --> 00:00:56,879 mention at globe sec 24 00:00:56,879 --> 00:00:58,719 titled transatlantic principles for a 25 00:00:58,719 --> 00:01:00,719 healthy online information space globe 26 00:01:00,719 --> 00:01:01,359 sec has 27 00:01:01,359 --> 00:01:03,359 has invested heavily in this uh and 28 00:01:03,359 --> 00:01:04,879 there are ten principles which perhaps 29 00:01:04,879 --> 00:01:06,159 we're going to be touching on a number 30 00:01:06,159 --> 00:01:06,720 of these 31 00:01:06,720 --> 00:01:08,159 i won't go through them here but what i 32 00:01:08,159 --> 00:01:10,640 will tell you is that about seven member 33 00:01:10,640 --> 00:01:13,040 nations 14 different large organizations 34 00:01:13,040 --> 00:01:14,799 have signed on to these principles and 35 00:01:14,799 --> 00:01:16,240 i'm told that down at the end of the 36 00:01:16,240 --> 00:01:16,640 hall 37 00:01:16,640 --> 00:01:18,400 just on this floor just down there there 38 00:01:18,400 --> 00:01:20,080 is a banner of these principles and if 39 00:01:20,080 --> 00:01:21,920 you believe kind of in this stuff 40 00:01:21,920 --> 00:01:23,680 now maybe you don't maybe you're one of 41 00:01:23,680 --> 00:01:25,520 the anarchists or you know 42 00:01:25,520 --> 00:01:27,119 you know kind of you know you know game 43 00:01:27,119 --> 00:01:28,799 this a different way you know and we 44 00:01:28,799 --> 00:01:30,079 want to hear from all sides 45 00:01:30,079 --> 00:01:31,520 but if you do believe in this sort of 46 00:01:31,520 --> 00:01:33,360 digital health digital principles 47 00:01:33,360 --> 00:01:35,200 working across borders and lines 48 00:01:35,200 --> 00:01:36,720 they've asked you to feel free to go 49 00:01:36,720 --> 00:01:38,560 down and sign that banner so 50 00:01:38,560 --> 00:01:39,840 let me encourage all of you to do that 51 00:01:39,840 --> 00:01:42,000 we have many many guests online as well 52 00:01:42,000 --> 00:01:43,200 and they should figure out a way to get 53 00:01:43,200 --> 00:01:45,920 you an opportunity to participate 54 00:01:45,920 --> 00:01:48,560 digitally online as well let me tell you 55 00:01:48,560 --> 00:01:50,880 who is joining us for our panel session 56 00:01:50,880 --> 00:01:52,799 our three panelists today are all 57 00:01:52,799 --> 00:01:54,399 virtual and i see all three of them here 58 00:01:54,399 --> 00:01:55,759 it's great to see david carroll i wasn't 59 00:01:55,759 --> 00:01:56,799 sure we're going to have him 60 00:01:56,799 --> 00:01:58,159 david carroll let me start out for a 61 00:01:58,159 --> 00:01:59,840 moment is the associate professor 62 00:01:59,840 --> 00:02:01,920 of media design at the parsons school of 63 00:02:01,920 --> 00:02:04,079 design in new york city 64 00:02:04,079 --> 00:02:07,200 he is probably one of the world's most 65 00:02:07,200 --> 00:02:10,000 important crusaders for data protection 66 00:02:10,000 --> 00:02:10,639 rights 67 00:02:10,639 --> 00:02:12,319 and i just want to put that on the table 68 00:02:12,319 --> 00:02:13,920 he has uh uh 69 00:02:13,920 --> 00:02:16,319 been the one who created the most 70 00:02:16,319 --> 00:02:18,480 significant criminal liabilities for 71 00:02:18,480 --> 00:02:19,920 cambridge analytica 72 00:02:19,920 --> 00:02:22,800 over his own voter rights profile it's a 73 00:02:22,800 --> 00:02:24,560 fascinating thing he was the subject 74 00:02:24,560 --> 00:02:26,879 of a netflix show called the great hack 75 00:02:26,879 --> 00:02:28,959 and so david it's great to see you here 76 00:02:28,959 --> 00:02:32,080 we also have with us renata nicolet 77 00:02:32,080 --> 00:02:34,000 is head of the cabinet for the vic in 78 00:02:34,000 --> 00:02:35,519 the vice president's office for values 79 00:02:35,519 --> 00:02:36,560 and transparency 80 00:02:36,560 --> 00:02:38,640 for the european commission in brussels 81 00:02:38,640 --> 00:02:40,720 she has a fantastic twitter feed i have 82 00:02:40,720 --> 00:02:41,120 to 83 00:02:41,120 --> 00:02:42,879 uh tell renate i've been enjoying your 84 00:02:42,879 --> 00:02:44,560 tweets all day i just know you have a 85 00:02:44,560 --> 00:02:45,519 great soul 86 00:02:45,519 --> 00:02:47,599 uh so please go take and we have marcus 87 00:02:47,599 --> 00:02:48,640 marcus reinish 88 00:02:48,640 --> 00:02:50,560 who's joining us also virtually he's 89 00:02:50,560 --> 00:02:52,560 vice president for public policy in 90 00:02:52,560 --> 00:02:53,760 europe for the middle east 91 00:02:53,760 --> 00:02:55,760 uh and africa so we've got different 92 00:02:55,760 --> 00:02:58,400 corners of this of this conversation 93 00:02:58,400 --> 00:02:59,440 here today 94 00:02:59,440 --> 00:03:01,280 and i'm going to try to keep looking at 95 00:03:01,280 --> 00:03:02,560 the camera where my guests are 96 00:03:02,560 --> 00:03:04,000 but all of you i'm going to tell those 97 00:03:04,000 --> 00:03:06,239 folks that are online you can 98 00:03:06,239 --> 00:03:07,840 participate and those in the room or 99 00:03:07,840 --> 00:03:09,760 those watching wherever file your 100 00:03:09,760 --> 00:03:10,480 questions 101 00:03:10,480 --> 00:03:12,560 uh in the globe sec app and i they will 102 00:03:12,560 --> 00:03:13,599 pop up on my little 103 00:03:13,599 --> 00:03:15,599 you know notepad here and we'll be happy 104 00:03:15,599 --> 00:03:17,200 to bring them to you those of you in the 105 00:03:17,200 --> 00:03:17,599 room 106 00:03:17,599 --> 00:03:19,360 can also pose questions and you'll just 107 00:03:19,360 --> 00:03:21,920 do so at the microphone 108 00:03:21,920 --> 00:03:23,840 i think the issue of starting with the 109 00:03:23,840 --> 00:03:25,200 questions you know i'm in media right 110 00:03:25,200 --> 00:03:26,799 and i've been watching media become more 111 00:03:26,799 --> 00:03:28,400 and more fragmented and the kind of 112 00:03:28,400 --> 00:03:29,200 division 113 00:03:29,200 --> 00:03:31,200 you know the discussion of what exactly 114 00:03:31,200 --> 00:03:32,319 is media uh 115 00:03:32,319 --> 00:03:33,920 is up for grabs today because there are 116 00:03:33,920 --> 00:03:36,400 a lot of players that do not have that 117 00:03:36,400 --> 00:03:39,040 that that dna of objective distance in 118 00:03:39,040 --> 00:03:39,440 them 119 00:03:39,440 --> 00:03:40,720 that notion of what public 120 00:03:40,720 --> 00:03:42,560 responsibility and accountability they 121 00:03:42,560 --> 00:03:43,120 should have 122 00:03:43,120 --> 00:03:44,480 that's just the media dimension but 123 00:03:44,480 --> 00:03:46,159 there are a lot of other players and 124 00:03:46,159 --> 00:03:47,920 there are platforms like we have at 125 00:03:47,920 --> 00:03:50,319 facebook uh twitter google but there are 126 00:03:50,319 --> 00:03:51,840 many many others as well 127 00:03:51,840 --> 00:03:53,920 and the rules across lines matter and 128 00:03:53,920 --> 00:03:55,360 how we kind of shape a world 129 00:03:55,360 --> 00:03:57,120 collectively together particularly as i 130 00:03:57,120 --> 00:03:58,879 see it in transatlantic relations 131 00:03:58,879 --> 00:04:00,400 dealing with some of the more insidious 132 00:04:00,400 --> 00:04:02,000 players uh that are out there is very 133 00:04:02,000 --> 00:04:02,480 important 134 00:04:02,480 --> 00:04:04,319 but let me ask david carroll to help set 135 00:04:04,319 --> 00:04:06,239 the stage if you would david 136 00:04:06,239 --> 00:04:09,200 you know i i i really respect what you 137 00:04:09,200 --> 00:04:09,519 did 138 00:04:09,519 --> 00:04:11,040 in sort of looking at the question of 139 00:04:11,040 --> 00:04:13,439 your own voter profile 140 00:04:13,439 --> 00:04:15,519 which which became part of the cambridge 141 00:04:15,519 --> 00:04:17,199 analytica story became part of a great 142 00:04:17,199 --> 00:04:18,000 series 143 00:04:18,000 --> 00:04:19,839 i'm interested in what you think or the 144 00:04:19,839 --> 00:04:22,400 top two or three equities we should keep 145 00:04:22,400 --> 00:04:23,360 in mind 146 00:04:23,360 --> 00:04:25,759 as we discuss this issue of digital 147 00:04:25,759 --> 00:04:26,960 health um 148 00:04:26,960 --> 00:04:28,800 and and and how to get it right as we 149 00:04:28,800 --> 00:04:30,479 move forward as opposed to just 150 00:04:30,479 --> 00:04:32,479 necessarily relitigating the past but 151 00:04:32,479 --> 00:04:34,160 give us your your your top of line 152 00:04:34,160 --> 00:04:36,080 thoughts 153 00:04:36,080 --> 00:04:38,080 well thanks steve it's great to be here 154 00:04:38,080 --> 00:04:39,759 uh glad i made it at the last 155 00:04:39,759 --> 00:04:44,320 uh minute um the big lessons learned 156 00:04:44,320 --> 00:04:48,080 out of okay great the big lessons 157 00:04:48,080 --> 00:04:48,639 learned 158 00:04:48,639 --> 00:04:50,720 um out of the cambridge analytical 159 00:04:50,720 --> 00:04:52,639 parable for the future 160 00:04:52,639 --> 00:04:55,600 is you know number one the need for a 161 00:04:55,600 --> 00:04:57,120 global standard 162 00:04:57,120 --> 00:05:00,320 for data rights the ability for bad 163 00:05:00,320 --> 00:05:02,320 actors to exploit 164 00:05:02,320 --> 00:05:05,840 the differences between data protection 165 00:05:05,840 --> 00:05:07,360 regimes 166 00:05:07,360 --> 00:05:09,600 is a significant problem and if we are 167 00:05:09,600 --> 00:05:11,120 to you know 168 00:05:11,120 --> 00:05:14,080 basically data does not respect borders 169 00:05:14,080 --> 00:05:14,479 it 170 00:05:14,479 --> 00:05:16,720 it emanates around the earth like an 171 00:05:16,720 --> 00:05:17,759 atmosphere 172 00:05:17,759 --> 00:05:19,759 so we have to understand it that way and 173 00:05:19,759 --> 00:05:21,039 we risk 174 00:05:21,039 --> 00:05:24,160 splinter net if we do not achieve a very 175 00:05:24,160 --> 00:05:25,759 strong global standard 176 00:05:25,759 --> 00:05:27,919 the second issue is you can have the 177 00:05:27,919 --> 00:05:29,840 greatest laws on the books 178 00:05:29,840 --> 00:05:32,400 but if they're not enforced they really 179 00:05:32,400 --> 00:05:34,080 don't matter that much 180 00:05:34,080 --> 00:05:35,440 and unfortunately in the case of 181 00:05:35,440 --> 00:05:37,360 cambridge analytica 182 00:05:37,360 --> 00:05:40,560 the ultimate victory of me achieving the 183 00:05:40,560 --> 00:05:42,080 full data request 184 00:05:42,080 --> 00:05:44,720 was achieved by an act of journalism 185 00:05:44,720 --> 00:05:46,320 because somebody leaked 186 00:05:46,320 --> 00:05:49,199 the database to journalists the data 187 00:05:49,199 --> 00:05:49,759 protection 188 00:05:49,759 --> 00:05:52,400 regulator was not able to fulfill their 189 00:05:52,400 --> 00:05:53,360 task 190 00:05:53,360 --> 00:05:55,840 so even the best data protection 191 00:05:55,840 --> 00:05:56,960 authorities 192 00:05:56,960 --> 00:05:59,199 don't have the tools or the skills or 193 00:05:59,199 --> 00:06:00,560 the resources 194 00:06:00,560 --> 00:06:03,199 to tackle the biggest challenges of the 195 00:06:03,199 --> 00:06:04,720 day and the future 196 00:06:04,720 --> 00:06:07,919 and then the last thing is the problem 197 00:06:07,919 --> 00:06:08,880 with my story 198 00:06:08,880 --> 00:06:11,199 is that it emphasizes this personal 199 00:06:11,199 --> 00:06:12,479 responsibility 200 00:06:12,479 --> 00:06:15,759 for one's own data but data is a 201 00:06:15,759 --> 00:06:18,880 collective issue and and privacy is a 202 00:06:18,880 --> 00:06:20,400 networked problem 203 00:06:20,400 --> 00:06:23,919 and so you know it is not my fault that 204 00:06:23,919 --> 00:06:24,560 i 205 00:06:24,560 --> 00:06:27,520 had a profile collected nor is it my 206 00:06:27,520 --> 00:06:29,600 fault that i spent huge amounts of money 207 00:06:29,600 --> 00:06:30,479 and many years 208 00:06:30,479 --> 00:06:34,160 trying to get get at the truth when 209 00:06:34,160 --> 00:06:35,520 really it affected 210 00:06:35,520 --> 00:06:37,199 every registered voter in the united 211 00:06:37,199 --> 00:06:39,360 states and we haven't really 212 00:06:39,360 --> 00:06:41,600 reckoned with the way that privacy 213 00:06:41,600 --> 00:06:43,120 invasions affect 214 00:06:43,120 --> 00:06:47,280 entire communities even 215 00:06:47,680 --> 00:06:48,880 well populations you for that david i 216 00:06:48,880 --> 00:06:50,560 want to encourage our three panelists to 217 00:06:50,560 --> 00:06:52,319 also feel free to react to one another 218 00:06:52,319 --> 00:06:53,440 as we weave through this i'm going to 219 00:06:53,440 --> 00:06:55,360 get to marcus in just a moment but 220 00:06:55,360 --> 00:06:57,440 um renata i want to ask you a question 221 00:06:57,440 --> 00:06:58,880 as you sort of look 222 00:06:58,880 --> 00:07:00,800 at this broad subject of kind of 223 00:07:00,800 --> 00:07:03,199 creating scaffolding and a platform 224 00:07:03,199 --> 00:07:06,240 and a safe space for people's digital 225 00:07:06,240 --> 00:07:08,240 uh rights and digital privacy and 226 00:07:08,240 --> 00:07:09,759 digital communications and 227 00:07:09,759 --> 00:07:10,960 you know just getting all of those 228 00:07:10,960 --> 00:07:12,880 features right one of the things that 229 00:07:12,880 --> 00:07:14,400 i've been thinking about here at 230 00:07:14,400 --> 00:07:15,680 globesec there's been a lot of 231 00:07:15,680 --> 00:07:17,440 discussion of china and russia 232 00:07:17,440 --> 00:07:19,440 when you think about tomorrow's global 233 00:07:19,440 --> 00:07:20,479 middle class 234 00:07:20,479 --> 00:07:21,840 and and david carroll just sort of 235 00:07:21,840 --> 00:07:23,039 referred to this in a way about the 236 00:07:23,039 --> 00:07:24,479 fears of a splinter net 237 00:07:24,479 --> 00:07:26,560 a lot of that is not going to future net 238 00:07:26,560 --> 00:07:27,919 added global value 239 00:07:27,919 --> 00:07:30,240 uh global middle class will not be in 240 00:07:30,240 --> 00:07:31,759 the transatlantic relationship it will 241 00:07:31,759 --> 00:07:32,960 not be in europe it will not be in the 242 00:07:32,960 --> 00:07:34,240 united states it's going to be in 243 00:07:34,240 --> 00:07:35,840 southeast asia it's going to be in asia 244 00:07:35,840 --> 00:07:37,599 proper it's going to be in africa 245 00:07:37,599 --> 00:07:39,520 and these are places that if you don't 246 00:07:39,520 --> 00:07:41,120 somehow bring them into this 247 00:07:41,120 --> 00:07:42,240 conversation 248 00:07:42,240 --> 00:07:43,759 if you did have a splinter net if you 249 00:07:43,759 --> 00:07:45,680 create rules that work in one place 250 00:07:45,680 --> 00:07:48,319 but china with it with its allies and 251 00:07:48,319 --> 00:07:50,319 its interests is not part of that 252 00:07:50,319 --> 00:07:52,479 don't you think we run the problem of 253 00:07:52,479 --> 00:07:53,680 we're going to get to the companies in a 254 00:07:53,680 --> 00:07:55,520 minute but the company is being left out 255 00:07:55,520 --> 00:07:56,960 to some degree 256 00:07:56,960 --> 00:07:58,160 of that growth but i would love to get 257 00:07:58,160 --> 00:07:59,520 your thoughts on what we need to get 258 00:07:59,520 --> 00:08:00,879 right because we often talk about 259 00:08:00,879 --> 00:08:02,080 setting rules up 260 00:08:02,080 --> 00:08:03,360 but we don't talk about what the 261 00:08:03,360 --> 00:08:05,520 economic consequences would be 262 00:08:05,520 --> 00:08:08,878 if you've got a divided world 263 00:08:10,560 --> 00:08:12,319 thank you steve for the question um 264 00:08:12,319 --> 00:08:14,400 indeed i think it's it's quite important 265 00:08:14,400 --> 00:08:15,280 to not forget 266 00:08:15,280 --> 00:08:17,360 the global dimension of what we do here 267 00:08:17,360 --> 00:08:18,400 because you know 268 00:08:18,400 --> 00:08:20,160 this digital transformation that we all 269 00:08:20,160 --> 00:08:22,160 live in uh is in the end something that 270 00:08:22,160 --> 00:08:23,360 will unite us all 271 00:08:23,360 --> 00:08:25,680 uh much more and and facilitate global 272 00:08:25,680 --> 00:08:27,520 conversations and that's why global 273 00:08:27,520 --> 00:08:28,960 convergence of rules 274 00:08:28,960 --> 00:08:31,919 is so important and i see a certain 275 00:08:31,919 --> 00:08:32,479 trend 276 00:08:32,479 --> 00:08:35,360 um in that direction i was very involved 277 00:08:35,360 --> 00:08:37,120 in the making of the general data 278 00:08:37,120 --> 00:08:38,399 protection regulation 279 00:08:38,399 --> 00:08:41,760 in the eu we were a first mover in that 280 00:08:41,760 --> 00:08:43,120 context 281 00:08:43,120 --> 00:08:45,040 and but nevertheless what we have seen 282 00:08:45,040 --> 00:08:46,880 in the three years since these new data 283 00:08:46,880 --> 00:08:48,000 protection rules 284 00:08:48,000 --> 00:08:50,480 have been in direct application is 285 00:08:50,480 --> 00:08:51,040 indeed 286 00:08:51,040 --> 00:08:54,399 a bigger convergence worldwide towards 287 00:08:54,399 --> 00:08:57,360 basic principles of data privacy and and 288 00:08:57,360 --> 00:08:58,399 of course it's always good if 289 00:08:58,399 --> 00:09:00,240 transatlantically we can kind of show 290 00:09:00,240 --> 00:09:02,240 the way and today is a good talk 291 00:09:02,240 --> 00:09:03,920 good day to talk about that because we 292 00:09:03,920 --> 00:09:06,320 just had uh president biden 293 00:09:06,320 --> 00:09:09,120 in town for an eu us summit and for a 294 00:09:09,120 --> 00:09:10,000 nato summit 295 00:09:10,000 --> 00:09:11,920 with very encouraging signals that the 296 00:09:11,920 --> 00:09:13,200 u.s is back and 297 00:09:13,200 --> 00:09:15,440 transatlantic corporation is back but 298 00:09:15,440 --> 00:09:17,440 what we are seeing globally is actually 299 00:09:17,440 --> 00:09:19,120 a convergence not only 300 00:09:19,120 --> 00:09:22,720 you know in in the us and in eu 301 00:09:22,720 --> 00:09:25,200 but also in parts of asia uh for 302 00:09:25,200 --> 00:09:27,839 instance we have in japan or in korea 303 00:09:27,839 --> 00:09:30,399 a very similar data protection regime so 304 00:09:30,399 --> 00:09:32,320 that we could kind of consider these 305 00:09:32,320 --> 00:09:34,399 these parties even also inadequacy 306 00:09:34,399 --> 00:09:35,760 findings on our side 307 00:09:35,760 --> 00:09:38,560 so our approach in europe is really to 308 00:09:38,560 --> 00:09:40,560 contribute to inspire others to 309 00:09:40,560 --> 00:09:42,560 contribute to a bigger convergence 310 00:09:42,560 --> 00:09:43,519 worldwide 311 00:09:43,519 --> 00:09:45,600 amongst allies let's not fool ourselves 312 00:09:45,600 --> 00:09:47,440 there will be always you know uh 313 00:09:47,440 --> 00:09:48,880 partners in the world 314 00:09:48,880 --> 00:09:50,640 with hope it will be much more difficult 315 00:09:50,640 --> 00:09:52,000 to align because 316 00:09:52,000 --> 00:09:54,080 the basic principles of these strong 317 00:09:54,080 --> 00:09:55,760 privacy rights is that you know the 318 00:09:55,760 --> 00:09:57,680 individuals in the center 319 00:09:57,680 --> 00:09:59,440 and you have your rights and you have to 320 00:09:59,440 --> 00:10:01,279 enforce them and you have to defend 321 00:10:01,279 --> 00:10:02,720 yourself against that 322 00:10:02,720 --> 00:10:05,040 and that's not shared across the globe 323 00:10:05,040 --> 00:10:07,120 but i think if we are united and work 324 00:10:07,120 --> 00:10:08,880 you know in international forum whether 325 00:10:08,880 --> 00:10:10,480 it's the united nations whether that's 326 00:10:10,480 --> 00:10:11,760 the council of europe 327 00:10:11,760 --> 00:10:13,680 uh to aim for a bigger convergence 328 00:10:13,680 --> 00:10:15,760 that's also in the interest not only of 329 00:10:15,760 --> 00:10:17,760 the citizens worldwide but also of 330 00:10:17,760 --> 00:10:19,120 business worldwide 331 00:10:19,120 --> 00:10:21,279 bernat before i jump to marcus um and 332 00:10:21,279 --> 00:10:22,640 we'll have a little fun with i want to 333 00:10:22,640 --> 00:10:24,560 ask you a question a few years ago 334 00:10:24,560 --> 00:10:26,399 at the brussels forum sponsored by the 335 00:10:26,399 --> 00:10:27,760 german marshall fund 336 00:10:27,760 --> 00:10:30,399 i interviewed your um i think it was the 337 00:10:30,399 --> 00:10:32,000 eu commissioner for a single digital 338 00:10:32,000 --> 00:10:33,360 economy but he was very he was 339 00:10:33,360 --> 00:10:34,720 responsible for some of the rules that 340 00:10:34,720 --> 00:10:35,440 you wrote 341 00:10:35,440 --> 00:10:37,680 uh in the gdpr which were just about to 342 00:10:37,680 --> 00:10:38,880 come online 343 00:10:38,880 --> 00:10:42,320 and i and i was talking about how europe 344 00:10:42,320 --> 00:10:44,800 uh regulated this sector differently of 345 00:10:44,800 --> 00:10:46,079 the united states 346 00:10:46,079 --> 00:10:48,160 and i said do you think in the rules and 347 00:10:48,160 --> 00:10:50,320 regulations europe was putting forward 348 00:10:50,320 --> 00:10:52,880 you would get a facebook uh being 349 00:10:52,880 --> 00:10:53,360 birthed 350 00:10:53,360 --> 00:10:56,480 in in europe or a google or a twitter 351 00:10:56,480 --> 00:10:58,560 and he said no and he said it on the 352 00:10:58,560 --> 00:10:59,760 record so i can say it 353 00:10:59,760 --> 00:11:02,480 he said you americans innovate we 354 00:11:02,480 --> 00:11:03,839 europeans regulate 355 00:11:03,839 --> 00:11:06,399 that's our social contract and so it was 356 00:11:06,399 --> 00:11:07,680 a provocative 357 00:11:07,680 --> 00:11:10,000 moment and so i just want to put that 358 00:11:10,000 --> 00:11:11,600 back to you because that was your 359 00:11:11,600 --> 00:11:13,040 minister who said that 360 00:11:13,040 --> 00:11:16,160 and ask you what risks we what 361 00:11:16,160 --> 00:11:19,680 what do we lose risk losing if we're not 362 00:11:19,680 --> 00:11:21,839 careful about how we homogenize those 363 00:11:21,839 --> 00:11:23,760 rules because part of this discussion 364 00:11:23,760 --> 00:11:25,760 is also the incredible innovation in 365 00:11:25,760 --> 00:11:27,440 platforms that have developed in a 366 00:11:27,440 --> 00:11:28,320 remarkably 367 00:11:28,320 --> 00:11:29,920 uh short period of time and i just want 368 00:11:29,920 --> 00:11:31,200 to acknowledge what the minister said 369 00:11:31,200 --> 00:11:32,560 back then but your thoughts real quickly 370 00:11:32,560 --> 00:11:34,320 renata 371 00:11:34,320 --> 00:11:36,320 yeah i think the the situation has 372 00:11:36,320 --> 00:11:38,320 changed uh compared to a couple of years 373 00:11:38,320 --> 00:11:40,000 ago where really in europe we just 374 00:11:40,000 --> 00:11:41,920 discovered the digital single market as 375 00:11:41,920 --> 00:11:43,760 a key kind of you know objective 376 00:11:43,760 --> 00:11:46,320 on our side i think since then we have 377 00:11:46,320 --> 00:11:48,079 quite an important discussion in europe 378 00:11:48,079 --> 00:11:49,519 about digital innovation 379 00:11:49,519 --> 00:11:51,680 even about digital sovereignty and we 380 00:11:51,680 --> 00:11:53,760 see a lot of very promising startup 381 00:11:53,760 --> 00:11:56,000 activities and also a lot of investment 382 00:11:56,000 --> 00:11:57,600 flowing into this and if you think about 383 00:11:57,600 --> 00:12:00,000 the recovery now after covert 19 384 00:12:00,000 --> 00:12:02,000 there's a lot of money also from the the 385 00:12:02,000 --> 00:12:03,839 huge kind of recovery kind of money that 386 00:12:03,839 --> 00:12:05,200 we have put together in europe 387 00:12:05,200 --> 00:12:06,720 to support our member states in the 388 00:12:06,720 --> 00:12:08,320 recovery going into the digital 389 00:12:08,320 --> 00:12:09,279 transformation 390 00:12:09,279 --> 00:12:10,959 so i would argue that you actually are 391 00:12:10,959 --> 00:12:13,200 faced today with a more assertive europe 392 00:12:13,200 --> 00:12:14,959 when it comes to actually uh you know 393 00:12:14,959 --> 00:12:16,560 also a catching up with the united 394 00:12:16,560 --> 00:12:18,959 states on the innovation side 395 00:12:18,959 --> 00:12:20,560 thank you marcus we're going to move to 396 00:12:20,560 --> 00:12:22,320 you now and i want to say before i 397 00:12:22,320 --> 00:12:23,760 mention another company other than 398 00:12:23,760 --> 00:12:25,279 facebook a while back i had 399 00:12:25,279 --> 00:12:26,959 dinner with jack dorsey of twitter and 400 00:12:26,959 --> 00:12:28,720 it was a private thing to talk about 401 00:12:28,720 --> 00:12:30,880 where the line ought to be and who 402 00:12:30,880 --> 00:12:31,839 should tweet should the 403 00:12:31,839 --> 00:12:33,519 with the president of the united states 404 00:12:33,519 --> 00:12:35,200 you know be de-tweeted 405 00:12:35,200 --> 00:12:36,399 you know where do you where do you draw 406 00:12:36,399 --> 00:12:39,279 those lines how do you enter a world 407 00:12:39,279 --> 00:12:40,720 where you're beginning to make 408 00:12:40,720 --> 00:12:42,320 censorship decisions 409 00:12:42,320 --> 00:12:44,160 and i have to tell you and this is 410 00:12:44,160 --> 00:12:45,360 honestly 411 00:12:45,360 --> 00:12:48,240 you could see the anguish in his face 412 00:12:48,240 --> 00:12:49,519 and so i don't think we should take 413 00:12:49,519 --> 00:12:50,079 lightly 414 00:12:50,079 --> 00:12:51,600 these conversations because where you 415 00:12:51,600 --> 00:12:53,440 put those lines matter but i 416 00:12:53,440 --> 00:12:56,079 i want to ask you marcus about how you 417 00:12:56,079 --> 00:12:57,279 and facebook 418 00:12:57,279 --> 00:12:59,040 and i know you used to be at vodafone 419 00:12:59,040 --> 00:13:00,639 you know this world well 420 00:13:00,639 --> 00:13:02,320 what are the ways in which we should be 421 00:13:02,320 --> 00:13:03,920 thinking that we might not know as 422 00:13:03,920 --> 00:13:05,839 you've approached some of the challenges 423 00:13:05,839 --> 00:13:07,680 of being one of the most popular 424 00:13:07,680 --> 00:13:09,760 uh and one of the most impactful 425 00:13:09,760 --> 00:13:11,920 platforms when it comes to political 426 00:13:11,920 --> 00:13:13,839 discussions and political activism 427 00:13:13,839 --> 00:13:16,880 but marcus your thoughts 428 00:13:16,880 --> 00:13:18,880 yeah so first thanks steve and uh 429 00:13:18,880 --> 00:13:20,079 extremely warm hello 430 00:13:20,079 --> 00:13:23,040 to bratislava but uh where are you right 431 00:13:23,040 --> 00:13:24,480 now by the way 432 00:13:24,480 --> 00:13:26,800 i'm i'm sitting in just overlooking the 433 00:13:26,800 --> 00:13:28,240 roofs of london that's where 434 00:13:28,240 --> 00:13:29,920 i'm based and all right enjoying the 435 00:13:29,920 --> 00:13:31,920 rest of the night we have 436 00:13:31,920 --> 00:13:33,680 um but you know the anguish that you 437 00:13:33,680 --> 00:13:35,519 were describing that jack dorsey is 438 00:13:35,519 --> 00:13:37,040 displaying that's the anguish that that 439 00:13:37,040 --> 00:13:37,839 i 440 00:13:37,839 --> 00:13:40,079 display as well almost on a on a daily 441 00:13:40,079 --> 00:13:41,279 basis because 442 00:13:41,279 --> 00:13:43,839 you know we are pushed towards decisions 443 00:13:43,839 --> 00:13:44,720 that are 444 00:13:44,720 --> 00:13:46,880 ethically very very critical and the 445 00:13:46,880 --> 00:13:47,760 question is really 446 00:13:47,760 --> 00:13:50,079 who should make these decisions and and 447 00:13:50,079 --> 00:13:51,519 i think we've said this and i feel this 448 00:13:51,519 --> 00:13:52,160 personally 449 00:13:52,160 --> 00:13:54,480 it it shouldn't be us it should be 450 00:13:54,480 --> 00:13:55,440 democratic 451 00:13:55,440 --> 00:13:58,639 uh elected um uh representatives it 452 00:13:58,639 --> 00:14:00,079 should be governments it should be 453 00:14:00,079 --> 00:14:02,000 um the european commission and and and 454 00:14:02,000 --> 00:14:03,440 the co-legislators and 455 00:14:03,440 --> 00:14:06,160 and i think it is um in a really really 456 00:14:06,160 --> 00:14:07,920 delicate balance that you have to strike 457 00:14:07,920 --> 00:14:10,000 you have to strike balances between 458 00:14:10,000 --> 00:14:12,079 you know freedom of speech and harmful 459 00:14:12,079 --> 00:14:13,519 speech you have to strike a balance 460 00:14:13,519 --> 00:14:15,199 between safety and privacy you have to 461 00:14:15,199 --> 00:14:16,000 decide 462 00:14:16,000 --> 00:14:18,160 what is truthful and what is deliberate 463 00:14:18,160 --> 00:14:19,199 manipulation and 464 00:14:19,199 --> 00:14:22,079 and this is not i think a role that a 465 00:14:22,079 --> 00:14:23,040 private company 466 00:14:23,040 --> 00:14:26,720 can ever play and i therefore 467 00:14:26,720 --> 00:14:28,480 really really you know happy to see as a 468 00:14:28,480 --> 00:14:30,320 european as a proud european 469 00:14:30,320 --> 00:14:33,199 that we do have um a european commission 470 00:14:33,199 --> 00:14:34,959 a european institution that is actually 471 00:14:34,959 --> 00:14:36,560 going down this route that is creating 472 00:14:36,560 --> 00:14:37,680 the frameworks 473 00:14:37,680 --> 00:14:40,399 um of the you know what safeguards 474 00:14:40,399 --> 00:14:42,560 democracy what safe scout society and we 475 00:14:42,560 --> 00:14:43,519 have sort of the 476 00:14:43,519 --> 00:14:45,680 the alphabet soup of the dsa and the 477 00:14:45,680 --> 00:14:46,880 edap and 478 00:14:46,880 --> 00:14:48,480 so on but but these are really 479 00:14:48,480 --> 00:14:49,920 instruments that we need and i just want 480 00:14:49,920 --> 00:14:50,800 to go back to 481 00:14:50,800 --> 00:14:53,440 vice president ansip who said you know 482 00:14:53,440 --> 00:14:54,639 america innovates and 483 00:14:54,639 --> 00:14:56,240 and europe regulates you know first 484 00:14:56,240 --> 00:14:57,600 that's not true i don't recognize this 485 00:14:57,600 --> 00:14:59,120 there's a lot of innovation going on in 486 00:14:59,120 --> 00:14:59,680 europe 487 00:14:59,680 --> 00:15:01,279 but but even if it would be true you 488 00:15:01,279 --> 00:15:04,240 know europe has developed a muscle 489 00:15:04,240 --> 00:15:06,399 of good and fair and cross-border 490 00:15:06,399 --> 00:15:08,000 regulation and i think it's a it's a 491 00:15:08,000 --> 00:15:10,240 very good contribution to make however 492 00:15:10,240 --> 00:15:12,560 europe by itself will not be strong 493 00:15:12,560 --> 00:15:14,399 enough to build the frameworks the 494 00:15:14,399 --> 00:15:15,920 global standards that david was 495 00:15:15,920 --> 00:15:16,800 preferring to 496 00:15:16,800 --> 00:15:18,399 and it does need the transatlantic 497 00:15:18,399 --> 00:15:20,240 partnership with the us and i think 498 00:15:20,240 --> 00:15:21,680 you know that president biden is 499 00:15:21,680 --> 00:15:23,600 currently in brussels is a fantastic 500 00:15:23,600 --> 00:15:25,040 sign and that there will be 501 00:15:25,040 --> 00:15:27,040 much more cooperation to create a global 502 00:15:27,040 --> 00:15:28,800 standard and a standard as well 503 00:15:28,800 --> 00:15:31,279 that addresses i think the counter model 504 00:15:31,279 --> 00:15:32,720 that comes from china and i think that 505 00:15:32,720 --> 00:15:34,399 is uh that is a concern for europe and 506 00:15:34,399 --> 00:15:36,079 for america in equal times 507 00:15:36,079 --> 00:15:38,240 let me uh i think david um has slipped 508 00:15:38,240 --> 00:15:39,759 off i think he'll be back in a moment 509 00:15:39,759 --> 00:15:40,880 because i want to go back with a heavy 510 00:15:40,880 --> 00:15:42,399 question but let me ask you both marcus 511 00:15:42,399 --> 00:15:43,440 and renata 512 00:15:43,440 --> 00:15:46,720 uh one of my best libertarian pals from 513 00:15:46,720 --> 00:15:48,399 germany has just walked in the door and 514 00:15:48,399 --> 00:15:49,279 so i have this 515 00:15:49,279 --> 00:15:50,880 you know sort of low rules you know 516 00:15:50,880 --> 00:15:52,639 don't get too obsessed with government 517 00:15:52,639 --> 00:15:54,399 mindset in favor of my friend over here 518 00:15:54,399 --> 00:15:55,839 for a moment but as 519 00:15:55,839 --> 00:15:57,759 as i think about this question and i 520 00:15:57,759 --> 00:15:59,680 think i typically think about political 521 00:15:59,680 --> 00:16:01,040 ecosystems as one 522 00:16:01,040 --> 00:16:03,279 where the product of political pressures 523 00:16:03,279 --> 00:16:04,880 you know the political gravity takes us 524 00:16:04,880 --> 00:16:06,000 to an outcome 525 00:16:06,000 --> 00:16:08,880 and i'm wondering in this space uh i 526 00:16:08,880 --> 00:16:10,480 used to think as a blogger and one of 527 00:16:10,480 --> 00:16:11,680 the early political 528 00:16:11,680 --> 00:16:14,720 bloggers in the in the early 2000s i 529 00:16:14,720 --> 00:16:16,480 used to tell people i didn't worry about 530 00:16:16,480 --> 00:16:18,720 alternative truth i didn't worry about 531 00:16:18,720 --> 00:16:20,399 uh folks challenging faction i didn't 532 00:16:20,399 --> 00:16:21,759 worry about scandals 533 00:16:21,759 --> 00:16:24,079 and and and uh conspiracy theories 534 00:16:24,079 --> 00:16:26,959 because the internet at that time was so 535 00:16:26,959 --> 00:16:29,120 disciplining that people would jump in 536 00:16:29,120 --> 00:16:30,079 quickly uh 537 00:16:30,079 --> 00:16:31,920 and fix and amend tell me where i was 538 00:16:31,920 --> 00:16:33,839 wrong but you began to see 539 00:16:33,839 --> 00:16:36,000 tribes develop and bubbles and people 540 00:16:36,000 --> 00:16:37,360 who had alternative 541 00:16:37,360 --> 00:16:39,519 uh facts and i'm interested you know as 542 00:16:39,519 --> 00:16:40,639 both a platform 543 00:16:40,639 --> 00:16:43,040 and as a government official this is the 544 00:16:43,040 --> 00:16:44,480 big question i think we're struggling 545 00:16:44,480 --> 00:16:45,680 with at globe sec 546 00:16:45,680 --> 00:16:48,320 that is elevating democracy democracy 547 00:16:48,320 --> 00:16:49,600 saying we need healthy 548 00:16:49,600 --> 00:16:51,839 standards to preserve democracy i 549 00:16:51,839 --> 00:16:53,680 sometimes wonder if you look out at a 550 00:16:53,680 --> 00:16:54,880 great number of people 551 00:16:54,880 --> 00:16:56,800 uh in all of our countries whether 552 00:16:56,800 --> 00:16:58,720 democracies on their mind and rather 553 00:16:58,720 --> 00:17:00,240 tribal auctocracies 554 00:17:00,240 --> 00:17:02,000 where the rules are the way they want 555 00:17:02,000 --> 00:17:04,079 them are just fine with them 556 00:17:04,079 --> 00:17:06,240 so what does the what do the defenders 557 00:17:06,240 --> 00:17:07,919 of democracy do 558 00:17:07,919 --> 00:17:09,599 when an increasing number of people 559 00:17:09,599 --> 00:17:11,439 don't care about it let me ask 560 00:17:11,439 --> 00:17:13,359 renata and then and then marcus to 561 00:17:13,359 --> 00:17:15,359 comment 562 00:17:15,359 --> 00:17:16,959 well steve i think you're putting the 563 00:17:16,959 --> 00:17:18,720 finger in the wand i think you know we 564 00:17:18,720 --> 00:17:20,079 have far too long 565 00:17:20,079 --> 00:17:22,400 uh simply thought that we can take 566 00:17:22,400 --> 00:17:23,760 democracy for granted 567 00:17:23,760 --> 00:17:25,520 that it doesn't need to be nurtured 568 00:17:25,520 --> 00:17:27,039 every day that it doesn't need to be 569 00:17:27,039 --> 00:17:27,919 defended 570 00:17:27,919 --> 00:17:30,080 by all of us every day and i think not 571 00:17:30,080 --> 00:17:31,600 the least you know the events that we 572 00:17:31,600 --> 00:17:33,039 saw on capitol hill in the beginning of 573 00:17:33,039 --> 00:17:35,120 january demonstrated that you know there 574 00:17:35,120 --> 00:17:36,640 aren't sacred kind of you know 575 00:17:36,640 --> 00:17:38,880 statutes of democracy we really have to 576 00:17:38,880 --> 00:17:40,320 be out there to defend it 577 00:17:40,320 --> 00:17:42,400 and the other part i would add is that 578 00:17:42,400 --> 00:17:44,559 also your face as an early blogger 579 00:17:44,559 --> 00:17:46,240 you know that phase where everybody 580 00:17:46,240 --> 00:17:47,600 thought oh this is all going to be a 581 00:17:47,600 --> 00:17:49,200 bright future where we just connect 582 00:17:49,200 --> 00:17:49,919 people 583 00:17:49,919 --> 00:17:52,320 uh that's over as well the naivety of 584 00:17:52,320 --> 00:17:53,919 you know the digital transformation and 585 00:17:53,919 --> 00:17:54,960 it's only good 586 00:17:54,960 --> 00:17:56,880 that's long over and i think what we are 587 00:17:56,880 --> 00:17:58,799 struggling with is indeed to find the 588 00:17:58,799 --> 00:18:00,480 right balance between what you can 589 00:18:00,480 --> 00:18:01,200 regulate 590 00:18:01,200 --> 00:18:02,880 and what you can't regulate but where 591 00:18:02,880 --> 00:18:04,799 you need to equip society in a whole 592 00:18:04,799 --> 00:18:06,080 society approach 593 00:18:06,080 --> 00:18:08,559 to be better prepared for it and um you 594 00:18:08,559 --> 00:18:10,000 know just to kind of explain a little 595 00:18:10,000 --> 00:18:11,919 bit what we do in europe about that 596 00:18:11,919 --> 00:18:14,320 indeed as kind of democracy said we have 597 00:18:14,320 --> 00:18:14,960 gone out 598 00:18:14,960 --> 00:18:17,679 to kind of come with regulation but it's 599 00:18:17,679 --> 00:18:18,320 built 600 00:18:18,320 --> 00:18:21,200 on a number of years of very strong 601 00:18:21,200 --> 00:18:23,919 experience of self-regulatory approaches 602 00:18:23,919 --> 00:18:25,440 very close cooperation with the 603 00:18:25,440 --> 00:18:28,000 platforms and you know in for instance a 604 00:18:28,000 --> 00:18:30,080 code of conduct on online hate speech 605 00:18:30,080 --> 00:18:31,760 or a code of practice against 606 00:18:31,760 --> 00:18:34,400 disinformation that was very useful for 607 00:18:34,400 --> 00:18:36,640 all sides to actually learn much more 608 00:18:36,640 --> 00:18:37,600 from each other 609 00:18:37,600 --> 00:18:40,000 and we have now kind of you know a 610 00:18:40,000 --> 00:18:40,960 position to 611 00:18:40,960 --> 00:18:43,360 regulate some things for instance to 612 00:18:43,360 --> 00:18:45,600 regulate that platforms have to get rid 613 00:18:45,600 --> 00:18:48,240 of illegal content online and if not 614 00:18:48,240 --> 00:18:50,000 there are responsibilities and there are 615 00:18:50,000 --> 00:18:50,720 sanctions 616 00:18:50,720 --> 00:18:52,720 are coming with it but on the other hand 617 00:18:52,720 --> 00:18:54,799 we also must admit that if we don't want 618 00:18:54,799 --> 00:18:56,160 to create censorship 619 00:18:56,160 --> 00:18:58,000 and also pay tribute to authoritarian 620 00:18:58,000 --> 00:18:59,440 regimes in that regard 621 00:18:59,440 --> 00:19:01,200 we need to kind of keep the internet a 622 00:19:01,200 --> 00:19:03,280 free space and that means you cannot 623 00:19:03,280 --> 00:19:03,840 regulate 624 00:19:03,840 --> 00:19:05,679 everything there are limits to what you 625 00:19:05,679 --> 00:19:07,440 can do on disinformation with 626 00:19:07,440 --> 00:19:09,679 clear regulation and that's where a core 627 00:19:09,679 --> 00:19:11,440 regulation approach comes in 628 00:19:11,440 --> 00:19:13,360 and this is where europe has developed 629 00:19:13,360 --> 00:19:14,640 or is about to develop 630 00:19:14,640 --> 00:19:17,679 an interesting unique mix of regulation 631 00:19:17,679 --> 00:19:19,679 where it matters and co-regulatory 632 00:19:19,679 --> 00:19:20,720 approaches in 633 00:19:20,720 --> 00:19:22,799 close cooperation with civil society 634 00:19:22,799 --> 00:19:24,000 with media with 635 00:19:24,000 --> 00:19:26,000 you know platforms to actually address 636 00:19:26,000 --> 00:19:28,080 the vulnerabilities of democracy so that 637 00:19:28,080 --> 00:19:29,440 we are better equipped 638 00:19:29,440 --> 00:19:31,600 when there are disinformation campaigns 639 00:19:31,600 --> 00:19:33,200 whether they are malicious 640 00:19:33,200 --> 00:19:35,600 you know coming from outside forces even 641 00:19:35,600 --> 00:19:37,120 or whether they are just happening 642 00:19:37,120 --> 00:19:38,799 because in the echo chambers that's what 643 00:19:38,799 --> 00:19:40,240 we risk 644 00:19:40,240 --> 00:19:42,240 your candor is really appreciated marcus 645 00:19:42,240 --> 00:19:43,840 let me ask you that that same question 646 00:19:43,840 --> 00:19:44,880 but also ask 647 00:19:44,880 --> 00:19:47,039 you know as facebook and your colleagues 648 00:19:47,039 --> 00:19:48,799 look at it you know these large major 649 00:19:48,799 --> 00:19:51,120 global transnational platforms 650 00:19:51,120 --> 00:19:54,400 do you see yourself as a place where 651 00:19:54,400 --> 00:19:56,320 you know you can have a federation of 652 00:19:56,320 --> 00:19:58,080 tribes and that's fine 653 00:19:58,080 --> 00:20:02,000 or do you see the democratic dimension 654 00:20:02,000 --> 00:20:03,520 and the civil rights and the you know 655 00:20:03,520 --> 00:20:05,120 the civic justice elements 656 00:20:05,120 --> 00:20:07,120 of a democracy is sort of part of what 657 00:20:07,120 --> 00:20:09,440 you're trying uh to inculcate with your 658 00:20:09,440 --> 00:20:10,000 users 659 00:20:10,000 --> 00:20:11,360 i'd be interested in whether that's one 660 00:20:11,360 --> 00:20:12,480 of the points of tension you're 661 00:20:12,480 --> 00:20:13,919 struggling with 662 00:20:13,919 --> 00:20:16,000 yeah yeah absolutely and i again i mean 663 00:20:16,000 --> 00:20:18,000 i think i echo what reynard has said you 664 00:20:18,000 --> 00:20:18,400 know we 665 00:20:18,400 --> 00:20:20,159 we need frameworks and we're working on 666 00:20:20,159 --> 00:20:21,840 frameworks it's specific in europe we 667 00:20:21,840 --> 00:20:22,480 need 668 00:20:22,480 --> 00:20:25,200 co-regulation but we need actions from 669 00:20:25,200 --> 00:20:26,720 platforms like us as well we have a 670 00:20:26,720 --> 00:20:28,400 responsibility not just illegal we have 671 00:20:28,400 --> 00:20:30,159 a moral responsibility to keep people 672 00:20:30,159 --> 00:20:32,159 safe to protect societies to protect 673 00:20:32,159 --> 00:20:35,360 um democracy and and what we do is 674 00:20:35,360 --> 00:20:37,679 is obviously you know we tackle head-on 675 00:20:37,679 --> 00:20:38,880 misinformation 676 00:20:38,880 --> 00:20:41,039 we tackle head-on hate speech we tackle 677 00:20:41,039 --> 00:20:42,960 head-on interference in elections and 678 00:20:42,960 --> 00:20:44,720 now plenty of measures that we take you 679 00:20:44,720 --> 00:20:45,600 know 680 00:20:45,600 --> 00:20:49,280 um deleting millions of of fake accounts 681 00:20:49,280 --> 00:20:50,240 every day of 682 00:20:50,240 --> 00:20:52,720 expanding uh network of independent fact 683 00:20:52,720 --> 00:20:54,480 checkers to identify misinformation 684 00:20:54,480 --> 00:20:55,280 label it 685 00:20:55,280 --> 00:20:58,320 demoted um that there's so much 686 00:20:58,320 --> 00:21:00,400 that we can do and by doing this you 687 00:21:00,400 --> 00:21:02,080 know back to your question david 688 00:21:02,080 --> 00:21:05,600 is we we we we are actually creating 689 00:21:05,600 --> 00:21:06,159 almost a 690 00:21:06,159 --> 00:21:09,039 a global standard for this um because we 691 00:21:09,039 --> 00:21:10,080 are the only 692 00:21:10,080 --> 00:21:12,320 organization that has that global reach 693 00:21:12,320 --> 00:21:14,799 but also we have a counter trend and and 694 00:21:14,799 --> 00:21:16,000 i think you mentioned this before 695 00:21:16,000 --> 00:21:18,480 and maybe also accelerated to covet is 696 00:21:18,480 --> 00:21:19,840 that we have much more assertive 697 00:21:19,840 --> 00:21:20,400 governments 698 00:21:20,400 --> 00:21:22,000 and they also become more assertive when 699 00:21:22,000 --> 00:21:23,760 it comes to digital matters they come 700 00:21:23,760 --> 00:21:25,200 become more assertive when 701 00:21:25,200 --> 00:21:27,120 it becomes to content that should stay 702 00:21:27,120 --> 00:21:28,400 on the platform and sometimes that 703 00:21:28,400 --> 00:21:29,520 content is 704 00:21:29,520 --> 00:21:31,360 regime critical content that is 705 00:21:31,360 --> 00:21:32,799 opposition con that is a really 706 00:21:32,799 --> 00:21:34,480 difficult situation for us so 707 00:21:34,480 --> 00:21:36,880 we have this tension of on the one side 708 00:21:36,880 --> 00:21:38,640 across border service 709 00:21:38,640 --> 00:21:41,600 that tries to reflect values of freedom 710 00:21:41,600 --> 00:21:43,679 of speech of human rights and so on 711 00:21:43,679 --> 00:21:45,760 but we also have assertiveness 712 00:21:45,760 --> 00:21:47,200 nationally and we also see a 713 00:21:47,200 --> 00:21:48,240 fragmentation and we see this 714 00:21:48,240 --> 00:21:50,240 fragmentation in europe you know we have 715 00:21:50,240 --> 00:21:52,559 individual content laws while we're 716 00:21:52,559 --> 00:21:54,480 trying to work on a global network 717 00:21:54,480 --> 00:21:55,919 on a european network and eventually 718 00:21:55,919 --> 00:21:57,600 hopefully in a global network but what 719 00:21:57,600 --> 00:21:58,880 you mentioned is this tension is 720 00:21:58,880 --> 00:22:00,880 inherent and it plays out 721 00:22:00,880 --> 00:22:03,520 between our global community standards 722 00:22:03,520 --> 00:22:05,039 which are the rules that allow 723 00:22:05,039 --> 00:22:06,640 what can be kept on the platform what is 724 00:22:06,640 --> 00:22:08,480 taken off between 725 00:22:08,480 --> 00:22:11,200 regional regulation and frameworks and 726 00:22:11,200 --> 00:22:12,480 then national regulation and then 727 00:22:12,480 --> 00:22:14,320 individual demands from 728 00:22:14,320 --> 00:22:16,000 governments across the club so it's a 729 00:22:16,000 --> 00:22:17,520 it's it's a very very difficult and 730 00:22:17,520 --> 00:22:18,880 heavy mix for us 731 00:22:18,880 --> 00:22:20,799 well thank you i think we've got david 732 00:22:20,799 --> 00:22:22,240 back online there's david 733 00:22:22,240 --> 00:22:24,320 david while you were um away doing 734 00:22:24,320 --> 00:22:26,799 something else i ask the question 735 00:22:26,799 --> 00:22:29,840 of whether people online or 736 00:22:29,840 --> 00:22:32,559 increasingly in our nations our citizens 737 00:22:32,559 --> 00:22:32,960 are 738 00:22:32,960 --> 00:22:35,280 as obsessed with preserving democracy as 739 00:22:35,280 --> 00:22:37,039 many people at this conference are 740 00:22:37,039 --> 00:22:38,880 and whether or not they want to see the 741 00:22:38,880 --> 00:22:40,880 ascension of their tribe and a different 742 00:22:40,880 --> 00:22:42,720 set of rules and they're not really 743 00:22:42,720 --> 00:22:44,080 they're really not empathetic or 744 00:22:44,080 --> 00:22:45,919 concerned across science and i know you 745 00:22:45,919 --> 00:22:47,280 are and i'm just interested as an 746 00:22:47,280 --> 00:22:48,559 american citizen 747 00:22:48,559 --> 00:22:50,480 you know in this space do you worry that 748 00:22:50,480 --> 00:22:52,320 bifurcation of interest that is out 749 00:22:52,320 --> 00:22:52,880 there that 750 00:22:52,880 --> 00:22:55,600 that design that we may uh we makes make 751 00:22:55,600 --> 00:22:57,039 some mistakes i mean i i just 752 00:22:57,039 --> 00:22:58,720 i just you know as i sort of deal every 753 00:22:58,720 --> 00:23:00,720 day in the media space and look the hill 754 00:23:00,720 --> 00:23:02,320 is a very centrist place which means we 755 00:23:02,320 --> 00:23:03,840 have all sides screaming at us 756 00:23:03,840 --> 00:23:06,159 but when you kind of look at that you 757 00:23:06,159 --> 00:23:07,600 you see 758 00:23:07,600 --> 00:23:10,159 very different uh gravitational force 759 00:23:10,159 --> 00:23:11,360 fields depending on where you are 760 00:23:11,360 --> 00:23:13,280 politically how you think what matters 761 00:23:13,280 --> 00:23:14,880 what you prioritize how you see the 762 00:23:14,880 --> 00:23:15,600 world 763 00:23:15,600 --> 00:23:18,880 and and as we talk about homogenizing 764 00:23:18,880 --> 00:23:20,720 rules and systems that protect those 765 00:23:20,720 --> 00:23:22,559 individuals who really don't care about 766 00:23:22,559 --> 00:23:24,080 that they care about their side winning 767 00:23:24,080 --> 00:23:27,360 i'm just asking what your thoughts are 768 00:23:27,360 --> 00:23:30,480 well thank you um yeah um it's no 769 00:23:30,480 --> 00:23:31,600 coincidence that 770 00:23:31,600 --> 00:23:34,559 um president biden's infrastructure plan 771 00:23:34,559 --> 00:23:35,679 is calling for 772 00:23:35,679 --> 00:23:37,520 improvements to broadband because i'm in 773 00:23:37,520 --> 00:23:39,280 a rural area and i'm having trouble with 774 00:23:39,280 --> 00:23:40,640 my internet here 775 00:23:40,640 --> 00:23:43,760 um so even the basics still have a long 776 00:23:43,760 --> 00:23:44,960 way to go but 777 00:23:44,960 --> 00:23:48,080 with regards to you know just speaking 778 00:23:48,080 --> 00:23:50,080 personally during the pandemic i 779 00:23:50,080 --> 00:23:51,840 relocated to a rural area 780 00:23:51,840 --> 00:23:54,080 and some of my neighbors you know have 781 00:23:54,080 --> 00:23:55,919 opposing political views 782 00:23:55,919 --> 00:23:57,919 and so it's a really important moment to 783 00:23:57,919 --> 00:23:58,960 just meet 784 00:23:58,960 --> 00:24:02,080 neighbors and reconnect we have a urban 785 00:24:02,080 --> 00:24:04,520 rural divide in the u.s that is further 786 00:24:04,520 --> 00:24:06,559 exacerbated by 787 00:24:06,559 --> 00:24:08,880 the the ways that social media and 788 00:24:08,880 --> 00:24:09,919 digital marketing 789 00:24:09,919 --> 00:24:12,880 further segments us into like-mindedness 790 00:24:12,880 --> 00:24:13,440 for 791 00:24:13,440 --> 00:24:16,480 the purposes of better performing 792 00:24:16,480 --> 00:24:17,520 advertising 793 00:24:17,520 --> 00:24:19,919 measurements and so it takes individual 794 00:24:19,919 --> 00:24:22,240 citizens to kind of work against this 795 00:24:22,240 --> 00:24:25,440 on very personal one-to-one ways 796 00:24:25,440 --> 00:24:28,400 for me the reason to pursue the 797 00:24:28,400 --> 00:24:30,559 cambridge analytica case 798 00:24:30,559 --> 00:24:33,120 was because i was very worried about the 799 00:24:33,120 --> 00:24:35,520 way that advertising technology and 800 00:24:35,520 --> 00:24:37,120 social media technology 801 00:24:37,120 --> 00:24:39,520 and the military-industrial complex were 802 00:24:39,520 --> 00:24:41,039 all converging 803 00:24:41,039 --> 00:24:44,000 uh to threaten the very notion of 804 00:24:44,000 --> 00:24:45,279 self-governance 805 00:24:45,279 --> 00:24:47,679 and self-determination and so i wanted 806 00:24:47,679 --> 00:24:49,440 to lift a veil onto how 807 00:24:49,440 --> 00:24:51,600 manipulation was occurring right there 808 00:24:51,600 --> 00:24:52,720 on the surface 809 00:24:52,720 --> 00:24:54,799 and below the surface so that people 810 00:24:54,799 --> 00:24:55,760 could see 811 00:24:55,760 --> 00:24:58,240 and indeed when people see inside the 812 00:24:58,240 --> 00:25:00,880 machine it is the most effective way 813 00:25:00,880 --> 00:25:03,760 to limit its impact that once we can see 814 00:25:03,760 --> 00:25:04,240 it 815 00:25:04,240 --> 00:25:07,440 it is less effective 816 00:25:07,440 --> 00:25:10,799 at dividing us let me thank you for that 817 00:25:10,799 --> 00:25:11,760 let me ask you all 818 00:25:11,760 --> 00:25:13,679 uh another question i would mention i 819 00:25:13,679 --> 00:25:14,880 can remind you all you can 820 00:25:14,880 --> 00:25:16,400 ask questions through the app someone 821 00:25:16,400 --> 00:25:17,919 asked me a defense industry question 822 00:25:17,919 --> 00:25:19,200 which i'm not sure if it's but i'm going 823 00:25:19,200 --> 00:25:20,799 to ask it if we have no others and you 824 00:25:20,799 --> 00:25:22,159 can also ask questions here in just a 825 00:25:22,159 --> 00:25:23,679 little bit at the microphone 826 00:25:23,679 --> 00:25:26,799 but we have um a great innovator from 827 00:25:26,799 --> 00:25:28,640 slovakia sitting in here the ceo of 828 00:25:28,640 --> 00:25:29,919 tachium he's a big 829 00:25:29,919 --> 00:25:32,000 giant in the artificial intelligence 830 00:25:32,000 --> 00:25:34,000 world i remember talking to eric schmidt 831 00:25:34,000 --> 00:25:35,360 of google at the time he said 832 00:25:35,360 --> 00:25:36,480 i asked him how do you think you're 833 00:25:36,480 --> 00:25:38,080 going to get the publics in our in our 834 00:25:38,080 --> 00:25:39,919 world to trust ai 835 00:25:39,919 --> 00:25:42,159 and he says well ai is going to fix bad 836 00:25:42,159 --> 00:25:43,600 health diagnoses because 837 00:25:43,600 --> 00:25:45,440 the health diagnostic system is so 838 00:25:45,440 --> 00:25:47,360 horrible that'll be the other side 839 00:25:47,360 --> 00:25:49,200 the other thing it will fix right away 840 00:25:49,200 --> 00:25:51,600 is is identity theft and fraud 841 00:25:51,600 --> 00:25:52,960 in financial he says they're going to be 842 00:25:52,960 --> 00:25:55,919 practical ways that ai comes in and 843 00:25:55,919 --> 00:25:56,640 changes 844 00:25:56,640 --> 00:25:59,520 the prospects and and welfare uh of 845 00:25:59,520 --> 00:26:00,080 people 846 00:26:00,080 --> 00:26:02,240 in this conversation you know as i 847 00:26:02,240 --> 00:26:04,880 listened to marcus talk about 848 00:26:04,880 --> 00:26:06,480 you know i don't know the hundreds of 849 00:26:06,480 --> 00:26:08,000 thousands of people out there i don't 850 00:26:08,000 --> 00:26:08,559 know how many 851 00:26:08,559 --> 00:26:09,760 you've got you know basically getting 852 00:26:09,760 --> 00:26:11,919 rid of the millions of bad messages and 853 00:26:11,919 --> 00:26:12,559 whatnot 854 00:26:12,559 --> 00:26:14,880 but it's an active thing you're having 855 00:26:14,880 --> 00:26:15,679 to do 856 00:26:15,679 --> 00:26:17,440 to remove those messages and i'm just 857 00:26:17,440 --> 00:26:19,200 interested in whether or not 858 00:26:19,200 --> 00:26:21,679 we are locked in the wrong frame for 859 00:26:21,679 --> 00:26:23,360 thinking about this that a lot of this 860 00:26:23,360 --> 00:26:26,080 online health could in fact be put in 861 00:26:26,080 --> 00:26:26,559 place 862 00:26:26,559 --> 00:26:28,880 by the advances that are coming in ai 863 00:26:28,880 --> 00:26:30,240 the advances that are coming in 864 00:26:30,240 --> 00:26:32,880 can we can we build in to the systems 865 00:26:32,880 --> 00:26:33,840 we're producing 866 00:26:33,840 --> 00:26:35,919 something that more automatically 867 00:26:35,919 --> 00:26:38,000 produces healthy outcomes or 868 00:26:38,000 --> 00:26:42,720 is that um is that naive of me marcus 869 00:26:42,960 --> 00:26:45,120 well you know first you mentioned this 870 00:26:45,120 --> 00:26:46,720 already the ai 871 00:26:46,720 --> 00:26:48,799 system is not just there to help elites 872 00:26:48,799 --> 00:26:50,720 and to help take technology companies 873 00:26:50,720 --> 00:26:52,799 it's there to help users and individual 874 00:26:52,799 --> 00:26:53,919 citizens and 875 00:26:53,919 --> 00:26:56,320 actually ironically the the place where 876 00:26:56,320 --> 00:26:57,200 we deploy 877 00:26:57,200 --> 00:26:59,120 the most sophisticated and most ai is 878 00:26:59,120 --> 00:27:01,600 actually of determining and finding 879 00:27:01,600 --> 00:27:04,000 um unto what content hate speech and so 880 00:27:04,000 --> 00:27:05,279 on i mentioned 97 881 00:27:05,279 --> 00:27:06,880 of hate speech is detected before it is 882 00:27:06,880 --> 00:27:08,480 reported that's done through ai so 883 00:27:08,480 --> 00:27:11,520 so ai is already built 884 00:27:11,520 --> 00:27:14,159 to benefit the user but there's no 885 00:27:14,159 --> 00:27:14,799 question 886 00:27:14,799 --> 00:27:17,200 that it requires much more transparency 887 00:27:17,200 --> 00:27:18,880 and and i agree here with with david you 888 00:27:18,880 --> 00:27:19,200 know 889 00:27:19,200 --> 00:27:20,720 that's the concern the concern it is a 890 00:27:20,720 --> 00:27:22,000 black box and you don't know what's 891 00:27:22,000 --> 00:27:23,039 going on in there and 892 00:27:23,039 --> 00:27:24,080 actually when you look at it it's 893 00:27:24,080 --> 00:27:26,159 relatively simple if you look at your 894 00:27:26,159 --> 00:27:27,840 facebook news feed it's it's fairly 895 00:27:27,840 --> 00:27:30,000 simple why things are ranked the way 896 00:27:30,000 --> 00:27:32,559 the way they rank but i do believe that 897 00:27:32,559 --> 00:27:33,279 there is 898 00:27:33,279 --> 00:27:36,080 a need for for a framework to build sort 899 00:27:36,080 --> 00:27:38,399 of the ethical element of ai 900 00:27:38,399 --> 00:27:40,080 and again i'm coming back to this this 901 00:27:40,080 --> 00:27:41,840 is not something you can leave to 902 00:27:41,840 --> 00:27:43,200 private companies i don't think that is 903 00:27:43,200 --> 00:27:44,240 something you can leave to even 904 00:27:44,240 --> 00:27:45,600 self-regulation 905 00:27:45,600 --> 00:27:47,919 and therefore the attempts again here in 906 00:27:47,919 --> 00:27:48,640 europe and 907 00:27:48,640 --> 00:27:51,039 europe is a is a leader in these things 908 00:27:51,039 --> 00:27:53,120 is extremely welcome to have 909 00:27:53,120 --> 00:27:56,080 a framework of how ai should be deployed 910 00:27:56,080 --> 00:27:57,200 by companies 911 00:27:57,200 --> 00:27:58,960 uh but also actually by public bodies 912 00:27:58,960 --> 00:28:00,240 because it's not it's not just 913 00:28:00,240 --> 00:28:02,880 it's not just us who deal with sensitive 914 00:28:02,880 --> 00:28:03,600 data and 915 00:28:03,600 --> 00:28:05,360 i would argue that actually some of the 916 00:28:05,360 --> 00:28:06,640 the public institutions 917 00:28:06,640 --> 00:28:08,480 um to think about health organization 918 00:28:08,480 --> 00:28:12,080 health services um have much more 919 00:28:12,080 --> 00:28:14,720 sensitive data that with ai has has 920 00:28:14,720 --> 00:28:16,399 potentially concerning outcomes so 921 00:28:16,399 --> 00:28:19,279 i think tackling it through the um 922 00:28:19,279 --> 00:28:20,880 through a regulatory framework an open 923 00:28:20,880 --> 00:28:22,640 transparent regulatory framework 924 00:28:22,640 --> 00:28:25,279 um while at the same time putting an 925 00:28:25,279 --> 00:28:27,120 onus on on companies like ours to be 926 00:28:27,120 --> 00:28:28,640 transparent and then to actually also 927 00:28:28,640 --> 00:28:29,600 recognize 928 00:28:29,600 --> 00:28:33,440 um that ai is something that is used for 929 00:28:33,440 --> 00:28:35,120 for for you this is not to polarize 930 00:28:35,120 --> 00:28:37,039 people uh not to keep people on the 931 00:28:37,039 --> 00:28:37,520 platform 932 00:28:37,520 --> 00:28:40,480 as often often alleged because we're an 933 00:28:40,480 --> 00:28:41,679 advertisement business 934 00:28:41,679 --> 00:28:43,600 you know we have advertisers that don't 935 00:28:43,600 --> 00:28:45,440 want to see the advertisement next to 936 00:28:45,440 --> 00:28:46,960 polarizing content so there is an 937 00:28:46,960 --> 00:28:47,679 incentive 938 00:28:47,679 --> 00:28:50,320 to use ai for good and not for bad well 939 00:28:50,320 --> 00:28:50,880 before i 940 00:28:50,880 --> 00:28:53,039 before i jump to renate ask her the same 941 00:28:53,039 --> 00:28:54,159 thing about ai 942 00:28:54,159 --> 00:28:56,640 and how europe uses it say opposed to as 943 00:28:56,640 --> 00:28:59,520 opposed to china or how facebook uses 944 00:28:59,520 --> 00:29:01,200 facial recognition as opposed to how 945 00:29:01,200 --> 00:29:02,960 china uses facial recognition 946 00:29:02,960 --> 00:29:04,880 marcus let me ask you one other question 947 00:29:04,880 --> 00:29:06,880 you know facebook has made the decision 948 00:29:06,880 --> 00:29:08,880 to say bye-bye to president trump for a 949 00:29:08,880 --> 00:29:10,159 couple of years 950 00:29:10,159 --> 00:29:12,159 and you went through a process for doing 951 00:29:12,159 --> 00:29:14,559 that there are a lot of other autocratic 952 00:29:14,559 --> 00:29:17,360 leaders who i remember when emanuel 953 00:29:17,360 --> 00:29:18,000 macron 954 00:29:18,000 --> 00:29:21,039 spoke to globesec and he said very 955 00:29:21,039 --> 00:29:21,919 bluntly in 956 00:29:21,919 --> 00:29:24,960 english the populists in europe are 957 00:29:24,960 --> 00:29:26,799 lying to their citizens 958 00:29:26,799 --> 00:29:28,799 and so i'm interested in whether that 959 00:29:28,799 --> 00:29:30,640 decision on donald trump 960 00:29:30,640 --> 00:29:33,200 leads to the necessary banning of other 961 00:29:33,200 --> 00:29:34,080 leaders 962 00:29:34,080 --> 00:29:36,559 who uh mimic some of those behaviors 963 00:29:36,559 --> 00:29:37,679 what are your thoughts real quick and 964 00:29:37,679 --> 00:29:39,200 then we'll jump to 965 00:29:39,200 --> 00:29:41,120 look at all our policies before the 966 00:29:41,120 --> 00:29:42,640 decision on trump you know 967 00:29:42,640 --> 00:29:44,320 never allowed that public figures can 968 00:29:44,320 --> 00:29:46,159 incite violence i mean that's that 969 00:29:46,159 --> 00:29:47,600 there's not necessarily a change to our 970 00:29:47,600 --> 00:29:49,200 policy but definitely 971 00:29:49,200 --> 00:29:50,559 what happened on the 6th of january in 972 00:29:50,559 --> 00:29:53,279 the us is obviously a pivotal moment 973 00:29:53,279 --> 00:29:55,200 and we had to make this drastic decision 974 00:29:55,200 --> 00:29:56,640 and it was the right decision and as you 975 00:29:56,640 --> 00:29:57,440 rightly say 976 00:29:57,440 --> 00:29:59,279 this was a fundamental decision which we 977 00:29:59,279 --> 00:30:00,720 put in front of our external 978 00:30:00,720 --> 00:30:03,120 independent oversight board and and they 979 00:30:03,120 --> 00:30:04,480 came back and they said 980 00:30:04,480 --> 00:30:06,799 look number one your decision was right 981 00:30:06,799 --> 00:30:08,080 yeah you had to 982 00:30:08,080 --> 00:30:11,279 take trump off the platform number two 983 00:30:11,279 --> 00:30:13,200 banning him indefinitely that is more 984 00:30:13,200 --> 00:30:15,120 problematic because there is not you 985 00:30:15,120 --> 00:30:17,200 processed it not transparent why you did 986 00:30:17,200 --> 00:30:17,840 that 987 00:30:17,840 --> 00:30:20,159 and back to you you need to find a 988 00:30:20,159 --> 00:30:21,360 transparent process that you need to 989 00:30:21,360 --> 00:30:22,559 communicate properly so 990 00:30:22,559 --> 00:30:24,159 that that's where we that's where we 991 00:30:24,159 --> 00:30:26,159 ended up in there but what we do as a 992 00:30:26,159 --> 00:30:27,679 response to the decision of the 993 00:30:27,679 --> 00:30:29,840 oversight board is to create standards 994 00:30:29,840 --> 00:30:30,880 for public figures 995 00:30:30,880 --> 00:30:32,240 that don't just apply to one public 996 00:30:32,240 --> 00:30:33,919 figure to ex president trump but 997 00:30:33,919 --> 00:30:34,960 actually apply 998 00:30:34,960 --> 00:30:37,520 to all politicians to all public person 999 00:30:37,520 --> 00:30:38,960 who potentially abuse 1000 00:30:38,960 --> 00:30:40,240 the ability of the platform to 1001 00:30:40,240 --> 00:30:42,320 communicate and thereby trying to 1002 00:30:42,320 --> 00:30:44,799 incite violence and cause harm thank you 1003 00:30:44,799 --> 00:30:46,320 for that renato let me ask you to 1004 00:30:46,320 --> 00:30:47,120 comment about 1005 00:30:47,120 --> 00:30:50,000 ai and technologies as you see them i 1006 00:30:50,000 --> 00:30:50,320 mean 1007 00:30:50,320 --> 00:30:52,240 brad smith who's also speaking here the 1008 00:30:52,240 --> 00:30:54,080 president of microsoft wrote about 1009 00:30:54,080 --> 00:30:56,960 advances in technology as tools or 1010 00:30:56,960 --> 00:30:57,840 weapons 1011 00:30:57,840 --> 00:30:59,120 and so we're talking about things that 1012 00:30:59,120 --> 00:31:01,360 can be used either way but but tell us 1013 00:31:01,360 --> 00:31:02,640 how you think about 1014 00:31:02,640 --> 00:31:05,039 you know ai and other uh techniques 1015 00:31:05,039 --> 00:31:06,000 coming into 1016 00:31:06,000 --> 00:31:08,000 the system to help perhaps right the 1017 00:31:08,000 --> 00:31:09,919 ship when it comes to digital health 1018 00:31:09,919 --> 00:31:12,399 and digital privacy and you know digital 1019 00:31:12,399 --> 00:31:15,919 accountability and responsibility 1020 00:31:15,919 --> 00:31:18,240 i strongly believe that we need to kind 1021 00:31:18,240 --> 00:31:20,399 of structure systems where technology 1022 00:31:20,399 --> 00:31:21,039 serves 1023 00:31:21,039 --> 00:31:23,360 the people and not the other way around 1024 00:31:23,360 --> 00:31:26,080 and for instance we are already using ai 1025 00:31:26,080 --> 00:31:28,640 systems very well today in kind of you 1026 00:31:28,640 --> 00:31:30,399 know putting order on the internet when 1027 00:31:30,399 --> 00:31:32,000 it comes to illegal behavior 1028 00:31:32,000 --> 00:31:33,679 there has been even a couple of years 1029 00:31:33,679 --> 00:31:36,159 back a global initiative 1030 00:31:36,159 --> 00:31:38,799 to fight against child sexual abuse 1031 00:31:38,799 --> 00:31:39,600 online 1032 00:31:39,600 --> 00:31:42,799 and here ai kind of was the solution to 1033 00:31:42,799 --> 00:31:44,320 actually detect the photos 1034 00:31:44,320 --> 00:31:47,039 very very rapidly so to just to show you 1035 00:31:47,039 --> 00:31:49,279 that you know even before the hype on ai 1036 00:31:49,279 --> 00:31:50,000 regulation 1037 00:31:50,000 --> 00:31:52,240 uh started emerging we've been actually 1038 00:31:52,240 --> 00:31:54,399 using it to put order in the wild west 1039 00:31:54,399 --> 00:31:55,679 of the internet 1040 00:31:55,679 --> 00:31:58,080 but indeed in europe we have also put 1041 00:31:58,080 --> 00:31:59,840 regulation on the table 1042 00:31:59,840 --> 00:32:03,039 the so-called ai act which is very much 1043 00:32:03,039 --> 00:32:05,679 driven by the idea of having you know 1044 00:32:05,679 --> 00:32:08,559 an ethic by design a human-centric 1045 00:32:08,559 --> 00:32:09,600 approach in ai 1046 00:32:09,600 --> 00:32:12,240 because we want to use the chances of 1047 00:32:12,240 --> 00:32:14,080 this technology we want to have better 1048 00:32:14,080 --> 00:32:15,679 health we want to have solutions on 1049 00:32:15,679 --> 00:32:17,840 mobility that are more adapted to our 1050 00:32:17,840 --> 00:32:18,399 needs 1051 00:32:18,399 --> 00:32:20,240 we want to use it in all sectors we also 1052 00:32:20,240 --> 00:32:22,000 want to kind of catch up with regard to 1053 00:32:22,000 --> 00:32:23,360 the huge investments 1054 00:32:23,360 --> 00:32:24,960 that are already happening in the u.s 1055 00:32:24,960 --> 00:32:26,480 and in china on ai 1056 00:32:26,480 --> 00:32:28,399 and this is a collective exercise with 1057 00:32:28,399 --> 00:32:30,159 member states that's what member states 1058 00:32:30,159 --> 00:32:31,679 want to do with us in the eu 1059 00:32:31,679 --> 00:32:32,640 institutions 1060 00:32:32,640 --> 00:32:35,039 but it's very clear that we have to take 1061 00:32:35,039 --> 00:32:36,880 a different approach from some of our 1062 00:32:36,880 --> 00:32:38,159 partners in the world 1063 00:32:38,159 --> 00:32:40,720 we don't want social scoring like in 1064 00:32:40,720 --> 00:32:41,279 china 1065 00:32:41,279 --> 00:32:43,360 we don't think that this is kind of now 1066 00:32:43,360 --> 00:32:45,120 something that we want in the european 1067 00:32:45,120 --> 00:32:45,840 area 1068 00:32:45,840 --> 00:32:48,159 so we have made a choice in this ai act 1069 00:32:48,159 --> 00:32:50,320 to also say there are risky areas 1070 00:32:50,320 --> 00:32:51,840 where we either want to have a high 1071 00:32:51,840 --> 00:32:54,159 level of regulatory observation 1072 00:32:54,159 --> 00:32:56,240 and supervision because it can really 1073 00:32:56,240 --> 00:32:57,840 turn it around when you're talking for 1074 00:32:57,840 --> 00:33:00,159 instance on using ai in recruitment 1075 00:33:00,159 --> 00:33:03,440 or in public services you know 1076 00:33:03,440 --> 00:33:04,640 that and if you want to have a 1077 00:33:04,640 --> 00:33:06,559 sustainable growth kind of 1078 00:33:06,559 --> 00:33:09,200 perspective on ai technology that is 1079 00:33:09,200 --> 00:33:10,000 really 1080 00:33:10,000 --> 00:33:12,240 trustful and that kind of consumers and 1081 00:33:12,240 --> 00:33:13,440 citizens in europe 1082 00:33:13,440 --> 00:33:15,440 fully have trust in then you need to 1083 00:33:15,440 --> 00:33:17,360 build it in from the beginning as we 1084 00:33:17,360 --> 00:33:19,120 have done with privacy because the whole 1085 00:33:19,120 --> 00:33:21,200 concept of the data protection rules 1086 00:33:21,200 --> 00:33:23,279 was to have a privacy by design 1087 00:33:23,279 --> 00:33:25,200 innovation and that's what we are trying 1088 00:33:25,200 --> 00:33:27,840 to kind of repeat with the ai act very 1089 00:33:27,840 --> 00:33:28,880 much driven by 1090 00:33:28,880 --> 00:33:31,200 we want to support innovation but in 1091 00:33:31,200 --> 00:33:32,399 areas where we 1092 00:33:32,399 --> 00:33:34,720 think they might be high-risk areas we 1093 00:33:34,720 --> 00:33:37,039 want to supervise a bit more thoroughly 1094 00:33:37,039 --> 00:33:38,960 and as this is a technological 1095 00:33:38,960 --> 00:33:40,799 revolution that is kind of changing 1096 00:33:40,799 --> 00:33:42,799 every day where nobody of us has the 1097 00:33:42,799 --> 00:33:44,799 real kind of concept for the next 10 1098 00:33:44,799 --> 00:33:47,440 years it has to be a flexible concept 1099 00:33:47,440 --> 00:33:49,840 where we can also add high-risk sectors 1100 00:33:49,840 --> 00:33:51,440 that only emerge when we see the 1101 00:33:51,440 --> 00:33:52,799 applications coming 1102 00:33:52,799 --> 00:33:54,880 so it's really an interesting way of 1103 00:33:54,880 --> 00:33:56,000 dealing with 1104 00:33:56,000 --> 00:33:58,559 a regulatory challenge but it's it's a 1105 00:33:58,559 --> 00:34:00,799 very common and united kind of front 1106 00:34:00,799 --> 00:34:02,000 that has building up 1107 00:34:02,000 --> 00:34:04,240 here in the eu where we say what we want 1108 00:34:04,240 --> 00:34:05,519 where we want to lead the way 1109 00:34:05,519 --> 00:34:07,519 hopefully inspire others but also make a 1110 00:34:07,519 --> 00:34:09,520 clear kind of difference between us and 1111 00:34:09,520 --> 00:34:10,480 others 1112 00:34:10,480 --> 00:34:12,079 right and so just a few minutes we're 1113 00:34:12,079 --> 00:34:14,000 going to go to uh questions again 1114 00:34:14,000 --> 00:34:15,918 there's some good ones being populated 1115 00:34:15,918 --> 00:34:17,679 uh over the app and i'll invite people 1116 00:34:17,679 --> 00:34:18,719 up to the mic here and we'll go as 1117 00:34:18,719 --> 00:34:19,918 quickly as you can but let me just 1118 00:34:19,918 --> 00:34:22,000 finally ask david a question and and 1119 00:34:22,000 --> 00:34:23,918 i don't know exactly how to ask this but 1120 00:34:23,918 --> 00:34:25,359 i'm going to you know david if you know 1121 00:34:25,359 --> 00:34:26,800 with our colleague and our friend here 1122 00:34:26,800 --> 00:34:28,560 from from facebook facebook has 1123 00:34:28,560 --> 00:34:29,440 developed 1124 00:34:29,440 --> 00:34:32,560 community standards it has uh signed on 1125 00:34:32,560 --> 00:34:34,560 it's one of the signatory two codes of 1126 00:34:34,560 --> 00:34:35,760 practice it has 1127 00:34:35,760 --> 00:34:37,839 made decisions that we've seen about you 1128 00:34:37,839 --> 00:34:39,679 know political ads twitter and other 1129 00:34:39,679 --> 00:34:41,280 places have done this as well 1130 00:34:41,280 --> 00:34:43,918 but marcus has shared with us today that 1131 00:34:43,918 --> 00:34:46,000 that companies even leagues of companies 1132 00:34:46,000 --> 00:34:48,079 cannot do this on their own they need 1133 00:34:48,079 --> 00:34:49,839 governments they need partnership 1134 00:34:49,839 --> 00:34:51,520 to do this and i'm just interested in 1135 00:34:51,520 --> 00:34:53,359 that legal scaffolding if you look at 1136 00:34:53,359 --> 00:34:54,639 the united states 1137 00:34:54,639 --> 00:34:56,800 the united states still does not have a 1138 00:34:56,800 --> 00:34:58,560 national privacy standard 1139 00:34:58,560 --> 00:35:00,800 europe does california does but the 1140 00:35:00,800 --> 00:35:02,240 united states is not there a lot of 1141 00:35:02,240 --> 00:35:03,680 things that europe has moved ahead just 1142 00:35:03,680 --> 00:35:04,960 to go back to that minister 1143 00:35:04,960 --> 00:35:06,320 and jumped ahead with regulatory 1144 00:35:06,320 --> 00:35:08,720 approaches that the u.s government has 1145 00:35:08,720 --> 00:35:09,920 not done 1146 00:35:09,920 --> 00:35:12,160 uh and i'm just interested since you 1147 00:35:12,160 --> 00:35:13,920 spent so much money 1148 00:35:13,920 --> 00:35:15,440 fighting for your rights in the american 1149 00:35:15,440 --> 00:35:17,599 legal system just real quickly what's 1150 00:35:17,599 --> 00:35:19,119 missing what could help 1151 00:35:19,119 --> 00:35:21,359 move forward and i'm not talking about 1152 00:35:21,359 --> 00:35:22,960 companies i'm talking about 1153 00:35:22,960 --> 00:35:25,920 public policy washington d.c what what 1154 00:35:25,920 --> 00:35:28,880 needs to be there that's not 1155 00:35:28,880 --> 00:35:31,119 sure thank you um and hopefully my 1156 00:35:31,119 --> 00:35:32,079 connection will hold 1157 00:35:32,079 --> 00:35:35,359 um the the key issue the one correction 1158 00:35:35,359 --> 00:35:37,359 i would make is i didn't spend a dollar 1159 00:35:37,359 --> 00:35:39,119 of legal effort in the united states i 1160 00:35:39,119 --> 00:35:41,280 spent it all in the uk 1161 00:35:41,280 --> 00:35:44,720 because the the uk and europe has 1162 00:35:44,720 --> 00:35:47,599 the basic right to one's own data the 1163 00:35:47,599 --> 00:35:48,800 right to know 1164 00:35:48,800 --> 00:35:50,800 and we don't have this right in the 1165 00:35:50,800 --> 00:35:51,920 united states 1166 00:35:51,920 --> 00:35:55,119 now it's available in california and 1167 00:35:55,119 --> 00:35:57,599 most recently the state of virginia 1168 00:35:57,599 --> 00:36:00,320 this is the most basic building block to 1169 00:36:00,320 --> 00:36:00,800 solve 1170 00:36:00,800 --> 00:36:02,720 many of the above conversations 1171 00:36:02,720 --> 00:36:04,400 especially the one that was just 1172 00:36:04,400 --> 00:36:07,119 going on regarding ai and accountability 1173 00:36:07,119 --> 00:36:08,480 of ai 1174 00:36:08,480 --> 00:36:11,520 the key idea that we tried to challenge 1175 00:36:11,520 --> 00:36:14,640 was the requirement of the company to 1176 00:36:14,640 --> 00:36:16,160 disclose 1177 00:36:16,160 --> 00:36:20,640 um all of the key aspects of 1178 00:36:24,000 --> 00:36:27,280 and uh 1179 00:36:27,280 --> 00:36:28,800 oh no you're back you're back keep going 1180 00:36:28,800 --> 00:36:31,040 david keep going keep going 1181 00:36:31,040 --> 00:36:34,560 i see you um thank you i'll try to get 1182 00:36:34,560 --> 00:36:35,119 it out 1183 00:36:35,119 --> 00:36:37,440 um that the key aspects of the cambridge 1184 00:36:37,440 --> 00:36:38,480 analytica case 1185 00:36:38,480 --> 00:36:40,960 was how did they build these profiles 1186 00:36:40,960 --> 00:36:43,440 where did they get the data sources from 1187 00:36:43,440 --> 00:36:45,680 how did their algorithms work under the 1188 00:36:45,680 --> 00:36:47,599 uk law they were required to disclose 1189 00:36:47,599 --> 00:36:50,000 that and they managed to conceal it 1190 00:36:50,000 --> 00:36:52,079 so we still have a lot of work to do in 1191 00:36:52,079 --> 00:36:53,839 enforcing the transparency that's 1192 00:36:53,839 --> 00:36:55,599 already mandated in europe 1193 00:36:55,599 --> 00:36:57,920 and how to build similar requirements in 1194 00:36:57,920 --> 00:37:00,320 the united states and other countries 1195 00:37:00,320 --> 00:37:03,280 that are working on regis regulation to 1196 00:37:03,280 --> 00:37:05,119 create this global standard 1197 00:37:05,119 --> 00:37:07,359 to create the adequacy of international 1198 00:37:07,359 --> 00:37:10,079 trade but the fundamental bedrock is the 1199 00:37:10,079 --> 00:37:11,599 right to know 1200 00:37:11,599 --> 00:37:14,800 and we have so much work to do to make 1201 00:37:14,800 --> 00:37:16,720 that visible to individuals 1202 00:37:16,720 --> 00:37:20,079 a key moment that i'll try to 1203 00:37:20,079 --> 00:37:23,920 capture uh was when channel 4 1204 00:37:23,920 --> 00:37:26,640 showed a voter a black voter a black 1205 00:37:26,640 --> 00:37:28,240 woman in wisconsin 1206 00:37:28,240 --> 00:37:30,800 and showed her her voter profile and 1207 00:37:30,800 --> 00:37:33,200 showed how she was marked for deterrence 1208 00:37:33,200 --> 00:37:35,520 to be discouraged from participating in 1209 00:37:35,520 --> 00:37:37,280 the 2016 election 1210 00:37:37,280 --> 00:37:40,880 and when she saw that her response was 1211 00:37:40,880 --> 00:37:43,680 this makes me want to vote even more 1212 00:37:43,680 --> 00:37:44,560 meaning in the 1213 00:37:44,560 --> 00:37:48,800 2020. and so the simple transparency 1214 00:37:48,800 --> 00:37:52,320 of what is behind the screen i think is 1215 00:37:52,320 --> 00:37:53,920 the most important thing that we need to 1216 00:37:53,920 --> 00:37:54,400 achieve 1217 00:37:54,400 --> 00:37:57,760 first and then all other data protection 1218 00:37:57,760 --> 00:38:00,320 issues and ai issues and ethical issues 1219 00:38:00,320 --> 00:38:02,160 and regulatory issues 1220 00:38:02,160 --> 00:38:05,680 sit upon that we need to make 1221 00:38:05,680 --> 00:38:08,320 the data rights issues central to these 1222 00:38:08,320 --> 00:38:09,680 conversations 1223 00:38:09,680 --> 00:38:11,359 thank you so folks what we're going to 1224 00:38:11,359 --> 00:38:13,040 do now i'm going to ask you to be as 1225 00:38:13,040 --> 00:38:15,119 brief as possible so that we can kind of 1226 00:38:15,119 --> 00:38:17,520 cook through some of these questions let 1227 00:38:17,520 --> 00:38:19,440 me start with one from joe just to talk 1228 00:38:19,440 --> 00:38:21,119 about bottom line issues this is an 1229 00:38:21,119 --> 00:38:22,160 interesting question i hadn't thought 1230 00:38:22,160 --> 00:38:22,960 about before he says 1231 00:38:22,960 --> 00:38:25,119 from an ontological point of view do you 1232 00:38:25,119 --> 00:38:26,560 believe there's something like a 1233 00:38:26,560 --> 00:38:27,599 universal 1234 00:38:27,599 --> 00:38:30,480 global norm of freedom of expression 1235 00:38:30,480 --> 00:38:32,000 including limitations renata your 1236 00:38:32,000 --> 00:38:33,839 thoughts real fast 1237 00:38:33,839 --> 00:38:35,760 well we have you know international you 1238 00:38:35,760 --> 00:38:38,320 know human rights standards even in the 1239 00:38:38,320 --> 00:38:40,400 united nations and their freedom of 1240 00:38:40,400 --> 00:38:42,079 expression is clearly included and 1241 00:38:42,079 --> 00:38:43,599 that's why you know i would say there's 1242 00:38:43,599 --> 00:38:46,000 already quite a global convergence on 1243 00:38:46,000 --> 00:38:47,760 the importance of freedom of expression 1244 00:38:47,760 --> 00:38:49,520 where we differ a little bit is you know 1245 00:38:49,520 --> 00:38:51,440 whether that's absolute or how it can be 1246 00:38:51,440 --> 00:38:53,200 reconciled with other interests such as 1247 00:38:53,200 --> 00:38:54,400 security 1248 00:38:54,400 --> 00:38:55,760 marcus i think you you would have 1249 00:38:55,760 --> 00:38:57,520 thoughts on that is there a 1250 00:38:57,520 --> 00:38:59,520 kind of universal building block in when 1251 00:38:59,520 --> 00:39:01,359 when when uh 1252 00:39:01,359 --> 00:39:03,040 mark zuckerberg put this all together 1253 00:39:03,040 --> 00:39:05,040 was there sort of a unit of belief in 1254 00:39:05,040 --> 00:39:06,640 that element of expression 1255 00:39:06,640 --> 00:39:08,079 that was part of the dna of your 1256 00:39:08,079 --> 00:39:10,560 platform 1257 00:39:10,560 --> 00:39:12,880 yeah it's the same answer you know that 1258 00:39:12,880 --> 00:39:14,560 there are global standards 1259 00:39:14,560 --> 00:39:16,720 the the issue is here that we have 1260 00:39:16,720 --> 00:39:18,320 either a lack of enforcement you know 1261 00:39:18,320 --> 00:39:19,839 coming back to david's theme 1262 00:39:19,839 --> 00:39:22,880 on on privacy or or a different 1263 00:39:22,880 --> 00:39:24,480 interpretation and sometimes the 1264 00:39:24,480 --> 00:39:25,520 interpretation is 1265 00:39:25,520 --> 00:39:27,440 is is that people don't recognize some 1266 00:39:27,440 --> 00:39:29,040 of these human rights and these rights 1267 00:39:29,040 --> 00:39:30,320 of freedom of speech 1268 00:39:30,320 --> 00:39:33,440 but also i think more subtly um and i 1269 00:39:33,440 --> 00:39:34,960 think renate made the point already is 1270 00:39:34,960 --> 00:39:35,680 that 1271 00:39:35,680 --> 00:39:37,359 the absoluteness of them and i think 1272 00:39:37,359 --> 00:39:38,880 it's just a you know as a european 1273 00:39:38,880 --> 00:39:40,320 working for an american company you know 1274 00:39:40,320 --> 00:39:40,720 there's 1275 00:39:40,720 --> 00:39:43,599 there's a different um ranking of where 1276 00:39:43,599 --> 00:39:45,359 freedom of speech stands where 1277 00:39:45,359 --> 00:39:47,280 the integrity of the individual stands 1278 00:39:47,280 --> 00:39:49,119 and and i think this this this is normal 1279 00:39:49,119 --> 00:39:49,760 because 1280 00:39:49,760 --> 00:39:52,800 speech is a cultural thing and sometimes 1281 00:39:52,800 --> 00:39:55,200 it's impossible to really have 1282 00:39:55,200 --> 00:39:57,359 deep frameworks that cover everything 1283 00:39:57,359 --> 00:39:59,040 and they go across the globe 1284 00:39:59,040 --> 00:40:00,960 so it's a big big challenge and back to 1285 00:40:00,960 --> 00:40:02,560 the question you asked before radio does 1286 00:40:02,560 --> 00:40:04,160 i i think you know 1287 00:40:04,160 --> 00:40:06,560 we we we we have to hold on to global 1288 00:40:06,560 --> 00:40:07,680 standards in our 1289 00:40:07,680 --> 00:40:10,319 in our community standards but that is 1290 00:40:10,319 --> 00:40:12,160 that is challenged on a daily basis 1291 00:40:12,160 --> 00:40:13,760 of course what's going through my mind 1292 00:40:13,760 --> 00:40:15,280 right now is belarus 1293 00:40:15,280 --> 00:40:17,200 and what we saw happen to a blogger and 1294 00:40:17,200 --> 00:40:19,040 pulling down a plane who actually had 1295 00:40:19,040 --> 00:40:20,720 that universal right of expression 1296 00:40:20,720 --> 00:40:22,240 just going through my head right now we 1297 00:40:22,240 --> 00:40:23,839 got a question right here yes go ahead 1298 00:40:23,839 --> 00:40:24,880 hi i'm ayman 1299 00:40:24,880 --> 00:40:26,319 from lebanon i run a freedom of 1300 00:40:26,319 --> 00:40:27,680 expression organization in the mena 1301 00:40:27,680 --> 00:40:28,160 region 1302 00:40:28,160 --> 00:40:30,880 how's it going oh badly this is why i'm 1303 00:40:30,880 --> 00:40:32,640 asking uh marcus 1304 00:40:32,640 --> 00:40:34,720 um you mentioned the great progress that 1305 00:40:34,720 --> 00:40:36,240 facebook made in terms of fighting this 1306 00:40:36,240 --> 00:40:37,599 information through fact-checking 1307 00:40:37,599 --> 00:40:38,400 networks 1308 00:40:38,400 --> 00:40:40,400 uh fighting election interference and 1309 00:40:40,400 --> 00:40:41,760 fighting hate speech 1310 00:40:41,760 --> 00:40:44,960 well i'm inspired by a great book be war 1311 00:40:44,960 --> 00:40:47,599 beware of small states by david hurst to 1312 00:40:47,599 --> 00:40:48,560 ask you about 1313 00:40:48,560 --> 00:40:50,560 supporting democracy where it's more 1314 00:40:50,560 --> 00:40:52,000 fragile where it's less 1315 00:40:52,000 --> 00:40:54,480 established where it needs to be further 1316 00:40:54,480 --> 00:40:55,200 nurtured 1317 00:40:55,200 --> 00:40:56,640 yes there is progress in terms of 1318 00:40:56,640 --> 00:40:58,720 disinformation in other languages 1319 00:40:58,720 --> 00:41:00,640 but specifically election interference 1320 00:41:00,640 --> 00:41:01,760 in smaller countries 1321 00:41:01,760 --> 00:41:04,160 and fragile democracies right under your 1322 00:41:04,160 --> 00:41:05,440 jurisdiction tunisia 1323 00:41:05,440 --> 00:41:08,720 algeria iraq lebanon horrible uh 1324 00:41:08,720 --> 00:41:10,800 information marketplace same for hate 1325 00:41:10,800 --> 00:41:12,880 speech and uneven level of attention 1326 00:41:12,880 --> 00:41:14,480 right during the israel-palestine latest 1327 00:41:14,480 --> 00:41:16,079 conflict right the attack by the horde 1328 00:41:16,079 --> 00:41:18,240 of supporters of muhammad bin salman 1329 00:41:18,240 --> 00:41:20,560 so what are you doing to these more 1330 00:41:20,560 --> 00:41:21,920 fragile countries 1331 00:41:21,920 --> 00:41:23,839 in languages that are not as widely 1332 00:41:23,839 --> 00:41:25,280 represented on the internet as 1333 00:41:25,280 --> 00:41:27,760 english and other major languages these 1334 00:41:27,760 --> 00:41:29,680 are the places where democracy is most 1335 00:41:29,680 --> 00:41:31,119 fragile and where we need to support it 1336 00:41:31,119 --> 00:41:31,520 the most 1337 00:41:31,520 --> 00:41:32,720 great question thank you let me ask 1338 00:41:32,720 --> 00:41:36,720 david carroll for his thoughts david 1339 00:41:38,000 --> 00:41:41,520 david david is um looking good 1340 00:41:41,520 --> 00:41:43,359 but we're going to jump over to renata 1341 00:41:43,359 --> 00:41:45,359 renata your thoughts and david if you 1342 00:41:45,359 --> 00:41:46,400 can hear me 1343 00:41:46,400 --> 00:41:47,920 you can jump in when you come back and 1344 00:41:47,920 --> 00:41:50,720 you're moving again okay renata 1345 00:41:50,720 --> 00:41:53,359 huh yeah a very big question i mean uh 1346 00:41:53,359 --> 00:41:55,040 all i can say there is that 1347 00:41:55,040 --> 00:41:57,200 um indeed i one of the of the issues 1348 00:41:57,200 --> 00:41:59,440 that we and also in our code of practice 1349 00:41:59,440 --> 00:42:01,119 uh where we are working where we've been 1350 00:42:01,119 --> 00:42:02,800 working in the past years with 1351 00:42:02,800 --> 00:42:05,440 platforms such as facebook we have 1352 00:42:05,440 --> 00:42:06,960 always made the point that it's so 1353 00:42:06,960 --> 00:42:08,960 important not to do this only uh 1354 00:42:08,960 --> 00:42:11,119 in certain uh jurisdictions but of 1355 00:42:11,119 --> 00:42:12,880 course we had an uh you know an eu 1356 00:42:12,880 --> 00:42:14,720 perspective but already in the eu as you 1357 00:42:14,720 --> 00:42:15,599 know very well 1358 00:42:15,599 --> 00:42:17,280 we have smaller states and bigger states 1359 00:42:17,280 --> 00:42:19,280 and different kind of languages 1360 00:42:19,280 --> 00:42:22,079 going outside of the eu um i mean we are 1361 00:42:22,079 --> 00:42:24,319 very clear in our kind of cooperation 1362 00:42:24,319 --> 00:42:25,839 with especially partners in the 1363 00:42:25,839 --> 00:42:27,599 neighborhood and that also includes the 1364 00:42:27,599 --> 00:42:28,880 southern neighborhood and the countries 1365 00:42:28,880 --> 00:42:30,319 that you were referring to 1366 00:42:30,319 --> 00:42:31,920 that we want to kind of you know also 1367 00:42:31,920 --> 00:42:33,440 engage with partners there 1368 00:42:33,440 --> 00:42:35,280 to address the challenges of the digital 1369 00:42:35,280 --> 00:42:37,359 transformation and here also this is 1370 00:42:37,359 --> 00:42:39,920 the disinformation but of course we 1371 00:42:39,920 --> 00:42:41,359 cannot regulate there 1372 00:42:41,359 --> 00:42:43,599 so what we can do is that to really 1373 00:42:43,599 --> 00:42:44,400 equip us 1374 00:42:44,400 --> 00:42:46,480 with our kind of initiatives that i have 1375 00:42:46,480 --> 00:42:47,680 tried to explain 1376 00:42:47,680 --> 00:42:50,079 uh to inspire others so that also other 1377 00:42:50,079 --> 00:42:51,599 can kind of build on that but 1378 00:42:51,599 --> 00:42:53,599 uh yeah otherwise you know yeah i 1379 00:42:53,599 --> 00:42:54,880 understand the problem 1380 00:42:54,880 --> 00:42:57,359 great thank you david did you hear the 1381 00:42:57,359 --> 00:42:58,960 the question 1382 00:42:58,960 --> 00:43:02,319 uh it was uh regarding um the 1383 00:43:02,319 --> 00:43:04,560 international fragile democracies and 1384 00:43:04,560 --> 00:43:05,760 yes yes yeah yeah 1385 00:43:05,760 --> 00:43:07,359 so we went to you first but you just 1386 00:43:07,359 --> 00:43:08,880 stared at us so yeah 1387 00:43:08,880 --> 00:43:11,920 yeah i'm sorry yeah my my my bad 1388 00:43:11,920 --> 00:43:13,680 connection today we i'm gonna have 1389 00:43:13,680 --> 00:43:15,520 martin call you up and get facebook to 1390 00:43:15,520 --> 00:43:16,000 help you 1391 00:43:16,000 --> 00:43:19,280 you know um yes 1392 00:43:19,280 --> 00:43:21,119 i think that the small state issue is 1393 00:43:21,119 --> 00:43:22,400 extremely important and 1394 00:43:22,400 --> 00:43:25,359 goes back to the question of a universal 1395 00:43:25,359 --> 00:43:27,119 data protection 1396 00:43:27,119 --> 00:43:30,000 um shield across the world that small 1397 00:43:30,000 --> 00:43:33,040 states are also exploited by bad actors 1398 00:43:33,040 --> 00:43:35,760 uh for abusive practices or to offshore 1399 00:43:35,760 --> 00:43:36,079 and 1400 00:43:36,079 --> 00:43:39,280 lo and laundered de heda so for example 1401 00:43:39,280 --> 00:43:41,359 the the last thing that we learned from 1402 00:43:41,359 --> 00:43:43,040 the investigation into cambridge 1403 00:43:43,040 --> 00:43:43,760 analytica 1404 00:43:43,760 --> 00:43:45,119 is that they were in the process of 1405 00:43:45,119 --> 00:43:47,200 offshoring to saint kitts 1406 00:43:47,200 --> 00:43:50,640 and nevins using the caribbean to shield 1407 00:43:50,640 --> 00:43:51,520 themselves from 1408 00:43:51,520 --> 00:43:55,040 data protection laws so we need to 1409 00:43:55,040 --> 00:43:58,240 understand that every citizen 1410 00:43:58,240 --> 00:44:00,640 in every country deserves equal 1411 00:44:00,640 --> 00:44:01,760 protection 1412 00:44:01,760 --> 00:44:06,160 under an equal set of of laws and rules 1413 00:44:06,160 --> 00:44:08,079 and then the problem that we also need 1414 00:44:08,079 --> 00:44:10,800 to address is that advertising supported 1415 00:44:10,800 --> 00:44:14,160 public interest platforms are inherently 1416 00:44:14,160 --> 00:44:16,240 there's a tension there that what is 1417 00:44:16,240 --> 00:44:18,640 good for a brand may not be good for 1418 00:44:18,640 --> 00:44:22,160 a public society and there are arguments 1419 00:44:22,160 --> 00:44:23,680 that the brand 1420 00:44:23,680 --> 00:44:25,280 creates an incentive to be good for 1421 00:44:25,280 --> 00:44:27,920 society but not at the expense of profit 1422 00:44:27,920 --> 00:44:30,079 so there are we need more public 1423 00:44:30,079 --> 00:44:32,160 interest platforms to compete with the 1424 00:44:32,160 --> 00:44:33,839 commercial ones and that's a huge 1425 00:44:33,839 --> 00:44:34,800 challenge 1426 00:44:34,800 --> 00:44:36,800 marcus can i thank you that marcus can i 1427 00:44:36,800 --> 00:44:38,240 ask you 1428 00:44:38,240 --> 00:44:39,760 to share your thoughts with that but we 1429 00:44:39,760 --> 00:44:41,680 also have a question here from ivan 1430 00:44:41,680 --> 00:44:43,760 uh that's come in online and he 1431 00:44:43,760 --> 00:44:46,160 basically wonders to what degree 1432 00:44:46,160 --> 00:44:48,160 is it's not like a small nation problem 1433 00:44:48,160 --> 00:44:49,920 but it's a small you know company a 1434 00:44:49,920 --> 00:44:50,800 small everybody 1435 00:44:50,800 --> 00:44:52,000 question if you don't have the 1436 00:44:52,000 --> 00:44:54,160 technological sophistication 1437 00:44:54,160 --> 00:44:57,119 or the money uh to invest in the kind of 1438 00:44:57,119 --> 00:44:59,359 protections we're talking about today 1439 00:44:59,359 --> 00:45:01,280 he's just wondering whether that you 1440 00:45:01,280 --> 00:45:03,440 know is is something where 1441 00:45:03,440 --> 00:45:06,880 the bad guys and the villains always win 1442 00:45:06,880 --> 00:45:08,640 and and i guess i would ask you i know 1443 00:45:08,640 --> 00:45:10,160 someone in this room right now 1444 00:45:10,160 --> 00:45:12,160 who has you know one of the largest uh 1445 00:45:12,160 --> 00:45:14,640 followings on facebook from palestine 1446 00:45:14,640 --> 00:45:16,400 particularly among young youth 1447 00:45:16,400 --> 00:45:18,079 and you know believe me mahmoud abbas 1448 00:45:18,079 --> 00:45:20,000 doesn't like this guy you know he's 1449 00:45:20,000 --> 00:45:21,839 anti-corruption and pro-democracy and 1450 00:45:21,839 --> 00:45:23,680 mobas has overstayed his term by 12 1451 00:45:23,680 --> 00:45:24,240 years 1452 00:45:24,240 --> 00:45:26,079 so when you look at that he's got a 1453 00:45:26,079 --> 00:45:27,359 whole farm 1454 00:45:27,359 --> 00:45:29,520 of people that that government and the 1455 00:45:29,520 --> 00:45:30,400 egyptians 1456 00:45:30,400 --> 00:45:32,480 have aimed at him and every digital 1457 00:45:32,480 --> 00:45:33,680 thing he does 1458 00:45:33,680 --> 00:45:35,920 and and i and and and even facebook has 1459 00:45:35,920 --> 00:45:37,520 been in this story to somewhat so i i'm 1460 00:45:37,520 --> 00:45:37,839 just 1461 00:45:37,839 --> 00:45:40,880 interested in how even if facebook can 1462 00:45:40,880 --> 00:45:41,680 deal with 1463 00:45:41,680 --> 00:45:45,200 government actors trying to undo the 1464 00:45:45,200 --> 00:45:46,960 rights of those people who were worried 1465 00:45:46,960 --> 00:45:49,119 about democracy and anti-corruption 1466 00:45:49,119 --> 00:45:51,040 and trying to take how do we help the 1467 00:45:51,040 --> 00:45:52,720 little guys succeed 1468 00:45:52,720 --> 00:45:54,720 against the resources of governments 1469 00:45:54,720 --> 00:45:56,400 that actually don't want 1470 00:45:56,400 --> 00:45:58,839 the values we're talking about today 1471 00:45:58,839 --> 00:46:00,000 marcus 1472 00:46:00,000 --> 00:46:01,920 yeah i mean we can't be naive about this 1473 00:46:01,920 --> 00:46:03,200 this is this is a 1474 00:46:03,200 --> 00:46:05,839 an arms race between the bad actors and 1475 00:46:05,839 --> 00:46:06,240 and 1476 00:46:06,240 --> 00:46:08,240 and and people who try to to stop this 1477 00:46:08,240 --> 00:46:09,520 behavior so 1478 00:46:09,520 --> 00:46:11,440 it will never be over and yes i think 1479 00:46:11,440 --> 00:46:13,359 you're absolutely right i mean this is a 1480 00:46:13,359 --> 00:46:14,960 this is a resource question to some 1481 00:46:14,960 --> 00:46:16,640 extent and i you know make a very blunt 1482 00:46:16,640 --> 00:46:18,160 point about this 1483 00:46:18,160 --> 00:46:21,839 facebook spent last year the equivalent 1484 00:46:21,839 --> 00:46:24,160 of twitter's revenue in protecting the 1485 00:46:24,160 --> 00:46:26,319 integrity of our platform 1486 00:46:26,319 --> 00:46:28,000 um i mean that that is a that is an 1487 00:46:28,000 --> 00:46:29,440 enormous uh 1488 00:46:29,440 --> 00:46:32,160 expense to make we have 35 000 people 1489 00:46:32,160 --> 00:46:32,560 who 1490 00:46:32,560 --> 00:46:34,480 you know keep the platform safe that's 1491 00:46:34,480 --> 00:46:36,480 that is probably i 1492 00:46:36,480 --> 00:46:37,520 haven't really checked it but it's 1493 00:46:37,520 --> 00:46:39,040 probably the biggest enforcement army 1494 00:46:39,040 --> 00:46:41,520 that there is an enforcement 1495 00:46:41,520 --> 00:46:44,000 group of people that there is but i 1496 00:46:44,000 --> 00:46:45,839 think there's a myth and i want to just 1497 00:46:45,839 --> 00:46:47,520 clarify it is that so if you're a small 1498 00:46:47,520 --> 00:46:49,040 state if you're a small group you just 1499 00:46:49,040 --> 00:46:50,480 fall through the cracks i don't think 1500 00:46:50,480 --> 00:46:51,920 that is right and when we think about 1501 00:46:51,920 --> 00:46:53,200 backed actors 1502 00:46:53,200 --> 00:46:55,359 there's definitely one which is behavior 1503 00:46:55,359 --> 00:46:56,400 yeah let's think about it 1504 00:46:56,400 --> 00:46:58,240 as as a bad actor you behave and what we 1505 00:46:58,240 --> 00:47:00,480 call is cip 1506 00:47:00,480 --> 00:47:03,839 um which is which is um sort of um 1507 00:47:03,839 --> 00:47:07,280 coordinated um in influencing of uh 1508 00:47:07,280 --> 00:47:10,160 outcomes in in a different country and 1509 00:47:10,160 --> 00:47:11,040 we spot that 1510 00:47:11,040 --> 00:47:12,720 we spot it everywhere that has doesn't 1511 00:47:12,720 --> 00:47:14,160 matter how big a country is how big a 1512 00:47:14,160 --> 00:47:14,960 group is 1513 00:47:14,960 --> 00:47:16,480 these are mechanisms these again 1514 00:47:16,480 --> 00:47:18,800 algorithms that help us to spot so every 1515 00:47:18,800 --> 00:47:21,359 size of political or geographical entity 1516 00:47:21,359 --> 00:47:22,800 benefits from this one and then you have 1517 00:47:22,800 --> 00:47:24,079 content on the platform 1518 00:47:24,079 --> 00:47:25,920 and yes that is a scale game moderating 1519 00:47:25,920 --> 00:47:27,680 content again 1520 00:47:27,680 --> 00:47:30,000 raises the the need for having a lot of 1521 00:47:30,000 --> 00:47:31,200 people on the ground for having the most 1522 00:47:31,200 --> 00:47:32,079 sophisticated 1523 00:47:32,079 --> 00:47:34,319 algorithms and it does become a 1524 00:47:34,319 --> 00:47:36,720 challenge for others and i i do remember 1525 00:47:36,720 --> 00:47:38,559 when i was on the same panel with 1526 00:47:38,559 --> 00:47:41,440 renatus um vice president when she said 1527 00:47:41,440 --> 00:47:43,040 look are you to pick to care 1528 00:47:43,040 --> 00:47:45,280 you know as a platform and my response 1529 00:47:45,280 --> 00:47:46,559 was we we 1530 00:47:46,559 --> 00:47:49,280 we are just big enough that we can care 1531 00:47:49,280 --> 00:47:51,200 actually because it's it is very very 1532 00:47:51,200 --> 00:47:53,440 difficult to bring up these resources 1533 00:47:53,440 --> 00:47:55,440 to deal with all these challenges on a 1534 00:47:55,440 --> 00:47:56,559 global stage 1535 00:47:56,559 --> 00:47:58,559 that's that is it and you know if if you 1536 00:47:58,559 --> 00:47:59,839 think it's still not enough 1537 00:47:59,839 --> 00:48:01,520 think about our competitors or smaller 1538 00:48:01,520 --> 00:48:03,359 platforms that come into this market and 1539 00:48:03,359 --> 00:48:04,880 what challenge they have 1540 00:48:04,880 --> 00:48:06,400 look we've got two minutes left so i'm 1541 00:48:06,400 --> 00:48:07,359 going to ask you lightning round 1542 00:48:07,359 --> 00:48:08,720 questions i'm going to mention 1543 00:48:08,720 --> 00:48:11,040 globe sets globe sex transatlantic 1544 00:48:11,040 --> 00:48:11,839 principles 1545 00:48:11,839 --> 00:48:14,160 for a healthy online information space 1546 00:48:14,160 --> 00:48:14,960 long title 1547 00:48:14,960 --> 00:48:18,400 too wonky but um uh the second 1548 00:48:18,400 --> 00:48:20,640 item in this says empower users to make 1549 00:48:20,640 --> 00:48:22,559 informed decisions about their data 1550 00:48:22,559 --> 00:48:24,240 i guess my question is i know that david 1551 00:48:24,240 --> 00:48:25,680 carroll cared about his data 1552 00:48:25,680 --> 00:48:27,520 but i don't know if most people do i've 1553 00:48:27,520 --> 00:48:29,280 always wondered you know do we need a 1554 00:48:29,280 --> 00:48:30,160 game app like 1555 00:48:30,160 --> 00:48:31,680 you know angry birds or we need 1556 00:48:31,680 --> 00:48:33,599 something that a lot more people than 1557 00:48:33,599 --> 00:48:35,040 attend this conference go through 1558 00:48:35,040 --> 00:48:37,040 where we educate about digital literacy 1559 00:48:37,040 --> 00:48:38,480 we talk about digital health 1560 00:48:38,480 --> 00:48:40,400 we talk about how they could be victims 1561 00:48:40,400 --> 00:48:42,160 so that you could get more of a demand 1562 00:48:42,160 --> 00:48:43,040 function 1563 00:48:43,040 --> 00:48:45,359 uh going uh on these questions that 1564 00:48:45,359 --> 00:48:46,800 we're talking about today 1565 00:48:46,800 --> 00:48:48,400 and i'm just you know just real quick 1566 00:48:48,400 --> 00:48:50,319 insights on whether you think again i'm 1567 00:48:50,319 --> 00:48:52,240 naive and thinking that's possible 1568 00:48:52,240 --> 00:48:54,480 because i don't see again gravity going 1569 00:48:54,480 --> 00:48:55,440 the right direction 1570 00:48:55,440 --> 00:48:57,359 i see a lot of people overwhelmed by 1571 00:48:57,359 --> 00:48:59,440 this topic and they largely don't care 1572 00:48:59,440 --> 00:49:00,720 they don't see it as something 1573 00:49:00,720 --> 00:49:03,119 they do it's kind of similar to health i 1574 00:49:03,119 --> 00:49:04,400 see a lot of problems in medical and 1575 00:49:04,400 --> 00:49:05,440 health literacy 1576 00:49:05,440 --> 00:49:06,880 people just kind of forfeit you know 1577 00:49:06,880 --> 00:49:08,160 responsibility to let others make 1578 00:49:08,160 --> 00:49:09,280 decisions for them 1579 00:49:09,280 --> 00:49:11,599 how do you empower and you know in to go 1580 00:49:11,599 --> 00:49:12,319 back to this 1581 00:49:12,319 --> 00:49:14,880 globe sec uh initiative how do you 1582 00:49:14,880 --> 00:49:16,000 actually want people to 1583 00:49:16,000 --> 00:49:20,160 want this david the number one response 1584 00:49:20,160 --> 00:49:22,319 that people have had after seeing the 1585 00:49:22,319 --> 00:49:24,240 netflix documentary the great hack 1586 00:49:24,240 --> 00:49:27,359 is what can i do to protect my data 1587 00:49:27,359 --> 00:49:30,480 and i don't have good responses to that 1588 00:49:30,480 --> 00:49:33,280 question so there is a hunger for it and 1589 00:49:33,280 --> 00:49:34,240 the tools 1590 00:49:34,240 --> 00:49:38,000 and abilities are not there we 1591 00:49:38,000 --> 00:49:39,760 we get our financial information 1592 00:49:39,760 --> 00:49:41,599 proactively disclosed in a monthly 1593 00:49:41,599 --> 00:49:42,559 statement 1594 00:49:42,559 --> 00:49:44,640 why isn't our data why aren't our data 1595 00:49:44,640 --> 00:49:46,720 profiles proactively disclosed 1596 00:49:46,720 --> 00:49:49,920 to us excellent thank you 1597 00:49:49,920 --> 00:49:51,599 um marcus your thoughts and we'll give 1598 00:49:51,599 --> 00:49:53,760 the last word to renata 1599 00:49:53,760 --> 00:49:56,160 i would go in a similar direction i i do 1600 00:49:56,160 --> 00:49:57,440 think you know there are people 1601 00:49:57,440 --> 00:49:59,440 probably as a fraction of population 1602 00:49:59,440 --> 00:50:00,960 that that doesn't really care generally 1603 00:50:00,960 --> 00:50:02,640 doesn't care as people that care but 1604 00:50:02,640 --> 00:50:04,000 i think what is important is that the 1605 00:50:04,000 --> 00:50:06,640 people who care have more opportunities 1606 00:50:06,640 --> 00:50:08,559 to express that care and to have more 1607 00:50:08,559 --> 00:50:10,640 control and have more transparency over 1608 00:50:10,640 --> 00:50:12,240 over the data and i think that's sort of 1609 00:50:12,240 --> 00:50:13,839 the gradual process 1610 00:50:13,839 --> 00:50:16,079 that that we work towards as a as a 1611 00:50:16,079 --> 00:50:18,000 private company with our settings with 1612 00:50:18,000 --> 00:50:18,559 our 1613 00:50:18,559 --> 00:50:20,559 uh product changes but also i think 1614 00:50:20,559 --> 00:50:21,680 that's where we go in with 1615 00:50:21,680 --> 00:50:23,440 all regulations think about the digital 1616 00:50:23,440 --> 00:50:25,119 services act think about 1617 00:50:25,119 --> 00:50:26,880 privacy regulation and so on so i think 1618 00:50:26,880 --> 00:50:28,400 it's a it's a it's a piece of movement 1619 00:50:28,400 --> 00:50:30,000 that that is required here 1620 00:50:30,000 --> 00:50:31,520 but more transparency more control 1621 00:50:31,520 --> 00:50:33,520 that's the way to go thank you renata 1622 00:50:33,520 --> 00:50:36,240 you're going to get the last word 1623 00:50:36,240 --> 00:50:38,480 i think steve i'm less pessimistic than 1624 00:50:38,480 --> 00:50:39,280 you about 1625 00:50:39,280 --> 00:50:41,839 the citizens in general i'm a journalist 1626 00:50:41,839 --> 00:50:44,559 it's my job to be pessimistic 1627 00:50:44,559 --> 00:50:48,079 absolutely but on the other hand i see a 1628 00:50:48,079 --> 00:50:49,520 real task here because 1629 00:50:49,520 --> 00:50:51,839 if we don't address the media literacy 1630 00:50:51,839 --> 00:50:53,359 and the digital literacy 1631 00:50:53,359 --> 00:50:56,160 in societies and at all ages uh i mean 1632 00:50:56,160 --> 00:50:57,839 there's a lot of talk about you know 1633 00:50:57,839 --> 00:51:00,160 education school education and fos you 1634 00:51:00,160 --> 00:51:02,160 know focusing on that but you have to 1635 00:51:02,160 --> 00:51:03,520 really think about you know all 1636 00:51:03,520 --> 00:51:05,440 generations there's also a vulnerable 1637 00:51:05,440 --> 00:51:06,800 group of elderly people 1638 00:51:06,800 --> 00:51:08,319 you know if we don't get that right the 1639 00:51:08,319 --> 00:51:09,920 digital divide and with all the kind of 1640 00:51:09,920 --> 00:51:11,520 the difficulties that come along with it 1641 00:51:11,520 --> 00:51:12,640 can really harm us 1642 00:51:12,640 --> 00:51:14,640 so i think we we have understood that 1643 00:51:14,640 --> 00:51:16,960 this is a huge kind of task for us that 1644 00:51:16,960 --> 00:51:18,720 along with regulation increased 1645 00:51:18,720 --> 00:51:21,040 transparency we need to kind of you know 1646 00:51:21,040 --> 00:51:22,480 really engage in 1647 00:51:22,480 --> 00:51:24,480 concrete programming and concrete kind 1648 00:51:24,480 --> 00:51:25,599 of you know tools 1649 00:51:25,599 --> 00:51:27,680 to enable users to actually be better 1650 00:51:27,680 --> 00:51:29,599 informed because otherwise if we lose 1651 00:51:29,599 --> 00:51:31,119 them we will not kind of find a way 1652 00:51:31,119 --> 00:51:32,880 through that 1653 00:51:32,880 --> 00:51:34,880 uh well listen what a great conversation 1654 00:51:34,880 --> 00:51:35,920 a short period of time i didn't know 1655 00:51:35,920 --> 00:51:37,119 where this was all going to go but i 1656 00:51:37,119 --> 00:51:37,839 thought we 1657 00:51:37,839 --> 00:51:40,160 actually did say uh and and reveal some 1658 00:51:40,160 --> 00:51:41,359 very interesting 1659 00:51:41,359 --> 00:51:43,040 uh parts of this discussion i really 1660 00:51:43,040 --> 00:51:45,200 want to thank uh renata nicolet of 1661 00:51:45,200 --> 00:51:47,280 of the european commission in brussels 1662 00:51:47,280 --> 00:51:49,119 marcus reinish of facebook i'm happy to 1663 00:51:49,119 --> 00:51:50,480 friend you if you want marcus we can 1664 00:51:50,480 --> 00:51:51,440 become pals 1665 00:51:51,440 --> 00:51:54,000 and and david i love facebook but david 1666 00:51:54,000 --> 00:51:54,559 carroll 1667 00:51:54,559 --> 00:51:56,720 uh from parsons school of design watch 1668 00:51:56,720 --> 00:51:57,680 the great hack 1669 00:51:57,680 --> 00:51:59,599 it's such an important show thank you 1670 00:51:59,599 --> 00:52:02,240 all uh right now our time is up i should 1671 00:52:02,240 --> 00:52:04,000 say that there is a program about to 1672 00:52:04,000 --> 00:52:04,720 begin 1673 00:52:04,720 --> 00:52:06,319 uh with the prime ministers of slovakia 1674 00:52:06,319 --> 00:52:08,000 in austria in the maria teresa room it's 1675 00:52:08,000 --> 00:52:09,040 gonna be tough to get in there you'll 1676 00:52:09,040 --> 00:52:10,400 have a lot more fun if you just hang out 1677 00:52:10,400 --> 00:52:11,599 in this room because it's gonna run live 1678 00:52:11,599 --> 00:52:12,960 here but let's give a big round of 1679 00:52:12,960 --> 00:52:16,559 applause to our panel thank you so much 1680 00:52:16,960 --> 00:52:19,280 thanks david thanks marcus thanks renata 1681 00:52:19,280 --> 00:52:21,040 i hope to see you in person one of these 1682 00:52:21,040 --> 00:52:22,720 days 1683 00:52:22,720 --> 00:52:33,020 that'd be nice bye very much 1684 00:52:33,020 --> 00:52:37,130 [Music] 1685 00:52:37,130 --> 00:52:39,410 [Applause] 1686 00:52:39,410 --> 00:52:40,800 [Music] 1687 00:52:40,800 --> 00:52:42,880 you