Episode 4: April Falcon Doss (Saul Ewing Arnstein & Lehr LLP)

Season 2

Season 2

Episode 4: April Falcon Doss (Saul Ewing Arnstein & Lehr LLP)

Watch Now
Or listen via:
Listen Now


In today’s episode of the Pad-cast, I was joined by April Falcon Doss, who is Partner, Chair of Cybersecurity and Privacy at Saul Ewing Arnstein & Lehr LLP and Author of the outstanding book “Cyber Privacy: Who Has Your Data and Why You Should Care”. In a truly fascinating conversation, April tells us what drove her to write such a thought-provoking book on privacy, which informs, educates and inspires individuals around the world to make better privacy decisions. During an incredible career, April has spent several years as a lawyer at the US National Security Agency and provides us with eye-opening (sometimes jaw-dropping) insights into the sometimes-scary consequences of living in an increasingly data-driven world!

Episode highlights include:

  • Is our every move being tracked?
  • Why Facebook knows us better than our family!
  • How scared should we be about the ‘Tech Giants’?
  • Will the ‘Big 5’ be broken up?
  • Do we still have a choice when it comes to privacy?
  • We discuss an unwitting early US privacy hero!


Intro: [00:00:00] There’s been research fairly well documented, demonstrating that with a certain number of clicks on Facebook, the platform’s algorithms know us better than our friends and family. The platform is better than even our spouses are significant others at predicting what the next thing is that we’re going to like or share.

Anthony Brown: [00:00:20] This is the padcast, your privacy and data podcast with me. Anthony Brown interviewing leaders from across the industry to provide you with news, views, insight and opinion. Hello there, and welcome to another episode of the padcast, our Privacy and Data Talks padcast, so this is a slightly unusual episode today, only in the sense that it wasn’t actually planned. I’m speaking to April Falcon Doss, who is in the US and we’ve never  actually met or spoken to April before but what I have done is read her book, which is here, and it’s called Cyber Privacy. Who has your data and why should you care? And to anyone that’s not read it? It’s an incredible read. And actually, it’s so good that I contacted April and asked just for my own sort of research purposes, my own knowledge, and frankly, just because I really wanted to talk to her because the book was so good, I asked her if we could jump on a zoom. I just want to pick our brains about a few things, and we started talking about the podcast and here we are. So we just decided, let’s just have a podcast and so here we are. So April, I’m sorry to sort of throw this on you. It’s so lovely to meet you. We literally had about 5, 10 minutes, and I think we spoke about my, well, your guitar skills, my very limited guitar skills. And then we just decided to do this. So so how are you?

April Falcon Doss: [00:01:47] I’m fabulous this morning, Anthony, it is great to talk to you, and it has been so much fun already. And thank you. I’m really looking forward to this. We’ll see where the conversation goes.

Anthony Brown: [00:01:58] Now I’ve just got to introduce April. If anyone that doesn’t know her track record is, I mean, just incredible, your experience, what you’ve achieved. So let’s go back here. You’re a lawyer, an attorney. You specialized over 20 years. Now you’ve been involved with information law, privacy law, cybersecurity along your journey. You spent 12 years at the National Security Agency in the US or the NSA, where you were an associate general counsel for intelligence law. Following that, you then joined Saul Ewing LLP, where you were a partner and chair of the Cybersecurity and Privacy Practice Group. And so, you know, essentially big law there. And then it looks to me like you left there and you joined the senior minority counsel for the Russia investigation at the United States or the US Senate Select Committee on Intelligence, where you spent a year and then you returned to Saul Ewing and you are a partner and chair of the cybersecurity and privacy team. Is that correct? Is that all? I got that right?

April Falcon Doss: [00:03:04] Yeah, it sure is. It sure is. And along the way, fun fact. I did live in the UK for three years. I was over in Cheltenham for three years in the mid two thousands as part of my NSA stint. So. Oh wow.

Anthony Brown: [00:03:18] Are you allowed to disclose what sort of stuff was going on when you were when you were doing that?

April Falcon Doss: [00:03:22] Well, just just as a liaison officer, you know, it’s I think it’s well known that there’s within the national security or intelligence community. There’s there’s this Five Eyes collaboration amongst the UK, the US, Australia, Canada, New Zealand. And so all of those nations have liaison officers to each other. And I got to be one of those liaison officers in Cheltenham for three years.

Anthony Brown: [00:03:46] Wow. Well, Cheltenham, that’s certainly beautiful city and part of the world. I’m kind of on the other side, you know, the eastern side or southeast of the country, but I’m sure you know, the UK very well. And well, it’s surprise you to know it’s raining right now as as per, but there we go. So April, this book is fantastic and I would. Anybody who’s listening to this, I’m sure, has some grasp or some good understanding, if not amazing understanding of all of the issues that you talk about in your book. But the research in there and some of the things, you know, tangible examples that you can really, you know, hang on to are fantastic. And so many things resonated with me. Not least, you know, you talk about it now. Yeah, I’ve heard you talk about that. We’re living in an age where people shouldn’t have to be an expert to understand what’s going on with their data. Ultimately, and I understand that’s kind of partly one of the rationales of reasoning behind actually, you know, writing this book. Could you tell us a bit about the journey in the background with the book, please?

April Falcon Doss: [00:04:53] Yeah, absolutely. And you’re spot on. That was really what what drove me to write this? You know, you mentioned that I’ve been doing this kind of work for about 20 years and, you know, 20 years ago. Think about it. Facebook didn’t launch until 2004. The first smartphone, the first iPhone, didn’t come out until 2007. In twenty years ago, the internet was used by ordinary individuals. We had dial up modems and, you know, very little bandwidth, but we weren’t using geolocation services on devices that we have in our hip pocket. Twenty four seven and we didn’t have commercial access to home DNA. Testing kits for genealogy for ninety nine dollars for a cheek swab, and we didn’t have apps that were serving up targeted location based advertising or advertising to us based on our interests and and so 20 years ago, ideas around data privacy were fairly narrowly confined in the sense that certainly people were concerned about privacy as health related information, medical records, privacy and banking related information. People were concerned about how government agencies were using data. But there has been this explosion in the last 20 years of all of these technologies that we all use.  We all love using that, create this constantly expanding digital record of everything we do, everything we are interested in. And along with that explosion of technology has come this real expansion of computing capacity so that now, as we have these really sophisticated machine learning algorithms that are creating behavioral predictions about us that are assessing our personalities that are swaying and influencing our opinions, the information landscape has gotten so complex that I think it’s really hard. Unless you immerse yourself in this stuff, it’s really hard to keep track of it all. And I thought, well, people shouldn’t have to immerse themselves in it to understand it. It affects all of us. People should be perfectly free to, you know, be picking up the guitar or following their favorite football team or doing whatever they’re doing. It shouldn’t be hard to understand just what’s going on. And and so that’s really why I wanted to write the book to just kind of put in one place sort of an overview of these key areas around consumer data privacy, government surveillance, the ways that data is used by platforms and by schools and employers and all those things. So that was the goal.

Anthony Brown: [00:07:31] Yeah. And boy, you pulled it off. And you know, it’s really something that really resonated with me when I was reading. It is, you know, if we take ourselves back, probably, what, 15 years or so, maybe less than that and everyone started signing up to Facebook. I think I was probably 2007 or something like that. I’m sure, you know, a lot of my my friends and what have you were as well. And at that time, we just didn’t realize, you know what we were doing the way we we saw a free service and we thought it was free. But you talk so much in your book about how it’s really not free. Can you tell us a bit about that, please?

April Falcon Doss: [00:08:08] Well, you know, it’s it’s become almost a cliche in Silicon Valley that if you’re not paying for the product, you are the product. And now, of course. And so what that means is that services that don’t charge us a fee like Facebook or like Google Search or any of these other, you know, free apps and platforms and so forth, anything that doesn’t charge us for a service has to monetize that somehow. And what they do is they monetize our attention. They do that by selling advertising to us. So really, where their income is coming from is from the vendors who are placing advertising on the platform and using what the platforms know about us to get that really targeted messaging in front of us. And so, you know, this goes back. I remember it was I want to say it was around 2009 or so that I started noticing in my personal Gmail account, I started noticing that I was getting ads popping up that were tied to content in my emails and I thought, Oh, that’s creepy, right? Clearly, Google is reading my emails, but like everybody else, or like so many other people, I thought, Well, but on the other hand, it’s convenient. It’s free. I can access my Gmail account for anywhere in the world. If I change my email account, that’s a whole hassle because you have to give you contact details to everybody, you know? And so I sort of just got accustomed to the idea that Gmail is that Google is scanning all of my emails and sending me this targeted advertising and so one of the points that I try to make in the book is that it’s not that any of these things are inherently nefarious or bad. I would not argue with anybody that they ought not use Facebook or Google or any other service they want to use. My goal is just to help people be aware so that some people will say, You know what, I’m comfortable with these uses of my information, but now these other users over here, I’m not comfortable with. And so maybe I don’t want to use this particular service or this particular platform. Maybe I’d rather pay for a service that is not going to use my data in the same way. But but this is really what’s at the heart of it is when we’re not paying out of pocket for a service, our data is getting monetized. And that’s really the key thing to know is that that the way that the platforms are making money is keeping our eyeballs on screen and that has a whole series of knock on effects to democracy, to discourse, to all kinds of things, so I’ll pause there.

Anthony Brown: [00:10:41] Yeah, I mean, it’s so interesting. I mean, if there’s, you know, somebody listening to this or I’m sure, a huge amount of population around the world, if they’re if the privacy debate is underway with them or you talking to them or trying to explain this to them, they they may say, Well, do you know what? I don’t mind, you know, I don’t mind getting adverts and being marketed to. And in fact, it’s helpful because, you know, it helps me identify what I might, may or may not want to buy. So what would you say to these sort of people? I mean, where can this lead and what is so bad about this? I know you want to try to be as objective as possible in some ways, but yeah, what’s where can it lead?

April Falcon Doss: [00:11:19] Yeah. Well, you know, it’s funny. I think the classic formulation of this that I hear is people who say, Well, I’m not worried about surveillance because I have nothing to hide. Fair play. I mean, right, that’s great. And it’s perfectly fine, whether that’s whether that’s in the context of government surveillance or corporate surveillance, right? Or workplace surveillance school surveillance for individuals who feel that way. Absolutely fine. I’m not sure that everybody feels that way. And so I think again, the starting place is just having that awareness of how much information about us is acquired. And I think some of the things that might surprise people, they get a little bit, oh, I don’t know, a little bit creepier where you get into a little bit of an ick factor, right? Is it’s not just, for example, what I have searched for online that gets captured its a whole set of insights and presumptions and conclusions about the nature of my personality, about what I might be inclined to do. And there is a very deliberate attempt to shape my thinking. And some of those are really kind of straightforward, almost banal ways, and some of them are ways that are a little bit more troubling. So, for example, there are there’s been research fairly well documented demonstrating that with a certain number of clicks on Facebook, the platform’s algorithms know us better than our friends and family do. The platform is better than even our spouses are significant others at predicting what the next thing is that we’re going to like or share, and we see these personality tests coming out or assessments coming out of platforms, as we saw with the Cambridge Analytica scandal with with Facebook, which of course, has been a focus of real attention in the U.S and the U.K., where people were getting sort of these personality assessments across a psychological scale of ocean of openness and neuroticism and so forth. One way that these personality kind of assessments get used is in shaping political discourse, and I want to come back to that in just a sec. The more banal set is if you think about it every time an advert pops up in front of us on our feed, what does it do? It changes the train of our thinking. It directs our thinking down a path that we weren’t planning to go. Now, advertising always aims to do that. A billboard on the side of the motorway does the same thing, right? It tries to direct your thinking towards a product or service you weren’t planning to think about, but the way that the internet can capture our attention and direct it with continuous feeding of new content. The next set of videos on YouTube, the next set of posts on Facebook and so forth really shapes thinking in some dramatic ways. And so this brings us back to the more troubling aspects where we are seeing such shaping of conspiracy theories like the QAnon conspiracy theory in the U.S.. Absolutely baseless kind of bonkers stuff, but that has become a powerful force in U.S. politics. Anti-vaxxer conspiracies this idea that people getting the COVID vaccine are going to be injected with kind of radio tracking 5G devices, I mean, bonkers stuff, but people get led down these internet rabbit holes.  But white supremacy, domestic terrorism, all kinds of things. And the key to this is that the way this information is getting fed to us is from algorithms on these platforms and the algorithms are responding to a combination of things, what they think we will be interested in based on all of our searches and activity and profile and so forth. And what will keep us on screen. So in addition to that sort of Silicon Valley adage about if you’re not paying for the product, you are the product. Another one is that if it enrages, it engages sort of the modern equivalent to the old journalism idea that if it bleeds, it leads, right? And so by keeping us in a state of anxiety and anger and fear and resentment, they keep us on screen. Why did it keep us on screen? Because the longer our eyeballs are on screen, the more advertising can be served up to us, which increases the. Monetization value of our time. So there’s this really striking interconnection between data that’s available to the platforms about us and the way that then our thinking can be shaped by again, sometimes by just traditional advertising, as you said. For some people, it’s very helpful. Wouldn’t we all rather see adverts that are relevant to things we’re interested in? It can go towards our dark side, but has some real negative societal consequences that we haven’t really grappled with yet. So that was a long answer to a short question.

Anthony Brown: [00:16:11]  Oh, it’s absolutely fascinating. Honestly, it really is. So it’s purely about money. April, at the end of the day, is, is that it? I mean, where do you think I know there’s a few questions within this, but what ultimately do the big four? I know they’re all very different in what they offer, but what do they want to do? Let’s take a Facebook. Let’s take Mark Zuckerberg. Let’s take it, or let’s take Amazon and Jeff Bezos. You know, what do they ultimately want to achieve, do you think?

April Falcon Doss: [00:16:36] I think it’s really about growth of the companies that, you know, expansion of corporate profits. And again, there’s nothing that’s inherently nefarious about that goal. It’s the goal of most corporations. I think where we have this real disconnect is that we’re living through a time in which the ways that these companies are making money relies on a set of technologies that raises a lot of ethical issues. And the technology is just leaping ahead far faster than law or policy or sort of a sense of ethics can keep pace. And in some respects, and there’s a whole bottle of issues here. In addition to the data privacy, there’s issues around, you know, anti-competitive practices and whether they are already so big that they squelch competition from other kind of upstart companies that might challenge them. That might offer a different business model, perhaps with more privacy protections. There are concerns about whether the platforms have grown so big that they are ungovernable. I mean, we have just seen a major battle between Facebook and the Government of Australia over payment for news, and Google is in that as well. But the real standoff was between Facebook and Australia.  We’re seeing the antitrust or consumer protection and competition commissions in countries across the UK. I mean, excuse me, the EU, Australia, the U.S. all investigating these platforms. So there’s a whole host of issues here. And when you have these massive multinational corporations that are so powerful, so well-funded, so widely used, it is. It’s very hard, for example, for small business to gain traction if they don’t have the ability, for example, to have a Facebook page right to kind of grow a following and or be on Instagram or be right. The platforms are the big platforms are where people are. And there is a center of gravity that gets created just by the volume of users on the major platforms. So I think that what we just saw in this standoff between Facebook and Australia is one indication of the extent to which governments are saying, Wait a minute, we need to make sure that it’s still possible on a national level to regulate these companies in ways that are consistent with our national values. So lots to watch here.

Anthony Brown: [00:19:10] Isn’t there just and it’s just it’s it’s quite fitting that you’re just having what I don’t know if it’s tea or coffee, you’re drinking that it’s just about to say, do you think there’s a sense that governments and society normal people on the street are waking up and smelling the coffee now? Do you think you’re because your book couldn’t have come at a better time, really? Because with all of these big stories playing out in the press, this is only increasing awareness. It can only be a good thing, can’t it?

April Falcon Doss: [00:19:38] Yeah, yeah. And I think that I think there have been a couple of real moments of awareness that have been striking for a lot of people. So around the impact of the big platforms on democratic discourse and processes from the 2016 U.S. presidential election and the Russian attempts to interfere with that and the use of the internet to do it to the U.K. Brexit election, which has been widely speculated as also having been influenced by foreign actors. We’ve seen, you know, election influence operations. We have seen the spread of extremist ideologies online. We are certainly seeing we’re seeing a groundswell of concern over things like the way that information persists forever on the internet and certainly in the EU. The establishment of the right to be forgotten has been kind of a leading edge there and saying perhaps they need to be remedies for this, so I think there’s a lot of these things that are really kind of coming together to make people more aware that again, that these technologies that we love using, we love the convenience, we love the ability to connect. But people are recognizing that they’ve galloped ahead really fast. We need to maybe pause and consider what the downsides are so that those can be governed in some way.

Anthony Brown: [00:20:57] Hmm. Do you think that in many ways, though, because laws and regulations have now, I say, caught up? Are they ever going to catch up because it’s always evolving, as we know? But can there ever be a true, true competitors to say the big four? In many ways, because the things that they were doing in the early days, you might not necessarily be able to do now. So therefore, will there ever be a genuine competitor to as a search engine, for example, like Google, I know there are search engines out there and some of them charge you don’t need to actually have better privacy or, you know, not sell your data. But is there ever going to be really genuine competitors?

April Falcon Doss: [00:21:35] That’s a great question, and I think we’ve reached the point that in order for that to happen, there really is likely going to need to be some government intervention because the key to the platform model is market dominance. And so, for example, Facebook has been in the practice of buying up any competitors. It bought Instagram, right? I mean, it bought WhatsApp. It’s it says, Oh, look, there’s there’s a part of the social networking market that could draw people off our platform. Let’s buy them right and and it just expands that dominance. And with respect to search, it’s a great example. Again, the research shows that generally speaking, Google search results are so effective because they rely on personal data, both personal data of an individual user who’s conducted past searches in the browser and also personal data broadly up the whole set of users who are using Google search. And so searches on Google can can be as much as 40 percent more efficient in bringing to the top information that people are looking for, although it can also bring to the top information that advertisers are paying to have brought to the top. So the point is that these Apple, Amazon, Google, Facebook that big. As you said, the big four have become so big that it is very difficult for newer, smaller companies to compete unless there is some government intervention. And and we’re seeing really aggressive investigations now again around the world, certainly in the EU, certainly in the U.S. looking at this anti-competitive problem and we might very well see some move to break up these companies to say you’re there’s just been too much consolidation that’ll take a number of years to play out, but it’s certainly has regulators attention.

Anthony Brown: [00:23:33] You took my next question out of my mouth there at Apple because I was going to say, how do you see this panning out? How if we had a crystal ball, you know, what do you think could could happen? Do you think it’s conceivable they’ll be broken up in one way, shape or form?

April Falcon Doss: [00:23:47] I think it’s possible, you know, it’s interesting. So in the U.S. context, the idea of antitrust or anti competition law looked at has always traditionally looked at two main things one what’s the impact to potential upstart rivals? Is it possible to compete? And and it’s OK for a company to have market dominance because it’s just better at something, but it can’t unfairly quash competition. And then the second component is really around what’s the impact to the consumer? And traditionally, the impact of the consumer has been looked at in the context of pricing, right? And so if a company has market dominance and that drives up the price of a good or a service to the consumer, then there’s been a sense, Oh, there’s there may be an anti-competitive problem here. The challenge that we’ve had in trying to adapt antitrust law to this current climate is that in the data context, it’s not about price that’s disadvantaging consumers what disadvantages consumers is this combination of lack of choice, manipulation of information, manipulation of viewpoint, attempts to sway viewpoint that incentive to keep people’s eyeballs on screen, sale of data in some cases and sharing of data in some cases.  It’s the aggregation of that and the sort of algorithmic kind of use of data. And so we really need to be adapting our models of how we think about what cost to the consumer is or negative impact. And so one just another quick thought on that not to go too far into the weeds, but the Federal Trade Commission in the U.S., which is the primary regulator for consumer protection and Antitrust has launched a big study or an inquiry into nine major platforms, and one of the things they asked in that inquiry very long, detailed set of questions is what is the dollar value to the platform of each user? And that’s something that’s been really hard to get at. So it requires a shift in mindset. It used to be that anti-competitive regulation looked at things like the railroads and the steel industry, and this is just a different problem set. But again, I think governments globally are trying to modulate how they look at it and sort of modernize their approaches for the digital economy.

Anthony Brown: [00:26:10] Mm-hmm. Yeah, I mean, it’s to all intents and purposes, Facebook have blinked first, haven’t they, with the Australian government? Yeah, yeah. Again, you know, it can only be a good thing for all of us. I mean, you know, these the regulatory bodies like the ICO in the UK and governments are going to have to play a strong poker game, aren’t they? I think over the coming years, you know, it’s you really have to you’ve got to you’ve got to be tough and you’ve got to stand up to these giants in some cases. A lot of cases these businesses have more cash in the bank than some countries do. So it’s very difficult, isn’t it?

April Falcon Doss: [00:26:48] Yeah. Yeah, that’s exactly right. And these are these are all of the challenges. But I think there, I think there is a growing sense that whether it’s impact to the news industry or impact on democratic discourse or impact on people’s individual lives, all of these things are a bundle of issues that are very deeply intertwined with these privacy questions. And so, you know, back to the very beginning of the conversation when you ask, Well, what about people who aren’t concerned? First of all, again, absolutely a OK for people who aren’t concerned. That’s great. That’s fine. But I think for people who do have some concerns, the really striking thing that has changed in the last 20 years is that where privacy used to be much more straightforward, what information is is available about a person and to whom. Now it’s become deeply intertwined with all of these other issues that makes it much more complex and really challenging.

Anthony Brown: [00:27:44] It’s funny as well that you said that. I mean, anyone listening, I promise you this is straight off the cuff this conversation, but you just referred to you are almost sort of referring or alluding or maybe going down the route of talking about, you know, well, you did previous privacy issues or of years gone by and the sort of challenges that people faced then. And something I really wanted to ask you about is because again, it was. Your book is so well researched. It’s superb. And some of the stories in there are so interesting for people who’ve got an interest in privacy or not privacy or not pick it up because some of these nuggets are brilliant and you talk about, Correct me if I’m wrong, I’ve obviously making it note here a lady called Is it Elvira Roberts? And one of the earliest known sort of privacy cases. Was it in the US back in the 1880s? Could you perhaps just throw us right back there and tell us a bit about that case?

April Falcon Doss: [00:28:38] Yeah, absolutely. So the U.S. and its Constitution does not anywhere use the word privacy. And so in the eighteen eighties, there was starting to emerge this sense of concern around what was the realm of information that individuals should be able to keep to themselves what’s sensitive, what’s personal? So in the 1880s, there was this case called Roberts versus De May, and this woman, Mrs Roberts, was pregnant. She was getting ready to give birth to her child. She and her husband were very modest means they lived in a one room home, very small, and the doctor came to deliver her child and the doctor had a companion with him, and the companion was present throughout the labor and delivery. And in this very small one room setting. After the fact, Mrs Roberts and her husband discover that the doctor’s companion wasn’t a medical student, wasn’t another doctor, was was, was just a friend who happens to be with the doctor that evening, she was mortified. I mean, this is the 1880s there was and he was a man. The friend was a man. It was a strange man, practically at her bedside as she’s giving birth. And this was a terribly personal moment and an intimate one, and it felt so intrusive. So the Roberts sued the doctor for having brought this stranger who was not a medical professional in any way into this very intimate setting in the home. And the court said yes, there has to be basis on which to sue for this. And previously, notions of privacy had in the U.S. been tied to things like the privacy of personal papers and the ability to protect that from the government or the privacy of property. But this was the first time that a court really looked and said, no, there’s something intangible about the sense of being able to protect from the world intimate moments. It doesn’t have to be solely about property and papers and intrusion into a space. Now that’s that Case was certainly influenced by the fact that it was in the home, but I think a moment like childbirth would have been found to be sensitive and intimate in a similar kind of way, even perhaps outside the home. Interestingly, right after this, very soon after this, there’s this very famous law review article in the U.S. that becomes the foundation for privacy law in the U.S., and it was driven by changes in technology. These two very respected lawyers, one of whom would later become a Supreme Court justice, wrote an article saying that the heart of privacy consisted in the right to be, let alone now. What had happened at this time and the things that they were influenced by was the advent of the camera. And so all of a sudden by the eighteen eighties, there is this growing movement of the early stages of paparazzi who are out there taking photographs of all kinds of things peering over garden walls to take photographs of society weddings really just sort of intruding into what had once been private spaces and even taking photos out on the street, but making people’s actions on the street, although not private now permanent. And that created a different sense of what that means to this idea of the right to be, let alone was actually driven in part. It started with kind of this sense that some things are just intimate and should be protected, and then was very quickly driven by this change in technology. And we certainly see in the EU as well as as there’s the establishment of privacy as a fundamental right coming out of the European Convention on Human Rights. We really see again this concern about the ways that information can be misused and the EU historical context. Of course, it derives really heavily from the World War Two experience. So technology has shaped our notions of privacy for almost a hundred and fifty years now, and law and policy continues to struggle to keep up because technology just evolves more quickly. So all these ideas are right to be, let alone the right to have our own thoughts free of monitoring and surveillance and assessment, and the right to have a sense of personhood and dignity that is away from prying eyes are things that really are essential to to living a complete and whole life. And they are things that are at risk each time there are technology changes. And so we really need to think about how do we continue to maintain that sense of personhood and autonomy and make use of all the new technology sort of innovations as they come along? Another long answer to a short question.

Anthony Brown: [00:33:08] No, April, it’s fascinating hearing you talk, and it’s brilliant because I can hear you talking and I can remember bits that I’ve read in the book, and it’s just amazing to hear you talking about them. And isn’t it amazing that you know when you break it all down, break privacy down. It’s so simple the things that you said really what it all, what it’s all about is that fundamental right to keep things private. That’s where it all began. Alvera Roberts I mean, little did she know that she was an important part and case that was in the 1880s, and little did she know what the world would look like fast forwarding well over a hundred years. When you know, in another fact in your book is that 90 percent of the world’s data has been created in the last two years, and I think that’s perhaps where we’ll wrap up April and leave that as a something for everyone to think about, because that is just mind blowing. Ninety percent of the world’s data was created in the last two years. It may it may well have been when you wrote the book of a year or so ago, it probably was the preceding two years to that, so it’s probably just exploded even more

April Falcon Doss: [00:34:16] So. Yeah, it is. It is mind boggling, isn’t it? Yeah. And so it just means that there’s so much to grapple with. And just for a closing thought, there are so many people who work in one way or another in privacy and compliance and corporate ethics who really, whether it’s in the government or the commercial context, want to be wrestling with these things, right? And so it’s a great opportunity. It’s a great time for people who are working in these fields to try to help within their organizations. Take some ethical uses. Here’s some. Here’s some lines that we might not want to cross. Here’s some things to think about, right? So for folks who are kind of working in these industries, as well as all of us just walking down the street, living our lives with our smartphones in our hip pocket, right? You know, hopefully this is some useful, some useful context in how to think about these things.

Anthony Brown: [00:35:05] Absolutely. I mean, people like you around April banging the drum, shining the light on everything we know we’re in safe hands. And as we said earlier, there’s never been a time when people have been more interested or more aware, and long may it continue. And hopefully we didn’t do too badly there, considering we just decided on the way to do it. So I’m just going to say, and you stay on the line April and I’ll finish recording in a minute, but I just want to say thank you so much and. To everyone listening. Whether you can see this or not, April’s book is cyber privacy. Who has your data and why you should care and available from all good bookshops and where you can guess one of the big four sell it. So thank you, April, and it’s a wonderful read and thank you for your time. I’d love to have you back on in the future as well, so we’ll say goodbye to our listeners and viewers now. I hope you’ve enjoyed it and you take lots of way and go and buy the book bye for now.

86-90 Paul Street
London, EC2A 4NE

+44(0) 203 773 3660

Follow Us
© 2023 Clermont Search Partners Ltd
 Privacy Policy
Website by: