The Privacy Insider Podcast
Protecting Privacy at Every Walk of Life with France Bélanger and Donna Wertalik of Virginia Tech
Privacy has entered a new era. AI-generated content, ubiquitous surveillance, and unchecked data collection have created a landscape where individuals are constantly tracked but rarely informed. As digital tools outpace policy and public awareness, the need for accessible privacy education has never been greater.
About Our Guest
University Distinguished Professor France Bélanger and Professor of Practice Donna Wertalik, both of Virginia Tech, are meeting this moment with a blend of academic depth and public communication. Through their joint initiative, Voices of Privacy, they translate decades of research and marketing insight into practical guidance for navigating the digital world. In this conversation, they discuss behavioral patterns, generational attitudes, flawed policy execution, and how building privacy habits early can shift the culture from passive acceptance to informed control.
Episode Highlights:
- (05:10) Meaningful projects often begin with shared values and a simple idea.
- (07:20) Remote work was an early signal of evolving surveillance norms.
- (10:12) Advocacy in communication and branding can drive awareness around digital harm.
- (17:45) Data is constantly collected, and informed choices are essential.
- (23:39) Building privacy awareness requires relatable examples and early habit formation.
- (28:11) Early education is crucial to shifting how privacy is understood.
- (44:49) Strong policies can fall short without thoughtful implementation.
- (51:42) Even experts face trade-offs when balancing convenience and privacy.
Episode Resources:
[00:00:00] France Belanger: It's not about having something to hide, it's about having your life. It's about, being your own person and not having everything about you out there. Welcome to the Privacy Insider. My name is Arlo Gilbert. I'm the founder at Osano, but today I'm your host. Today we are joined by a power duo, Dr. France Belanger. She's a scholar who turned privacy from a buzzword into a blueprint. After earning her PhD in Information Systems, Dr. France Belanger rocketed through Virginia Tech's ranks to become a university distinguished Professor RB Pamplin professor and Byrd senior faculty fellow with an H index of 52 and about 22,000 citations placing her in.
[00:01:11] Arlo: Stanford's top 1% of scientists worldwide. France's work keeps raising the bar from multi-level privacy management, revealing how families, teams, and companies co-own data to mapping the rise of remote work surveillance tech, and she doesn't lead insights on the page. She co-founded the award-Winning Voices of Privacy Project, turning, cutting edge research into web episodes and TV tips that the public can use right now.
[00:01:40] Arlo: Buckle up. France saw the telework wave coming before Y 2K, and she's still steering the privacy conversation today. And the other half of this power duo is Professor Donna Wertalik. Next, get ready for a marketing firecracker with a privacy twist.
[00:01:58] Arlo: Professor Donna Wertalik spent 15 years shaping brands like Nestle, sharing, plow and Ogilvy before leaking into academia As Professor of practice, director of Student Engagement and founder of Prism, Virginia Tech's award-winning student social media agency. Under her guidance Prism has bagged more than 10 national creative trophies while Donna, herself boasts Six Telly awards, a stack of Addies and the title of Bloomberg Business Week.
[00:02:27] Arlo: Favorite professor, whether she's unpacking digital identity storytelling. Debunking Gen Z's privacy, apathy, or showing how micro influencers can bridge the gap between personalization and trust. Donna delivers monthly privacy tip segments on WDBJ seven with the same infectious energy she brings to the classroom and voices of privacy.
[00:02:50] Arlo: Get set for a masterclass in brand savvy and data smarts. Donna Wertalik is in the house. Ladies, professors, welcome to the show.
[00:03:02] Donna Wertalik: Thank you so much. We're so excited to be here. Yes. My name is Donna Wertalik, and I'm a professor at Virginia Tech in the marketing department.
[00:03:11] France Belanger: Thank you so much for having us. We love talking about privacy. So my name is France Belanger. I have actually several titles at Virginia Tech, but basically a professor, and I'm also a book author. I am a journal editor and a consultant.
[00:03:27] Arlo: Wow you sound like y'all must be very busy, so I would love Now, te tell us a little bit about you. you both have a show that you produce together around privacy. Tell our audience a little bit about it. I'm sure many of them have listened and seen your clips, but let's share. We've got a large audience who has perhaps not seen some of your, clips.
[00:03:46] Donna Wertalik: Voices of privacy started, as an idea, France and I had met many, many years ago. I always say France saved my life because I lost my wallet. And talk about the ultimate in terms of losing your privacy . So lost every single credit card, um, driver's license, all of that, and France, like someone coming out from the lights said, I have your wallet.
[00:04:06] Donna Wertalik: And that was the start of, you know, way back in the day, us meeting each other and just really. Just occasionally seeing each other here and there. Then the concept obviously of, of privacy, which I've been in marketing for 30 years. So online, tracking, digital tracking, what's happening with our data, the information.
[00:04:25] Donna Wertalik: Why do marketers need it, why do consumers have to give it all of that? So. For a bit. I was, working with a number of different associations and actually doing some training and, and really starting to discuss the aspect of what happens when we're online, when we can't see people following us and what happens when we give everything up.
[00:04:43] Donna Wertalik: So it's familiar to, it's similar to 1984, George Orwell. Someone's watching us. Right. But the other side of it was that they didn't assume that we would be giving all the information freely. I'm going on vacation for two weeks. Come on inside my home, or you know, whether it was senior citizens or children.
[00:05:01] Donna Wertalik: So I was touching upon all of these subjects and I'll let France share her angle of it as well in terms of her, areas that she was touching upon. So. I was doing some marketing at the time for the college. She reached out and said, she had an article and I said, well, you know, maybe we were both like, maybe we should talk about something bigger than what we were doing.
[00:05:21] Donna Wertalik: And she said, I have a crazy idea. And I said, I love crazy ideas. So we met over coffee and was thinking about the time of book or something along those lines, and we were like, what if it was a show? What if two people that never really worked together and were two of the most. People that were probably so opposite of each other in terms of not just, you know, our work Um, but a lot of different aspects. But one aspect that pulled us both together, outside of our. Brilliance as females, and kind hearts was our passion, our passion to make a difference. And that is how Voices of Privacy came about in from my perspective in terms to make a difference. We are ahead, we feel right now because the world still is unaware of where everything is going.
[00:06:08] Donna Wertalik: But it started with a passion and it started with a crazy idea and. Four years later, we started in 21 in the fall of 21, just etching things out. We're sitting here and on a podcast with, you know, thousands subscribers that on our, our social and, viewers and, and years of, content to just educate everyone from a third grade level, not an academic level.
[00:06:31] Donna Wertalik: So that was my angle of it. But France, please speak.
[00:06:35] France Belanger: Absolutely. So, uh, I'm just gonna reemphasize the point that Donna and I are extremely different. And so Donna is a very marketing oriented person. She was involved in social media from the beginning. I never had a Facebook account. I've always shone from every social media, although I am on LinkedIn for practical reasons, and that's part of the privacy conversation.
[00:07:01] France Belanger: and, and here I'm gonna show my age. I was, when I first started my career, I was in. At first, I worked for IBM, and then I was in the telecommunications industry, and then I decided, I did that for about 10 years. Then I decided to go into academia, which is the best job in the world. I love what I do, and of course, I brought my, my telecommunications with me and my dissertation back in the nineties.
[00:07:28] France Belanger: Was on telecommuting or teleworking and this is way before COVID happened or anything. So I was, you know, saying, okay, people are gonna start working remotely. And uh, just to give you an idea, I was comparing, this is just for the older people here, I. 2,400 BPS modem to 9,600 bits per second modem. And for those of you who have no clue what that means, if you had a 2,400 BPS modem back then to get work, first you push the button and then you go get a coffee.
[00:08:00] France Belanger: But we're talking about depending on your bandwidth you have on a 5G. Now, it could be up to a million times faster. The communication you have from what I was studying back then. And when the web started to evolve into e-commerce, I was really drawn to, well, why are people gonna do this? And three of the factors that really came out were privacy, security, and trust.
[00:08:27] France Belanger: And so you have to think this is several decades back and that started studying privacy. Privacy is a fascinating area because it is not just about the technology. It's not just about even the regulations, it's about the people choices. And so that people aspect, the fact that people have a role to play in privacy has kept me super interested.
[00:08:55] France Belanger: And so when I was thinking of this in, you know, after years and years and years of saying people just. Don't get it. I said, we need to do something about it. And I knew that Donna had that aspect that I didn't have. Plus we knew each other, we liked each other, but I was like, she's got that like marketing aspect, that social media, that understanding and really in many ways the opposite of what I was fighting for, which is digital tracking.
[00:09:24] France Belanger: And so. It was like, I didn't even hesitate. It was like, she's the person I wanna work with. And that's how we ended up creating voices of privacy. And since the beginning, it's always been about trying to educate everybody. We're not saying you shouldn't use the web, you shouldn't be online, we shouldn't talk to Arlo because there's a video of us.
[00:09:45] France Belanger: It's about, making decisions.
[00:09:47] Arlo: Informed choices and informed consent are kind of important, so I have to ask. Marketing and privacy, they're usually at odds with each other. Do the two of you find that this relationship has sources of conflict because of the, the two different backgrounds?
[00:10:06] Donna Wertalik: I'll speak for both of us and say that we respect each other's background and history. So my background, you know, 30 years in marketing, working for Nestle and working for, you know, all sorts of pharmaceutical products companies. but there was always an aspect within my marketing of advocacy.
[00:10:26] Donna Wertalik: So one of my big projects I worked on was Dove Real Beauty. It was really breaking the seal of what women felt were perfection. So that kind of got me on the advocacy route in terms of we're really not, we need to put out marketing that's real, that's authentic versus really, really impacting at the time young girls' lives and, you know, creating all sorts of things.
[00:10:47] Donna Wertalik: So as we've watched, and we were the first ones to, you know, at the college, et cetera, to launch social and all the platforms because we need to be there. That's where our generational cohorts are coming in. Expect us to have that education, especially in the marketing department field. So that's what I primarily focus on.
[00:11:01] Donna Wertalik: But seeing, you know, it was a little bit over a decade ago, I think I started to present to the Virginia Tech football team, and it was about curating for good. What kind of content can you put out there that if you saw, or your child or your grandchild thought. Would really make you proud. So it's not about going off camera, it's being on and being very intentional.
[00:11:23] Donna Wertalik: I have learned more than a lifetime of insights from France. I find myself absolutely honored. I respect France so much. She is a global scholar. She is a confidant. Um, I learn from her every single day, and I think we have more fun than back and forth. It's like, oh, I didn't know that. Oh, okay. That's good.
[00:11:45] Donna Wertalik: So it, it really is, and I think it's because we are the right people to do it. And other people, I think they might come from different angles, but we really are trying to do the same thing. We have the same goal, so that, I think that helps. But France obviously.
[00:11:58] Arlo: That's amazing. And I have to say, uh, I remember that Dove campaign so vividly. It was really groundbreaking. It was the first time that any beauty company had ever gone out. Well, at least to my knowledge, that a beauty company with that kind of reach had gone outside of the standard of the, you know. What do you see on the cover of the magazine?
[00:12:21] Arlo: Does that represent reality or does it. Represent some distorted story that we're telling. I love that. So you're definitely one of the good marketers using your powers for good, Not for evil. That's wonderful. I'd love to learn a little bit about how both of you got interested in the subject. So, you know, France, you had a, a much richer background in privacy prior to meeting up with Donna.
[00:12:44] Arlo: Was there anything in your background that got you excited about privacy to start with?
[00:12:49] France Belanger: Not really. I mean, it was really the constant, you have to think of like 20 years of trying to understand people. And I remember doing, an interview with somebody that says, is your background psychology? And I said, no, but. By default, you become a psychologist when you try to understand people behavior, both in security and privacy and you know, I, I have done research in both and I made a conscious decision at one point.
[00:13:20] France Belanger: To really focus on the privacy because, you know, in security, we're trying to get the power away from the users. I mean, if you don't wanna use a strong password, tough luck. You're just not gonna get into the system. And so slowly we're trying to do that. But the people who still click on that link, the people who still share things, the people, I mean, we have so many stories, Donna and I, and, and you know, people, we know people.
[00:13:48] France Belanger: Some very close. Persons to us that have been hacked or scammed or all of that. But just like, it, it never ends seriously. And so that's when I made the decision. I said, we have to really focus on understanding the people. And also, and I'll just bring this very temporarily, is the aspect of the organization.
[00:14:15] France Belanger: What I've seen in organizations 20 years ago in cybersecurity where it was nobody knew. And now they're starting to know in the privacy world, they don't know. Right now it is completely, I call it the wild west because they don't know. They don't know what they have and, and the power, and I'm sure you're gonna talk about technologies soon, but the.
[00:14:41] France Belanger: Power of the technology to collect more and more and more data to process data, for people, not to know that it's being collected, all of that, it's getting into these organizations. Some are consciously thinking about, oh, now we really have to be you know, good stewards of that . Others don't even know what they have, and yet others are like, oh, how can we use that?
[00:15:04] France Belanger: But that world is really the wild West right now to me. So I didn't quite answer your question about why I got into this, but I can tell you why I'm passionate about it.
[00:15:15] Arlo: Fair enough. I think that all that matters that it's, it's the authentic you. So I'm, I'm really curious, you know, to, it's, it's. A long time for it to still be the Wild West. 'cause it sounds like we all got online around the same era and back then they were calling it the Wild West And then 10 years later they were calling it the Wild West.
[00:15:34] Arlo: Do we have any hope that this will not be the Wild West someday soon?
[00:15:39] Donna Wertalik: And I mean, it's interesting. There was a stat actually released today. MIT did a survey with, everyone using AI and our competency scores and creativity scores went down about. 83% because now we have machines thinking for us, and it's a fear. It's quite fearful. So I think the Wild West gets wilder.
[00:16:00] Donna Wertalik: I honestly do. And I think that we still, as France mentioned, still don't have a fan foundational understanding of companies that. They don't have to do it. Right. And that's one of the biggest things that we saw earlier this year, um, with, with Google and browsers and, okay, you're gonna just choose one time and say, on this browser, I don't want anyone chat tracking.
[00:16:19] Donna Wertalik: And then all, all of a sudden, oh no, we're not gonna do it now you still, so they're still asking for consumer behavior changes in this and That's right. Consumer behavior psychology, it's asking people to take a pause. To stop and say, here's a popup, what do I do? Versus mentally, we're conditioned almost from bra, from the days of banner ads to be like, click it away, click it away, click it away.
[00:16:40] Donna Wertalik: So it kind of serves as that same. So do we need a different prompt for consumers to understand how important it is? Because it really is, it's gonna come from an integrated source, it's gonna come from educators, it's gonna come from policy, it's gonna come from families and student. You know, everyone really. Embracing this, but understanding, you know, that they do have the power, um, but they need to take that power over quickly before other things occur and we've seen them. So I think from my perspective, I think it's getting wilder.
[00:17:13] France Belanger: Yeah, so are we gonna be able to do, to make a difference? And I don't know if we're gonna be able to, but we're not gonna stop. So that's the one thing I can say. We do believe that. You know, if you look at the old movies and you see all the science fiction about people talking to their watches and, and them going way back here, or you know, the data and you can't walk on the street without being recognized, we're pretty much there.
[00:17:43] France Belanger: And so once people realize the amount of data about them that is constantly being collected constantly. Being used, and analyzed. I think then maybe we can at least tell them, make some choices and, you know, instead of just you know, give a blank check to all of these companies and all of these governments and all of these environments to just know everything about me.
[00:18:13] France Belanger: I don't care. Typical answer. I don't have anything to hide. It's not about having something to hide, it's about having your life. It's about, you know, being your own person and not having everything about you out there. So we can get there, but it's a complicated. Effort that, as Donna was saying, is realty multi-pronged.
[00:18:38] France Belanger: It needs to come from the organizations producing the products. It probably more or better regulation. And the people really having that understanding that they still don't have.
[00:18:54] Arlo: Amazing. When you look back at the history of the internet and we think about the path that took us here, my memory is that when we began, the internet was a pretty open, safe. Fair place. There wasn't really anything that could be collected about you other than perhaps a, a username and a password, or maybe they wanted to know your birthdate so they could, you know, show you a graphic on the website, on your birthday, or something along those lines.
[00:19:23] Arlo: It was always so innocent and we were all trained that it was safe and okay to fill these things out. It was open and. We had this, uh, we had a guest some time ago where we talked about the way that marketing has very effectively used signals across lots of different things to be able to cross correlate identity.
[00:19:45] Arlo: So we often talk to people and they say, well, you know. For example, maybe my zip code, right? That's not personal information. My zip code. There are thousands and thousands of people who live here and we said, yeah, okay, and then you went to another website and they asked you for your last name and your date of birth.
[00:20:04] Arlo: And then it turns out that one company bought both of those websites. Now they know your zip code, your date of birth, and your name. They definitely know exactly where you live. They know what you order. That it's this amazing ecosystem that it feels like it almost sprung out accidentally. Of, you know, solving problems around why I need to make money on this website, that it costs me a lot of money to host, so I need to run some banners.
[00:20:29] Arlo: Oh no, the banner prices are going down. Well, this company's offering better banner prices. I'll use them. Right? And it became this slippery slope that somehow has gotten us to the point now where if you've ever spoken online, if you've ever been on a video, the AI can now clone you and call your family and pretend to be you.
[00:20:48] Arlo: It's really amazing how we've gotten here. So I'd love to talk a little bit about your specific work. So, uh, France, you've spent a great deal of time in academia talking about privacy, and I guess the first question that really comes to mind whenever I hear, talk about teaching of privacy, how do you deal with all the apathy?
[00:21:10] Arlo: Because there are a lot of people who say, Hey, you know, I, I don't mind sharing my name, or I don't mind sharing my email. But often what we hear is, I don't like it, but they've kind of won the game's over. You know, they, they won, big tech has dominated, now I'm just a, a pawn in their machine. How do you think about that?
[00:21:32] Arlo: I.
[00:21:33] France Belanger: Well, so from an educational point of view, um, I'm a big fan of what's called learner centered education. And so from an education point of view, you have to. Make them do things. And so one of the initiatives that actually was Donna's idea, we have created these, these videos where students are forced to go and do something like, go watch this.
[00:21:58] France Belanger: How to fix my settings on my phone, go watch how to do my Facebook, you know, settings. And so when people do it, It's easier to train them after they've done it to see what is there, right? To see what the issues are, to see the choices they have. So I'm very much of a fan of, let's get them into some, we call them learning activities and then we'll explain the, the reasoning behind.
[00:22:29] France Belanger: So from an education point of view, I think that's really the only way they're gonna get there. If we were able to start way back when they were younger, they would develop these habits, right? We wanna develop habits. So very simple example, your phone changes operating system every three, four, five, six months, right?
[00:22:54] France Belanger: There's an update. Every time there's an update, there's a new setting or a change of setting, or a reset of setting. Well, a habit is you do this, then you go check the settings. Nobody does that, but you should. And so if we can start building these habits that people have kind of a, a second nature right, then I think that it would help in solving the problem.
[00:23:22] France Belanger: I'm gonna stop there because if your question had more, but I wanna leave a. Donna for answering that too. 'cause she does that in her class too.
[00:23:30] Arlo: Great. Well, Donna, why don't you jump in and, and I'll take a, take a deep breath so we have a a pause, a pause spot. But I'd love, I'd love to have you jump in, Donna.
[00:23:38] Donna Wertalik: Yeah. One of my favorite things to say is, if you had to take a step for every click you have made since you woke up this morning, how far would you walk? How many clicks have you taken? How many people have walked with you that you have no idea how many people have walked with your children, with your parents?
[00:23:56] Donna Wertalik: They have a no idea that they're surround sounding and. Attaching themselves to us, attaching themselves when we go from Facebook to a store to anything along those lines. So, you know, we start by trying to explain that to a certain perspective to students to understand, and then start with the analogy of, okay, you don't care about privacy and this.
[00:24:17] Donna Wertalik: Does differ between generational cohorts, I will say. And so for the one now, in terms of Gen Z, 'cause Gen Alpha's coming up and that's very interesting, some of the habits that they're developing. But for Gen Z, it's as if to say so, if you don't hear anything about privacy online. Then offline, you must always keep all your doors open, all the money on the, on your table and jewelry and cash, because there is no, there's no worry for you.
[00:24:44] Donna Wertalik: Oh, no, no, no. It is. So for us to be able to bridge the virtual and the physical, like fidgital is what we kind of call it, and create that environment of habits as France has spoken of, because it truly is. On the user. They know how complacent people are, especially with different age stages, whether you're complacent or you just don't understand, or your technology is not my thing, and that's how people get targeted.
[00:25:11] Donna Wertalik: So I, I think from the long road down in terms of education and as we see it going into immersive, I mean, we're gonna. We learn in headsets. Now we're in, you know, the Metaverse teaching and now we're as avatars. So that's the whole other next level that we're going to with not even ourselves and putting ourselves out there as avatars.
[00:25:29] Donna Wertalik: So it goes on and on, obviously. And I think that it is, I think it's habit forming. I think it's starts early in the schools. What is gonna be really, really needed, you know, and, and I think there have been some good. Roadways that we've made, but I think things are coming at us so fast and so furious, and with the world as it is, I just don't think that people have time.
[00:25:52] Arlo: So let me ask you both are educators and you both predominantly focus on teaching, you know, collegiate, uh, level and perhaps graduate student level. I education, but it sounds like both of you are saying that this has to start a lot earlier because I, I know that by the time I got to college, I had a lot of formed habits already that I wasn't gonna change, and now at a much, much later age, it's even harder to change who I am.
[00:26:20] Arlo: So if you both are talking about earlier education, how do you make that happen? Because it seems like that's a lot of people and a lot of minds to change and perhaps even getting support of educational institutions and districts and counties and countries. How, how do you approach that? Where do you start?
[00:26:42] France Belanger: I think we start by, convincing. you have to convince the, um, school systems of the importance of doing this. But it really starts with the parents. When the parents demand it, I think it's got more of a chance to happen. And unfortunately, before they demand it, parents need to understand the issues.
[00:27:05] France Belanger: And so one of our episode was on Sharenting. Sharenting is parents sharing all these pictures and information about their children online, like on Instagram, and I don't know how many parents I've talked to, they don't realize that their photos have tagging information with them. So if I start posting a picture of my child at the playground, I don't know, once a week.
[00:27:31] France Belanger: And so every Tuesday this child is at that playground. I've just given a lot of information to somebody out there and this is on top of all the issues of, you know, bullying and, you know, that can happen and all of that from having these pictures, but even just the information they're sharing about their child and, and there's also a marketing size, well, the marketing people are also collecting this information about, you know, the, children and building life profiles of children.
[00:28:03] France Belanger: And they'll follow them till later in life. But just from a security point of view to have all of that posted there. So if parents, for various reasons, some that are because they think it's good, others that are self-oriented are doing all of that well, we need to educate the parents first so that they say, you know, we, we do need to do something for our children to start understanding.
[00:28:30] France Belanger: It's about the value of data. Once they understand the value of data and we need to give them tools for, you know, protective settings, I think we're gonna make a step in the direction. So we've already contacted schools, we've already talked to some um, county, but we need a very broad effort for this to happen, which of course.
[00:28:55] France Belanger: Would need to be, uh, supported in many ways. 'cause we only have 24 hours a day each still, not quite to do the privacy. And so, but yeah, this is something we believe in, that it needs to start very early, understand the value of data, understand the basics of protection, and to not give up.
[00:29:15] Donna Wertalik: And if you also look at it too, Tagging onto this, some companies have encouraged us to share. So even if you look at Google, Google does a great campaign of, you know, dad and his daughter and from the very start, and every day they take a picture, right? And at the end it's this beautiful commercial of this relationship this father and daughter had for so long.
[00:29:35] Donna Wertalik: So they're encouraging people to do that, which is memories, which is fine, but then it goes back, Arlo, to your point where you were saying, you know, when. Web started, everyone was like, oh, it's a safe space. It's okay. And then you add onto that, oh, it's a social media space. It's just my friends that are seeing this.
[00:29:50] Donna Wertalik: And you're never imagining the quiet stalkers, the people that are just watching your photos and looking at everything. You know, we have shown a, there's a great PSA public service announcement video out of UK called Without Consent and it will scare. The Dally is out of you because it really showcases it, it ai ages up a girl from five years old to 18 and everything that happened to her life because her parents put her online and it was very simple thing.
[00:30:17] Donna Wertalik: So, and it's a really. Scary world and we don't want to go out and promote with fear. But some of the things that we have seen behind, behind the scenes, and these are children, um, really fires us up. So we do, we need, whether it's policy and or school policies or corporations being held further, we know years ago that.
[00:30:38] Donna Wertalik: You know, mark Zuckerberg was on, you know, with the Senate. And, and that was more mind blowing, I think, than anything else, because they were not understanding even what social media was. And it was kind of like, we have this amazing senate, right? You know, you grow up, you believe in all of this, and they're not even understanding what data is or is not shared.
[00:30:56] Donna Wertalik: And it kind of put us in a position of, you know, how we're governed. They don't even understand the technologies. How can they help us or help the company? So that was a, that was a big eyeopening, I think.
[00:31:07] Arlo: I remember that so well. In fact, that was one of the inspirations for me to start Osano was watching that. Exact that exact Senate conversation I, and I remember so vividly thinking these people are very educated and the vast majority of them are attorneys. And the way they're describing the technology really shows an absence of understanding of the foundations of it.
[00:31:30] Arlo: That's kind of scary. How do we get legislature to pass meaningful laws when they don't understand the concepts underneath it and they instead rely upon the face of the largest social media organizations and ad companies to largely come in and tell them how it all works. I, one thing you mentioned that we haven't, we haven't talked a lot about Donna and I'd, I'd love to hear both of your thoughts on it.
[00:31:55] Arlo: you talked about the generational. Differences. Now we all know I, so I'm Gen X, uh, and you know, we have lots of snark and we don't trust the world and we don't trust the government. And we all apparently like to hold boom boxes over our heads when we're being romantic. But you know, you go beyond Gen X and you go to Gen Z, gen Alpha.
[00:32:17] Arlo: How do you see those generations behaving differently with regards to their view about privacy and their understanding and desire for privacy?
[00:32:28] Donna Wertalik: I'll just say from some of the things that I've seen, um, so some, even when you, when you look at millennials and they were always knocked for, millennials want this and millennials want that. And now those millennials are parents and some of 'em millennial parents, I. Are the ones with masks over their kids' faces and they're not sharing things.
[00:32:45] Donna Wertalik: So has that turned in that sense? And then when you see Gen Z, you know, and there's a break in the generational cohorts, even within Gen Z, you know, a 25 versus a 19-year-old that may both be in Gen Z could act very, very differently because of the world that they've grown up in, et cetera. But I think it's really interesting when I feel like Gen Z is a society that wants to be shown.
[00:33:10] Donna Wertalik: Right. They want to show up. They wanna say, this is my life. This is the highlight reel. Everything is great. I vacation all the time. I travel. It's, it's that one upping each other, which I think is, and constantly tracked. And then some of the things we're seeing early on, I mean, gen A, I think they're the oldest is in sixth grade or seventh grade at this point.
[00:33:28] Donna Wertalik: Maybe something along those lines. But is, I don't know if I wanna be tracked all the time. I wanna, maybe I want a dumb phone. I want a phone that just calls and takes pictures and I can text because why should somebody know where I am all the time? That's creepy. And I kind of love that angle of what I've heard and hoping that we can make creepy a bigger thing in terms of, you know, just think about it and if we could get it trending and all this, just like when people stop smoking, right?
[00:33:55] Donna Wertalik: you know? It's stupid, right? Don't smoke either. So. What if we could do something along those lines? So from, you know, that angle, that's how I, I, I look at the generational cohorts.
[00:34:05] France Belanger: I think that older people in general have been shown to be more concerned about privacy, but less able to manage the. Technology from a privacy perspective, and they're very much at risk. Um, we had a great episode on romance scams, that, you know, we interviewed some researchers and a lot of the victims are older people.
[00:34:32] France Belanger: Older people who seek to have, you know, companionship. And there's somebody who. Makes them feel really good and they're not able, and don't, or don't have the tools to be able to make the difference between a scam, an online scam and reality. And it is sad to see the amounts of money that they're being robbed of.
[00:35:01] France Belanger: And yet they're the ones who, when you look at surveys, say they're the most concerned about privacy. You know, it's, it's nice to be concerned, but you need the tools. You need to have the abilities, the knowledge to be able to do something about it.
[00:35:19] Arlo: Agreed. Now we know one thing about technology companies is that they, when they want to make something easy, they're very, very good at making it easy. So the fact that these settings, for example, looking at your iPhone or your Android device, or managing your browser preferences, these are never front and center.
[00:35:38] Arlo: They're never the first button, they're never the call to action. It's always a secondary thing, and I can only imagine how difficult it must be to be in your seventies or eighties and you're trying to catch up with technology and they're kind of playing this game of, you know, hide the button.
[00:35:58] Arlo: Always telling you you have power, but you're gonna have to go navigate 42 menus and on, and like you said, on every single operating system update, you have to go do it again. It is scary. And we all know somebody who has been impacted in that way. I mean, I have, I have family members who have been.
[00:36:15] Arlo: Taken by, you know, for relatively small dollars by, you know, somebody who wanted medical help or, uh, you know, I, I want to come visit you, but I can't afford the plane ticket. Those kinds of things. And it's, it's really terrifying. So when we think about. The privacy world and technology, one of the core themes seems to be that marketing is responsible for a lot of the bad stuff that we're dealing with.
[00:36:42] Arlo: But I'm really interested to hear from both of you about what is it that keeps you up at night? 'cause now we've got ai. There's always new techniques. There's always new tools available to marketing and sales and technology companies, and then they're developing faster than we can keep up with, quite honestly.
[00:37:01] Arlo: What is it that keeps the two of you up at night in terms of the future of technology and how it's changing? Is there anything you see that gets you excited? Is there anything you see that gets you worried?
[00:37:15] France Belanger: So every technology. That has, you know, after many decades of being in this, there's always the, the first, um, scared and there's the, it gets normalized, into. Everything. And then it's used usually kind of as a counterbalance and, and like AI is similar, right? So AI has a lot of privacy issues. I mean, scraping of our data that we never agreed to let any companies use.
[00:37:45] France Belanger: All of the things we were talking about before lo where, you know, you can pretend to be somebody so easily and all the deep fakes, everything like that. Very, very scary. At the same time, there are some efforts to use the AI to help, you know, recognize, boom, alert, this is a scam, boom, this person's not real.
[00:38:05] France Belanger: So there are efforts to do that. I'm concerned, you know, from like my personal point of view is that we are at a point where the surveillance is literally everywhere. And so it sounds bad, but when I walk into a room, I spot all the cameras now. It's just, I, I don't even think about it, but I spot where all the cameras are.
[00:38:30] France Belanger: Um, maybe because of what I do, not that I, you know, I have nothing to hide, but, you know, do I really want people to know how long I was at the coffee shop and how long I, it took me to park my car properly in the parking spot or all of these things? No, I don't want that, but I don't have a choice. And so, you know, we go back to this idea of choice.
[00:38:53] France Belanger: There are things I can choose. I can choose not to have a Facebook account. I can choose not to say on LinkedIn that, oh, by the way, I'm going to this, this, this, this. In the next three weeks. It's my choice to be careful. It's my choice to not have all the tagging on my photos. But when I look at my photos, it doesn't tell me which daytime hour and who was with me that I actually took this picture.
[00:39:18] France Belanger: I have to remember it myself, but it's a choice. So there are things I choose. There are things that we don't choose and it is a little scary to me that we are, let me go back to this idea of the films. We are going back to these hyper, you know. Known society where everything is known about you and I don't want people to give up.
[00:39:47] Arlo: That's fantastic, Donna, how about yourself?
[00:39:50] Donna Wertalik: Well, a lot of things keep me up at night. Um, but, you know, I, I think it is invasion of privacy, you know, and, and I, I think without most people's knowledge, do I worry that we're all gonna turn into robots, and really lose sight of ourselves. Not a hundred percent, but it's there. It's definitely there.
[00:40:11] Donna Wertalik: And that worries me in terms of our cognitive skills, um, younger generations without using their creativity and the ability to problem solve in a unique way versus having a machine do it. So for the younger students, I really, really worry, but I still always worry about hacked and my parents and older people and how many times I can't be Superman and fly around and say, wait, you're getting hacked, or Don't hang up that phone, et cetera. Because it is coming at such a speed of light. And I think that we, like I said, there was concept called this uncanny, um, valley, years and years ago, and it was, can you detect the robots from the real.
[00:40:53] Donna Wertalik: Right. And if you look at it now, and we've done lots of tests on ai and it just, it's machine learning. It keeps getting better and better. But there were points when you could see it was ai. 'cause the neck was shut down and it didn't look real. And, and now they have all these nuances to it So when does it stop? When does it stop? And if it's, if it's used for good, like France said in terms of yes, this is a stalker, this and that, but we have a lot of bad out in the world. And a lot of, you know, an interview we did, and the interview was kind of mind blowing because, you know, for these people that are stalking and, and, and stealing from older people, younger people, et cetera, in their countries, I.
[00:41:32] Donna Wertalik: They're viewed as, they're viewed as, as heroes because they're saving the community. They're, so, it's the Robinhood concept, taking from the rich and giving to the poor. So there's police that do nothing in their countries, priests that support them. So it's kind of, it's So disconnected and it is such a world where we only look at it from the US lens.
[00:41:56] Donna Wertalik: If we looked globally and if more people
[00:41:58] Arlo: Agreed. Now we know one
[00:42:00] Donna Wertalik: the other side of that
[00:42:01] Arlo: technology companies is that they, when they want to make something easy, they're very, very good it easy. So the fact that, these settings, for example, looking at your iPhone or your Android device, or managing your browser preferences, these are never front And center.
[00:42:17] Arlo: They're never the first button, they're never the call to action. It's always a secondary thing, and I can only imagine how must be to be in your seventies or eighties and you're trying to catch up with technology and they're kind of playing this game of you know, hide the, hide the button.
[00:42:36] Arlo: Always telling you you have power, but you're gonna have to go navigate 42 menus and on, and like you said, on every single operating system update, you have to go do it again. Um, It is, it is scary. And we all know somebody who has been impacted in that way. I mean, I have, I have family members who have been.
[00:42:55] Arlo: Taken by, you know, for, for relatively small by, you know, somebody who wanted medical help or, uh, you know, I, I want to come visit you, but I can't afford the plane ticket. Those kinds of really terrifying. So when we think about. The privacy world and technology, one of the core themes seems to be that marketing is responsible for a lot of the bad stuff we're dealing with.
[00:43:21] Arlo: Um, but I'm really
[00:43:22] Arlo: interested to hear from both of you about what is it that keeps you up at night? 'cause now we've got ai. There's always new tools available to marketing and
[00:43:32] France Belanger: I think we are seeing some rollback
[00:43:35] Arlo: and then they're developing faster than we can keep up
[00:43:37] France Belanger: I think the states are fighting
[00:43:39] Arlo: What is it that keeps
[00:43:40] France Belanger: I
[00:43:40] Arlo: terms of the of technology
[00:43:43] France Belanger: the momentum
[00:43:44] Arlo: Is there anything you see that gets you excited? Is there anything
[00:43:47] France Belanger: but there are.
[00:43:49] France Belanger: there are a lot of people trying to push for privacy in the
[00:43:53] France Belanger: country.
[00:43:55] France Belanger: I think as, as long as
[00:43:56] France Belanger: people see privacy as a hindrance, as opposed to being part of value, it's gonna be harder if they step back and think of, you know, not that, oh, if I have privacy, it stops me from doing things because I have
[00:44:12] France Belanger: to keep things private. They find a better way to leverage that.
[00:44:17] France Belanger: I think that, it would be, uh, more positive And more likely outcomes. Now, I do wanna comment on these laws though,
[00:44:25] France Belanger: because if you think of GDPR, lots of
[00:44:28] France Belanger: great things
[00:44:29] France Belanger: in GDPR, privacy viewed as a human right. You know, all these cookie messages that we all see now that really annoy us come from GDPR.
[00:44:39] France Belanger: And so the problem is even if you
[00:44:41] France Belanger: have the ideas for the laws, how They're, implemented can
[00:44:47] France Belanger: be problematic.
[00:44:49] France Belanger: And so the idea of GDPR with the cookies was to give everybody more knowledge. 'cause
[00:44:55] France Belanger: people didn't understand what cookies were. Still don't. But anyway, and to give that idea that, well, you get to pick what is shared,
[00:45:03] France Belanger: right? and that's fine, but the way it's implemented
[00:45:08] France Belanger: is accept all or
[00:45:10] France Belanger: reject all. And we know that reject all in the minds of people means I won't have access to everything I want on the website.
[00:45:20] France Belanger: And so the real. Solution is to click manage and then to usually the good sites when you click manage will have everything turned off by default, except necessary, and then you click confirm. But that's an extra step that I have to do every single time I go to a website. Whereas if I just click accept all, it's done.
[00:45:43] France Belanger: So what have we created by having this law? We've created even more people. Voluntarily agreeing to give companies everything, right? I have another example, which is when there's a law called copa Children Online Privacy Protection Act. Not allowed to collect information from children below the age of 13.
[00:46:03] France Belanger: Many US companies have been fined under DUR that law, but the implementation is you need to be 13 to use this site. Click here to show that you are 13. Well, of course my 10-year-old or you know, whatever is gonna wants to play a game is gonna click there. So, so we've got these things happening and it's at the implementation level.
[00:46:29] France Belanger: That's why I say it needs to be, I'm gonna go back to what Donna said way earlier. It needs to be multi-pronged. We need to have companies embed privacy. We need to have the regulations, we need to have systems to value that, to understand, and we need to educate people. And I want, and, and as you talk about global, the other thing is we have to understand that privacy is not understood the same way everywhere.
[00:46:53] France Belanger: There are many countries where in the language the word doesn't even exist. So I speak a couple languages and there's some of them where the word privacy is not a word per se. You can say private space, you can say private something, but privacy itself doesn't exist. And in some communities, particularly like in Africa, where they have these very collectivist societies like umuntu, where the self doesn't exist, right?
[00:47:20] France Belanger: You exist as part of a community for the good of the community. And so. Those, you know, their privacy
[00:47:29] France Belanger: cannot even be measured, evaluated. So we need to think about this in a
[00:47:35] France Belanger: much broader picture about protecting the people. You know, in the end
[00:47:40] France Belanger: it's about protection of people, and in
[00:47:43] France Belanger: this case it's protection of their information. So I know I went a little bit on a tangent there, but it's something that I'm quite passionate about. So.
[00:47:51] Donna Wertalik: And we talk, we talk about often, I mean, we've talked with senators, we've talked to other policy makers in terms of how this can be changed. And you know, you get really passionate I think, and you're thinking, okay, we're gonna do a movement, we're gonna do this. And then a product, a small product, which it's a kid's product, right?
[00:48:09] Donna Wertalik: Who's looking at it? Who cares? Um, and you're supposed to be nine to use it, but parents put kids on it. Three, four, wherever it is. It's called Roblox. And it's having them build all these cities and all these worlds, and it's tracking every single bit of data. I actually had a friend that sent a text and she's like, should my kid not be on this?
[00:48:26] Donna Wertalik: And I'm like, not really. Not really. And it's where, so when you're trying to create policies. Then you have these companies that are teaching young ones to build and to be, you know, to just dream, which is amazing. But on their data and their information, I, I think if we, if consumers looked at it from a functional habit and said, my privacy is like my kidney. If someone asked me to, can I have your kidney? I'd have to really think about. that, right? And say, who is it? Why do they need it? Are they this or they that? But we don't have those pauses and we're a very time poor society and European, they're very different around the world in terms of how they view time.
[00:49:07] Donna Wertalik: They're pri, how they do things private, how they advertise, how they market. It's quite different and there's a reason why the US, I think, is, where it's at. But there's a lot of. US, you know, global companies that operate in the US that have to adhere to GDPR. So maybe the more we get, but I don't think we're anywhere close, sadly.
[00:49:26] Arlo: Yeah, I would agree with both of you. And it sounds like you know a lot more about this than me, so I'm gonna go ahead and take your word as the gospel here. So we've taken a bit of time talking about a number of topics, and one of the things we always like to do is, is just remember that behind these big brains are real people, and.
[00:49:45] Arlo: You know, we talk a lot about privacy in kind of a finger wagging way, oftentimes, right? Here's what you should do, here's what you should do, here's what you should do. But the truth is, we're human. And you know, even though we
[00:49:58] Arlo: might spend our days thinking about the depths of the data and the problems and the challenges.
[00:50:04] Arlo: We're still
[00:50:04] Arlo: people who wanna go and share a picture or, you know, track our health. So I'm really curious for both of you are, are there
[00:50:11] Arlo: any, uh, for lack of a better word, guilty pleasures that you like, to indulge in that perhaps don't support your own worldviews of privacy?
[00:50:21] Donna Wertalik: Well, I'll just say that from my career, professional career. I'm a marketer, and so just by default I am on most of the platforms. I have my settings saved and I watch that very carefully, but there's, I don't think there's ever a day that, I don't pause myself and think I. Wait a minute, should I be doing that?
[00:50:42] Donna Wertalik: Should I be doing this? Whether it's even a phone That's sitting next to you and recording everything, or the prompt from my Alexa, which I'm sure she'll hear right now, saying, do you wanna move to the next one? Which has a whole onslaught of privacy aspects. It's you find yourself, you go into regular consumer mode, I think, and you're like, oh, let me just check this.
[00:51:02] Donna Wertalik: Let me just, you slip. and I think it's just, you can be as good as you can be and then there's slips. Like anything, it's psychology, it's behavior, so you just try to get back on track. That would be my telltale, kind of thing that I would share.
[00:51:15] France Belanger: We live in a society where
[00:51:18] France Belanger: a lot of what we do
[00:51:20] France Belanger: is online and a lot of benefits are from digital technologies. And so we're just people like others who want to have
[00:51:29] France Belanger: some of the benefits. And you know, in, in our academic world, there's this idea of the privacy calculus, right? The cost and benefits
[00:51:37] France Belanger: and the cost is everything we've talked about,
[00:51:40] France Belanger: but the benefits are there.
[00:51:42] France Belanger: And so I'm, I've got
[00:51:44] France Belanger: several things I do that I am, you know, as a privacy researcher is
[00:51:50] France Belanger: not what I
[00:51:51] France Belanger: would recommend doing. But it's part of my life. Uh, I'll give example. When
[00:51:56] France Belanger: I drive on the road, I
[00:51:58] France Belanger: will use either Waze or Google Map. Well, that
[00:52:01] France Belanger: means those apps. Know a lot about where
[00:52:06] France Belanger: I'm at on this day to day, now I personally always give my neighbors
[00:52:12] France Belanger: addresses and I switch it every time, so I never leave from the same address.
[00:52:17] France Belanger: That's
[00:52:17] France Belanger: just, you know, something I do to little
[00:52:19] France Belanger: protection.
[00:52:21] France Belanger: But they know that I go to the Roanoke
[00:52:22] France Belanger: Airport in Virginia a lot and so that, you know, is a lot of
[00:52:27] France Belanger: information about me. Um, they know if I drive too
[00:52:31] France Belanger: fast. So they know a lot of
[00:52:33] France Belanger: information about me. I put all the settings for not sharing, but they still know and they still have, and they can still do.
[00:52:40] France Belanger: Same thing with my Apple Watch. I like my Apple Watch and I use it for all kinds of diagnostics, you know, health and so on. Well, that means somebody has, you know, that information and even though their policies say they're gonna protect, you know, it's things like that. From a bigger surveillance point of view, because I travel a lot, I have several free accesses into countries, global entry nexus and so on.
[00:53:08] France Belanger: Well, guess what? They have my face, they have my iris, they have my fingerprints. They, I am not anonymous, right? But it gives me something that makes my life easier. And I am aware of the issues, but yeah, I do. You know, the only way to really disappear is to not use a credit card, not have a connected car.
[00:53:30] France Belanger: You can have an old car,
[00:53:32] France Belanger: don't, you know, walk in the bank with cash, get out of the ban with cash and still, damn you, you're still,
[00:53:38] France Belanger: in the system somehow. Or live in the woods
[00:53:42] France Belanger: like you are
[00:53:42] France Belanger: right now. Arlo, you're in the
[00:53:43] France Belanger: woods.
[00:53:44] Arlo: I am hiding off in the woods
[00:53:46] Arlo: protecting my privacy. Yes. Except that now the whole world knows that?
[00:53:50] Arlo: I'm in the woods protecting my privacy. It's easy to see. You know, I, I, I often, when we talk
[00:53:55] Arlo: To guests about this topic, it's a little
[00:53:57] Arlo: disheartening in some way because it's so easy to see that slippery slope that consumers can go down and, just knowing
[00:54:04] Arlo: that even folks like us who spend their
[00:54:06] Arlo: time thinking about these topics. We still struggle with those decisions and that consent
[00:54:12] Arlo: and recognizing that the world's moving forward in this direction, and if I choose not to participate, I might kind of be stuck in the past. In some ways, it's a really complicated dynamic. So, uh, look, I don't wanna spend the entire, day that you have, uh, interviewing you, and so we will let you go, but I just wanted to let our audience know, first and foremost, if you are interested in learning more about the work that either of these amazing women have done, please go to voicesofprivacy.com. It's an awesome website, awesome resource. There are tons of great videos from their episodes audio clips. You can learn all about their research and their work. So please head over to voicesofprivacy.com. And with that France Donna, thank you so much for joining us today.
[00:55:04] Donna Wertalik: Thank Thank you. This was awesome.
Meet the host
Arlo Gilbert is the host of The Privacy Insider Podcast, CIO and cofounder of Osano, and author of The Privacy Insider Book. A native of Austin, Texas, he has been building software companies for more than twenty-five years in categories including telecom, payments, procurement, and compliance.