The Privacy Insider Podcast
What Businesses Get Wrong About Regulators and How to Fix Privacy Fast with Brent Homan of the Office of the Data Protection Authority
Privacy, digital rights, and government responses to new technology are under heavy pressure right now as AI scales and more of life happens online. But data protection is not a partisan or optional issue. No matter who is in power or where people live, safeguarding personal information and basic freedoms takes constant vigilance and a readiness to push back on harmful uses of tech.
Brent Homan, Commissioner of the Office of the Data Protection Authority, shares how modern enforcement works in practice. He explains why privacy is universal, how regulators move quickly through cooperation, and what privacy by design must look like as AI and youth-focused platforms expand.
Episode Highlights:
- 00:00 Introduction.
- 05:05 Many privacy leaders come from wider public-interest work.
- 13:10 Good enforcement aims for fast, fair fixes first.
- 19:39 Ongoing regulator-business dialogue prevents bigger harms.
- 24:54 Cross-border teamwork strengthens oversight.
- 28:14 Youth privacy improves when adults listen and guide.
- 34:32 Digital safety grows through habits and learning.
- 39:23 Privacy supports informed choice, not secrecy.
- 46:13 Privacy-by-design matters most as AI expands.
Episode Resources:
[00:00:00] Brent Homan: Privacy is a fundamental right that transcends socioeconomic boundaries and, uh, levels of status. I thought, Okay.
[00:00:08] this is really universal. It's inherent in in people's DNA.
[00:00:35] Arlo: Hello, and welcome to the Privacy Insider Podcast. My name is Arlo Gilbert. I'm the Chief Innovation Officer and founder at osano. But today I'm your host. We are thrilled to welcome Brent Homan, the Data Protection Commissioner for the Ballywick of Guernsey. Brent brings over 20 years of experience in privacy and competition law with a resume that reads like the greatest hits of global enforcement from Facebook, Cambridge Analytica.
[00:01:05] To Equifax, to Clearview ai, Tim Horton's, and even Ashley Madison's breach. He is a pioneer behind international cooperation co-founder of the GPE and privacy suite, and a driving force in the digital economy of regulation. From Canada's top privacy cases to leading Guernsey through major challenges, Brent is shaping the future of trust and compliance worldwide.
[00:01:31] Brent, welcome to the show.
[00:01:33] Brent Homan: Arlo, thank you very much. I'm really pleased to be here today.
[00:01:36] Arlo: So, Brent, you're the first regulator we've had on here, and in some ways it feels a little bit like I'm talking to the dark side. Uh, you know, we spent a lot of time trying to help people to avoid regulators and to stay away from being on their radar by doing all the right things. But, but I got to meet you before the show, and you're not on the dark side at all.
[00:01:56] You're a really nice guy to hang out with.
[00:01:58] Brent Homan: Yeah, I hope I'm not on the dark side.
[00:02:01] Arlo: How, how, how does one, get involved in this? Tell us just a little bit about what you do and a little bit maybe about your background.
[00:02:10] Brent Homan: Sure. Well, right now I am the data commissioner, for Guernsey. So basically I oversee the, the, data protection and privacy laws in Guernsey. I didn't start out That way. I am, uh, I'm an economist, uh, by training, and I cut my teeth. In the, uh, field of antitrust when I was with the competition authority, back in Canada, uh, before I took this role here in Guernsey, I was a Deputy Privacy Commissioner of Canada, with the office of the Privacy Commissioner in Canada.
[00:02:39] But I thought I'd come and, uh, have some fun on an island.
[00:02:43] Arlo: That is amazing. So Canada, over to another, uh, over to an island. And, uh, and Canada has had some pretty exciting times in terms of data privacy. I mean, during the COVID, during COI, I recall that Canada was really on the forefront of creating a lot of the public tracking of COVID. I, and, how do you do that while remaining compliant, and protecting privacy?
[00:03:09] And, did you learn a lot at your time in Canada that you've taken to Guernsey?
[00:03:13] Brent Homan: Oh, oh abs. Absolutely. I spent over 10 years at the, uh, office of the privacy commissioner there, and, my primarily role was enforcement. So I had the opportunity to, oversee the, uh, some. Very, timely investigations, including against, uh, Clearview, ai, which was facial recognition technology, as well as other areas in joint investigations, uh, including with Ashley Madison that remember that, website for, um, interesting dating opportunities.
[00:03:45] Arlo: Yes. I, I have, I, I remember it. I have not ever used it, but I do know people who have used it and uh, I remember that. And the Clearview one, was that just a real, was that just a real layup? I mean, like, you kind of see what they're doing and you immediately are like, they're breaking every law. which one do we pick?
[00:04:02] Brent Homan: Yeah. Well what was interesting about Clearview is that, its facial recognition, technology. It basically scraped billions of images. Around the world. So if you had a social media site or any type of presence online, it's likely that your image was in the Clearview database. And clearly, I don't believe, you remember ever consenting to that.
[00:04:27] And at the end of the day, you know, what does that mean that you find yourself in a. Potential lineup 24 hours a day, 365 days a year without doing anything because they were, um, marketing this to enforcement agencies. So, um, so, so with that, from engaging with Clearview, they were encouraged to, um, leave the Canadian market.
[00:04:51] Arlo: I like. I like the soft, the soft words there. They were encouraged to leave the Canadian market. Well, so, so you were an economist. Now I wanna know what made you wanna get into economics? 'cause that's a pretty dry field.
[00:05:05] Brent Homan: Do you know what. People say that. And that's what I thought before I, got hooked. But I was really into economic philosophy. That's inclusion of, uh, looking at rights of, ethics. and so what I found is that economics was a great way to understand the way the world works. It's a great way to kind of, hone your skills with respect to analytics and, and really it's even a great way to learn skills, like how, how to buy a car without, again, taking for a ride. So, so, I realized that after doing a degree in, uh, political science, that it was actually good at economics. So that's why I focused on, um, my graduate studies.
[00:05:48] I became an economist, and then my first job was as an antitrust regulator, reviewing, uh, mergers. first with the, uh, competition Bureau of Canada. I was eventually an assistant deputy commissioner there, but, privacy, I guess I gotta admit. I didn't think much about privacy Arlo before I heard about this.
[00:06:06] And then there was a, um, head hunter that came around and talked to me about this opportunity of the office of the private commissioner of Canada. And I thought, wow, well, privacy is a emerging field. It's interesting. Sounds like a good gig. I think I'm gonna give it, uh, give it a shot. so that's how I kind of stumbled into it.
[00:06:27] and, and to tell you the truth. it was an experience Arlo in Ethiopia that solidified for me how important privacy is. I know that's a little
[00:06:39] Arlo: I, I mean, I heard Ethiopia. I have a lot of questions right now.
[00:06:43] Brent Homan: Okay. Okay. Uh, let me answer them then. In the summer of around 2013, I had the opportunity to accompany, uh, my niece, um, Rachel Holman, who's, uh, the, she's the current world champion, for curling, you know, that sport with rocks and brooms and ice, right?
[00:07:00] And so she was the Goodwill Ambassador for the Canadian Hunger Foundation. And so, I went with her on a mission, to Ethiopia and to Northern Ethiopia. This had nothing to do with privacy or my day job or so I thought, okay. So we go there and the mission in the program was to introdu introduce a savings culture with communities in, uh, in this area, Northern Ethiopia.
[00:07:25] And while this area was green and very lush at the time, it was the same area that was subject to um, uh. the starvation and the droughts that happened in the eighties. So thinking about Bob Geldof and Live Aid, and that's the area. And the idea was to introduce a savings program so that the community could put away money.
[00:07:50] in times of prosperity that they could take in times when, uh, they weren't, prosperous. Again, a savings mentality that we take for granted but isn't necessarily there in sub subsistence areas. So when I, this was a program that the rules were made by the community. So there was a men's group that set the rules for, um, the payouts.
[00:08:16] You know, if you lost livestock or if there was a, uh, something happened with your farm, but also with the penalties. And so I looked at that list of penalties the lowest penalty might be late for a meeting. Second highest after that, oh, you failed to make a contribution. in order to the community fund, and then a higher penalty for causing a disruption at the meeting,
[00:08:42] Arlo: I have to interrupt you. What, what kinds of penalties are we talking
[00:08:45] Brent Homan: We're talking about things like. Okay. The currency's a burr. So five bur uh, one burr is equal to two, um, or 200 bur is equal to one pound. So, and, and I can't do the conversion of my, in my head. So, uh, probably, it's equal to, uh, a buck 50 us, is
[00:09:03] Arlo: So the, so they use financial penalties. I didn't know if perhaps these were like societal or you have to go stand outside for a few days or something
[00:09:11] Brent Homan: No, they were financial penalties. And so it was interesting. So I was looking at that and then looking at the list and do you know what the highest penalty was that they would have to pay for revealing the personal information of a member of the community savings group. And that Arlo was on when I had my first, oh.
[00:09:34] Privacy epiphany, realizing that privacy isn't just a luxury good in the western world. No, no, no. Privacy is, is is a fundamental right that transcends uh, socioeconomic boundaries and, uh, levels of status. I thought, Okay.
[00:09:52] this is really universal. it's inherent in people's, uh, in people's DNA.
[00:09:57] Arlo: I love it. So you were, you were touched, I mean, you were touched, you found, you found a calling in the strangest and most, you know, serendipitous of ways.
[00:10:06] Brent Homan: And, and Arlo. That's when I also realized that I was just so fortunate to join a community, a profession, that was becoming so important at the time that being the privacy field. And I know you talked earlier about, you know, the dark side versus the white, but you know what? We're all on the same side.
[00:10:23] 'cause if you talk to privacy professionals in the private sector as well as regulators, um, they're all privacy geeks and they're all about thinking about how do we advance this fundamentally important human right?
[00:10:37] Arlo: Well, this is a great transition because, you know, I, I was entirely kidding about the dark side, but not really. Right? I mean, when as, as somebody who's in the world of privacy. I understand that, you know, the privacy commissioners and the data protection groups, they're completely, they're looking out for me.
[00:10:54] Uh, they're trying to take care of me and my rights and all those good things. Now, of course, if you're a business who's been fined, you probably don't look at the regulators with quite the same halo effect. Um, but I am very grateful, uh, that there are regulators out there who are starting to enforce, and I'm so happy to see that we're seeing an increase in enforcement.
[00:11:15] Um, you know, there was a while where. We all wondered, are these laws actually gonna have any teeth or is it just gonna be paper that we don't really do much with? And so thank you for, for being part of the regime that's helping to make sure that our rights are protected
[00:11:29] Brent Homan: Can, can I just say Arlo on that in terms of enforcement and.
[00:11:31] fines? we just recently issued a fine to a medical, uh, service group in Guernsey, and I think there's a way To do fines and enforce that really absolutely promotes, privacy. At the same time. In this situation, we find the organization for a hundred thousand pounds.
[00:11:53] but what we did is we said, Hey, if you are able to put to successively implement this action plan that elevates the level of data protection within a year. You can forget about paying the last 25,000. So creating an incentive for, you know, for having a system, you know, from an enforcement, action that leads to an overall elevation of the level of services.
[00:12:23] and this one group, with this medical group, it was critical for services in, uh, in the island of Guernsey.
[00:12:31] Arlo: You know, it is interesting. We have seen, and I I don't want to derail the conversation 'cause I ha I, I wanna understand more about what it's really like to be a, a, a privacy commissioner. but I'm, I'm interested when you guys identify a violation. Uh, you know, you say, Hey, we, we just found this medical device company.
[00:12:50] We got a couple reports from, you know, citizens who told us they thought they might have been in violation or, or maybe we discovered it on our own,
[00:12:57] Brent Homan: Or a
[00:12:58] Arlo: and
[00:12:58] Brent Homan: a breach report.
[00:12:59] Arlo: or there was a breach report. Right. A lot of ways they can get on your radar. Do you just immediately send out a, an invoice?
[00:13:07] Brent Homan: No, no,
[00:13:08] Arlo: Well tell like what's the
[00:13:09] process like?
[00:13:09] What
[00:13:10] Brent Homan: Okay. Well, would, would you like to know, uh, something Arlo. the majority of the actions we take in enforcement are about trying to find solutions outside of the formal process. Okay. Because not every. Not every breach report represents a great risk, and there's always an opportunity to say, okay, can we perhaps make adjustments here and make adjustments there?
[00:13:37] And often we find solutions outside of the formal investigative process because to tell you the truth, at the end of the day, what our objective is, to get the best possible outcome for. Not only the citizens and their data rights, but uh, for the company as well to get them in a better, in, in a better position.
[00:13:58] but it's only when we're talking about the highest risks, perhaps the compromise of highly sensitive special data, health data, perhaps financial data and where we see, a systemic problem. Systemic risks that we think, Hey, you know, we have to take a little bit more formal action in here. so we try to be agile.
[00:14:19] Okay. We, you know, if we can, if we can mediate a result and get a positive result without going down the formal route, we'll do that. Um, because we have limited resources and unlimited demands on those resources. So we're trying to, know, we, we don't, we're, we're, we're not. Holding a hammer and thinking that everything is a nail.
[00:14:41] Arlo: I love that. You know, I, I think it's, it's unique in the sense that most businesses, you know, when you're in business school or when you're learning how to be a business person, you know the idea that if a regulator comes knocking on your door, it usually means the beginning of some horrific.
[00:15:00] Multi-year painful thing that you're gonna have to spend lots of money on. And I think a, I think a lot of. Companies and general counsels out there have been really trained that the moment a regulator talks to you immediately call the most expensive law firm that you can and begin building walls.
[00:15:18] Right. Because they've been, they've learned through groups like the SEC, right? If the SEC comes knocking, you're in for some misery. but it sounds to me, and, and from what I've read about other data privacy groups that. That's not really how it works with most data privacy groups. They're really just trying to fix the problems.
[00:15:34] They're not trying to go out and, you know, make huge amounts of money for their state. They're just trying to protect their citizens, and I think that that's an important, an important message for every listener if you're not a privacy pro, is if the regulators call, just pick up the phone and talk to these nice people and, and ask what they want, because you're not, it's not like talking to the SEC.
[00:15:54] Brent Homan: No, no, it isn't. And, and, uh, we're one of those jurisdictions that has mandatory, uh, breach reporting, requirements. and so with many around the world. and the majority of the breach reports that we get end up in guidance and conversations in helping them in order to, identify what the harms are in helping them to, shore up their security safeguards.
[00:16:16] It's only, uh, certain that end up, with a formal investigation. And I think part of that, Arlo, you know, if you've, uh, mentioned, if you've noticed that it's a little bit different in the privacy world, also think about the revolutionary pace of technological development and innovation and how it impacts on privacy.
[00:16:37] So you have to be agile, you have to, Do what you can to find solutions that don't result in a four year investigation so that the solution becomes irrelevant to the problem. So if there's a opportunity to solve that problem early on swiftly, then take it.
[00:16:55] Arlo: Good advice. Well, you know, we, we didn't get to talk a little bit about Guernsey or, you know, I, I kind of jumped right into. Enforcement actions and what it's like to be on the dark side. But, but let's step back. I, I don't know much about Guernsey. I'm guessing a lot of our audience does not as well. I'd, I'd love to learn a little bit about Guernsey and, you know, what, where is it, what is it like there, you know, maybe a little bit about what it's like.
[00:17:20] What is your daily work involved? I don't think most people really have an understanding in the, what is the daily life of a commissioner.
[00:17:27] Brent Homan: sure I can. I can take you through that. Well, first, first, Guernsey. Guernsey is this. Beautiful, Rocky, scenic Island, in the English Channel. And it's just off the coast of, uh, of France. And it's one of the, it's a global financial hub. It's very much a vibrant hub with a thriving, financial services sector.
[00:17:50] a high global standard. So it's kind of like a mix between traditional. British life and this very much high, uh, high, uh, tech financial sector as well. That's the main driver of the economy. 60% of the economy is the financial services sector. and so that's kind of what it's, what it's like, In terms of a, a day-to-day life like of a commissioner.
[00:18:14] Hey, you know, usually what I do is I get up, early in the morning and I spend a bit of, uh, meditative or spiritual, kind of like reflective time, uh, you know, just kind of center yourself in the morning before I have that. Pack my 10-year-old daughter and, and, and, and into the Range Rover. So, uh, we go and drop her off at school.
[00:18:34] and then, you know, uh, although there's no typical day as a commissioner.
[00:18:39] here, um, I'll give you a couple things that often do happen. We are very fortunate here to have a very engaged media who is interested in data protection and. unfortunately, they realize that I'm available and will make myself available early in the morning.
[00:18:57] So, often in the morning I might be on, um, on a BBC radio show, uh, at seven 30 when people drive into work. and that can be talking about the latest enforcement action or can talk about it, can be talking about how do you avoid, you know, what are great tips to avoid being, becoming a victim of, um, of identity theft and then at work.
[00:19:18] I think one of the advantages of being on the aisle, on an island like this is the, it's its intimacy. It doesn't take long, to get from one place to the other. So, you know, the power of having one-on-one discussions with organizations and we regulate the public sector and the private sector. Hey, you know, forget about sending emails back and forth.
[00:19:39] Let's get people in the room together and let's have real conversations and let's answer questions in real time. So, so I think we really leverage that intimacy of the island and, I would say another thing that's really important when it comes to, uh, a smaller data protection regulator. is that you do have to le leverage the power of, partnership.
[00:20:01] and that means collaborating with our international, data protection, uh, partners, our privacy regulators, because I think, you know, with partnership. It kind of expands your, your capacity to take action and then it amplifies the impacts of your actions 'cause you're speaking together. we've taken a very prominent role in that area.
[00:20:20] We're, um, a co-chair. Of the International Enforcement, working group of the Global Privacy Assembly. And we're also a, uh, co-coordinator of, uh, the global privacy sweep that happens every year. And we just finished it a couple of, um, a couple of weeks ago. And this year's theme was on protecting children.
[00:20:43] so the whole idea of taking action, globally. To have a positive, effect, not only globally, but as well in protecting our children on the island.
[00:20:53] Arlo: All right. I have, I have a couple of questions here. So, uh, I know in the states we've had, uh, a number of states that have banded together, to collectively enforce because they said, look, if you're, whether you're in California or Colorado mm-hmm. The actions you're taking are impacting people in our state too, and I'm imagining that in the, uh, European economic area that that kind of collaboration also exists.
[00:21:18] Is that, is that the case? You're, you know, you're talking to the French authorities or the German authorities, or the British authorities and, and kind of co-coordinating because this breach or this action that might, might be violating privacy, it's, yes, it's impacting people in Guernsey, but it's also impacting people.
[00:21:35] on the next rock. so how do you, how do you prioritize and how do you decide? Do you, do you, do you coordinate as an enforcement and say, we'll, take this one, uh, you go take that one, or let's do it together. How does that work?
[00:21:48] Brent Homan: Well, how, how about this? I think the global privacy sweep is a good example of, partnership and coordination at its best. And it's, it's one of those, not, it's a less formal enforcement action, but, um, it involves. Data protection authorities and privacy authorities from around the world. So we're talking, uh, I, I think close to 30 or 40, countries from around the world participated in this, uh, year sweep.
[00:22:13] And the idea of it was for one day or one week focus on an issue that has been identified as a shared global risk. And so this year it was about looking at websites that are either popular with children. Or targeted at children, because it can be a slight nuance with that. And then, so what we would do is go on.
[00:22:37] We'd, replicate the consumer experience. Like, okay, go on, see how hard it is to get past the age, uh, age authentication, uh, see if, there's good, parental controls, uh, with these websites, and just really kind of get a sense of whether, whether, they are privacy protective for children or not.
[00:22:58] So what, so this happened. And right now what's happening is that as the, as a coordinator, we're getting all the results from all the countries around the world, and we're gonna take a look at them, we're gonna identify trends, and then we'll globally, uh, share our observations. So, uh, there'll be a coordinated, um, press statements around the world identifying here are some of the issues that we saw. here are some of the, good examples. And by doing that, by speaking with the United Voice, it. Helps to amplify. It helps to kind of really, hit home the issues. And, and I'll tell you what we often do in these sweeps that is helpful, uh, after the fact, and we did this last year as well, is that when we see, okay, well we swep 20 websites.
[00:23:47] 16 of them had issues. We write to those and we say, Hey. We conducted the sweep. Here's some of the issues we saw. You know, we saw that the privacy policy was, uh, was 20,000 pages long. we, we, we saw that, there was no way to actually, um, log off of not being a user. We saw, you know, whatever the issues may be.
[00:24:11] And then we invite that organization to, um, say, hey. Could you respond to us within, uh, two months and let us know what you're doing in order to address the issues? and that Arlo is the magic because, uh, what we found is that, organizations are very receptive, especially when it's a global, um, effort.
[00:24:33] And last year, our global privacy sweep looked at gaming sites and dark patterns. And you know what? 100% compliance. got back all these responses of saying, Hey, yes, we'll do this. We'll, just this, we'll do this. And that was for us, for our, data protection authority. But think about what that means.
[00:24:54] 'cause you talked about enforcement. if we were to. Have investigated those or organizations, it would've taken years and would've cost resources for ourselves and the organizations. So we were able to achieve in a matter of a few months, what would've cost hundreds of thousands of pounds and years in order to get an outcome.
[00:25:18] So that Arlo is the power of partnership and where, you know, finding a way to get results and um, that benefits people's rights. This year rather than too far down the road.
[00:25:30] Arlo: I love that. That's, uh, that's a, it's a really smart approach. Now, you, you've been talking a little about these sweeps and I, and I wanna talk more about the sweeps and about, about children's privacy. But this is an area of interest to you, right? I mean, children's privacy is a, a, an area of passion. And when we chatted prior to the show, you shared, you shared that, you know, you, you spend time helping to educate.
[00:25:52] you know, being on a commissioner's, not just, the stick, but you're actually out there talking to kids and families, and I'd love to understand where this
[00:25:59] passion came from.
[00:26:00] Brent Homan: well, it's, it's, this passion is not only in the world of privacy. I had this passion when it came to consumer protection as well in terms of, um, focusing on protecting, uh, children's, uh, rights. And, and I could actually. Point back to a period of time, a point of time when I was in Canada and talking to a youth group up in the Yukon.
[00:26:30] Okay, so this might have been, I'd say about eight years ago. And I was on a cross country tour talking to um, uh, different chambers of commerce. And then, uh, I was up in the Yukon and had an opportunity to talk with the youth group. so I thought, great. I went in there and I was all ready and, you know, I was gonna talk to them about all the dangers of, you know, be careful online.
[00:26:58] the dangers of, um, of, of bullying, uh, the dangers of, um, engaging with people that have, bad intentions. And then Arlo, um, as I was telling the standard, don't, don't, don't, here's what you gotta think about. I noticed. I was losing them. And I, and I saw the leader of that youth group, kind of like he was, his, his his, uh, face was going blank.
[00:27:24] And I, I can usually keep people's attention. So I just stopped there and I said, okay, it's clear that you're not. Picking up what I'm throwing down. So what's the issue? So, so let's let, let's have a real conversation. What, you know, what is it that you, how is it that you feel about this issue? And the gentleman was very respectful, but what he said is, Hey.
[00:27:49] We've heard about online bullying, we've heard about the dangers. That's all we hear about. But we live in a small community that is, in the high North. Of Canada and if there's any opportunity for us to kind of interact with people, in other parts of the world, it's online. So it's actually something that is really key for us to develop.
[00:28:14] And then that's when I thought, wow. Protecting children isn't just about telling them what to do, but really it's about listening and hearing, what their interests are. And that's how we thought, okay, well it's of limited. Help if you just say don't, don't, don't. What you want to do is allow children and empower children to benefit from.
[00:28:36] everything that the online and digital world has to offer, while at the same time identifying and helping them to, um, guard against the risks.
[00:28:46] And that's the program that we have in Guernsey. So we go into schools and we talk to children, and we talk about. in a balanced way. Here's the powers of ai, but here is the issues with DeepFakes and here's how you should, um, guard against, um, sharing, uh, information. And more importantly, actually, uh, what we did is innovate, and thought that, yeah, it's great that we talk to children.
[00:29:13] It's nice that we talk to teachers. It's nice that we talk to leaders, all these people about protecting children, but then realized we're missing someone important, and that's the parent. Because everyone always points to the parent. It's like, okay, well what's the parent doing in order to protect them?
[00:29:29] That, and then the parent is sitting there hearing all this, all these dangers. And they don't know what to do and they're not equipped with the tools. So we've created a parents' workshop that. Doesn't just talk about the dangers, but actually is practical. It says, okay, let's go right into the lines den.
[00:29:50] These are the popular, apps, uh, with respect to children that have risks that you have to be aware of. So we talk about Roblox, we talk about TikTok, we talk about Snapchat, and we give actionable advice in terms of how to. Protect your children on these platforms. That relates to all platforms in general.
[00:30:12] So that's a new thing that we're doing and kind of part of our way of creating this, this, this net to support and protect our children all at the same time is listening to them. ' it was, I believe, um, a, I, I forget who it was that said in terms of takes a, takes a village to raise a child.
[00:30:34] Arlo: Oh, it takes a village. I think that might've been, that might've been a Clinton,
[00:30:38] Brent Homan: Yes. Hillary Clinton
[00:30:39] I believe. Yeah. Well, uh, well, what do we say in Guernsey is that it, it takes an island to protect our children.
[00:30:45] Arlo: Well, I gotta tell you, if you, if you ever decide to come to Austin, Texas and put on a parental workshop, count me in
[00:30:52] because I, I, I have heard and seen plenty of those kind of generic workshops and it's exactly what you described. You walk away going, I didn't really learn anything. You know, this was, this was all stuff I know.
[00:31:04] but then, you know, you open up the apps and you get in and you find yourself going, wait, how do I. How do I get access to that? How do I even control? I don't even, there's a million settings in here and I'm pretty sophisticated. So, that sounds like a really powerful tool.
[00:31:18] Brent Homan: Yeah. And, and not only that, I think, I think the approach that we take and, and, and, and I deliver these workshops, personally because I'm a parent and, and, and, and, I start out that way of saying, Hey. Yeah, I approach this subject with humility 'cause I don't have all the answers. And if somebody tells you that they have all the answers to protect, children, they're feeding you a line.
[00:31:39] But what I do have is some information, information that I know as a, as a privacy expert and information I know from making all the mistakes that a parent and a dad can make. So. Let's talk about that, and I also wanna hear about your experience as well. You know, creating that intimate environment to allow for having real conversations, not just handing out a leaflet and saying, good luck.
[00:32:03] Arlo: Well, you know, since, since you, you opened the Pandora's Box on kind of mobile and social, I, I would love to understand a little bit more. I mean, there's, there's been a movement. Kind of occurring globally. People are calling for social media, mobile phone bands. You know, as you're thinking about these parental workshops and you start thinking about a more global viewpoint, what do you think about these, these calls to action?
[00:32:28] Brent Homan: That's a very interesting, um, topic, Arlo, and, and, and you're right, it is a movement. It's something that's being considered. There's certain jurisdictions where it's been implemented in Australia, they've in implemented a sort of a um, of a ban. and it's also being considered in other jurisdictions and my personal opinion is that I have completely respect for the motivation behind that. You know, uh, there's a lot of very unfortunate, incidents that result in or in, in, in children, uh, being harmed online. But I don't think that a ban is the answer. I think in fact, An outright ban on social media or mobile phones for children?
[00:33:12] I think it's the risk is that it just drives them to different areas so that, uh, accessing it in more clandestine ma uh, manner. And in fact, then all of a sudden they don't even have the benefit of the protections of having, uh, their, their, their, their parents around.
[00:33:27] Uh, I, I,
[00:33:28] Arlo: like The prohibition era,
[00:33:30] uh, alcohol. You know, you're drinking bathtub maiden, you're drinking bathtub gin instead of something that went through, uh, an approval
[00:33:37] Brent Homan: Yeah,
[00:33:38] Arlo: and a quality control.
[00:33:39] Brent Homan: yeah. Bands don't have a really good record of effectiveness. It, they often have a good motivation, but it, it, I guess think about it that, um, with. Children or with, uh, adults learning how to drive a car, there's usually a learning process. You, you just don't wait till they're 19, throw them the keys of the car and say, good luck.
[00:34:03] and, and similarly, as. Parents, you know, don't we want to be the ones to help to show and model responsible online behavior? That's part, you know, the digital world isn't going away. It's, it's their future and, children are actually pound for pound. They're smarter than us, but we have more experience, so we still have.
[00:34:32] An obligation to help to raise children in the digital world. So I think that is more important to try to model and promote. Safe habits so that they can benefit from all the promise that the digital world has to offer while being aware and being protected from the risks. Hey, it's not easy, but you know, um, as parents, none of us asked to be, uh, parents and the most challenging of areas in terms of the digital areas, but here it is.
[00:35:04] So, uh, so it's, it's, it's just one other area that we need to raise our kids in.
[00:35:08] Arlo: Well, that, that's a, that's, that's fascinating insights and I, and I can't agree with you more that the bans are, are a little silly because the fact is that as a parent, you could always enforce your own ban if you want. If you don't want your kids to have a phone, don't buy them one.
[00:35:23] Brent Homan: exactly, and we're all, you know, it's, um, I, I think it's all about balance and, and some of the things that we teach. At the workshop is that you can put in place some very, very easy parental controls that set limits on times that allow you to kind of, uh, see what your children are engaged in.
[00:35:43] And then we also suggest that, you know, show an interest, ask questions, establish trust, maybe play a game with them, get a sense of what's going on, and, and, and children and people usually don't react. Positively to the don't, don't, don't, no, no, no. type of approach. But it is, I, I gotta say that it is difficult for parents these days because, uh, they have a lot of anxiety.
[00:36:09] They're fed a lot of, of these very kind of, um. A horrific kind of scenarios. So realizing that, you know, they can be forgiven, we can all be forgiven for thinking, do you know what? I just wanna protect, my child while they're in the home. And so that nothing, nothing can happen. But, you know, I, I think that our obligations go further than that.
[00:36:31] Yeah.
[00:36:32] Arlo: Yeah, I think there's nothing in my own personal life that has caused more conflict in my family than when we turned on screen time on my teenage daughter's phone. You just never seen anybody get that angry. Well, maybe you have, but Wow. I was very surprised.
[00:36:48] Brent Homan: Oh, oh, Arlo. Uh, yeah. And it's, um, it's very revealing and, you know, once you put on screen time, then, you see how habit forming this can be. And, and by the way. We do it as well as adults. Like how many times are you saying, Hey, don't do this, don't do this. And you realize that you're doom scrolling on LinkedIn and that, and then next thing you know, it's 45 minutes in and you haven't done anything and you haven't done the dishes that Yeah, no, no abs.
[00:37:15] Absolutely. So, uh, so I think that That's a, uh, that's a good point to make. And, uh, 1 pointer that we give is that, um, when you are establishing limits. Hey, do it with your children and help them to set the limits. What's reasonable for bedtime? What's that? And what's most important though? is that then stick by them.
[00:37:39] Don't negotiate like, like Arlo. I am a negotiator, like part of my job historically has been to negotiate and I have had an opportunity to negotiate with some of the, some large platforms and organizations in the world. And I've had a fairly good record that, you know what? I lose every time in negotiating against my wife or my child.
[00:38:00] So it's like five extra minutes, 10 extra minutes. Are you sure? Well, I just need to finish this video and, and if you do that, then like, uh, I know I lose every time, so I just have to to the hard, fast rule.
[00:38:13] Arlo: That's awesome advice. and you know, we, we've talked a lot about. Coming at things from the angle of, uh, of a privacy regulator, right? You know, you're, you're, you're teaching, you're educating, you're enforcing, you're collaborating, you're partnering. But that's all from the enforcement side. and the thing is that you're an enforcer by day, but when you, when you walk out that door, you're a dad.
[00:38:39] You're just a guy. And like all of us, you know, you, you also probably. Use social media or, you know, engage in these things. And so I, I'm really interested to understand, you know, we, we often say, you know, do as I say, not as I do. and I'm guilty of course, of my own, my own habits. But I'm curious as, as a regulator, do you have any, any kind of, you know, do as do as I say, not as I do kind of moments
[00:39:05] Brent Homan: Do you know what Arlo I. I might have a little bit of a different take on that. 'cause I, I think I know where you're going. You know, the whole idea of we're, privacy champions, so do we protect all our, all our information? Do we not share anything? Do we, you know, do we sit on the island and, and, I don't.
[00:39:23] I'm actually probably decide to share more information, uh, whether it be with an app or, with a website, than you typically expect. But you know what? I don't think that amounts to a privacy sin because privacy isn't about secre. It isn't about holding everything. Privacy is about control and having the agency to exercise that control, it's about having control over your information.
[00:39:51] So. As long, even if I choose, do you know what? I want to share my location with this app because it's gonna help get the pizza to my door quickly. Or I want to, um, share my interest because I want, to know more about, uh, grooming pugs because I have a pug, then I'm exercising that control. I'm not, I'm not committing a privacy sin by sharing information I'm saying.
[00:40:18] I'm making an informed decision that yes, I'm going to share this information, and that's, um, that's part of my agency. the problem is where if, um, I choose not to share such information, and then I find out later that it's scraped. But, um, so anyways, in a roundabout way, Arlo, I don't feel guilty that I do share information sometimes, but it's certainly important for me to have control over what I share and what I don't.
[00:40:50] Arlo: That is a wonderful position. And I think, you know, as I, as I think about this, I realize that, look, you know, sweeps don't only happen because the office decides to sit down and go through complaints or they have a particular list. Some I've, I've heard from other folks that sometimes, sometimes enforcement actions happen because.
[00:41:13] The privacy regulator is using the product and they discover something horrible while they're using it and they're like, ah, I really didn't wanna have to bring my work home with me, but you know, how do I not look at this? Does that happen to you?
[00:41:27] Brent Homan: Oh, Arlo, I.
[00:41:28] have a great example of that. That's a real, I'd say a success story on this island. This wasn't because of the complaint, but we learned that there was a new health program. there's a health organization on the island their objective was to improve the health of Islanders, including children.
[00:41:48] Right? And so they introduced Arlo. A, health band wristband, program into schools. So this is like a wristband tracker. So seeing about how many steps the, uh, the kids are taking and, uh, tracking them to help kind of, um, get some data to, target for kind of improvement program. So very.
[00:42:12] very laudable goals.
[00:42:14] This is all about improving the health of children. We heard about it because some of us have children and some of these, uh some of these notifications were coming Out.
[00:42:25] But when I looked at it, we realized, okay, well. This is about tracking, uh, or collecting, uh, information to Children Steps is still health information.
[00:42:36] So that's special category data that's sensitive and it's children. And we realized that this was on a opt out rather than an opt-in basis. So we kind of engaged right up front. with the organization, and we said we, we recognize the laudable kind of objectives of this, but there's certain areas that we, uh.
[00:43:02] Are looking for you to address. and this was doing, this was, we did this outside of investigation. So the Health Organization, health Improvement Organization was really quick to engage with us and say, oh, okay, we didn't think about that. And, and in a very short period of time, We engaged with them and schools and to ensure that this was on an opt, an opt-in, uh, rather than an opt-out basis.
[00:43:28] The transparency was better. We understood, what was helping ha having with this health risk band. So, so we got a positive result that if we had investigated, we'd probably interrupt a very, very, Useful, uh, and beneficial program. Uh, but what's interesting is it didn't stop there. we looked at the wristband technology and realized that while it was being introduced on the island.
[00:43:57] It was done by a company that was selling it to, um, schools all across the United Kingdom. 1500 schools across the United Kingdom. Okay. And they weren't responsible for the, for the communications like we did on the island, but we thought, hey. Why don't we kind of like take a look at the product itself.
[00:44:15] And so we engaged with that company and we looked at it, to tell you the truth, a lot of it was in pretty good shape, but we saw opportunities for better transparency about what's happening with the risk band and that, so they agreed to make changes. at the end of the day this little issue that happened on the island that we became aware of.
[00:44:36] It resulted in positive privacy changes, uh, not only on the island, but to the benefit of 1500 schools across the United Kingdom. So that's how you can be a small island having a supersized, positive effect. And again, because we looked at that. In an informal measure outside of the investigation, we were able to get a result quickly without, uh, interrupting a very, uh, beneficial health program for children.
[00:45:06] Arlo: so I guess we would call you a privacy superhero. the, you know, we've talked a lot about a lot of different topics. We talked about your background, we've talked about Canada, Guernsey, children's privacy, Ethiopia, uh, really of running the, running the gamut.
[00:45:21] But what we haven't talked about is the future, and I'm really curious to understand, as a regulator. What keeps you up at night right now? I mean, you obviously have existing privacy regulations. You, you got a, a, a duty and a clear sense of, of mission to go and, and resolve these things, hopefully in an amicable manner.
[00:45:41] but what do you worry about in terms of the go forward?
[00:45:44] Brent Homan: Okay. right now this is what's keeping me up at night and, and it's. It's common in terms of a theme, but, um, I have a very specific take on it. And then it's, it's relates to AI and, basically the frenetic and urgent pace, uh, at which public and private sector organizations are feeling the need to adopt ai, knowing that they need to think about how to do it safely, but not necessarily knowing.
[00:46:13] What safety looks like. so you combine that with, what I see as the myth that privacy and innovation work at cross purposes with each other, because I think that the only way that you can have a successful enduring innovation is if you build in privacy, from the get-go. and I guess the concern is that there is this very, very, um.
[00:46:40] Fundamental feeling amongst organizations to, to adopt AI without going through what they need to think about it. And so I think it's our obligation to, uh, help them realize the power, and it's Really.
[00:46:54] powerful. ai, you know, it, it has done wonders for bringing medical. Support to parts of the globe that traditionally haven't, to bringing education to helping to guard against, uh, crop failures.
[00:47:10] But, uh, I think what's important is as regulators is that we find a way to ensure that they can. Adopt it and implement it into their systems in a responsible manner, rather than MacGyver it in to their, uh, business model in a haphazard manner. You know, we want them to be successful with this very powerful, product.
[00:47:34] So I think that to me, that's something that I'm thinking about a lot, Arlo, is how can we, best. Assist organizations in the successful implementation of ai, this powerful tool. and, and that's why we're looking at, uh, and we have been kind of releasing guidance, in the area to try to give, areas of, okay, well.
[00:47:56] This is what you need to think about and this is what you need to guard about and, and, and just to help people along that journey, because I, I think that's the sense I'm getting is that everyone's rushing like, okay, we need to adopt it. We need to adopt ai. And they're like, okay, well what does that look like?
[00:48:12] and, and for something that we, it's one of those things that has as powerful it is, you don't want to implement it the wrong way. So how can we encourage people to not have too much of a panic of implementing it and then figuring out what they did after? I think that's, that's the big challenge, that not only, uh, myself, but many regulators are, uh, are, uh, having, because, you know, we want to support that powerful, technology, uh, but obviously in a way that mitigates, uh, the potential harms that can have for, uh, people's, uh, rights.
[00:48:46] Arlo: that was amazing. Well, thank you so much for joining us today, Brent. This has been an incredible episode. I know I've learned a lot. I hope our audience has learned a lot, and I hope you've had a nice time joining.
[00:48:56] Brent Homan: I've had a great time. Marlo, thank you very much for, uh, having me on.
[00:49:00] Arlo: Well, folks, and if you wanna learn more about Brent's work or you wanna hear more of his thoughts, please tune into the ODPA Guernsey podcast. Just type it in. Guernsey podcast o dpa. There's probably not a lot of them out there, and you can find it on SoundCloud. They do some really interesting stuff. Uh, they, they've done some deep historical dives on privacy back to the Roman era, so tune in and I'm certain that it will blow your mind.
[00:49:24] Brent, thanks for being here. And for all our guests, have a wonderful day.
[00:49:28]
Meet the host
Arlo Gilbert is the host of The Privacy Insider Podcast, CIO and cofounder of Osano, and author of The Privacy Insider Book. A native of Austin, Texas, he has been building software companies for more than twenty-five years in categories including telecom, payments, procurement, and compliance.