-1.png?width=2000&height=2000&name=OSANO-cover-3000x3000%20(2)-1.png)
The Privacy Insider Podcast
The Elephant in the Chatroom: Preserving Privacy and Social Connection with Christine Rosen of The American Enterprise Institute
As we navigate the complex landscape of technology and its impact on society, considering how our digital interactions shape our identities and communities is crucial. The shift towards virtual third spaces—like social media and online platforms—has transformed how we connect and share personal data. This evolution raises important questions about privacy, community building and the future of human interaction.
About Our Guest
Christine Rosen, Fellow of The American Enterprise Institute, explores these themes in her work. With a background in American history, society, and culture, Christine offers unique insights into how technology influences human behavior.
Episode Highlights:
- (07:37) Technology makes things easier but not always better.
- (10:08) Rapid technological adoption challenges societal adaptation.
- (13:41) We've traded deep, messy human experiences for convenience—and barely noticed.
- (16:44) Traditional skills are being lost due to technological advancements.
- (20:34) Concerns about technology replacing human connections.
- (28:45) Technology influences identity formation in young people.
- (40:07) Reviving face-to-face interactions is crucial for well-being.
- (53.48) The need for new community spaces in a digital world.
- (56:07) When no one can explain how a system works, people turn to stories that pretend to.
Episode Resources:
0:02
Hi everybody, this is Arlo Gilbert, Co Founder and CEO of Osano, a leading data privacy management platform, and you are listening to the Privacy Insider Podcast.
0:13
This show explores the past, present, and future of data privacy for privacy and business leaders alike, as well as anyone who wants to keep privacy top of mind.
0:27
Welcome to the Privacy Insider podcast.
0:30
When was the last time that you had an in person social interaction?
0:35
When was the last time your kids had an in person social interaction?
0:40
If thinking about it makes you anxious, it makes me anxious, then this podcast is for you.
0:47
Did you know what a third space is?
0:50
You've been in many of them.
0:52
Third spaces figure prominently in our society.
0:55
They are the places we go to interact outside of our home and our work.
1:01
Things like coffee shops, game nights, parks, restaurants, our friends houses.
1:06
They are where we make connections, find community, and even create our identities.
1:12
But more and more, our most common third spaces don't have 4 walls, grass, or food and drink.
1:19
They're virtual social media platforms, gaming sites, dating apps and more, and we don't act the same way and we don't connect the same way on these platforms.
1:32
It also changes what personal data we share, how we share it, and who has access to it.
1:38
What are the consequences, and can we find real connection while keeping our data safe?
1:44
Our guest, Doctor Christine Rosen, thinks about this a lot.
1:48
She wrote a terrific book about it, The Extinction of Experience.
1:52
Christine is a Senior Fellow at the American Enterprise Institute, where she focuses on American history, society and culture, technology and culture, and feminism.
2:02
She has written books about the digital experience, about pop culture and a memoir about growing up in a fundamentalist Christian community.
2:12
Her opinion pieces, articles and reviews have appeared in the Christian Science Monitor, the LA Times, National Review, MIT Technology Review, the New York Times, the Washington Post, Politico, the Wall Street Journal and the New England Journal of Medicine.
2:27
Whoo, I am almost out of breath.
2:30
And that's the short list.
2:32
She's a great thinker and writer, and we are really excited to have her here.
2:36
Christine, welcome to the show.
2:39
Thanks, Arla.
2:39
I'm really glad to be here.
2:42
Well, let's get started.
2:43
You know, we really are excited to have you here today.
2:46
You know, the topics that you talk about are near and dear to my heart.
2:50
And I think your background is absolutely fascinating.
2:53
Would you mind just telling us a little bit about yourself?
2:56
I mean, starting from the beginning, not not starting from like age 20.
2:59
Let's hear it from the beginning.
3:00
I want to know who you are.
3:02
Well, I was born and raised in the great state of Florida, the wacky, weird place that I still love in Saint Petersburg.
3:09
And I went to very fundamentalist Christian schools K through 12, and then went to the University of South Florida on a bassoon scholarship where I studied history.
3:19
Just absolutely fell in love with history.
3:21
So went on to get a PhD in history at Emory University in Atlanta.
3:26
And while a grad student, moved to our nation's capital, Washington, DC to do research, thinking I would be there for year or two and then move on.
3:33
And it just, I stayed, I ended up staying, I ended up starting to work for research institutes and think tanks, doing some ghost writing and speech writing and, and with a bunch of friends founded a journal about technology and culture called The New Atlantis.
3:49
And along the way, just discovered that I was fascinated by how technology changes human behavior and some of the ways that we try to adapt, some of the ways we don't adapt but should.
4:00
And, and so now I'm a fellow at the American Enterprise Institute.
4:06
I've raised two boys here in, in Washington, DC who are now off in college.
4:11
And it's just been, it's all gone by very quickly.
4:14
But I, I love my work and I have wonderful colleagues who are who don't always agree with me but but push back in the most civil and wonderful way whenever I write something that they don't agree with.
4:28
Well, healthy conflict is a lost art these days.
4:31
Before we talk about your work, I I do have one really important topic to dive into, which is bassoon scholarship.
4:39
Tell us about playing the bassoon.
4:41
So so the bassoon for for any of your listeners who are unfamiliar, is perhaps the most comical woodwind instrument in the woodwind family.
4:48
And it was the only instrument that no one played at my school.
4:52
And so for some reason, my intrepid band director decided this is the kid weird enough to try to play this thing.
4:58
I was 8 years old.
4:59
The bassoon was bigger than I was, but boy, I just, I saw it as this amazing challenge, which it was, you know, it's like 50, some odd keys.
5:06
And so I just embraced it.
5:08
And God bless my parents who had to listen to me learn to play the before I got good enough to make it sound like something that wasn't a duck expiring.
5:19
So I, yeah.
5:20
And I just, I loved it.
5:21
I love playing and I still play.
5:22
One of my sons is a cellist, so I forced him to play duets with me.
5:26
But it paved my way through college.
5:28
It opened up a number of wonderful opportunities to meet other musicians.
5:31
I had some fantastic teachers along the way.
5:34
So it, it was a very early part of my life and remains 1.
5:38
And I, I write a little bit about it in the book in terms of just how familiar your mind and your body fuse with an instrument if you start playing it at a young age and, and what that experience is like.
5:50
I love that.
5:51
So and, and you, you didn't just play the bassoon.
5:54
I mean, you, you really played the bassoon because I mean, you know, I, I knew plenty of people in college who were in the music and the folks in college who played tended to be pretty passionate and pretty excited about the topic.
6:07
Whereas in high school it was often a social experiment.
6:10
What I mainly did is it was the way I, it was my side hustle when I was an undergraduate because bassoons are, you know, there aren't that many of them in an orchestra, but one of my teachers at the time was a bassoonist in the Florida Orchestra.
6:23
So I auditioned and was able to be a fill in bassoonist.
6:26
So it was amazing money for an undergraduate and it was a wonderful experience to play with a professional orchestra and to really see the rigor and have to go to all the practices and, and, and I entered concerto competitions where again, the judges were used to these incredible prodigies on the piano and the violin coming out and playing beautifully.
6:44
And I would March out there and sit down with my bassoon.
6:47
And I think the shock of it all, I actually won one of the contests.
6:51
I think they couldn't believe someone was playing a bassoon solo.
6:53
So I've always appreciated how odd an instrument it is, but it has offered opportunities to really in a way that I think people who play violin or other instruments that are more common, you don't get because it is a weird thing.
7:06
So it says obviously something about my temperament and personality that I just absolutely love the weirdest instrument.
7:13
Golly.
7:14
Well, tell, tell us.
7:16
I'd love to know a little bit about the the work that you're doing these days.
7:22
You know, where is your focus and and how did you end up becoming interested in this topic?
7:28
So I have I did my doctoral work studying the history of genetic science and ways that human beings have tried to manipulate human breeding and, you know, control others breeding.
7:40
I studied the eugenics movement and from that interest, I also did some work in early bioethics and I was at each step along the way.
7:49
The question that seems to fuel my curiosity is what do human beings think that they can become?
7:56
What do we understand to be permanent parts of what it means to be human?
8:00
Do we have a human nature?
8:01
If so, how do we define that?
8:03
When people disagree about what we can and should improve or change, how do we resolve those conflicts, whether they're about reproduction, whether they're about, you know, technological improvements to our human bodies?
8:15
And at each point, I would find that some of our most overlooked technologies, the things that we take for granted, like, oh, we get a phone, OK, And now we can do all these things on the phone and we'll adapt to it and it's great.
8:27
It makes our life easier.
8:29
I really wanted to question whether it was always a good thing.
8:33
And so that's I guess a sort of small C conservative sensibility I have, which is, is every new thing necessarily an improvement?
8:40
So for me, I think there were a lot of conversations early on pre social media when smartphones were not yet on the market where we were early Internet kind of so excited about what the opportunities that the Internet offered, which it did, but not always asking the questions about human nature that I thought were important.
8:59
So as a historian, I'd studied a lot of other technological revolutions and in the industrial revolution and, and in each of those moments in time, the adjustment.
9:10
Was quite long and it could be quite brutal for people who were on the sort of losing end of those transformations.
9:17
But things were happening quickly.
9:18
And and it wasn't just that I was also then raising kids at the time and seeing the things that were entering their lives.
9:25
It just struck me that we needed to stop and really start looking at these everyday interactions, these everyday behaviors, and question what we were giving up at the same time that we were really appreciating the convenience and the efficiency of our new, new technologies.
9:40
Yeah.
9:40
I'm really curious.
9:41
When you think about the way that technology is impacting people, has there been a substantial change, not just in terms of adapting to new technology, but adapting to the frequency of new technology?
9:56
Because I think about like, you know, back when I was a kid when they invented the printing press, the printing press itself was very disruptive, but it took a very long time for the printing press to reach real mass, the market.
10:13
Same thing with the telephone, right?
10:15
You know, AT&T was putting one of those phones in each of our houses, if you remember that.
10:19
And and it took them a long time to really get that to be a ubiquitous piece of being an American in society.
10:27
And so I'm really curious, when you think about the juxtaposition of both the the level of change but also the frequency of change, does that play in?
10:37
I think it absolutely does.
10:38
And if you look at, for example, radio, television, telephones, landline telephones, and even really the desktop computer, when it first sort of entered people's private homes, there were decades in some cases before most people were using these things on a regular basis.
10:55
That gave us time to sort of figure out what the rules were to figure out social norms.
11:00
And social norms are really important.
11:02
They're what they're sort of the grease of society, right?
11:05
They allow us to figure out how to get along, how to change in a healthy way, how not to have overreactions to new things.
11:13
And when you think about the smart, the Internet, and then the smartphone and now social media platforms and all the apps that go with the smartphone on top of that, that's really been compressed in like a little more than a decade.
11:23
If you think about people's total use of it.
11:26
And then you look at how much time we all spend using these tools, that's also new because even when people had desktop computers, they might spend a few hours of their leisure time in the evening on the desktop computer.
11:36
And it was this novel experience play games, right?
11:39
No, I mean, I remember I, I was on AI was of course a super nerd who is like managing an early list scholarly listserv.
11:46
And I would I just loved it.
11:47
I would like remind people to be polite in the chats and stuff and it, but it was a great experience of connecting to scholars all around the world.
11:54
You're a nerd.
11:55
That's fantastic.
11:56
I embrace minority academic.
11:58
OK, now I'm really digging into this.
12:00
So I, I really, I, I think that experience for those of us who were in the transition generations, it's really difficult to understand just how ubiquitous the use of these tools and their intimate place in our daily lives has become in a very short span of time.
12:17
And This is why we really struggle to understand how to act in public with a phone when they first came out.
12:23
Is it polite to scream into your phone while talking to someone on FaceTime?
12:28
Note to listeners, It is not, but people do it.
12:30
So I mean, but again, it's we have to have compassion for ourselves and others because it really is a vast transformation.
12:37
Social norms take a long time to develop.
12:40
The changing of norms takes a long time to to be sorted.
12:43
And in that sense, we're, we're kind of all, we're just past being toddlers with some of this stuff when it comes to understanding how we should behave.
12:51
When I lived in New York, I remember it was in the early 2000s.
12:55
And they were, you know, phones were really just becoming a thing.
12:59
You know, everybody was starting to get their first phone on Sprint PCs or whatever the network was at the time.
13:04
And I remember the Bluetooth earpieces had come out and I was walking down the street in Midtown to my office and I passed by this guy.
13:12
And I, I mean, he was talking to himself and being loud.
13:16
And I was like, I can't tell if this is a dangerous person who is talking to themself that I should avoid more.
13:24
Is this person on their phone?
13:26
I can't tell the difference because I don't know whether they're talking to a real person or somebody who's just in their ear.
13:32
And for me, that was a moment of, wow, this is a, this is a big societal change that's just a little tiny moment.
13:38
But, you know, all of my lifeline, lifelong training of, you know, avoid danger, you know, here are clues that somebody may not be healthy and in the right space.
13:48
Had to throw them all out the window one day, right?
13:50
Just had to get rid of all of that, that old baggage.
13:53
So tell us a little bit about your book.
13:55
I I'm really, you know, we, we've been digging into it and I've bought copies of it and it is amazing.
14:02
But why don't you tell our audience?
14:03
What's the theory?
14:04
What do you talk about?
14:06
Are there any exciting conclusions that you can reveal to us without making us go to the end of the book?
14:11
You know, share, share a little bit about your book.
14:13
Sure, I'd, I'd love to.
14:14
I am.
14:15
So the book is called The Extinction of Experience.
14:17
And that title I borrowed from a wonderful naturalist named Robert Michael Pyle.
14:22
And he was writing about his concern.
14:25
This is decades ago that children were have having very little experience outside in nature getting their hands dirty.
14:31
You know, they were, they were sitting watching TV or they were in like structured play that was controlled by adults.
14:36
And so his concern wasn't just that they weren't outdoors getting muddy and dirty and and playing in the world around them.
14:43
It was that if if the forest that they played in disappeared, would they care if they'd never played in it?
14:48
Right.
14:48
This is so this idea that if it, if you never have this initial experience with nature, with interacting and understanding species and ecosystems, would you care if one day you open the newspaper and it assuming newspaper still existed at the time they say you know it's gone.
15:04
It wouldn't matter.
15:05
They have no true connection to it.
15:06
And that that essay, it just really stuck with me, that concept because I think it extends well beyond a childhood experience with the natural world.
15:16
All of us now live our days with experiences that are mediated in such a way that we no longer see each other face to face.
15:26
All of these ways of behaving as human beings, as you said, someone walking down the street talking to themselves very recently in history would have been seen as a madman.
15:35
And the expectations now that people who have stuff in their ears and are focused on on the people on their phones or in their ears is very different.
15:43
When they bump into someone accidentally or if or if you try to interrupt them, suddenly you're imposing on them rather than having these shared rules in space.
15:50
So those kinds of experiences, they are unquantifiable.
15:53
And we live in a society that loves to put a number to everything and measure everything.
15:58
I felt like there was a real need to think about the qualitative experiences that are disappearing or have already gone extinct.
16:04
And once I had that in mind, I just started to observe and and look at through my children's experience, through my own experience growing up, What's what's gone now?
16:12
And is it good that it's gone?
16:14
In some cases, I think we've improved on things with technology on in other cases.
16:18
So I think we've we've set aside deeply important human values and behaviors that we still need that we've fooled ourselves into believing we don't.
16:27
And those are the things really that the book focuses on things like patients learning to wait, understanding how we should be in public space, really, really keeping our our private mediated pleasures or unmediated.
16:42
Now things like experiencing art, literature, music, family time.
16:46
These all seem like, you know, squishy things, as my economist and technologist friends like to say, you can't put a number on that.
16:52
And I to my answer is a humanities person is you're absolutely right, but those are still really important things that we have to acknowledge and attend to.
17:01
And when we decide we don't need them anymore, that should be a thoughtful choice, not just a oh, we've got a nifty new thing that'll that'll do this for me.
17:10
The machine will do it for me.
17:11
So I don't need the person anymore.
17:13
So that's sort of, it's a convoluted answer, but that's all of those questions are we're circulating.
17:18
When I was writing this book, that must have been a pretty both fascinating and terrifying intellectual journey to go down that route.
17:27
The whole thing that you, you know, you talked about, you know, some things turned out to be good, some things turned out to be bad.
17:34
You had to pick 2 examples.
17:36
Is there is there 1 bad and one good that stood out to you during your research?
17:40
Absolutely.
17:41
So, So the, the thing that we got rid of that I think probably most people don't miss, although I still will make my plea that everyone should know how to read a map is GPS.
17:49
And you know, I grew up in Florida perfectly flat.
17:51
I have a terrible sense of direction and I, I rely on GPS when I drive and probably over rely on it now because it's so convenient and easy to use.
18:01
But reading maps is still a skill that I think we should cultivate.
18:04
But that's a, that's a general good.
18:06
I have an ongoing debate with a friend of mine over whether automated toll toll booths are good.
18:11
I have privacy concerns about the tracking.
18:14
And I also worry about, you know, it's one more thing in society where we've taken a human and replaced it with a machine.
18:20
And what does that do to us as humans?
18:21
But those two things I think people generally like.
18:24
But one thing we gave up, and this is very important for children, is teaching handwriting, which again seems really small and, and not that big a deal.
18:33
But in fact is, is implicated in embodied cognition, in memory formation, in, in retention of knowledge.
18:40
And in all these ways, we didn't think through it, even though there were these fascinating neuroscientists going, wait, wait, wait, let's not, let's not throw the baby out with the bathwater.
18:49
And it's not, I know we have to learn to do keyboarding and touch screens and all these new things.
18:53
But there is value in this old way of doing things, and we will show you what it is.
18:58
That's the kind of example where I think a lot of us, myself included, thought who needs to write cursive anymore?
19:04
Turns out, actually we all should because it especially when you're a child developing your your literacy skills and your embodied cognition skills, it's really important.
19:14
And so we should be thoughtful when we set those things aside.
19:18
Interesting.
19:19
And and I love, I love the maps analogy.
19:21
I mean, I certainly learned to learn to read maps.
19:24
That was part of how you learn to drive a car right here and sat you down and said, here's how you find out how to get home.
19:32
Yeah.
19:32
I mean, maps are the kind of thing that you could really nerd out over, right?
19:35
I mean, it's got a it's got a little bit of privacy, right?
19:39
The map lines change The, the, the, the, the solutioning and the the A&Z.
19:45
It's like playing battleship.
19:47
I completely see what you're talking about.
19:48
That that is absolutely a skill that I'm glad I have.
19:52
But I, I don't know if I've taught it to my children.
19:55
Well, wayfinding is an ancient human skill, right?
19:59
This was something, that's why we explored and found new lands.
20:02
It's how we find our way back home after we have explored.
20:05
And again, our brains are designed to really be transformed by that tool.
20:11
The studies of London cab drivers who have to pass the knowledge exam where they've memorized their brains somehow has incorporated these crazy, you know, winding streets in London that are not a grid.
20:21
And he says.
20:22
And they have to add, the more they drive those streets, the more it's embedded in their memory in ways that are have fascinated researchers.
20:28
But again, that's something that I think wayfinding should never become just a hobbyist pursuit.
20:34
Because at some point, look, if if if you're off the grid or your GPS goes out, you still have to know how to find your way.
20:41
And so I think in those experiences, and I do tell the story in the book where I did get lost in Northern California with when my kids were quite young and I was like, oh, no, don't worry, Mommy knows where she's going.
20:51
I was completely lying.
20:52
I just didn't want them to panic like I was on the inside.
20:55
But that reminded me of that old skill I had been taught to read.
20:58
And I had a road Atlas in every car I ever drove until GPS came along, right.
21:03
So now I have a road Atlas in my car again.
21:06
And I taught my kids to read maps that summer because I, I, I was reminded of my old knowledge that I'd let get a little dusty and how it actually needed to be put into practice again.
21:18
Yeah, well, and, you know, maps and, you know, even even things like geocaching, right?
21:23
That's a ton of fun.
21:25
And it's way more fun to do it with a, a compass and a map in your hand than just staring at your screen and trying to follow a Pokémon.
21:34
Yeah, you know, it does feel like we've, we've lost out on some, some things.
21:38
And is there any, any new technology that stands out to you as problematic?
21:43
Quite a bit.
21:44
How long do we have?
21:47
No, I, I can't, I can't I, I'm constantly called a let I, which I'm not.
21:50
I use these tools myself everyday.
21:52
The one that concerns me most.
21:54
And it and the book was already in production as these things are really starting to come to the market.
21:59
So I didn't get to write a lot about it, but I have since been spending time exploring it are the friendship chat bots.
22:07
So chat bots can be extremely useful in lots of sort of practical ways for businesses.
22:14
But a lot of the friendship enabling chat bots and the therapy chat bots do worry me because again, the way humans are wired, we do want to, we want to see the human and everything around us is where we see a face in our toast every morning.
22:25
We're like, oh, look, it's Jesus and my toast, right?
22:27
You know, these we're wired to see faces.
22:30
We're wired to kind of connect in this really wonderful way.
22:33
But what that means is that if you have people who understand human behavior and our need to connect and want to make a tool that gets you hooked that isn't a human but mimics one extremely well, that can be dangerous for particularly for younger people or people without a really healthy sense of self, like to connect to a, to an AI friend chatbot.
22:53
I know there's an argument that, well, that's better than nothing, but I actually think it's worse because I think we, we owe obligations to each other as humans.
23:00
We should want to connect to other human beings.
23:02
So that those concern me in terms of how they're completely thrown out into the wild and everybody's like, well, we'll just figure it out.
23:10
I worry about that attitude towards something that is going to be not only interacting with us in very sophisticated ways, but also gathering a lot of information, very private and personal things about our moment by moment moods about our information about how we feel, you know, at work versus at home.
23:28
So in all those ways, these are extremely intimate technologies and we instead of seeing them as like, look at this cool new thing I can talk to and it shares my interest and it keeps feeding me, you know, compliments and telling me how great I am and I feel better.
23:41
It's not that simple.
23:43
We really should understand.
23:44
It's a lot more complicated than that.
23:46
Yeah.
23:47
It's terrifying.
23:48
And I feel like we all know how that movie ends.
23:51
I mean, it started with a space odyssey in the famous line.
23:54
I'm sorry, Dave, I can't do that.
23:57
Followed up thou by, you know, we saw her come out and that that seemed a little more that seems a little more prescient relative to the topic at hand here with these kind of friendship chat bots.
24:10
I'm curious, do you do you worry at all about people becoming, you know, dependent upon these digital virtual friends, what the consequences of that are emotionally, but also on the privacy side.
24:26
That's AI mean.
24:28
Facebook had to trick us to get us to give him give them all that information.
24:34
Or at least, you know, socially engineer us therapist, a chat bot.
24:39
They know your most intimate secrets.
24:42
What do we do with that?
24:44
Well, one of the things that's sort of most disturbing to me about some of the latest research on how people feel about these chat bots is they will say, and, and this actually speaks to the way we've already become very conditioned to distancing ourselves from human relationships.
25:01
They'll say, well, I trust the AI and more than I trust a person.
25:04
It's not going to tell anyone my secrets.
25:06
It doesn't ever get impatient with me.
25:08
It can't cut me off.
25:09
It lets me just be me.
25:10
I feel like I'm really myself.
25:12
To which I respond, Oh my God, no, because the whole point of humans in our lives is they're, they're the lovely people who after we rant and rave, say, well, you sure you're not overreacting or I understand you're upset, but let's just sit with that feeling.
25:28
And Are you sure you're not angry or are you sad?
25:31
Or And they, our friends, our intimates are from from the moment, you know, from our parents, our siblings, our families, our communities, they are constantly teaching us how to understand ourselves and helping us form a sense of self.
25:43
If we, if we have a lot of people coming of age in a world where their formation of self comes from something designed to manipulate them, that is feeding off the information they provide to the, to the bot, because IT services someone else's bottom line.
25:58
And who do it all, as you say, with a, with an absolutely blatant level of trickery.
26:04
And people think, well, that's better than dealing with a difficult human.
26:07
Well, guess what?
26:07
We're all difficult humans deep down.
26:09
That's, that's what makes us human and our contradictions and our self delusions.
26:13
All of these things we have to work out, we work out best with other people.
26:18
And I worry that this, this enthusiasm for for friendly chat bots now is a sign of just how far we've drifted from an understanding of why we need other people in our lives and why we need face to face connection.
26:30
There's also an issue of inequality here because the people I fear who are going to be given the AI therapy bot, but the people who don't have the money to afford another human being to sit in a room with them and for an hour every week and talk about their problems and to understand them as a whole embodied person.
26:45
And we see that already in the UK, the health service is often, you know, giving people a chat bot to talk to when they have a mental health concern.
26:55
That is a dereliction of our duty to each other.
26:57
And so I do worry about our the ease with which we think we can acclimate to that.
27:02
I think that can only happen in a society that's already spent way too much time becoming comfortable with mediated interactions.
27:11
Yeah.
27:11
I don't know if you remember, there was a bot back in the 60s named Eliza and this was a therapist bot.
27:21
And you know, it's, it's so interesting.
27:24
Like, why do you think that maybe this is clearly I'm, I'm just asking for speculation here, but what, why is it that we, you know, on one hand, I think we would all agree that human interaction is important and that we don't want bots to be our friends.
27:39
Like, objectively, I don't think there's anybody who would say I want those things.
27:43
But yet here we are, you know, 60 something years later, still working on the therapist bot says somebody really wants this thing.
27:52
What's where does that come from?
27:54
Why are we so desperate to build machines that can take over these interactions?
28:00
So I think the sort of broader existential question is why we are a society that treats deep human problems as engineering problems, not as human problems.
28:12
So I think a lot of people, well-intentioned people see a mental health crisis in this country and they go, I want to fix that.
28:18
Like a like you see a loose bolt on a, on a Truss on a bridge, like I'm just going to go in there with this tool and crank that out and then it'll be will be all fixed.
28:27
That works well.
28:28
If you're talking about structural engineering, it works well actually when you're designing things with computers, it doesn't work for people because there's always going to be and, and there's often unintended consequences with engineering choices.
28:40
But I think the whole way of looking at human nature and humans as a problem to be solved, that's the starting point of a lot of the sort of Silicon Valley tech ethos when it comes to people.
28:51
It's like, I mean, it's why you see a lot of frustration on on the part of people in Silicon Valley for how government works, which is very slowly, often inefficiently and you know, with tons of, you know, unintended consequences and spending too much money.
29:04
They look at that and they go, that's terrible.
29:06
Why is that how we do things?
29:07
So I think at some level you can, you can see structural issues there.
29:11
But when it comes to people, we are so much more complicated.
29:15
And I think embracing our complications will then allow us to find solutions, some of which might include technological tools, but most of which are still about understanding human nature and the best way to understand human nature.
29:27
Number one, spend time around other humans and, and pay attention #2 read history and read literature.
29:33
I mean, these things actually have have for since we could put our hands on a cave wall with something on it to leave a mark, we've been trying to understand ourselves.
29:43
And that impulse, I think should not be outsourced to technology or to an engineering mindset when it comes to the really difficult things like connection, like human flourishing, like understanding what what is my purpose, what what gives me meaning.
29:59
And I think it's why we are so we've outsourced even things like awe to our technologies now in a way that worries me because there's so much to be awestruck by in our in our world.
30:11
The phone you carry on your pocket shouldn't be the main thing that that, you know, incites that kind, that level of, of appreciation.
30:19
Interesting.
30:19
And so when you think about the I mean, the kid children are are, you know, clearly a focus throughout the book, You know, you just you discuss children a lot.
30:30
You know, do we have any particular concerns when you think about things like identity building, right in the sense of self, like do we have any sense yet or do you have any sense of what the impacts might be to our society as a result of these kinds of things?
30:45
Yes, I'm, I am concerned a great deal with the impact on children who from a very young age spend a lot of their times having mediated interactions.
30:54
They're having real experiences, right?
30:56
These digital experiences, they get an emotional response to these, to the things they do online.
31:01
It's not like it's fake in the way that we all used to think of IRL and you know, the computer that, that that boundary doesn't exist for them.
31:10
But what I think we've seen, you know, my friend John Heights written a wonderful book about sort of the mental health crises, focusing particularly on social media and children.
31:18
But there's another part of formation of self and character development.
31:22
And here's where I become neo Victorian and in my language, intentionally, habits of mind, what we do every day that helps form our sense of who we are and where we belong in the world.
31:33
If most of your interactions in in your day are mediated through a screen and through platforms designed to get you to do certain things at certain times in certain ways and structured in ways to illicit, say on social media, anger, fear and anxiety rather than happiness and calm, then you're if you do that enough and we spend on average 7 hours a day with me in mediation with screens higher number for children, that changes who you are, your sense of self, because you become very outer directed other directed.
32:04
What do people think of me?
32:05
Do they approve of this?
32:06
Did I get a like?
32:06
Did I get a retweet that so that that very other directedness means when when you talk to kids who erase a lot of this stuff and are finding that they're challenging, facing challenging mental health issues, they'll talk about avoid feeling hollow.
32:22
Who am I if I don't know what other people think of me?
32:25
That's what worries me.
32:26
And my friend Nick Carr has a new book out about information technologies is an interesting section in there where he calls it the mirror ball self.
32:34
It's like a self that's like a disco ball and it's constantly refracting off of all the outside inputs rather than having a kind of inner core.
32:42
And developing that sense of selfhood is really what becoming a fully formed human being is about.
32:47
And if you're a good parent, you try to give that to your kids.
32:50
If you're a good friend or sibling or colleague, you try to do that for the people in your life.
32:56
We're not doing that with the same kind of rigor because technology has made it easier not to do it that way.
33:03
So my argument is not that we give up all technology and become Amish, although I I love the Amish.
33:08
I write about them in the book.
33:09
It's so we understand that there's a difference between going across the street to help your neighbor and checking on on them because they're old and and widowed and they might need some help.
33:18
Then it is to send him a text message and go, hey, you OK?
33:21
Those are two qualitatively different ways of understanding an obligation.
33:24
And I think we tend to choose the ease because it's.
33:28
That's, again, human nature will choose the easy path almost every time.
33:32
I know I will.
33:33
Yeah.
33:33
I mean, we have to fight that urge actually, that that's what technology has, has made it the default in a way that even a generation ago it was not the default because you didn't have the option.
33:42
Now you have the option, which actually places the burden and responsibility on you to choose.
33:47
And that's kind of what I'm urging.
33:48
It's it is kind of tough medicine in a way.
33:51
And I and I apply all of this to myself because we all face these challenges.
33:56
Yeah.
33:56
And, and I mean, there's a there's a biological component to all of this, right?
34:00
I mean, when we were little, we, we got reinforcement from our parents, you know, a pat on the head, you know, good job making that play DoH sculpture or, you know, a way to go climbing that tree or I got internal validation, right?
34:14
I succeeded at building up some blocks that I hadn't, you know, been able to do last week, right?
34:20
I had ways that I got reinforcement and that dopamine hit is this is part of this mediated interaction challenge, the fact that like they can get that dopamine hit elsewhere.
34:31
And, and is that overtaking the kind of intentional habits of our mind?
34:38
Yes.
34:38
And the dopamine hit is much more powerful.
34:40
The and, and the intermittent reward is constant in a way that even the best parent is going to be having other things that draw their attention away from the kid.
34:49
And I think you do see this kind of very unhealthy feedback loop where a parent trying to get something done, hands the phone or the iPad to the kid to just get them off their back for 5 minutes.
34:59
Then they try to take it back.
35:01
And what happens?
35:02
The child's like, absolutely not like they're holding that thing like it's, you know, the last life raft on the Titanic because they, they, the, the pleasure they get, the dopamine hit is real.
35:11
And I, I'm, I'm being a little hyperbolic here, but I, I've spent a lot of time talking to educators and K through 12 as well as college, and they all say the same thing.
35:20
They say we can't compete with that thing, cannot compete with what that screen offers any child.
35:26
And there's no sense in.
35:27
That's why a lot of the idea of like, oh, teach them media literacy and what is that?
35:32
That's gone because it's with them all the time.
35:34
It's on their bodies.
35:35
Eventually they'll perhaps have overlays in terms of glasses or sensors.
35:38
It is so much more powerful and intimate that I think we have to start thinking about instead of monitoring the use of it, reclaiming some time where those technologies aren't part of the equation at all.
35:54
Whether that means getting phones out of schools during the school day, whether that means, you know, if you have a rule for your household around the dinner table, no phones.
36:01
And that includes the adults.
36:02
We have to model good behavior too.
36:04
So I think that, but if you habituate a young mind to that level of stimulation, it's very, very difficult to wean them off of it.
36:12
Now you can.
36:13
And kids I know, including my own son who went for, you know, 4 weeks hiking in the Backcountry after his senior year of high school.
36:20
He, you know, he was, he was, he had a smartphone.
36:22
He wasn't a didn't use it a ton, but he used he as a teenager.
36:26
He talked about that first week and how they all would sit around at night going, We really miss our phones.
36:30
They took.
36:31
No, they had a sat phone.
36:32
That's it, you know, for the guide.
36:34
And, but by the end of the four weeks, just how transformative, transformative it was for them.
36:40
And they all said the same thing.
36:41
My mind slowed down and I noticed more.
36:44
And that might seem like a small thing, but in a world where the pace is 24/7 and we're bombarded with so much stimulation information all the time, having the ability to slow, slow down and think and just be yourself without stimulation.
36:59
Can you sit Pascal?
37:00
Right.
37:00
Can you sit in a room by yourself and just be?
37:03
That is the challenge of being human.
37:04
And we're not meeting that challenge right now because we have so many alternatives.
37:09
Yeah.
37:09
I mean, I, I can speak from personal experience, just, you know, my ability to sit down and, you know, I like to try and practice meditation occasionally to calm myself.
37:18
And, you know, my ability as an adult who didn't grow up with all this stuff has definitely been diminished.
37:25
I, I can't go for as long as I used to go, which is the opposite what you would expect to have happen.
37:31
But I, I can't help but wonder if some of it has to do with all this digital stuff in front of me.
37:37
So, you know, we've, we've talked a lot about the emotional side on, on this, we've talked about the biological side, societal impacts.
37:46
You know, I would love to understand your thoughts in terms of the data, right?
37:52
So, so, you know, we do tend to focus a little more on the data privacy side of the world in our, in our house over here in this podcast.
37:58
And you know, the stuff like the chat bots, the stuff like the, the, the sharing with other people.
38:05
To me, this seems like a pretty slippery slope for training an entire generation to have no sense of privacy.
38:14
What's your take?
38:15
Where do we go?
38:16
What's happening with all this data?
38:17
Is it bad?
38:18
Is it good?
38:19
Is it necessary?
38:21
So that last question is the most important and should be the beginning of any conversation about is this necessary?
38:27
Do we actually need to know this about you and you know it?
38:32
The conformity and ease with which an entire generation has expected no privacy and constant surveillance is deeply worrying to me.
38:40
Now I'm seeing glimmers of rebellion among the Gen.
38:43
Z ers about this.
38:45
But if you think about, you know, whether you're whether you're getting a watch that tracks, you know, your behavior.
38:50
And they say, oh, you have this add on sensor that these sensors and development that can, you know, tell you your mood and, and see how you feel throughout the day.
38:56
Well, and then your spouse can have it too.
38:59
And or if you have an aura ring, you can like sync up your data with your spouse.
39:03
And again, I have friends who do this and it always it is the same old story.
39:06
It starts out with like, oh, we learned so much about each other in three weeks.
39:09
And they're like, he hates me when I talk about work after like, how do you guys like by the aura data shows his like blood pressure spikes.
39:16
I'm like, why would you want to know that about yourself?
39:19
Like just let him sit there and be polite.
39:21
And she didn't know it before, but now she knows it.
39:24
Do we all need a little bit more of an off stage backstage area where we are?
39:30
We have our private thoughts, our private feelings, and sometimes we mask them out of politeness, out of concern, out of all the reasons that we've been, you know, trained to do because we want to all connect and get along.
39:40
Also, your employer doesn't need to know your whereabouts all throughout the day.
39:43
The bad the the sociometric badges people are now given in many workplaces don't just track their movements and open the door to the office.
39:51
It also tracks how often do they speak in meetings, What's their, you know, the sort of who do they interact with throughout the day so their badge pings against someone else's.
39:59
These are all sold to employees as necessary for efficiency, for a healthier workplace.
40:05
And up to a point, I can accept a small bit of that.
40:09
But knowing how many times you speak in a meeting, is that really necessary?
40:13
Because some people are introvert, some are extrovert, some people communicate better, you know, one-on-one versus in a group.
40:19
There are 1,000,000 quirky human reasons why someone might not talk as much in a meeting.
40:24
But if your metric, again, if you're solving the problem of workplace efficiency and your metric is everyone should speak on average this amount of time at every meeting, then you've already gone down the road of making the people conform to the demands of the data and the machine rather than coming up with tools that actually work with people.
40:41
So in that sense that the privacy and the surveillance stuff is worrisome enough, but it also doesn't always work.
40:47
And that's where I think it's sold to employees and employers as this amazing new tool.
40:52
When in fact, maybe the old fashioned practice of, you know, taking a walk with your manager to work out a problem that's in your mind.
40:59
You get some fresh air, you get a little exercise, you talk, it's private, it's not recorded ideally by either party.
41:05
And you work through a problem that might be a healthier choice than than even what all of this data that's being hoovered up by an employer might show them in a spreadsheet.
41:17
Yeah, well, I definitely don't need to know anybody's location in order for them to do an excellent job at our business.
41:24
So, you know, you mentioned Gen.
41:25
Z.
41:26
I've got a good friend of mine who's who's written some books about Gen.
41:28
Z and the differences in the generations.
41:31
And it's it's an interesting way to kind of segment those groups.
41:34
I am Gen.
41:35
X and we're the best, we're the best reality bites.
41:39
I mean, you know, so, so Gen.
41:42
X, you know, I feel like we all come in with a little bit of skepticism about the technology and we're probably reasonably safe.
41:49
But you know, Gen.
41:50
Z is really interesting, right?
41:52
They grew up with this stuff, but at the same time, they're turning out to be more cautious and aware of their data rights and what happens with the data.
42:03
And then you got Gen.
42:04
alpha.
42:05
So what what happens are we is this, do you think that we've seen a clue that despite this mediated interaction and heavy technology that people will grow out of it and learn?
42:17
Or are we going to witness?
42:20
No, no, no, this is going to change the generation because I, I don't know how it turns out with Gen.
42:24
alpha yet.
42:25
I think I, I hope, my hope is that it leads to more thoughtful decision making by younger generations because they have two glaring examples of thoughtlessness and eager embrace and unthinking adoption of these tools.
42:40
The millennials on whom this the world performed a massive social experiment and the boomers who came to it too late and have like destroyed things like Facebook because they just went all boomer on these tools.
42:51
And so I think for, you know, the, the Gen.
42:53
the Gen.
42:54
X are shrugging, of course, but the Gen.
42:56
Z ers and then the rising alphas, what they have seen are a lot of the mistakes.
43:00
They're also growing up in a world where the idea, and this is actually where I think older generations do have a responsibility.
43:08
We have to revive ideas like privacy.
43:11
We have to revive ideas like, you know, being bored and along with your thoughts as a good thing.
43:17
And we have to revive practices like you all sit around the table and just have a conversation and nobody answers a question by picking up their phone.
43:25
You either end the evening not knowing something, not knowing the answer, maybe having to go find it out yourself, or kind of hearing from others listening.
43:33
I think they have seen technology really disrupt their relations with the most important people in their lives, with their parents.
43:41
You see people pushing strollers on their phone, talking on their phone to someone and the kid is like vocalizing and looking around and parent is tuned out.
43:48
The parent is not paying attention to that child.
43:50
If you're raised that way, of course you're going to go right for the phone yourself the minute you have one because that's what you've seen modeled for you.
43:56
So I've spoken to a lot of teenagers who are frustrated with their own parents behavior and in a way that's quite healthy.
44:03
There's skepticism about institutions, although damaging to trust as a society is also forced in old institutions to really rethink what their values are when it comes to monitoring employees.
44:14
What's required?
44:15
What do you know about me?
44:16
And they'll ask these questions if I sign on to this, What do you know?
44:19
What's the data that you're getting from me?
44:21
They're curious what it is and employers suddenly have to disclose that.
44:25
So they're, those are all healthy trends, but I do worry about how young they're being inculcated in technological habits versus human ones.
44:35
Because from, you know, for especially zero to five, they need the human 99.9% of the time.
44:42
And you can give them a little technology, it's not going to actively harm them.
44:46
But if if you reverse that ratio, the human starts to become this really difficult thing for them to, to understand and interact with.
44:53
And that has lifelong consequences.
44:56
Technology is pretty easy to figure out.
44:58
I mean, it's now, you know, designed in such a way to that even me, a Gen.
45:03
Xer, can figure it out most of the time.
45:05
But those human skills, those are hard to teach.
45:07
I have to do it with some of our younger employees.
45:09
And it's difficult to teach, like, look people in the eye, listen to what they're saying before you respond.
45:16
All these little skills we take for granted, they have to relearn.
45:21
Yeah.
45:21
And those are also things that are hard to learn at a certain age, right?
45:25
I mean, these are are our brains are prewired to learn things like language at a certain period of time.
45:31
And I assume probably social norms and things like that fall into a similar buckets and build, you know, can't teach an old dog new tricks.
45:40
And so I, I, I think that's, that's probably spot on.
45:43
Well, yeah, I do have a theory though, which is I think if we, if we want to make all of these younger people care about privacy, about these kinds of things, we have to somehow make it retro because then it will be cool again.
45:57
So, you know, maybe, maybe we can just position like good data hygiene is the way our grandparents used to do it.
46:04
And, and now it'll be cool again.
46:06
It's like a warm cardigan.
46:07
It's like a vinyl record.
46:08
You know, it's awesome.
46:11
Well, you know, let's talk a little bit about the world around around us.
46:17
We have seen a lot of dynamics around politics and civility these days.
46:23
And, you know, I think candidly, kind of a lack of it in, in many places, I can't help but think about how these things are also part of contributing to kind of distortion in our society.
46:38
I'm, I'm really curious.
46:39
I, I, I, I don't want to put you on the spot in terms of political views, but how does that political dynamic play into, you know, these challenges that we're talking about around children, around adults, around sharing, around a desire to mediate?
46:56
Really curious about your thoughts on that.
46:58
Well, I can speak from the experience of having lived in Washington, DC, for 30, more than 30 years now.
47:02
I have friends on both sides of the aisle who've served in Democratic and Republican administrations.
47:07
And I've kept those friendships in part because I understand that if you make your world about politics and everything's political, you can't keep friendships because you're going to meet wonderful people with whom you disagree on particular issues.
47:18
So that was always in some ways kind of how this town worked too.
47:23
So you'd see people, you know, denouncing their colleague on the Senate floor, but then behind the scenes, their kids are on the same baseball team and they hang out on the weekends.
47:31
And there was a civility.
47:32
There was actually a lot of cross party socialization.
47:35
It was all very healthy with the Internet changed by moving a lot of our political debate online is that it started to reward different sorts of behavior, moral grandstanding by politicians, appealing to followers rather than constituents.
47:50
Very big distinction.
47:52
That's really important to remember.
47:53
If you're a politician, you should be answering to your constituents.
47:56
But if you're a politician with 2 million Instagram followers, you're not really focused on your constituents.
48:01
You're really focused on the attention you're getting on a platform that has nothing to do with the needs of your local constituents.
48:07
So there.
48:07
And the rewards are vast.
48:09
The fame, the, you know, constant yay team and then the polarization that happens as a result.
48:15
We can now all live in our own little fractured worlds and never even exchange any sort of agreement on facts.
48:23
This is why the whole like fact free universe we live in, everyone has their own facts now, their own truth, their own this and that.
48:28
And that's been enabled by a technology that doesn't have any barrier to entry that how does its conceit.
48:36
Remember the early days of the Internet?
48:38
It's the Newtown Square, it's the new Democratic forum.
48:41
These were all really lovely ideas, but those of us who've studied human nature like this is not going to work because you actually need both hysical boundaries and you need to be face to face for some of these sorts of discussions for them not to spiral.
48:54
Because you remove that barrier and you have the online disinhibition effect where we will say and do things to someone with a screen between us that we would never do in person, I hope.
49:03
But we this has been studied.
49:05
So when you try to move very raucous young, because we are still quite young as a democracy online, and the shaping tools and the platforms design all reward extreme behavior, we get where we are now.
49:19
And there is, though, a really positive thing that's coming out of that.
49:24
So many issues in Washington today are highly polarized, highly partisan.
49:28
There is one thing that you can get a lot of people on both sides of the aisle to agree on.
49:32
And we've had bills working their way through Congress on this, which is, you know, social media might be bad for really young kids.
49:38
You've got you've got people like Katie Britt of Alabama, very conservative senator, and John Fetterman of Pennsylvania, very liberal senator, sitting down and going, OK, we've tried this experiment.
49:47
We've seen the effects.
49:49
Maybe we need to have some limits to the architecture and design of some of these tools when it comes to kids.
49:54
And that to me is really heartening.
49:56
If you look at the people who are Co sponsoring some of this legislation.
50:00
And again, I'm, you know, I don't think top down federal solutions are always the answer.
50:04
But when it comes to protecting the safety of children, they're important.
50:08
And we've done it before.
50:09
And the bipartisanship on this issue is really good.
50:13
You see it in schools, people who would hate a teachers union will sit down with all the teachers and the teachers and the parents are all like, what do we do about the classroom with these phones?
50:21
And they come up with solutions and they really try to implement them.
50:25
And those are all heartening signs that there are ways out of polarization.
50:29
They do tend to involve taking these conversations back to the smaller human oriented spaces, whether that's face to face with your fellow congressman and staffers on the Hill or it's sitting down at a school board meeting face to face, not performing for the people online who are going to put it on a TikTok out of context, but face to face.
50:48
I mean, I think they should ban smartphones during these meetings because that allows everybody to just actually be human beings rather than performing seals for the for the likes on.
50:58
And, and in that sense, I think we are having that conversation about how these platforms have changed our sense of reality because we need a shared reality if we're going to actually embrace how wonderful and weird our democracy is.
51:11
And that it's we're going on 250 years next year for the Declaration of Independence.
51:15
I mean, it's an incredible experiment, but it only works if we all have a couple of things we can agree on, whether that's, you know, you can believe whatever you want.
51:22
Just leave me alone.
51:24
And if part of what I think the Internet has done is made everything into a political question where you're where you must vote, you must weigh in on everything.
51:31
You know, I was pumping gas the other day and up pops, you know, on the screens on the gas pump.
51:36
It's like, would you take a survey about your experience while pumping gas?
51:39
I'm like, who cares what I think about like?
51:42
But, you know, we're constantly asked our opinions and rewarded for giving the most extreme version of our our thoughts.
51:48
And again, we become habituated to behaving that way.
51:52
So it's no surprise that in our politics, we've seen that play out.
51:56
I want, I hope again, younger generations are kind of tired about this and they're tuning out.
52:00
They don't like either party and they don't like either party for a reason because they think both are too invested in this way of doing politics and they want to do it a different way.
52:09
That's a really good thing.
52:11
They can build something new.
52:13
You know, you mentioned an interesting phrase when you were talking about that.
52:18
You talked about, you know, your truth, right?
52:21
And you know, kind of what's the real truth.
52:24
And it's funny because we didn't used to have debates about whether or not these things we saw with our eyes were true, right?
52:35
And now that's become something and, and we talked about that in a very negative light when we think about the the viewpoint of look, the my friend has gone down a rabbit hole on some political spectrum and they're, they believe a bunch of new things that I think are comically wrong.
52:54
OK, but as I think about this, you know, isn't that really what religion is, right?
53:02
I mean, isn't religion a kind of collective agreement about a set of truths that are despite the evidence to the contrary, or at least the scientific evidence to the contrary?
53:14
And I think one of the things that you mentioned up front that I, I, you know, we didn't spend much time on is you talked about you, you know, you were raised in an evangelical family and then you decided that that was not the path that you wanted to follow.
53:29
How do we, how do we reconcile that?
53:32
There are some places where shared delusion is OK, if not even good, but there are other places where we say shared delusions a bad thing and where, where are we drawing that line?
53:47
And I, you know, just kind of a broad question here, but I'm just curious what you think.
53:51
No, it's, it's a great question because it's kind of unanswerable in a, in a free society, right?
53:56
Because I think what you end up seeing are people self selecting into communities where everyone shares those same principles and values and lives by them.
54:04
The challenge that I think technology has posed to that way of ordering society where there's always conflict.
54:09
You know, there have been lawsuits about, you know, religious sects that won't vaccinate their kids, for example.
54:14
There have been all kinds of, you know, terrible stories of abuse, but also if you look at, for example, right now, you know, the healthiest group of, of teenagers in the in the country are kids raised in households with faith-based, you know, who, who regularly attend some sort of religious service and have a family structure that's organized around, you know, some faith.
54:34
It doesn't even matter which one.
54:35
It's just, it has boundaries and order and a sense of, you know, meaning and purpose.
54:40
And again, like you say, shared understandings and truths.
54:43
The way I grew up, I, I was always allowed to ask questions, but there was always that point where the questioning got shut down.
54:48
It's like, no, now you're just going to have to, no more questions.
54:52
So for me being a nerd, well, I, but I have 10 more questions.
54:55
What do you mean no more questions?
54:56
And so that, that really, and I had parents who were actually incredibly indulgent of my annoying curiosity about everything.
55:03
And they would usually just send me to the library and I'd ride my bike to the library And the librarians are like, whatever, go, you know, go look down that aisle, you'll find an answer.
55:10
And I did, I found answers in books and I found a lot more questions.
55:14
And for me that meant leaving a faith that, you know, taught me creation science rather than evolutionary Syrian and it led to a secular life.
55:22
But I think it's interesting that at the same time that we have all this confusion about truth and reality, we have a society that's become a lot more secular as, as a role, you know, a lot, a lot more people do not profess any faith or regularly attend any services of any sort.
55:36
So we do have this human need to, to belong to a tribe to, to all kind of agree on some things.
55:44
And so I think what the Internet has given us is a global version of finding a tribe.
55:49
And it's a kind of religious fervor.
55:51
When people, whether it's a political debate or a cultural issue, and they find all these people on Reddit who agree with them.
55:56
And suddenly they're like, I'm not alone.
55:58
I too believe the earth is flat.
56:00
We're all, we all know this now.
56:02
And then off they go because it's a sense of belonging they seek.
56:05
And This is why, you know, when there were a lot of debates during COVID about the vaccine and people were skeptical, the most successful physicians who were dealing with skeptical patients about anything, whether it's a vaccine or any treatment, we're the ones who sat down and had a conversation about those people's questions.
56:22
They say, why do you think that?
56:23
What, what, what does that mean to you?
56:25
Like, how's that make you feel?
56:26
They would just be humans with another human and listen and, and attend to their concerns and treat them seriously.
56:33
And they could often persuade their patients to do something that was better for them.
56:37
That's I think what we've lost on the Internet, where things are very instantaneous.
56:40
You have to instantly have a view you're either for us or against us.
56:44
And if you're, you know, if you try to have any nuance whatsoever, you're, you're out of the debate immediately.
56:49
So, but what people are seeking and doing that, that doesn't mean they're bad people.
56:54
It means they're seeking connection and meaning and purpose.
56:57
So that suggests to me that we need to find other ways to build institutions in the real world that offer them that same thing.
57:04
And this is where sociologists who studied the decline of these, you know, civic organizations and 3rd places in bowling leagues, the famous bowling alone argument that Robert Putnam made, all of these things deteriorated very slowly.
57:17
And we've replaced them with technology.
57:19
And technology is a poor replacement for that.
57:21
Because when you, you know, I have a weird hobby that throws me in with a lot of different people from all different walks of life.
57:28
And I learn so much when we have conversations just about other ways of looking at the world.
57:34
It's been very helpful and very humbling to realize people have all kinds of different ways of viewing political issues, for example, but it's it's difficult to find those places.
57:44
They used to be everywhere.
57:45
And if you look around, they're harder to find.
57:47
You have to actively create them now.
57:48
And that is our responsibility to the next generation is to show them how to do that.
57:52
Say, look, you can get a group of friends together.
57:54
Maybe it's a book group, maybe it's a Frisbeely, I don't know.
57:57
Whatever you guys all like to do, just do it and do it regularly and do it in person.
58:00
That really does help overcome this challenge of a fractured sense of reality.
58:05
And, and in an age of AI where you really can't trust everything you see in here, we need to be grounded in when what we know to be real, if we're going to confront what is coming down the pipeline.
58:17
Is there any, I, I, I'm just sitting here wondering, is there any truth?
58:22
If I said that human beings tend to, and I know you're not a psychologist, so but you know, the, the human beings tend to center around communities, at least the perception I have looking at, you know, the polarization of the Internet is that the vast majority of the community building around things like the Earth is flat as a good example.
58:52
They they tend to be around something that's just objectively, measurably not true.
59:00
Is, is there a desire?
59:02
Is, is it?
59:03
Is this desire to seek out community bolstered by the fact that it's an untrue thing?
59:12
I mean, we don't see a lot of people building, you know, a new cult because everybody likes Frisbee, right?
59:19
Like, that's never what happens.
59:20
It's always, I know a secret and you don't know it.
59:24
And so now come join my cult or believe my conspiracy theory.
59:28
What, what is it about us that makes it so that the more fantastical the story, the more likely it is to be something that people want to group on and, and, and agree on?
59:41
I think this is something that implicates technology and how both sophisticated and opaque the systems that rule our life have become.
59:50
I think people seek out those groups because they generally and honestly feel like they don't know how anything works anymore.
59:56
And when something breaks down, the people who are supposed to know how it works often don't know how it works either and why it broke down and how to fix it.
1:00:03
And this is going to sound, I'm not trying to be glib here, but I've tried to park in a part of Washington the other day and I had the wrong parking app and I have an old model phone and it was like Nope, no more apps, not doing it, can't do it.
1:00:16
Keep your full up with apps until you get a new phone.
1:00:18
And I couldn't park legally.
1:00:20
I wanted to park legally.
1:00:21
I wanted to pay my $4.00 or whatever to park on the street.
1:00:25
And I had, I was experiencing such frustration.
1:00:27
This is before a dental appointment.
1:00:28
So I was already all wound up like I got to go to the dentist and I, I had this moment where I was like, oh, now I get it.
1:00:35
Imagine if like you're, you work in a, in a, say you're a gig worker and you know, you, you deliver food and something breaks down with your app on your phone or something goes wrong with a customer and there's no human being who's going to talk you through it.
1:00:47
It's like you have to submit to the app, you have to show the photo.
1:00:50
They're going to double check you're being monitored, surveyed, but you're not being dealt with like a human being.
1:00:55
And a lot of people go throughout their days feeling like that a fair amount of the time.
1:01:00
I mean, I do too.
1:01:01
A lot of systems, a lot of information, a lot of ways of managing our behavior, controlling it, asking questions of it.
1:01:08
And some of it is innocuous, but the overall sense is I don't know how this stuff works.
1:01:13
So if think about how like I used to know how to change the oil in my car, I open and I have a like a 10 year old Honda.
1:01:18
I open it up.
1:01:19
I'm like, it's like a computer.
1:01:21
And you see, like even that sort of black box, it's a kind of black box sensibility.
1:01:27
Everyone likes it because, oh, it's easy, it's convenient, it's computerized, it's new.
1:01:30
But then something breaks or we have a question and then you're running up against, why doesn't Amazon have a phone number I can call and reach a human being?
1:01:37
Well, that was deliberate.
1:01:39
They don't want, they don't want to talk to you about what's going on.
1:01:41
For example, maybe you're Alexa's now not giving you the option of not recording every time you want to wake Alexa, which is a new thing.
1:01:49
Maybe you want to talk to someone about what that means for privacy.
1:01:51
Guess what?
1:01:52
You're not going to talk to someone.
1:01:53
So I think that impulse to kind of understand the world is really, I'm sympathetic to it because we really don't understand a lot of the bureaucratic and informatic systems that now rule our lives on a regular basis from something as quotidian as not being able to park without an app.
1:02:10
And something is major is why did your employer fire you?
1:02:13
Because the algorithm they used to assess your productivity said you were being lazy or unproductive, or maybe you're up for parole.
1:02:20
And the judge says no, because they use a criminal sentencing algorithm that that puts feeds all this information into it.
1:02:27
You don't get access to what they fed into it, but it it says no, you're a high risk for recidivism.
1:02:32
So back to jail you go.
1:02:33
This is how we live now.
1:02:35
And I think the conspiracy theorizing grows out of this desire to really have some accountability on the part of those systems and their designers.
1:02:44
Yeah, I I think he, I think you're probably exactly right.
1:02:47
I mean, there's, you know, we look back through human history and we've always wanted to every time we didn't understand something, right?
1:02:55
We we kind of have this viewpoint of its magic.
1:02:58
In fact, there's a fairly famous quote, you know, it's a any, any many sufficiently advanced technology is indistinguishable from magic, right?
1:03:08
And, and that really is the perception of many people.
1:03:12
I, I have no idea how this phone I've got in my hand that I spent 20 hours a day on, I have no idea how it works.
1:03:22
I have no idea where the stuff goes.
1:03:25
And so it's probably almost easier to believe that there's, you know, Steve Jobs came back from the dead and he's routing your phone call.
1:03:34
Like that almost makes more sense than the real technical explanation of packet routing and how it's going, right?
1:03:40
Like to a normal layperson, that is such a complex problem that it's become unknowable, no matter how how well they use and know the technology anyways.
1:03:54
Well, we could get off on that one for a while, I'm sure.
1:03:57
Talking, talking for hours.
1:03:59
Oh no, I mean conspiracy.
1:04:00
I'm fascinated by conspiracy theories and we've always humans love them.
1:04:04
It it is actually something about our nature to and it's healthy in the sense that we ask questions and and often question power.
1:04:10
Conspiracy theories are often a pathway to holding the power to account.
1:04:13
And then they also are often quite dangerous and, and, and terribles.
1:04:19
So we've talked a lot about a lot of bad things.
1:04:22
You know, I think I guess I kind of look back at it and you know, we talked about like the negative impacts of this mediated society.
1:04:29
You know, is there hope for us?
1:04:31
Are we are we kind of are we done?
1:04:34
We've already crossed the point of no return.
1:04:36
How do we correct this?
1:04:37
So you talked about wayfinding.
1:04:39
Help us.
1:04:40
Where do we go from here?
1:04:42
Well, I'm because I'm trained as a historian and I'm a tech critic.
1:04:46
I don't have to offer solutions.
1:04:47
Don't I kid?
1:04:47
I just, I've thought a lot about this.
1:04:49
I'm very hesitant.
1:04:51
There's no one-size-fits-all solution to a lot of this.
1:04:53
There are things we can do and are doing on the margins, like I said, things to protect kids in terms of some of these platforms, things like that.
1:04:59
But overall, I think I do have a lot of hope for us because the thing about human nature, it is so stubborn and persistent.
1:05:08
And the more you read about people in the past, the more you realize they we did, we tend to figure it out.
1:05:15
And I think the the the conceit of much of Silicon Valley is that they think they have figured things out.
1:05:23
And that's where human nature is this wonderful curb from mythology onward, reminding anyone who thinks they can build the Tower of Babel or fly close to the sun.
1:05:31
Well, it's not all a technical problem.
1:05:34
It's a human question you're not answering before you try to solve the technical challenge.
1:05:38
So what I see, and this is actually quite heartening in the debates about AI and, and, and some of the ways we will interact in and work and play in the in the future with AI is OK.
1:05:49
Well, So what is it that's unique about human that we shouldn't replicate or can't replicate?
1:05:53
What is it about us that we really value?
1:05:56
And that's the first order question, right?
1:05:58
That's where you start.
1:05:59
Then how do you build technologies that respect those things, truly respect those things?
1:06:04
Because I think what a lot of people are attending to now, the reason they're a little anxious about their own use of technology, what technology might be doing to them or to their kids or to society.
1:06:13
That's a really useful feeling that they should hold onto and say, OK, why do I feel this way in my daily life?
1:06:19
How do I do this?
1:06:21
How do I live without that feeling of anxiety or that need to constantly pick up the phone?
1:06:25
And they have to practice that behavior.
1:06:26
And that's where human nature truly shines.
1:06:29
We can, we can reorder ourselves far better than any algorithm or app can.
1:06:33
We know how to do it, but we have to 1st recognize the challenge and then figure out solutions that work for us and our communities.
1:06:40
And they're not going to look the same for everyone.
1:06:43
And that's, I think where experts tend to want to air on the side of here's my bullet point list of all the ways we will improve humanity.
1:06:49
I mean, I could make a list, but it's not going to work for you and it'll work for me and maybe for, you know, a couple other people, but it's our obligation.
1:06:57
That's my, that's your responsibility is to make that list for yourself and, and your loved ones and figure out, you know, how do I raise a, how do I raise kids?
1:07:04
How do I be a good member of my community with those values in mind?
1:07:10
So then the conclusion is that it's not going to be technology that will be solving these problems, it will be the human squishiness that is solving them.
1:07:21
This.
1:07:21
That is both terrifying.
1:07:22
You're the problem.
1:07:23
I am a solution.
1:07:24
Oddly optimistic at the same time.
1:07:27
And I, I asked this question of all our guests.
1:07:29
So, you know, let me ask you this, You know, you spent a lot of time looking at, you know, mediation and the impacts on our brains and our society and historical consequences.
1:07:41
What kind of guilty pleasures do you have?
1:07:43
Is there anything that you, you know, you kind of go look, based on everything I know I shouldn't be doing this, but I'm still going to do it anyways.
1:07:51
So the, the thing I really struggle with, and I mean every day I try to do this experiment every day because it is a problem for me is picking up my phone during interstitial moments of time when I'm waiting for the bus or I'm waiting for a colleague to come join me at a meeting or I'm waiting for, you know, to meet a friend at a restaurant.
1:08:10
It's there.
1:08:11
I have people I could, there are efficient things I could do tasks I could check off my To Do List e-mail, the check text messages, check.
1:08:17
I'm not even on social media.
1:08:19
So it's not even I don't even have the appeal of social media.
1:08:21
This is just the phone as a basic communication tool.
1:08:25
I had to stop doing that because I was noticing that I wasn't noticing things around me.
1:08:31
And that was important when a friend of mine who shall remain nameless fell on her way to our table when meeting me at a restaurant.
1:08:39
And I was like, on my phone.
1:08:41
Ace planted embarrassingly.
1:08:43
Thank God she wasn't injured, but I didn't see it.
1:08:45
Like I should have rushed to her aid and like, you know, like, ha helped her, you know, not be as embarrassed and instead, you know, some waiter like everyone saw and she felt mortified that moment.
1:08:57
I was like, I have got to look up.
1:08:59
I have to have look up experiences as often as possible.
1:09:02
And so I it is really difficult for me to do that.
1:09:05
I still find myself reaching for my phone all the time.
1:09:08
I have to practice not doing that.
1:09:10
And that for me has made me more patient, more aware.
1:09:14
I will, if I have a commute on the bus or something, or a train ride or a flight, I bring a book now bring a paper book because there's no distraction.
1:09:22
And I love the feel of the paper also because I'm, you know, old, but there's no distraction there, right?
1:09:29
I read the book.
1:09:30
The book doesn't read me back.
1:09:31
Like if it's an electronic version of something, it's not tracking my eye movements.
1:09:34
It's just a book and I can close it and it's mine and I can take notes in it that no one else will read but me.
1:09:39
So I, I have to choose that though, because my guilty pleasure is really to pick up the phone and see what's going on in my life and in the world.
1:09:47
So, so as a human, it turns out you have all the same human foibles that all of us other humans have when it comes to technology.
1:09:55
You know, I, I just can't say enough about your book.
1:09:57
So, you know, the, the, the extinction of experience, you know, you can find it on Amazon.
1:10:01
That's where I got mine.
1:10:03
There are many other places to buy it.
1:10:04
It's a fantastic read.
1:10:06
It's interesting, it's fun.
1:10:08
Christine, thank you so much for joining us today.
1:10:10
This has been a real delight.
1:10:12
Thank you.
1:10:12
I really enjoyed the conversation.
1:10:13
Thanks for having me on.
1:10:16
Thank you for listening to this episode of the Privacy Insider Podcast.
1:10:20
You can find a full transcript of this episode and any show notes at osano.com.
1:10:26
That's www.osano.com.
1:10:30
And while you're there, get access to an excerpt of my book, The Privacy Insider.
1:10:35
How to embrace Data Privacy and join the Next Wave of Trusted brands, which is now available on Amazon for purchase until next month.
1:10:44
Take care and remember, data privacy is a fundamental human right y'all.
Meet the host
Arlo Gilbert is the host of The Privacy Insider Podcast, CEO and cofounder of Osano, and author of The Privacy Insider Book. An Austin, Texas, native, he has been building software companies for more than twenty-five years in categories including telecom, payments, procurement, and compliance.
