PodClips Logo
PodClips Logo
Making Sense with Sam Harris
#218 — Welcome to the Cult Factory
#218 — Welcome to the Cult Factory

#218 — Welcome to the Cult Factory

Making Sense with Sam HarrisGo to Podcast Page

Sam Harris, Tristan Harris
·
27 Clips
·
Sep 24, 2020
Listen to Clips & Top Moments
Episode Summary
Episode Transcript
0:06
Welcome to the Megan says podcast. This is Sam Harris. Just a note to say that if you're hearing this, you are not currently on our subscriber feed and will only be hearing partial episodes of the podcast. If you'd like access to full episodes, you'll need to subscribe at Sam Harris dot-org there. You'll find our private RSS feed to add to your favorite podcast. True along with other subscriber only content.
0:30
And as always I never want money to be the reason why someone can listen to the podcast. So if you can't afford a subscription, there's an option at Sam Harris dot-org to request a free account and we Grant a hundred percent of those requests. No questions asked.
0:47
Welcome to the making sense podcast. This is Sam Harris.
0:52
Okay, very brief housekeeping today.
0:55
Just a couple of announcements first. I will be doing another resume call for subscribers and that will be on October 7th. I'm not sure if that's going to be an open-ended QA or whether the questions will be focused on a theme. I'll decide that in the next few days. But anyway, the last one was fun and hopefully the fun will continue. So I will see you on October 7th, and you should be on my mailing list.
1:25
If you want those details also, there's a few exciting changes happening over on the waking up side of things. So pay attention over there if you're an app user.
1:37
And I think that's it. Okay.
1:42
Well today I'm speaking with Kristen Harris. Tristan has been on the podcast before and he is one of the central figures in a new documentary which is available on Netflix now and that film is the social dilemma which discusses the growing problem of social media and the fracturing of society, which is our theme today. So as you'll hear I highly
2:12
You watch this film but I think you also get a lot from this conversation. I mean if you're looking out at the world in wondering why things seems so crazy out there. Social media is very likely the reason or it's the reason that is aggregating so many other reasons. It's the reason why we can't Converge on a shared understanding of what's happening so much of the time we can't agree about
2:42
whether specific events attest to an epidemic of racism in our society or whether these events are caused by some other arrangement in our thinking or just bad incentives or bad luck. We can't agree about what's actually happening and amazingly we are about to hold a presidential election that it seems our democracy might not even sir.
3:12
I've really it seems valid to worry whether we might be tipped into chaos by merely holding a presidential election. It's fairly amazing that we are in this spot and social media is largely. The reason it's not entirely the reason a lot of This falls on Trump some of it falls on the far left, but the fact that we can't stay.
3:42
A sane as a society right now that is largely due to the fact that we are simply drowning in misinformation.
3:53
Anyway, that is the topic of today's conversation and I was very happy to get Tristan back on the podcast apologies for the uneven sound pre covid. We were bringing everyone into Studios where they could be professionally recorded now, we're shipping people Zoom devices and microphones, but occasionally the technology fails and we have to rely on
4:23
on the Skype signal. So what you're hearing today is Skype. It's actually pretty good for Skype. But apology is if any of the audio sounds subpar, and now I bring you Tristan Harris.
4:42
I'm here with Kristin Harris Tristan's great to get you back on the podcast.
4:47
It's really good to be back Sam. It's been a while since the first
4:49
time I was on here. Yeah, we will cover similar ground but a lot has happened since we last spoke and it's is to my eye everything's gotten worse. So we know that just there's there's more damage to analyze and try to prevent in the future. But before we jump in remind people who you are and how you come at these
5:12
What's your brief bio for this relevant to this conversation? Yeah, well just
5:18
to say briefly. I guess one of the reasons why we're talking now and most relevant to my recent biography is the new Netflix documentary that just came out called the social dilemma. You're you know in which all these technology insiders are speaking about the Frankenstein that they've created we'll get into that later prior to that. I was a Google Design ethicist coming in through an acquisition of a technology company that I'd started called app sure that Google acquired
5:42
and after being at the company for a little while migrated into a roll of thinking about how do you ethically steer to billion people's attention when you hold the collective human psyche in your in your hands and then prior to that as you know is also discussed in the film is I was at Stanford, it studied computer science human computer interaction, but specifically at a lab called the persuasive technology lab, which I'm sure we'll get into which relates to just sort of a lifelong view of how is the human mind vulnerable to psychological influence and have had a fascination
6:12
Those topics from Cults to sleight-of-hand Magic to mentalism and heroes like Derren Brown who's a mutual friend of ours and how that plays into the the things that we're seeing with technology.
6:22
Yes, I would just want to reiterate that. This film The Social dilemma is on Netflix now and yeah, that's the proximate cause of this conversation and it really is it's great. It really covers the issue in a compelling way. So I highly recommend people go see that and they don't have to go anywhere obviously just open
6:41
Netflix and there's no irony there. I would count Netflix as I'm sure they're an offender in some way but they're their business model really is distinct from much of what we're going to talk about. I mean, they just they could have made the choice to they're clearly gaming people's attention because there they want to cancel churn and they want people on the platform and deriving as much value from the platform as possible. But there is something different going on over there with respect to not not being
7:12
Part of the ad economy and the attention economy in quite the same way. Let's just make sure we could draw later on but is there a bright line between proper subscription services like that and what we're going to talk about?
7:26
Yeah. I think the core question we're here to talk about is in which in which ways and where our Technologies incentives aligned with the public good and I think the problem that brings us here today is where Technology's incentives are misaligned with the public good through the business model.
7:41
Of advertising and three models like user-generated content clearly because we live in a finite attention economy where there's only so much human attention. We are managing a Commons a collective environment and because Netflix, like any other actor including politicians including conferences, including your month URI or this podcast or my podcast, we're all competing for the same finite resource. And so there's a difference I think in how different business models engage in an attention economy, but a
8:12
Model in which the cost of producing things that are going to reach exponential numbers of people exponential broadcast in case of Netflix, but also in the case of these other companies, there's a difference when there's a sense of Ethics or responsibility or privacy or touch child's controls that we add into that equation and I'm sure we'll get more into those topics,
8:31
right? Okay. So let's take it from the top here what's wrong with social media at this point if you could boil it down to the
8:42
The elevator pitch answer. What is the problem that we are going to unspool over the next hour or so?
8:51
Well, it's funny because the film actually opens with that prompt to the blank stares of any technology insiders including myself because I think it's so hard to Define exactly what this problem is. There's clearly a problem of incentives but beneath that there's a problem of what those incentives are doing and where the exact harm show up and the way that we frame it in the film and in a big presentation we gave
9:12
Jazz Center back in April 2019 to you know, a bunch of the top technologists and people in the industry was to say that while we've all been looking out for the moment when a I would overwhelm human strengths and when we get the singularity when with a I take our jobs when it be smarter than humans we missed this much much earlier point when technology didn't overwhelm human strengths, but it undermined human weaknesses and you can actually frame the cacophony of Grievances and Scandals and problems that we've seen in the tech industry.
9:41
Free from distraction to addiction to polarization to bullying to harassment to the breakdown of Truth All in terms of progressively hacking more and more of human vulnerability and weaknesses. So if we take it from the top, you know, our brains short-term memory system have seven plus or minus two things that we can hold when technology starts to overwhelm our short-term and working memory. We feel that as a problem called distraction. Oh my gosh. I can't remember what I was doing. I came here to open an email. I came here to go to Facebook, too.
10:11
Something up and now I got sucked down into something else. That's a problem of overwhelming the human limit and weakness of just our working memory when it overwhelms our dopamine systems and our reward systems that we feel that is a problem called addiction when it is Hap's into and exploits our Reliance on stopping cues that at some point I will stop talking and that's a cue for you to keep going when technology doesn't stop talking and it just gives you the independent bottomless Bowl we feel that as a problem called addiction or addictive use when technology exploits
10:41
our social approval and giving us more and more social approval. We feel that as a problem called teen depression because suddenly children are dosed with social approval every few minutes and are hungry for more likes and comparing themselves in terms of the currency of likes and when technology hacks the limits of our heuristics for determining what is true, for example that that Twitter profile who just commented on your Tweet five seconds ago that photo looked pretty real. They've got a vial that seems pretty real. They've got 10,000 followers. We only have a few cues that we can use to discern. What is real.
11:12
And Bots and deep fakes and I'm sure we'll get into DP G 3 actually overwhelmed that human weakness. So we don't even know what's true. So I think the main thing that we really want people to get is through a series of misaligned incentives, which will further get into technology has overwhelmed and undermined human weaknesses and many of the problems that were seen as separate are actually the same and just one more thing on this analogy. It's kind of like, you know, collectively this digital fall out of addiction teen depression and suicides polarization breakdown of
11:41
Truth we think of this as a collective digital Fallout or a kind of climate change of culture that much like the you know oil extractive economy that we have been living in an extractive race for attention. There's only so much when it starts running out. We have to start fracking your attention by splitting your attention into multiple streams. I want you watching an iPad and the phone and the television at the same time because that lets me Triple the size of the attention economy, but that extractive race for attention creates This Global Climate Change of culture and much like climate change it.
12:11
Slowly it happens gradually. It happens. Chronically. It's not this sudden immediate threat. It's this slow erosion of the social Fabric and that collectively we called in that presentation human downgrading that you can call it. Whatever you want. The point is that you know, if you think back to the climate change movement before there was climate change as a cohesive understanding of emissions and linking to to climate change. We had some people working on polar bears and people working on the coral reefs. We had some people working on species loss in the Amazon and it wasn't until we had an encounter.
12:41
Compassing view of how all these problems get worse that that we start to get chained. And so we're really hoping that this film can act as a kind of catalyst for a global response to this really destructive thing that's happened to society.
12:54
Hmm. Okay. So let me play devil's advocate for a moment using some of the elements you've already put into Play Because Abby you and I are going to impressively agree throughout this conversation on the nature of the problem, but channeling a skeptic.
13:10
Here and it's actually not that hard for me to empathize with a skeptic because as you point out, it really takes a fair amount of work to pry the scales from people's eyes on this point and and the nature of the problem though. It really is everywhere to be seen. It's surprisingly elusive, right? So if you reference something like, you know a spike in teen depression and self-harm and suicide there's no one.
13:39
Who's going to pretend not to care about that and then it really is just the question of you know, what's the causality here? And is it really a matter of exposure to social media that is driving it and I think I don't think people are especially skeptical of that. And that's a discrete problem that I think most people would easily understand and be concerned about but the more General problem for all of us is is harder to keep in View and it's so when you talk about things again, these are things you've already
14:09
You conceded in a way. So like attention has been a finite resource always and everyone has always been competing for it. So I'm going to put if you're going to publish a book you are part of this race for people's attention. If you if you were going to release something on the radio or television. It was always a matter of trying to grab people's attention. And as you say we're trying to do right now with this podcast, so it's when considered through that lens it's hard to see what is
14:39
Is fundamentally new here, right? So yes, this is zero sum and then the question is is good content or not. I think people want to say right it's just this is just a matter of interfacing in some way with human desire and human curiosity and you're either doing that successfully or not. And what's so bad about really succeeding, you know, just fundamentally succeeding in a way that yeah. I'm you can call it a diction, but really it's just what people find
15:09
Is what people want to do they want they want to grant their attention to the next video. That is absolutely enthralling. But how is that different from Italy leaving through the pages of a hard copy of Vanity Fair in the year 1987 and feeling that you really want to read the next article rather than work or do whatever else you were you thought you were going to do with your afternoon. So there's that and then there's this sense that the fact that advertising
15:39
is is involved and really really at the foundation of everything we're going to talk about what's so bad about that. I'm so really it's a story of ads just getting better. You know, I don't have to see ads for Tampax anymore. Right eye bag. I go online and I see ads for things that I probably want or nearly want because I abandoned them in my Zappos shopping cart, right? So what what's wrong with that and I think most people
16:08
Are stuck in that place like they just we have to do a lot of work to bring them into the place of the conversation where the emergency becomes Salient and so let's start
16:19
there. Gosh, there's so much good stuff to unpack here. So on the attention economy, obviously, we've always had it we've had television competing for attention radio and we've had evolutions of the attention economy before competition between books competition between newspapers competition between television two more engaging television to more channels of Television. So,
16:37
In many ways this isn't new but I think what we really need to look at is what was mediating where that attention went to meetings a big word smartphones. We check out we check our smartphones, you know, a hundred times or something like that per day. They are intimately woven into the fabric of our daily lives and ever more so because of we re-establish addiction or just this addictive checking that we have than any moment of anxiety, we turn to our phone to look at it. So it's intimately woven into where the attention starting place will come from. It's also taken
17:07
Our fundamental infrastructure for our basic verbs. Like if I want to talk to you or talk to someone else. My phone has become the primary vehicle for just about for many many verbs in my life. Whether it's ordering food or speaking to someone or you know, figuring out what I were to go and a map we are increasingly reliant on the central node of our smartphone to be a router for where all of our attention goes. So that's the first part of this intimately woven nature and the fact that it's our social it's part of the social infrastructure.
17:37
By which we rely on we can't avoid it and part of what makes technology today inhumane is that we're reliant on infrastructure. That's not safe or contaminated for many reasons that we'll get into later a second reason that's different is the degree of asymmetry between let's say that newspaper editor journalist who was writing that enticing article that gets you to turn to the next page versus the level of asymmetry of when you watch a YouTube video and you think yeah this time. I'm just going to watch one video and then I got to go back to work and you wake up from a trance, you know, two hours later and you
18:07
Amen, what happened to me? I should have had more self-control what that mrs. Is there's literally the Google Google's billions of dollars of super Computing infrastructure on the other side of that slab of glass in your hand pointed at your brain doing Predictive Analytics on what would be the perfect next video to keep you here. And the same is true on Facebook you think? Okay. I've sort of been scrolling through this thing for a while, but I'm just going to swipe up one more time. And then I'm done each time you swipe up with your finger, you know, you're activating a Twitter or a face.
18:37
Book or a tick tock supercomputer that's doing Predictive Analytics, which has billions of data points on exactly the thing that will keep you here. And I think it's important to expand this metaphor in a way that you've talked about on thinking your show before about just the power increasing power and computational power of of AI when you think about a supercomputer pointed at your brain trying to figure out what's the perfect next thing to show you that's on one side of the screen on the other side of the screen is my prefrontal cortex, which is evolved millions of years ago and doing the best job that can to do goal.
19:07
You can go retention and memory and sort of staying on task self-discipline Etc. So who's going to win in that battle? Well a good metaphor for this is let's say you or I were to play Garry Kasparov at chess. Like why would you or I lose it's because you know there I am on the chessboard and I'm thinking okay. If I do this he'll do this. But if I do this, he'll do this and I'm playing out a few moves ahead in the chessboard. But when Gary looks at that same chessboard, he's playing out a million more moves ahead than I can write and that's why Gary is going to win and beat.
19:37
You and I every single time but when Gary the human is playing chess against the best supercomputer in the world. No matter how many million moves a heads that Gary can see the super computer can see billions of moves ahead. And when he beats Gary who is the best human chess player of all time. He's beaten like the human brain at chess because that was kind of the best one that we had. And so when you look at the degree of asymmetry that we now have when you're sitting there innocuously saying, okay. I'm just going to watch one video and then I'm out we have to
20:07
We have an exponential degree of asymmetry and they know us in our weaknesses better than we know ourselves to borrow also from a mutual friend. You've all
20:15
Harari. So I guess I still think the nature of the problem will seem debatable. Even at this point. It's okay because again, you're talking about successfully gaming attention making, you know various forms of content more captivating, you know stickier and if people are losing time,
20:37
I'm perhaps that they didn't know they were going to give over to their devices. But what's that they were doing that with their televisions. Anyway, I mean that these statistics long before we had smartphones these statistics on watching television were appalling. I forget what they were there's something like, you know, the average television was on seven hours a day in the home, you know, so that the picture was of people in a kind of Aldous Huxley like, you know distopia just plugged in to the boob tube.
21:08
And being fed, you know bad commercials and it's and therefore being monetized in some way that is strikes people is not fundamentally different from what's Happening Now me guess it was there was less to choose from you know, there were within three different types of laundry detergent and it was not a matter of a really fine grained manipulation of people's behavior, but it was still if you wanted from the perspective of what seems optimal. It's
21:37
Still had a character of propagandizing people, you know with certain messages that seem less than optimal. You could I'm sure you could talk about teens or just people in general having you know body dysmorphia around ideal presentations of human beauty that were unrealistic and it will whether Photoshop was involved at that point or not. I mean, it was just good lighting and good makeup and and you know selection effects that make it make people
22:07
Obliged to Aspire to irrational standards of beauty all of these problems that we tend to reference in a conversation like this seemed present. I think the thing that strikes me as fundamentally new and this is brought out in your in the film but several people relates to the issue of misinformation and the siloing of information. So which really does strike me as genuinely new so and
22:37
there are few analogies here that I find especially arrest in and what the one thing that Jaron Lanier said, he says in the film and he said on this podcast year or so ago, which is I think frames it really well is that it just imagine if Wikipedia would present you with information in a way that was completely dependent on your search history you all the data on you that had been collected that showing your biases and your preferences and the way the
23:07
Ways in which your attention can be gamed so that when each of us went to Wikipedia Not only was there no guarantee. That would be seeing precisely the same facts rather. There was a guarantee that we wouldn't be right there. We're in this sort of a holiday that this shattered epistemology now and we built this machine it so the very Machinery were using to deliver information really the only what is almost the only source of information for most people now is
23:37
is a machine that is designed to partially inform people misinform people spread conspiracy theories and lives faster than facts spread outrage faster than disinterested nuanced analysis of stories. So it's like we have designed an apparatus whose purpose is to fragment our worldview and to make it impossible for us to fuse our cognitive.
24:07
Horizon so that if you and I start out in a different place, we can never converge in the middle of the psychological experiment and that that's the thing that strikes me there for which there is no analogue in, you know, all previous moments of culture yet. That's
24:22
100% Right? And I mean if we jump to the chase about what is most concerning it is the breakdown of a shared reality and the breakdown therefore of our capacity to have conversations and you know, you said it that if we don't have conversation we have violence and when you
24:37
shatter the epistemic basis of how do we know what we know and I've been living literally in a different reality a different Truman Show is Roger McNamee would say for the last 10 years and we have to keep in mind. We're about 10 years into this radicalization polarization process where each of us have been fed, you know, really a more extreme view of reality for quite a long time that what I really want people to do isn't just to say this technology addictive or these small questions. It's really to rewind the tape and to ask you know, how has my mind been fundamentally War
25:07
And so just to go back to the point you made a second ago, you know, so what you know YouTube is giving us information will first on that chess match. I mentioned of you know, are we going to win are they going to win 70% of the billion hours a day that people spend on YouTube is actually driven by the recommendation system by with the recommendation system is choosing for us. Just imagine a TV channel where you're not choosing 70% of the time and the question becomes as you said. Well, what is the default programming of that channel? Is it, you know, Walter Cronkite and some kind of semi reliable.
25:37
Communal sense making as our friend Eric would say or is it actually giving us more and more extreme views of reality. So three examples of this several years ago. If you were a teenager and looked at a diet video on YouTube all the several of the videos on the right hand side would be thinspo anorexia videos because those things were better at keeping people's attention. If you looked at, you know, the 911 videos it would look at it would give you Alex Jones Infowars 9/11 conspiracy theories YouTube recommended Alex Jones conspiracy theories 15 billion times.
26:08
In the right-hand sidebar, which is more than the combined traffic of the New York Times Fox News MSNBC Guardian Etc combined. So the scale of what has actually transpired here is is so enormous that I think it's really hard for people to get their head around because also each of us only see our own Truman Show. So the fact that I'm saying these stats you might say well I've never seen a dieting video or anorexia video or someone else might say I've never seen those conspiracy theories. It's because it fed you some different Rabbit Hole, you know, Guillaume Chaz low, who's the YouTube?
26:37
Ins engineer in the film talks about an interview we did with him on our podcast how he you know, the algorithm found out that he liked seeing these videos of plane landings and it's this weird addictive corner of YouTube where people like to see plane landings or the example of Flat Earth conspiracy theories, which were recommended hundreds of millions of times and you know, because we've been doing this work Sanford such a long time and I've talked to so many people, you know, I hear from teachers and parents who say, you know, suddenly all these kids are coming into my classroom and they're saying the Holocaust didn't happen or they're saying the Earth is flat and it's like, where are they getting?
27:07
His ideas, especially in a time of coronavirus where parents are forced to sit their kids in front of the new television the new digital pacifier, which is really just YouTube, you know, they're basically at the whims of whatever that automated system is showing them and of course the reason economically why this happened is because the only way that you can broadcast a three billion people in every language is you don't pay any human editors, right? You take out all of those expensive people who sat at the, you know, New York Times or Washington Post editorial department or PBS editorial?
27:37
Department saying what's good for kids in terms of Saturday morning or Sesame Street? And you say let's have a machine decide what's good for people and the Machine cannot know the difference between what we'll watch versus what we actually really want and the easiest example there is if I'm driving down a freeway on the 5 and la and according to YouTube if my eyes go off to the side and I see a car crash and everybody's eyes go to the side. They look at the car crash. Then the world must really want car crashes and next thing, you know, there's a self-reinforcing feedback loop of their feeding.
28:07
More car crashes and we keep looking at the car crashes. They feed us more and more. That's exactly what's happened over the last 10 years with conspiracy theories. And one of the best predictors of whether you will believe in a new conspiracy is whether you already believe in one and YouTube and Facebook have never made that easier than to sort of open the doorways into a more paranoid style of thinking and just one last thing before handing it back is you know, I think this is not to vilify all conspiracy thinking, you know, some conspiracies are real or some Notions of you know, what Epstein did with
28:37
running a child sex ring is all real. So but we need a more nuanced way to see this because when you're put into a surround sound Rabbit Hole where everything is a conspiracy theory everything that's ever happened in the last 50 years as part of some Master Plan and there's actually the secret cabal that controls everything and Bill Gates and 5G and his spirit and coronavirus. This is where the thing goes off the rails and I think this really became apparent to people once they were stuck at home where you're not actually going out into the world. You're not talking to as many neighbors and so the primary
29:07
Eating making and sense making system that we are using to navigate reality are these social media products? And I think that is exacerbated the kind of craziness. We've seen, you know over the last six
29:17
months. Yeah. Well, you're really talking about the the formation of Cults and I know you've thought about a lot about cults and what we have here is a kind of cult Factory or you know, a cult industrial complex that we have built inadvertently. And again what the inadvertence is.
29:37
Is is really interesting because it does it relates directly to the business model is because we have decided that the only way to pay for the internet or the primary way to pay for the internet is with ads and when will get into the mechanics of this that is the thing that has dictated everything else we're talking about and it's it really is incredible to think about because we you know, we have created a system where indisputably some of the smartest people on earth.
30:07
To me. This is really the word some of our brightest minds are using the most powerful technology. We've ever built not to cure cancer or mitigate climate change or respond to a very real and pressing problem like a an emerging pandemic. They're spending their time trying to get better at gaming human attention more effectively to sell random products and even random conspiracy theories.
30:37
Right, the in fact they're doing all of this not merely as a in a mode of failing to address other real problems, like in mitigating climate change or responding to a pandemic the consequences of what they're doing is making it harder to respond to those real problems. And we have a climate change and pandemics are now impossible to talk about as a result of what's happening on social media. And this is this is a direct result of how social media is being.
31:07
paid for or is it how it has decided to make money and you know, as you say it's making it impossible for us to understand one another because people are not seeing the same things maybe like on a daily basis have this experience of looking at people out in the world, you know on my own social media feed or he's just reading news accounts of what somebody is into almost a somebody is into queuing on right and this
31:37
Cult is not too strong a word this Cult of indeterminate size, but massively well subscribe at this point. So people who believe that not only is child sexual abuse a real problem out there in the world as more or less everyone believes, but they believe that there are uncountable numbers of high-profile well-connected people, you know from the clintons on down who are part of a cannibalistic Cult of child sexual slavery, you know, where they extract the the bodily Essences of
32:07
Children's always to prolong their lives, right? I mean, it's just it's as crazy as crazy gets and so so when I as someone who's outside this information stream view this Behavior people look frankly insane to me, right but and some of these people have to be crazy, right? This is good has to be acting like a bug Light for for crazy people at least of some sort. But most of the people are presumably normal people who are just drinking from a fire hose.
32:37
Oz of misinformation and just different information from the information I'm seeing and so it's their behavior is is actually inexplicable to me and there's so many versions of this now. I don't think it's too much to say that we're driving ourselves crazy. We are creating a culture that is not compatible with basic sanity. I mean, we're amplifying in commensurable delusions everywhere all at once and we've created a system.
33:07
We're true information real real facts and valid, you know, skeptical analysis of what's going on isn't up to the task of dampening down the spread of lies, and maybe there's some other variable here that accounts for it, but it's amazing to me. How much of this is born of Simply the choice over a business model.
33:33
Well, I think this is to me the most important aspect of
33:37
What the film hopefully will do is right now we're living in the shattered prism of a shared reality where we reach trapped in a separate chard and like you said when you look over at someone else and say how can they believe those crazy things? How can they be so stupid aren't they seeing the same information that I'm seeing and the answer is they're not seeing the same information that you're seeing. They've been living literally in a completely different field of information than you have and that's actually one of the other I think psychological not so much vulnerabilities, but we did not
34:07
have evolved to assume that every person you would see physically around you would inside of their own mind be actually living in a completely different virtual reality than the one that you live in so nothing from an evolutionary perspective would enable us to have empathy with the fact that each of us have our own little virtual reality in our own minds and that each of them could be so dramatically not just a little bit but so dramatically different because another aspect you mentioned when you brought up Cults at the beginning of what you said was the power of groupthink and the power of
34:37
An echo chamber where you know many of what's going many of the things that are going on in conspiracy theory groups on Facebook. I mean the plan demick video spread actually through a massive network of Q and on groups, there's actually been a capturing of the new spirituality and sort of in psychedelics type Community into the Q&A on world interestingly
34:55
which are and that's what these people need acid. Yeah,
34:59
that's doesn't sound like a good addition to an already mad world, but I think if we zoom out it's like the question is who's in control of
35:07
In history right now are human beings authoring our own choices or by the fact, we've seeded the information that feeds into three billion people's brains meant that we have actually seated control to machines because the machines control the information that all all three billion of us are getting it's become the primary way that we make sense of the world and to jump ahead of in mind read some of the Skeptics out there some people saying will hold on a second weren't there filter Bubbles and narrow partisan Echo Chambers with Fox News and MSNBC and people sticking with those channels.
35:37
Eels. Yes, that's true. But I would ask people to question. Where are the editorial Departments of those television channels getting their news from well, they're just living on Twitter and Twitter is algorithms are recommending again that same partisan Echo chamber back to you. If you follow as you had renamed I rest on your podcast who's a dear friend and amazing colleague talking about how you know, radicalization spreads on social media and she worked back in the state department in 2015 where they noticed that if you followed one Isis terrorists on Twitter the suggested
36:07
Our system would say oh there's suggested people you might want to follow and it gives you 10 more suggested Isis terrorists for you to follow likewise if you were a new mom and she was several years ago and you join some new mom groups specifically groups for like making your own baby food kind of a do-it-yourself organic mom's movement. Well their Facebook's algorithm said well, hold on what are other suggested groups? We might show for you that tend to correlate with users in this mom group that keeps people really engaged and one of the top recommendations was The anti-vaccine Conspiracy Theory groups and when you
36:37
In one of those it says well those groups tend to be also in these Cuban on groups and chemtrails groups and the Flat Earth groups. And so you see very quickly how these tiny little changes as they say it in German says in the beginning of the film, you know, the business model of just changing your beliefs and identity just one percent, you know changing the entire world one percent is a lot. It's like climate change quite literally right where you only have to change the temperature a tiny bit and change the basis of what people are believing and it changes the rest of reality because as you know from
37:07
Bias, when you have a hammer, everything looks like a nail and technology is laying the foundation of hammers that are looking for specific kinds of nails. Once you see the world in a paranoid conspiratorial lens, you are seeing you're looking for evidence that confirms that belief and that's happening on all sides. I'm it's really a thing that's happened to all of us. This is why my biggest hope really in the global impact of the film and this is not a marketing push. It's really a social impact bush. Like I genuinely am concerned that their main be no other way to put Humpty Dumpty back together again.
37:37
Then to show the world that we have created that we need a new shared reality about that breakdown of our shared
37:43
reality. There are many aspects to the ad model and I think people can get it doesn't take much work to convince people as we've hopefully have begun to hear that the shattering of shared reality is a problem. It's at minimum a political problem. I mean whether it's a social problem for you, you know out in the world.
38:07
Or in your primary relationships to see the kind of hyper partisanship we see now and they and they just inability to converge on an account of basic facts that could mitigate that partisanship. I think people feel that that is a kind of assault on democracy. And then when you add the piece that bad actors like the Russians or the Chinese or anyone can decide to deliberately game that system. I mean just the knowledge that you know,
38:37
Russia is actively spreading you know, black lives matter information and pseudo information. So as to heighten the anguish and and polarization on that topic in America, I mean that just the fact that we built the tools by which they can do that and they can do it surreptitiously, right? We don't see who's seeing these ads right? You don't see the 50,000 people who were who were targeted and specific state for a specific reason that is new and Sinister and I think people
39:08
can understand that but when you're we're talking about the problem with sharing information or using our information in these ways and I think we should get clear about what's happening here because at this is a distinction several people make in the film. It's not that these platforms sell our data where they don't really sell our data. They gather the data they analyzed the data and what they sell are more and more accurate predictions of our Behavior.
39:37
to advertisers right and the ability to and as they gets more refined you really have a as close as we've ever come to advertising being a kind of sure thing right where it really, you know, it really works and and even there people I think most people
39:57
won't necessarily care about that because if you tell them listen you the thing you really thought you wanted and went out and bought you were played by the Company the company placed an ad with Facebook and Facebook delivered it to you because you were the perfect Target of that ad I think the person can at the end of the day own all of that process and say and just subsume it with their satisfaction at having bought the thing they
40:26
they they now actually want right like so yeah, I actually but I want I wanted it a new Prius right? I mean that's it was time. I needed a new car. I think there's some whether it's confabulate or e or not. There's some way in which they don't necessarily feel violated and I think when I think people think they care about privacy, but we don't really seem to care about privacy all that much more we care about convenience and we care about money. I'm a bottom. Nobody wants to pay for these things. No one wants to pay.
40:56
Facebook they don't want to pay for Twitter. They don't want to pay for most of what happens on the internet and they're happy to be enrolled in this psychological experiment so that they don't have to pay for anything and that's and the dysfunction of all of that is what we're trying to get across here. But it's I'm always amazed that it's you focus on it and parts of this monstrosity begin to disappear. You know, it's like a it's very hard to keep what is wrong with this in view every moment.
41:26
All at once and so I have a maybe for the moment. Let's just focus on information and privacy and and the ad model and and just how we should think about it. Well when we talk
41:37
about the advertising model, you know people tend to think about the good faith uses like you're talking about, you know, a Prius or a pair of shoes what this misses the geopolitical World War 3 information Warfare that it's happening right now because you know a line I say often is you know, while we've been obsessed with protecting our physical borders as a country.
41:56
We left the digital border wide open. I mean if Russia or China tried to fly a cruise missile or a bomber plane into the United States, they be blasted out of the sky by the Pentagon, but when they try to fly an information bomb into the United States in our virtual infrastructure of Facebook, they're met by a white glove that says, yes exactly. Which zip code in which African-American subdistrict. Would you like to Target and that that is the core problem? We are completely unprotected when it comes to the virtual infrastructure. So if you go to the roads and
42:26
The air and the telephone telephone lines that we use here in this country. They're completely are capped from you know, Russia or China, but when most of the activity happening in our country happens in a virtual digital online environment, you know as Marc Andreessen says software is eating the world meaning software and the digital world are consuming more and more of the physical world in the physical ways that we use to get around in the physical conversations. We used to have that digital environment is basically the Big 5 tech companies. It's all happening through the landscape of
42:56
YouTube Tick Tock Facebook Etc. And you know how it is an Empire fault, you know, you use the power of an Empire against itself, you know after World War II, you know, we had all these nooks in the big Powers couldn't do conventional Wars with each other so they had to use subtler methods plausible deniability proxy wars. That would be waging economic Warfare diplomatic Warfare. But if you are Russia or Iran or turkey, you know, and you don't want to see the u.s. In a position of global dominance, would you do you know forward-facing attack on the country with all the Nooks?
43:26
Obviously not but would you take the already existing tensions of that country and turn the enemy against himself? That's what Sun Tzu would say to do and that's what Chinese military strategy would say to do and Facebook just makes that a trillion times easier. So, you know if I was China, I would want an extreme right and extreme left groups to proliferate and fight each other and you know, we know that this is basically happening and this has been stoking up groups on all sides, you know, I can go into your country and create an army of bots that look just as indistinguishable from regular people if I'm China I'm running tick
43:56
And I can manipulate the political discourse in your country with the fact that I have 300 million Americans, you know on my service might even be bigger than that from memory correctly. So I think you know the advertising model isn't just that it's enables these good faith uses. I think people have to recognize the amount of manipulated and deceptive activities that are almost like you said untraceable. I mean the fact that I'm saying all this to you and the listeners out there with sound like a conspiracy theory until you know, the researchers who are tracking these things because if you're you know, if you're just looking at your own feed I'm living in
44:26
For Nia, I'm not actually part of a targeted group. So I don't really see these things and it's actually invisible to me anybody who is so again our psychological vulnerabilities here technology is not allowing us to empathize with people who are closest to being her are harmed by these
44:41
systems. Yeah. Okay. So I think people can get the central fear here, which is that it seems at best difficult more likely impossible to run a healthy democracy.
44:57
On bad information. I mean if we can do it for a few years, we probably can't do it for a century. Something has to change here. We can't be feeding everyone lies or half-truths different lies and different half-truths all at once 24 hours a day year after year and hope to have a healthy Society right? So that's a discernible piece of this problem that I think.
45:26
Virtually, everyone will understand and then when you add the kind of the emotional valence of all these lies and half-truths people get that. There's a problem amplifying outraged right in the fact that the thing that is most captivating to us is the feeling of in-group outraged pointed outward toward the out group for whom we have contempt growing into hatred.
45:56
It's the place. We are so much of the time on social media that runs the Moschino the gears of this Machinery faster than than any other emotion and whatever the you know, if that changes tomorrow if it turns out that you know sheer terror is better than outrage. Well, then the algorithm will find that and it will be amplifying Terror. But the thing that you have to be sure of is that it's contained in the very word, you know dispassionate take on.
46:26
On current events is never going to be the thing that gets this this Machinery running hottest. It's right. And so I think people can get that. But when we talk about possible remedies for this problem, then I really think it is hard to see a path forward. So if you'd like to continue listening to this podcast, you'll need to subscribe at Sam Harris dot-org you'll get access.
46:56
Stu all full-length episodes of The Making Sense podcast and two other subscriber only content including bonus episodes and a Mas and the conversations. I've been having on the waking up app. The making sense podcast is ad-free and relies entirely on listeners support and you can subscribe now at Sam Harris dot-org.
ms