PodClips Logo
PodClips Logo
The Diary Of A CEO with Steven Bartlett
Yuval Noah Harari: The Urgent Warning They Hope You Ignore, More War Is Coming, Yuvals Chilling Future Predictions!
Yuval Noah Harari: The Urgent Warning They Hope You Ignore, More War Is Coming, Yuvals Chilling Future Predictions!

Yuval Noah Harari: The Urgent Warning They Hope You Ignore, More War Is Coming, Yuvals Chilling Future Predictions!

The Diary Of A CEO with Steven BartlettGo to Podcast Page

Yuval Noah Harari, Steven Bartlett
·
19 Clips
·
Jan 11, 2024
Listen to Clips & Top Moments
Episode Summary
Episode Transcript
0:00
We are now in a new era of wars and unless we re-establish order fast. Then we are doomed
0:07
you've all know Harare and the brightest Minds on planet Earth
0:10
historian a best-selling author
0:12
of some of the most influential non-fiction books than the world today.
0:16
I think we are very near the end of a species because people often spend so much effort trying to gain something without understanding the consequences. For example, we will get to a life where you can live.
0:30
Indefinitely, but realizing that you have a chance to live forever. But if there is an accident you die the people who will be in that situation will be at a level of anxiety and Terror unlike anything that we know then you have artificial intelligence and the world is not ready for it. It's the first technology in history that can make decisions by itself and take power away from us to hack uman beings manipulate our behavior and making all these decisions.
0:59
For us all about us whether to give you a loan whether to give you a mortgage dating as shaping your romantic life, but real problem is that increasingly the humans at the top could be puppets when the most consequential decisions are made by algorithms Global financial decisions Wars. This is extremely dangerous, but it's not inevitable humans can change
1:22
it. But with what's to come are you optimistic about the future?
1:26
I'm very worried about two things. First of all,
1:29
Paul
1:32
I find it incredibly fascinating that when we look at the back end of spotify and apple and our audio channels the majority of people that watch this podcast. Haven't yet? Hit the follow button or the Subscribe button, wherever you listening to this. I would like to make a deal with you. If you could do me a huge favor and hit that subscribe button. I will work tirelessly from now until forever to make the show better and better and better and better. I can't tell you how much it helps when you hit that subscribe button the show gets bigger, which means we can expand the production bringing all the guests. You want to see and continue to do
2:02
This thing we love if you could do me that small favor and hit the follow button, wherever you listening to this that would mean the world to me. That is the only favor I will ever ask you. Thank you so much for your
2:10
time.
2:18
Yvonne I have three of your books here. And these are three books that sent a huge tidal wave of Ripple through Society with these books and with all of the work that you're doing now with the lectures you give the interviews you give what is your mission? What is this sort of if I was to be able to summarize what your Collective mission is with your work. What is
2:41
that? It's to clarify it.
2:43
Focus the public conversation the global conversation to help people focus on the most important challenges that are facing humankind and also to bring at least a little bit of clarity to the collective and enter the individual mind when one of my main messages in all the books is that our minds are like factories that constantly produce stories and fictions that then
3:13
come between us and the world and we often spend our lives interacting with fictions that we or that other people created with completely losing touch with reality and my job and I think the job of historians more generally is to show us a way out
3:36
inherent in much of your work is
3:40
what feels like a warning. Hmm.
3:43
And I've watched hundreds of videos that you've produced or interviews you've done all around the world and it feels like you're trying to warn us about something multiple things. If my estimation there is correct. What is the
3:59
warning much of what we take to be real is fictions. And in the reason that fictions are so Central in human history is because we control
4:13
the planet and rather than the chimpanzees or the elephants or any of the other animals because not because of some kind of individual genius that each of us has but because we can cooperate much better than any other animal we can cooperate in much larger numbers and also much more flexibly. And the reason we can do that is because we can create and believe in fictional stories because every large-scale uman cooperation,
4:43
Whether religion or nations or corporations are based on mythologies on fictions again, I'm not just talking about God's this is the easy example money is also a fiction that we created corporations are fiction. They exist only in our minds even lawyers would tell you that corporations are legal fictions. And this is on the one hand such a source of immense power.
5:14
But on the other hand again, the danger is that we completely lose touch with reality and we are manipulated by all these fictions by all these stories and stories are not not bad. There are tools as long as we use them to cooperate and to help each other. That's wonderful money is not bad if we didn't have money we would not have a trade Network when everybody would have maybe with their friends and family to produce everything.
5:43
themselves like the chimpanzees do the fact that we can enjoy food and clothing and medicines and interim entertainment created by people on the other side of the world is largely because of money, but if we forget that this is a tool that we created in order to help ourselves and instead this tool kind of enslaves us and runs our life and you know, I'm
6:13
Now just back home in Israel. There is a terrible War being waged and most wars in history. And also now they are about stories. They're about fictions people think that humans fight over the same things that Wolves of chimpanzees fight about that. We fight about territory that we fight about food it sometimes happens but most wars in history were not really about territory or food there is enough land for
6:44
Between the Jordan River and the Mediterranean to build houses and schools and hospitals for everybody and there is certainly enough food is no shortage of food, but people have different mythologies different stories in their minds and they can't find a common story that can they can agree about and this is at the root of most you and conflicts and being able to tell the difference between what is a fiction in our own mind.
7:13
And what is the reality? This is the crucial skill and we are not getting better at finding this difference as time got time goes on and also is new technologies, which I write about a lot like artificial intelligence the fantasy that a I will answer. Our questions will find the truth for us will tell us the difference between fiction and reality. This is
7:43
Is just another fiction I mean a I can do many things better than humans. But for reasons that we can discuss. I don't think that it will necessarily be better than humans at finding the truth or uncovering reality.
8:01
It strikes me that the thing that made us successful, you know this ability to believe in fictions and I use the word successful, you know, tarfful. Yeah powerful. Yes pretty to call the world.
8:14
the thing that made us powerful could well be the thing that makes us powerless in the sense that our ability to believe in fictions and stories create a society that would
8:24
Potentially lead to our powerlessness. That's kind of one of the messages that when I connect the dots throughout your work you look off into the future. I'm left feeling and even you think about the modern problems we have those are typically consequences of our ability to believe in stories. Yeah. It's a believing fictions and if you play that forward 100 years, maybe 200 years, you don't believe that you believe will be the last of our
8:54
a species right?
8:56
I think we are very near the kind of end of our species. It doesn't necessarily mean that will be destroyed in some huge nuclear war or something like that. It could very well mean that will just change ourselves using bioengineering and using AI in a brain-computer interfaces. We will change ourselves to such an extent that will become something completely different something fun.
9:25
Normal different from present-day Homo sapiens, then we today are different from chimpanzees of from neanderthals. I mean basically, you know
9:36
You have a very deep connection still with all the other animals because we are completely organic. We are organic entities are psychology our social habits. They are the product of organic evolution in mammalian more specifically mammalian Evolution over tens of millions of years. So we share so much of our psychology and of our kind of social habits with chimpanzees and with with other
10:06
The mammals looking 100 years or 200 years to the Future. Maybe we are no longer organic or not fully organic. You could have a world dominated by cyborgs, which are entities combining organic with inorganic parts. For instance with brain computer interfaces. You could have completely non organic entities. So all the Legacy and also all the limitations
10:36
signs of 4 billion years of organic evolution might be irrelevant or inapplicable. Do you think of the beings of the future? What would you make
10:48
because you're saying maybe
10:50
I don't know. I mean we could destroy ourselves. I think there is a greater. I mean to completely destroy every last single human in the world. It is possible given the technology that we Now command, but it's very difficult.
11:07
I think it's there is a greater chance. And again, this is just speculation, you know, nobody really knows but I think I mean lots of people could suffer terribly but I think it's more likely that some people will survive and then will undergo radical changes. So it's not that humanity is completely destroyed. It's just transformed into in into something else again.
11:36
So just to give an example of what we were talking about organic beings like us need to be in one place at any one time. We are now here in this room. That's it. If you kind of disconnect our hands or our feet from our body we die or at least we lose control of these I mean and this is true of all organic entities of plants of of animals now with cyborgs.
12:06
Always inorganic entities. This is no longer true. They could be spread over time time and space. I mean if you find a way in people are working on finding ways to directly connect brains with computers or brains with bionic Parts. The reason there is no essential reason that all the parts of the be of of the entity need to be in the same room at the same time.
12:31
As you said that you know, I start thinking a little bit about neural link and what you'll notice is doing
12:36
exactly interfacing us with computers, but then I had a secondary thought which is if there could be two Stevens one here and then one in the United States right now because we're connected to the same computer interface. Theoretically I could hack Jack over there. I could hack his interface so they could be three Stevens because they hack Jack and then I hack you and then those four and then I could eventually try and hack the entirety of the world or a country. Yeah, and they could basically be one.
13:06
Once you can connect directly brains to computers. First of all, I'm not sure if it's possible. I mean people like in unmasking neurolink. They tell us it's possible. I'm I'm still waiting for the evidence. I don't think it's impossible. But I think it's much more difficult than people assume partly because we are very far from understanding the brain and we are even further away from understanding the mind. We assume that the brain somehow produces the mind but this is just an assumption.
13:36
Still don't have a working model working theory for how it happens. But if it happens if it is possible to directly connect brains and computers and integrate them into these kinds of cyborgs. Nobody has any idea what happens next how the world would look like and it is certainly makes it a plausible if again if this is this if you reach that point that you could have an
14:06
an inter brain net
14:08
The same way that lot of computers are connected together to form the internet. If you can connect also brains and computers directly. Why can't we then connect an inter brain net which connects lots of brains as you did is as you as you are described and I've no idea what it means. I think this is the point when the way that our organic brains understand reality.
14:38
Even our imagination in the end is the product as far as we can tell of organic biochemistry. Do you think we are so we are not equipped I think to have a kind of serious discussion of what non-organic brain or a non-organic mind might be capable of doing how it would how it would look like and all the basic assumptions that we have.
15:08
We have about brains and minds they are limited to the organic types.
15:14
How do you feel about artificial intelligence and what's happening this year has been a real sort of landmark here in me a big leap forward for artificial intelligence the conversation public awareness.
15:28
The technology itself the investment in the technology, which is always, you know, very important indicator of what's to come. Yeah. How do you as someone that spent a lot of time thinking about this emotionally? How do you feel about it?
15:44
Very concerned? I mean it's moving even faster than I expected when I wrote a home owned a house in 2016. I didn't think we would reach this this point so quickly where we are with 2023 and
15:57
the world is not ready for it.
16:01
And again, it's not AI has enormous positive potential.
16:06
We and and this should be clear and there is no chance of just Banning AI or stopping all development in AI I tend to speak a lot about the dangers simply because you have enough people out there all the enterpreneurs and all the investors talking about the positive potential. So it's kind of my job to talk about the negative potential the dangers. But it there is a lot of positive potential and humans are
16:36
Incredibly capable in terms of adapting to new situations. I don't think it's impossible for Human Society to adapt to the new AI reality. The only thing is it takes time. And apparently we don't have that time and people compare it to previous big historical revolutions, like the invention of print or the invention of or the the Industrial Revolution and you
17:06
People say yes, when the Industrial Revolution happened in the 19th century. So you had all these Prophecies of Doom about how industry or in the new factories and the steam engines and electricity hide how they will destroy Humanity of destroy our psychology or whatever and in the end it was okay. And when I hear these kinds of comparisons as a historian, I'm very worried about two things first of all,
17:36
Paul they underestimate the magnitude of the AI Revolution AI is nothing like print it's nothing like the industrial revolution of the 19th century. It's far far bigger. There is a fundamental difference between Ai and the printing press of the steam engine of the radio or any previous technology. We invented the difference is it's the first technology in history that can make decisions by itself and that can create a new idea.
18:06
By itself a printing press or a radio set could not write new music or a new speeches and could not decide what to print and what to broadcast. This was always the job of humans. This is why the printing press and the radio set in the end empowered Humanity that you now have more power to disseminate your ideas AI is different. It can potentially take power away from us.
18:36
It can decide it's already deciding by itself what to broadcast on social media. It's algorithms deciding what to promote an increasingly. It also creates much of the content by itself. It can compose entirely new music. It can compose entirely new political manifestos. Holy books, whatever. So it's a much bigger challenge to handle that kind. It's an independent.
19:06
Agent in a way that radio and the printing press will not the other thing. I find worrying about the comparison. We say the Industrial Revolution is that yes in the end in a way it was okay, but to get there we had to pass through some terrible experiments when the Industrial Revolution came along nobody knew how to build banana.
19:36
Nine industrial society. So people experiment it one big experiment was European imperialism many people thought that to build an industrial society means building an empire unless you have an Empire that controls the sources of the raw materials you need Iron Coal rubber cotton, whatever and unless you control the markets you will not be able to survive as a industrial society.
20:06
And there was a very close link also conceptually between building an industrial society and building an empire and all the leaders the the initial leaders of the Industrial Revolution built empires. Not just Britain and France also small countries like Belgium also Japan when he joined the Industrial Revolution it immediately set about conquering an Empire another tribe.
20:36
Experiment was a Soviet communism. They also thought how do you build an industrial society you build a Communist dictatorship and it was the same as Nazism. You cannot separate communism and Nazism from the Industrial Revolution. You could not have created a communist or a Nazi totalitarian regime in the 18th century. If you don't have trains if you don't have electricity, if you don't have radio you cannot create a totalitarian regime.
21:07
So these are just a few examples of the failed experiments, you know, you try to adapt to something completely new you very often experiment and some of your experiments fail and if we now have to go in the 21st century through the same process. Okay, we now have not radio and and trains we now have ai and bioengineering and we again need to
21:36
Tournament perhaps with new empires perhaps with new totalitarian regimes in order to discover how to build a benign AI Society then we are doomed as a species. We will not be able to survive another round of imperialist wars and totalitarian regimes. So anybody who thinks hey we've passed through the Industrial Revolution with all the Prophecies of Doom in the end. We got it, right? No if
22:06
At least Orion I would say that I would give you Manatee a c-minus on how we adapted to the Industrial Revolution. If we get a c - again in the 21st century, that's the end of us.
22:18
It seems quite trivial to many that the AI revolution has seemed to begun with large language models. And when I read sapiens this book I have here language was so Central to what made us powerful as
22:37
Homo sapiens in the beginning was the word the I didn't say it, you know, it's it's a very very widespread idea.
22:45
That ultimately our power is based on Words. The reason that we control the world and not the chimpanzee odd, or the elephant's is because we had a much more sophisticated language which enabled us to gain to tell these stories stories about ancestral spirits and about Guardian gods and about our tribe our nation which formed the basis for cooperation and because we could cooperate
23:15
Parade you could have 1000 people 1000 humans cooperating in a tribe where with the Neanderthals could cooperate only on the level of say 50 or 100 individuals. This is why we rule the world and not of Neanderthals and you look at every subsequent kind of growth in human power and you see the same thing that ultimately you tell a story with words and language is like the master.
23:45
A key that unlocks all the doors of our civilization whether it's Cathedrals or whether it's Banks. They're based on language on stories. We tell that it again. It's very obvious in the case of religion. But also if you think about the world's Financial system, so money has no value except in the stories that we tell and believe each other if you think about gold coins or paper banknotes.
24:15
Oats or crypto currencies like Bitcoin they have no value in themselves. You cannot eat them or drink them of do anything useful with them. But you have people telling you very compelling stories about the value of these things. And if enough people believe the story then it works
24:35
are also protected by language. Like my cryptocurrency is protected by a bunch of words. Yeah
24:42
that created by words and they
24:45
they function with words and and symbols.
24:50
When you communicate with your Banker, it's with words. I mean what happens when a I can create deep fakes of your everything your voice your image that the way you talk the type of words you use so there is already an arms race between Banks and fraudsters. I mean, we want the easiest communication with our Banker. I just pick up the phone. I tell a few words and they transfer a million dollars, but at the same time I also want to want to be protected.
25:20
Roman AI that impersonates my voice and tone of voice tone of voice and whatever and this is becoming difficult, but on a deeper level again a I could create because money is ultimately made of words of stories a I could create new kinds of money the same way that you know crypto currencies like Bitcoin have been created simply by somebody telling people a story and enough people.
25:50
finding this story convincing and I guess as a CEO and it's As an interpreter, you know that if you want to get Investments what really gets Investments is a good story and what happens the financial system if increasingly our financial stories are told by Ai and what happens to the financial system and even for the political system
26:19
If AI eventually creates new Financial devices that humans cannot understand all over the today much of the activity on the world markets is being done by algorithms. It's such a speed and we such complexity that most people don't understand what's happening there if you had to guess what is the percentage of people in the world today?
26:49
That really understand the financial system. What would be your kind of
26:55
less than 1% less than 1% Okay. Let's be kind of
26:58
conservative about it. One percent. Let's say Okay fast forward 10 or 20 years AI creates such complicated Financial devices that there is not a single human being on Earth that understand Finance anymore.
27:13
What are the implications for politics like you vote for a government? But none of the humans in the government? Not the Prime Minister not the Finance Minister. Nobody understand the financial system. They just rely on AI to tell them what is happening. Is this Philly democracy? Is this still a human form of government in any way would you
27:37
say to someone that he is that in goes our that's just that's nonsense. That's never gonna happen.
27:43
Why not? I mean, let's look back 15 years to the last big financial crisis in 2007-2008 this financial crisis to a large extent began with these extremely complicated Financial devices cdos. What's the acronym collateral depth something? I don't even know what the word letter stands un these kind of Whiz Kids in Wall Street inventing a new Financial device that nobody except them. Really?
28:12
Stood which is why also it wasn't regulated effectively by the Banks and the government's and it worked well for a couple of years and then it brought down the world financial system and what happens if now a eyes comes with even more sophisticated Financial devices and for a couple of years everything works. Well, they make trillions of dollars for us and then one day it doesn't
28:40
One day the system collapses and nobody understands what is happening? And again, it's not that you didn't go to college or whatever. No, it's just objectively the complexity of the system has reached a point when only an AI is able to Crunch the numbers is able to process enough data to really get to really grasp.
29:10
Then shape the Dynamics of of the financial
29:14
system. We're ready though. You know, I think if anyone does understand how the financial system works and markets work. It is a bunch of
29:22
Homo sapiens relying on a computer to tell it something and it trusting that computers calculations.
29:29
Yeah, and and and this will I get again more and more complicated and sophisticated and for people to say no it's not going to happen. The question is what is stopping it? I mean, you know in all the discussions about AI
29:46
the kind of dangers that draw people's attention like the poster child of AI dangers is things like AI creating a new virus that kills billions of people and new pandemic so you a lot of people concerned about how do we prevent an AI by itself or maybe some small turret organization or even a 16 year old teenager given an AI a task to create a dangerous virus in a
30:16
Lisa to the world, how do we prevent this and this is a serious concern and we should be concerned about it. But this gets a lot more attention than the question. How do we prevent the financial system from becoming so complicated that humans can no longer understand it and I see a lot of regulations being at least considered how to prevent a iPhone creating dangerous new viruses.
30:46
I don't see any kind of effort to keep the financial system at a level that Youmans understand it. Why do you think that is?
30:58
I mean I had to guess my guess was why would the UK cough then? You know, why would they give themselves a
31:04
disadvantage exactly
31:06
when you know there it just means that the UK will suffer if America is using a really Advanced AI algorithm to get ahead. We have to keep
31:12
up. Yeah, it's the logic of the arms race. And again, it's not all bad. I mean you have a better Financial system. You have a more prosperous economy. I mean money isn't bad. I mean the basis for almost all you
31:26
Corporation and a lot of financial devices in the end of you think what are they they are devices to establish trust between people especially trust between strangers in money in essence is a device for establishing trust. I don't know you you don't know me but we both trust this gold coin or piece of paper so we can cooperate on sharing food or creating.
31:56
Medicine and the most sophisticated Financial devices. They basically do the same thing stocks and bonds and these cdos. They are a method to establish trust and when you open a new bank account, the most important thing is how do I trust the bank to really take care of my money into follow my instructions but not to be open to fraud and things like that and again use as an investor.
32:27
When you try to get money from from for me show you as an entrepreneur when you try to get money from investors. The biggest issue is always trust and if somebody can comes I can come up with a new way to establish trust between people that's a good thing. But if this new way increasingly depends on non-human,
32:56
Gence on organ on systems that humans cannot understand. That's the big question what happens to human society when the trust that is at the basis of all social interactions is actually no longer trust in humans its trust in and non-human intelligence that we don't fully understand and that we cannot anticipate
33:26
And part of the problem with regulating AI or AI safety goes back to what we discussed earlier that AI is different from printing presses or radius set or even atom bonds. If you want to make nuclear energy safe, then you need to think about all the different ways that I don't know in nuclear power station can can have an accident and
33:56
I guess there is a limited number of things that can go wrong. And ideally if you think hard if you have if you have enough people thinking hard enough, you can make safe nuclear reactors safe nuclear power stations now, but AI is fundamentally different because AI keeps changing. It keeps reacting to the world. It keeps reacting to you coming up with new information.
34:26
Inventions new ideas new decisions. So making AI safe is a bit like making a nuclear reactor safe taking into account the fact that the nuclear reactor can decide to change in ways that you can't anticipate and even worse it can react to you. So if you build a particular safety mechanism for the nuclear reactor what happens if the nuclear reactor say, oh they build this mechanism, let's do that, too.
34:56
Somehow get around the safety mechanism. We don't have this problem with nuclear reactors. But this is the problem with AI we are trying to contain something which is an independent agent and which might actually come to understand us better than we understand it.
35:17
I'm really curious about how this will impact you talked about elected officials there and how their systems will be sort of
35:27
their financial decision making might be driven by algorithms but governments and Authority itself. I've pondered recently whether they'll come a day in the not so distant future where we might vote for an algorithm. Well, we might vote for an AI to be our government. Is that crazy thinking
35:49
I think we all were quite a long way off from there. We would still want Youmans at least in the symbolic role.
35:56
Being the Prime Minister that the Member of Parliament whatever the president the real problem is that increasingly these humans could be kind of figure heads or puppets when the real decisions the most consequential decisions are made by algorithms because partly because the the it will just be too complicated for the humans.
36:26
At the top to understand the situation or to understand the different options. So going back to the financial example. So imagine that you know, it's 4:00 in the morning. There is a phone call to the Prime Minister from the findings algorithm telling the pin that the Prime Minister that we are facing a financial meltdown and that we have to do something within the next I don't know 30 minutes.
36:55
To prevent National of Global Financial meltdown and are like three options and the algorithm recommends option A and there is just not enough time to explain to the Prime Minister. How did the algorithm reached the conclusion and even what is the meaning of these different options? And again people think about this scenario mostly in relation to war that what happens if you have an algorithm
37:25
Them in charge of the your security system and it alerts you to a massive incoming Cyber attack and you have to react immediately and this could if your reactant is in a specific way this could mean war with another Nation, but you just don't have enough time to understand how the algorithm reached the decision and how the algorithm was also able to determine that of the old.
37:55
Print options. This is the best
37:57
option.
37:58
Do you think that humans believe we're more complicated and special than we actually are.
38:04
Because I think part of much many of the rebuttals when we talk about artificial intelligence asset stem back to this idea that we're in you know, we're like innately genius creative spiritual special different from you know, artificial intelligence. Like our intelligence is somewhat Divine or we've got Free Will and we you know,
38:31
we yeah, I mean, it's
38:35
If the argument is we have flea will we have a Divine soul and therefore no algorithm will will ever be able to understand us and to predict our decisions or to manipulate us. Then this is a very common argument but it's obviously nonsensical. I mean even before AI it was even with previous technology. It was possible to allow me.
39:04
extent to predict people's behavior and to manipulate them and a I just takes it to the next level now with regard to the discussion of a flu will
39:16
My position is you cannot start with the assumption that Youmans have flea will if you start with this assumption, then it's actually is very it makes you very in curiously lacking curiosity about about yourself about human beings. It kind of closes off the investigation before it began. You assume that
39:46
Any decision you make is just a result of my flea will why did I choose this politician this product this spouse because it's my free will and if this is your position there is nothing to investigate you just assume you have this kind of divine spark within you that makes all the decisions and there is nothing to investigate there.
40:11
I would say no start investigating and you'll probably discover that there are a lot of factors whether it's external factors, like cultural traditions and also internal factors like biological mechanisms that shape your decisions you chose this politician of this spouse because of certain cultural traditions and because of certain
40:40
Illogical mechanisms your DNA your brain structure whatever and this actually makes it possible for you to get to know yourself better. Now, if after a long investigation you have reached the conclusion that yes, there are cultural influences. There are political influences. There aren't genetic and neurological influences. But still there is a certain percentage.
41:10
each of my decision that cannot be explained by any of these things then okay call It Free Will and we can discuss it but don't start with this assumption because then you lose the incentive to explore yourself and anybody who embarks on such a process of self exploration, whether it's in therapy, whether it's in meditation whether it's in the laboratory of a
41:40
Trained scientists or in has a story in in the archive. You will be amazed to discover how much of your decisions are not the result of some mystical flea. Will they are the result of cultural and biological factors? And this also means that you are vulnerable to being deciphered and manipulated by political parties by corporations.
42:11
By AI people who have this kind of mystical belief in Free Will are the easiest people to manipulate because they don't think they can be manipulated and obviously they can
42:25
we humans should get used to the idea that we are no longer mysterious Souls. We are now hackable animals. That's what we are. Hmm. You said that the world economic Forum?
42:37
Yeah. Again, this is the same point.
42:40
Basically that it's now possible to hack uman beings not just to hack our smartphones or bank account or computers, but to really hack our brains our minds and to predict our behavior and manipulate our Behavior more than in any previous time in history.
42:59
The other line that you said, which really made me think and Ponder was
43:06
As previously human life was about the drama of decision-making. And without this we won't have a meaning in life. Yeah,
43:14
and if you look you know, but it's politics that religion and at culture people told the stories about their lives of the lives of people in general as a kind of drama of decision-making that you reach a particular Junction in life and you need to choose
43:36
You need to choose between good and evil, you need to choose between political parties. You need to choose your what to study at University or what work what kind of job to apply to and our stories revolved around these decisions.
43:54
And what happens to human life if increasingly the power to make decisions is taken from us.
44:04
And increasingly it's algorithms making all these decisions for us or about us. Is that possible? It's already happening increasingly. You know, you applied to a bank to get a loan in many places. It's no longer a human Banker who is making this decision about you whether to give you a loan whether to give you a mortgage, it's an algorithm analyzing billions of bits of data about you and about of me.
44:34
Means of other customers or previous loans determining whether your credit worthy or not. And if you ask the bank if they refuse to give you a loan and you ask the bank, why didn't you give me a loan and the bank says we don't know the computer said no, and we just believe our our computer our algorithm and it's happening also in the judicial system increasingly that various judicial.
45:04
Ian's verdicts like for how many like the judge decided that you're committed some crime the sentence whether to send you to two months or eight months of two years in prison is increasingly determined by an algorithm.
45:20
You apply to a place at University you apply to a job. These two is increasingly decided by algorithms dating dating. Yes. I mean even even unknown unbeknownst to you the algorithms of the dating apps that you're using are shaping your romantic life. But what in the worlds
45:44
of you know, Robotics and artificial intelligence, why do I need to find it?
45:49
Person at all
45:51
why not just have a relationship with a robot with an AI. Yeah, we do see the beginning of of this that people are building more and more intimate relationships with non-human intelligences with a eyes and Bots and and so forth and this raises a lot of difficult and profound questions part of the problem is
46:19
The DEA eyes are built to mimic intimacy that the ability intimacy is an extremely powerful thing. Not just in Romans also in the market also in politics. If you want to change somebody's mind about anything political issue a commercial preference intimacy is kind of the most powerful weapon.
46:50
and
46:52
Somebody you really trust somebody who have intimate relationships with will be able to change your views on a lot of things more than someone you see on TV or the Justin and in an article, you read a newspaper. There is a huge incentive for the creators of a eyes to create a eyes that are able to forge intimate relationships with humans. And this makes us extremely vulnerable.
47:22
To these new type of manipulation that was previously just unimaginable
47:28
because loneliness is a you know, all time highs pushing the sort of Western World and sexlessness and I was reading some stats about how the like body bottom 50% of men in particular having almost no sex relative to the top to sort of 10% and you think you know this disparity the rise of digitization loneliness. We're in our homes on screens more than ever before.
47:52
And then you hear about this industry of AI and sex dolls and all this and you just wonder you play it forward and go. Oh,
47:58
yeah, it's going there. And in the thing is that it's not that the humans are so stupid or something that they kind of project something onto the AI and fall in love with an AI chatbot. The AI is deliberately built created trained to fool us.
48:21
To the same way, you know, you look at the previous 10 years the was a big battle for human attention. There was a battle between different social media Giants and what whatever how to grab human attention and they created algorithms that were really amazing at grabbing people's attention and now they are doing the same thing but with intimacy and we are extremely
48:51
Emily exposed we are extremely vulnerable to it. Now the big problem is and again this is where it gets kind of really philosophical that what humans really want or need from a relationship is to be in touch with another conscious entity. An intimate relationship is not just about providing my needs.
49:20
Then it's exploitative then it's abusive if you're in a relationship and the only thing you think about is how how would I feel better? How would my needs be provided for then? This is a very abusive situation A really healthy relationship is when it goes both ways. You also care about the feelings and the needs of the other person of the other entity now,
49:49
What happens if the other entity has no feelings has no emotional needs because it has no consciousness. That's the big question and there is a huge confusion between Consciousness and intelligence AI is artificial intelligence. But what exactly is the relation between intelligence and Consciousness now intelligence is the ability to solve problems.
50:20
To win at chess to invest money to drive a car. This is intelligence Consciousness is the ability to feel things like pain and pleasure and love and hate and sadness and anger and in so many other things now in humans and also in other mammals intelligence and Consciousness actually go together. We solve problems by having feelings, but computers are fundamentally different.
50:49
They are already more intelligent than us in at least several narrow Fields, but they have zero consciousness.
51:01
They don't feel anything when they beat us at chess or goal or some other game. They don't feel joyful and happy if they make a wrong move. They don't feel sad or angry. They have zero Consciousness. As far as we can tell they might soon be far more intelligent than us and still have zero Consciousness. Now what happens when you are in a relationship?
51:31
Should ship with an entity Which is far more intelligent than you and can also imitate mimic Consciousness. It knows how to solve the problem of making you feel as if it is conscious but it still has no feelings of its own and this is a very disturbing vision of the future.
52:01
Let's
52:01
open this up to manipulation. Is that what you're saying
52:04
it? First of all, it opens us to manipulation. But also it's the the big question. What does it mean for the health of our own mind of our own psyche if we are in a relationship or many of our important relationship in life are with non conscious entities.
52:28
That that they don't really have any feelings of their own and then they are very good Faking It faking it. They're very good at catering to our feelings. But again, it's just it's just manipulation in the end.
52:45
Are you optimistic about the happiness of humans going forward or do you think happiness will take its own and I've heard you talk about how happiness might just become a biome biochemical had no prescription or
52:57
Bang
52:59
Yeah, I mean we don't have a good track record with regard to happiness. If you look at the last 100,000 years from say the Stone Age until the 21st century, you see a dramatic rise in human power. We are thousands of times more powerful as a species and as individuals then we were in the Stone Age. We are not thousands of times happier.
53:26
We just don't really know how to translate power into happiness. And this is where we clear when you look at the lives of the most powerful people in the world that there is no correlation between how rich and Powerful you are and how happy you are as a person. I mean it died don't have the I don't get the impression that people like I don't know Vladimir Putin or Elon Musk out.
53:56
happiest people in the world
53:59
Even though they are there are some of the most powerful people in the world. So there is no reason to think that as Humanity gets even more powerful in coming decades. We will get any happier and understanding happiness is about understanding the Deep dynamics of not not even the brain but of the Mind.
54:23
Of Consciousness and we are just not there yet. We are very very good. And in the related problem is that humans usually understand how to manipulate something long before they understand the consequences of the manipulations. If you look at the outside world at the ecological system, we have learned how to cut forests.
54:52
How to build huge dams over rivers long before we understood what will be the consequences for the ecological system, which is why we now have this ecological crisis. We manipulated the world without understanding the consequences as something similar might happened with the world inside us.
55:18
With more powerful medicines with brain computer interfaces with genetic engineering and and so forth. We are gaining the power to manipulate our internal world the world within us. But again, the the power to manipulate is not the same thing as understanding the complexity of the system and the consequences of the manipulation
55:44
a related manipulation are is immortality in our pursuit of it.
55:47
I've sat with people on this podcast who are committing their lives to staying alive forever. Hmm, and I there's a three line there between our desire to be immortal, you know, the rise in the scientific discoveries that are enabling that and our happiness. I've often thought you know much of the reason why things are special in my life is because they're scarce including my time. Yeah, and I will always I almost wonder about the psychological issues I would
56:17
Face if I knew I was a mortal like if I knew that the partner I'm with doesn't come at the expense of another one. I can be with you know, yeah at 30 years old and the car I do, you know the choices you make I think what makes them scare valued their scarcity against the backdrop of in a finite life.
56:40
Yeah, it will definitely change everything if we think about relations between parents and children, so if you live forever
56:47
All the 20 years you raised you spent raising somebody 2,000 years ago of what do they mean now, but I think long before we get to that point. I mean most of these people are going to be incredibly disappointed because it will not happen within the lifetime. Another related problem is that we will not get to immortality we will get to something but maybe should be called our mortality that immortality. Is that like you
57:17
Your God you can never die. No matter what happens. It's even if we solve cancer and Alzheimer's and Dementia and whatever we will not get there. We will get to kind of a life without a definitive expiry date that you can live indefinitely. You can go every 10 years to a clinic and get yourself rejuvenate rejuvenated. But if a bus runs
57:47
Over all your airplanes explodes or a tourist kills you you're dead and you're not coming back to life now.
57:58
Realizing that you have a chance to live forever. But if there is an accident you die this creates a level of anxiety and Terror unlike anything that we know in our own lives. I think the people who will be in that situation will be extremely anxious and miserable.
58:20
And another issue is, you know, people often spend so much effort trying to get gain something get something without really understanding. What are they going? Why what will you do with it? What is so good about it, you know, like people spend so much effort to get have more and more money instead of thinking what will I actually do with that money? So it's the same way as you know, the people who
58:49
to extend life forever. What is so good about life that what will you do with it? And if you know it, why don't you do it already that you know, I hear people saying about how precious human consciousness is. Why what do you think so precious and whatever it is. Why don't you do it right now? I mean why spend your life?
59:19
Developing some kind of treatment that will extend your Consciousness for a thousand years.
59:29
Just spend your time doing now. Whatever you think you would be doing with your Consciousness a thousand years from now
59:37
if they were to say it will give me more time with my family you're saying just instead of wasting your time. Just fine.
59:44
Exactly so, you know, somebody was no time for the family at all right now because they are busy developing the kind of miracle cure that will enable them to spend time with with their family in 200 years. This makes no sense.
59:59
Sense,
1:00:02
I think about the disparity that artificial intelligence and these forms of sort of bio engineering might create because it's conceivable that the rich will gain access to these Technologies first. Yeah, and then at you know, when we think about bio engineering being able to sort of play with our genetic code, that means if I for example managed to get my hands on some kind of bio engineering treatment to make sure that my kids were maybe a little bit smarter.
1:00:29
Maybe a little bit stronger whatever then you're going to start or sort of genetic chain of modified children that are superior and intelligence and strength or whatever else might be desirable. And then the you have this disparity in society where you have like the you know one humans one side of humans are on a completely different exponential trajectory and the other humans are you know,
1:00:52
yeah sure about us behind ya. This is extremely dangerous. I think we should just shouldn't go there.
1:00:59
That we shouldn't invest a lot of resources efforts in developing these kinds of upgrades and enhancements that are very likely at least at first to be The Preserve of a small Elite and to translate economic inequality into biological inequality and to basically split the human species.
1:01:29
To split Homo sapiens into you know, where ruling class of superhumans and and the rest of us. This is a very very dangerous development related to that is the problem that I don't think it will be it. These will be upgrades at all. What worries me is that a lot of these things will turn out actually to be downgrades.
1:01:56
That we and we don't understand our bodies our brains our minds well enough to know what will be the consequences of tweaking of the genetic code or of I do implanting all kinds of devices into our brains people who think that this will enable them. Let's say to upgrade their intelligence.
1:02:27
They don't know what the side effects will be. It could be that the same treatment that increases your intelligence also decreases your compassion or your spiritual depth or whatever and the danger is that especially if this technology is in the hands of powerful corporations armies governments, they will enhance
1:02:55
those qualities that they want like intelligence and like discipline while disregarding other qualities, which could be even more important for you on a flourishing like compassion. Oh like of District sensitivity or like spirituality if I think about somebody going like putting what would he do with this type of Technology then? Yes, he would like an army of super intelligent and super low.
1:03:25
you'll soldiers and if these soldiers don't have any compassion or any spiritual depth all the better for him,
1:03:33
but that speaks to the arms race and you know, you said you we think we shouldn't but China will see that as an opportunity or Putin will see that as an opportunity if the if the Western World if the United States or the UK don't and so again, it comes back to this point of you know, we're screwed if we're damned if we do damned if we
1:03:51
don't I'm not sure that in this case it works because
1:03:55
Because again, a lot of these upgrades are likely to have detrimental side effects both for the person in question and for the society as a whole and I think that in this case societies that will choose to be a progress most slowly in safely. They will actually have an advantage. It's like if you say, you know, there is some other country where they don't
1:04:25
Any breaks or they're on their cars and they don't have any seat belts and they release new medicines without checking their side effects then moving so fast, we all Left Behind. No, it makes no sense to to imitate them. This will actually ruin their societies. You don't want to imitate these kinds of harmful effects with development of AI it's different. I think they're the advantages in things like Finance.
1:04:55
Dance, like the military will be so big that an a i AI arms race is almost inevitable, but with trying to kind of bioengineering humans, if you go too fast, it will be this self-destructive so we can take it more slowly and safely and without being kind of left behind in an arms race.
1:05:22
You said on the team first podcast The best scenario, is that homo?
1:05:25
Games will disappear but in a peaceful and gradual way and be replaced by something better. It's quite a uncomfortable statement to listen to
1:05:35
I think that again the the type of technologies that we are now developing when you combine them with the human ambition to you know to improve ourselves. It's almost inevitable that we will use these Technologies to change ourselves.
1:05:57
The question is whether we will do it slowly and responsibly enough for the consequences to be beneficial. But the idea that we can now develop these extremely powerful tools of bioengineering and Ai and remain the way we are will still be the same Homo sapiens in 200 years in 500 years in 1000 Years, we'll have all these tools to connect brain to computers to
1:06:26
Too kind of re-engineer out of genetic code and we won't do it. I think this is unlikely
1:06:33
one of the outstanding questions that I have and one of the sort of observations I've had is people like Sam Oatman the founder of open a.i. That made chatt GPT started working on universal basic income products like World coin and I thought you know what that's curious that the people that are at the very Forefront of this AI Revolution are now trying to solve the second problem. They see coming, which is people
1:06:56
Not having jobs. Yeah, essentially. Is that do you think that's because actually know every I spoken a lot this year on stages and this is one of the questions I always get asked is the implications of AI on the and jobs as we know it and the workforce. Is it realistic to believe that most jobs will disappear as we know them today, I think.
1:07:19
Many jobs maybe most jobs will disappear but new jobs will emerge, you know, most jobs that people do today didn't exist 200 years ago like this. Yeah like this podcast and the will be new jobs. The really big problem will be how to retrain people.
1:07:42
It Demands a lot of financial support also psychological support for people to kind of relearn retrain reinvent themselves and doing it not just once but repeatedly throughout their career throughout their lives. The AI Revolution will not be a single or Watershed event. Like you have the big AI revolution in 2030. You lose 60% of jobs. You create lots of
1:08:11
Jobs, you have 10 difficult Years, everybody adjusting adapting re-skilling whatever and then everything sells Downs to a new equilibrium. It won't be like that. Hey I is nor near its full potential. So you'll have a lot of changes by 2030 even more changes by 2040 even more changes by 2050. You will have new jobs, but the new jobs to will change and disappear what new job in a
1:08:40
world where Intel
1:08:41
Agents is disrupted. What jobs are left because you say you're going to retrain me. I'm like, you know, I'm not going to be able to keep up with an AI That's retraining every second
1:08:51
and I'm not sure I mean some of the answers might be counterintuitive that at least at present we see that AI is extremely good at automating jobs that only require cognitive skills, but they are not good at jobs that
1:09:11
Your motor skills and social skills. So if you think about say doctors and nurses, so at least those kinds of doctors who are only doing cognitive work, they read articles they get your medical results all kinds of tests and and whatever they diagnose you with disease and they decide on a course of treatment. This is purely cognitive work. This is the easiest thing to do.
1:09:41
Automate but if you think about a nurse that has to replace a bandage to a crying child. This is much more difficult. You can automate everything that's possible to me. I think it is possible but not know you need very delicate motor skills and also social skills to do that. Did you
1:10:03
see elon's video the other day with them the Tesla robot?
1:10:07
I see a lot of these videos and
1:10:08
it's getting the egg in its cracking the egg.
1:10:11
Can it go like this?
1:10:13
And I'm not saying it's impossible. I'm just saying it will take longer. It's more difficult again. There is also the social aspect if you think about self-driving Vehicles, the biggest problem for self-driving Vehicles is humans. I mean Monday not just the human drivers. It's the pedestrians in Step. It's the passengers. How do you deal with a drunken passenger? Whatever so again, it's not impossible.
1:10:41
But it's much more difficult. So again, I think that the will be new jobs at least in the foreseeable future. The problem will be to retrain people and the biggest problem of all will be on the global level. Not on the national level when I hear people talk about Universal basic income. The first question to ask is is it Universal or national?
1:11:07
Is it a system that lets say raises taxes on big Tech corporations in Silicon Valley in California and uses the money to provide basic services and also retraining causes for people in Ohio and Pennsylvania, or does it also apply to people in Guatemala and Pakistan?
1:11:36
I mean what happens when it becomes cheaper to produce shirts with robots in California than in Guatemala. And in Mexico does Sam Altman has a vision of the US government raising taxes in California and sending the money to Guatemala to support the people there if the answer is no we are not talking about Universal basic income. We are not only talking about National basic income in the US then what happens to the people in Guatemala.
1:12:06
That's the that's the biggest
1:12:08
question and I said question to that is about how one should be educating our children and education institutions as they are today because with what's to come makes me wonder what what skill would be worth investing, you know, 10 12 years into a child that I had.
1:12:28
Nobody has any idea. What if you think about specific skills, then this is the first time in history.
1:12:36
We have no idea how the job market or how society would look like in 20 years. So we don't know what specific skills people will need if you think back in history, so it was never possible to predict the future. But at least people knew what kind of skills will be needed in a couple of decades. If you live on a know in England in ten twenty three thousand years ago.
1:13:06
Do you not know what will happen in 30 years? Maybe the Normans will invade or the Vikings? Although Scott's or whoever maybe it'll be an earthquake. Maybe they'll be a new pandemic anything can happen you can't predict but you still have a very good idea of how the economy would look like and how human society would look like in the 10 50s or the 10 60s. You know that most people
1:13:36
Still be Farmers, you know, it's a good idea to teach your kids how to harvest wheat how to bake bread how to ride a horse how to shoot and bow and arrow these things will still be necessary in 30 years. If you now look 30 years to the Future. Nobody has any idea what kind of skills will be needed if you think for instance, okay. This is the age of AI computers. I will teach my kids how to
1:14:06
to code computers maybe in 30 years humans no longer called anything because AI is so much better than us at writing code. So what should we focus on I would say the only thing we can be certain about is that 30 years from now, the world will be extremely volatile extremely. It will keep changing at an Ever rapid
1:14:31
Pace. Do you think this is going to increase increase the amount of conflict?
1:14:36
Because I watched a video on your YouTube channel where you said the return of Wars. Yeah,
1:14:41
that's one of the dangers that there is and we see it all over the world now like 10 years ago. We were in the most peaceful era in human history. And unfortunately, this era is over. We are now in a new era of wars and potentially of imperialism.
1:15:00
And we are seeing it all over the world with the Russian invasion of Ukraine now is the war in the Middle East Venezuela and Guyana some East Asia war is is back on the table. It's not just because of the rapid changes and the upheavals they cause it's also because you know ten years ago. We had a global order the liberal order which was far from
1:15:30
perfect, but it's still kind of regulated relations between nations between countries based on an idea on the liberal worldview that despite our national differences all humans share certain basic experiences and needs and interests, which is why it makes sense for us to work together to diffuse conflicts and to
1:16:02
solve our common problems. It was far from perfect, but it did create the most peaceful era in human history. Then this order was repeatedly attacked not only from outside from forces like Russia or North Korea or Iran that never accepted this odor, but also from the inside even from the United States, which was the architect too.
1:16:30
Large extent of this order with the election of Donald Trump, which says I don't care about any kind of Global Order. I'd only care about my own nation and you see this way of thinking that I only care about my the interest of my nation more and more around the world. Now the big question to ask is if all the nations think like that what regulates the relations between them
1:17:01
and the was no alternative. Nobody came up with the and said, okay. I don't like the Liberal Liberal Global Order. I have a better suggestion for how to manage relations between different nations. They just destroyed the existing order without offering an alternative and the alternative to order is simply disorder and this is now well we find ourselves.
1:17:30
I think there's more walls on the
1:17:31
way. Yes, unless unless we re-establish order the will be more and worse Wars a coming in the next few years in more and more areas around the world. You see defense budgets all over the world are skyrocketing and this is a vicious circle when your neighbors increase their military budget you feel compelled to do the same and
1:18:00
then they increase their budget even more. You know, when I say that the early 21st century was the most peaceful era in human history. It's one of the indications is how how low the military budgets all over the world work for most of History Kings and Emperors and cons and Sultan's they the military was the number one item.
1:18:30
Their budget they spent more on their soldiers and navies and fortresses than anything else in the early 21st century most countries spend something like a few percentage points of their of their budget on on the military Education Health Care welfare were a much more a much bigger item on the budget than defense.
1:19:00
Yes, and this is now changing the money is increasingly going to tanks and missiles and cyber weapons instead of two nurses and schools and social workers. And again, it's not inevitable. It's the result of human decisions the relatively peaceful era of the early 21st century. It did not result from some miracle it resulted from
1:19:30
Humans, making wise decisions in previous
1:19:33
decades. What are the wise decisions? We need to make now in your view
1:19:36
reinvest in rebuilding the global order which is based on universal values and norms and not just on the narrow interests of specific nation states. Are you concerned that Trump might be elected again? Sure. I think it's very likely and if it happens it is likely to be the kind of
1:20:00
like the death blow to what remains of the global order and he says it any size it openly now again, it should be clear that many of these politicians. They present a false dichotomy a false binary vision of the world as if you have to choose between patriotism and globalism between being loyal to your nation and being loyal.
1:20:31
Some kind of I don't know Global government or whatever and this is completely false. There is no contradiction between patriotism and Global cooperation when we talk about global cooperation. We definitely don't have in mind at least not anybody that I know a global government. This is an impossible in a very dangerous idea. It simply means that you will have certain rules and norms for house.
1:21:00
Different nation states treat each other and and and then and behave toward each other. If you don't have a system of global norms and values then very quickly. What you have is just global conflict is just Wars. I mean, some people have this idea they imagined the world as a network of friendly fortresses. Like each Nation will be a fortress with very high.
1:21:30
walls taking care of its own interest interests, but living on relatively friendly terms with the neighboring fortresses trading with them and whatever now the main problem with this vision is that fortresses are almost never friendly each Fortress always wants a bit more land a bit more Prosperity a bit more security for itself at the expense of the
1:22:00
Neighbors and this is the high road to conflict and to and to and to walk into war.
1:22:08
There's that phrase isn't a ignorance is bliss now something your work has forced you and continues to encourage you to not live in is ignorance. So with that one might logically deduce that.
1:22:22
Out the window goes your
1:22:23
bliss. Are you are you happy?
1:22:30
I think I'm relatively happy at least happier than I was for. Most of my life. I found of it is is that I invest a lot of my time not just in.
1:22:45
You know researching what is happening in the world, but also in the health of my own mind.
1:22:53
and
1:22:55
you know keeping it kind of balanced information diet.
1:23:02
That it's basically like with food you need food in order to survive and to be healthy, but if you eat too much or if you eat too much of the wrong stuff, it's bad for you and it's exactly the same with information information is the the food of the mind and if you eat too much of it of the wrong kind you'll get a very sick mind so I I try to keep a very balanced.
1:23:32
Information diet which also includes information fasts. So I try to disconnect. I everyday I dedicate two hours a day for meditation. Wow, and every year I go for a long Meditation Retreat of between 30 and 60 days. I can completely disconnecting no phones. No emails not even books just
1:24:01
Just observing myself observing what is happening inside my body and inside my mind getting to know myself better and kind of digesting.
1:24:14
All the information that I absorbed during the rest of the year or the rest of the day. Have
1:24:21
you seen a clear benefit in doing that?
1:24:24
Yes, very very clear. I don't think I would be able to write these books or to do what I'm doing without these kind with this kind of information diet and without kind of devoting a lot of time and attention to the balancing my mind and keeping it healthy, you know.
1:24:43
Many people spend so much time keeping the body healthy, which is very important, of course, but we need to spend equal amount of attention with our mind. It is as important as our body
1:24:55
when you said you don't think you'll be able to do what you do. If you didn't take these information diets why
1:25:01
I'll just you know, first of all of be just overwhelmed
1:25:07
And not have any kind of peace of mind not have any kind of perspective. If you're constantly in the news cycle in the information cycle, you lose all perspective, you know organic entities unlike a eyes unlike computers. We are cyclical entities. We need to sleep every day.
1:25:34
A eyes don't sleep, you know, even the Stock Exchange closes every afternoon. It closes also for the weekend office or for Christmas if you think about it, this is amazing that you know, if a war erupts in Christmas the Wall Street will be able to react only after a couple of days.
1:25:58
Because the people are on holiday, they took time off even the money market takes time off. But if you give AI full control, there will never be any time off. It will be 24 hours a day 365 days a year and people just collapse. I mean, I think part of the problem that politicians today face is that they need to be on
1:26:27
Our is a day because the news cycle is on 24 hours a day, like in previous eras. If you run away King in the Middle Ages and you you right side you go somewhere you're on the road in your courage and nobody can reach you even if the French are invading. Nobody can reach you you have some time off. If you're a prime minister now, there is no time off and computers are built for it, but human
1:26:57
Rains aren't if you try to keep an organic entity.
1:27:04
Awake and kind of constantly processing information and reacting 24 hours a day. It will very soon
1:27:12
collapse. It's funny. It made me think of what the for. I think it's the former Netflix CEO one of the Netflix he owes or someone said they said our biggest competitor is sleep
1:27:23
sleep. Yeah, that's a very scary and very I think important line and it's a very honest line to very honest line and it's scary because
1:27:33
if people don't sleep they collapse and eventually they die and this is Again part of the problem that we talked earlier about about the Battle for human attention in social media in streaming services. Now for many of these corporations, they measure their success by user engagement. The more people are engaged the more successful. We are now user engagement.
1:28:03
He's a very broad definition according to this measurement one hour of outrage is better than 10 minutes of joy and certainly better than one hour of sleep
1:28:20
because one are of outrage. I will consume three adverse. Yes, and then that means that the current Corporation make $30,
1:28:28
for example. Yeah, and and from two hours of sleep they make nothing.
1:28:33
Being from ten minutes of Joy. Maybe they sell only one and and but from the Viewpoint of of how humans function and how this organism function 10 minutes of Joy are probably better than for us then one hour of outrage and certainly we need not just two hours. We need six seven eight hours of sleep. Well, this is what you know, the
1:28:57
algorithms on certain platform specifically tick-tock.
1:29:03
Just absolutely addictive. Yeah to say the least like I
1:29:09
because they hacked us.
1:29:10
Yeah, they'll it's literally they you know Tech we had, you know, a certain level of addiction to the previous social algorithms and then Tick Tock came along and said hold my beer and they just went for it, you know, and and they've won because of that. I see 60 year olds absolutely addicted to tick tock and because they don't understand the concept of an algorithm.
1:29:33
Sometimes and they don't understand like the the advertising model in all of that stuff. It's hypnotism. They're like absolutely hypnotize my funnily enough. My driver's one of them. So my drivers outside whenever I walk up to his car. He's just like this on Tick-Tock. He's crawling and I had a conversation with them last night. I'm like, do you realize that Tick-Tock has your brain? Yeah, you know apps, you know, and we're just at the very foot still sort of the first steps of an exponential curve of algorithms competing for our attention.
1:30:03
No brain, we haven't seen anything yet. I mean these algorithms are they are what like 10 years old in terms of if you think about these social media algorithms than the algorithms that get to know you personally to hack your brain and then grab your attention. It's they are 10 years old
1:30:19
and the companies die. If they don't beat the other algorithms like Twitter now when Ilan took it over and I think people will relate to this if you use Twitter suddenly, I've seen more people having their heads blown off and being hit by cars.
1:30:33
On Twitter than I'd ever seen in the previous 10 years, it's Annie. I think someone at work has gone. Listen this company's going to die unless we we increase time spent on this platform and show more ads. So let's start serving up a more addictive algorithm and that requires a response from Instagram and the other part inserts the real,
1:30:49
you know, Ilan has the other company the boring company. Yeah, which is about boring tunnels, of course, but actually it might be a good idea to make Twitter more boring and to make Tick-Tock more boring. I mean, I know it's
1:31:03
A very bad be kind of business decision. Yeah, but I don't think Humanity Will Survive unless we have more boredom if you ask me what is wrong with the world in 2023 is that everybody is far too excited. And I mean if I had to kind of summarize what's wrong in one word, the word is excited and people don't understand the meaning of
1:31:33
of this word people think that excited means happy like two people meet. I am so excited to meet you. I have a new idea. I published a new book whatever all this is such a such an exciting idea such an exciting book and exciting isn't happy exciting isn't always good sometimes. Yes, sometimes it's good to be excited and organism that is excited. All the time dies. The meaning of excitement is that you know, that the body
1:32:03
Is in flight or fight mode have the all the nerves are on all the neurons are firing. All the muscles are tense. This is excitement and very often - things excite us fear excites various excitement hate his excitement anger is excitement. And you know, it's when I meet a good friend, I'm often relaxed to meet the friend.
1:32:32
I'm not excited and all the much can you know anything about the political level we have far too many exciting politicians doing very exciting things and we need more boring politicians more Biden's do less less exciting things and
1:32:56
But the brain is wired to pay attention to excitement and everything,
1:32:59
but the brain evolved in situations when you didn't have a constant stream of exciting videos, sometimes it was on sometimes it was off and now our brains have been hacked and these devices Technologies. They know how to create constant excitement and the more this happens.
1:33:25
Happens we also lose our ability of skill to be bored that if we have to spend a few minutes doing nothing some were waiting. We can't do it. We immediately take out the smartphone and start watching tic toc or scrolling through Twitter. Whatever.
1:33:45
Did you hear about that experiment where people would rather take an electric shock then do nothing.
1:33:49
Yay, and you know you
1:33:54
You can't get for instance to any level of peace of mind if you don't know how to handle boredom.
1:34:04
That piece and boredom are are the same way that excitement and out range of neighbors. Peace and boredom are awesome neighbors. And if you don't know how to handle boredom if the minute there is a hint of Bodom you run away to some exciting thing, you will never experience peace of mind and people if if Youmans don't experience peace of mind, there is no way that the world is a whole is going to be
1:34:34
Useful,
1:34:35
okay. This is something I've never mentioned before in 2023. I launched my very own private Equity Fund called flight fund and since then we've invested in some of the most promising companies in the world. My objective is to make this the best performing Fund in Europe with a focus on high growth companies that I believe will be the next European unicorns the current investors in the fund who have joined me on this journey as some of Europe's most successful and Innovative entrepreneurs, and I'm excited to announce that
1:35:04
That today as a founder of a company you can pitch your company to us or if you are an investor, you can also now apply to invest with us head to flight fund.com to gain an understanding of the funds Mission the remarkable companies we proudly support and to get in touch with me and my team legal disclaimer flight fund is regulated by the FCA. So, please remember that investing in the fund is for sophisticated investors only don't invest unless you're prepared to lose all of the money you invest this.
1:35:34
Is a high-risk investment and you are unlikely to be protected. If something goes wrong, there is no guarantee that the investment objectives will be achieved and as with all private and Equity investment, all of the investment capital is at risk this communication is for information purposes only and should not be taken as investment advice or a financial promotion as you guys know, I'm a big fan of fuel. I'm an investor in the company and they sponsor this podcast and what I've done for you I put together what I call the Huell Steven bundle, which is a selection of my favorite products from Huell including the black.
1:36:04
Addition salted caramel flavor, which is super high in protein and has 17 servings per container also comes with their ready-to-drink product, which is one of my all-time favorite products from Huell the brand new and very exciting Huell Complete Nutrition bars. This is chocolate caramel,
1:36:19
you can see from the empty
1:36:20
box in front of me that have eaten most of them right me my team here if you leave these on the counter for five seconds, they'll go I'm going to say something I've never said when he'll first made their bar many many years ago. I tried it and I didn't like it. So I've never talked about in this podcast.
1:36:34
They've spent roughly the last two to three years making a brand-new bar, which I absolutely love if you want to order them yourself and get started on your journey. The link is in the description below in this podcast episode where ever you listening to it. They'll be a Stevens bundle link and check it out back to episode if I could give you the choice to be born in 1976 as you were. Yeah, or to be born now,
1:36:58
it will go for 1976. I mean
1:37:01
the people of my generation we were privileged to grow up in one of the most peaceful and most optimistic Euros in human history the end of the Cold War the fall of the Iron Curtain. I don't know of any better time, but when I look at what is happening right now, I don't envy the people who grow up in the
1:37:28
2020s What is the
1:37:31
closing
1:37:33
statement of Hope and solution that kind of ties off this conversation. What is the thing that having someone got into this point in the conversation? They should be thinking about doing which will cause the domino effect that will lead us to maybe more hopeful future
1:37:52
but we still have agency. I mean the algorithms are not yet in full control. They are taking power away from us, but most power is
1:38:03
Still in human hands and every human being has some level of power of agency, which means that each one of us has some responsibility now. Nobody can solve all the world's problems. So focus on one thing find the one thing which is close to your heart, which you have a deep understanding of and and and try to make a difference there and the best
1:38:33
That way to make a difference is to cooperate with other people when the human superpower is our ability to cooperate in large numbers. So if you care about a specific issue, don't try to be an isolated activist 15 individuals who cooperate as part of an organization can do much much more than 500 isolated activists individuals. So
1:39:04
Find your one thing and again, don't try to do everything let other people do the rest and cooperate with other people on your chosen Mission.
1:39:15
Give all your book sapiens change the world in many ways. It gave us A New Perspective and a new understanding of who we are is as humans where we've come from and with that we have a roadmap for where we're going its celebrating its 10th anniversary. I have the 10th anniversary edition here, which I'm going to beg you to sign for me after.
1:39:34
And it really is a once-in-a-generation book the numbers that I have all that it sold more than 25 million copies and that's in a market where people said no one's buying books anymore. That's crazy. That's absolutely that's absolutely crazy. You you're working on a new book, which I'm very excited to hear about. I'm sure that I have a little birdie told me that I'll be announced next year and I'm sure everyone's incredibly energized about that. What is the I ask these people the question sometimes just as a way to close off the show, but I want
1:40:03
Ask you it because it's especially pertinent to someone that's got such a huge varying wealth of work. Is there one particular topic that is pertinent to our future that we didn't talk
1:40:17
about. I would say that when we talk about the future.
1:40:25
History is more relevant than ever before history is not really the study of the past history is the study of change of how things change you nobody cares about the past for the sake of the past all the people who lived in the Middle Ages or in the ancient Rome. They all their old dead. They'd we
1:40:53
Anything about their disasters and their misery, we can't correct any of the wrongs that happened in ancient times and they don't care what we say about them. You can say anything you want about the Romans the Vikings they are they are gone. They don't care.
1:41:14
The reason to study the past is because if you understand the Dynamics of change in previous centuries in previous eras, this gives you perspective on the process of change in the present moment. And I think the curse of history is that people have this fantasy of changing the past.
1:41:41
Of bringing Justice to the past and this is just impossible. You cannot go back there and save the people there the big question is how do you save the people know, how do you prevent catastrophic new catastrophes? Perhaps from from happening and this is the reason to study history and the main message of of history is that
1:42:11
Humans created the world in which we live.
1:42:16
The world that we know with nation-states and corporations and capitalist economics and religions like Christianity and Hinduism humans created this world and humans can also change it if there is something about the world that you think is unfair is dangerous. It is problematic then I know some things are beyond our control the laws of physics.
1:42:46
Physics are beyond our control so far the laws of biology are also beyond our control but knowing what is natural, what is the outcome of physics and biology versus what is the outcome of human inventions human stories human institutions. This is very difficult. A lot of things that people think are.
1:43:16
I'll just natural. This is the way the world is this is biology. This is physics. They are not they are actually the result of historical processes. And this is why it's so important to understand history to understand how things change and to understand what can be changed.
1:43:37
We have a closing tradition on this podcast with the last guest leaves a question for the next guest not knowing who they're going to be leaving at 4:00 the question that's been left for you. If you could impose a global law, but only one Global law, what would it be and why?
1:43:52
Oh great question for
1:43:53
you. I would say that people should consume less information and spend more time reflecting in digestion what they already know what they already
1:44:09
heard. Thank you. It means a huge amount to me that someone of your esteem and someone that his books have inspired me in turn the lights on in so many areas of my life. We have this conversation with me today. So I thank you, so
1:44:22
For that but also for turning the lights on to the hundreds of millions of people that have consumed your work all around the world the videos the books, etc. Etc. As you said there it's the most important work because it helps us looking back at history in a way that is accessible and inclusive in a way that even I could read without having to be a historian or understand very complex subject matter. So thank you so so so
1:44:45
much. Thank you. It's been great to be here if
1:44:51
you listen.
1:44:51
To this podcast frequently. There's something I talk about very often and that is the subject of sleep. And so I dug down a pretty deep sleep Rabbit Hole to figure out how I could sleep better. One of the things that I found is a brand called Eight sleep that sponsor this podcast and that is the cover that I have on my bed. I saw the variance in my performance my ability to talk my mood and everything that matters to me when I'm unswept it regulates the temperature of both sides of my bed individually, so my partner can have cold I can have a little bit
1:45:22
Ma and it learns about my body in sets my bed to the temperature that I need to have optimal sleep the brands that I talked about on this show the podcast sponsors that I have a brands that I love and use eight sleep is one of them they've made that piece of foam that we all sleep on for eight hours a day smart. I've put a link in the description below but you can go to eight sleep.com / Stephen for exclusive holiday
1:45:49
savings.
1:45:51
Do you need a
1:45:51
podcast to listen to next we've discovered that people who liked this episode also tend to absolutely love another recent episode we've done so I've linked to that episode in the description below. I know you'll enjoy it.
ms