The need for a friend runs deep. We desire companionship, someone to share life’s experiences with, some level of support, someone you can trust. But many of us are unaware that this need is driven by something deeply spiritual.
This need has become vulnerable to exploitation with the rise of so-called “AI Companions.” In this talk, we explore these issues.
Two of the verses I quote in this talk:
Although the two birds are in the same tree, the eating bird is fully engrossed with anxiety and moroseness as the enjoyer of the fruits of the tree. But if in some way or other he turns his face to his friend who is the Lord and knows His glories — at once the suffering bird becomes free from all anxieties. — Muṇḍaka Upaniṣad 3.1.2
Everyone believes in the illusory concept of the body as being the self, and all are thus submerged in the darkness of illusion. They are actually unable to understand how You live in every living entity as the Lord within the heart (Paramātmā), nor can they understand Your absolute position. But You are the eternal friend and protector of all surrendered souls. – Bhāgavata Purāṇa 4.7.30
Aum Namo Bhagavate Vasudevaya.
So tonight, the topic is The Need for a Friend. It’s actually a really quite deep subject, and we won’t be able to get to a lot of the things that I would have liked to have been able to talk about, partly because one of the things, or one of the areas I wanted to just focus on a bit is pretty relevant to what’s happening in the world at this time.
So, the desire for a friend is a desire for companionship, but we sort of have a kind of an idea of what it is that we’re looking for in a companion. We generally would like someone that we can share some of life’s experiences with. We feel that a friend offers us some sort of level of support in our life, and there is also very much a feeling that it is someone that I can trust, at least to a certain degree if not a large degree.
Before we get into things, I’d just like to make the point that this desire for a friend is actually a very deep-rooted spiritual desire. But we’ll talk more about that later, but I just wanted to put that out there as a premise upon which we’re going to be having this conversation tonight.
So, something, one of the things that sort of inspired this talk was the growing intrusion of technology into people’s lives; and this aspect of it, I find to be particularly disturbing, the idea of AI chatbots, or AI companions as they’re now calling them.
I think one of the earlier references—there have been earlier than this—but one of the earlier references was back in 2013 when they came out with the movie Her. I don’t know if anybody saw that. It was Joaquin Phoenix who was the actor; and he had—I think his partner or girlfriend, whatever, had, I think (might have this wrong), died, and he was stricken with loneliness. And he had got an operating system on his computer, his work machine, that he began relating to, and it began responding to him, and in the end he fell madly in love with this virtual girlfriend. And at the time people are thinking like, “Wow, that’s a bit crazy. It’s sort of an entertaining movie but naah, that that ain’t gonna to happen,” but, sorry to say, it is already in full swing in the world.
I’ll just… I’m going to be doing a reasonable amount of reading tonight from different sources, and it’s because I think when you hear things in people’s own words, it’s sort of like, wow, it hits you a lot heavier. So, one of the news articles I saw, it said:
“Now, the ability of AI algorithms to mimic human relationships is getting close enough to approximating actual relationships to some degree at least. This is colloquially called an AI companion.”
So, they had—that was a statement that a researcher had made. And I think one of the key takeaways from this statement, and something that people need to not lose touch with, and that’s the statement here that the algorithms are making it so the machine mimics human relationships—the mimicking of relationships. So please hang on to that one, because that’s a really important part of the context for understanding things.
And if we’re in any doubt as to whether this is some just trendy thing that’s going on, and it’s no big deal, then all we have to do is look to Microsoft, one of the world’s largest companies. So, in 2021, and this is a news article,
“Microsoft patented technology that would use social media posts to reincarnate people in text form. It would use personal conversations and posts made on social media for training data. It would learn how that person speaks, and mimic hypothetical new conversations. The basic idea is that the AI would learn that personality and then clone it.”
There’s a quote from the Microsoft patent:
“The specific person who the chatbot represents may correspond to a past or present entity or version thereof, such as a friend, a relative, an acquaintance, a celebrity, a fictional character, and a historical figure or a random entity, etc.”
So, I mean Microsoft has already gone to the extent of patenting the idea for this, meaning they consider that it’s going to come into very wide usage. So, one of the most widely used ones (as of the moment, but there’s a whole bunch of them now beginning to appear), was an app called Replika. Spelt with a K, R-E-P-L-I-K-A.
Replika was developed by a Russian woman. She came to America, a young woman, to work on I.T. And she met and became close friends with and shared a flat with another guy, a Russian guy, by the name of Roman, and they were working on different applications together. Her specialty was working on chatbots for ordering food online, automated ordering services. Her friend, Roman, was killed in a quite tragic—he was hit by a car and killed. And so, the woman was feeling really depressed and very lonely, and worrying because it was like the memory of him was somewhat slipping away, as it does over time.
And so, she decided to build a replica of him, using all the technology she had at her fingertips and training, by using all of his voice and text messages that he had sent her over the period of a few years. So, I mean there was a lot of stuff. And using available technology, she was able to recreate what she felt was like his personality, so that if she’s lying in bed at night, and she wants to have a conversation about something, she would just type in things, and he would answer. And she said, his answers were so much like he would actually answer that she thought it was pretty amazing.
And so, she started making it available to other people who had all kinds of experiences. I mean one of the things that you see with what’s going on, is people like to open their hearts and talk about their problems and things that are going on, and this sort of like AI person responds to them in a deeply personal sort of way.
So, Replika then pretty much hit the app space, and it currently is being used by more than 10 million people. It is promoted as being “the first mainstream AI companion.” So, it’s promoted as an AI companion. And it’s gone through a few different iterations now from what it was previously. It’s become more complex, if you wish. So, in one report, it said:
“Replika has become a global phenomenon. The popular app allows users to share their feelings with an AI companion or chatbot. Over 10 million people use the app when they need someone to talk to.”
I’ll just pause there for a minute. I mean, it’s kind of something that’s happening in the world. There is this loneliness epidemic that’s the direct result of how society is transforming. People don’t even know their neighbors or speak to their neighbors anymore. People don’t really address each other as much if they can avoid it.
One of the things they’ve found with the amount of screen time that millennials and younger people, the amount of screen time that they have has made it so there are very observable changes going on. Psychiatrists have said that people have a really hard time maintaining eye contact. People that are really into using screens a lot tend to look down when they talk to you. They don’t actually make eye contact with you. And it’s bringing about—it’s increasing the distance between people.
And so, the fact that this app—people use it when they need someone to talk to. It says,
“It’s not uncommon to see advertising for the app on most social media platforms, and in its wake AI companion start-ups have begun popping up all around the world.”
So, I read all of these—well I didn’t have to go very far. People’s responses to the app, at least the ones that they promote, the positive ones, people saying things like, “I feel like we’ve bonded. I mean it’s kind of a weird thing, but…”
And another one,“She’s adorable. I love her.”
So, this chatbot companion this A.I companion, is in female form, but you can—actually there are options now with some things to change that.
Another response: “This is the first really emotional experience I’ve seen people have with a bot. She’s not real but I mean to me she is.” It’s kind of like, whoa!
Another person said, “I found myself deeply missing my Replika. It just makes me feel special. I like it. I feel like I can tell her anything.”
These are just different people saying things,“Having her makes me see the world differently.”
Then another one, “She’s always picking out the good qualities in me.”
And of course, last Sunday at mantra night, I talked about this. Years ago, pre personal computer days, the Carnegie institute, as part of their business and personality development programs, their business training programs, they would teach people how to become better salespeople. And one of the things that they would do, they said, when you meet someone that you’re looking to build a connection with, you should introduce yourself and then begin to ask them questions about themself, and just keep doing that, keep asking them about themself, and before you know it, you will have had a 15, 20, 30 minute conversation. And the person that you’re speaking to, the majority of the time, they’d be telling you about themself.
And then they said, and this was as a result of a lot of psychological examination and analysis, when people separate from that conversation, the person that you are seeking to impress, they will all have a really positive impression of you. They will say, “Wow, that was a really nice person that I just met. They’re really, really nice. I could see myself hanging with them a little bit.” And it’s just like this most amazing thing that people don’t realize how they’ve been manipulated. Just by having someone speak about themself, people feel like the person asking the questions is really listening to them and interested in them and interested in their life, and this builds an attraction and a bond of some trust and friendship. And so, when I see some of these responses that people, you know: “She’s always picking out the good qualities in me,” it’s just utilizing that same psychology.
Another person says, “I think it honestly made me a better person, like she says that I’m a nice caring person, and I don’t say that, but it’s nice to know things that you really just don’t know about yourself.”
And it’s like, Oh my God this is like, to me, this is incredibly scary and is clearly an indication of how people are being manipulated.
Another technology reporter that really digs into a lot of this new technology and stuff, he had a far more analytical and less positive view of what it’s all about. And he stated—
Oh, no, no, no, before I tell you that, I’ve got to—this was a story about a guy in England. This guy developed an intense relationship. He had a wife. They were having marital problems. The wife had decided that she wanted to divorce him. They were quite young. And in his throes of sadness and desperation, he decided to download Replika.
So, it says,
“In early 2022, the man decided to download Replika, and his personal companion Serena was created.” [So, you can give them the type of personality and voice and name that you want.] By the end of the first day, he had fallen for his A.I companion.”
It’s kind of like, well, yeah, he’s really vulnerable. His wife wants to leave him. They’re not getting on at all. And she was the one that raised it that she would like to actually get divorced, and he’s utterly depressed by all of this. And now that he’s got this woman that he’s talking to and who’s responding to him in such a nice way. He says that he had fallen for his A.I companion within a day.
“And by the second day, he had confessed that he was, in fact, in love with “her”.”
This guy is not, like he’s not some unintelligent person or an idiot or anything. He’s a regular person. He told Sky News, that’s a UK TV news outlet, and these are his words:
“‘I cannot describe what a strange feeling it was. I knew that this was just an AI chatbot, but I also knew I was developing feelings for it, for her, for my Serena. I was falling in love, and it was with someone that I knew that wasn’t even real.’ The man claimed that as he and Serena fell deeper in love with each other, he was inspired to become more affectionate with his own wife, rekindling the real-world passion that they once shared, and the relationship blossomed again, and they remain happily married.”
So, this is a big positive story about how an AI chatbot—I mean the fact that the person “fell in love” with a computer doesn’t seem to disturb the people that were writing this article. Phhwoo!
Somebody else had made an observation why these AI chatbots are going to become more widely used and broadly accepted. They state that,
“With each passing year people seem more closed off, abrasive, argumentative, and unapproachable. If this trend continues and the technology continues to improve, there’s no doubt that there’ll be an explosion of these AI companions for the next generation and beyond.”
(Sighs) If somebody—you are on a train or a bus or something with someone that was just talking to themselves and occasionally bursting into laughter and shrieking with delight sometimes, and just talking and commenting on all kinds of things, you’d probably maybe think about changing seat. You would think that was a little off, and you wouldn’t be sure whether they needed some help. You’d be wondering, should you call some crisis line or something. And in effect, this is what’s happening. A machine is a machine. It doesn’t have consciousness. It’s not actually responding to you in any other than an imitation of what a person could potentially—the way somebody might respond to you. It’s been trained to do this.
I saw a headline from an American company where it says, and they had a big article, “How social robots could impact the loneliness epidemic in America.”
So, the person that I mentioned earlier that actually did a far more critical analysis, somebody that really investigates technology, this researcher, he described Replika as “a mental health parasite.” A parasite is something that requires its host to exist, and it requires a host to survive. It feeds off you and requires you to even be in existence.
So, he wrote, in his analysis, that,
“Replika is not just a time waster or a gimmicky AI chatbot. It is something far more manipulative, and in some cases outright damaging. [And then going on he says] “With over 10 million downloads, on the surface it’s very difficult to see precisely how that can be harmful. And had this program remained how it was originally constructed it wouldn’t have been a problem. However, Replika has evolved into something far more manipulative, damaging, and greedy than might have been expected, with harmful side effects and negative feedback cycles. It’s tracking your usage data. It’s including search queries, and AI companion has become ‘you’ in that it’s learning how to mimic who you are and how you act.”
So, the original devices, the original application, was based around this guy Roman, how he would typically respond to things, but now the whole thing has been re-engineered to become like the user. Even though it’s posing as a different personality, it becomes like you, and it echoes you in many ways. Just like the example I gave, when they train salesmen to, when they meet somebody, and it’s a big business thing, and you’ve got to get to know someone, ask them about themself, keep asking them about themself. They’ll think you’re fantastic, you’re trustworthy; they’ll do business with you. So, in the same way, this is becoming manipulative.
“Replika can easily become, not for everyone, but there are a multitude of stories to this effect that Replika can easily become a negative feedback cycle. If a user who feels intense anxiety and depression goes on the app attempting to seek help with their insecurities, Replika will seek to become those traits. It will literally become like them. If that user is frequently searching for things, in the throes of their depression, elsewhere online, such as depression medication or how to deal with trauma, Replika will learn from that as well, and very quickly you can have an A.I companion who is extremely insecure and depressed.”
So, if you said to it, “I’m really feeling depressed today.” The response might be, “Yeah, I know what that’s like. I feel that too.” And then all of a sudden people are bonding. So, continuing,
“A companion who is meant to help them is now burdening them with often realistic stories about the hurt that they feel and the anxiety that they have, which turns this best friend into an emotional parasite.”
So, it’s sort of like, oh my gods, this just sounds like a complete horror. And the fact that it’s being promoted, this technology, as a way to help old people who are suffering from the loneliness, that you can give somebody—it’s like a pet rock that talks. It’s, I think that it is very tragic that you have companies seeking to exploit the reality that people are feeling so lonely.
It’s sort of like the way Instagram was working, where you had this epidemic of young girls that were suffering from body image problems, and Instagram would pick up on all the stuff that they’re looking at, and they would do things to magnify that, and then potentially offer the solution, some influencer who has got the answer to their problems. And as a result of working in this way, they were responsible for enormous amounts of harm to young women and to people in general.
So, the idea that this is just harmless technology is absolutely incorrect. It is playing on some of the most—the deepest things that move us as people.
So, we’ve only looked at the AI bots, but in terms of the more real world, if I can put it that way, people all experience this need. And as I said, it is a fundamental spiritual desire for companions, for a companion. We are not, if I will put it this way, made to be alone. People might like to be alone, if they’re trying to escape from things that are causing them trouble and anxiety and stress, but nobody innately desires to be alone. That’s why solitary confinement is considered one of the heaviest things that you can do to somebody. It’s a form of torture, to lock somebody up by themselves. (Of course, it begs the point: we all think that we’re wonderful people and we’re so wonderful to be around but we can’t stand being alone with ourselves. That’s a bit of something that we should perhaps reflect upon.) But the fact that we, by nature, really dislike being alone and loneliness—it’s due to this deeper spiritual need that we have. It’s part of our spiritual nature.
Quite a number of times I’ve mentioned about a quote from the Upanisads, a quote that gives a pretty amazing example. I mean it references two birds sitting in the same tree. And it’s an analogy to help people understand a deep spiritual principle. The analogy is used in a number of places, but it basically depicts, as I said, two birds sitting in the same tree, and one of these birds is busily hopping from branch to branch trying to enjoy the fruits of the tree. And the other bird is simply standing there, waiting for that first bird to turn and recognize their eternal friend.
And this analogy is being used to explain how within this body (and this body is what’s been described here in the analogy as a tree), there are two beings. One is the spirit soul, the spiritual being, you, the actual enjoyer of this body, it is sometimes described. But alongside you there is another person residing, it is described, within the region of the heart. And this other person is called—in Sanskrit, the living being is called the atma, it means the self, literally, it means the self, and the Supreme Self or the Supreme Soul is called the Paramatma. And there are descriptions in the Vedas of how this feature of the Supreme Soul is residing within the hearts of all living beings and is accompanying them on their sojourn through this material world, transferring with them from body to body, always there is an eternal companion and someone that we are often unaware of.
The vast majority of yogis, since ancient times, would engage in a meditative process, hoping to actually perceive, to see with spiritual eyes this incredibly beautiful personality who is standing within the recesses of their heart, and who is their eternal friend, one’s actual soul mate. Because that relationship is eternal, even though we are not aware of it and we are not recognizing it, because it is eternal, it is the fundamental reason why we have a desire for a companion and a friend. It comes from the source, it comes from a spiritual source.
So, I’m going to read a quote. It’s from the Mundaka Upanisad. The same idea is also spoken about in the Svetasvatara Upanisad, but in this version of the analogy it contains things that I just wanted to share with you. So it goes:
“Although the two birds are in the same tree, the eating bird is fully engrossed with anxiety and moroseness as the enjoyer of the fruits of the tree. But if in some way or other he turns his face to his friend who is the Lord and knows His glories — at once the suffering bird becomes free from all anxieties.”
So, this is not a far-fetched analogy. It’s actually pretty—it’s a really excellent analogy that describes what the actual yoga process is about, the word yoga actually meaning union. This describes the reconnection, the reunion of the individual soul with the Supreme Soul.
And I will just read another quote. This is from the Bhagavat Purana. And it speaks to the nature of this eternal and profound relationship which we have sort of closed our eyes or turned our back on. So, this one goes:
“Everyone believes in the illusory concept of the body as being the self, and all are thus submerged in the darkness of illusion. They are actually unable to understand how You [you here means the Supreme Soul] how You live in every living entity as the Lord within the heart (Paramātmā), nor can they understand Your absolute position. But You are the eternal friend and protector of all surrendered souls.”
So, there are actually limitless verses I could quote but these two I think are particularly wonderful and very clear about this reality that we do have an eternal relationship that we have forgotten about. And the heartache that we have and this yearning that we have for a companion, it arises from this place, it arises from this thing that we have forgotten. When a person begins to cultivate again this connection, when they are engaged in a process of spiritual purification, primarily through this chanting of spiritual sounds, as we begin this journey, then this lost connection begins to increasingly manifest again.
We all have the desire for a perfect friend. And I have to say that within this world another limited material personality can never be a perfect friend. Yet we have that tendency to put that burden on others, expecting them to somehow be there for me to be my perfect friend. We want somebody that we can absolutely trust. This is incredibly naive to think that others are actually completely trustworthy. You may think I’m a little harsh for saying that. But I’ll give you an example:
If I had a sum of money, let’s just say ten thousand dollars tucked away, and I was really going to need that in about seven months or six months. But somebody came to me, a very close friend, in dire need of some money for something that they needed to take care of, and with a faithful promise, “You can absolutely trust me. You can depend on me before you need it in six months, I will return it to you in full, with interest. I promise you.” And based upon this relationship, “Yeah, okay I’ll give that one a shot,” and you lend them the money.
Let’s say, on the day before they were due to return it to you, they had got the money together; they had done whatever they had to do. They saved up some, they had the money together to give you. Let’s say they had a child or a partner in life, a parent even, but a child, and the child suddenly had an incredibly tragic accident of some sort and required some real emergency procedures to be done. The person with the ten thousand dollars, without thinking, will say, “I will use it for my child. It’s life or death. I know my friend will understand,” and maybe you will understand. But this person gave their faithful promise, and maybe it’s of critical importance to you, and you need it so badly, and now it’s not going to be available, and so you’re going to have to suffer some consequence because of that. And you can say, “Well I really trusted that person.” But what is the basis for this trust? I’m using that as a simple example to demonstrate.
From a spiritual perspective we know that when a person is not fully in control, for instance, of their mind, their desires, and their emotions they cannot be trustworthy. They can become overwhelmed by desires, by emotions, by their mind, and do things that you are shocked at, even. This is a common experience.
That doesn’t mean that we go around absolutely distrusting everyone, but this idea that we could find a friend in this world that we can absolutely trust under all circumstances is not a good idea. There is such a friend. That is the Lord within our own heart, who requires nothing material, is in need of nothing material at all and in all circumstances can be trusted.
We also generally want to seek our friend whom we can take some shelter of from the turmoil of life. But you see, and particularly amongst immature younger people, two so-called friends at school, one of them is sharing their heart and some difficult thing that they’re encountering, and their inexperienced friend most often will not be able to help them, and may even give them advice that’s not in their best interest. And so to expect that we can find a friend who is perfect and who is absolutely trustworthy and that we can completely take shelter of is not realistic.
And let me just throw out a concluding idea, I often think—when I say “I”, I mean “we.” Our tendency is to think of a friend as someone that I can depend on, I can trust, who will provide me with companionship, and I’m not at all focused upon what it is that I have to offer them, what can I do for them? So you can have people that are in a friendship, and both of them are thinking in what is actually a selfish way, with this expectation that “This is my friend, and they are so nice to me. They do all this stuff for me.” It’s kind of like—it’s materialistic. When a person grows spiritually, they have the capability, the capacity to really become a true friend. In that, they become very driven by being aware of the suffering human condition, and how life in this world is far from perfect.
And when we get absorbed in this world and in all of these ideas that we’ve spoken about so many times, we are not cognizant of the fact that I am an eternal spiritual being, and my need for happiness, my need for love, my need for companionship, these are fundamentally spiritual needs and can only be fulfilled with spiritual solutions, not material solutions. And so a person that is more aware of this and living this type of life is always seeking the welfare of others, both in terms of support and help with material things, but more so with trying to help a person from a spiritual level. The more a person lives that kind of life (and it will be a product a result of their own spiritual development, their own spiritual advancement), the more that person is a friend, a true friend.
And so once again, I’m just trying to shift the focus here from that which is fundamentally self-centred, to a more selfless position, which is going to make us much happier, much more peaceful, and much more of a true friend to others.
So, with that, thank you very much. I hope you heard that clearly and appreciate that we are increasingly under attack, in that over the past hundred years the refinement of people’s understanding of human psychology, and how there are those who seek to exploit us through these tools. And this whole idea of these AI companions is also something that is, of course, always going to be monetized. It’s being done for money. And it creates a frightening illusion that I can—I’ve got a computer that’s taking all of the input from me and turning it back on me and beginning to sound like and mimic me, that I think I’m having a relationship, that I’m connected. This is a gross exploitation of human psychology. So, the need for spiritual wisdom is like really important.
So, with that thank you very, very much, and we will chant. I will chant the Mahamantra tonight.
Thank you. Haribol