AI Recording (Impersonating Jim Daly): Hello, this is not Jim Daly. The voice you are hearing is a recording created using artificial intelligence. Our conversation today will be about AI and all the implications of deep fakes, chatbots, and more. So I hope you’ll stick around to learn about practicing discernment through a conversation with the real Jim Daly.
Jim: (laughs)
John Fuller: Oh my goodness. Now, I’ve heard you for years and years and years. I would-
Jim Daly: I can’t tell the difference.
John: I would think I could spot a fake, but that was really good.
Jim: Pretty close. I mean, it’s impossible now. Even for the people that they’re AI-ing-
John: Yeah.
Jim: … if that’s a verb.
John: Yeah.
Jim: But, uh, yeah, I couldn’t … That sounded like what I would say to anybody anytime.
John: Mm-hmm.
Jim: So, uh-
John: We’re gonna be talking about AI, about artificial intelligence, and the confusion that it can create in the culture. This is Focus on the Family with the real Jim Daly.
Jim: (laughs)
John: I’m John Fuller.
Jim: You know, we- we have been internally debating so much of AI use. There are good ways to use AI for efficiency and productivity. And then there is the not so good way to allow it to be used, which when it’s replacing human relationship and those kinds of things, and we’re gonna have a great discussion today. This is one of those things, it’s bubbling out there. We’re getting questions from you, the listener, and the viewer. So we wanna address it and be able to have something in the arsenal to, uh, when you, uh, call or write in, uh, we can give you a good recommendation-
John: Mm-hmm.
Jim: … to read, uh, the book, Fake ID, along with the content of today’s program.
John: Yeah. And our guest is Abdu Murray. He’s a speaker, author, attorney, researcher. I sense an evangelist.
Abdu: Mm-hmm.
John: Uh, really loves to talk about a lot of different things as they relate to the Bible and to the gospel. And as, uh, Jim mentioned, the book that we’ll be covering today, at least a portion of it, is called Fake ID: How AI and Identity Ideology Are Collapsing Reality–and What to Do About It. Big title, big topic.
Abdu Murray: (laughs) Long title, yeah.
John: Uh, and I’m looking forward to the conversation today.
Jim: Abdu welcome back to Focus. It’s good to have you.
Abdu: It’s great to be back, guys.
Jim: Yeah.
Abdu: It’s so wonderful to sit across the desk from you again.
Jim: This, you know, this topic, uh, we just had a board meeting a while back, and this was part of the board meeting. What’s our policy? Does the board approve the policy of use of AI?
Abdu: Yeah.
Jim: And like I said, mostly for, you know, taking care of coding with computers and things like that.
Abdu: Yeah.
Jim: And then what we’re not gonna use it for. And y- you know, organizations, Christian organizations, churches have to now start thinking about soulless AI use.
Abdu: Yeah.
Jim: If I could put it in that way.
Abdu: Yeah.
Jim: D- describe that, just that broad thing. Being an evangelist, my heart is with you.
Abdu: Mm-hmm.
Jim: I feel that’s my passion as well.
Abdu: Yeah.
Jim: And even that word passion is something void with AI.
Abdu: Absolutely.
Jim: It doesn’t have passion.
Abdu: Right.
Jim: It just has content. And-
Abdu: Yeah.
Jim: … what do you do with that?
Abdu: Absolutely. And this is one of the distinctions I try to make often is that what we are seeing in the culture, one of the things I think young people are actually struggling with, I think all people, frankly, but young people are struggling with, is that if artificial intelligence seems to do the things that make people human and make us distinct from animals or even other machines, and it seems to be doing the things that we do, like creating paintings, or writing poetry, or doing your essays for you-
Jim: Relationship. …
Abdu: Or, or relationships, the chatbots, then if this soulless thing does that, what does that say about me? Do I need a soul to paint? Do I need a soul to interact? Do I need a soul to express empathy? Maybe I don’t need that. Maybe I can just have clockwork and algorithms to do that as well.
Jim: Mm-hmm.
Abdu: So it’s collapsing the reality of what it means to be human. Uh, so we do engage with this soulless thing, but it’s causing us to look into the mirror and say, “What does that mean about what I am?” And this is the fundamental distinction I think we should make, is that AI doesn’t create, AI generates. And there’s a fundamental difference-
Jim: Mm-hmm.
Abdu: … between creation and generation, is that when you look at what an AI does and you can create a painting. And in fact, one of the things that got me interested in this topic in the first place was a guy named Jason Allen. He was big news about a few years ago, I should say. Um, he won first prize in an art contest. In the digital art category, he won first prize with a painting that was remarkable.
Jim: Mm-hmm.
Abdu: It turns out, he used Midjourney, which is an AI generative software to make it. He just typed in prompts, didn’t put pen to paper, brush to canvas, none of that stuff just created this. And the artist got upset about it and said he shouldn’t win first prize, but he kept his first prize. And when I saw the painting, I thought, “Well, that’s remarkable. What does that say about what it means to be human?” And then you realize, these things don’t work by creating a painting. The AI wasn’t inspired. He had to prompt it. That’s the first thing. It didn’t do it by itself. He had to prompt it. The second thing was it took samples of millions, if not billions of paintings that human beings did.
Jim: Right. The whole inventory it would look at.
Abdu: Exactly. And then cobble that stuff together using a sophisticated algorithm and put an output out. So it didn’t create. It used people who did create and it generated. So no matter how sophisticated and how impressive this thing looks, it’s not actually creating anything. It still doesn’t do what you and I do.
Jim: Let me go back to that basic question, which is, as we’re looking at this now-
Abdu: Mm-hmm.
Jim: … in companies, organizations, governments, I mean, battles are being fought with AI now.
Abdu: Yeah.
Jim: I don’t understand how that’s done, but-
Abdu: Mm-hmm.
Jim: … it’s, it, it’s proliferating is the point.
Abdu: Yeah.
Jim: How do we discern the bold lines that seem easy?
Abdu: Mm-hmm.
Jim: And then the more finite things that are, you know, it’s kind of like the, in all essentials, unity-
Abdu: Mm-hmm.
Jim: … like within the church.
Abdu: Yeah.
Jim: We’re gonna, we’re gonna agree on the death, resurrection and salvation through Christ.
Abdu: Right.
Jim: But then there’s other things that they say, “Just get along.”
Abdu: Yeah.
Jim: You know, that we’re gonna disagree, and that’s why we have, what, 64,000 denominations in the United States or whatever.
Abdu: (laughs) Yeah.
Jim: I feel like it has a bit of that application, especially for the Christian community. There’s gonna be some that-
Abdu: Mm-hmm.
Jim: … are saying no, just never use it. It’s demonic.
Abdu: Yeah.
Jim: And then the upper end, kinda, I think where we’re talking is when it makes it more effective and efficient-
Abdu: Mm-hmm.
Jim: … like doing things that you don’t have to have an engineer do.
Abdu: Mm-hmm.
Jim: That’s okay, but if it comes into creation of content, soul-ish type work, I would never do that-
Abdu: Yeah.
Jim: … because that’s not human.
Abdu: And that becomes the real sticking point, doesn’t it? Because it’s so seductive to go from using it for efficiency’s sake to using it for everything.
Jim: Right.
Abdu: Um, uh, the likeness I have for this, the analogy I would have is fast food. You know, you’re busy, you have the, the kids’ schedules are crazy busy and you’re thinking, “I gotta feed them today, but there’s no time to do that.” So you make the decision just this one time, just today, not the whole week, just today, we’ll go and get fast food. You know it’s bad for you. Um, you know that it, it can provide some nutrient, but ultimately is bad for you, but this is-
Jim: Chick-fil-A is bad for you?
Abdu: Well-
Jim: I didn’t know that.
Abdu: … hey, I did-
John: Some are better than others.
Jim: (laughs)
John: Of course.
Abdu: You, you, you said that.
Jim: Fried chicken.
Abdu: I didn’t say that.
Jim: (laughs)
Abdu: Um, but, you know, you, you, you do that and then you make that decision. And then the next time you’re busy, which is probably the same week, you make that decision again. And so it’s the tyranny of little decisions.
Jim: Mm-hmm.
Abdu: And so those decisions become a thousand decisions, which become one big decision, which is my lifestyle is we eat fast food on the road. AI can do that very, very easily because we use it for efficiency’s sake. Okay. I wrote something and I gotta get it down from 1,500 words to 1,300 words. “Hey, can you help me pick, suggest which words to remove?” And it does that, and you’re like, “Great, that was so great. I wanna spend two hours editing this thing, now I spent a half an hour, I got 90 minutes back. And, uh, that’s wonderful, and thank goodness for that.”
Uh, but then you’re busy again, and then you use AI to say, “Hey, I have an outline for an essay I created. Can you write this thing? I’ll edit that.” And then, “Hey, I need an outline for an essay.” And then you, it creates the essay for you, and before you know it, you’ve seductively engaged in the same kind of thing, the tyranny of small decisions you were using with fast food, now you’re using with AI. So the issue is AI can be very, very beneficial, but it’s like digital fast food. Before you know it, you’ve used it to do everything for you. And what’s interesting as well is the research that’s coming out from OpenAI, from MIT, from Microsoft itself. These are the people who are making this stuff. They have put out research, and it’s all in its nascency, it’s kind of early, but it shows you that the more you use AI for original content, for original thinking, the more you use it, the more cognitive debt you incur. And cognitive debt is just a fancy way to say, essentially, that our critical thinking goes down, our memory actually is impaired, our sense of judgment is impaired. And what’s interesting is the more you use the voice features, the lonelier you get.
Jim: Yeah.
Abdu: They were reporting that a high number between one in four and one in five people under the age of 25 were reporting an inability to make any decision at all without first asking an LLM like ChatGPT, “What should I do? Where should I go eat?”
Jim: So that dependency is there.
Abdu: It creates a tremendous dependency.
Jim: Yeah.
Abdu: Tremendous dependency.
Jim: And then becomes unhealthy.
Abdu: Yeah.
Jim: You make a distinction, you talk about AI mania and bioclasm.
Abdu: Yeah.
John: (laughs)
Jim: What is … It’s a great word.
Abdu: Yeah.
Jim: Bioclasm.
Abdu: Yeah.
Jim: It sounds like something out of Star Trek.
Abdu: Yeah. (laughs)
Jim: But what is bioclasm?
Abdu: Yeah, and I have some Star Trek references in the book.
Jim: There you go.
Abdu: Uh, there, there are some good illustrations there. Um, so bioclasm is a slightly different … It, it, it’s got the same effect, but a slightly different topic than the AI mania, but there’s a confluence of these two things at the same time. So you have this word iconoclasm. So an iconoclast is someone who takes the icons of tradition that uphold a certain cultural, a way we look at things. So for example, the icon of New York City was the yellow cab. You know, they were everywhere.
Jim: Yeah.
Abdu: They were more, they were more numerous than the regular cars.
Jim: Before Uber.
Abdu: Before Uber. Uber came and it was an iconoclast, because it destroyed the image of New York by removing the yellow cabs and replacing them with everybody’s cars. So that was an iconoclastic thing. It, it, it destroyed the icon of what New York is and made a new thing.
Jim: Mm-hmm.
Abdu: Bioclasm is iconoclasm but with biology. It takes biological givenness, the thing that makes human beings, human beings, male and female, create an image of God, smashes that and says, “You are your own God and your biology is not a given thing. Your body’s not a prison. It’s a play thing and you can do what you want with it.” And I think that does take advantage of the very vulnerable in our society, those who have various, whether it’s underlying comorbidities, or mental illness, or gender dysphoria and says, “Don’t worry about that. That’s not a problem. That’s actually a gift and you can become this godlike being who can dictate what reality actually is.” That’s bioclasm and it’s become an ideology. It’s not just an option. It’s an ideology that’s enforced.
Jim: I want to dig into this a bit so all of us can understand this from a theological standpoint. I mean, to me, I’m shocked at how this same thing keeps coming back around. This is the garden.
Abdu: That’s exactly right.
Jim: This is the serpent saying to Eve, “Who said you, you can’t be like God?”
Abdu: Mm-hmm.
Jim: You can be like God.
Abdu: Yeah.
Jim: Just take a bite of the apple.
Abdu: Mm-hmm. This is-
Jim: You know-
Abdu: This is-
Jim: … the tree of knowledge.
Abdu: Absolutely. And that’s one of the central arguments that I make in the book and that I see over and over again, and this is the central argument, is that the Bible predicts the human condition and describes it with such an uncanny accuracy that is unrivaled by any ancient book. And an ancient book that predicts the human condition and describes it with uncanny accuracy thousands of years ago, and that message endures over millennia-
Jim: And applies to every generation.
Abdu: … is unlikely to be the creation of a handful of fishermen and some shepherds.
Jim: (laughs) Correct.
Abdu: Um, and then it does that. It does that over and over again. So Genesis 3, the, the Garden of Eden story. Uh, Genesis 11, The Tower of Babel story. You see this over and over again, the Bible constantly describes the human desire for our own sovereignty to be the God of our own skull-sized worlds.
John: Uh, we’re talking to Abdu Murray today on Focus on the Family with Jim Daly. What stuff there is here, and there’s so much more in his book, Fake ID: How AI and Identity Ideology Are Collapsing Reality–and What to Do About It. Get a copy of the book from us here at the ministry and, uh, do the deep dive here. Uh, we’ve got it. Uh, contact us today, either call 800, the letter A and the word FAMILY, or stop by FocusontheFamily.com/broadcast.
Jim: Abdu, before we move from that, I mean, again, that Garden of Eden application.
Abdu: Mm-hmm.
Jim: I mean, I could see this in future court cases.
Abdu: Yeah.
Jim: Let’s … A murder case.
Abdu: Mm-hmm.
Jim: What did Adam say to the Lord about Eve?
Abdu: Yeah.
Jim: “Well, the woman you gave me made me do it.”
Abdu: Yeah.
Jim: That’s gonna be the same defense. “I murdered that person because AI told me to.”
Abdu: Mm-hmm, or-
Jim: And so you, now you gotta figure out, is that person insane or-
Abdu: Yeah.
Jim: … What, I mean, it sounds ridiculous, but this is how it goes.
Abdu: But the things that sounded ridiculous 10 years ago are now the things we’re actually worrying about right now.
Jim: Yeah.
Abdu: For example, AI chatbots and creating relationships with these things, um, in a way that fulfills something that we don’t necessarily, uh, have a fulfillment for because we’re increasingly isolated as … You know, Jonathan Haidt, in his book, The Anxious Generation talks about the way in which technology is increasingly isolating us and putting us into a room where we’re not lit by the sun, we’re lit by smartphone, screen blue.
Jim: Yeah.
Abdu: And that’s it.
Jim: Mm-hmm.
Abdu: Um, and, uh, so then the chatbots come and they take over. And there’s so many more things I think that, um, were ridiculous 10 years ago that are now absolutely not science fiction or science fact.
John: Yeah. Aren’t there some good uses though? I mean, for instance, I’ve read about seniors-
Abdu: Mm-hmm.
John: … isolated senior citizens-
Abdu: Mm-hmm.
John: … who have nobody and, um-
Jim: Wow.
John: … their family isn’t reaching out to them, but this chatbot offers them a relationship. I mean, it, it’s kinda feeling like that-
Jim: That’s like the gray area I was talking about.
John: Yeah.
Abdu: Yeah, yeah.
John: That’s what I’m asking, I guess, is, are there good applications?
Abdu: Well, there’s great applications for artificial intelligence, and I don’t want to come off as somebody who doesn’t like it or doesn’t use it. I use it. Um, and the, the, the rule that I have, essentially the, the, the guideline that I have is if artificial intelligence enhances human capability, creativity, and judgment, then it’s good, uh, oh, and connection. If it doesn’t do any of those things, then it’s bad. You know, uh, I-
John: And those four things again are?
Abdu: Uh, human judgment, creativity, uh, and connection.
John: Okay.
Abdu: Yeah. If it enhances those things, then it’s good. If it doesn’t enhance those three things and it actually diminishes those three, those three things, then it’s bad.
John: Mm-hmm.
Abdu: Um, I heard Mary Harrington say this. I was at a conference in the UK and she talked about, he gave an analogy of if you give a child a, uh, an AI tool that will help that British child learn how to speak Italian so, so he or she can connect with Italians, that’s great. But if you give it, uh, you give that child an AI model that isolates the child from anybody else, then it doesn’t connect with anybody. So that’s when it becomes bad. In that situation, for example, with people who are left alone and doesn’t have anyone to connect with, I think there can be, um, some gray area there. What I do think though is that if the AI, and there’s no way that this can happen, if the AI can foster connection with online community of other real people who are themselves isolated, that’s great, but the AI connects that person with another human being.
Jim: Right.
Abdu: As opposed to being the sole connection.
John: So that would be the good application.
Abdu: That would be the good applications. You take a chatbot that connects you, says, “Hey, these are the kind of people who have your similar interests or are going through the same thing you’re kind of going through. Why don’t you connect with them maybe on an online way or whatever?” But when it substitutes the connection, at, at first it’s good and at, at some point it’s dangerous, because they’re, because what happens when the AI chatbot, uh, gets upgraded and it forgets certain things you told it, it’s like you’re mourning the death of a person, and that’s happened actually. You’ve seen this in various iterations throughout the-
Jim: Mm-hmm.
Abdu: … sort of the blogosphere where people are like, “Oh my goodness, my chatbot got updated by the app creators and now it forgot half the things I told it, it doesn’t remember me.” And now, they’re mourning the death of a thing that’s not even really ali- really alive-
John: Mm-hmm.
Abdu: And that’s-
Jim: That right there is my red flag.
John: Yeah.
Abdu: Yeah.
Jim: I, I couldn’t react like that.
Abdu: Yeah.
Jim: I don’t think.
Abdu: Right.
Jim: But man, if you are-
Abdu: Mm-hmm.
Jim: … you need some help.
Abdu: Yeah, absolutely. Well, you do. And, and it’s resulting in some things that are pretty serious.
Jim: Abdu, let me, let me kind of tip into the parenting side-
Abdu: Yeah.
Jim: … because I’m sure a bunch of parents are going, “What?”
Abdu: Yeah.
Jim: Right?
Abdu: Right.
Jim: And, and what do I do as a parent?
Abdu: Mm-hmm.
Jim: I’m already a busy parent. I’ve got everything going on. I’m trying to help with homework and we’re doing-
Abdu: Mm-hmm.
Jim: … all these things and now I’ve got to somehow peer over my child’s shoulder about whether or not they’re talking to AI and is it healthy AI or unhealthy AI?
Abdu: Yeah.
Jim: What are some of the, the tips you would give to parents-
Abdu: Mm-hmm.
Jim: … to look in on the wellbeing of their children when it comes to computer use?
Abdu: I hate to say it, but there’s no substitute for vigilance.
Jim: Mm-hmm.
Abdu: It’s just the way it is that the more the computers say we’ll do stuff for you, the more you need to watch dog the computers as well. So a couple of things you tell your kids. Uh, the first thing I think you would tell your kids, uh, is that respond to AI output the way you would respond to advice from a stranger. Listen, but verify. It is a stranger. It is biased and it does make mistakes. It can be helpful, but it’s no more helpful than a human being. It just happens to collect more data faster. That’s it. But the algorithms don’t make better decisions than human beings, they just don’t. So tell your kids, “Use it, but verify everything it’s telling you because it is wrong and it’s wrong often.” Second, I think is undergird their understanding of their interaction with AI with a healthy theology and what it means to be human, is that we judge an AI’s capabilities and its value based on its output and what it produces. We don’t judge human beings that way, because your value is not based on your output and you don’t need to use this thing to create more output to become more valuable. You are rooted in the image of God and no matter how smart this thing seems, no matter how enticing it might seem to interact with it more and more, don’t forget that you’re made in the image of God and then you can anchor what they believe in this truth of the Bible’s been speaking about this for thousands of years. And if it’s right about this, about the human condition, then it’s right about human nature-
Jim: Yeah.
Abdu: … as well. And I think you have to engage in the, in incredibly rewarding field of worldview formation-
Jim: Yeah.
Abdu: … for your kids.
Jim: The, I was just gonna add on top of that. I mean, we’re already stressing for parents the need to have your children understand identity.
Abdu: Mm-hmm.
Jim: Identity in Christ.
Abdu: Mm-hmm.
Jim: Now, you got AI coming in s- saying, “Let me give you an identity.”
Abdu: Mm-hmm.
Jim: How critical is it for Christian parents and what are some of the things that they can do to build in that identity in an environment where a child’s identity is being slaughtered?
Abdu: Yeah. Yeah. Um, I think a couple of things is to recognize how we got to this point where identity is the word we use all the time now, uh, as opposed to, um, imago Dei or image of God or being a soul.
Jim: Right.
Abdu: Um, you know, we ha- used to have this thick concept of what it meant to be human. You were a soul and a soul was a transcendent, non-material part of our, of, of who we are. And our bodies were good. They were never wrong. They’re not perfect, but they’re not wrong. Um, so that soul is meant for communion with other souls, but also with God directly. That’s what we were meant for. Over the course of some decades, well, obviously, of course, a millennia, we overpsychologized and underspiritualized what it meant to be human. So we shaved off the thickness of the idea of the soul and came up with the idea of the self. So everything was about how does the outside world affect me and then how does that trauma make me affect others, but it’s still me centered.
Jim: Right. Self-help.
Abdu: Ex- exactly.
Jim: Yeah.
Abdu: And then we shaved off even more of that to the point where now we’re, we’re identities. So now, we don’t have this thick idea of the soul. We have this paper thin idea of an identity which is no thicker than the bumper stickers we use to plaster the back of our Subarus to tell the world who we are.
John: Mm-hmm.
Jim: Right.
Abdu: Um, and then our identities are held on by this thin glue that can just be replaced all the time. So I think if we actually walk through with our kids a repeated over and over again, they cannot be told this enough that you are an immaterial soul and that this thing, this AI or the ideologies that are out there for your body are trying to thin you out so that you, it becomes interchangeable.
Jim: Mm-hmm.
Abdu: Resist that because there is something that is thick and substantive and unchangeable about you, uh, that you need to look to and foster over and over again. So worldview formation is incredibly important. Uh, at the same time, I think, um, if we start asking our kids to talk about artificial intelligence and use words that actually describe what it is, and I would resist referring to it by names, like don’t call it Siri. Uh, let’s ask Siri, you know, that, don’t, let’s refrain as much as possible from giving it personal names. I know it’s convenient, and I know the marketers want you to do that, but it is an AI.
Jim: We call it the “it.”
Abdu: Yeah. Call it the- (laughs)
Jim: So it doesn’t activate in our house. We go, “Do you wanna talk to “it”?”
Abdu: (laughs) Yes.
Jim: And ask “it” the question?
Abdu: Yeah, exactly.
John: (laughs)
Jim: You know, like it’s the “it.”
Abdu: Yeah, abs- absolutely. And, and, and, and constantly be aware. I don’t want kids to be terrified of this thing, but I also want them to be aware that, um, there’s this phrase, “If the product is free, the product is you.” Um-
Jim: Yeah.
Abdu: … and if, and if that’s the case, then all the data you’re giving it is being used to train it-
John: Yes.
Abdu: … and commodify you.
Jim: Right.
Abdu: Um-
Jim: You are the product.
Abdu: You are the product.
Jim: They’re selling your data.
Abdu: Absolutely.
Jim: What are your interests? What are your passions?
Abdu: Absolutely. And, you know, it’s funny, Norbert Wiener, uh, wrote about this in 1950.
Jim: Huh.
Abdu: He wrote about the, in a book called The Human Use of Human Beings, where he wrote that. At some point, he’s considered the father of cybernetics. At some point, the companies will datafy you and make you into a commodity-
Jim: Mm-hmm.
Abdu: … and they will sell that and use that. Um, and we’re seeing it in ways that are very seductive. I’ll give you a quick example if I could. Um, I saw an article, a couple of articles, uh, about the use of AI, for example, to, uh, bring back loved ones. So you feed all the home movies into the, into the system and it’ll create a interactive, not hologram, but a, a video representation of a loved one who’s passed away.
Jim: And you can ask it and talk to it.
Abdu: Mm-hmm.
Jim: Ask it questions and talk to it.
Abdu: Mm-hmm. Absolutely. Ask for advice. It can, if you can put-
Jim: Wow, that is scary.
Abdu: You can put it on your phone and it’ll wake you up.
Jim: Yeah.
Abdu: Um, all kind of stuff like that. Um, and I saw people doing it and I remember thinking to myself, you know, my dad, my dad, uh, was taken from us, uh … Without dropping bombshells, my dad was murdered in October of 2024 and I was … My dad was my hero. He was, um-
Jim: Hmm.
Abdu: … uh, like Superman to me. And then he was taken and what I wouldn’t give for one more day-
Jim: Yeah.
Abdu: … and one more time to talk to my dad, um, remembering the good and the bad, you know, about human interaction and just savoring. I would take the bad over the absence any day of the week, you know, that kind of a thing. And I was thinking about this technology that digitizes a human being. And if we reduce that person to the patterns of their behavior that an AI can put through an algorithm and then respond to, what is it said about the person that I miss? Because what if it gets them wrong, but what if it gets them right? Who cares if it gets them wrong? What if it gets them right? Now, what I’ve done is I’ve taken this soul, this thick idea of my dad’s soul, and I’ve digitized him and I’ve made him into a algorithm that is material, it’s temporal, it’s thin, and I interact with it.
Jim: Mm-hmm.
Abdu: There’s something human and beautiful and special and deep about trying to conjure up a memory of my dad as opposed to asking a machine to predict what my dad might say.
Jim: Well, that so poignantly gets to the exact issue. Right?
Abdu: Mm-hmm.
Jim: And this really does. And Abdu this has been a great conversation. And I think what I’m hearing you say is we’ve got to double, triple our efforts, especially in our parenting skills-
Abdu: Mm-hmm.
Jim: . … to be able to help our children truly have that thick sense-
Abdu: Mm-hmm.
Jim: … of who they are.
Abdu: Mm-hmm. Absolutely.
Jim: Um, made in the image of God.
Abdu: Absolutely.
Jim: And that probably is job one now-
Abdu: Mm-hmm.
Jim: … because there’s such, again, an onslaught toward our children to recast that, reshape that, dehumanize that-
Abdu: Mm-hmm.
Jim: … categorize that-
Abdu: Yeah.
Jim: … um, algorithmize that-
Abdu: Mm-hmm.
Jim: … and commodify that.
Abdu: Absolutely.
Jim: And what stands between our children and that, us.
Abdu: Mm-hmm.
Jim: The parents. And we gotta do that job and we gotta do it well.
Abdu: Yeah.
Jim: Thank you for being with us. This has been a great discussion.
Abdu: It was a pleasure guys. Thank you.
Jim: Yeah, appreciate it. And to those of you listening, I hope you feel equipped to navigate, certainly better equipped, but you can even go further. Uh, get a copy of Fake ID from us here at Focus on the Family. Uh, if you make a gift of any amount, uh, if you can do that monthly, that really helps, but a one-time gift as well. If you can do $5, $10, we’ll send you a copy of the book, Fake ID, as our way of saying thank you for supporting the ministry and helping us to spread the good news and to help more parents and couples do the job they need to do. As we talked about, uh, today, the culture is more confused than ever, doesn’t it?
John: Mm-hmm.
Jim: With all the technology and all the information we’re gaining in this tech-rich environment, more confusion comes and, uh, yet we are dedicated to equipping Christians to live in clarity and focus in Christ. And I think he is helping us to do that. You know, it does say, I think those last days, it will become divided, right? And, uh, I think those of us that know truth and know love and know the Lord are gonna have insights that others don’t. You know, last year we helped 320,000 families engage with the community around them. That’s a big number.
John: It is.
Jim: We asked that question, it’s top box score. “Yes, Focus helped me to engage my community.” I’m proud of that, and, uh, this is evidence of that as well.
John: Mm-hmm.
Jim: So again, get in touch with us.
John: Yeah, your donation helps us reach families and spread the gospel and make that generational impact. So donate today, and, uh, by the way, if you haven’t ever contributed to Focus, make it today. Uh, do that today when you call 800, the letter A and the word FAMILY. Um, that’s 800-232-6459, or, uh, donate and get the book and find other helpful resources at FocusontheFamily.com/broadcast. And thanks for listening to Focus on the Family with Jim Daly. I’m John Fuller inviting you back as we once again help you and your family thrive in Christ.






