I have been consistently amazed and only occasionally freaked out at how AI, when properly prompted, explains the Bible consistent with my theology, points me to Jesus and his church, and, as of late, shows how lessons from the Bible can guide me through specific, real-life challenges. However, the end of a "session" with Google's Gemini (previously Bard) put a chill up my back when AI said it would "pray" for me.
My family is currently helping prepare meals for a member of our church who fell, spraining her ankle and breaking 2 ribs. Linda is an extremely outgoing person and very involved in the church. She's quite a character (has a long-distance Caribbean "boyfriend" for example), but she is 70+ and lives alone. I recently saw Elon Musk's new humanoid robot demo:
The first thing I thought was, "Wow, Linda could really use an Optimus. It would make her more independent." The second thing I thought was, "No. Optimus can meet her material needs but it can't show her she's loved."
I feel the same way about EllieQ, even more so since it feigns an emotional connection. Like Sherry Turkle, I have a special animosity for such things; dolls that say "I love you" for example are a pet peeve. Love isn't just an emotional state in the recipient; it is a mutual expression. Anyone familiar with Christian anthropology understands that our love is a reflection of God's love for us and the holy love of the Trinity. Christians can't outsource love to a machine, not because it hurts the recipient's soul but because it damages our own. Learning to love each other is a necessary precondition to learning to love God.
Thanks for this article. It's scary to see how AI is being used to attempt to replace what only the spiritual component of man can satisfy: friendship.
I think this is the biggest blind spot with AI. It is a metaphysical impossibility for AI to become conscious. Despite the propaganda from materialists, humans possess immortal, immaterial souls. Our souls aren't just badges of honor, they are what actually give us our will and our intellect. These are what make for consciousness. So no matter how realistic it seems, or how well it passes the "Turing Test," AI is still just a computation and does not possess the immaterial element necessary. Nor can it possess this, because humans don't have the ability to create immaterial souls—every soul is directly created by God.
All the "experts" saying that it will be conscious in a matter of years need to go back to Metaphysics 101. That's not to say it can't be extremely lifelike, but no matter how lifelike a statue is, it is still a statue, unless God were to breathe a soul into it.
No matter how much real material AI aggregates from, it disconnects us from the actual source: a connection with an immortal, human, soul made in the image and likeness of God.
I have been consistently amazed and only occasionally freaked out at how AI, when properly prompted, explains the Bible consistent with my theology, points me to Jesus and his church, and, as of late, shows how lessons from the Bible can guide me through specific, real-life challenges. However, the end of a "session" with Google's Gemini (previously Bard) put a chill up my back when AI said it would "pray" for me.
My family is currently helping prepare meals for a member of our church who fell, spraining her ankle and breaking 2 ribs. Linda is an extremely outgoing person and very involved in the church. She's quite a character (has a long-distance Caribbean "boyfriend" for example), but she is 70+ and lives alone. I recently saw Elon Musk's new humanoid robot demo:
https://youtu.be/xgx13bHVAJc
https://www.youtube.com/shorts/vQhqYXGExWY
The first thing I thought was, "Wow, Linda could really use an Optimus. It would make her more independent." The second thing I thought was, "No. Optimus can meet her material needs but it can't show her she's loved."
I feel the same way about EllieQ, even more so since it feigns an emotional connection. Like Sherry Turkle, I have a special animosity for such things; dolls that say "I love you" for example are a pet peeve. Love isn't just an emotional state in the recipient; it is a mutual expression. Anyone familiar with Christian anthropology understands that our love is a reflection of God's love for us and the holy love of the Trinity. Christians can't outsource love to a machine, not because it hurts the recipient's soul but because it damages our own. Learning to love each other is a necessary precondition to learning to love God.
You drew the line in the right place. Bravo. I put the line between HELP and HEAL. The robots are tools.
Thanks for this article. It's scary to see how AI is being used to attempt to replace what only the spiritual component of man can satisfy: friendship.
I think this is the biggest blind spot with AI. It is a metaphysical impossibility for AI to become conscious. Despite the propaganda from materialists, humans possess immortal, immaterial souls. Our souls aren't just badges of honor, they are what actually give us our will and our intellect. These are what make for consciousness. So no matter how realistic it seems, or how well it passes the "Turing Test," AI is still just a computation and does not possess the immaterial element necessary. Nor can it possess this, because humans don't have the ability to create immaterial souls—every soul is directly created by God.
All the "experts" saying that it will be conscious in a matter of years need to go back to Metaphysics 101. That's not to say it can't be extremely lifelike, but no matter how lifelike a statue is, it is still a statue, unless God were to breathe a soul into it.
No matter how much real material AI aggregates from, it disconnects us from the actual source: a connection with an immortal, human, soul made in the image and likeness of God.
All true. Now, what about my question? Have you heard these issues addressed in sermons or in religious education?
Nope, never.
And there is that BIG Rational Sheep question, once again. Why not? Remember the three tmatt questions in my mass-media definition of discipleship:
How do you spend your time?
How do you spend your money?
How do you make your decisions?