I have been consistently amazed and only occasionally freaked out at how AI, when properly prompted, explains the Bible consistent with my theology, points me to Jesus and his church, and, as of late, shows how lessons from the Bible can guide me through specific, real-life challenges. However, the end of a "session" with Google's Gemini (previously Bard) put a chill up my back when AI said it would "pray" for me.
My family is currently helping prepare meals for a member of our church who fell, spraining her ankle and breaking 2 ribs. Linda is an extremely outgoing person and very involved in the church. She's quite a character (has a long-distance Caribbean "boyfriend" for example), but she is 70+ and lives alone. I recently saw Elon Musk's new humanoid robot demo:
The first thing I thought was, "Wow, Linda could really use an Optimus. It would make her more independent." The second thing I thought was, "No. Optimus can meet her material needs but it can't show her she's loved."
I feel the same way about EllieQ, even more so since it feigns an emotional connection. Like Sherry Turkle, I have a special animosity for such things; dolls that say "I love you" for example are a pet peeve. Love isn't just an emotional state in the recipient; it is a mutual expression. Anyone familiar with Christian anthropology understands that our love is a reflection of God's love for us and the holy love of the Trinity. Christians can't outsource love to a machine, not because it hurts the recipient's soul but because it damages our own. Learning to love each other is a necessary precondition to learning to love God.
I have been consistently amazed and only occasionally freaked out at how AI, when properly prompted, explains the Bible consistent with my theology, points me to Jesus and his church, and, as of late, shows how lessons from the Bible can guide me through specific, real-life challenges. However, the end of a "session" with Google's Gemini (previously Bard) put a chill up my back when AI said it would "pray" for me.
My family is currently helping prepare meals for a member of our church who fell, spraining her ankle and breaking 2 ribs. Linda is an extremely outgoing person and very involved in the church. She's quite a character (has a long-distance Caribbean "boyfriend" for example), but she is 70+ and lives alone. I recently saw Elon Musk's new humanoid robot demo:
https://youtu.be/xgx13bHVAJc
https://www.youtube.com/shorts/vQhqYXGExWY
The first thing I thought was, "Wow, Linda could really use an Optimus. It would make her more independent." The second thing I thought was, "No. Optimus can meet her material needs but it can't show her she's loved."
I feel the same way about EllieQ, even more so since it feigns an emotional connection. Like Sherry Turkle, I have a special animosity for such things; dolls that say "I love you" for example are a pet peeve. Love isn't just an emotional state in the recipient; it is a mutual expression. Anyone familiar with Christian anthropology understands that our love is a reflection of God's love for us and the holy love of the Trinity. Christians can't outsource love to a machine, not because it hurts the recipient's soul but because it damages our own. Learning to love each other is a necessary precondition to learning to love God.
You drew the line in the right place. Bravo. I put the line between HELP and HEAL. The robots are tools.
All true. Now, what about my question? Have you heard these issues addressed in sermons or in religious education?
And there is that BIG Rational Sheep question, once again. Why not? Remember the three tmatt questions in my mass-media definition of discipleship:
How do you spend your time?
How do you spend your money?
How do you make your decisions?