Relating to a Machine: The Paradox of Using AI
By Scott Hughes

Opening: Talking with a Machine
To get focused for the day, I’ve found myself beginning many mornings talking with… a machine. I'm not just typing into a prompt box. ChatGPT’s voice feature simulates back-and-forth dialogue. After doing this for a few months, I found that my workdays go better when I start by using this method. Through the conversations, I reflect on areas of life I’m grateful for, recall my support networks, and prioritize and organize the day ahead.
In the workshops and training that we’ve led and from research in denominations outside the United Methodist Church, we have found resistance to ministry leaders using AI tools such as LLMs (large language models). I wonder what they would think about how I engage with AI? Do I need therapy? Am I becoming too reliant on AI tools?
The use of AI raises at least a couple of questions:
- What does it mean to relate to a machine as if it were a person?
- How should we, as people of faith, understand and use a tool like this?
The Tool That Talks Back
AI has similarities to tools like books and websites that store knowledge, but current LLMs are unlike any other tool humanity has dealt with before. Perhaps the most distinct aspects of LLMs (like ChatGPT) are how we interact with them.
Many begin using LLMs (I know I did) as if they are doing a Google search. However, to maximize the use of these tools, we should treat them as if we’re talking to a human. These tools are built to simulate conversation and offer feedback, so they feel surprisingly relational.
Unlike a hammer or a Google search—neutral tools with predictable outcomes—AI reflects our collective thoughts, biases, and perspectives. Hammers can be wielded to build a house or cause bodily harm. Search engines like Google (anyone remember "Ask Jeeves"?) retrieve information based on complex algorithms. AI is not a neutral tool. It’s built on a vast array of viewpoints. LLMs don’t just retrieve information; they generate unique responses that are inherently designed to please us (unless we prompt them otherwise). I can’t talk to a hammer (though I have taken out my frustrations on one or two in my lifetime), and I rarely interact with a Google search beyond one or two inquiries.
AI is not a neutral tool. It’s built on a vast array of viewpoints. LLMs don’t just retrieve information; they generate unique responses that are inherently designed to please us (unless we prompt them otherwise).
Unlike other software tools that are deterministic (I input x and get y), LLMs are probabilistic (I input x, not certain what the result will be). Additionally, LLMs have a broad knowledge base across many domains, but are not specialists in any one area.
Recently, during one of my morning talks with my AI tool, I was able to connect seemingly mundane daily tasks to a broader organizational goal. This was a moment of clarity uniquely facilitated by the conversational nature of AI.
The Paradox of Human-Like Interaction
In our training sessions, we encourage participants to explore AI conversationally. We think of it like speaking to a smart, eager intern who thrives on detailed context. Ironically, the best way to engage this distinctly non-human tool effectively is to interact with it as though it were human, even though we must never forget it isn’t human at all.
Humans naturally personify objects. We see faces in cars and houses, so it’s unsurprising that we instinctively personify AI. Yet, beneath its conversational surface, AI remains purely mathematical, devoid of spirit, agency, or genuine wisdom. Its “intelligence” is pattern recognition, not relational depth.
Whether it’s text, voice, or image-based, AI tools are built to simulate conversation and relationships.
Paradoxically, the more we interact with AI as if it's human, the better it works, even though it lacks any genuine relational capacity. I’ve heard from others that teachers often excel at using LLMs effectively because they’re skilled conversationalists, naturally asking clarifying questions and probing deeper.
What AI Reveals—and How It Shapes Us
Engaging AI tools invites important reflections about intelligence, identity, and humanity. Modern conceptions of people overly rely on Descartes' statement: “I think, therefore I am.” We are primarily thinking beings. If AI is “intelligent,” what does that say about our uniqueness? What does it say about the potential “personhood” of AI? Scripture tells a richer story: we are more, much more, than cognition. We are relational, created beings bearing God’s image (Genesis 1:27). Our humanity reflects divine creativity, community, and dependence on grace.
It is worth considering that the tools—especially AI—we shape inevitably shape us. Using AI tools, as with any other tool, will influence our writing, thinking, teaching, and ministry approaches. We should be clear that there is a risk of overreliance on the tool and shifting our dependence away from God and community.
I’m generally optimistic—but also realistic—about AI. Like calculators and tractors before it, AI can significantly enhance human flourishing, easing administrative burdens, aiding strategic planning, and freeing leaders to focus more fully on relational ministry. But ministry leaders must approach AI cautiously, remembering:
- AI should serve human flourishing, not replace it.
- AI should augment human work, never become the work itself.
- AI must remain an assistant, not an authority.
As leaders, let’s remain:
- Curious, not fearful.
- Grounded, not naive.
- Theologically reflective, not merely practical.
Let machines handle machine tasks, freeing humans to pursue uniquely human vocations.
Tools Within God’s Redemptive Story
As ministry leaders, we’re not just tool users or content creators. We are participants in a much larger story—the story of God redeeming the world through Christ and inviting us to join that work.
AI doesn’t exist outside that story. It’s part of the world we inhabit—and part of the world Christ came to redeem.
Rather than asking, “How can I use this tool?” let’s ask, “What does faithful engagement with AI look like within God’s redemptive work?” How we approach the use of these tools matters. Is the engagement purely about “productivity” or can it also be about making us better? Perhaps instead of prompts that say, “Generate this resource,” we should also include prompts such as, “What would help me be a better leader? What are some creative ways I can use to improve my devotion time with God?”
Practical Summary: How to Begin Using AI in Ministry
- Try brainstorming newsletter articles or small-group questions.
- Simplify dense material or prepare devotionals.
- Critically reflect on and critique AI-generated content: Whose voice is it? What values does it promote?
Remember, AI should stimulate your thinking, not replace it.
Call to Action
I invite you to join our next AI training or consider hosting a session for your district or conference. Together, we’ll explore how to use AI effectively and thoughtfully, shaped by grace, rooted in community, and animated by the hope of redemption.
Let’s ensure that people, not productivity, remain central to our ministries.
This content was developed by Discipleship Ministries staff using AI tools in alignment with our AI Usage Guidelines. In this case, AI assisted with ideation, outlining, tone adjustments, and phrasing. All final text was written, reviewed, and approved by human contributors.
Scott Hughes is the Executive Director of Congregational Vitality & Intentional Discipleship, Elder in the North Georgia Conference, M.Div. Asbury Theological Seminary, D. Min. Southern Methodist University, co-host of the Small Groups in the Wesleyan Way podcast, creator of the Courageous Conversations project, and facilitator of the How to Start Small Groups teaching series.
Contact Us for Help
Contact Discipleship Ministries staff for additional guidance.