Humanity in the Age of AI: A Conversation with Richa Dwor

What does it mean to be human in an age of rapid technological growth? As artificial intelligence becomes a dominant force shaping how we communicate, learn, and interpret the world, conversations about what to preserve of our shared humanity are becoming essential.

In a recent dialogue, Douglas College’s Richa Dwor shared her reflections on human dignity, critical thinking, and why reading on paper might be a small but powerful act of resistance.

Richa currently serves as the chair of the English department, and the coordinator of the Research and Innovation Office for the college. A scholar of Victorian literature, she focuses on Jewish women’s writing from that period, and also religion and affect; categories of emotions that people feel within and through religious experience and in religious communities.

“If at a bare minimum, if what we can do is look at another person and say you can be in this world, and you can be in the world with dignity, and maybe that’s all I can give you, because I can’t understand your way of thinking… I think that, at bare minimum, is humanity. Humanism. It’s extending a kind of tolerance towards one another. To me, that’s what, at bare minimum, that’s what tolerance looks like. And then you can build onto that with compassion, empathy, kindness, generosity.”


Critical Capacities We Can’t Afford to Lose

“Firstly, we need to retain our critical competencies, our ways of thinking, our modes of thought that are grounded in the humanities. So by this, I mean the ability to understand and hold in conversation multiple points of view. The ability to think historically, and to understand the progress of ideas and events as they have occurred, and to recognize that our own moment is not inevitable, and that its impact on the future is hard to understand.”

“And the other thing we need is access to artifacts of the past. And to preserve what we have of our shared cultural heritage.”

On Interpretation, Paraphrase, and Translation

One of the richest parts of our conversation centered on language, reading, and interpretation. Richa emphasized:

“Paraphrase and close reading are exactly what I teach students to do. It’s the biggest part of the intellectual work that I’m guiding students to develop, which is… once you’ve found [a source], reading it and understanding it. This cannot be lost. We cannot let the machine become the interpreter.”

“If we no longer perform our own act of summary and paraphrase, it means that we no longer are able to engage in close reading of a source sufficient to develop our own comprehension of it. And that is at the root of humanistic ways of thinking.”

“You can’t have true discourse with AI. The interpreter is still a person, a human in the community… even if there’s an interpreter, it should still be possible for others in the community to themselves engage in their own act of interpretation, right? Which may challenge what the interpreter has delivered.”

“If I were to publish material from another language, and I wanted it translated into English, I would never, ever, ever, ever publish an AI translation. I would always hire a human translator. So, it’s one of those things that is useful, but human act of translation are irreplaceable.”

Asked for a practical takeaway, Richa’s advice was clear:

“Read a book. On paper. Just sit down and read the newspaper, or read a book on paper, and just think about it a bit. Read the next one. Yeah, just, just, time and concentration with text, so that you can flex your capacity for paying attention. And for reflection.”

As the conversation turned more personal, both of us reflected on moments of identity change. Richa offered this poignant insight:

“Life is not linear. We are living in a state of chaos. As Joan Didion says, we tell ourselves stories in order to live, so all of the stories that we tell ourselves are just attempts to impose a kind of rational narrative sequence on what is otherwise just one damn thing after another. And there’s a kind of peace that comes from getting rid of that narrative and just recognizing that it’s just one thing after another, and all I can do to center myself is just, like, just try to be good, and be good to other people, and just, like, ride it.”


And perhaps that is the heart of the matter. Technologies will evolve. But the work of recognizing, interpreting, and relating — remains human.

About this series: “Rescuing Our Humanity” is a dialogue series hosted through the Douglas Educational Support Community (DESC), exploring what it means to be human in an age of rapid technological transformation. Each conversation invites faculty, staff, and students to reflect on what we must protect, preserve, or restore of our shared humanity.

Attending to What Matters: A Dialogue on AI, Empathy, and the Costs of Convenience

An interview with Dr. Jennifer Jill Fellows, Philosophy & Humanities instructor

image from https://todaytesting.com/free-social-media-marketing-free-images/; Wikimedia Commons

As artificial intelligence continues to expand into classrooms, workplaces, and even our private lives, Dr. Fellows reminds us that technological convenience always comes with a cost.

A philosophy instructor and Associate of Arts Coordinator at Douglas College, Dr, Fellows is currently researching our growing relationships with AI companions — from “grief bots” modeled after loved ones to conversational agents like ChatGPT and Replika. Her upcoming course, Ethics and Technology (Winter 2026), will explore these themes in depth.

A central point in the dialogue is that even when chatbots aren’t designed to be companions, many people still form emotional attachments to them. We anthropomorphize — we imagine what it’s like to ‘be’ them — and that’s both a human superpower and a vulnerability.

In our conversation, Dr. Fellows reflected on what happens when we begin offloading moral and cognitive skills to machines. While using AI for efficiency may feel harmless, she cautions that it mirrors sending a robot to work out for us: we don’t actually strengthen our own intellectual or ethical “muscles.”

She draws from philosopher Shannon Vallor’s concept of Moral Deskilling, suggesting that empathy, care, and attentiveness are not fixed traits but skills that must be practiced. When digital companions respond without real needs or suffering, we lose opportunities to cultivate genuine empathy — even as loneliness rises.

Dr. Fellows also highlights the environmental and human costs of generative AI: massive energy use and the emotional toll borne by underpaid content moderators who make these systems safe for public use. “All convenience comes with a cost. You give something up to gain something.” she says.

For educators, her advice is both practical and profound:“And so I would ask people to be very mindful about what the costs are to themselves, what the costs are to their communities, what the costs are to the planet at large, and to really think about whether and how if they’re going to use these tools, how they’re going to use them, whether they’re going to use them. And for educators specifically, my other piece of advice would be to really think about what we want students to gain from these classrooms.”

Perhaps the deepest lesson came near the end of our exchange. Quoting the French philosopher, mystic, and political activist Simone Weil, Fellows emphasized that true attentiveness means holding one’s desires in check to receive from others and from the world. In an attention economy designed to feed our wants, that kind of listening may be the most radical skill of all.

A few links to topics mentioned in the recording
Thomas Nagel “What is it like to be a bat?
Joseph Weizenbaum and the ELIZA Chatbot
Authenticity in the age of digital companions by Sherry Turkle
Long may you run by Neil Young
MIT Research on impact of AI use on the brain
Simone Weil – Attention as a Moral and Psychological Act
Cyborg Goddess podcast episode with Dr,. Nicole Ramsoomair: The costs of convenience

Here are three lists addressing categories of concern with generative AI and technology, and based on conversations with educators. Nothing definitive; just playing with the ideas.

Human skills and tasks we feel comfortable offloading to technology

  • Data storage, retrieval, and archiving
  • Complex calculations, analytics, and pattern detection
  • Information searching, sorting, and summarizing
  • Scheduling, reminders, and basic administrative workflows
  • Navigation, mapping, and real-time traffic updates
  • Translation for basic comprehension (not nuance or cultural depth)
  • Routine manufacturing, repetitive assembly, and hazardous tasks
  • Predictive maintenance and environmental monitoring in buildings or infrastructure

Human skills and activities we feel unsure about offloading to technology

  • Decision-making in morally ambiguous or high-stakes contexts
  • Education design and assessment that require understanding of learner needs
  • Creative work where originality, cultural resonance, or personal meaning matter
  • Complex negotiations and conflict resolution
  • Emotional support, counselling, and mentorship
  • Interpretation of art, literature, and cultural heritage
  • Environmental stewardship decisions with long-term community impact
  • Cross-cultural communication where subtle context matters

Human attributes that we need to protect, rescue, and restore

  • Empathy, compassion, and ethical discernment
  • Deep listening and presence in interpersonal interactions
  • Embodied awareness and sensory connection to place
  • Critical thinking and independent judgment
  • Patience, persistence, and the capacity for sustained attention
  • Civic engagement and shared responsibility in community life
  • Imagination and the ability to envision alternative futures
  • Wisdom drawn from lived experience and intergenerational knowledge