Humanity in the Age of AI: A Conversation with Richa Dwor

What does it mean to be human in an age of rapid technological growth? As artificial intelligence becomes a dominant force shaping how we communicate, learn, and interpret the world, conversations about what to preserve of our shared humanity are becoming essential.

In a recent dialogue, Douglas College’s Richa Dwor shared her reflections on human dignity, critical thinking, and why reading on paper might be a small but powerful act of resistance.

Richa currently serves as the chair of the English department, and the coordinator of the Research and Innovation Office for the college. A scholar of Victorian literature, she focuses on Jewish women’s writing from that period, and also religion and affect; categories of emotions that people feel within and through religious experience and in religious communities.

“If at a bare minimum, if what we can do is look at another person and say you can be in this world, and you can be in the world with dignity, and maybe that’s all I can give you, because I can’t understand your way of thinking
 I think that, at bare minimum, is humanity. Humanism. It’s extending a kind of tolerance towards one another. To me, that’s what, at bare minimum, that’s what tolerance looks like. And then you can build onto that with compassion, empathy, kindness, generosity.”


Critical Capacities We Can’t Afford to Lose

“Firstly, we need to retain our critical competencies, our ways of thinking, our modes of thought that are grounded in the humanities. So by this, I mean the ability to understand and hold in conversation multiple points of view. The ability to think historically, and to understand the progress of ideas and events as they have occurred, and to recognize that our own moment is not inevitable, and that its impact on the future is hard to understand.”

“And the other thing we need is access to artifacts of the past. And to preserve what we have of our shared cultural heritage.”

On Interpretation, Paraphrase, and Translation

One of the richest parts of our conversation centered on language, reading, and interpretation. Richa emphasized:

“Paraphrase and close reading are exactly what I teach students to do. It’s the biggest part of the intellectual work that I’m guiding students to develop, which is
 once you’ve found [a source], reading it and understanding it. This cannot be lost. We cannot let the machine become the interpreter.”

“If we no longer perform our own act of summary and paraphrase, it means that we no longer are able to engage in close reading of a source sufficient to develop our own comprehension of it. And that is at the root of humanistic ways of thinking.”

“You can’t have true discourse with AI. The interpreter is still a person, a human in the community
 even if there’s an interpreter, it should still be possible for others in the community to themselves engage in their own act of interpretation, right? Which may challenge what the interpreter has delivered.”

“If I were to publish material from another language, and I wanted it translated into English, I would never, ever, ever, ever publish an AI translation. I would always hire a human translator. So, it’s one of those things that is useful, but human act of translation are irreplaceable.”

Asked for a practical takeaway, Richa’s advice was clear:

“Read a book. On paper. Just sit down and read the newspaper, or read a book on paper, and just think about it a bit. Read the next one. Yeah, just, just, time and concentration with text, so that you can flex your capacity for paying attention. And for reflection.”

As the conversation turned more personal, both of us reflected on moments of identity change. Richa offered this poignant insight:

“Life is not linear. We are living in a state of chaos. As Joan Didion says, we tell ourselves stories in order to live, so all of the stories that we tell ourselves are just attempts to impose a kind of rational narrative sequence on what is otherwise just one damn thing after another. And there’s a kind of peace that comes from getting rid of that narrative and just recognizing that it’s just one thing after another, and all I can do to center myself is just, like, just try to be good, and be good to other people, and just, like, ride it.”


And perhaps that is the heart of the matter. Technologies will evolve. But the work of recognizing, interpreting, and relating — remains human.

About this series: “Rescuing Our Humanity” is a dialogue series hosted through the Douglas Educational Support Community (DESC), exploring what it means to be human in an age of rapid technological transformation. Each conversation invites faculty, staff, and students to reflect on what we must protect, preserve, or restore of our shared humanity.

Attending to What Matters: A Dialogue on AI, Empathy, and the Costs of Convenience

An interview with Dr. Jennifer Jill Fellows, Philosophy & Humanities instructor

image from https://todaytesting.com/free-social-media-marketing-free-images/; Wikimedia Commons

As artificial intelligence continues to expand into classrooms, workplaces, and even our private lives, Dr. Fellows reminds us that technological convenience always comes with a cost.

A philosophy instructor and Associate of Arts Coordinator at Douglas College, Dr, Fellows is currently researching our growing relationships with AI companions — from “grief bots” modeled after loved ones to conversational agents like ChatGPT and Replika. Her upcoming course, Ethics and Technology (Winter 2026), will explore these themes in depth.

A central point in the dialogue is that even when chatbots aren’t designed to be companions, many people still form emotional attachments to them. We anthropomorphize — we imagine what it’s like to ‘be’ them — and that’s both a human superpower and a vulnerability.

In our conversation, Dr. Fellows reflected on what happens when we begin offloading moral and cognitive skills to machines. While using AI for efficiency may feel harmless, she cautions that it mirrors sending a robot to work out for us: we don’t actually strengthen our own intellectual or ethical “muscles.”

She draws from philosopher Shannon Vallor’s concept of Moral Deskilling, suggesting that empathy, care, and attentiveness are not fixed traits but skills that must be practiced. When digital companions respond without real needs or suffering, we lose opportunities to cultivate genuine empathy — even as loneliness rises.

Dr. Fellows also highlights the environmental and human costs of generative AI: massive energy use and the emotional toll borne by underpaid content moderators who make these systems safe for public use. “All convenience comes with a cost. You give something up to gain something.” she says.

For educators, her advice is both practical and profound:“And so I would ask people to be very mindful about what the costs are to themselves, what the costs are to their communities, what the costs are to the planet at large, and to really think about whether and how if they’re going to use these tools, how they’re going to use them, whether they’re going to use them. And for educators specifically, my other piece of advice would be to really think about what we want students to gain from these classrooms.”

Perhaps the deepest lesson came near the end of our exchange. Quoting the French philosopher, mystic, and political activist Simone Weil, Fellows emphasized that true attentiveness means holding one’s desires in check to receive from others and from the world. In an attention economy designed to feed our wants, that kind of listening may be the most radical skill of all.

A few links to topics mentioned in the recording
Thomas Nagel “What is it like to be a bat?
Joseph Weizenbaum and the ELIZA Chatbot
Authenticity in the age of digital companions by Sherry Turkle
Long may you run by Neil Young
MIT Research on impact of AI use on the brain
Simone Weil – Attention as a Moral and Psychological Act
Cyborg Goddess podcast episode with Dr,. Nicole Ramsoomair: The costs of convenience

Here are three lists addressing categories of concern with generative AI and technology, and based on conversations with educators. Nothing definitive; just playing with the ideas.

Human skills and tasks we feel comfortable offloading to technology

  • Data storage, retrieval, and archiving
  • Complex calculations, analytics, and pattern detection
  • Information searching, sorting, and summarizing
  • Scheduling, reminders, and basic administrative workflows
  • Navigation, mapping, and real-time traffic updates
  • Translation for basic comprehension (not nuance or cultural depth)
  • Routine manufacturing, repetitive assembly, and hazardous tasks
  • Predictive maintenance and environmental monitoring in buildings or infrastructure

Human skills and activities we feel unsure about offloading to technology

  • Decision-making in morally ambiguous or high-stakes contexts
  • Education design and assessment that require understanding of learner needs
  • Creative work where originality, cultural resonance, or personal meaning matter
  • Complex negotiations and conflict resolution
  • Emotional support, counselling, and mentorship
  • Interpretation of art, literature, and cultural heritage
  • Environmental stewardship decisions with long-term community impact
  • Cross-cultural communication where subtle context matters

Human attributes that we need to protect, rescue, and restore

  • Empathy, compassion, and ethical discernment
  • Deep listening and presence in interpersonal interactions
  • Embodied awareness and sensory connection to place
  • Critical thinking and independent judgment
  • Patience, persistence, and the capacity for sustained attention
  • Civic engagement and shared responsibility in community life
  • Imagination and the ability to envision alternative futures
  • Wisdom drawn from lived experience and intergenerational knowledge

Digital Literacy, AI, and Staying Human in a Changing World

What does it mean to be digitally literate in 2025? For Britt Dzioba, Learning and Teaching Advisor at BCcampus, the answer goes far beyond knowing how to use a computer.

In a recent conversation, Britt reflected on her work with BCcampus and Digital Literacy development and the open-access Digital Literacy Hub—a resource designed to help faculty, students, and communities build the skills needed to thrive in an ever-changing digital environment. She noted that while the tools and platforms may shift rapidly, the underlying competencies—such as information literacy, critical engagement with technology, and digital well-being—remain essential.

The pandemic acted as a “bucket of ice water” moment, forcing faculty and students to adapt quickly to online learning. Since then, technologies like generative AI have added new layers of complexity, raising both opportunities and concerns. “I think technology can be a wonderful thing if used intentionally and if used critically. This is where I think, especially with AI, … it’s so important to keep the human in it…that is how we use AI critically and ethically, by centering human experience.”

From exploring algorithmic literacy to supporting accessibility for neurodivergent learners, Britt emphasizes the importance of slowing down, setting boundaries, and cultivating critical thinking. She sees small, grassroots actions—like community-led digital projects or mindful approaches to teaching and learning—as powerful antidotes to despair in the face of rapid change.

Ultimately, digital literacy is not just about mastering tools. It is about equipping ourselves to respond—rather than react—to change, and to shape technologies in ways that sustain human connection, equity, and wisdom.

Rescuing the Human in the Age of AI: A Conversation with Nicole Vittoz

By Steven Bishop, Learning Designer, Douglas College

In an age of rapid technological change and expanding AI capabilities, educators are asking deep and necessary questions about the future of learning and what it means to be human. In a recent conversation with Nicole Vittoz, Associate Dean of Humanities and Social Sciences at Douglas College, we explored the tension between emerging technologies and enduring human capacities — and what this means for post-secondary education.

Nicole brings a thoughtful, multidisciplinary lens to this conversation, drawing on her background in psychology and her current role overseeing academic integrity and pedagogy. “…it’s hard to believe it’s been almost two years now since ChatGPT really blasted onto our stage of awareness. Many conversations over those couple of years have initially started off as, how do we stop this?,” she reflected. “…but I think we’re now slowly getting to the stage of ‘yes and’. Yes, these tools exist and we need to work within that new universe and adjust what we’re doing and try to rescue the human in there. But also use that opportunity to reframe what we’re doing, redefine what we’re doing, in some cases, revise what we’re doing. Where necessary, really articulate, don’t assume anything about other people’s motivations for engaging in the type of learning that we engaged in when we were in post secondary..”

Rather than fearing that generative AI will erode academic standards, Nicole sees this as a moment to “rescue the human.” She emphasizes that traditional assignments like the term paper still hold deep value — not for the product alone, but for the process: developing voice, critical thinking, and intellectual identity. But that process must be made visible, intentional, and supported. “If I want my students to write well, then I’m teaching writing,” she said. “That’s part of my role, even if my course is philosophy or sociology.”

The conversation also touched on the dangers of mistaking convenience for authenticity. AI tools like ChatGPT can mimic voice and polish writing — but at what cost? “When students submit something they haven’t even read, and their name is on it, we have to talk about accountability,” Nicole noted. “It’s not just about plagiarism; it’s about presence and engagement.”

Nicole also highlighted the critical role of skepticism — especially in the humanities. “Our students are usually great at questioning authority,” she observed, “but somehow these technologies have slipped under their radar. Many students trust AI more than a human.” This, she argues, makes it even more urgent to model and teach critical inquiry.

She also reminded us that not all use of AI is negative — tools like text-to-speech and translation software can remove barriers for students with disabilities or language challenges. But the real question becomes: What do we want to offload, and what do we need to protect?

For Nicole, the answer lies in fostering authentic learning, nuanced communication, and ethical reflection. She invites educators to consider transparency frameworks, assignment redesign, and open dialogue with students. And perhaps most importantly, to reclaim the space of conversation itself — where questions unfold, insights emerge, and we remember why education matters.

“Dialogue is an important human attribute,” Nicole said in closing. “We need to keep talking about these things.”

And we will.

“What Makes Us Human?”: A Political Theorist Reflects on Freedom, Technology, and Awareness

In a recent dialogue hosted at Douglas College, I had the pleasure of sitting down with Dr. Jovian Radheshwar, a political science instructor and political theorist, to explore what it means to be human in an age shaped—and increasingly reshaped—by artificial intelligence.

Jovian opened our conversation with a stark observation: we are witnessing a global slide away from freedom, as technology merges with authoritarian politics to normalize social control. Drawing from Greek philosophy, he highlighted how the ancient word idiota referred not to someone lacking intelligence, but to one who is absorbed solely in private concerns, disconnected from the polis—the public world of shared concern. In contrast, true intelligence (physis in Greek) was linked to being deeply enmeshed in one’s surroundings, aware and responsive to the world.

Jovian offered a compelling critique of today’s techno-culture—especially the AI-driven promise of transcendence from human limitations. Figures like Peter Thiel and movements like Transhumanism, he argues, are rooted not in empathy or collective uplift but in a desire to dominate, to become “supermen” at the cost of humility, biodiversity, and social connection.

We reflected together on what awareness looks like in practice. For me, riding a bike through the city has become a daily exercise in sensing others, slowing down, and learning to pick up on the emotional “weather” of a space. Jovian connected this to urban survival skills—what he called a kind of “street ninja” intelligence—and also to the loss of embodiment that comes from excessive digital distraction.

Our conversation turned to education and the risks of reducing learning to a game of performance metrics. Jovian made a passionate case for embracing failure, for cultivating diverse skills, and for protecting the space to explore what we might be good at—beyond what the system rewards. He quoted Heidegger to illustrate how modern technology shifts us from “taking care” of the Earth to “challenging” it—a shift with profound ethical and ecological consequences.

So, what skills should we protect? What human capacities must we nurture?

Jovian suggests starting with awareness, reflection, the capacity to fail, and the courage to imagine freedom—not as domination, but as mutual care.

This is just the beginning. We both agreed: this conversation needs a Part Two.

Mapping More Than Terrain: Land, Story, and Spirit in the Geography Lab

In a recent conversation with Sasha Djakovic, Geography Lab Technician at Douglas College, I was struck by how deeply experiential and relational geography education can be—especially when it is tied to land-based knowledge and Indigenous ways of knowing.

Sasha’s work, both in the lab and on the land, bridges high-tech tools like GIS and LiDAR drones with ancient stories, place names, and protocols rooted in the Lil’wat Nation’s territory. Through the Lil’wat Archaeological Research Project, he and colleagues like Bill Angelbeck, with guidance from Indigenous leaders such as Jennifer Anaquod, have facilitated a rare kind of learning: one that pairs digital mapping with spiritual respect, physical geography with oral tradition, and student development with cultural humility.

The Geography lab’s augmented reality sandbox, for instance, helps students visualize topography and contour lines in three dimensions. But Sasha reminds us that the best geography lab is still outside. In Lil’wat territory, stories like the Copper Canoe become topographical narratives—legends validated not only by tradition but also by modern science. The land itself, with its steep terrain and active volcanoes, speaks back to these stories in powerful ways.

What stood out most in our exchange was the emotional transformation students undergo during their time on these digs. From opening ceremonies with drumming to observing protocols around sacred sites and funerals, students are asked not just to learn—but to feel, to listen, and to honour. This is not just skill-building. It’s soul-building.

At a time when generative AI is rewriting the rules of what skills we offload to machines, Sasha’s reflections remind us of what we must not offload: spiritual sensitivity, ethical responsibility, and deep relational awareness with the land and its stories. These are not just Indigenous ways of knowing—they are human ones, and they’re more vital than ever.

And…good timing! This article about the archeological work came out shortly after our recorded dialogue: Archeological dig on Lil’wat territory uncovers ancient histories and reframes research relationships

Reflections on the Digital Literacy Challenge

A conversation recorded from Douglas College, on the unceded traditional and ancestral lands of the Coast Salish Peoples, including the territories of the qÌ“Ă­c̓əy̓ (Katzie), qʌʷa:n̓ƛʌən̓ (Kwantlen), kÊ·ikʷəƛ̓əm (Kwikwetlem), xÊ·məξkʷəy̓əm (Musqueam), and qiqĂ©yt (Qayqayt) First Nations, and from Thompson Rivers University on the Tk’emlĂșps te SecwĂ©pemc (Kamloops campus) and the T’exelc (Williams Lake campus) within SecwĂ©pemc’ulucw, reside on the traditional and unceded territory of the SecwĂ©pemc.

As a participant midway through the BC Campus 2024 Digital Literacy Challenge Series, I was delighted to have a chance to chat with the creators, Helen Lee and Melanie Lathan.

Helen Lee, Instructional Designer at Justice Institute of BC
Melanie Latham, Coordinator, Educational Technologies, Thompson Rivers University

The series takes participants through the eight competencies outlined in the B.C. Digital Literacy Framework, and includes strategies for incorporating the skills into teaching practice, and how to support student success through digital literacy. 

Participants receive one challenge in their inbox each Monday morning over the course of eight weeks. Each challenge will address the following: 

  • What: a definition of the digital literacy competency 
  • Why: the importance of each competency in teaching practice 
  • How: approaches for developing the competency in our own learning, and how to incorporate these skills in our teaching practice 

Interactive activities and thoughtful prompts help to learn how to incorporate digital literacy into your teaching practice and generate ideas on how to teach these skills to students. Each weekly challenge takes one to four hours to complete, depending on how deeply you want to explore.  

There were four optional, synchronous drop-in sessions held over Zoom where participants asked questions and connected with peers in a casual setting. 

We are planning on adapting the Digital Literacy Challenge content to our needs at Douglas College, respecting The BC Campus Creative Commons Attribution 4.0 International Licence.

Recorded conversation with Helen Lee, Melanie Latham, and Steven Bishop

Host of the Digital Literacy Challenge synchronous sessions: Britt Dzioba, Learning & Teaching Advisor, BC Campus

Digital Literacy Materials for Post-Secondary Educators

Digital Learning Strategy Forum 2024 – November 13-14 in person and online

Reimagining Post-Secondary Education with Bailey Cove

Karine Hamm (Sports Science Diploma Coordinator) and I met with Bailey Cove, a former student of Karine’s, to discuss her ideas on the topic of Reimagining Post-Secondary Education. Bailey has been volunteering since high-school, and provided this list of recent positions she has served in:

  • Douglas Students’ Union (DSU) Director of College Relations (1 year) 
  • Douglas College (DC) Board of Governors (1 year)
  • Douglas Students’ Union (DSU) Director of External Relations (1 year) 
  • DSU Budget and Operations Committee (member)
  • DSU Campus Life Working Group (member)
  • DSU/DC Joint Operations (member)
  • Douglas College Education Council College Board Liaison 
  • Douglas College Campus Planning Committee (member)
  • DSU Campaigns Working Group (Chair)

Bailey shared her thoughts on the student experience of post-secondary education from a highly-informed perspective. Enjoy listening to a discussion of what works, what needs improvement, and what new directions we can imagine.


Links

Bailey Cove on LinkedIn
Bailey Cove on DSU Board of Directors

A few references from the discussion:

The reGENERATE Ideas Challenge (PEAK-Buildings Certificate program students’ submission shortlisted as #16)

Quote from Neighbourhood Houses – Edited by Ming Chung Yan and Sean Lauer: “The community problem is generally considered to comprise the following problems of connection and engagement: the avoidance and superficial level of interaction, the living together at high densities as strangers, and the feeling of isolation while surrounded by others. This can lead to alienation and a social disconnection from the social world around us. As a form of social infrastructure focused on the development of relationships and social capacity, neighbourhood houses have the potential to contribute to the ideal of creating welcoming communities in cities and societies that are often less than welcoming and supportive for marginalized, racialized, and disadvantaged groups.”

Yuval Noah Harari on storytelling – “Homo sapiens is a storytelling animal that thinks in stories rather than in numbers or graphs, and believes that the universe itself works like a story, replete with heroes and villains, conflicts and resolutions, climaxes and happy endings. When we look for the meaning of life, we want a story that will explain what reality is all about and what my particular role is in the cosmic drama. This role makes me a part of something bigger than myself, and gives meaning to all my experiences and choices.”

“Never doubt that a small group of thoughtful, committed citizens can change the world; indeed, it’s the only thing that ever has.” Margaret Mead

“The Trouble with Normal” by Bruce Cockburn