Rescuing the Human in the Age of AI: A Conversation with Nicole Vittoz

By Steven Bishop, Learning Designer, Douglas College

In an age of rapid technological change and expanding AI capabilities, educators are asking deep and necessary questions about the future of learning and what it means to be human. In a recent conversation with Nicole Vittoz, Associate Dean of Humanities and Social Sciences at Douglas College, we explored the tension between emerging technologies and enduring human capacities — and what this means for post-secondary education.

Nicole brings a thoughtful, multidisciplinary lens to this conversation, drawing on her background in psychology and her current role overseeing academic integrity and pedagogy. “…it’s hard to believe it’s been almost two years now since ChatGPT really blasted onto our stage of awareness. Many conversations over those couple of years have initially started off as, how do we stop this?,” she reflected. “…but I think we’re now slowly getting to the stage of ‘yes and’. Yes, these tools exist and we need to work within that new universe and adjust what we’re doing and try to rescue the human in there. But also use that opportunity to reframe what we’re doing, redefine what we’re doing, in some cases, revise what we’re doing. Where necessary, really articulate, don’t assume anything about other people’s motivations for engaging in the type of learning that we engaged in when we were in post secondary..”

Rather than fearing that generative AI will erode academic standards, Nicole sees this as a moment to “rescue the human.” She emphasizes that traditional assignments like the term paper still hold deep value — not for the product alone, but for the process: developing voice, critical thinking, and intellectual identity. But that process must be made visible, intentional, and supported. “If I want my students to write well, then I’m teaching writing,” she said. “That’s part of my role, even if my course is philosophy or sociology.”

The conversation also touched on the dangers of mistaking convenience for authenticity. AI tools like ChatGPT can mimic voice and polish writing — but at what cost? “When students submit something they haven’t even read, and their name is on it, we have to talk about accountability,” Nicole noted. “It’s not just about plagiarism; it’s about presence and engagement.”

Nicole also highlighted the critical role of skepticism — especially in the humanities. “Our students are usually great at questioning authority,” she observed, “but somehow these technologies have slipped under their radar. Many students trust AI more than a human.” This, she argues, makes it even more urgent to model and teach critical inquiry.

She also reminded us that not all use of AI is negative — tools like text-to-speech and translation software can remove barriers for students with disabilities or language challenges. But the real question becomes: What do we want to offload, and what do we need to protect?

For Nicole, the answer lies in fostering authentic learning, nuanced communication, and ethical reflection. She invites educators to consider transparency frameworks, assignment redesign, and open dialogue with students. And perhaps most importantly, to reclaim the space of conversation itself — where questions unfold, insights emerge, and we remember why education matters.

“Dialogue is an important human attribute,” Nicole said in closing. “We need to keep talking about these things.”

And we will.

“What Makes Us Human?”: A Political Theorist Reflects on Freedom, Technology, and Awareness

In a recent dialogue hosted at Douglas College, I had the pleasure of sitting down with Dr. Jovian Radheshwar, a political science instructor and political theorist, to explore what it means to be human in an age shaped—and increasingly reshaped—by artificial intelligence.

Jovian opened our conversation with a stark observation: we are witnessing a global slide away from freedom, as technology merges with authoritarian politics to normalize social control. Drawing from Greek philosophy, he highlighted how the ancient word idiota referred not to someone lacking intelligence, but to one who is absorbed solely in private concerns, disconnected from the polis—the public world of shared concern. In contrast, true intelligence (physis in Greek) was linked to being deeply enmeshed in one’s surroundings, aware and responsive to the world.

Jovian offered a compelling critique of today’s techno-culture—especially the AI-driven promise of transcendence from human limitations. Figures like Peter Thiel and movements like Transhumanism, he argues, are rooted not in empathy or collective uplift but in a desire to dominate, to become “supermen” at the cost of humility, biodiversity, and social connection.

We reflected together on what awareness looks like in practice. For me, riding a bike through the city has become a daily exercise in sensing others, slowing down, and learning to pick up on the emotional “weather” of a space. Jovian connected this to urban survival skills—what he called a kind of “street ninja” intelligence—and also to the loss of embodiment that comes from excessive digital distraction.

Our conversation turned to education and the risks of reducing learning to a game of performance metrics. Jovian made a passionate case for embracing failure, for cultivating diverse skills, and for protecting the space to explore what we might be good at—beyond what the system rewards. He quoted Heidegger to illustrate how modern technology shifts us from “taking care” of the Earth to “challenging” it—a shift with profound ethical and ecological consequences.

So, what skills should we protect? What human capacities must we nurture?

Jovian suggests starting with awareness, reflection, the capacity to fail, and the courage to imagine freedom—not as domination, but as mutual care.

This is just the beginning. We both agreed: this conversation needs a Part Two.

Mapping More Than Terrain: Land, Story, and Spirit in the Geography Lab

In a recent conversation with Sasha Djakovic, Geography Lab Technician at Douglas College, I was struck by how deeply experiential and relational geography education can be—especially when it is tied to land-based knowledge and Indigenous ways of knowing.

Sasha’s work, both in the lab and on the land, bridges high-tech tools like GIS and LiDAR drones with ancient stories, place names, and protocols rooted in the Lil’wat Nation’s territory. Through the Lil’wat Archaeological Research Project, he and colleagues like Bill Angelbeck, with guidance from Indigenous leaders such as Jennifer Anaquod, have facilitated a rare kind of learning: one that pairs digital mapping with spiritual respect, physical geography with oral tradition, and student development with cultural humility.

The Geography lab’s augmented reality sandbox, for instance, helps students visualize topography and contour lines in three dimensions. But Sasha reminds us that the best geography lab is still outside. In Lil’wat territory, stories like the Copper Canoe become topographical narratives—legends validated not only by tradition but also by modern science. The land itself, with its steep terrain and active volcanoes, speaks back to these stories in powerful ways.

What stood out most in our exchange was the emotional transformation students undergo during their time on these digs. From opening ceremonies with drumming to observing protocols around sacred sites and funerals, students are asked not just to learn—but to feel, to listen, and to honour. This is not just skill-building. It’s soul-building.

At a time when generative AI is rewriting the rules of what skills we offload to machines, Sasha’s reflections remind us of what we must not offload: spiritual sensitivity, ethical responsibility, and deep relational awareness with the land and its stories. These are not just Indigenous ways of knowing—they are human ones, and they’re more vital than ever.

And…good timing! This article about the archeological work came out shortly after our recorded dialogue: Archeological dig on Lil’wat territory uncovers ancient histories and reframes research relationships