Becoming an Algorithmic Problem: Slowing Down as a Political Act with INST 1100 students

image based on a pencil drawing made by Steven Bishop, inspired by an image from the 1971 book “Be Here Now” by Ram Dass (Richard Alpert)

In a recent class dialogue with Dr. Jovian Radheshwar’s INST 1100 students, we explored ideas from political scientist Dr. JosĂ© Marichal, author of You Must Become an Algorithmic Problem: Renegotiating the Sociotechnical Contract. His work examines how our lives have become entwined with algorithms — systems that categorize, predict, and shape our digital experiences.

Here’s a simple way to think about the difference between social media and algorithms, which came up in the class discussion:, drawing on Jose Marichal’s explanation:

  • Social media is the visible layer — the apps and platforms where we post, share, and connect. It’s where we perform our identities and find community.
  • Algorithms are the invisible layer — the systems behind those platforms that decide what we see, when we see it, and who sees us. They quietly shape our feeds, recommendations, and even our sense of what’s “normal.”

Social media is the stage. Algorithms are the directors. Marichal’s key point is that we often blame the stage, but forget about the directors quietly shaping the show.

Dr. Marichal draws a parallel between the social contract — the implicit agreement we make with society to exchange certain freedoms for collective security — and what he calls the sociotechnical contract. In this new digital era, we have unwittingly traded our attention, privacy, and even aspects of independent thought for the conveniences of social media, algorithmic news feeds, and digital companionship.

The Attention Economy and the Outlier

Students were quick to recognize how easily social media can shape perception. One noted how their feed filled with political praise for one U.S. leader without presenting opposing views. Another shared how TikTok’s algorithms “trained” themselves to reflect heartbreak or attraction—depending on what held their attention longest.

Dr. Marichal’s warning is that algorithms thrive on generalization. They group us into categories that make us easier to sell to, sway politically, or polarize socially. The “outliers” — those who don’t fit the model — are vital to creativity, democracy, and human progress. As one student insightfully said, “You have to train your social media algorithm to what you actually want.”

Slowing Down as Resistance

We considered whether slowing down could itself be a political act. What if, in a world obsessed with speed, scrolling, and optimization, we simply paused?

One student responded:

If we just stopped using Facebook or Instagram, we could make the founders go bankrupt. Our attention is what they want — and we have the power to take it back.

Another reflected:

I think slowing down makes sense as a political act because if you can slow down, you can kind of rebel against capitalism. Like our time and our energy and our bodies are the currency that capitalism operates off of.

We discussed “attention” as the new frontier of civic power — something both tech companies and social movements compete for. Even mindfulness, once a personal wellness practice, becomes a political gesture when it interrupts the algorithm’s grip.

Mindfulness, Attention, and the Self

We discussed how Mindfulness, can help shift us from a “survivalist mind”—constantly reacting to fear, news, and comparison—to an “attentionist mind”, capable of calm observation.

We even tried a simple exercise: lightly touching our fingertips together and noticing the subtle ridges of sensation. For a moment, attention was no longer a commodity — it was presence.

Another student mentioned that yoga, practiced at home by his parents, can reclaim this awareness:

to be at ease with yourself, at ease with your mind, at ease with, no matter what position you’re in, you’re still calm and composed and collected

Social Media for Connection

Not all stories were cautionary. One student described joining a run club he discovered through social media — a digital doorway to genuine human connection:

“Through that I met people I’d never otherwise meet. It’s social media being used in a positive way.”

Another student shared how social media fueled nationwide protests in Nepal by raising awareness of corruption:

“The government had to shut down the Internet to stop people from organizing. That shows the power of the right algorithm.”

From Kafka to the Classroom

The conversation eventually turned to Franz Kafka’s The Trial. Like Kafka’s protagonist Joseph K., who spends his life navigating an opaque bureaucracy without knowing his crime, we risk spending our lives responding to systems we barely understand.

Will we one day be judged by how we used our attention?

Perhaps the antidote is awareness — of self, society, and the digital infrastructures mediating both. As one student summarized:

“It made us think about many ideas we hadn’t before. Slowing down might be one of the most radical things we can do.”

Related Resources

featured image credit: Photo by ANOOF C on Unsplash

Humanity in the Age of AI: A Conversation with Richa Dwor

What does it mean to be human in an age of rapid technological growth? As artificial intelligence becomes a dominant force shaping how we communicate, learn, and interpret the world, conversations about what to preserve of our shared humanity are becoming essential.

In a recent dialogue, Douglas College’s Richa Dwor shared her reflections on human dignity, critical thinking, and why reading on paper might be a small but powerful act of resistance.

Richa currently serves as the chair of the English department, and the coordinator of the Research and Innovation Office for the college. A scholar of Victorian literature, she focuses on Jewish women’s writing from that period, and also religion and affect; categories of emotions that people feel within and through religious experience and in religious communities.

“If at a bare minimum, if what we can do is look at another person and say you can be in this world, and you can be in the world with dignity, and maybe that’s all I can give you, because I can’t understand your way of thinking
 I think that, at bare minimum, is humanity. Humanism. It’s extending a kind of tolerance towards one another. To me, that’s what, at bare minimum, that’s what tolerance looks like. And then you can build onto that with compassion, empathy, kindness, generosity.”


Critical Capacities We Can’t Afford to Lose

“Firstly, we need to retain our critical competencies, our ways of thinking, our modes of thought that are grounded in the humanities. So by this, I mean the ability to understand and hold in conversation multiple points of view. The ability to think historically, and to understand the progress of ideas and events as they have occurred, and to recognize that our own moment is not inevitable, and that its impact on the future is hard to understand.”

“And the other thing we need is access to artifacts of the past. And to preserve what we have of our shared cultural heritage.”

On Interpretation, Paraphrase, and Translation

One of the richest parts of our conversation centered on language, reading, and interpretation. Richa emphasized:

“Paraphrase and close reading are exactly what I teach students to do. It’s the biggest part of the intellectual work that I’m guiding students to develop, which is
 once you’ve found [a source], reading it and understanding it. This cannot be lost. We cannot let the machine become the interpreter.”

“If we no longer perform our own act of summary and paraphrase, it means that we no longer are able to engage in close reading of a source sufficient to develop our own comprehension of it. And that is at the root of humanistic ways of thinking.”

“You can’t have true discourse with AI. The interpreter is still a person, a human in the community
 even if there’s an interpreter, it should still be possible for others in the community to themselves engage in their own act of interpretation, right? Which may challenge what the interpreter has delivered.”

“If I were to publish material from another language, and I wanted it translated into English, I would never, ever, ever, ever publish an AI translation. I would always hire a human translator. So, it’s one of those things that is useful, but human act of translation are irreplaceable.”

Asked for a practical takeaway, Richa’s advice was clear:

“Read a book. On paper. Just sit down and read the newspaper, or read a book on paper, and just think about it a bit. Read the next one. Yeah, just, just, time and concentration with text, so that you can flex your capacity for paying attention. And for reflection.”

As the conversation turned more personal, both of us reflected on moments of identity change. Richa offered this poignant insight:

“Life is not linear. We are living in a state of chaos. As Joan Didion says, we tell ourselves stories in order to live, so all of the stories that we tell ourselves are just attempts to impose a kind of rational narrative sequence on what is otherwise just one damn thing after another. And there’s a kind of peace that comes from getting rid of that narrative and just recognizing that it’s just one thing after another, and all I can do to center myself is just, like, just try to be good, and be good to other people, and just, like, ride it.”


And perhaps that is the heart of the matter. Technologies will evolve. But the work of recognizing, interpreting, and relating — remains human.

About this series: “Rescuing Our Humanity” is a dialogue series hosted through the Douglas Educational Support Community (DESC), exploring what it means to be human in an age of rapid technological transformation. Each conversation invites faculty, staff, and students to reflect on what we must protect, preserve, or restore of our shared humanity.

Attending to What Matters: A Dialogue on AI, Empathy, and the Costs of Convenience

An interview with Dr. Jennifer Jill Fellows, Philosophy & Humanities instructor

image from https://todaytesting.com/free-social-media-marketing-free-images/; Wikimedia Commons

As artificial intelligence continues to expand into classrooms, workplaces, and even our private lives, Dr. Fellows reminds us that technological convenience always comes with a cost.

A philosophy instructor and Associate of Arts Coordinator at Douglas College, Dr, Fellows is currently researching our growing relationships with AI companions — from “grief bots” modeled after loved ones to conversational agents like ChatGPT and Replika. Her upcoming course, Ethics and Technology (Winter 2026), will explore these themes in depth.

A central point in the dialogue is that even when chatbots aren’t designed to be companions, many people still form emotional attachments to them. We anthropomorphize — we imagine what it’s like to ‘be’ them — and that’s both a human superpower and a vulnerability.

In our conversation, Dr. Fellows reflected on what happens when we begin offloading moral and cognitive skills to machines. While using AI for efficiency may feel harmless, she cautions that it mirrors sending a robot to work out for us: we don’t actually strengthen our own intellectual or ethical “muscles.”

She draws from philosopher Shannon Vallor’s concept of Moral Deskilling, suggesting that empathy, care, and attentiveness are not fixed traits but skills that must be practiced. When digital companions respond without real needs or suffering, we lose opportunities to cultivate genuine empathy — even as loneliness rises.

Dr. Fellows also highlights the environmental and human costs of generative AI: massive energy use and the emotional toll borne by underpaid content moderators who make these systems safe for public use. “All convenience comes with a cost. You give something up to gain something.” she says.

For educators, her advice is both practical and profound:“And so I would ask people to be very mindful about what the costs are to themselves, what the costs are to their communities, what the costs are to the planet at large, and to really think about whether and how if they’re going to use these tools, how they’re going to use them, whether they’re going to use them. And for educators specifically, my other piece of advice would be to really think about what we want students to gain from these classrooms.”

Perhaps the deepest lesson came near the end of our exchange. Quoting the French philosopher, mystic, and political activist Simone Weil, Fellows emphasized that true attentiveness means holding one’s desires in check to receive from others and from the world. In an attention economy designed to feed our wants, that kind of listening may be the most radical skill of all.

A few links to topics mentioned in the recording
Thomas Nagel “What is it like to be a bat?
Joseph Weizenbaum and the ELIZA Chatbot
Authenticity in the age of digital companions by Sherry Turkle
Long may you run by Neil Young
MIT Research on impact of AI use on the brain
Simone Weil – Attention as a Moral and Psychological Act
Cyborg Goddess podcast episode with Dr,. Nicole Ramsoomair: The costs of convenience

Here are three lists addressing categories of concern with generative AI and technology, and based on conversations with educators. Nothing definitive; just playing with the ideas.

Human skills and tasks we feel comfortable offloading to technology

  • Data storage, retrieval, and archiving
  • Complex calculations, analytics, and pattern detection
  • Information searching, sorting, and summarizing
  • Scheduling, reminders, and basic administrative workflows
  • Navigation, mapping, and real-time traffic updates
  • Translation for basic comprehension (not nuance or cultural depth)
  • Routine manufacturing, repetitive assembly, and hazardous tasks
  • Predictive maintenance and environmental monitoring in buildings or infrastructure

Human skills and activities we feel unsure about offloading to technology

  • Decision-making in morally ambiguous or high-stakes contexts
  • Education design and assessment that require understanding of learner needs
  • Creative work where originality, cultural resonance, or personal meaning matter
  • Complex negotiations and conflict resolution
  • Emotional support, counselling, and mentorship
  • Interpretation of art, literature, and cultural heritage
  • Environmental stewardship decisions with long-term community impact
  • Cross-cultural communication where subtle context matters

Human attributes that we need to protect, rescue, and restore

  • Empathy, compassion, and ethical discernment
  • Deep listening and presence in interpersonal interactions
  • Embodied awareness and sensory connection to place
  • Critical thinking and independent judgment
  • Patience, persistence, and the capacity for sustained attention
  • Civic engagement and shared responsibility in community life
  • Imagination and the ability to envision alternative futures
  • Wisdom drawn from lived experience and intergenerational knowledge

Digital Literacy, AI, and Staying Human in a Changing World

What does it mean to be digitally literate in 2025? For Britt Dzioba, Learning and Teaching Advisor at BCcampus, the answer goes far beyond knowing how to use a computer.

In a recent conversation, Britt reflected on her work with BCcampus and Digital Literacy development and the open-access Digital Literacy Hub—a resource designed to help faculty, students, and communities build the skills needed to thrive in an ever-changing digital environment. She noted that while the tools and platforms may shift rapidly, the underlying competencies—such as information literacy, critical engagement with technology, and digital well-being—remain essential.

The pandemic acted as a “bucket of ice water” moment, forcing faculty and students to adapt quickly to online learning. Since then, technologies like generative AI have added new layers of complexity, raising both opportunities and concerns. “I think technology can be a wonderful thing if used intentionally and if used critically. This is where I think, especially with AI, … it’s so important to keep the human in it…that is how we use AI critically and ethically, by centering human experience.”

From exploring algorithmic literacy to supporting accessibility for neurodivergent learners, Britt emphasizes the importance of slowing down, setting boundaries, and cultivating critical thinking. She sees small, grassroots actions—like community-led digital projects or mindful approaches to teaching and learning—as powerful antidotes to despair in the face of rapid change.

Ultimately, digital literacy is not just about mastering tools. It is about equipping ourselves to respond—rather than react—to change, and to shape technologies in ways that sustain human connection, equity, and wisdom.

The Psyche and Artificial Intelligence

Psychology faculty members Shahnaz Winer and Joe Thompson joined a conversation about how technology is reshaping minds, methods, and what we call the “psyche.”

Dr. Winer, a professor of psychology in her second year at Douglas College, described her focus on the brain–mind question and the nature of consciousness—work now inseparable from advances in artificial intelligence. She recently supervised an honours thesis exploring how tools like ChatGPT might guide a cognitive-behavioural therapy exercise to help with anxious thinking. Shahnaz shared her thought that our brain is a beautiful bridge between our humanity/physiology and our ‘divinity’—the abstract self we call psyche or soul, noting how AI’s rise makes these questions newly urgent.

Dr. Thompson, Chair of Psychology and Social Sciences, studies how naturally occurring data—for example, from people who play video games—can power new research. He also zoomed out to a big-picture concern: humans routinely overestimate attention and multitasking ability. Whether walking or driving, divided attention can turn risky. “The cognitive system is very limited attentional resource, and people tend to overestimate how many resources they have”.

The conversation traced psychology’s roots—psyche as “breath” or “soul”—to today’s scientific lens on the brain, touching on the long-running debate about whether humans are fundamentally special or simply part of nature’s continuum. Both guests urged caution with fast-moving AI systems whose inner workings can be opaque. Because generative models are trained on human data, they can inherit our biases—making thoughtful guardrails and regulation part of the work ahead.

Dr. Winer emphasized what to protect as more tasks are offloaded to machines: empathy, connection, and community. “Our relationships and care for one another can’t be replaced,” she noted, adding that AI is still “in its infancy” and can be shaped to benefit humanity. Thompson closed by returning to an ancient reminder from Delphi—Know Thyself—arguing that we should use AI carefully until we better understand our own cognitive limits.

Rescuing the Human in the Age of AI: A Conversation with Nicole Vittoz

By Steven Bishop, Learning Designer, Douglas College

In an age of rapid technological change and expanding AI capabilities, educators are asking deep and necessary questions about the future of learning and what it means to be human. In a recent conversation with Nicole Vittoz, Associate Dean of Humanities and Social Sciences at Douglas College, we explored the tension between emerging technologies and enduring human capacities — and what this means for post-secondary education.

Nicole brings a thoughtful, multidisciplinary lens to this conversation, drawing on her background in psychology and her current role overseeing academic integrity and pedagogy. “…it’s hard to believe it’s been almost two years now since ChatGPT really blasted onto our stage of awareness. Many conversations over those couple of years have initially started off as, how do we stop this?,” she reflected. “…but I think we’re now slowly getting to the stage of ‘yes and’. Yes, these tools exist and we need to work within that new universe and adjust what we’re doing and try to rescue the human in there. But also use that opportunity to reframe what we’re doing, redefine what we’re doing, in some cases, revise what we’re doing. Where necessary, really articulate, don’t assume anything about other people’s motivations for engaging in the type of learning that we engaged in when we were in post secondary..”

Rather than fearing that generative AI will erode academic standards, Nicole sees this as a moment to “rescue the human.” She emphasizes that traditional assignments like the term paper still hold deep value — not for the product alone, but for the process: developing voice, critical thinking, and intellectual identity. But that process must be made visible, intentional, and supported. “If I want my students to write well, then I’m teaching writing,” she said. “That’s part of my role, even if my course is philosophy or sociology.”

The conversation also touched on the dangers of mistaking convenience for authenticity. AI tools like ChatGPT can mimic voice and polish writing — but at what cost? “When students submit something they haven’t even read, and their name is on it, we have to talk about accountability,” Nicole noted. “It’s not just about plagiarism; it’s about presence and engagement.”

Nicole also highlighted the critical role of skepticism — especially in the humanities. “Our students are usually great at questioning authority,” she observed, “but somehow these technologies have slipped under their radar. Many students trust AI more than a human.” This, she argues, makes it even more urgent to model and teach critical inquiry.

She also reminded us that not all use of AI is negative — tools like text-to-speech and translation software can remove barriers for students with disabilities or language challenges. But the real question becomes: What do we want to offload, and what do we need to protect?

For Nicole, the answer lies in fostering authentic learning, nuanced communication, and ethical reflection. She invites educators to consider transparency frameworks, assignment redesign, and open dialogue with students. And perhaps most importantly, to reclaim the space of conversation itself — where questions unfold, insights emerge, and we remember why education matters.

“Dialogue is an important human attribute,” Nicole said in closing. “We need to keep talking about these things.”

And we will.

“What Makes Us Human?”: A Political Theorist Reflects on Freedom, Technology, and Awareness

In a recent dialogue hosted at Douglas College, I had the pleasure of sitting down with Dr. Jovian Radheshwar, a political science instructor and political theorist, to explore what it means to be human in an age shaped—and increasingly reshaped—by artificial intelligence.

Jovian opened our conversation with a stark observation: we are witnessing a global slide away from freedom, as technology merges with authoritarian politics to normalize social control. Drawing from Greek philosophy, he highlighted how the ancient word idiota referred not to someone lacking intelligence, but to one who is absorbed solely in private concerns, disconnected from the polis—the public world of shared concern. In contrast, true intelligence (physis in Greek) was linked to being deeply enmeshed in one’s surroundings, aware and responsive to the world.

Jovian offered a compelling critique of today’s techno-culture—especially the AI-driven promise of transcendence from human limitations. Figures like Peter Thiel and movements like Transhumanism, he argues, are rooted not in empathy or collective uplift but in a desire to dominate, to become “supermen” at the cost of humility, biodiversity, and social connection.

We reflected together on what awareness looks like in practice. For me, riding a bike through the city has become a daily exercise in sensing others, slowing down, and learning to pick up on the emotional “weather” of a space. Jovian connected this to urban survival skills—what he called a kind of “street ninja” intelligence—and also to the loss of embodiment that comes from excessive digital distraction.

Our conversation turned to education and the risks of reducing learning to a game of performance metrics. Jovian made a passionate case for embracing failure, for cultivating diverse skills, and for protecting the space to explore what we might be good at—beyond what the system rewards. He quoted Heidegger to illustrate how modern technology shifts us from “taking care” of the Earth to “challenging” it—a shift with profound ethical and ecological consequences.

So, what skills should we protect? What human capacities must we nurture?

Jovian suggests starting with awareness, reflection, the capacity to fail, and the courage to imagine freedom—not as domination, but as mutual care.

This is just the beginning. We both agreed: this conversation needs a Part Two.

Mapping More Than Terrain: Land, Story, and Spirit in the Geography Lab

In a recent conversation with Sasha Djakovic, Geography Lab Technician at Douglas College, I was struck by how deeply experiential and relational geography education can be—especially when it is tied to land-based knowledge and Indigenous ways of knowing.

Sasha’s work, both in the lab and on the land, bridges high-tech tools like GIS and LiDAR drones with ancient stories, place names, and protocols rooted in the Lil’wat Nation’s territory. Through the Lil’wat Archaeological Research Project, he and colleagues like Bill Angelbeck, with guidance from Indigenous leaders such as Jennifer Anaquod, have facilitated a rare kind of learning: one that pairs digital mapping with spiritual respect, physical geography with oral tradition, and student development with cultural humility.

The Geography lab’s augmented reality sandbox, for instance, helps students visualize topography and contour lines in three dimensions. But Sasha reminds us that the best geography lab is still outside. In Lil’wat territory, stories like the Copper Canoe become topographical narratives—legends validated not only by tradition but also by modern science. The land itself, with its steep terrain and active volcanoes, speaks back to these stories in powerful ways.

What stood out most in our exchange was the emotional transformation students undergo during their time on these digs. From opening ceremonies with drumming to observing protocols around sacred sites and funerals, students are asked not just to learn—but to feel, to listen, and to honour. This is not just skill-building. It’s soul-building.

At a time when generative AI is rewriting the rules of what skills we offload to machines, Sasha’s reflections remind us of what we must not offload: spiritual sensitivity, ethical responsibility, and deep relational awareness with the land and its stories. These are not just Indigenous ways of knowing—they are human ones, and they’re more vital than ever.

And…good timing! This article about the archeological work came out shortly after our recorded dialogue: Archeological dig on Lil’wat territory uncovers ancient histories and reframes research relationships

Inclusive Teaching with Sarah Skinner


We caught up with Sarah Skinner, an Early Childhood Education instructor, about her first year in a full-time faculty role and her evolving approach to inclusive, student-centered teaching. Sarah shares how Universal Design for Learning (UDL) principles, multiple means of expression, and real-world inclusive strategies are shaping her courses—especially in asynchronous online environments.

Sarah reflects on her recent attendance at the Canadian Society for the Study of Education (CSSE) conference and shares a few memorable sessions and takeaways, including using case studies based on refugee families’ lived experiences and incorporating visual note-taking into learning activities. These approaches center student agency and creativity while maintaining clear learning goals.

Key Topics

  • Sarah’s background in occupational therapy and inclusive education
  • Implementing Universal Design for Learning (UDL) in practice
  • Tools like Padlet for engagement and collaboration
  • The value of choice in learning modalities (e.g., essays, podcasts, paintings, carvings)
  • Conference reflections from CSSE and ideas for case-based learning
  • Explorations of visual note-taking in asynchronous online learning
  • The importance of preserving joy, diversity, and human-centered learning in the face of increasing automation and AI

Tools & Teaching Strategies

  • Padlet for visual and collaborative learning
  • Kaltura media tools for multimedia sharing in online courses
  • Visual note-taking as an inclusive and creative learning strategy
  • Gallery walks as assessment and dialogue tools
  • Flexible assessments (papers, songs, carvings, videos) that align with consistent learning outcomes

Takeaway Quotes

“Inclusive education is education.”
“UDL is about fixed goals and flexible means.”
“We’re not just looking for a well-written essay—we’re looking for understanding.”
“When students are engaged and joyful, they’re more likely to think for themselves.”


🔗 Related Links

Access Sarah’s presentation via the link below:

Padlet-for-Choice-and-Flexibility-1


Sarah welcomes collaboration and conversation with other Douglas College instructors who are interested in inclusive teaching, creative assessments, or translating hands-on activities to online environments.

Team-based Learning with Mustafa Syed

In this follow-up to the DESC Community-Based Learning workshop, Mustafa Syed from the Training Group describes how he uses digital tools and team-based learning approaches to support adult learners—particularly those in self-employment and career-transition programs.

The conversation explores how instructional design grounded in team collaboration, community building, and the BC Post-Secondary Digital Literacy Framework can create real-world outcomes for diverse learners. Drawing on examples from programs like Self-Employment Services, VOICE, and Encore, Mustafa shares how team dynamics, digital skill-building, and collaborative projects foster meaningful communication, promote peer-to-peer engagement, and empower students to connect, share, co-create in dynamic digital environments, and transform their business ideas into action.

This recorded conversation highlights practical tools and strategies for fostering collaboration and digital literacy in team-based learning environments. Mustafa shares how platforms like Microsoft Whiteboard, Padlet, Blackboard, and Zoom are used to support real-time interaction, peer learning, and inclusive group work. Instructors will find inspiration for engaging learners and supporting diverse teams. Key takeaways include using digital tools to reduce barriers, form learning communities, and empower students to co-create meaningful projects.

Recording of conversation with Mustafa Syed