Becoming an Algorithmic Problem: Slowing Down as a Political Act with INST 1100 students

image based on a pencil drawing made by Steven Bishop, inspired by an image from the 1971 book “Be Here Now” by Ram Dass (Richard Alpert)

In a recent class dialogue with Dr. Jovian Radheshwar’s INST 1100 students, we explored ideas from political scientist Dr. JosĂ© Marichal, author of You Must Become an Algorithmic Problem: Renegotiating the Sociotechnical Contract. His work examines how our lives have become entwined with algorithms — systems that categorize, predict, and shape our digital experiences.

Here’s a simple way to think about the difference between social media and algorithms, which came up in the class discussion:, drawing on Jose Marichal’s explanation:

  • Social media is the visible layer — the apps and platforms where we post, share, and connect. It’s where we perform our identities and find community.
  • Algorithms are the invisible layer — the systems behind those platforms that decide what we see, when we see it, and who sees us. They quietly shape our feeds, recommendations, and even our sense of what’s “normal.”

Social media is the stage. Algorithms are the directors. Marichal’s key point is that we often blame the stage, but forget about the directors quietly shaping the show.

Dr. Marichal draws a parallel between the social contract — the implicit agreement we make with society to exchange certain freedoms for collective security — and what he calls the sociotechnical contract. In this new digital era, we have unwittingly traded our attention, privacy, and even aspects of independent thought for the conveniences of social media, algorithmic news feeds, and digital companionship.

The Attention Economy and the Outlier

Students were quick to recognize how easily social media can shape perception. One noted how their feed filled with political praise for one U.S. leader without presenting opposing views. Another shared how TikTok’s algorithms “trained” themselves to reflect heartbreak or attraction—depending on what held their attention longest.

Dr. Marichal’s warning is that algorithms thrive on generalization. They group us into categories that make us easier to sell to, sway politically, or polarize socially. The “outliers” — those who don’t fit the model — are vital to creativity, democracy, and human progress. As one student insightfully said, “You have to train your social media algorithm to what you actually want.”

Slowing Down as Resistance

We considered whether slowing down could itself be a political act. What if, in a world obsessed with speed, scrolling, and optimization, we simply paused?

One student responded:

If we just stopped using Facebook or Instagram, we could make the founders go bankrupt. Our attention is what they want — and we have the power to take it back.

Another reflected:

I think slowing down makes sense as a political act because if you can slow down, you can kind of rebel against capitalism. Like our time and our energy and our bodies are the currency that capitalism operates off of.

We discussed “attention” as the new frontier of civic power — something both tech companies and social movements compete for. Even mindfulness, once a personal wellness practice, becomes a political gesture when it interrupts the algorithm’s grip.

Mindfulness, Attention, and the Self

We discussed how Mindfulness, can help shift us from a “survivalist mind”—constantly reacting to fear, news, and comparison—to an “attentionist mind”, capable of calm observation.

We even tried a simple exercise: lightly touching our fingertips together and noticing the subtle ridges of sensation. For a moment, attention was no longer a commodity — it was presence.

Another student mentioned that yoga, practiced at home by his parents, can reclaim this awareness:

to be at ease with yourself, at ease with your mind, at ease with, no matter what position you’re in, you’re still calm and composed and collected

Social Media for Connection

Not all stories were cautionary. One student described joining a run club he discovered through social media — a digital doorway to genuine human connection:

“Through that I met people I’d never otherwise meet. It’s social media being used in a positive way.”

Another student shared how social media fueled nationwide protests in Nepal by raising awareness of corruption:

“The government had to shut down the Internet to stop people from organizing. That shows the power of the right algorithm.”

From Kafka to the Classroom

The conversation eventually turned to Franz Kafka’s The Trial. Like Kafka’s protagonist Joseph K., who spends his life navigating an opaque bureaucracy without knowing his crime, we risk spending our lives responding to systems we barely understand.

Will we one day be judged by how we used our attention?

Perhaps the antidote is awareness — of self, society, and the digital infrastructures mediating both. As one student summarized:

“It made us think about many ideas we hadn’t before. Slowing down might be one of the most radical things we can do.”

Related Resources

featured image credit: Photo by ANOOF C on Unsplash

Digital Literacy, AI, and Staying Human in a Changing World

What does it mean to be digitally literate in 2025? For Britt Dzioba, Learning and Teaching Advisor at BCcampus, the answer goes far beyond knowing how to use a computer.

In a recent conversation, Britt reflected on her work with BCcampus and Digital Literacy development and the open-access Digital Literacy Hub—a resource designed to help faculty, students, and communities build the skills needed to thrive in an ever-changing digital environment. She noted that while the tools and platforms may shift rapidly, the underlying competencies—such as information literacy, critical engagement with technology, and digital well-being—remain essential.

The pandemic acted as a “bucket of ice water” moment, forcing faculty and students to adapt quickly to online learning. Since then, technologies like generative AI have added new layers of complexity, raising both opportunities and concerns. “I think technology can be a wonderful thing if used intentionally and if used critically. This is where I think, especially with AI, … it’s so important to keep the human in it…that is how we use AI critically and ethically, by centering human experience.”

From exploring algorithmic literacy to supporting accessibility for neurodivergent learners, Britt emphasizes the importance of slowing down, setting boundaries, and cultivating critical thinking. She sees small, grassroots actions—like community-led digital projects or mindful approaches to teaching and learning—as powerful antidotes to despair in the face of rapid change.

Ultimately, digital literacy is not just about mastering tools. It is about equipping ourselves to respond—rather than react—to change, and to shape technologies in ways that sustain human connection, equity, and wisdom.

Rescuing the Human in the Age of AI: A Conversation with Nicole Vittoz

By Steven Bishop, Learning Designer, Douglas College

In an age of rapid technological change and expanding AI capabilities, educators are asking deep and necessary questions about the future of learning and what it means to be human. In a recent conversation with Nicole Vittoz, Associate Dean of Humanities and Social Sciences at Douglas College, we explored the tension between emerging technologies and enduring human capacities — and what this means for post-secondary education.

Nicole brings a thoughtful, multidisciplinary lens to this conversation, drawing on her background in psychology and her current role overseeing academic integrity and pedagogy. “…it’s hard to believe it’s been almost two years now since ChatGPT really blasted onto our stage of awareness. Many conversations over those couple of years have initially started off as, how do we stop this?,” she reflected. “…but I think we’re now slowly getting to the stage of ‘yes and’. Yes, these tools exist and we need to work within that new universe and adjust what we’re doing and try to rescue the human in there. But also use that opportunity to reframe what we’re doing, redefine what we’re doing, in some cases, revise what we’re doing. Where necessary, really articulate, don’t assume anything about other people’s motivations for engaging in the type of learning that we engaged in when we were in post secondary..”

Rather than fearing that generative AI will erode academic standards, Nicole sees this as a moment to “rescue the human.” She emphasizes that traditional assignments like the term paper still hold deep value — not for the product alone, but for the process: developing voice, critical thinking, and intellectual identity. But that process must be made visible, intentional, and supported. “If I want my students to write well, then I’m teaching writing,” she said. “That’s part of my role, even if my course is philosophy or sociology.”

The conversation also touched on the dangers of mistaking convenience for authenticity. AI tools like ChatGPT can mimic voice and polish writing — but at what cost? “When students submit something they haven’t even read, and their name is on it, we have to talk about accountability,” Nicole noted. “It’s not just about plagiarism; it’s about presence and engagement.”

Nicole also highlighted the critical role of skepticism — especially in the humanities. “Our students are usually great at questioning authority,” she observed, “but somehow these technologies have slipped under their radar. Many students trust AI more than a human.” This, she argues, makes it even more urgent to model and teach critical inquiry.

She also reminded us that not all use of AI is negative — tools like text-to-speech and translation software can remove barriers for students with disabilities or language challenges. But the real question becomes: What do we want to offload, and what do we need to protect?

For Nicole, the answer lies in fostering authentic learning, nuanced communication, and ethical reflection. She invites educators to consider transparency frameworks, assignment redesign, and open dialogue with students. And perhaps most importantly, to reclaim the space of conversation itself — where questions unfold, insights emerge, and we remember why education matters.

“Dialogue is an important human attribute,” Nicole said in closing. “We need to keep talking about these things.”

And we will.

“What Makes Us Human?”: A Political Theorist Reflects on Freedom, Technology, and Awareness

In a recent dialogue hosted at Douglas College, I had the pleasure of sitting down with Dr. Jovian Radheshwar, a political science instructor and political theorist, to explore what it means to be human in an age shaped—and increasingly reshaped—by artificial intelligence.

Jovian opened our conversation with a stark observation: we are witnessing a global slide away from freedom, as technology merges with authoritarian politics to normalize social control. Drawing from Greek philosophy, he highlighted how the ancient word idiota referred not to someone lacking intelligence, but to one who is absorbed solely in private concerns, disconnected from the polis—the public world of shared concern. In contrast, true intelligence (physis in Greek) was linked to being deeply enmeshed in one’s surroundings, aware and responsive to the world.

Jovian offered a compelling critique of today’s techno-culture—especially the AI-driven promise of transcendence from human limitations. Figures like Peter Thiel and movements like Transhumanism, he argues, are rooted not in empathy or collective uplift but in a desire to dominate, to become “supermen” at the cost of humility, biodiversity, and social connection.

We reflected together on what awareness looks like in practice. For me, riding a bike through the city has become a daily exercise in sensing others, slowing down, and learning to pick up on the emotional “weather” of a space. Jovian connected this to urban survival skills—what he called a kind of “street ninja” intelligence—and also to the loss of embodiment that comes from excessive digital distraction.

Our conversation turned to education and the risks of reducing learning to a game of performance metrics. Jovian made a passionate case for embracing failure, for cultivating diverse skills, and for protecting the space to explore what we might be good at—beyond what the system rewards. He quoted Heidegger to illustrate how modern technology shifts us from “taking care” of the Earth to “challenging” it—a shift with profound ethical and ecological consequences.

So, what skills should we protect? What human capacities must we nurture?

Jovian suggests starting with awareness, reflection, the capacity to fail, and the courage to imagine freedom—not as domination, but as mutual care.

This is just the beginning. We both agreed: this conversation needs a Part Two.

Mapping More Than Terrain: Land, Story, and Spirit in the Geography Lab

In a recent conversation with Sasha Djakovic, Geography Lab Technician at Douglas College, I was struck by how deeply experiential and relational geography education can be—especially when it is tied to land-based knowledge and Indigenous ways of knowing.

Sasha’s work, both in the lab and on the land, bridges high-tech tools like GIS and LiDAR drones with ancient stories, place names, and protocols rooted in the Lil’wat Nation’s territory. Through the Lil’wat Archaeological Research Project, he and colleagues like Bill Angelbeck, with guidance from Indigenous leaders such as Jennifer Anaquod, have facilitated a rare kind of learning: one that pairs digital mapping with spiritual respect, physical geography with oral tradition, and student development with cultural humility.

The Geography lab’s augmented reality sandbox, for instance, helps students visualize topography and contour lines in three dimensions. But Sasha reminds us that the best geography lab is still outside. In Lil’wat territory, stories like the Copper Canoe become topographical narratives—legends validated not only by tradition but also by modern science. The land itself, with its steep terrain and active volcanoes, speaks back to these stories in powerful ways.

What stood out most in our exchange was the emotional transformation students undergo during their time on these digs. From opening ceremonies with drumming to observing protocols around sacred sites and funerals, students are asked not just to learn—but to feel, to listen, and to honour. This is not just skill-building. It’s soul-building.

At a time when generative AI is rewriting the rules of what skills we offload to machines, Sasha’s reflections remind us of what we must not offload: spiritual sensitivity, ethical responsibility, and deep relational awareness with the land and its stories. These are not just Indigenous ways of knowing—they are human ones, and they’re more vital than ever.

And…good timing! This article about the archeological work came out shortly after our recorded dialogue: Archeological dig on Lil’wat territory uncovers ancient histories and reframes research relationships

Free OER Book: ChatGPT Assignments to Use in Your Classroom Today

As part of DESC’s ongoing work to provide resources to educators across Douglas College, we provide the following Creative Commons licenced book from the University of Central Florida on creating assignments that incorporate AI chat tools such as ChatGPT. These would also work with the Douglas College access to Microsoft’s BING AI chat through our Office 365 account.

Reference:

Yee, Kevin; Whittington, Kirby; Doggette, Erin; and Uttich, Laurie, “ChatGPT Assignments to Use in Your Classroom Today” (2023). UCF Created OER Works. 8.

Cover of ChatGPT Assignments to Use in Your Classroom Today. Click on the image to download.
Click on the image above to download

Stumbling Blocks and Stepping Stones – “AI in the classroom”

The ChatGPT main screen

Join your peers from across Douglas College as we share “Stumbling Blocks and Stepping Stones”: a monthly series celebrating our struggles and successes in teaching and learning.

Moderated by Tim Paul, Manager, Academic Technology Services and member of Douglas Educational Support Community (DESC), “Stumbling Blocks and Stepping Stones” is structured around 3 short faculty presentations and an optional 30-minute discussion time. Developed under the philosophy of professional development through collegial sharing, we hope that the experience of others will help you to find community, creativity, and the strength to explore new opportunities.

  • Theme: AI in the Classroom: Educator Experiences and Classroom Guidelines
  • Date: Tuesday, November 14
  • Time: 10:35-11:30 am

Our panelists will be:

  • Jim Palmer (Music, LLPA)
  • Nina Blanes (BSN, HS)
  • Doug Beech (Marketing, CBA)

In this session, our panelists will provide examples of how AI is utilized (and not utilized) in their classrooms, shedding light on their approaches to communicating with students about the responsible use of generative technology. Gain valuable insights and practical inspiration for your own teaching methods in this session. 

Visit the Stumbling Blocks and Stepping Stones page for the recording once the session is done.

Three things you need to know about AI “detectors”

Grey background with AI written in white in large letters in the centre of the image

Instructors may not submit student material to AI Detectors: Terms and Conditions require ownership over the intellectual property that is submitted; Students own their IP, not instructors​.

  • Douglas College – Freedom of Information & Protection of Privacy act [FIPPA]
  • Terms and Conditions from Copyleaks AI detector
    • “By accessing and/or using the site or services for any purpose whatsoever, you agree to the collection and use of any information or documents uploaded to the site or services, in addition to the collection of your personal information, all in accordance with these terms and under our privacy policy. Unless you delete any uploaded documents from Copyleaks’ servers in accordance with our privacy policy, we reserve the right to keep such uploaded documents in a data-base and use such documents for all purposes listed in our privacy policy.”

Student work may not be submitted to AI detectors without their consent: Students have the right to know if their material is being submitted to a third-party (who may profit off their IP) and the right to withhold consent (Rosenfeld v. McGill, 2004)​.

  • At Douglas College, you are required to get consent from a student before submitting their work to any third-party site that has not been properly vetted.
  • Students must be able to opt out without penalty.

“Detectors” don’t work as advertised: Open AI recently admitted this by pulling its own “detector”; Beware of the “Appeal to Statistics” Fallacy


© 2023 Janette Tilley and Nathan Hall. This document is Creative Commons licensed under CC BY-SA 4.0. This license allows reusers to distribute, remix, adapt, and build upon the material in any medium or format, so long as attribution is given to the creator. The license allows for commercial use. If you remix, adapt, or build upon the material, you must license the modified material under identical terms.

Blackboard Ultra – Setting up assignments

A ceramic mug on a desk next to a laptop open to a group Zoom call

Every month, the Blackboard Ultra Champions group meets to discuss topics related to supporting those who are making the switch to Blackboard Ultra. This group is made up of Blackboard Ultra users who share ideas and questions with the goal of supporting all users across the college. From time to time, we will share sections of those recorded meetings. Today’s meeting focused on the area of setting up assignments in Blackboard Ultra. Here is a 20-minute section of that meeting demonstrating some of the nuances of grading and assignment set up. Click on the image below to access that recording.

Easy Transition To Blackboard Ultra – Fall 2023 Workshop Sessions

A man with a beard sitting at a table with a laptop in front of him

If you are teaching a face-to-face class in Winter, Summer, or Fall 2024, and will be using Blackboard to enhance or supplement your course, please register for one of the following Easy-Transition-to-Ultra sessions according to your availability. This opportunity will minimize the time and effort you need to make the transition. By the time you finish this 1-hour session, you will have the information to convert your existing Blackboard Original content into an Ultra Sandbox, Course Master, or upcoming course.

Select a registration date:

Tuesday, Oct. 10, 10:30 am

Wednesday, Oct.11, 1:00 pm

Wednesday, Oct.25, 10:30 am

Thursday, Oct.26, 1:00 pm

Friday, Nov.10, 10:30 am