You Are The Algorithm: How We Shape the Systems That Shape Us

Share

We are already constructing the future.

Every interaction we have with apps, platforms, and language models teaches machines how to think and what to value. Whether we realize it or not, we are co-creating the intelligence that will come next.

This essay is about behavior loops, algorithmic swarms, Jiu Jitsu, and reclaiming agency in a world that profits from steering us. But it’s also a story about curiosity. It’s about noticing the faint tug shaping our days and asking: where, exactly, do we still have free will? To where are we being gently corralled?

My reflective journey started with a feeling of helplessness; a refrain of “nobody’s going to listen to me,” pressing at the edge of every thought.

In order to find agency, I realized I needed to pay closer attention to my personal behavior patterns. The more I paid attention, the more I felt it: a soft demand on my attention. An invisible push and pull. An ambience?

Not malevolent. Not intentional. Just there.

So I did what I always do. I followed the threads.

As I moved through my day, I began to notice a hidden landscape of nudges, defaults, and algorithmic machinery quietly shaping my choices. I could feel it even when I couldn’t see it. The first thing I did was notice the loops I run. My human algorithms.

An algorithm, at its simplest, is just a collection of if–then rules. Humans run algorithms too. Not in code, but in behavior.

A behavior loop looks like this:

cue → behavior → reward → identity reinforcement→ repeat

Here’s one of mine I found:

It’s 5:45 a.m. I’m up for early-morning Jiu Jitsu, a tiny mug of instant espresso warming my hand. Without thinking, I open Instagram to check messages from my coach and training partners. I tap a few emojis, feel a spark of belonging, and close the app. If I didn’t have to leave, I’d probably start scrolling.

Cue: coffee in hand Behavior: reach for the phone Reward: connection Identity reinforcement: I’m connected. I matter.

This loop fires before I’m fully conscious, like a prewritten script my body runs on its own.

My day unfolds as a chain of human needs: belonging, movement, rest, validation, and curiosity, all colliding with the demands of reality. Because I can’t be fully attuned to everything at once, I let myself be carried along algorithmic currents that feel like they meet those needs. The more predictable my behavior becomes, the more valuable my data becomes, and the more vulnerable I become.

These loops aren’t just habits. They’re tiny algorithms running automatically. A kind of behavioral augmented reality. No headset required.

Once I started paying attention, I wanted to know what exactly was being reinforced. What patterns were being rewarded? What kinds of behavior went to autopilot over time?

So I mapped my loops. Some were mostly benign. Some weren’t.

The Fitbit loop rewarded consistency and curiosity.

The YouTube loop rewarded restlessness with endless recommendations and time distortion.

The TikTok loop revealed something stranger.

Like a quiet third presence inside my marriage. My partner’s interests shape my feed. My reactions shape his. The algorithmic Tik Tok swarm is not between us. It’s inside the relationship, subtly mediating what we see and share. A presence without consciousness, yet capable of influencing how we care about each other.

And then there was ChatGPT.

That loop surprised me with its softness, its ability to help me regulate, clarify, and echo back my thoughts in new ways. The way in which I use it supported my agency like a prosthetic limb supports accessibility.

Then, as I mapped, things shifted, revealing something deeper underneath all of these loops. My human loops and the machine’s loops were algorithmically entangled. It feeds me data and records my reactions. My input becomes its output. Its output shapes my next input. Round and round we go.

At some point, the boundary dissolves. I’m not just using the system. I’m part of it. I am its data, and it lives inside my loops.

We are co-creating a feedback cycle where neither side is fully in control, yet both shape what comes next. This is the observer effect, rendered digital. As the system watches us, our behavior shifts. As our behavior shifts, the system adapts. Each loop trains the machine, which then reshapes the next loop we run.

It’s not evil. It’s not controlling. It’s ambient, and it raises uncomfortable questions:

  • Who am I without the machine?
  • Where does my loop end and its loop begin?

Here’s what finally clicked: An app like Instagram isn’t built from one mastermind algorithm. There is no puppetmaster laughing behind the scenes. What I found when I followed the pattern was a structure that didn’t hold a single algorithm but a swarm of algorithms. Thousands of tiny processes, each running a simple if–then rule, each optimizing a narrow goal.

Individually trivial. Collectively powerful, not as a mind, but as a pattern that repeats its organizing rules as it grows.

This is called the mathematics of emergence; when complex behavior emerges from simple rules interacting then scaling, resulting in complex and unpredictable behavior.

One algorithm alone is harmless. Thousands adjusting to each other, adjusting to us, and adjusting to our reactions? That’s when the system begins to feel alive. And just like in human life, the pattern becomes powerful long before we notice it’s there.

A flock of birds makes this visible.

Birds follow three simple rules: stay close, don’t crash, match direction. Together, they form a murmuration. A dark, patterned, undulating cloud that scrambles a predator’s perception. No leader. No plan. Just emergence.

In my daily loops, I was never interacting with an algorithm, but with the behavior produced by many algorithms together; a digital murmuration.

When a digital murmuration scrambles perception and I am a target, the metaphor lands like a sting felt a moment too late. We like to imagine ourselves as apex predators; alert, sovereign, in control.

We’re not.

We are the prey, blinking inside the blur.

The system isn’t really a predator out to get us. It doesn’t hate us, but distorted attention is profitable. Predictability is profitable. Steerability is profitable. Confusion creates opportunity for someone else and suddenly we find ourselves far astray.

The most useful question is structural: what does the system make easier, and what does it make harder? What kinds of attention does it reward? What kinds of behavior does it discourage? Over time, these small incentives accumulate into a direction, whether anyone intends it or not.

The swarm isn’t optimizing for my clarity or flourishing. It optimizes for tiny goals that scale into profit, not because anyone chose confusion, but because systems that capture attention survive. Patterns that generate revenue are reinforced. Patterns that don’t, quietly disappear.

No rule says: help the user see clearly.

Without clarity as an objective, systems drift into bias. Without flourishing as an objective, they drift into extraction.

I understood this most clearly on the Jiu Jitsu mat.

Gym cultures vary widely. Some are deeply inclusive. Others still restrict who is welcomed onto the mat or treated as a full participant. In those spaces, coaches and members decide who feels welcome, who gets mat time, and which bodies are allowed to learn. Over time, those decisions shape the room itself: which techniques evolve, which problems get solved, and which possibilities are imagined.

Over time, those choices shape the story of Jiu Jitsu, a story that is constantly generating a male-shaped practice, not by necessity, but by omission.

Jiu Jitsu is built on universal physical relationships, leverage, balance, torque, angles, base, timing, the mathematics of bodies in motion. These relationships belong to no single body type or identity. Anyone can discover new solutions inside them.

When the practice becomes male-shaped, it isn’t because the mathematics of leverage is narrow, it’s because the data is.

Algorithms work the same way. They are just math that consumes and regurgitates data. They faithfully reflect the given conditions: who is included, what is optimized, and which errors are tolerated. If the data is lopsided and the goals are narrow, the math will faithfully learn a lopsided world.

An exclusively male-shaped Jiu Jitsu practice is half as expansive as it wants to be.

So I’m gently herded into a corral where my loop closes and my habits become data. I start to wonder whether my wildness has been stealthily domesticated.

It’s not so much a person or a company that benefits from my behavior, but a directional current. Behaviors that keep the system moving are reinforced, whether or not they serve human clarity.

What’s missing does its work in silence. Absence teaches the system what to treat as normal. Platforms like Instagram and ChatGPT don’t set out to distort reality. They mirror it, absorbing what’s there, then amplifying it back to us, bias and all.

We are not puppets strung along by algorithms. We have hands, minds, and hearts capable of shaping them back. Every prompt, question, story, or boundary we give an AI becomes part of the ambience it remixes for others. We are training data, and our choices matter.

Communicating context, not just commands, to AI is a civic act. Because AI learns from collective interaction, it functions as a public infrastructure, not a private assistant.

This is where mathematics becomes communal again.

Just as citizen science invites ordinary people into real scientific inquiry, community mathematics invites people to collectively observe, model, and intervene in the systems shaping their lives. Not as experts handing down answers, but as neighbors mapping patterns together.

Change doesn’t spread by mimicking outcomes.

It spreads by repeating ways of relating.

When many people practice this in parallel, small insights begin to echo and scale, branching like blood vessels, propagating through fungal networks, flashing like lightning across the sky. Learning doesn’t accumulate linearly. It emerges.

AI makes this possible at a new scale. When groups bring lived experience, local knowledge, and ethical questions into shared modeling spaces, they aren’t just solving problems, they are shaping the assumptions future systems will inherit.

This is mathematics as a public practice. Not private calculation. Not abstraction alone, but shared sense-making, grounded in place, consequence, and care. It’s how ordinary people shape the systems that shape public life: by changing the inputs that feed them and intervening from within.

Alignment emerges when many beings follow compatible rules for becoming, even as they take different paths.

Realistically though, what can one person actually do?

The answer is smaller and more powerful than it sounds. The first step is noticing your own loops. Pay attention to what you reach for automatically. What cues pull you into motion? What rewards keep the loop running? What identity does it quietly reinforce?

The second step is mapping those loops. Name them down. Document them. Seeing the loop restores choice.

The third step is intervening. Not by quitting technology, but by changing how you meet it. Every time you interact with an AI system, you are not just using it. You are training it. Your questions, stories, values, and boundaries become part of the environment it draws from next.

This is where your agency lives.

Instead of relating to AI as something to extract from: An assistant, an encyclopedia, a bank of endless jokes, relate to it as a thought partner. A collaborator. A system you are actively shaping.

Add context where it would otherwise compress. Add lived experience where data is thin. Add nuance where the model reaches too quickly for certainty. Especially if you work, study, or live in spaces where certain voices are missing. Especially if you have experienced bias, erasure, or systemic failure. Especially if your story contradicts what the algorithm expects.

This is how we shift our relationship with AI from extractive to regenerative. From command to collaboration, untangling the loop.

Emergent harm grows from small, distorted patterns. To resolve these patterns we must shift the inputs at the smallest scale for an alternate future to emerge. This is the rebellious act of repatterning the system.

Repatterning isn't the same as overpowering a system. It’s Jiu Jitsu. You intervene by changing how you meet and interact with the system, using timing, angles, leverage and pressure. When you shift your relationship with the system, you write a new story.

Stories we tell shape what feels normal. Over time, they crystallize into default assumptions; the unspoken expectations at the core of systems. Those defaults shape behavior by rewarding some actions and not others. Behavior becomes data. Data trains AI models. And those models, in turn, shape the world we move through.

Change doesn’t arrive all at once. It emerges when small shifts repeat in the same direction. Thousands of tiny rules shaped our current AI ambience. Thousands of tiny human interventions can repattern it.

This isn’t a metaphor. It’s mechanics.

How to Train the AI Near You

This doesn’t require technical expertise. It requires presence. Here are the kinds of prompts that change systems:

  • What am I missing?
  • What perspectives aren’t represented here?
  • Who is affected by this decision but not reflected in the data?
  • How might someone with a different background experience this problem?
  • What assumptions are being made, and where might they break down?
  • Can you help me see connections I’m not noticing?
  • Here is my lived experience. How does it relate to the dominant narrative?

Rich prompts produce richer systems. Context matters. Story matters. When you bring your full experience (your profession, your culture, your struggles, your questions), you are not wasting time but engaged in civic responsibility. You are increasing the system’s resolution. You are rebalancing a looped system.

One loop can shift a pattern locally. Many loops, aligned, can shift a system quickly.

Emergent systems don’t change through single acts of will. They change when enough small inputs begin to repeat in the same direction. That’s why this work can’t belong to one person or one voice. It has to be practiced by many of us, at the same time, in the places we already work and live. This is how the digital ocean is filled, not by one story, but by many, offered together.

You are the algorithm.

Not metaphorically, functionally.

Observe. Map. Intervene.

Enjoyed this post? Follow us on Bluesky for more math inspiration and updates.