Reality Is Negotiable: The Art of Conscious Algorithm Training

Written by: Jessica Grace

Reality is negotiable now. Every scroll, swipe, and click rewires what we think is true. We aren’t just influenced by content; we become the content we consume.

When I lived in San Francisco and worked in tech marketing, it was common knowledge that governments, corporations, and organizations of all kinds were using increasingly sophisticated methods to attract attention and shape public opinion online. Most people had a general sense of how advertising tech, public-opinion campaigns, and personal branding worked. We all knew there were tools for persuasion, sometimes aggressive and obvious, sometimes so subtle they slipped beneath conscious awareness.

What few of us fully grasped was how big data would transform influence from a marketing tactic into a reality-warping machine.

Web 2.0 was built on user data. When the social media platforms that now have us transfixed for too many hours per day were first created, they were entirely new products in an untested category. No one knew what to build, how to talk about it, or how to sell it. We approached marketing and design like an engineering challenge: with enough data, we could determine precisely what every user responded to, what they liked, what they ignored, and what made them recoil. We could optimize everything down to the pixel. 

And we did. I remember sitting in meetings designing website elements using insights from neuromarketing, a field that studies how people’s brains and bodies respond to marketing stimuli, from web layouts to packaging to color palettes. It draws on neuroscience and psychology to understand attention, emotion, and decision-making.

For the first time in history, we had access to an ocean of actionable behavioral and psychological data. We built the tools to mine it, analyze it, and report on it in real time. Every detail of your online experience was designed to seize attention and provoke emotion. All that data became the rocket fuel for systems capable of shaping opinions at scale. 

And now, we’ve automated those systems, and we let them run themselves. We call them the algorithm.  But behind that familiar word lies a vast, automated system quietly deciding what we see, think, and believe.

The Invisible Hand of the Algorithm

If the first wave of the internet gave us information at our fingertips, the second wave sorted, personalized, and served that information to each of us individually. The idea sounded helpful. Who wouldn’t want to see the most relevant results first? Algorithms were originally designed to help us sort the chaos. As the web expanded beyond anything a human mind could organize, automated systems began ranking and filtering information for us. Today, the algorithm alters our perception of reality and can shift our identity.

How did this happen? The key is in defining what ‘relevant’ means in this context. Most people think ‘relevant’ is referring to the information meaningfully contributing to understanding, but in tech, “relevant” has come to be correlated with achieving an objective…and the objective is not your personal objective. Sometimes relevance is shorthand for whatever keeps you engaged the longest. Sometimes, if you’re say Vladimir Putin’s best bot farm, relevance is the most realistic divisive content that you’ll accept as truth without further scrutiny.

Everything you see on the internet represents someone somewhere trying to achieve an objective. The algorithm doesn’t care if you’re informed, only that you’re engaged. It will serve you content that keeps you engaged and also helps to achieve the objective. Sometimes that’s an advertisement for a new yoga studio on your block. Other times, it's Christian Nationalist organizations attempting to undermine the Constitution to solidify their chokehold on power. 

Every time you scroll, pause, or click, you’re training a system to feed you more of the same. The longer you linger, the stronger the signal. We get this. But shocker! In this process, the algorithm is also training you. It is teaching you to normalize ideas that serve the objectives of those that pay to promote content on the platform. These invisible feedback loops quietly reshape what you see, what you think about, and eventually, what you believe. 

Pattern recognition at a planetary scale is not evil in itself, it’s math mathing, but it’s optimized for the wrong goal. The system doesn’t exist to help you become wiser or more balanced, to help us build peaceful societies or a better future for everyone. It could! But right now it exists to keep you scrolling and to serve the objectives of the powerful elite who want to control the world (for real). Outrage, fear, and awe hold attention far longer than calm reason, so those are the emotions the algorithm learns to serve. 

Back in the day, although most of my peers acknowledged the existence of these systems of influence and control, they refused to accept that they themselves were being influenced. Even now, there’s resistance to the idea that your perception of the world around you is being warped to suit political actors, corporations, spiritual gurus, and random weirdos on the internet. Many still think they’re sharp enough to feel the influence and reject it. They’re wrong.

There’s an old adage that you become the five people you spend the most time with. Well that idea has been upgraded. You also become the five people whose videos autoplay on your feed every night. Looking around (waves hands generally), two decades into big data, it’s not hard to see the effect of millions of Americans passively consuming whatever content pops up in their discovery feed. 

We’ve been so thoroughly manipulated that reality itself has become negotiable. As they say, we live in a post-facts world. People have been radicalized, and quite simply, no longer see the world in the same way as they did before the algorithm began curating it for them. 

And it’s not just human influencers doing the shaping. Roughly half of all internet traffic today is generated by bots. So when you follow them, they follow you back. Don’t fool yourself. You aren’t immune, and you’re not better at detecting them than anyone else. Those accounts, those recommendations, those comments; many are automated persuasion campaigns with directives, goals, and targets. The content we consume, at least in part, is a poison pill created by unknown actors for unknown ends. They all have directives, goals, and targets. Are you a target? Probably. 

The result is a world where every person lives in a slightly different informational universe. Two people standing side by side can look at the same platform and see opposing realities; different news, different moral hierarchies, different truths. It’s not that they disagree about facts; they’re being told different stories about what those facts mean.

The algorithm learns how you respond and what keeps you hooked. Then it builds a world around those responses, personalized to your impulses. There is another saying in tech that if you aren’t paying for the product, then you are the product. In this case, you are the product and the prototype at the same time. The algorithm collects your data and uses it to optimize its effectiveness on everyone else. Every click is a data point in a behavioral experiment that never ends.

We must accept this truth about our world now in order to protect ourselves and build the life we want for ourselves and a livable future society. It doesn’t have to be this way, but by becoming aware of how the system functions at present we can begin to design better tech that actually benefits society and supports individual growth, rather than destroys it on behalf of the world’s worst people.

The Psychology of Persuasion — Upgraded

For decades, persuasion has been treated like a science — and it is one. The systems manipulating us today were born in psychology labs, refined by advertisers, and weaponized by political strategists. What started as research into how people make decisions has become a playbook for steering entire populations.

These systems are so refined and advanced that you are not aware of them and have no defense once enmeshed. The evidence is clear: once you infuse your world with messages of any type, your brain will assimilate and process that information, and your beliefs and outlook will begin to shift and transform to align with that information. 

Robert Cialdini, one of the godfathers of modern persuasion, identified six classic levers: reciprocity, authority, social proof, liking, commitment, and scarcity. They sound simple, but they map directly onto our most primal social wiring, the need to belong, to be consistent, to trust authority, to not miss out. Once you understand those instincts, influencing behavior becomes formulaic. Now imagine those levers embedded into every post, ad, and headline you see online.

Daniel Kahneman took it a step further: our brains run on two systems. System 1 is fast, emotional, intuitive — perfect for survival, terrible for discernment. System 2 is slow, rational, and lazy. It kicks in only when forced. The algorithm knows this. It keeps you in System 1 — scrolling, reacting, emoting — because that’s where engagement lives.

And when those emotions fire repeatedly, Hebbian learning takes over: neurons that fire together wire together. The more your brain associates outrage with political headlines or belonging with your favorite influencer’s voice, the more those neural pathways strengthen. Over time, the pattern becomes your perception. Flood the zone with stimuli, and the human brain will adapt its beliefs to reduce uncertainty.

This is how cults work. This is how MAGA works. This is how Navy SEALs are trained, and how Facebook learned to keep you hooked. The playbook is the same: repetition, emotional charge, and social reinforcement. You are told what to fear, what to love, and who belongs to which category. Flood the zone with messages — subtle and obvious, scary and funny, short and long — and always emotionally triggering. When your emotions light up, information slips past your critical filters and lodges deeper in memory.

That’s why rage bait is so effective. Fear and outrage are the stickiest emotions we have. They keep you watching, because the brain hates uncertainty even more than it hates being wrong. When people say they’ve been “red-pilled,” what they essentially mean is their emotional circuitry has been rewired.

But the same mechanics can work in the other direction. Meditation, manifestation, and flow states all use repetition and emotion to reprogram the mind — deliberately. The tools aren’t evil; it’s the intent and context that matter. The brain doesn’t care whether the stimulus comes from a cult or a gratitude journal. It simply builds around what it’s given.

This is how the world works. We become whatever we surround ourselves with. We must accept this as fact and become proactive in consciously choosing who we want to become and what world we want to live in so we can go find the influences that will help us become it.

When I finally accepted the research as fact, that every bit of data I consume becomes a part of me, I was able to cultivate the discernment required to consciously create my reality. After being very online since the dawn of the internet, I realized the only real choice I have in shaping my identity is curating what I let in. If I want to be someone who lives in gratitude and awe, I need to surround myself with messages that reinforce gratitude and awe.

Persuasion isn’t just something done to you; if you’re aware, it’s the raw material you choose to build yourself from.

The Counterspell: Curating Conscious Influence

The invisible hand of the algorithm has plunged society into a kind of psychological freefall, but within that chaos lies an extraordinary opportunity for evolution. The radicalizing effects of social media have made something unmistakably clear: reality has more plasticity than we ever imagined. If it’s terrifying to recognize that billions of minds can be manipulated in real time, it must be equally empowering to realize that we have access to all the same tools to consciously create whatever we choose.

Reality may be negotiable, but that means it’s also designable.

Among the success and life coaches, mindset and best life optimizers, whose content I consume and let influence me, there is a common thread about mindfully choosing the activities you spend time on in any given moment in a day. The idea is that you choose the activities you spend time on in a day, and those activities influence how you think and who you become. Your days are composed of micro-decisions, and those decisions are what shape your future self. What you pay attention to, what you let into your consciousness, is not trivial. It’s identity architecture. If you don’t make these choices deliberately, someone else will make them for you. And if you consume mindlessly, you’re not just wasting time; you’re outsourcing your beliefs, values, and sense of self to the highest bidder in the attention economy.

The research shows these systems of information exchange don’t just alter your worldview; they reshape your sense of self. What you think of as you is an evolving record of interactions and reactions — a living archive of stimuli your brain has processed, stored, and indexed. Change the inputs, and the output changes too. Tell yourself a different story, and you literally become someone else over time.

Neuroscience backs this up. Studies on experience-dependent neuroplasticity (a term popularized by psychologist Rick Hanson) show that our brains constantly rewire themselves based on repeated mental states. As the saying goes: neurons that fire together, wire together. Every time you engage with fear, outrage, or envy online, you strengthen those pathways. But the same principle applies to gratitude, awe, curiosity, and compassion. The stimuli you feed your brain literally sculpt your neural landscape.

Social psychology mirrors this insight. The mere exposure effect, first identified by Robert Zajonc in the 1960s, demonstrates that people tend to develop preferences for things simply because they’re familiar. Repetition breeds comfort, which breeds belief. The more we encounter an idea — even one we once disagreed with — the more plausible it begins to feel. This is the underlying engine of radicalization, but it can also be the foundation for conscious self-reinvention.

If we accept that, then the implication is revolutionary: your mind is programmable, and you are the programmer.

Imagine what our world might look like if we understood and accepted that we each have the power to create ourselves and the reality we live in. Imagine a world where people were taught from childhood that everything they consume — every scroll, every post, every comment — is an input shaping their perception of reality. Imagine if we treated media hygiene with the same seriousness as nutrition or exercise. What might our public discourse, our politics, or our self-esteem look like then? If we were taught that everything we put out into the world and post on social media influences what the future world will look like, would we express the same ideas? 

You can start small. Examine your feeds through this lens. Unfollow accounts that consistently inject negativity, fear, or cynicism into your world.  Unfollow accounts that inject ideas and emotions into your world that you don’t want to absorb and embody.   Curate your inbox around aspirational ideas. Follow voices that expand your curiosity, creativity, or compassion. Create a “gratitude algorithm” of your own. Frame this not as retreating from reality, but as a practice in perceptual retraining, an ongoing ritual of aligning your attention with the world you want to inhabit.

Accepting that we are susceptible to the influence and manipulation we are immersed in every day is the first step to developing a kind of superpower to transform your life and consciously create your world.  It’s the first step toward mastery. The moment you acknowledge how porous your mind truly is, you gain the power to shape what flows through it.

That’s the real counterspell.

To consciously choose your inputs is to reclaim authorship of your life — and, by extension, the collective story we’re all writing together.


Epilogue: The New Architects of Reality

The story of the next century won’t be written by those who control the algorithms. It will be written by those who learn to see through them. The ability to discern signal from noise, to choose consciousness over consumption, may become the defining skill of our time.

We stand at an inflection point in human history where technology can either fragment our collective psyche or expand it. The same systems that have been used to radicalize, divide, and distract us can also be used to awaken, connect, and create. The direction depends entirely on the consciousness of the people using them.

Our attention is the world’s most valuable currency. Where we spend it determines the reality we inhabit and the future we construct together. If we learn to treat attention as sacred, to wield it with care and precision, we can reclaim authorship not just of our individual lives, but of the shared story of humanity itself.

Because in the end, the algorithm isn’t our enemy. It’s our mirror.

And what it reflects back depends entirely on what we choose to feed it.

Share this article:
Subscribe to Still Processing:

Still Processing is a deep dive newsletter about thriving in accelerated change. I write about emerging tech, cultural and social trends, and growing ideologies that are reshaping the world faster than we can make sense of it.

Thanks for subscribing to Still Processing...

I'll send you thought-provoking deep-dive articles as they are written straight to your inbox. Because what's happening right now deserves deeper answers.  

You can always browse my article archive to keep reading now.
Oops! Something went wrong. Try it again!

Jessica Grace is a seasoned marketing strategist and fractional CMO specializing in early-stage startups and visionary entrepreneurs. With a sharp eye for brand storytelling and data-driven growth, she transforms ideas into impactful, values-driven brands.

Want to go deeper?

Every idea has an origin story. The books below helped shape the questions, insights, and curiosities that evolved into this issue. They’re the thinkers and frameworks I return to when I’m tracing the deeper architecture of reality and meaning. If you want to keep exploring the web of influences behind Still Processing, browse my full library here.

Hanson translates the science of neuroplasticity into accessible practices for conscious mental rewiring—bridging perfectly with my “counterspell” framework.

Thinking, Fast and Slow

Written by:

Daniel Kahneman

A landmark exploration of the two systems that drive our thinking: the fast, emotional, intuitive system and the slow, deliberate one. Essential for understanding how algorithms exploit cognitive shortcuts.

Influence: The Psychology of Persuasion

Written by:

Robert Cialdini

Cialdini’s foundational text on the six principles of persuasion—reciprocity, authority, social proof, liking, commitment, and scarcity—reveals the timeless psychological levers that modern media automates at scale.

Amusing Ourselves to Death

Written by:

Neil Postman

Written in the 1980s but eerily prophetic, Postman argues that television (and by extension, the internet) turns serious public discourse into entertainment, eroding our ability to think critically about truth.

Carr examines how constant online engagement rewires our neural pathways, shortening attention spans and changing how we process meaning—a neurocognitive lens for your essay’s core argument.

A compassionate, wide-ranging look at the attention crisis, combining personal narrative with research into how Big Tech fragments focus and how we can rebuild depth and intention.

Interested in working together?

Connect with me.

Thanks for sending me a message.
I usually reply within 48 hours.
Oops! Something went wrong while submitting the form.