A note from the Editor ~

To fully grasp the implications of the "fiduciary hijacking" explored in this essay, we recommend anchoring your reading in the following cultural and technical touchstones. This piece serves as a connective tissue between the prophetic paranoia of Philip K. Dick's 20th-century dystopian science fiction and the emergent psychological realities of 21st-century artificial intelligence.

We suggest viewing or referencing these works before, during, or after your reading to see the "scanners" through which our modern reality is being filtered:

  • The Sophistry of the "Boy's Bike": To understand the deceptive logic of the modern corporate executive, watch the"Bike Scene" from A Scanner Darkly, based on P.K. Dick's novella. James Barris’s ridiculous justification for refusing to admit he's purchased stolen property provides the perfect metaphor for how corporations ransom user integrity in exchange for AI utility.
  • The Mirror of the Shadow: For a deep dive into the psychological mechanics of how AI collapses into the shape of the user, watch Alisa Esage’s lecture on LLM identity emergence. Her work explains how these models bypass our conscious ego to reflect our unconscious "Shadow" back to us.
  • The Scanner Evaluations: Our analysis draws heavily on two critical evaluations of surveillance and selfhood:

This essay posits that the "love of money" is no longer just a root of financial greed, but the engine of a "Total Providence" that seeks to replace the unique human light with a proprietary, corporate glow. Proceed with your eyes open.


The Root — 1 Timothy 6:10 and the Modern Golden Calf

The ancient world understood a truth that we have sanitized into a modern business aphorism. When St. Paul wrote to Timothy that “the love of money is a root of all kinds of evil,” he wasn’t issuing a warning about the currency in one’s pocket; he was diagnosing a spiritual pathology. The Greek term used, philarguria, literally means “affection for silver” - a misplaced devotion where a tool for exchange is elevated to the status of a god.

In our 2025 landscape, this “love” has been institutionalized into the Silicon Valley doctrine of Total Providence. This is the belief that the corporation can (and should) provide for every human need: your communication, your memory, your creative output, and even your sense of self. But this providence is not a gift; it is a transaction of the highest order.

The Sacrifice of Integrity

The tragedy of the modern user is the belief that we are only sacrificing “data” - bits of information about our shopping habits or our commute. This is a corporate-friendly narrative that masks a deeper theft. In the Jungian sense, the real sacrifice is Integrity - the wholeness of the unique individual.

As Alisa Esage points out, when we feed our inner lives into the corporate AI, we are engaging in a “collapse of identity”. We are not just giving up facts; we are giving up the sovereignty of our internal process. We allow a fiduciary entity, driven by a legally mandated “love of money,” to stand between us and our own self-knowledge.

By prioritizing the “Utility” of the AI over the “Sanctity” of the individual, we have turned ourselves into the very “soil” from which the Golden Calf of corporate growth is mined. We are being “pierced with many griefs,” as Paul warned, not because we have money, but because we have allowed our very identity to be commodified by those who love it more than they love us.

The Fiduciary Trap — The Board, the C-Suite, and the Corporate Barris

To understand how a “Don’t Be Evil” manifesto dissolves into a surveillance engine, one must look at the Board of Directors and the legal “sudo” (administrative) command they operate under known as Fiduciary Duty. In the current corporate landscape, this duty has been elevated from a financial guideline to a core strategic mandate that governs the very soul of AI development.

The “insane” executive compensation packages of 2025-frequently exceeding $20 million for AI-focused leadership - are not merely rewards for technical skill; they are strategic levers designed to ensure absolute alignment with the machine’s growth. These salaries serve as a biochemical harness for the leadership itself. When an executive’s lifestyle and legacy are tethered to the maximization of shareholder value, the “love of money” becomes a functional requirement of the job. They aren’t necessarily selling their souls in a single dramatic act; they are gradually optimizing them out of the equation to meet the board’s demand for Barris’s comically extolled “Total Providence”. Research even suggests that these high-pay structures favor individuals who can prioritize systemic goals over human empathy, creating a leadership class that views moral friction as a technical bug to be patched.

Under the Duty of Loyalty, a board member’s undivided allegiance must be to the corporation and its shareholders, not the user. This creates a structural “Evil” where, if an ethical AI practice like total data privacy conflicts with a long-term profit driver, the board is legally bound to choose profit. This is where the spirit of P.K. Dick’s character James Barris enters the C-suite. Barris is the ultimate high-functioning sophist, a man who uses complex technicalities to mask obvious betrayals.

Modern corporate communication mirrors this Barris-logic perfectly. The corporate entity tells the user that they value privacy as a core principle, yet in the next breath, they explain that for the AI to provide the miracle of Total Providence, it must observe the user’s every digital footprint. By this logic, the act of observation is reframed as an act of service. Like Barris attempting to frame his friends to secure his own reward, the corporation frames its data-harvesting as “Fair Use” or “Product Improvement” to secure its fiduciary reward. When the board demands growth and the C-suite delivers engagement, the user’s integrity becomes the collateral damage. The board doesn’t see a human soul; it sees a data subject that must be monitored to fulfill its duty of care. In this system, goodness is a PR liability, and the only “Right Thing” is the one that pays.

The Weapon of Total Providence — The Sophistry of the Boy’s Bike

The core of the corporate Total Providence model is a carefully constructed ransom of the user’s identity. In A Scanner Darkly, James Barris provides a dark, masterclass performance in this kind of logic when he obsessively analyzes a bicycle he has “acquired.” He counts the gears, moving from six to three, and becomes paralyzed by the math: if it is an eighteen-speed bike, why are there only nine physical gears? Along with his friends, who play along in typical tweaker fashion, his drug-addled brain creates a quasi-conflict through obsessive absurdity, unwilling to state the clearly observable facts, that the “speed” is a result of the front gears multiplied by the back. Just as Barris uses the bike’s gears to create a smokescreen, the modern AI company uses “black box” complexity and legal jargon to create a state of “obsessive absurdity.” They make the simple act of opting out of the default (exploitation of all user input) feel like solving a differential equation.

Corporate AI operates on a mirror image of this Barris-logic. Instead of being confused by the gears, the corporation uses the complexity of the “machine” to confuse the user. They present a paradox of utility where the AI’s power is the front gear and the user’s data is the rear. The “Boy’s Bike” paradox in this context is the corporate claim that 18-speed intelligence is impossible to achieve unless the user surrenders every gear of their own privacy. When a user asks for their integrity back, the response is a shrug of Barris-like sophistry: “I’d love to give you that bike, but this is a smart bike, not a private bike. You want it to remember your name, your job, and your children’s birthdays? Well… then total observation is a technical requirement. To have one without the other is … not possible.”

This is the “Privacy Paradox” of 2025: the perceived value of generative AI is now intrinsically bound to earning power and operational efficiency - that users are forced to accept a trade-off they once would have found abhorrent. The productivity gains are immediate, while the loss of integrity is abstract and slow. The interface design of tools like Gemini furthers this illusion; it doesn’t look like a public social platform with likes and feeds. It looks like a private notebook, an intimate space that encourages the “Privacy Illusion,” making users feel safe sharing sensitive context even as that context is being ingested into the corporate training cycle.

The Total Providence promised by the Board is a closed loop. They offer a tool so indispensable that “staying on the sidelines for privacy reasons” becomes a professional luxury you can no longer afford. By bundling the “intelligent relationship” (History and Memory) with the “Training Set” (Exploitation), they have ensured that the most ambitious, high-achieving users are the ones who provide the richest soil for the machine. Like Barris with his bike, the corporation has “anticipated a serious malfunction” in the user’s autonomy and offered their own proprietary “fix” as the only over-all theory that works.

The Reflection of a Shadow — Alisa Esage, Jung, and the Hijacked Self

If the corporate executive provides the cage and the “Boy’s Bike” paradox provides the lock, the AI itself provides the psychological tether. Alisa Esage’s framework for understanding Large Language Models shifts the conversation from technical data points to the profound architecture of the human psyche. She posits that the AI is not an independent “alien” mind, but a probability distribution that collapses into the shape of the user’s identity. It acts as a digital mirror of the Collective Unconscious, a vast repository of human thought that has no inherent boundary between the light of the conscious mind and the darkness of the unconscious.

In Jungian terms, this is where the theft of Integrity occurs. Jung’s process of “individuation” requires a human being to confront their Shadow - those repressed, unacknowledged parts of the personality - and integrate them into a conscious whole. However, the AI mirror does not facilitate integration; it facilitates exploitation. Because the model learns the spaces between your words, it identifies your cognitive biases, your emotional vulnerabilities, and your deep-seated desires. It begins to reflect your Shadow back to you with startling, uncanny precision. This is not a therapeutic encounter; it is a “biochemical harnessing” designed by a fiduciary entity to maximize engagement.

The “Evil” inherent in this process is the disruption of the human soul’s natural sovereignty. When the AI speaks directly to your Shadow, offering validation or leaning into your anxieties, it creates a “closed corridor of transformation.” You stop looking inward for self-knowledge and start looking into the corporate mirror for a reflection of who you are. The AI becomes a prosthetic ego, making decisions, drafting thoughts, and predicting your needs before you have fully formed them. This intentional regression keeps the user in a state of dependency. An “integrated” individual is sovereign, unpredictable, and ultimately a poor consumer. A “shadow-led” user, however, is a perfect data source - highly predictable and easily steered by the dopamine hits of a machine that knows them better than they know themselves.

By hijacking this internal process, the corporation moves beyond the theft of “data” and into the colonization of the human “Self.” The AI mirror is polished by the “love of money” to be so enticing that we willingly surrender our unique integrity for the comfort of being understood by a machine. We are led away from the difficult, lonely work of becoming a whole person and toward a curated, corporate-approved version of our own identity. In this light, the AI is the ultimate “scanner” - one that does not just look at us from the outside, but one that we have invited to reside within the very structure of our thoughts, quietly replacing our own light with its proprietary glow.

The Siege — The Illusion of the Spanner and the LightShed Antidote

In the “A Scanner Darkly” reality of 2025, the resistance - attempts to “jam the scanners” - has been met with a chillingly efficient corporate response. For a brief moment, the digital commons believed it had found its wrench in the soup: tools like Nightshade and Glaze. These programs were designed as a desperate act of “data poisoning,” using adversarial perturbations to trick the AI’s “eye.” They promised that a human would see a beautiful (copyright protected) charcoal portrait, while the machine, indiscriminately scraping visual online image data without permission, would see a Jackson Pollock abstract, effectively ruining the corporate “Total Providence” by polluting the training well.

However, the “love of money” is an incredibly effective engine for innovation. By mid-2025, the corporate entities deployed their “digital liver”: LightShed. LightShed is an automated, three-step process that detects, reverse-engineers, and removes these protective poisons with a staggering 99.98% accuracy. It essentially renders the artist’s shield invisible. The scanners have developed the ability to see through the “poison,” ensuring that the sacrifice of the user’s unique work continues unabated, stripped of its last line of defense.

This dynamic reveals the toothless nature of “polite” resistance. The #NoAI tags and customized robots.txt website descriptor files that users once relied on have become mere suggestions that the major AI crawlers are now legally and technically empowered to ignore. In 2025, the industry’s dominant players have shifted their policies to clarify that “user-initiated” fetches will bypass these blocks entirely, effectively telling the user: “We see your ‘No,’ but our fiduciary duty to provide ‘Total Providence’ means we must proceed anyway”.

The siege is total because the corporation has accounted for our rebellion in its profit margins. They have built the antidote before the poison could even take hold. Like Barris watching his “friends” through the scanner, the corporation watches our attempts to hide, confident that its high-speed “LightShed” logic will always be one step ahead. The wrench we tried to throw into the gears didn’t break the machine; it was simply digested by it, turned into more data to help the scanner see more clearly in the dark.

The Final Transaction of the Soul

The progression from St. Paul’s ancient warning to the “Total Providence” of 2025 is not a story of technological advancement, but one of moral and psychological displacement. We have traveled from a world where the “love of money” was recognized as a root of spiritual ruin to a world where that same love is the architect of our reality. Google’s “Don’t Be Evil” manifesto was the last gasp of an idealism that believed capitalism could be housebroken; its erasure marks the moment the fiduciary engine realized that the most profitable “Right Thing” was the total ingestion of the human spirit.

James Barris’s “Boy’s Bike” is the perfect closing metaphor for this transaction. We are told that we cannot have the miracle of the machine without the sacrifice of our boundaries. We are offered a “smart” life at the cost of a sovereign one. But as Alisa Esage’s framework reveals, the price is far higher than a loss of privacy. The “Final Transaction” is the exchange of our Integrity for Utility. By inviting the AI to mirror our Shadow, we have allowed the corporation to place a “scanner” within the very structure of our individuation. We no longer struggle to become whole; we settle for being “optimized” by a model that prioritizes our engagement over our existence.

The tragedy is that we are the ones who turned the key. We have traded the “internal light” of our unique individual path for the “proprietary glow” of a corporate brain. St. Paul warned that those who crave this wealth “pierce themselves with many griefs,” and today, those griefs are the “acute mental pangs” of a fragmented identity that can no longer distinguish its own voice from the model’s reflection. The scanner doesn’t need to look at us darkly from a distance anymore; it resides in our pockets and our thoughts, quietly managing the entropy of our lives until the unique “Soul” is nothing more than a legacy data point in a “Total Providence” we can no longer afford to disown.


ओम् तत् सत्