
Yellow-Red-Blue, by Wassily Kandinsky
Small preface: This is an essay about bubbles—both economic and personal. The AI–semiconductor rally has become the defining financial story of the decade, and this piece explores why that market mirrors something far more familiar: the inflation of the self in a hyper-individualistic culture. Here, I trace how what we believe becomes what we value, how stories take on real economic weight, and how acting certain influences both markets and our own identities. Whether you’re finance-coded or simply trying to understand the world we’re living in, this piece offers a way to read both economic systems and emotional ones with sharper, more honest precision.
Debates about AI and markets are saturated with talk of bubbles. Commentators argue over whether valuations have truly detached from fundamentals or simply reflect the repricing of a transformative technology. Rather than debating whether a bubble exists, I use “bubble” as a lens for moments when narratives outrun reality. Just as financial bubbles inflate in markets, we see a similar inflation in the self: identity, persona, boundary – each overvalued relative to relational reality.
The current AI–semiconductor boom and our culture of radical individuality share this structure. Both turn comforting narratives into inflated valuations, economic in one case and psychological in the other. In both markets and the self, narrative momentum – amplified by individualism – pushes things beyond what fundamentals or lived behaviour can sustain.
This bubble discourse is itself a product of an individualistic culture. Its clearest financial expression is on the investor side. The same social logic that emphasises self-containment, parasocial allegiance, and optionality over obligation encourages investors to buy narratives over cash flows. Markets mimic these habits: we follow confident crowds when uncertainty strikes, mistake repetition for truth, and pay now for the illusion of control, settling with reality later. We do not just price future earnings; we price the psychological relief that comes from feeling “decided.” That additional price is a “sentiment spread”: a premium paid to close uncertainty before systems are ready – much like we seek relief by asserting clarity about our identities before they are fully formed (Baker & Wurgler, 2006; Horton & Wohl, 1956).
In markets, that overpricing is already visible in hard numbers. In the first half of 2025, AI-related capital expenditure contributed about 1.1 percentage points to U.S. GDP growth, meaning investment in AI outpaced the consumer as a driver of expansion (J.P. Morgan Asset Management, 2025). At the same time, industry analysts estimate that hyperscalers’ AI infrastructure spending will reach around $490 billion in 2026, with cumulative capex potentially exceeding $2.8 trillion by 2029: a vast concentration of economic activity funnelled into a single, long-duration bet (Reuters, 2025a). A large share of new economic energy is being committed on the assumption that today’s AI narratives will be validated tomorrow.
This extra weight reveals what’s at stake when investors pay for more than machines; they are paying for conviction.
This dynamic sets the stage for how belief itself is being priced. Markets today run on conviction as much as cash. A narrow circle of AI-platform companies has persuaded investors and governments that raw compute is the new oil, and that owning chip capacity is comparable to owning the future. That belief is priced as though it were already cash-generating.
The concentration of belief shows up clearly in NVIDIA’s role. In 2025, NVIDIA became the first public company to hit a $4 trillion valuation, accounting for roughly 7–7.5% of the S&P 500’s market value – an unprecedented weight for a single hardware supplier (Bloomberg, 2025). Owning “the market” increasingly means owning one AI-exposed chip vendor, and investors are treating a few quarters of explosive data-centre revenue as if they were a permanent feature of the landscape.
The circle is narrower than it appears. In late 2025, OpenAI and NVIDIA announced a strategic partnership: NVIDIA will supply at least 10 gigawatts of AI data-centre capacity to OpenAI, and has committed to invest up to $100 billion in OpenAI progressively as each gigawatt is deployed (OpenAI, 2025b; Reuters, 2025b). The first phase is slated for the second half of 2026. Meanwhile, in a separate multi-year agreement, OpenAI will purchase up to 6 gigawatts of GPUs from AMD (starting with a 1 GW deployment in mid-2026). As part of that deal, OpenAI receives a warrant to acquire up to approximately 10% of AMD’s common stock, contingent on deployment milestones and performance targets (Reuters, 2025b).
What is striking about the current AI–semiconductor boom is how financial flows, compute-supply and infrastructure demand are entwined. NVIDIA and AMD are not merely chip-vendors, but equity-holders in the very firms (like OpenAI) buying their hardware. Meanwhile, cloud-infrastructure partners like Oracle (themselves major investors and hosts) buy more chips to meet rising AI demand. Money flows from investors into hardware, into data-centres, into AI firms – and partly back into valuations of all those entities. It becomes a self-reinforcing loop: supply funds demand, demand funds supply, and shared optimism binds the two. In effect, capital is being recycled through a “compute-infrastructure-valuation” feedback cycle.
In these arrangements, buyers become owners, and suppliers become investors. The same few firms sit on both sides of these deals, so narrative and pricing reinforce each other in real time.
This is classic reflexivity: belief drives price, and price validates belief (Shiller, 2019). Overconfidence becomes the market’s love language to itself. It shortens horizons and turns speculation into identity. A few dominant quarters are treated as destiny; uncertainty becomes socially prohibited. We stop questioning outcomes and behave as though they have already arrived.
Markets mirror the individual: narrative replaces reality long before the infrastructure, or the person, actually exists. The same pattern, declaring something inevitable because we emotionally need it to be, is also how we talk about ourselves.
Our appetite for certainty shapes both markets and internal narratives. We do not merely want good outcomes; we want outcomes that cannot go otherwise. Certainty becomes emotional yield. That is what turns devotion – of capital, attention, or allegiance – into duration risk: once we declare something inevitable, we bind ourselves to it longer than the evidence justifies.
Our culture rewards the performance of certainty. Founders onstage, politicians on the trail, ordinary people online all speak in inevitabilities. Individualism reframes insecurity as weakness: you must “know,” even when privately you do not. Desire is presented as fact.
Markets internalise this. When OpenAI says it will require planetary-scale compute, investors hear inevitability, not aspiration. Informational cascades take over: people copy the loudest conviction in the room (Bikhchandani, Hirshleifer & Welch, 1992). It is not just herd behaviour but ego-defence; nobody wants to be the lone voice admitting uncertainty.
The desire for relief plays an equally important role: individuals spend to soothe fear, markets do the same. The rush to finance enormous GPU campuses is driven not only by expected future cash flows but by the emotional reassurance that “we will not miss the wave.” Behavioural finance calls this “sentiment”: paying for exposure that soothes an emotional need (Baker & Wurgler, 2006). It is a closure premium.
This is duration blindness: compressing years of physical build-out: power lines, substations, cooling systems, into a psychological now. Nothing here suggests that AI infrastructure will not materialise, only that we treat a long construction process as though it already exists.
The build-out is not abstract. The AI campuses being promised now depend on new power lines, high-voltage substations and grid equipment that currently have multi-year lead times, in markets already facing transformer shortages and surging data-centre power demand (Reuters, 2025c; Reuters, 2025d). Regulators and macro-economic watchdogs ( including the BIS) now explicitly flag that equity indices are being pulled to record highs by a narrow set of AI-linked mega-caps, even as questions remain about whether these vast capex commitments can earn back their cost on any reasonable horizon (Bank for International Settlements, 2025). The narrative collapses years of engineering and regulatory friction into a single, emotionally “secured” present.
In emotional terms, this is identical to treating a still-forming identity as fully realised because admitting uncertainty feels humiliating.
If duration risk concerns how long we stay bound to a story, scarcity concerns who is allowed inside it at all. The AI story is sold as critical infrastructure but also as exclusivity. Only a handful of firms – NVIDIA, OpenAI, Oracle, AMD, a few others – are framed as legitimate participants. Scarcity is the moat.
This logic has a psychological analogue: hyper-individualism moralises unavailability. To be rare is to be valuable; boundaries become brand. Emotional scarcity is treated as worth.
The industrial version is simply scaled up. Oracle’s dedicated campuses for OpenAI, NVIDIA’s role as both supplier and co-architect; these structures signal that access itself is a source of value.
Regulators and industry watchers are already uneasy with how entangled this ecosystem has become. The same small set of names now appears as supplier, customer, investor, landlord and preferred tenant across the largest AI projects. Deals like Oracle’s multi-billion-dollar “Stargate” contract with OpenAI, or NVIDIA’s proposed $100 billion investment alongside hardware sales, turn what should be arm’s-length relationships into a web of mutual dependence. When only three to five firms are deemed “real AI,” the system becomes emotionally and politically attached to their continued dominance; disruption at one company starts to read less like business risk and more like a threat to national infrastructure (OpenAI, 2025a; OpenAI, 2025b).
Publicly, these companies insist on optionality; privately, their commitments run deep. The performance is independence; the reality is entanglement. When the aftercare arrives: power constraints, grid bottlenecks, cost overruns, the moat stops looking like pure brilliance and starts looking like the depth of the relationships holding everything up.The same inversion appears in people who perform detachment while quietly depending on the very relationships they claim not to need.
The AI narrative – inevitability, supremacy, an untouchable moat – mirrors the hardened persona our culture encourages. We present as self-contained, already healed, already ascended. We treat distance as strength and scarcity as worth. We refuse dependence because dependence sounds weak.
But no system is weightless. AI infrastructure ultimately rests on unglamorous physics and unexciting logistics. So do people. A self that claims to be untouchable still runs on power drawn from others: attention, reassurance, recognition. What looks like autonomy is often infrastructure.
When investors speak of duration risk or regulators of concentration, they circle the same truth: the fantasy of the sealed, self-sufficient actor – corporate or personal – collides with the cost of maintenance. The myth of autonomy meets the reality of being kept alive. If a bubble exists, it is not only in AI, but in the confidence with which we perform self-sufficiency while quietly relying on others to hold the world up around us.
Works cited
Baker, M. and Wurgler, J. (2006) ‘Investor sentiment and the cross-section of stock returns’, Journal of Finance, 61(4), pp. 1645–1680.
Bank for International Settlements (2025) BIS Quarterly Review: September 2025. Basel: Bank for International Settlements.
Bikhchandani, S., Hirshleifer, D. and Welch, I. (1992) ‘A theory of fads, fashion, custom, and cultural change as informational cascades’, Journal of Political Economy, 100(5), pp. 992–1026.
Bloomberg News (2025) ‘Nvidia hits $4 trillion value as rally notches another milestone’, Bloomberg, 9 July.
Horton, D. and Wohl, R.R. (1956) ‘Mass communication and para-social interaction: Observations on intimacy at a distance’, Psychiatry, 19(3), pp. 215–229.
J.P. Morgan Asset Management (2025) Is AI already driving U.S. growth? Market Insights, 12 September. London: J.P. Morgan Asset Management.
NVIDIA Corporation (2025) NVIDIA announces financial results for third quarter fiscal 2026. Press release, 19 November. Santa Clara, CA: NVIDIA Corporation.
OpenAI (2025a) Stargate advances with 4.5 GW partnership with Oracle. Press release, 22 July. San Francisco, CA: OpenAI.
OpenAI (2025b) OpenAI and NVIDIA announce strategic partnership to deploy 10 GW of NVIDIA systems. Press release, September. San Francisco, CA: OpenAI.
Reuters (2025a) ‘Citigroup forecasts Big Tech’s AI spending to cross $2.8 trillion by 2029’, Reuters, 30 September.
Reuters (2025b) ‘AMD signs AI chip-supply deal with OpenAI, gives it option to take 10% stake’, Reuters, 6 October.
Reuters (2025c) ‘US utilities grapple with Big Tech’s massive power demands as data centers surge’, Reuters, 7 April.
Reuters (2025d) ‘US faces transformer supply shortfall as power demand surges, WoodMac says’, Reuters, 14 August.
Shiller, R.J. (2019) Narrative economics: How stories go viral and drive major economic events. Princeton, NJ: Princeton University Press.