HELGE SVERREAll-stack Developer
Bergen, Norwayv13.0
est. 2012  |  300+ repos  |  4000+ contributions
Tools  |   Theme:
The Boiling Frog Map
March 25, 2026

Yesterday I wrote Alexander Wept, an essay about why developers feel hollow despite AI making everything possible. The core argument: it's not a scarcity problem (you ran out of things to build), it's an abundance problem (infinite possibilities make each one feel small). Plutarch diagnosed this 2,000 years ago. The fountain of tranquility is inside. No tool can clean it for you.

That gave me a framework. Then I tried to apply it to music, and it broke.

The music problem is different

I make AI music. Meme tracks under joke aliases, weird trance experiments, stuff that makes me laugh. I use Suno regularly and I'm not remotely anti-AI-music. But when I tried to map the developer crisis onto the music world, the shape was wrong.

The developer problem is about the maker's relationship to craft. You can build anything, and the act of building — which used to generate flow, meaning, identity — now produces less of those per hour because AI handles the hard parts. The IKEA effect: less effort, less ownership. The output works fine. It's just less yours.

The music problem is about the audience's relationship to other humans. Suno's own investor admitted she'd shifted most of her listening from Spotify to Suno — then deleted the post when someone pointed out it undermined their fair use defense. Suno users generate 7 million tracks per day. Many of them primarily listen to their own generated music. They are both the producer and the entire audience.

That's not the IKEA effect. That's something else entirely. When you listen to a song by another human, you're spending time inside someone else's emotional architecture — their grief, their anger, their joy, processed and expressed in a way that's foreign to your own experience. It's a low-key empathy exercise. Suno replaces that with a mirror. Every song confirms what you already feel, because you prompted it.

It's not an echo chamber in the social media sense, where algorithms feed you increasingly extreme external content. It's more like an anechoic chamber — a room so perfectly insulated that you only hear yourself. And the research on actual anechoic chambers is that people find them deeply unsettling. The absence of external sound doesn't create peace. It creates disorientation.

But here's the thing: does that actually matter? Music is one channel for encountering other people's interiority. It's not the only one. You have conversations, relationships, books, films. If someone stops listening to human-made music but still has a rich social life, are they measurably worse off? Probably not. And let's be honest — 90% of Spotify listening is background noise while commuting. Nobody was deeply engaging with the artist's inner life during Lo-Fi Hip Hop Beats To Study To.

So I started asking: which domains of AI personalization actually carry risk, and which ones are just nostalgia for friction we don't need?

Mapping the risk

I wanted a framework that separated real concern from reflexive luddism. Two axes:

X axis — Redundancy: Can you develop the same human skill through other channels? If AI erodes your ability to do X, are there five other ways to build that capacity, or is X the only path?

Y axis — Load-bearing: How fundamental is the skill that atrophies? Is it a nice-to-have, or does losing it cascade into other failures?

Danger zoneStealth erosionAlready boiledWho caresRedundancy →Load-bearing Stakes →LowHighLowHighAI companionsPersonalized adult contentParenting advice/delegationAI therapyEducation/learningDecision-makingNews/information dietWritingCodingMusic (creating)Music (listening)Art/visual creationCookingDriving/navigationShopping/recommendationsTranslation
Intimacy
Cognition
Craft
Utility
Resilience

Hover the dots. The reasoning is in the tooltips.

What clusters where

Top-left: the danger zone. AI companions, personalized adult content, parenting delegation, education. These are domains where the human friction is the skill. Intimacy requires tolerating another person's imperfection. Learning requires productive struggle. Parenting requires judgment under exhaustion with no manual. There's almost nowhere else to build these capacities. Remove the friction and you remove the development mechanism.

Bottom-right: who cares. Driving, shopping, translation, cooking. Low stakes, high redundancy. Nobody's becoming a worse person because GPS handles navigation. These are convenience gains with minimal skill atrophy.

The middle diagonal: the craft zone. Coding, music creation, writing, visual art. This is where the Alexander Wept argument lives. Medium redundancy, medium stakes. Lose any single one and you're fine. Lose all of them simultaneously and you've quietly removed most of the channels where people practice sustained effort toward a hard thing. Each dot is defensible. The aggregate might not be.

The empty corners

Two quadrants were completely empty.

Top-right (high stakes, high redundancy) had nothing. At first this seemed like a gap in my thinking. Then I realized it's structurally empty — almost a contradiction. If a skill is truly load-bearing for human functioning, it tends not to have lots of backup channels. The really important stuff — vulnerability, judgment under ambiguity, tolerating someone else's full complexity — is hard because there are so few places to practice it. If there were ten easy ways to learn intimacy, it wouldn't be the thing that defines and destroys relationships.

Bottom-left (low stakes, low redundancy) looked empty too. Same logic in reverse. If something is the only channel for a skill, we tend to assign it importance by definition. The uniqueness itself creates the perceived stakes. Try naming something that's truly irreplaceable and truly doesn't matter. It's hard. Because if it didn't matter, we wouldn't have noticed it was irreplaceable.

So the map is basically a diagonal. Everything clusters along a line from bottom-right (low stakes, replaceable) to top-left (high stakes, irreplaceable). Reassuring, in a way. The stuff AI is eating fastest is the stuff that matters least.

Except the bottom-left isn't quite empty. It has ghosts — skills that already completed the transition. Mental arithmetic, penmanship, memorizing facts, celestial navigation. Low redundancy, low stakes. The frog boiled decades ago and nobody cared. These are completed transitions, not active erosion. They sit in the corner as proof that some skill loss genuinely doesn't matter.

Danger zoneStealth erosionAlready boiledWho caresRedundancy →Load-bearing Stakes →LowHighLowHighAI companionsPersonalized adult contentParenting advice/delegationAI therapyEducation/learningDecision-makingNews/information dietWritingCodingMusic (creating)Music (listening)Art/visual creationCookingDriving/navigationShopping/recommendationsTranslationMental arithmeticPenmanshipMemorizing factsCelestial navigation
Intimacy
Cognition
Craft
Utility
Resilience

Then I found the real exceptions.

The stealth erosion quadrant

Three things broke the diagonal and landed in the top-right corner — high stakes, high redundancy. The "this should be safe but isn't" zone.

Boredom tolerance. This is the sleeper on the entire chart. It's linked to creativity, impulse control, delayed gratification, and self-regulation — basically every long-term life outcome researchers care about. And it's massively redundant. Any idle moment builds it: waiting rooms, long walks, manual labor, commutes, staring at the ceiling. You could lose any single source and be fine.

But you're not losing one source. You're losing all of them simultaneously. AI and personalized content fill every idle moment. Your phone fills the elevator ride. Podcasts fill the walk. AI-generated playlists fill the commute. Netflix autoplays fill the evening. No single app is the villain. The water temperature is the sum of a thousand small comforts, each one perfectly reasonable.

Disagreement tolerance. The capacity to be told no, to encounter friction, to have your ideas challenged and survive it. Massively redundant — you can encounter disagreement at work, in relationships, in sports, in debate. But AI companions that adapt to you, content algorithms that confirm you, assistants that agree with you, feeds that filter for your priors — it's now possible to opt out of friction across every channel simultaneously. No single product removes disagreement from your life. All of them together make it optional.

Incidental physical effort. High stakes for health and longevity, absurdly redundant (any sport, walking, stairs, manual labor). But self-driving cars, delivery bots, AI assistants that reduce errands, remote work — the baseline of accidental physical activity keeps dropping. You can still go to the gym. But "choosing to exercise" is a fundamentally different thing from "movement being woven into the structure of your day." One requires willpower. The other just happened.

These three share a structure: individually safe because of redundancy, collectively dangerous because AI erodes them from every direction at once. The redundancy that should protect them is exactly what makes the erosion invisible. If boredom tolerance depended on a single activity, we'd notice when that activity disappeared. Because it depends on a hundred small idle moments, we don't notice when they're all filled.

Here's the full map:

Danger zoneStealth erosionAlready boiledWho caresRedundancy →Load-bearing Stakes →LowHighLowHighAI companionsPersonalized adult contentParenting advice/delegationAI therapyEducation/learningDecision-makingNews/information dietWritingCodingMusic (creating)Music (listening)Art/visual creationCookingDriving/navigationShopping/recommendationsTranslationBoredom toleranceDisagreement toleranceIncidental physical effortMental arithmeticPenmanshipMemorizing factsCelestial navigation
Intimacy
Cognition
Craft
Utility
Resilience

The frog, the water, and the thermostat

The boiling frog metaphor gets used for everything. This version is different.

The classic boiling frog story is about one threat, gradually increasing. Climate change. Authoritarianism. Debt. You can point at the stove and say "that's the thing heating the water."

This is a pot with a thousand thermostats, each controlled by a different app, and each one only raises the temperature by a fraction of a degree. The frog isn't being boiled by Suno, or ChatGPT, or algorithmic feeds, or AI companions, or personalized content. It's being boiled by the aggregate. And every individual thermostat operator can truthfully say "my product only adds 0.1 degrees, that's nothing."

They're right. It is nothing. That's the problem.

The actually scary part

Everything above is about adults making choices. There's a version of this that's harder to be sanguine about: children growing up in warm water from birth.

An adult can choose to preserve boredom, seek disagreement, maintain physical activity, expose themselves to art they didn't ask for. They know what they're choosing because they experienced the alternative. A child raised with AI-calibrated entertainment that never lets them hit the "I'm bored" threshold doesn't know what unstructured boredom feels like. They never hit the moment where nothing is provided and they have to invent something. That moment is load-bearing for creativity, self-regulation, social negotiation, and theory of mind. And unlike adult skill maintenance, some of these developmental windows close.

Identity formation works similarly. Who you become is shaped enormously by random exposure — the weird book you stumbled onto, the music a friend's older sibling played, the job you took because nothing else was available. Personalization removes randomness. A child who only encounters what an algorithm predicts they'll like only becomes a more concentrated version of who they already were at age eight. The future self that would have emerged from accidental exposure to something foreign never materializes.

This is the one area where the boiling frog metaphor breaks down. It's not the same frog. It's the next generation of frogs, born into water that's already warm, with no memory of what cold felt like.


Cleanse the fountain. This is a follow-up to Alexander Wept. The interactive diagram above was built during the conversation that produced this essay.




<!-- generated with nested tables and zero regrets -->