On this page
Two articles on a screen
If you publish, edit, or commission public argument for a mission-driven organization—communications lead, research director, institutional voice owner—put two articles on the screen, side by side. The first is a well-structured thought-leadership piece on organizational change, produced in under a minute by an intern who has never led anything. The structure is tidy. The prose is clean. The headings are well-chosen. The examples sound plausible. There is nothing in the piece that a reader would, on a first pass, identify as synthetic.
The second is a piece by a practitioner who has been doing the work for thirty years. It is also tidy. Also clean. Also well-structured. A decade ago, the differences between the two would have been obvious inside the first paragraph, and decisive by the end of the first page. The practitioner's sentence rhythm, the specificity of the examples, the small precisions that only experience produces, the willingness to take a position a generalist would not take — all of these would have sorted the two pieces for a serious reader inside thirty seconds.
Today, side by side, the two pieces are indistinguishable on the surface. A reader can still tell them apart, with effort. Most readers will not put in the effort. The environment is too noisy; the stakes of each individual article are too low. The reader will skim, feel that both pieces are roughly competent, and move on. The practitioner's thirty years and the intern's forty seconds have, at the level of the page, been priced the same.
This is not a complaint about AI. It is a plain observation about what has happened to the signals readers used to rely on, and why the collapse of those signals is going to reprice how expertise and credibility work in the sector for the next decade.
Signal collapse, in the sense this book uses the phrase, is when craft stops acting as a reliable proxy for depth: polish used to cost money, so polish filtered for seriousness; when polish became cheap, that filter stopped working.
How signal used to work
Signal, in the information-economic sense, is whatever makes real expertise legible at a distance. It is the thing that lets a reader, in a feed of thousands of pieces, decide which ones deserve attention without having to evaluate each on its merits. Signal works because it is costly to fake. The cost is the filter.
For most of the era in which leaders in this sector built their habits, three kinds of signal did most of the work. The first was surface craft. A well-edited, well-structured, carefully worded piece of writing was expensive to produce. Production cost functioned as a filter. A reader who encountered a polished long essay could reasonably infer that someone had invested in it, which was a decent proxy for the invested-someone having had something to say.
The second was institutional backing. A byline in a serious publication, a talk at a curated conference, a book from a real publisher, a line on a masthead — these were markers of gatekeeping. Gatekeepers had filtered. The reader, by trusting the gatekeeper, got a shortcut around the work of filtering themselves.
The third was accumulated position. A writer who had, over years, produced a recognizable body of work carried a reputation that functioned as its own signal. The reader encountering a new piece by a known writer got to use the prior body of work as a prior probability. It did not guarantee the new piece was good. It reasonably predicted that the new piece would be worth reading.
Three signals, all costly in different ways to fake, all functioning as filters for the serious reader. The information environment of the sector was built on top of them for decades.
What AI did to signal
AI has not removed signal. It has repriced it. Specifically, it has radically reduced the cost of the first signal — surface craft — without touching the cost of the other two. The asymmetry is the entire problem.
Polished prose is now free. On-brand design is now free. Well-structured argument-shaped prose is now free. Paragraphs that sound authoritative, examples that sound plausible, conclusions that sound considered — all of these can be produced in seconds by a tool that does not understand what it is writing about. The surface, in other words, is no longer costly, which means the surface no longer signals. A reader encountering a polished piece cannot infer, from the polish, that anyone invested anything in the thinking underneath it.
That single change has a cascading effect on everything downstream. Institutional backing is still signal, but the gatekeeping function has been thinning for years under earlier platform pressure, and AI further thins it by flooding every submission queue with plausible-looking work. Accumulated position is still signal, but it is harder to build one because the surface cost that used to protect the body of work from dilution has collapsed. New work gets produced by anyone, at volume, with the same surface polish that used to indicate experience. The reputation layer still exists. It has to do more work than it used to, because the surface layer has stopped doing any.
The result is a legibility crisis. Expertise is as present as it ever was — arguably more, given how many serious practitioners are working — but the sorting mechanism that readers used to use to find it has gone quiet. A reader scrolling through a feed of thousands of pieces is now operating without the filter their reading habits were built around. The reader either invests much more effort in evaluating each piece, which most will not do, or learns to trust a shrinking set of proxies, which most will.
The paradox this produces
The paradox, stated sharply, is this. The leaders in this sector with the most actual expertise are now among the hardest to identify from the outside. Their surface output looks identical to the surface output of a prompt. A reader who is trying to find a serious voice on a subject, using the signals their reading habit was formed by, will often fail to find one — not because the serious voice is not publishing, but because the sorting mechanism can no longer see the difference between the serious voice and the generic filler.
This is the opposite of what most commentary about AI predicted. The popular framing was that AI would flood the sector with low-quality work and that readers would quickly learn to filter it out, leaving serious voices more visible. The pattern in the field is the inverse. AI has produced medium-quality work at scale, not low-quality work, and medium-quality work is precisely the kind that defeats the reader's existing filters. Low-quality work is easy to dismiss. Medium-quality work crowds the channel and teaches readers to expect less from what they encounter, which makes the genuinely high-quality voice even harder to distinguish.
The paradox compounds. As readers train themselves, correctly, to trust surface less, they also become less responsive to the signals real experts have used to identify themselves. A reader burned enough times by polished filler begins discounting polish in general. The expert's polished piece gets discounted along with the filler's polished piece. Real expertise is penalized by the reader's correct adaptation to an environment the expert did not create.
What still signals
The collapse is not total. Several categories of signal remain, and they are — this is the important part for the rest of this book — precisely the categories that AI does not reduce the cost of.
Specificity that only experience produces still signals. A sentence that names a particular constraint, a particular number, a particular failure mode that a reader with equivalent experience will recognize and that a generator would not have produced, still registers. The specificity of lived work is expensive in a way AI does not lower. It is produced by years of encountering the actual terrain, not by training on text about the terrain.
Positions a generator would not take still signal. Real practitioners, over time, develop positions that are costly to hold. They are unfashionable inside their sector, or they disagree with a consensus that feels safe, or they refuse a claim that most peers are making because the practitioner knows, from the work, why the claim is wrong. Generators, by design, converge on whatever is safest and most common. A piece that refuses to converge, that takes a position a generator would not, signals something a generator cannot produce.
Lived consequence signals. A piece of work that names the cost the writer has paid, publicly and specifically, for a position or a choice, signals in a way a generator cannot touch. The generator has never paid a cost. The reader can tell.
Time signatures signal. A body of work that has been built over a decade, with visible evolution, with earlier pieces that were wrong in ways later pieces acknowledge, with a through-line that can be traced, signals in a way no recent piece can fake. A decade cannot be fabricated in a week, no matter how good the tooling gets.
Relationships signal. The practitioners other serious practitioners cite, invite, argue with, and defer to form a network that is expensive to join. The network is not a status game. It is a slow, accumulated set of trust relationships among people who have actually worked together or read each other seriously over years. A voice that moves inside that network signals something that a voice outside it cannot imitate, regardless of surface polish.
Specificity, position, consequence, time, relationships. These are what survive. They are not alternatives to craft. They are what craft has to be in service of in order to signal now.
What organizations have to do
The operational claim for any leader in this sector is short. Stop competing on surface. Compete on everything a generator cannot produce.
Organizations that continue to optimize their communications for surface polish are investing in the single category of signal that has gone to zero. The investment will feel like responsible work because polished output is familiar, legible on dashboards, and easy to defend in a staff meeting. It will also be economically irrational inside the new information environment, because every dollar spent on polish is a dollar spent on an attribute that no longer signals.
Organizations that reallocate, quietly, toward the categories of signal that remain — specificity, position, consequence, time, relationships — will be disproportionately visible to the readers who matter in five years. Not because they tried harder on the surface. Because the surface was not the level they were competing on.
This reallocation is uncomfortable. It asks organizations to hold positions they might rather hedge. It asks staff to write with specificity that feels exposing. It asks leaders to publish on a decade-long time signature rather than on a quarterly one. None of these are standard moves inside the current operating rhythm of the sector. All of them are the moves that pay.
Why the collapse is good news
The closing move of this piece is not pessimistic, and it should not be read as such. The noise floor in the sector is rising. Real signal — the kind that cannot be faked — is becoming more valuable, not less. The leaders who invest in genuine substance are positioned for the decade, not the quarter. The collapse of the old surface signals is not the end of signal. It is the end of the cheap signals, and the beginning of a period in which only the expensive signals will carry weight.
That is durable good news for anyone willing to build a body of work, over years, with specificity, position, and consequence inside it. It is bad news only for the organizations that were trading on surface, which were going to be outcompeted eventually regardless. AI has accelerated a sort that was already underway.
The next chapter names a specific corollary of this collapse that leaders have felt but not yet diagnosed. As the old signals go quiet, real expertise is not just harder to sort for. It is losing visibility inside the discovery layers readers now lean on. That is a different problem, and it requires a different response.
Read next: Why Expertise Is Becoming Invisible — how the sorting mechanisms of the new information environment are burying the voices readers most need.

