There’s a moment, somewhere around late 2022, when the tech world collectively lost its mind over a job title. Not a product. Not a business. a title for a job. Prompt Engineer. Within weeks of ChatGPT’s public debut, the phrase was everywhere — on LinkedIn profiles, in breathless startup job postings, in newspaper headlines that treated it with the same reverence once reserved for “rocket scientist.”
Anthropic, the AI safety company backed by billions, posted a Prompt Engineer opening at $375,000 a year and didn’t even require a computer science degree. As expected, the internet went crazy.
| Job Title | Prompt Engineer |
| Peak Salary Range | $200,000 – $500,000+ USD |
| Job Emerged | Late 2022 (post-ChatGPT launch) |
| Peak Job Search Interest | April 2023 — 144 searches per million (Indeed) |
| Current Search Volume | 20–30 per million (as of 2025) |
| Notable Hiring Company | Anthropic (posted role at $375,000, no CS degree required) |
| Key Voices | Jared Spataro, CMO AI at Work, Microsoft; Hannah Calhoon, VP of AI, Indeed |
| Companies That Never Hired One | Nationwide, Carhartt, New York Life |
| Microsoft Survey Sample | 31,000 workers across 31 countries |
| Prompt Engineer Ranking (New Roles) | 2nd from bottom among roles companies plan to add |
| Reference Website | Wall Street Journal — AI & Jobs Coverage |
It’s hard not to notice, looking back, just how fast the cycle moved. In the spring of 2023, searches for “prompt engineer” on Indeed hit 144 per million — a number that sounds modest until you realize it had been essentially zero eighteen months before. Recruiters were cold-messaging linguistics majors, philosophy graduates, anyone who could demonstrate a knack for coaxing coherent, useful outputs from large language models.
The pitch was simple: you don’t need to code. You just need to know how to talk to the machine. For a certain kind of person — curious, verbal, technically adjacent but not quite technical — it sounded like the answer to a prayer.
By mid-2025, those same searches had flatlined somewhere between 20 and 30 per million. Posts had stopped. The $400,000 salaries were becoming the stuff of nostalgic LinkedIn posts. Microsoft‘s own survey of 31,000 workers across 31 countries ranked prompt engineering second from the bottom among roles companies were actually considering adding. The ideal job of the AI era had aged incredibly quickly.
What happened isn’t mysterious, exactly. The models got better. Early versions of ChatGPT and its competitors were brilliant but temperamental — they needed careful handling, specific phrasing, precise framing. A skilled prompt engineer could mean the difference between an AI that produced garbage and one that produced something genuinely useful. That gap was real, and it was worth paying for.
But AI systems, almost by design, are built to close gaps. They became more conversational, more iterative, more capable of asking follow-up questions and inferring intent without needing the input perfectly constructed the first time. In a sense, they started doing the prompt engineering themselves. The job description dissolved into the product.
Jared Spataro, Microsoft’s chief marketing officer of AI at Work, put it bluntly at a Wall Street Journal event earlier this year. “Two years ago, everybody said, ‘Oh, I think prompt engineer is going to be the hot job,'” he said. “It’s not turning out to be true at all.
” Spataro described how Microsoft’s AI research tools now ask clarifying questions, acknowledge uncertainty, and iterate on their own — which is essentially what a skilled prompt engineer was being paid to do from the outside. The expertise got absorbed into the system. The middleman disappeared.
Companies like Nationwide, Carhartt, and New York Life — the kind of large, deliberate organizations that move carefully on hiring decisions — say they never brought on a single prompt engineer. Instead, they trained their current workforce. Nationwide rolled out a companywide AI curriculum, and prompt engineering became one of its most popular modules.
Their Chief Technology Officer, Jim Fowler, described it plainly: “We see this becoming a capability within a job title, not a job title to itself.” That framing matters. It suggests the knowledge was always real — just never quite a standalone profession.
There’s something almost poignant about how the role emerged in the first place. It was a transitional artifact, filling the space between what AI could technically do and what most people understood about how to use it. Once that gap narrowed — and it narrowed fast — the role lost its reason to exist. The same dynamic is now rippling through other emerging AI-specific titles.
Specialized AI trainers are being folded into data engineering teams. As machines become more adept at translating human intent into machine output, the number of roles designed to do so is decreasing. The pattern is starting to sound familiar.
However, a new list of exotic titles isn’t taking the place of these positions. AI fluency is more elusive and more difficult to certify. the capacity to collaborate with these systems rather than merely run them. Demand for that quality jumped sevenfold between 2023 and 2025, according to McKinsey — but it’s showing up as a requirement inside existing roles, not a new job posting.
A marketer who understands AI, a lawyer who can use it strategically, a financial analyst who knows its limits. Companies now claim to want that. not an expert. A generalist without a need for one.
The $500,000 prompt engineer — and yes, some salaries climbed that high, briefly — wasn’t a fraud. The people who held those roles were often genuinely skilled, often doing something real and valuable in that narrow window when the technology needed a human interpreter. However, the window shut.
The job’s disappearance isn’t what’s disturbing. It’s how quickly it happened, and the quiet reminder it leaves behind: in the age of AI, even the jobs created by AI aren’t safe from it.

