Toby E. Stuart, a professor of business administration at the University of California’s Haas School of Business, is the author of the forthcoming book “Anointed: The Extraordinary Effects of Social Status in a Winner-Take-Most World.”
A few weeks ago, the editor of a literary magazine told me: “I can’t tell who wrote the essay I’m reading.” The piece in question had sailed through a plagiarism check, but a few too many sentences rang with predict-the-next-word perfection. Unsure whether it was produced by man or machine, the editor nixed the submission.
This challenge is suddenly ubiquitous. We’ve all by now watched in amazement as generative AI produces prose, poems, music, video, code, and concept art that is indistinguishable from what a competent human can create but at a tiny fraction of the cost. So what are college admissions officers to make of “personal” statements, newspaper desks of op-ed submissions, scientific journal editors of manuscript submissions, book publishers of proposals, and so on?
Get The Gavel
A weekly SCOTUS explainer newsletter by columnist Kimberly Atkins Stohr.
When the ability to expediently undertake honest evaluation eludes us, we intuitively default to a next-best shortcut: pedigree. Expect a resurgence of reliance on status symbols we may have thought the world was beginning to leave behind — elite diplomas, warm intros, old-fashioned references, a person’s ZIP code, race, gender, and maybe even their given and family names. One of the early, wide-scale effects that generative AI will have on labor and capital markets is the return of velvet ropes.
The logic dates way back. If an artifact’s authenticity is in question, validate the artisan by their tribe. We’ll see these changes all over. Foundations such as Wellcome Trust are accepting applications for grants from “established researchers.” It’ll go unsaid, but universities will rely more heavily on PhD program rankings when they recruit faculty. Admissions officers who had begun to move away from standardized test scores may now grasp onto any numerical indicators that still seem to justifiably sort applicants, even if only by a hair’s breadth. Law firms that started to lean toward “school-agnostic” hiring policies will reverse course.
This is cognitive triage rather than malice. AI already is doing many wonderful things for us, but it has massively diluted our ability to assess talent and verify authenticity. So gatekeepers everywhere are going to look for logos. In the near term, that will tilt the playing field further away from anyone who lacks status markers. It’ll be no surprise if a first-generation college applicant from Fresno whose personal essay might be AI assisted faces more doubt than a legacy or Andover kid who is presumptively the better writer. Likewise, a midcareer coder in Belgrade whose work shows well on GitHub will meet greater suspicion than an MIT grad with a referral, even if both their code repositories were mostly generated with the help of Copilot.
The same technology that promises to democratize education, enlarge the circle of creators and productive workers, and equalize talent in the workplace will, ironically, refeudalize selection and recognition in the near term. Whether we end up in a world of more distributed opportunity or more unexamined pedigree may depend on our appetite for doing the harder work of verification or on our willingness to treat outputs of artificial or of human creators as one and the same. Until we acknowledge the latter or we develop new methods to assess and authenticate human capability in an AI-saturated world, the democratic potential of these technologies will be overshadowed by the hierarchies they otherwise might have helped to dissolve.
.png)
