By Jeffrey Selingo
From New York Magazine Intelligencer
When I left for college in the fall of 1991, the internet era was just beginning. By sophomore year, I received my first email address. By junior year, the first commercial web browser was released. The summer after graduation, I worked as a reporter at the Arizona Republic covering the internet’s rise in our everyday lives, writing about the opening of internet cafés and businesses launching their first websites. I was part of an in-between class of graduates who went off to college just before a new technology transformed what would define our careers.
So when Alina McMahon, a recent University of Pittsburgh graduate, described her job search to me, I immediately recognized her predicament. McMahon began college before AI was a thing. Three and a half years later, she graduated into a world where it was suddenly everywhere. McMahon majored in marketing, with a minor in film and media studies. “I was trying to do the stable option,” she said of her business degree. She followed the standard advice given to all undergraduates hoping for a job after college: Network and intern. Her first “coffee chat” with a Pitt alumnus came freshman year; she landed three internships, including one in Los Angeles at Paramount in media planning. There she compiled competitor updates and helped calculate metrics for which billboard advertisements the company would buy.
But when she started to apply for full-time jobs, all she heard back — on the rare occasions she heard anything — was that roles were being cut, either because of AI or outsourcing. Before pausing her job search recently, McMahon had applied to roughly 150 jobs. “I know those are kind of rookie numbers in this environment,” she said jokingly. “It’s very discouraging.”
McMahon’s frustrations are pretty typical among job seekers freshly out of college. There were 15 percent fewer entry-level and internship job postings in 2025 than the year before, according to Handshake, a job-search platform popular with college students; meanwhile, applications per posting rose 26 percent. The unemployment rate for new college graduates was 5.7 percent in December, more than a full percentage point above the national average and higher even than what high-school graduates face.
How much AI is to blame for the fragile entry-level job market is unclear. Several research studies show AI is hitting young college-educated workers disproportionately, but broader economic forces are part of the story, too. As Christine Cruzvergara, Handshake’s chief education-strategy officer, told me, AI isn’t “taking” jobs so much as employers are “choosing” to replace parts of jobs with automation rather than redesign roles around workers. “They’re replacing people instead of enabling their workforce,” she said.
The fact that Gen-Z college interns and recent graduates are the first workers being affected by AI is surprising. Historically, major technological shifts favored junior employees because they tend to make less money and be more skilled and enthusiastic in embracing new tools. But a study from Stanford’s Digital Economy Lab in August showed something quite different. Employment for Gen-Z college graduates in AI-affected jobs, such as software development and customer support, has fallen by 16 percent since late 2022. Meanwhile, more experienced workers in the same occupations aren’t feeling the same impact (at least not yet), said Erik Brynjolfsson, an economist who led the study. Why the difference? Senior workers, he told me, “learn tricks of the trade that maybe never get written down,” which allow them to better compete with AI than those new to a field who lack such “tacit knowledge.” For instance, that practical know-how might allow senior workers to better understand when AI is hallucinating, wrong, or simply not useful.
For employers, AI also complicates an already delicate calculus around hiring new talent. College interns and recent college graduates require — as they always have — time and resources to train. “It’s real easy to say ‘college students are expensive,’” Sim Kho told me in an interview. “Not from a salary standpoint, but from the investment we have to make.” Until recently, Kho ran early career programs at Raymond James Financial, where it took roughly 18 months for new college hires to pay off in terms of productivity. And then? “They get fidgety,” he added, and look for other jobs. “So you can see the challenges from an HR standpoint: ‘Where are we getting value? Will AI solve this for us?’”
Weeks after Stanford’s study was released, another by two researchers at Harvard University also found that less experienced employees were more affected by AI. And it revealed that where junior employees went to college influenced whether they stayed employed. Graduates from elite and lower-tier institutions fared better than those from mid-tier colleges, who experienced the steepest drop in employment. The study didn’t spell out why, but when I asked one of the authors, Seyed Mahdi Hosseini Maasoum, he offered a theory: Elite graduates may have stronger skills; lower-tier graduates may be cheaper. “Mid-tier graduates end up somewhat in between — they’re relatively costly to hire but not as skilled as graduates of the very prestigious universities — so they are hit the hardest,” Maasoum wrote to me.
Just three years after ChatGPT’s release, the speed of AI’s disruption on the early career job market is even catching the attention of observers at the highest level of the economy. In September, Fed chair Jerome Powell flagged the “particular focus on young people coming out of college” when asked about AI’s effects on the labor market. Brynjolfsson told me that if current trends hold, the impact of AI will be “quite a bit more noticeable” by the time the next graduating class hits the job market this spring. Employers already see it coming: In a recent survey by the National Association of Colleges and Employers, nearly half of 200 employers rated the outlook for the class of 2026 as poor or fair, the most pessimistic outlook since the first year of the pandemic.
The upheaval in the early career job market has caught higher education flat-footed. Colleges have long had an uneasy relationship with their unofficial role as vocational pipelines. When generative AI burst onto campuses in 2022, many administrators and faculty saw it primarily as a threat to learning — the world’s greatest cheating tool. Professors resurrected blue books for in-classroom exams and demanded that AI tools added to software be blocked in their classes.
Only now are colleges realizing that the implications of AI are much greater and are already outrunning their institutional ability to respond. As schools struggle to update their curricula and classroom policies, they also confront a deeper problem: the suddenly enormous gap between what they say a degree is for and what the labor market now demands. In that mismatch, students are left to absorb the risk. Alina McMahon and millions of other Gen-Zers like her are caught in a muddled in-between moment: colleges only just beginning to think about how to adapt and redefine their mission in the post-AI world, and a job market that’s changing much, much faster.
What feels like a sudden, unexpected dilemma for Gen-Z graduates has only been made worse by several structural changes across higher education over the past decade.
First, a huge surge of undergraduates shifted to majoring in fields now being upended by AI. In the aftermath of the Great Recession of 2008, a long-running survey of college freshmen by UCLA found students much more focused on going to college to “get a better job” than on what they previously wanted most: to learn more about things that interested them. That new mind-set showed up in what they picked as a major in college. Between 2010 and 2020, fields such as philosophy, history, and English saw a big drop in popularity. The latter two majors fell by one-third in that ten-year period while overall humanities enrollment declined by almost a fifth. Where did they go? A lot pivoted to computer science and related fields.
Last year, the number of students majoring in comp-sci alone topped 170,000 — more than double the number from 2014, even as overall undergraduate enrollment fell. Many were responding to a steady drumbeat of advice from groups like Code.org and Girls Who Code, amplified by tech celebrities such as Bill Gates and Mark Zuckerberg and echoed by presidents from Barack Obama to Donald Trump, all urging young people to learn computer programming. Now, ironically, many of those same students are struggling to find work, as the entry-level positions they are seeking tend to be ones that are among the most affected by AI. College graduates in their 20s with computer-science and computer-engineering majors have one of the highest unemployment rates, according to a report last year from the Federal Reserve Bank of New York — double that of pharmacy, criminal justice, and biology. Undergrads seem already too aware of this new state of affairs: Enrollment in computer- and information-sciences programs is down nearly 8 percent this academic year compared to last.
Andrew Wyatt was on track to graduate from the University of Southern California in December with a computer-science degree. He has since switched to economics and data science after realizing how easily the work he’d performed during his internships could be done by AI. “I coded, I analyzed data,” he said of his prior internship. And of his current one: “Now I’m building reports from raw information so the company can make sense of what’s happening in its sales pipeline. None of it’s glamorous.” But even his new major is subject to the same forces. “AI can do it all without me,” he said. His internship experience should have set him up for a full-time data-analytics job, but whenever he gets a call back from the two-dozen positions he has applied to so far, they tell him, “‘We have AI do the cool stuff, so would you like a sales job instead?’”
While there are no easy answers here, experts still tend to point to practical workplace experience as a durable advantage for recent college graduates. Meaning that, even as he struggles, Andrew is still better off than his peers who aren’t getting internships. Nearly three-quarters of freshmen expect to have intern experiences before they graduate, according to Strada Education Foundation research, but fewer than half of them actually complete one by their senior year. And nothing better predicts “underemployment” after college — meaning you end up in a job where a degree isn’t needed — than failing to secure at least one internship in college, Matt Sigelman, who has studied the labor market for two decades and is president of the Burning Glass Institute, told me. More than half of graduates who did not intern as students, according to Burning Glass, were underemployed five years after college. “It used to be that we went to college and maybe had a job on the side,” Sigelman said. “Now you need a job with college on the side.”
What most colleges offer is still wildly out of sync with this new reality. As institutions, they place the greatest value on what happens in the classroom. There are a handful of schools that build internships or what’s known as cooperative education into the undergraduate curriculum — including Northeastern University, Drexel University, and the University of Cincinnati — so that workplace experience can be structurally part of earning an undergraduate degree. But, for now at least, they remain very much exceptions.
In general, new graduates in the age of AI face an even harsher version of the classic Catch-22: Employers want experience, but no one wants to be the employer to provide that experience. Colleges are almost inevitably going to play a much bigger role in launching students into a career. “Colleges and universities face an existential issue before them,” said Ryan Craig, author of Apprentice Nation and managing director of a firm that invests in new educational models. “They need to figure out how to integrate relevant, in-field, and hopefully paid work experience for every student, and hopefully multiple experiences before they graduate.”
But colleges still operate on a model focused on education rather than employment. One logical place for building new structures for a new era in education, Craig said, is the career center — but faculty tend to see that as a student service, separate from the academic core. Career centers at many colleges exist at the margins, offering job listings, interviewing skills, and career fairs but overall feeling optional. An annual survey from the National Association of Colleges and Employers finds that only about a quarter of students ever tap into their career centers for help.
Craig sees a variety of options for work-based learning in an AI world, everything from apprenticeships out of high school (and in more than just the skill trades as is the case now) to — yes — internships in college as well as real-world research projects for companies that could be part of a course in college. There’s a shortage of such work-based learning, Craig says, because it’s costly and time consuming for both employers and colleges to set up. This is a problem his investment firm is trying to solve by building so-called intermediaries, common in places like the U.K. and Australia, which take on the work of setting up jobs and hiring students.
That point hit home for me in September, when I sat in on a session at Workday’s annual user conference, where Chris Ernst, the company’s chief learning officer, spoke to a room full of HR executives who hire junior employees. Ernst told them that 70 percent of learning comes from experience, 20 percent from relationships, and only 10 percent from formal instruction. Now as companies pull back, colleges are being asked to deliver forms of “genuine learning” they were never designed to provide.
Of course, there is an added layer of complexity to this emerging conventional wisdom: It’s not just real-world experience — it’s that plus the ability to use AI tools well that really creates value in the entry-level job market. Demand for AI fluency among employers has jumped nearly sevenfold in two years, according to the McKinsey Global Institute, faster than for any other skill in job postings.
But a study by the American Association of Colleges and Universities and Elon University found that fewer than half of college leaders say their campuses were ready to use AI to prepare students for the future. McMahon’s experience was fairly typical of what recent graduates describe. Her exposure to AI at Pitt, she said, depended entirely on the preferences of individual professors. “For the most part, it was ignored,” she said, “like, you know better than to use it.” Her internship at Paramount was no different: Using AI on the job was barely talked about, even as it was remaking the entertainment industry. When McMahon began applying for jobs after graduation, she didn’t realize how rapidly entry-level expectations had changed. “I had no idea this was an actual threat until it was too late,” she said. “It always felt like something far off, and then suddenly I’m told that’s why I can’t get a job.”
As the new school year got underway this past fall, I moderated two small dinners with college and university leaders from campuses of all sizes. At both, the conversation centered on AI. Several academic leaders talked about ditching blanket AI bans in favor of policies set by individual faculty members, so students would know when, how, and why they could use AI. Others described how faculty and students are already using AI for “messy work”: summarizing lecture notes, creating study guides, brainstorming research ideas, and editing drafts of papers. Such uses of AI, several of them agreed, could open up classroom time to focus less on traditional teaching and assessments and more on developing the foundational skills that Sigelman talks about — critical thinking, communication, and problem-solving.
Still, higher education by its nature is slow to change. It can take years for new majors to be designed and approved by faculty members, and on that timeline new programs risk being outdated before the first cohort even graduates. Meanwhile, professors have few incentives to experiment with AI. Faculty members are largely “one-trick ponies” when it comes to their teaching, said Corbin Campbell, an education professor at American University and author of Great College Teaching. They seldom vary their approach from year to year and mostly teach as they were taught.
The kinds of companies that set trends in hiring — consulting firms, Wall Street banks, law firms — are well aware of this balance between AI literacy and capacity for old-fashioned critical thinking. Even as they automate the work that used to define entry-level roles, human beings and their judgment are still at the center of the business..
At the investment firm Carlyle, new hires are put through AI training and employees regularly share how they use AI. What used to take weeks of research on potential investments now takes just hours with AI, according to Lúcia Soares, Carlyle’s chief innovation officer. But when employees use AI to generate a report, Soares said, they write a final paragraph summarizing the automated output to ensure they’ve understood and rigorously assessed it. “Judgment, critical thinking, and the ability to influence and persuade are the skills that allow you to advance into higher-level jobs that are becoming the entry point to careers,” Handshake’s Cruzvergara told me.
What makes this debate about how to prepare college students for an automated labor market even more fraught right now is that it’s complicated by other crises happening in higher education — a broken financial model, shrinking enrollment, declining numbers of international students, the cancellation of federal research grants, and an administration in Washington that always seems ready to pick a battle with the sector. Facing those headwinds, colleges can’t focus only on helping the next class of students prepare for a job market where entry-level positions are becoming harder and harder to land.
If colleges can’t stop the clock, Ohio State University wants to at least reset it to start in 2029. That’s the class — this year’s freshmen — that the university has promised will be “fluent in their field of study, and fluent in the application of AI in that field.” The university’s AI Fluency initiative, announced with fanfare in June, is ambitious but still short on specifics. Many colleges are making AI literacy a goal for their students: to know how AI works, how to use different tools, how to evaluate their output, and where their limits are. But for Ohio State’s provost, Ravi V. Bellamkonda, literacy isn’t enough. He likened AI fluency to other foundational math, science, and language concepts students learn in introductory courses and then build on in their majors. “We’re going to ensure that there’s a minimum that’s met across the university,” Bellamkonda said. “How much more than the minimum is up to the major. Certain majors lend themselves to more, some less.”
One school at Ohio State already doing more is, not surprisingly, the university’s business school. The semester after ChatGPT was released in 2022, Vince Castillo, an assistant professor of marketing and logistics, told his students that his classes would be “Amsterdam for AI,” an open market for experimentation as long as everyone was transparent about how they used it. Since then, his approach to one of his most popular courses, Logistics and Supply Chain Analytics, has shifted significantly. Many of his students are business majors without coding skills, a technical barrier that once separated them from computer-science students. “With AI, they can write commands to get a better understanding of what’s in the data,” Castillo said. That gives them a clearer grasp of the material, Castillo said, and pushes them to ask more insightful questions. “They think more strategically now,” he added.
Despite all the grim news, this moment isn’t entirely unfamiliar in higher education. In the late 1990s, colleges treated the internet at first as a digital encyclopedia. Then as an email tool. Only gradually did they build course pages on the web, then full learning-management systems, then online courses — and eventually entire programs built around jobs that didn’t exist when I was a student. The internet didn’t merely add a tool. It rewired how we interacted with information and with one another.
AI optimists believe the same cycle of jobs lost/jobs created will repeat itself. (There is also a camp that predicts it will destroy nearly all human jobs — good luck to all of us in that world.) But as career author Lindsey Pollak reminded me, “We’re terrible at predicting which majors or skills will matter next.” We pushed a generation toward computer science, only to watch many entry-level roles disappear just as they graduated.
In recent months, as I’ve been on a tour with a new book about college admissions, I’ve spoken with parents, high-school students, and counselors in cities around the country. At every stop, parents ask what they should tell their kids in this in-between moment when AI is moving faster than employers or colleges can adjust. Here I am keenly aware of the words of Pollak: “We’re terrible at predicting which majors or skills will matter next.” We pushed a generation toward computer science, only to watch many entry-level roles disappear just as they graduated. The honest answer is that the future is ambiguous, and learning to navigate ambiguity may be the most important skill they can acquire.