Doom and gloom in the groves of academe

This article is part of The Week‘s 20th anniversary section, looking back at how the world has changed since our first issue was published in April 2001.

In 2001, I was a college senior trying to figure out what to do with my life. I can’t claim I was the most diligent student. But I was bright enough to get good grades without trying too hard, at least in courses I enjoyed. Attracted by the apparently favorable ratio of reward to effort and lacking any strong attraction to other pursuits, I decided to stick around higher education. Two decades later, I’m still here, now as a mid-career professor at George Washington University.

What I didn’t know then but recognize now is that I decided to enter the academy during a short and temporary moment of prosperity. The turn of the 21st century wasn’t quite the golden age that extended from World War II to the mid-1970s, when massive increases in government funding turned college campuses into boomtowns. Still, it was a pretty good time, one in which growing enrollments, appreciating endowments, and an anticipated wave of faculty retirements promised a comfortable future for aspiring professors. The state of academia today is very different indeed.

But then, we didn’t know how much change was coming. Beyond economic considerations, 2001 was a period of cultural stability. In April of that year, David Brooks published a widely circulated essay on the “organization kid” — his term for the magnificently disciplined, professionally ambitious, politically detached undergraduates he saw at Princeton. These students found it hard to imagine getting worked up about the culture war disputes of the early 1990s or the antiwar mobilization of the 1960s. Apart from a few malcontents whom tightening admissions standards tended to exclude from elite colleges, the Ivy League students of that era liked school, admired their parents, and expected to succeed.

I studied a few miles up Route 1 at a less distinguished institution. With some adaptations for class and ability, though, I observed similar qualities among my classmates. By the fall semester, the 9/11 attacks had cast a shadow over our comfortable suburban existence, but I don’t remember any lasting sense that the world had changed in fundamental and unfavorable ways. 

Helped by the housing bubble, the good times continued to roll for nearly a decade. By then, I was in graduate school at a university that had grown so rich my department provided a daily continental breakfast buffet to its faculty, students, and staff. Beyond the confines of Harvard Yard, there were warning signs evident to those paying attention, but it was still possible to imagine there would be enough money, students, and jobs to go around forever.

The financial crisis of 2008 dispelled these illusions. In the most immediate sense, it buried the fantasy that the domestic market could absorb constant tuition hikes. Adjusted for inflation, the price of enrollment nearly doubled between the mid 1990s and late 2000s. Without a backdrop of inflating real estate values, which provided the illusion of increasing wealth, this rate of change couldn’t be sustained by middle class families, the core of the university customer base.

The fundamental mistake goes back to that midcentury golden age, when the Higher Education Act of 1965 established a range of financial products to help pay for college tuition. The idea was that the provision of federal loans and grants would supplement existing state appropriations, particularly at the public universities that educate the vast majority of the nation’s students.

It didn’t work out that way. Recognizing that many students had other resources to pay for college, facing other demands on their budgets, and wary of raising taxes, state legislatures reduced their per capita spending on higher education. In the second half of the 20th century, higher education rested on a financial bargain in which the states subsidized supply for a broad range of consumers. By the start of the 21st century, the model changed to the federal government subsidizing demand, while increasing regulatory burdens and pursuit of prestige limited supply. 

Anyone who’s passed Econ 101 can tell you the consequence of subsidized demand and constrained supply is rising prices. And that’s exactly what happened, as the average price of a four-year degree zoomed beyond the median annual income.

Even before 2008, this burden was becoming too great for many families to bear, including the relatively affluent. Universities responded by shifting their enrollment strategies, using selective offer discounts to desirable students. Marketed as “merit aid,” this practice flattered parents’ belief that their children were being rewarded for outstanding academic promise. In fact, it was a sophisticated form of financial engineering designed to entice families into extending themselves to the very limit of their financial capability.

The growing difference between the “sticker price” and the amount actually paid by most students created a problem for universities, though. Forced by regulation and market competition to provide costly services and amenities, they needed to find a sufficient number of “full freight” customers to subsidize those receiving discounts. Some of those customers were domestic students attracted to graduate programs that charged huge prices for dubious benefits. Others were found abroad, since foreign students are not eligible for federal benefits but still sought the prestige of an American degree. After 2008, the share of international students in U.S. higher education doubled from about 3 percent to 6 percent. On some campuses, foreign students arrived in even larger numbers and became essential to balancing the books.  

The financial shock in 2008 caused some short-term discomfort, but the richest universities were buffered against these developments by continued growth of their endowments, and they used that wealth to hasten a profound shift in the structure of higher education. Historically, the faculty was the core of the university, assisted in their duties by a small cadre of administrators (often professors taking time off from teaching duties). The need to control costs while attracting valuable, full-pay students encouraged huge expansion in the range of student services and intensification of the amenities arms race. To pay for the administrators necessary to provide these goods, universities curtailed tenure-track hiring, relying instead on the cheaper, short-term instructors who have become the majority of the faculty in the American university.

Here, too, well-intentioned regulations generated perverse consequences. Title VII of the Civil Rights Act of 1964 and Title IX of the 1972 Higher Education Amendments generated large bureaucracies to ensure compliance with bans on discrimination in hiring and other services. As in the corporate world, HR departments became effectively an arm of federal race and gender policy. The mounting complexity and expense of health insurance and other benefits also encouraged the growth of university bureaucracies, which often included managers of their own hospitals or medical centers.  

Some of these distortions in higher education might have been avoided with wiser policy, but other challenges are mostly beyond its influence. The golden age of American universities occurred as the Baby Boomers reached student age. The silver age I remember from my own college days was an echo of that demographic bulge, as Boomers’ children began enrolling during the Clinton Administration. But that generational pattern has been interrupted over the last two decades. The birth rate began dropping in 2008, reducing the population of traditional-age students for the foreseeable future. Because older students, a growing portion of the undergraduate population, tend to prefer commuter, part-time, or online options that are often cheaper than residential study, the financial vice in which many institutions already find themselves is only going to get tighter.

These structural shifts are necessary, if not sufficient, causes of the “great awokening” that now dominates public conversation about higher education. The “organization kids” of the 1990s believed they were party to a bargain in which hard work and institutional loyalty would secure success. Their younger siblings — and now some of their children — think they’ve been sold a bad deal in which the mounting costs of higher education, including a grueling admissions process, offer few reliable benefits. 

Unable to determine their economic futures, students fixate on things they can control. That very much includes the personal services and social experiences they believe, with some justification, they’ve paid for. The 1990s-era cliché of “tenured radicals” is largely outdated. Most of the political negotiation on the modern campus occurs between customers (that is, students) and administrators, with faculty hardly to be seen. Whole institutions have been deformed by this reconfiguration of the academic enterprise around a novel brand of managerial progressivism. 

And so faculty of my generation are disillusioned, too. The dark mood isn’t limited to thousands of Ph.Ds who were trained for tenure-track positions that don’t exist. Even some of those who secured what are known in the trade as “good jobs” are questioning whether the benefits outweigh the costs. COVID-related disruptions and policy changes are just the latest burden. Even the new rules work for some instructors in some fields and at some levels, online classes are disastrous for the kind of intensive, personalized seminars that drew many of us into the academy in the first place.  

But it’s not all doom and gloom in the groves of academe. Despite technological, administrative, and political challenges, great instructors teach great classes to enthusiastic students every single day at every single institution of higher education in the country. College graduates continue to earn more than those without degrees. Even humanities majors get hired. There are new experiments that promise to offer face-to-face instruction at a lower price. 

With the benefit of two decades’ hindsight, would I have made the same decision, then? I have moments of doubt, but most of the time I think I would.

The academy has big problems, many of them self-inflicted. But it also offers great rewards, including a unparalleled, if not unlimited, degree of professional autonomy. Intriguingly, in a provocative recent book, the scholars Paul Reitter and Chad Wellmon argue that the idea that the modern academy is existentially threatened is a defining feature of the modern academy, all the way back to its origins in early 19th century. The more things change, the more they stay the same.

December 28, 2021 December 28, 2021 This article is part of The Week‘s 20th anniversary section, looking back at how the world has changed since our first issue was published in April 2001. In 2001, I was a college senior trying to figure out what to do with my life. I can’t claim I was…

December 28, 2021 December 28, 2021 This article is part of The Week‘s 20th anniversary section, looking back at how the world has changed since our first issue was published in April 2001. In 2001, I was a college senior trying to figure out what to do with my life. I can’t claim I was…