Letter from the President


The Heart of the University is the Classroom

“The very possibility of civilized human discourse rests upon the willingness of people to consider that they may be mistaken.” It seems a long time ago since I quoted those words in last year’s annual report. They were spoken by the great historian Richard Hofstadter in 1968, when America’s universities were plunging toward anarchy, and they feel urgently pertinent now.
 
As our nation seems once again to be breaking apart into raging factions, I want to return to this conviction that the capacity for doubt—the willingness to think and rethink in view of new evidence and experience—is the hallmark of a civilized society. Those of us who teach know that for our students as well as for ourselves the best way to learn is with zeal leavened by humility. The challenge now is to retrieve and nurture that spirit—and the place to begin is the classroom.
 

I

 
The imperative to acknowledge our own fallibility has a long lineage. In the West it stretches back to Judaism, suffused with a dialectical spirit of self-questioning. Early Christianity decried the sin of pride as a mark of reprobation. In ancient Greece Plato subjected all claims to inconvenient questions from his teacher, Socrates, who confessed (as translated by Benjamin Jowett), “I neither know nor think I know.”
 
For many people, self-doubt is an uncomfortable condition that cannot match the appeal of ideology in politics or dogma in religion. But it is a necessary pre-condition for curiosity, tolerance or any form of open thought. This was a central insight of the Enlightenment, from which some two centuries ago there arose a new kind of educational institution committed to what my Columbia colleague Stuart Firestein, a distinguished neuroscientist, describes as “knowledgeable ignorance”—the recognition of how little one knows and that everything one thinks one knows is provisional pending further investigation. It was on that basis, at Berlin in 1810, that the modern university was founded as a site of free inquiry and teaching in which the pursuit of truth proceeds in a spirit of skepticism toward received certitudes and brooks no interference from prelates, princes, or any authority external to the university itself.
 
The functional principle of this new institution—academic freedom—was slow to take hold in the United States. Through most of the nineteenth century America’s universities, such as they were, remained under denominational constraint if not control. In 1878, for example, the president of Vanderbilt University, a Methodist bishop, fired its professor of geology for teaching evolution in defiance of “the plan of redemption.” Soon the clergy were supplanted by men of wealth who had no problem with Darwin, at least not in the dubious form of Social Darwinism whose doctrine of “survival of the fittest” flattered the thriving rich. But when it came to other unwelcome ideas, they were ready to step in to exercise what Thorstein Veblen called “pecuniary surveillance” over “the permissible limits of learning.” At the University of Wisconsin, in 1894, two leading faculty were dismissed for “teaching socialism.”
 
It wouldn’t be difficult to assemble an anthology of similar interventions first by men of the cloth then by titans of industry. In retrospect all these efforts look like rear-guard actions by a regime in retreat. In 1869, speaking at his inauguration as president of Harvard, Charles W. Eliot (a chemist descended from Puritan clergy) declared that “the winnowing breeze of freedom” had begun to “blow through all its chambers.” Arguably the most ventilated of America’s universities had been established four years earlier in upstate New York by the engineer and entrepreneur Ezra Cornell, who, dismayed by the stifling atmosphere at existing institutions, envisioned a new kind of university in which “any person can find instruction in any study.” Soon several new institutions—including Johns Hopkins (1876) and the University of Chicago (1890)—were founded on the premise that no subject or point of view should be ruled a priori out of bounds.
 
Still, the fight over academic freedom continued into the twentieth century. In 1900, the Stanford sociologist Edward Ross gave a public lecture opposing the use of Asian immigrant labor by the railroads. Jane Stanford, widow of the railroad tycoon who had founded the university and now its sole trustee, was enraged and Ross was fired. In 1915 a group of eminent scholars led by the Columbia philosopher John Dewey established an advocacy organization—the American Association of University Professors (AAUP)— whose Declaration of Principles, defined and defended academic freedom as its cardinal value. Two years later, the Columbia trustees, unimpressed, prevailed upon the university president to dismiss several members of the history department for holding a view—opposition to America’s entry into World War I—that was obnoxious to their own. With tacit consent from most faculty, he complied. Attempts at suppression continued during the “Red Scares” that followed both World War I and World War II, but by and large the forces of reaction grew weaker as the faculty’s role in institutional governance, bolstered by the protections of tenure, grew stronger.
 
The shared premise of the new universities —as well as older ones that were attempting to renew themselves—was that the only route to knowledge is through restless discontent with received certitudes. To this end, using language both descriptive and aspirational, the AAUP defined the modern university as “an intellectual experiment station.” At its best, that’s what it became and what it continues to be.
 

II

 
From the start, however, even the most progressive academic leaders knew that universities could sustain their “searching function” (Hofstadter’s phrase) only if they won support, or at least forbearance, from the public.
 
As a legal matter, private institutions were answerable only to themselves. As early as 1819, the principle of self-governance had been affirmed by the U.S. Supreme Court when it ruled that an attempt by the state of New Hampshire to alter the charter of Dartmouth College was unconstitutional. But while older institutions with private endowments were thus protected from government interference, by the early twentieth century they were becoming, in an extra-legal sense, quasi-public. Through a series of laws enacted by Congress culminating with the War Revenue Act of 1917, the modern system of tax exemptions for non-profit organizations and tax deductions for their donors was taking shape. The rationale for conferring tax advantage on institutions and on the philanthropic individuals who support them is that they serve the public interest. But when consensus breaks down over what constitutes that interest, tax advantage starts to look like tax evasion and politicians can be counted on to stress the resemblance.
 
Though the AAUP founders could hardly have been expected to foresee the crisis of trust in which higher education finds itself today, they were sensitive to the risks. They knew that American universities have always had to manage a distinctively American tension between government and private enterprise, including enterprises that do not operate for profit. They knew, too, that the right to autonomy must be balanced with some degree of deference to external forces—always a delicate balance because any true university, while ultimately dependent on public approval, must be a nursery and sanctuary for ideas that incur public disapproval.
 
Over the course of the twentieth century, this mutually wary relationship—now commonly referred to as a compact or contract between academia and society—became more and more consequential, with a great deal at stake for both partners. Following World War II the monetary value of the contract soared with rapid expansion of the National Institutes of Health and, early in the Cold War, the establishment of the National Science Foundation partially in response to technological and military competition from the Soviet Union. The U.S. government now invested huge sums in universities in the form of research grants as well as grants and subsidized loans directed through the Department of Education to millions of students. And because the labor market increasingly rewarded holders of a college degree, the benefits of the compact were widely assumed to outstrip the costs.
 
But lately, for a host of reasons, notably the galloping price of tuition driven largely by public disinvestment but often blamed reflexively on universities themselves—which do, indeed, share some of the blame—college education has come under attack as a foolish investment for young people seeking work in the trades or the service economy or even in white-collar jobs. Some mega-wealthy entrepreneurs, almost all of whom benefited from attending college, now ridicule it as a time-wasting distraction. And despite the astonishing success of mRNA vaccines to prevent or mitigate Covid-19 infection, as well as countless other therapies and technologies developed in university labs, the reputation of academic scientists (and to some extent, economists) fell into sharp decline following the shocks of the pandemic—prolonged school closures, mask mandates, enforced isolation, along with persistent inflation triggered by shortages of staple goods and by the federal effort to stabilize the economy. Calls to “follow the science” now seem to many people to lead to perdition.
 
Meanwhile, another even more damning narrative burst into prominence. This one says that universities, dominated by what the current president of the United States calls “Marxist maniacs and lunatics,” have become un-American or even anti-American, intolerant of conservatism and capitalism itself, ruthless toward dissidents, incubators of antisemitism, and so on.
 
It’s mostly a slander. With humanities enrollments declining and expectations rising that the main function of universities is to train students for the workplace, the number of students sitting in classrooms discussing such supposedly subversive authors as Karl Marx or Michel Foucault or Judith Butler is small and shrinking. And even for those who are asked to read them, there’s no reason to assume that they are doing so with uniform assent. “The charge of widespread ‘indoctrination’ by left-wing faculty,” writes the historian David Bell, a liberal centrist, “is nonsense, and, worse, a pretext being used shamelessly by the right to attack the university system and bring it under government control.”
 
Still, there’s no doubt that universities—especially the so-called elite—have made themselves easy targets by furnishing their critics with a rich repertoire of follies and scandals that cast doubt on their professed devotion to academic freedom. They have done too little to counter the view that they’ve become playgrounds of “virtue signaling”—trigger warnings, land acknowledgments, compulsory statements of fidelity to a narrow conception of “diversity,” and so on. Even if this damning portrait of academic culture is an opportunistic exaggeration, there’s enough truth in the caricature that the public is hard pressed to distinguish between reality and appearance. The occasional shouting-down of conservative speakers, for instance, can be counted on to yield incriminating headlines even when the speaker is a provocateur happy to make news by being heckled or harassed.
 
After the monstrous attack by Hamas on Israeli civilians in October 2023, some universities, faced with the task of distinguishing between protected free speech and poisonous hate speech, made matters worse by bungling their response to protests over Israel’s ferocious counterattack. At my own academic home, whose full formal name is Columbia University in the City of New York, non-students who seek to bring toxic hatreds onto campus are only a subway ride away. The situation was managed poorly. But none of us who stood on the sidelines should be confident that we would have defused it before it spun out of control.
 
Well before those sad and damaging events, universities had been exposing themselves to attacks in the name of outraged patriotism. In the always-contested terrain of American history, college courses once likely to conform to a triumphalist story inherited from antebellum and Gilded Age historians who celebrated America’s “exceptionalism” now emphasize exclusions and cruelties aimed at enslaved, indigenous, or exploited peoples who served involuntarily or resisted America’s “Manifest Destiny.” This is a belated corrective to the boosterish narrative that once marginalized millions of Americans in favor of a monochrome story of power and progress. But as Bill Readings warned three decades ago in a prescient book (The University in Ruins (1996)), it’s at their own peril that universities abandon their role as “producer, protector, and inculcator of national culture” in the eyes of the broader public. 
 
The charge that academia is a nest of insidious leftists has long roots. It reaches back to William F. Buckley (God and Man at Yale (1951)), and Allan Bloom (The Closing of the American Mind (1987)) and has been carried on with varying degrees of subtlety by conservative writers such as Roger Kimball, Bruce Bawer, and George Will, among many others. More recently, it’s been taken up by liberals who write in a tone closer to that of disappointed friends—writers such as Laura Kipnis, Jonathan Haidt, Yascha Mounk, and Eboo Patel, all of them warriors against the excesses of “Woke,” “DEI,” “Me Too,” and “Cancel Culture.”

I doubt that many people on left, center, or right (with a few exceptions such as Christopher Rufo and others involved in the Heritage Foundation’s “Project 2025”) anticipated that with the second election of Donald Trump, critiques of the university—whether fanciful or well-founded or somewhere in between—would become grounds for suspending or cancelling billions of dollars in research grants, curtailing the influx of international students, enforcing “viewpoint diversity” (on the weak premise, as Brown University professor Amanda Anderson puts it, that “individuals occupy identifiable political positions aligned with stable and explicit beliefs that do not change over time”) or “meritocratic” admissions and hiring, as if merit were an objective quality measurable in scores or grades or by counting publications or citations.
 
With the full force of government behind them, enemies of the university are now employing a strategy of divide and conquer. In the absence of a collective response remotely comparable to the defense led by AAUP a century ago, this strategy has proven to be highly effective. Columbia, Penn, Brown, Cornell and, most recently, Northwestern, have made their deals. Harvard appears to be dealing with one hand and defying with the other. The Board of Visitors at the University of Virginia, buckling under fear of becoming the next target, forced out President James Ryan, a person of luminous integrity. And even with hints of a truce in the air, it seems unlikely that research support will return to previous levels anytime soon. With the rise in taxation rates on endowment income at the wealthiest institutions and an impending drop in Indirect Cost Recovery rates attached to federal grants at all institutions, the damage promises to be difficult if not impossible to reverse.
 

III

 
I’ve taken an historical approach to some of the issues now facing higher education on the premise (the premise, I think, of all historical writing) that knowledge of the past has value for confronting challenges in the present. And though they account for only a small fraction of our Foundation’s grant-making, I’ve focused on well-known research universities because they are the institutions most conspicuously under assault. It should be stressed, however, that collateral damage is rippling through the whole range of higher education—from regional public universities to liberal arts colleges and Historically Black Colleges and Universities, as well as community colleges, which collectively enroll vastly more students, including large numbers of low-income and first-generation students, than the elites. Compensatory cuts in state subsidies to education seem likely to follow reductions in federal Medicaid support. Pell grants continue to lose their purchasing power in relation to the price of college attendance. Federal funding for English language instruction, technical training, and other forms of adult education is threatened by restrictive provisions in the “Big Beautiful Bill” as well as by the dismantling of the Department of Education (DOE). In short, programs designed to help the most vulnerable students are on the chopping block.
 
Those mounting the attack on higher education have many rhetorical advantages. Knowingly or unwittingly, they’ve tapped into an ugly undercurrent of contempt not only toward privileged intellectuals but toward people very much without privilege—immigrants, persons of color, students from disadvantaged backgrounds deemed unprepared or unfit. They also benefit from the fact that the term “university” is an abstraction in which it’s hard to discern the dedicated teachers, researchers, students and support staff who do the day-to-day work in classrooms, labs, dorms, recreational facilities, administrative offices and all the rest, serving not only the academic community but neighbors as well. At my own university it’s been heartbreaking to watch the gutting of Columbia’s Double Discovery Center, which, almost entirely dependent on now-cancelled grants from the DOE, has served, for sixty years, thousands of vibrant high school and middle school students in northern Manhattan with tutoring, mentoring, and counseling—a magnificent exception to the rule that elite institutions hold themselves aloof from the local communities in which they exist.
 
Earlier this fall nine universities in varying degrees of favor or disfavor with the current administration were invited to sign a “Compact for Academic Excellence in Higher Education.” In exchange for acquiescing to its demands (among them, a five-year tuition freeze and limitations on foreign student enrollment), the compact promised “multiple positive benefits” such as “substantial and meaningful federal grants.” To qualify for these benefits, the signatories were asked to ban anything that would “punish, belittle and even spark violence against conservative ideas.”
 
Do “conservative ideas” include such features of the American past—which some people, no doubt, hope to see restored in the future—as segregation, prohibition of “mixed” marriages, and literacy tests at the polls? Do they include superseded conservative ideas such as free trade, limited government, and an independent judiciary? Its incendiary incoherence notwithstanding, the proffered compact, in the blunt words of Berkeley’s law school dean, Erwin Chemerinsky, amounts to an attempt at “extortion, plain and simple.” So far, thankfully, it’s an offer that the invitees have been willing to refuse.
 
But universities continue to be under surveillance by Big Brother, who has demonstrated his willingness to punish transgressors and to do it again anytime he doesn’t like what he sees or hears. Meanwhile, state legislatures dominated by his followers—notably in Florida and Texas—have banned the teaching of “divisive concepts” and granted new powers to administrators and trustees to review, discipline, or even terminate tenured faculty. Students with grievances about course content or grades or disciplinary procedures now know there are allies outside the university itching for a fight—conservative media, watchdog organizations, legislators, even governors—to whom they can appeal and thereby threaten faculty or staff by whom they’ve been offended. We face the prospect, as Suzanne Nossel, former director of PEN America writes, of professors “walking on eggshells for fear of not just a viral video or student complaint, but one that triggers government reprisals.”
 
What should we make of the charge that universities have brought these furies upon themselves? For the most part, comparing instances of intellectual intolerance on the left to the wholesale assault from the right amounts to an egregious case of false equivalence. Yet no doubt there are elements of truth in the backlash against “identity politics,” or the sweeping claims of “Critical Race Theory,” or the instinctive deployment of the term “social justice,” or the excoriation of “settler colonialism” as a proxy for antisemitism. Despite the current administration’s mobster tactics, we in academia should consider that once we get through the current crisis, some of the shocks may prove salutary, and that some self-searching is overdue.
 
To that end, the AAUP’s Declaration of Principles remains a valuable resource for reflection. Its authors understood that universities, even though liberated in the previous century from control by the sects that founded them and coercion by the governments that chartered them, bear ultimate responsibility for regulating themselves. In a society where hostility toward intellectuals is always latent and sometimes virulent (Hofstadter wrote the key book on this subject, Anti-Intellectualism in American Life (1962)), they knew, too, that universities would always remain suspect. On the core issue of academic freedom, as the legal scholar Robert Post has remarked, if faculty were to construe that freedom as the right “to research and publish in any manner they see fit,” it wouldn’t be long before public support “would vanish.” Post made that comment sixteen years ago. Today we seem to be approaching the vanishing point.
 

IV

 
Anyone concerned about the future of higher education would do well to recall that even as the authors of the Declaration defended academic freedom as essential for the search for truth, they devoted a section (cited much less frequently than its clarion calls for freedom) to its “corresponding duties” and “correlative obligations.” But what, exactly, did—and does—this call for self-regulation mean?
 
In the natural sciences it has meant that hypotheses must be tested by experiment and experimental results put to the test of replication. In the so-called social and human sciences—where the line between argument and opinion is often thin and work can be ideologically tendentious—it takes the form of “peer review” of journal articles, book manuscripts, and nominations to tenure. These quality-control methods are by no means immune to error or abuse, but they do ensure that propositions and theories, to be taken seriously, must have something more than the status of unfounded claims.
 
But when most people think of the university, they don’t think of the lab or the academic journal or the scholarly conference. They think of the classroom. When it comes to teaching, what can one say about how universities have been doing at monitoring themselves?
 
Here’s a frank assessment from the long-serving president of Bard College, Leon Botstein, who has denounced Big Brother more fiercely than all but a very few college presidents while at the same time holding universities to account in a spirit of loyal opposition:
 
Universities have retreated shamelessly from their obligation to teach, particularly undergraduates. Higher education has become a matter of lecture halls, seminars taught by graduate students, standardized testing, fully remote learning, and obscurantist pseudo-professionalism, notably in the humanities and social sciences. Universities have abandoned the act of serious teaching and learning and settled for anonymity and routine.
 
One need not agree in every particular with this list of indictments (in my experience graduate students can be very good teachers), but every honest observer of higher education knows there’s a lot of truth in it. Fifteen years ago, in their data-heavy book Academically Adrift: Limited Learning on College Campuses, sociologists Richard Arum and Josipa Roksa argued in depressing detail that too many universities have been complacent as “students’ academic effort has dramatically declined in recent decades.” They made that argument long before the advent of A.I., which now invites students—at least those who wish to evade responsibility for their own learning—to substitute prefabricated writing and research for their own work.
 
Unfortunately, academia’s loudest critics are generally less concerned with meeting the needs of disengaged or unconfident or underprepared students than with the putative problem of radical professors browbeating students with “woke” ideas. How credible is this charge? The University of California sociologist Steven Brint, another sober critic of academia from within the academic world, put it this way in a recent interview with the Chronicle of Higher Education: “Those who are “woke,” or social-justice oriented remain a fairly small part of the faculty, let’s say 20 percent as a kind of rough approximation. . . . In some disciplines like gender studies and ethnic studies it approaches 50 percent but, in self-description at least, never gets to be a majority even there. So it exists, but it’s not a majority tendency.”
 
But even if most teachers remain open to a range of perspectives, less sanguine observers will say that students whose thinking doesn’t align with the “progressive” norms of the university nevertheless feel compelled to censor themselves. The classroom thus gives itself up to “the tyranny of the prevailing opinion and feeling” by which a majority—more likely, an imperious minority—seeks to “impose . . . its own ideas and practices as rules of conduct on those who dissent from them,” and thereby to “fetter the development and, if possible, prevent the formation of any individuality not in harmony with its ways, and compel all characters to fashion themselves upon the model of its own.”
 
Those words come not from a contemporary writer but from the greatest exponent of free speech in the Anglo-American tradition, John Stuart Mill. In his classic essay On Liberty (1859), Mill sounded the alarm at “a social tyranny more formidable than many kinds of political oppression.” Today it’s an article of faith among anti-academic pundits that even if teachers don’t browbeat their students, universities are aiding and abetting the habit of self-censorship, which inhibits both faculty and students from saying—or even thinking—what they wish.
 
The “soft power” of conformity is impossible to measure, but it certainly exists. When I consider my own slight experience of it, I confess that I’ve done a little self-censoring myself. For example, in my course on early American literature I had assigned for many years Ben Franklin’s “Advice to a Young Man on the Choice of a Mistress” (1745), in which he advises a young friend on the sexual advantages of older women. Ribald and deliberately outrageous, it guaranteed raucous laughter when I read it aloud, which I did partly for comic relief after a week of theological tracts and hellfire sermons by Franklin’s contemporary Jonathan Edwards, but also because it illustrates the split (Edwards would have been scandalized had he read it) between pietism and rationalism in eighteenth-century America. Then around a decade ago I felt I ought to drop it from the syllabus because the laughs had died down and given way to a few nervous chuckles or silent stares. The last time I assigned it, the only hearty laughter came from the one older student in the room, who asked me after class for the full citation so she could share it with her ex-husband.
 
I’m still not sure where my decision falls on the spectrum between cowardice and good manners. One reason I appreciate the AAUP authors is that they advised continual self-monitoring. “The teacher,” they wrote, “ought also to be especially on his guard against taking unfair advantage of the student’s immaturity by indoctrinating him with the teacher’s own opinions before the student has had an opportunity fairly to examine other opinions upon the matters in question.” Then they added the admonition that professors should “refrain from intemperate or sensational modes of expression” not only in the classroom but in their extra-mural speech as well.
 
In fact, cool disinterest in the classroom is rarely desirable, especially when the subject entails judgments of value. It’s a prescription for bland, dull, and bloodless teaching. Good teachers exemplify what it means to care passionately about whatever subject they are teaching—whether it’s an idea or an historical episode or a work of art or any aspect of human experience. But good teachers also exemplify the habit of self-questioning, of taking seriously arguments that run counter to their own convictions, beginning with the obligation to make students aware that such arguments exist.
 
With that imperative in mind, I find another aid to reflection in that brilliant memoir and cultural history, The Education of Henry Adams (1907), in which Adams recalls wishing during his brief stint teaching medieval history at Harvard that he could have “seated a rival Assistant Professor opposite him, whose business should be strictly limited to expressing opposite views.” (It was a futile wish because even at 19th-century Harvard team-teaching was considered prohibitively expensive.) Adams understood the teacher’s responsibility not as transmitting information or cajoling the class to accept his own line of thought but as helping young people grapple with the complexities and contradictions of interpretation. His overriding purpose in the classroom was to “create conflicts of thought” within and among his students. That should be our purpose too, especially now.
 

V

 
So, I’m tempted to say that the mission of the Teagle Foundation is to Make the Classroom Great Again. I mean this not in a spirit of nostalgia for some mythic past when universities had their values straight and academics lived by them. I mean it to celebrate the many caring teachers whom we at the Foundation are privileged to support as they work to deepen and enrich the college experience so that it becomes something more than mere credential acquisition.
 
Much of what I’ve reported in these pages is dark and alarming, and there’s no downplaying the challenges we face. But amid the ferment there are reasons to be hopeful. I sense, for instance, a new spirit of reflection not only among faculty but also among students and academic leaders as we all try to do our part in defending the institutions to which we owe so much. Any honest defense should include a strong dose of tough love.
 
One intriguing development is the proliferation of centers, institutes, and schools of “Civic Thought” in response to the perception that curricula have been hijacked by tenured radicals. In fact, the first such center, the James Madison Program in American Ideals and Institutions, was founded long before the present fraught moment, at Princeton in 2000, by the natural-rights political philosopher Robert George. Twenty-five years later Yale has established its own center under the direction of Bryan Garsten, a scholar and teacher in the venerable tradition of liberal critics of liberalism.
 
So far, however, the movement has been making headway mainly at public institutions in red states. The effort picked up speed in 2016 with the establishment by the state legislature of a School of Civic & Economic Thought and Leadership at Arizona State University under the direction of the historian of political philosophy Paul Careese. Since then, many other centers and schools have been created by legislative fiat at state flagship universities, including the University of Texas at Austin, the University of Florida, the University of North Carolina, and Ohio State (now one of five such programs in Ohio). Three other states (Iowa, West Virginia, and Utah) have passed legislation in the past year alone to create similar programs. 
 
The future of this movement is unclear. For the moment, these schools and institutes enjoy considerable independence and run on separate budget lines not generally subject to shared faculty governance. Writing in The Washington Post, George Will hails them as agents of salvation that are “reviving universities’ civic seriousness . . . reinvigorating the humanities, inspiring students eager to grapple with big questions, and reversing academia’s forfeiture of its prestige.” This seems overheated and premature. While the courses they offer on such topics as American constitutionalism are seeing growing enrollments, and some of their courses are now included among those with which students may satisfy pre-existing general education requirements, there are many challenges ahead.
 
For one thing, adjacent faculty tend to regard them with suspicion, as little more than fronts for conservative state legislatures to influence university hiring and decision-making. And to use the prevailing term of opprobrium for the noxious territorialism of academic life, any program that originates from outside the existing institution rather than arising organically from within risks becoming just another “silo” in which faculty and students seek solidarity in the company of like-minded peers. Most of the new centers depend, moreover, on a scarce supply of qualified faculty and on funding streams from legislatures that may become less friendly with a shift in the political winds. 
 
Still, we seem to be witnessing a determined effort to replicate something first attempted some 250 years ago by the nation’s founders, who tried but failed to establish a new academic institution explicitly designed to prepare young people (exclusively young men, at the time) for citizenship and leadership in the new republic. Distrustful of the old sectarian colleges with roots in this or that religious denomination, the founders envisioned a federal university, to be located in the nation’s capital, whose mission would be to teach what Benjamin Rush called “the principles and forms of government, applied in a particular manner to the explanation of every part of the Constitution and laws of the United States.” George Washington told Congress that the paramount obligation of such a federal university would be to teach “the people themselves to know and to value their own rights; to discern and provide against invasions of them; to distinguish between oppression and the necessary exercise of lawful authority” so they will obtain a “temperate vigilance against encroachments, with an inviolable respect for the Laws.” In the twentieth century, similar motives animated the creation of general education curricula such as the Columbia Core, which began as a “War Issues” course during World War I, and the Harvard “Red Book” curriculum, intended to prepare students for responsible citizenship following World War II. It remains to be seen whether the new 21st-century civic thought institutes manage to adapt and advance those erstwhile attempts at civic education, or whether they devolve into just another cluster of battle stations in the culture wars.
 
Meanwhile, at Teagle, we have launched our own more local and emphatically non-partisan initiative, Civics in the City, which supports programs in our home city for helping students to grasp “American history and democratic principles while providing opportunities to practice public service, stewardship, and problem solving.” We’re also working with the Jack Miller Center for the Teaching of American’s Founding Principles and History (JMC), whose mission has long been to prepare young faculty to teach in established history, philosophy, and political science departments. And this past year we helped to launch the new Chang-Chavkin Center for Liberal Education and Civic Life at Bard College, directed by my colleague Roosevelt Montás, whose mission is to prepare young faculty for teaching in non-disciplinary general education programs with a focus on core texts.
 
In these vexing and bewildering times, we at Teagle are determined to continue our work of identifying and supporting faculty committed to teaching the liberal arts with care for students who are just starting to discover their passions and test their talents. To that end, we believe in the indispensable value of confronting privately (reading) or publicly (discussion) challenging texts—whether they are essays, poems, plays, stories, sermons, tracts, historical narratives, or any genre that pushes readers to grapple with such antinomies as altruism and ambition, law and justice, equality and opportunity that sooner or later confront us all. At stake in the present moment is nothing less than the fate of such an education—properly called a liberal education—in our colleges and universities. By “complicating [the] answers” and “unsettling the certainties,” as Roosevelt Montás puts it in a recent interview, it’s an education powered by the kind of healthy self-doubt that Hofstadter pleaded for some sixty years ago when he feared it was headed for extinction.
 
It's in the service of such an education that we’ve been building a library of online workshops on “How I Teach this Text.” It’s why we hold annual in-person convenings of Cornerstone: Learning for Living grantees, who are hard at work revitalizing general education so that students fresh to college, before committing to this or that pre-professional major, can engage with works that raise enduring human questions. It’s why, through our Transfer Pathways to the Liberal Arts initiative, we work with our partners at the Arthur Vining Davis Foundations to help community college students achieve their dream of transferring into liberal arts colleges that stress small-class discussion. It’s our purpose, too, in providing support for faculty who lead our Knowledge for Freedom programs, which invite low-income high school students into college-level humanities seminars where they encounter ideas that have formed and reformed our culture.
 
The classroom to which the Foundation is devoted is a place where teachers become learners and learners become teachers. It’s a place where students—as well as faculty—can have the surprising experience of walking into the room convinced of one point of view and walking out with new openness to other points of view. Such a classroom works best when it’s a place of genuine diversity in every sense of the word—now, alas, a taboo word—in accord with J.S. Mill’s conviction that
 
the only path to knowledge of any subject—including that most mysterious subject, the self, is by hearing what can be said about it by persons of every variety of opinion, and studying all modes in which it can be looked at by every character of mind.
 
No one has stated better what a good classroom in a good university should be.
 
The future of this vital classroom depends on our will to overcome many countervailing forces. These forces include incentives that drive faculty to focus on disciplinary research and relentless publication at the expense of teaching; a growing truancy problem by which students regard class attendance as tiresome and optional; the incursion of A.I. into every aspect of life, which has the effect of leaving students unaccustomed to the low-tech methods of reading closely and speaking carefully; the budgetary challenge of preserving small classes that can never deliver the economies of scale provided by online instruction or big lectures—not to mention the need for inexperienced faculty to learn how to conduct a discussion that has both spontaneity and structure.
 
I want to close this lengthy rumination the same way I began, with somebody else’s words—this time not from a renowned historian but from a current college student who wrote a column not long ago in my university’s student newspaper, the Columbia Daily Spectator, about the tone and atmosphere we value in the college classroom.  It was written by Salvatore Mannella, a college senior and officer of Columbia’s chapter of College Republicans, in the wake of the assassination of Charlie Kirk:
 
We are students. Every day, we sit and listen to those who know things that we do not. But something changes when we leave the classroom: suddenly we have all the answers.
 
He then offers the modest suggestion that “we should approach our discussions with one another with the same openness and humility with which we approach our studies.”
 
However much we may disagree about values entailed in politics or religion or ethics or aesthetics, surely we can agree that if the norms of the classroom—taking hard questions seriously, thinking before speaking, honoring the right of others to speak and be heard—were to supplant the chanting and shouting that tend to set the tone of so much discourse outside the classroom, we’d all be better off. Who could argue with that?
 
--Andrew Delbanco, President