Journalists who cover education seize on every opportunity to report on outcomes in elementary and secondary schools. They write articles about schools that don’t make adequate yearly progress under No Child Left Behind regulations, end-of-year promotion tests, National Assessment of Educational Progress results, SAT and ACT scores and high school exit exams. They also take note of dropout and college-going rates.

By contrast, coverage of higher education often seems to accept as an article of faith that college students learn what they set out to learn as long as they pass their courses and get degrees. The diploma seems to matter most to journalists and, frankly, to employers; it is treated as a proxy indicating that a graduate has absorbed a body of knowledge or mastered a set of usable skills. Reporters devote few column inches and little airtime to examining what college students learn.

Journalists’ lack of attention to learning outcomes is not surprising since most colleges and universities don’t devote much time or many resources to the issue, either. Instead, they tend to fall back on their “best-in-theworld” reputation, cite the need for an independent professoriate and speak of the difficulty of using any test to gauge the value of higher education. Nevertheless, the debate around measuring learning outcomes is growing louder and occurring in more places across the country.