The End of Higher Education’s Golden Age

Clay Shirky:

Interest in using the internet to slash the price of higher education is being driven in part by hope for new methods of teaching, but also by frustration with the existing system. The biggest threat those of us working in colleges and universities face isn’t video lectures or online tests. It’s the fact that we live in institutions perfectly adapted to an environment that no longer exists.
In the first half of the 20th century, higher education was a luxury and a rarity in the U.S. Only 5% or so of adults, overwhelmingly drawn from well-off families, had attended college. That changed with the end of WWII. Waves of discharged soldiers subsidized by the GI Bill, joined by the children of the expanding middle class, wanted or needed a college degree. From 1945 to 1975, the number of undergraduates increased five-fold, and graduate students nine-fold. PhDs graduating one year got jobs teaching the ever-larger cohort of freshman arriving the next.
This growth was enthusiastically subsidized. Between 1960 and 1975, states more than doubled their rate of appropriations for higher education, from four dollars per thousand in state revenue to ten. Post-secondary education extended its previous mission–liberal arts education for elites–to include both more basic research from faculty and more job-specific training for students. Federal research grants quadrupled; at the same time, a Bachelor’s degree became an entry-level certificate for an increasing number of jobs.