The 21st will reveal its university lineage just as the 20th century did, writes Sheldon Rothblatt.
Do dates such as 1900 or 2000 have any particular significance for understanding changes in university history? The immediate answer is "no" - the question is a weak excuse to review some of the salient features of academic history in the United States.
The major systemic features of US higher education are largely a product of the last third of the 19th century and can be correlated with geographical expansion, population growth, industrialism and urbanism. Modular instruction and examining, the rise of the department as the central unit of teaching and the simultaneous decline of the chairholder are all features of the post-1860s period. Curricular modification in response to niche marketing occurred much earlier and never ceased. Wissenschaft, the ethic of knowledge acquisition derived from Germany, was an established feature by 1900 as younger lecturers, fortified by the example of overseas scholarship, wrestled their entrenched seniors to the mat. On such acts of defiance the separate graduate school was ultimately built.
The "people's universities", the so-called land-grant universities founded with federal government assistance to further low-cost higher education and to promote technical education, date back to the 1860s. In roughly the same period, the philanthropist Ezra Cornell created a market-responsive university of his own in upstate New York, one that is today a hybrid of land-grant inheritance and private income. The road from there to the multiversity was direct. In fact, the multiversity is but a more catchy name for a multiplicity of service functions long ago accepted as a necessary aspect of higher education.
Great polytechnics such as Massachusetts Institute of Technology were in place. The Johns Hopkins University, an institution infused with German conceptions of advanced learning, was up and running from the mid-1870s. Another experiment with a unique history, the University of Chicago, was developed by a Rockefeller in the 1890s. The belief in competition, that institutions improved when forced to compete for students, teachers, resources and prestige, was accepted by great leaders such as Charles Eliot at Harvard.
Even the "old-time college" was innovating. Its adolescent ethos, student-centred, liberal arts education and in loco parentis attitudes did not disappear as some historians have imagined. New colleges were continually created to meet pressures from different constituencies, including African-Americans. Vassar College for women was a wonder of the age, with its monumental buildings and scientific equipment. In another guise, the old-time college invaded all universities to become incorporated as "colleges of letters and science" on the rhapsodical American principle that a single institution could master all forms of education.
These were some of the great systemic achievements of the 19th century that created the 20th-century university. The next great period of innovation and change belongs to the 50 years after 1950.
For convenience, we can say that a quartet of postwar developments is remarkable. First was a commitment to expanding scientific and technical research derived from the experience of global wars and post-1945 Soviet containment policies. At long last Washington DC was an active contributor to the culture of universities. It was also a willing sponsor of the previous century's creation of peer review as the principal means of identifying scientific success.
A second feature was the linking of all state-sponsored segments of higher education into systems of distinct responsibilities to rationalise resources and regularise student opportunities. Non-elite public institutions consented to mission limitations because of guarantees for student transfer.
Another change, not so much systemic as soul-baring, accompanied by measures of guilt and rectification, was the acknowledgement of an unpleasant history of discriminating practices. Gentlemen's agreements, insensitivity to the aspirations of women and indifference to the educational condition of minorities prevailed until the late 1940s or 1950s with respect to Jews, later with regard to others. No one anticipated the turn that mea culpa took. Differences of opinion over the best policies needed to encourage and promote the excluded were normal, but the subsequent politicising of campuses, the proliferation of rules to police speech, the theft of student newspapers if unpopular views were expressed and the spillover of disappointment, frustration and anger piled up as the century wound down. Intellectual life itself was affected. Questioning scholarly integrity and attributing hidden agenda to scholarship almost became the order of the day.
But the legacies of the culture and science wars, no matter how insidious, became a sideshow to the fourth contribution. Even a historian, whose gaze is turned congenitally and unflinchingly to the past, may acknowledge that the high-tech universe that broke loose in the 1990s has the potential to shatter most past-century features of the US university. All eyes here in Berkeley are on the staggeringly controversial deal between the Swiss chemical and agricultural giant Novartis and the College of Natural Resources, the carefully drawn $25 million contract that is the first such (anywhere?) to commit an entire regular teaching division of a research university to a business alliance. All past disputes over definitions of pure research, over industrial cooperation, over the pursuit of knowledge for its own sake now seem as quaint as a medieval dispute about Aristotelian syntax. The first 50 half of the 20th century drew out the implications of its parental inheritance. So too will the 21st, but don't hold your breath.
Sheldon Rothblatt is professor of history at the University of California, Berkeley.