The Quad: How national ambitions, international wars explain the UC’s history of prestige
(Nicole Anisgard-Parra/Illustrations director)
By Arthur Wang
July 19, 2018 1:57 pm
This post was updated July 28 at 4:55 p.m.
Students and faculty in Berkeley and Westwood have had a lot worth celebrating lately. The University of California celebrated the 150th anniversary of its founding in March, with most of the events taking place at the inaugural Berkeley campus. Never one to miss out on a good party, UCLA is gearing up for its own festivities as the end of the 2017-2018 school year marks the beginning of a year’s worth of centennial celebrations here at the other top public university.
As such, there’s no better time to reflect on something many Californians and observers of American higher education regard as an unassailable truth: The UC is one of the most prestigious and productive university systems in the world. It’s certainly reflected on the ludicrously popular lists of “best” American and global public universities. A monograph of the UC’s first century and a half speaks of this to no end – and don’t even mention what the UC’s own biography has to say of itself.
However, the UC system is not just remarkable for its rankings or considerable research output – rather, it’s due to the rapidity in which the university has become a veritable global higher education powerhouse. Age is a key predictor of university prestige, since reputation, which pulls in undergraduate aspirants, federal research funds and donor dollars alike, is accrued over time. That the UC seems to have defied the temporal constraint on reputation development is perhaps the university system’s true crowning achievement.
In recognition of 150 years of the UC, I use the work of UC historian John Aubrey Douglass, among others, to look at how California’s original state university became a beacon of the Golden State and a model of American public higher education in less than two centuries. Along with it is an inquiry of what is exactly meant when we think a university is “prestigious,” and how this understanding is shaped by war, the federal government and money.
Before The Big “C,” Big Ag
In the mid-19th century, agriculture played an outsized role in both the economy and politics. The Morrill Land-Grant Acts, a staple of AP US History lectures, required states to conduct research in agriculture and engineering in exchange for receiving free federal land. This subsequently led to the establishment or expansion of many of today’s large state universities.
The UC was partially funded by the Morrill Act, but it faced even more pressure to emphasize agriculture and engineering because the agricultural lobby, representing California’s most powerful economic sector at the time, fought hard for the to-be-established university to serve their needs, even if it came at the expense of “research for research’s sake.”
These pressures would have several consequences for the development of the UC. It helped establish its status as a thoroughly public institution – it needed to serve the economic and educational needs of the state. The establishment of the precursors to UC Davis and UC Riverside was a direct consequence of the agricultural focus.
A war for(ged) California higher education
After World War II, California legislators stared in the face of both a looming crisis and historic opportunity.
The Golden State, long known for being a “cornucopia” for its agricultural production, became an industrial powerhouse during the truly global conflict. The Japanese threat and American involvement in the European theater spurred dramatic government investment in military infrastructure and the fledgling aerospace and aviation industries in the state, essentially financing the creation of what would later become collectively regarded as the “military-industrial complex.”
The crisis was that the end of hostilities would inevitably cause an economic recession in the state that profited most from the business of war. The opportunity, however, presented itself in the millions of service members that would settle in California, and the massive, $230 million (about $3.3 billion in today’s dollars) estimated budget surplus the state held in its coffers.
In perhaps one of the most consequential decisions for California public higher education and the UC’s development, Gov. Earl Warren and various state legislators decided that a massive spending program on higher education would reintegrate veterans into the economy and society and avoid recession. This remarkable financial commitment to higher education, responding to a demographic crisis which may never emerge again, helped cement the UC’s golden status.
These commitments would culminate in the well-known general framework, the 1960 California Master Plan for Higher Education, an ambitious document that defined the community-college transfer framework, the tripartite system that delineated the respective characteristics and responsibilities of the UC, the California State University and California Community Colleges systems, and most famously, the no-tuition promise for residents (a commitment which, even after it long eroded, meant that the UC did not call tuition “tuition” until 2010).
The Plan itself emerged from a trio of legislative studies directly related to the concerns of GI-related population growth and made fiscally reasonable by the massive postwar surplus. These included: the 1948 Strayer Report, which laid the groundwork for the Cal Grant program; the 1955 Restudy Report that ignited legislative efforts to establish or integrate new campuses into the UC system; and a 1957 study to identify the sites of all future UCs not named Berkeley or Los Angeles.
Fawning over federal funding
While the Second World War and its consequences eventually spurred a dramatic expansion of the UC and California public higher education at large, an emergent mechanism for federally funded research at universities would prove another source of institutional wealth and prestige for American universities, including UCs, which were the designated research campuses in the state.
Federal research funding for state universities is today unquestionably celebrated as a source of national strength and evidence of confidence in academia, or at least, certain portions of it. Even in an age of unprecedented private philanthropy to universities, Uncle Sam is still client No. 1. Universities and individual academic departments will boast of how much money they are pulling from Washington for research that ostensibly helps the nation. Yet this was not always the case.
For much of the country’s history, education in the United States has chiefly been a state-sanctioned matter. The federal government neither directly administrates over colleges, like it does in many other developed countries, nor does it implement or enforce federal guidelines for how higher education should be run – its enforcement role is largely limited to preventing discrimination, and its power to compel state schools to heed these regulations is limited to the threat of withholding federal funds.
One man, more than anyone else, was responsible for changing previously held attitudes that government and academic research didn’t mix: Vannevar Bush, a scientist, and government and university administrator. During the war, Bush presided over the Office of Scientific Research and Development, a spiritual predecessor of the National Science Foundation primarily focused on the development of the weapons that changed warfare, like the modern missile and the atomic bomb.
His 1945 report “Science, The Endless Frontier” is considered to be the pioneering document in advocating for and conceptualizing the partnership between the federal government and academia. It also unmistakably marries academia with the military – thus forming the basis of the “military-industrial-academic complex.”
Where was the UC in this configuration? Certainly not in the periphery. Bush’s affiliation with the Massachusetts Institute of Technology naturally made the science research powerhouse a key player among a tiny pantheon of schools – Stanford, the California Institute of Technology, Johns Hopkins and UC Berkeley – that have received the lion’s share of military research and development contracts over the years. As detailed by John R. Thelin in “A History of American Higher Education,” these schools were the primary purveyors of “Big Science” as they had the track records from involvement in past efforts like the Manhattan Project to carry out defense research. This is why UC Berkeley, with its administration of the Lawrence Berkeley National Laboratory and the only public school in the list, was a key player.
Sputnik spurs science
As the postwar economic boom cooled off in the following decade, enthusiasm waned for prolific spending in higher education in California and nationwide. That would all change with one event: the launching of Sputnik 1 in 1957, which kicked off the space race.
With it came a frantic political outrage as lawmakers, no longer questioning the value of university research, instead began asking why federal and state governments had so inadequately funded research and science education, from preschool to the professoriate, that the Soviets could have beaten the United States to a major technological achievement. This concern led to the passage of the National Defense Education Act in 1958.
As Sputnik’s launch was essentially viewed as an existential threat, since it was interpreted as a peaceful proof of concept of the ability to launch intercontinental missiles, it decisively squashed negative appraisals of federal involvement in university research from both conservatives and wary left-leaning academics. By the early 1960s, the era of the “Cold War university” was truly on.
Washington’s multibillion dollar patronage in university scientific research has no doubt affected how we understand university quality and prestige. Rebecca S. Lowen, in “Creating the Cold War University,” finds Stanford as a perfect example, as its unabashed postwar pursuit of federal research patronage precipitated its meteoric rise as a globally recognized research university.
Lowen makes a point to mention Stanford and UC San Diego – a public university truly founded for the Cold War, given its historical STEM emphasis – in the same breath. Institutional classification systems like the Carnegie system all but place large universities with the highest rate of research activity – which is related to their intake of federal dollars to fund projects that become the journal articles and reports used to quantify research productivity – at the top.
The Cold War university legacy has survived in the UC even as the federal research agenda is no longer as singularly focused on military applications. The university continues to administer, in some capacity, the Lawrence Livermore and Los Alamos National Laboratories – two facilities that conducted more federally funded nuclear weapons research than anywhere else in the country.
[Related: Atomic City]
Is the presence of government-funded military research the sole criterion of what makes a university great in the public’s eyes? Certainly not. The Cold War is in the history books, and the small liberal arts colleges and community colleges that supplement this intentionally limited look into American higher education history continue to enroll hundreds of thousands of students. Yet there is no doubt that Caltech, Johns Hopkins, Georgia Tech and many other schools would not be looked at the same way without the Cold War and its long shadow. The lasting effect of both hot and cold wars on the University of California’s development and how we understand the origins of the modern research university is a reminder of how one of humanity’s most noble causes – education and pursuit for new knowledge – was so influenced, and financed, by efforts to inflict profound violence onto others.