Revista Electrónica de Investigación Educativa


Vol. 3, No. 2, 2001

Higher education in the United States:
Historical excursions

Jack H. Schuster
jack.schuster@cgu.edu

School of Educational Studies
Claremont Graduate University

150 East Tenth Street
Claremont, California, USA

 

Conferencia presentada en el Congreso Nacional sobre Historia de la Educación Superior en México, organizado por la Universidad Autónoma de Baja California (UABC) y la Asociación Nacional de Universidades e Instituciones de Educación Superior (ANUIES)1
Tijuana, Baja California, México, 8-10 de noviembre de 2000

Abstract

Higher education in the United States has been transformed in the last decade as never before in its three hundred years of history. Even though its origin is intimately linked to the religious groups of English settlers, nowadays it is characterized by increasing opportunities of access for students and by a decentralized system which allows institutional diversity. This paper is intended to explain both features by means of a retrospective journey along the main trends of current American higher education; and on the way, a critical review is done of its development, policies, similarities and differences when compared with other countries' higher education.

Key words: Higher education, history of education, educational policy.

 

Introduction

The purpose of my remarks is to link some of the most significant current trends in higher education in the United States to their historical origins and to suggest implications for the future. In so doing, it is not my intention to outline the basic structure and scope of higher education in the United States; this audience is well aware of those characteristics, and my recital here would be redundant. Rather, I shall focus on several current trends in higher education in the U.S. and describe their historical "journeys".

In attempting to place these current developments in their proper historical context, I recognize that the future always resists prediction. The literature of American higher education is replete with predictions that failed to materialize. Particularly notable is the fact that many of those unrealized forecasts were authored by (otherwise) well-regarded, very wise observers. Moreover, in an era of accelerated societal change on a global scale, attempts to predict the future are surely all the more hazardous, if not plain foolhardy. So, although I undertake the task of interpreting the historical evolution of higher education in the U.S. with some confidence, I have no illusion that the future will evolve according to my whims. With that caveat established, let me attempt to connect higher education's fascinating past to several current developments and conclude with a few comments about the elusive future.

 

Access and institutional diversity

The most distinctive feature of American higher education today is the huge number of students it accommodates -on the order of 15 million- in a radically decentralized, disorderly conglomeration of nearly four thousand accredited two- and four-year colleges and universities. Viewed in historical perspective, the drive to expand access to higher education has been relentless. Although some resistance to increased access has surfaced from time to time, those critics have invariably been overrun by the forces of democratization and expanded access. The result has been a system -perhaps more properly characterized as a non-system- both massive in size and very uneven in quality. The so-called "gold standard" that the British (and some other Commonwealth countries) for so many years applied to their philosophy and practice of higher education was never adopted in the U.S. (although it had its adherents); rather, the American approach has opted consistently for a basic trade-off: to emphasize minimal barriers to access while permitting quality to vary markedly among different types of institutions and even within a given institution. In some sectors of American higher education, the standard was, and remains, gold; in some other sectors, something less precious than gold suffices. And that variety -designed to accommodate students of very different academic ability levels- constitutes not only the most salient characteristics, but also the greatest strength of this unruly system. A brief historical overview may be helpful to outline how this came about.

The history of American higher education spans more than three and a half centuries, dating to the founding of Harvard in 1636 in the settlement of Newtowne (renamed Cambridge in 1638 for understandable reasons) across the Charles River from Boston. Students whom I have taught over the years in the doctoral seminar on the history of American higher education tend to be impressed that an actual college could have been organized there so long ago, especially because that college had been founded so soon after the Pilgrims had landed on the rocky coast of what was to become the Massachusetts Bay Colony2. . Lawrence Cremin underscored this point: "…a college was founded by a legislative body that had been in existence less than eight years for a colony that had been settled less than ten." (Cremin, 1970, p. 210). I am glad that the students appreciate that remarkable fact. And each year I am even more pleased to point out that, in the interest of perspective, that, while the infant Harvard still consisted of only a few crude structures,3 comparable activity had been underway in the New World for a century. The students typically are surprised to learn that Spain had been engaged in transplanting universities across the Atlantic long before the founding Harvard: in Santo Domingo in 1538, in Mexico City and Lima in 1551, and in Bogota in 1580. They learn, too, that the French also started early on, but were not nearly so motivated as the Spanish to export universities; the founding by Jesuits of Laval in Quebec in 1635 -on the eve of Harvard's inception- proved to be an exception, for the French then waited more than two centuries before establishing another colonial university, this one in Algiers in 1879 -and Algeria was not technically a colony but rather was regarded as an integral part of France (Perkin, 1991).

That observation about Mexico's rich history of higher learning, and other New World initiatives, helps to put American higher education's humble beginnings in clearer perspective. So, too, does the observation that the next college to be established in the English colonies of North America did not occur for another 57 years when the College of William and Mary was chartered in the colony of Virginia. This second college was the product of Anglicans, as distinguished from the Puritan Congregationalists who had established Harvard, and William and Mary's founding thereby initiated a pattern of diverse origins and sponsorship that saw various religious denominations sprinkle colleges across the landscape with the devout hope of advancing their particularistic interests: for instance, at Yale, Congregationalists (at odds with their co-religionists at Harvard); Anglicans at King's College (which became Columbia after the War of Independence); Presbyterians at the College of New Jersey (which became Princeton); Baptists at the College of Rhode Island (which became Brown); and Dutch Reformed at Queens (which became Rutgers) (Rudolph, 1962, pp.10-11). "Variety" summed up Frederick Rudolph, "carried the day, as the relations between college and state and between college and church certainly made clear." (p.13). However, none of these colleges proved to be narrowly sectarian in their admissions policies or practices-whether their more inclusive approach was based on a more inclusive institutional philosophy or was derived from plain necessity. Rudolph again sums it up well in one of my favorite passages:

The nineteenth-century American college could not support itself on a regimen of petty sectarianism; there simply were not enough petty sectarians or, if there were, there was no way of getting them to the petty sectarian colleges in sufficient numbers. The high mortality rate of colleges in the first half of the nineteenth-century was proof that petty sectarianism did not pay off (p. 69).

And so variety and access characterized higher education in the English colonies from the beginning and throughout the Seventeenth and Eighteenth Centuries, although, as noted, the feature of access was perhaps more a matter of necessity than of principle. In reality, those early colleges remained tiny, struggling for decades to survive, despite their openness to students from diverse religious backgrounds.

Higher education evolved slowly and incrementally over the span of many years. There were nine colleges by 1770, prior to the American Revolution, and another ten by the beginning of the Nineteenth Century.4 The number of colleges grew to 75 by 1840.5 As the U.S. expanded its territory in its inexorable westward drive, colleges increasingly were scattered across sparsely-settled topography. One datum that underscores the American penchant for breeding colleges is that by 1880, while all of England, with a population of 23 million, was served by four universities, a total of 37 colleges were spread across the midwestern state of Ohio, population 3 million. Playing with these numbers, it can be seen that the good people of Ohio had provided a college for each 80,000 of its citizens, while England lagged woefully with one university per 6,000,000! But that contrast tells only one part of the story. For one sobering datum reflects the downside of that overzealous commitment to creating new colleges: Rudolph estimates that "over seven hundred colleges died in the United States before 1860." (Rudolph, 1962, p. 219).

At the turn of the next century, by 1900, the number of colleges had increased to over 500 serving some 238,000 students. Through the first half of the Twentieth Century, the expansion of higher education was steady but nonetheless modest in scale. Growth had been slowed by the harsh Depression years and, subsequently, by the attention to more urgent national priorities necessitated by the Second World War. Although the war had helped to bring an end to the Depression in America, it had the effect of further retarding the growth of higher education.

Not until the conclusion of World War II did higher education experience sharp increases in enrollments. This was stimulated in large part by a remarkable law enacted by Congress. This legislation -the Servicemen's Readjustment Act of 1944- provided for education subsidies to millions of members of the military forces. Its scope was broad, for the GI Bill of Rights (as it was more commonly known) helped returning veterans in a variety of ways, among them by guaranteeing loans to buy houses and to start businesses. But the GI Bill is best remembered -appropriately- for providing payments that enabled some 2,232,000 men and women to attend college (Ravitch, 1983, pp. 12-15). For most of its existence, American higher education had afforded limited access to persons with only limited financial means; only a very small fraction of the normal college-age population had attended college despite the efforts of most colleges -almost all of them very weak financially- to seek out students. Now, however, the matriculation of large numbers of veterans in enrollment-hungry colleges and universities began to change past patterns, thereby contributing significantly to the democratization of higher education. In the fall of 1946 over one million veterans enrolled, thereby approximately doubling college enrollments. It was not a development, however, that was universally applauded. Indeed, the wisdom of the GI Bill was challenged by many higher education leaders and stridently opposed by some, most notably, Robert Maynard Hutchins, Chancellor of the University of Chicago.6 Despite the vigorous resistance by Hutchins and like-minded traditionalists to what they feared would be a terrible cheapening of higher education -by making colleges and universities available to persons who had not clearly demonstrated a capacity for rigorous academic work- higher education in the U.S. had been inalterably transformed. "For the first time," Diane Ravitch writes, "the link between income and educational opportunity was broken." (p. 14).

Even outspoken critics of the GI Bill were obliged to admit, eventually, that the serious-minded veterans were surprisingly successful in their studies. There was room -or should be- in colleges and universities for those who were not typical middle-class students. Soon after the flood of veterans had begun to change how educators and the general public thought about higher education, one of the most underrated events in the annals of American higher education took place. Harry S. Truman, himself the last American president (1945-1953) without a college education, appointed a President's Commission on Higher Education in 1946. Surveying the challenges that confronted post-war America, the Commission issued an extraordinary series of reports in 1947-48 that urged the further expansion of higher education, with particular attention to the newly emerging two-year community colleges. More pointedly, the Commission boldly called for an end to the racial segregation -both de facto and de jure- that characterized much of American higher education, especially in the Southern states. However, it took years before the far-sighted recommendations of the Truman Commission were implemented.

The next crucial step in the chronicles of expanding access to higher education came about two decades later when Congress, always wary of according the Federal government an influential role in education at any level, but pressed hard by President Lyndon B. Johnson, to invest more aggressively in education, including higher education, enacted the pivotal Higher Education Act of 1965.7 The confluence of many factors had been necessary to enable that breakthrough, perhaps chief among them the electoral landslide in the 1964 national elections that had resulted in extraordinarily large Democratic majorities in both houses of Congress.

The Truman Commission's report had provided a beacon that lit the way for reformers who sought to further democratize access to higher learning in America. Since 1965, the federal government has substantially broadened its role in promoting access to higher education, especially through amendments in 1978 and 1980 to the Higher Education Act that significantly expanded financial aid to students from middle class families. The 1970s and 1980s were notable for the sharp increase in the number of "non-traditional" students who matriculated in colleges and universities; particularly this was the case for older adults (especially women) and racial minorities. The long transition from elite to mass higher education had thus proceeded in linear fashion, albeit with occasional but minor detours, over the span of nearly four centuries. Despite the manifest unevenness in quality, American higher education's crowning achievement has been to provide access -opportunity to succeed- for an extraordinary proportion of its citizenry.

That very brief overview of how access to higher education evolved in America is intended as a backdrop, albeit a superficial one, for several current developments that are reshaping how higher education is currently organized and delivered. To do this I shall identify three salient trends: internationalization, faculty redeployment, and a blurring of auspices or control. I see them, respectively, in their historical context, as being "progressive," "regressive," and "convergent" (for lack of a better term). Other trends could have been selected, but that is a task for another time. The following illustrate my interpretation of these three quite different historical trajectories.

 

1. Internationalization: A phenomenon of progression

Movement across national boundaries is as old as universities themselves: Paris to Oxford, Oxford to Cambridge. And on it goes; the physical movement of scholars from one venue to another has never stopped although the flow is sometimes hindered by political realities, including restrictive immigration policies. Surely no nation has benefited more than the United States from the emigration of scholars from other lands. The flow to America began with the beginning of English settlements in North America: "What is especially interesting with respect to the American situation was the extraordinary concentration of educated men in the Great Migration of Puritans to New England….[A]t least 130 university men [were] among those who immigrated before 1646…." (Cremin, 1970, p. 207). Over the ensuing years, America-the quintessential land of immigrants-became the destination of innumerable scholars who streamed to the U.S. in sporadic bursts, as they were attracted to opportunity or propelled by oppression out from their homelands. Perhaps most dramatically, of course, were the scholars who were able to escape Nazi Germany in the 1930s, seeking refuge in the U.S. and elsewhere. Therein lies an ironic touch. Germany had once attracted many hundreds of American scholars and students who, particularly during the second half of the Nineteenth Century, were drawn to German universities, absorbed their values, and returned to seed the American landscape -dominated too long by the old English College model- with visions and practices appropriate to research universities.

The patterns of immigration to the U.S. are in constant flux. In recent decades, the tilt has been toward Asia, or, more precisely, toward the U.S. from Asia. There are various ways to measure the change in the immigration patterns of scholars who obtain faculty positions in the U.S. A recent study provides one lens through which to view this shift (Finkelstein, Seal & Schuster, 1998).8This study contrasted the characteristics of a relatively new cohort of faculty members (whom we defined as having had seven or fewer years of full-time employment as faculty members) with the characteristics of their more senior colleagues (those with eight or more years of full-time academic employment). Two findings are relevant for present purposes.

First, increasing numbers of non-U.S. natives are becoming faculty members. That is, the proportion of full-time faculty members who are native-born U.S. citizens appears to be shrinking significantly: 83.1 percent among the newer cohort in contrast to 88.5 percent among the more senior cohort. 9 Put another way, only about one in nine experienced faculty members had moved to the U.S. from elsewhere compared to about one in six among the newer entrants to full-time academic work.

Second, a clear difference is evident in comparing these non-natives' countries of origin. In both the senior and newer cohorts of faculty, India is the country of birth that heads the list, and in both cohorts natives of India are followed by natives of the United Kingdom. The gap now appears to be widening. In the newer cohort of faculty, natives of India outnumber those from the U.K. by 15 percent (3,633 vs. 3,158), whereas the difference within the more senior cohort had been negligible (4,674 vs. 4,457) (Finkelstein, Seal & Schuster, 1998). 10 More strikingly, in the new cohort natives of China (2,736) approach in number those who were U.K.-born (3,158).

In all, the influx of scholars from Asia, particularly East Asia, has risen steeply in recent years both in absolute and relative terms. The steady stream of academic talent continues, lured by academic jobs and, often, attracted in particular by the prospect of research support that may be very difficult or nearly impossible to obtain in one's native land. This process continues, stimulated by favorable immigration policy -especially by the far-reaching U.S. Immigration Act of 1990- despite the prevailing very tight academic labor market in the U.S. in most academic fields. (It must be added that these immigration policies -perhaps like all national policies- serve the perceived national interest of the importing nation with minimal regard for the resulting drain of talent away from the exporters of academic talent.) (Schuster, 1994).

In sum, internationalization has become an increasingly prominent dimension of the American academic landscape, further extending the long history of immigration that is inseparable from the story of American higher education. This feature continues to enrich American higher education, even though revolutionary technologies of communication render national boundaries less and less relevant to the work of academicians. The history of internationalization is a story of sustained progression.

 

2. Reconfiguring the faculty: A phenomenon of regression

The development in contemporary American higher education that most intrigues -and alarms- me is a pronounced trend in the types of academic appointments currently being made. It is a story that begins long ago with just a few faculty members who held "regular" academic appointments, that is, full-time, long-term appointments. This pattern was prevalent throughout the Seventeenth and Eighteenth Centuries and that practice only slowly gave way to a professionalized regular faculty in the Nineteenth Century (Finkelstein, 1983). The academic profession through much of the Twentieth Century struggled to expand and secure its professional status. The strategy entailed gaining acceptance for principles of academic freedom and, concomitantly, for the practice of awarding tenure, that is, conferring lifetime security of employment for faculty members who successfully completed a probationary period. Tenure was deemed by the leadership of the profession to be an indispensable means to assure academic freedom. In that quest, the American Association of University Professors, founded in 1915 primarily for that purpose, emerged as the profession's principal strategist for articulating and attempting to enforce the standards of academic freedom and of tenure. Much was accomplished during the next half-century.

When Christopher Jencks and David Riesman wrote The Academic Revolution in 1968, it could be interpreted as a celebration of how the academic profession had succeeded in its long struggle to achieve professionalism (Jenks & Riesman, 1968). This accomplishment was evidenced, they argued, by academics having finally achieved the dominant influence over core academic activities, particularly, academic personnel decisions, curricular decisions, and other issues of academic quality.

But at about the time that this figurative celebration of victory was taking place, important changes -the regression to which I refer- were already beginning to take place, not so visibly at first but gaining momentum in the 1980s and especially accelerating in the 1990s. Most notable are two parallel developments that taken together erode -even reverse- the prior progress. First, the proportion of faculty members who teach only part-time has been rising at a dramatic rate for three decades. In the early 1970s, about 22 percent of faculty members (by headcount) were teaching part-time. By the early 1980s, the proportion had risen steadily to about 32 percent, then to 42 percent in 1992. The percentage appears to have climbed throughout the past decade, and as the new decade begins, the proportion of part-time faculty appears to be approaching 50 percent (Schuster, 1998). This relentless increase in the proportion of part-time faculty constitutes a huge change in the way higher education is being conducted in the U.S. And although the use of part-time faculty is more extensive among the two-year community colleges, the practice has accelerated in all types of institutions (Gappa & Leslie, 1993).

The experience in many other nations, including Mexico, is very different from that in the U.S. In Mexico, among other settings, very heavy reliance on less than full-time faculty members has long been the norm. But in the U.S. this massive shift toward a contingent academic workforce constitutes a highly troubling setback in the eyes of many critics.

A second trend that has almost escaped detection is profoundly important and is also disturbing to many observers. This is the extent to which the appointments of full-time faculty members are being made that depart from traditional practice. The traditional type of appointment for full-time faculty members is initially for a probationary period (customarily six or seven years) followed by a decision -"up or out"- that would result either in conferring tenure or prescribing a terminal year that ordinarily would be used by the unsuccessful candidate for tenure to seek and obtain another academic position. But something very different has been happening, namely, appointing full-time faculty members for limited term appointments, that is, appointments off the tenure track. This is a quite recent development. There have always been some full-time faculty members who have not held tenure-eligible appointments, but those numbers, until very recently, have been small. In fact, national surveys of faculty members in the 1960s and 1970s indicate that the numbers of such appointments were negligible. Full-time status essentially was tantamount either to having tenure or a probationary appointment leading to a tenure decision. That has all changed. In fact, my colleagues and I have found that during the 1990s, slightly over one half of all new full-time academic appointments have been tenure-ineligible term appointments (IPEDS, 1993, 1995, 1997). That is revolutionary or, more properly described, a reversion to an era that predates the profession's achievement of tenure as a normative practice. Some observers, myself included, view this trend as regressive, a reversal of an historical evolution that was pivotal in securing crucial professional autonomy and appropriate faculty influence over academic affairs.

Thus, given the sharp escalation in the number and proportion of part-time faculty appointments combined with the rapid increase in the number and proportion of full-time faculty appointees who are tenure ineligible, the American faculty is being reconfigured at an amazing rate. Among faculty appointments made in recent years, perhaps only one in four is a traditional tenure-bearing or tenure-eligible appointment (Schuster, 1998).

The principal reasons for these developments are not difficult to ascertain: cost containment and organizational flexibility. The part-time appointments save a lot of money; for those appointees earn much less than their full-time counterparts measured on a per-course basis. And such appointments, almost always for one academic term or one year, provide the institution with maximum flexibility -an increasingly attractive feature in a volatile, uncertain environment. The full-time term appointments also add attractive flexibility for deploying instructional and research staff. That much is undeniable.

There is another dimension to this phenomenon. The full-time faculty in the U.S. is older than ever. The most recent national survey places their average age at 51 (National Opinion Research Center, 2000). Accordingly, a great many retirements are beginning to take place and, correspondingly, a great many replacements are being -and will be- made in the near future. The point here is that the transformation of the faculty, in terms of the kinds of appointments they hold, will likely accelerate with so many faculty members holding traditional-type appointments being replaced by new entrants who are much more likely to hold non-tenure track appointments.

What difference does this make? If these trends result in significant cost savings and expanded organizational flexibility, why be concerned? An adequate response would require more attention than the scope of this paper allows, but much of the answer must be grounded in historical experience. My reading of that history is that the academy would be unable adequately to perform its vital function as an independent critic of society, including its ability to criticize the academy itself, without the stability and protection that full-time, secure academic appointments provide. Moreover, an increasingly contingent faculty has enormous-and troubling implications-for academic culture and practice, including diminished faculty accessibility to students.

In all, the powerful trends now underway in redeploying the faculty constitute a reversion to a more problematic era for the academic profession and for core academic values.

 

3. The blurring of public-private auspices: A phenomenon of convergence

A third development suggests a different sort of trajectory -a trend toward convergence in the auspices or control of colleges and universities. The effects of this perceived convergence are undoubtedly significant -probably very important- although very difficult to measure. I refer to the blurring of the distinction between the public and private (or "independent") sectors of higher education in certain practical (though not legal) respects, especially during the past 25 to 35 years.

The colonial colleges in America were neither distinctly "public" nor "private" institutions in contemporary terms. In this regard, their identity was blurred. Thus Harvard and Yale and Dartmouth and their institutional "siblings" were chartered by local governments-typically by the governor who was the British Crown's agent in the various colonies. And all of these fledgling colleges received some amount of public financial subsidy (Cremin, 1970; Rudolph, 1962). (Harvard, for example, received an annual allocation from the General Court (essentially the colony's legislature). But these early colleges were far from being "pure" public entities. Thus, they were typically beholden to whichever Protestant denomination that had helped to found and support them and, moreover, the colleges controlled their own admissions, hired their small staffs, and so on. "The distinctions," observed Cremin, "were in process of becoming and therefore [were] unclear and inconsistent"(Cremin, 1970). Marsden described these early colleges as "quasi-public" (Marsden, 1994, p. 68). This condition of fuzziness between public and private identities or auspices changed in the early Nineteenth Century in large part as a result of the so-called Dartmouth College Case which was decided by the U.S. Supreme Court in 1819.

The state legislature of New Hampshire (for by then the colonies had become states in the still-young Union) was unhappy with Dartmouth College's unresponsiveness to the needs of the state's citizens which the legislators believed to be much more pragmatic than the College's classical curriculum with religious content was providing. To remedy the situation, the legislature attempted to reorder the College's priorities (Rudolph, 1962, pp. 207-213). But wait! Dartmouth College, which had been established in 1769, had its own governing board, dating from its original charter, and was keen to resist the attempted takeover by State authorities. To condense a long, complex story, the College sued the State and prevailed. Writing the Court's opinion, John Marshall -the first (and long serving) Chief Justice of the U.S. Supreme Court- concluded that Dartmouth College was not a public institution subject to public control and that the State could not abrogate the College's contractual rights, which derived from its charter (Rudolph, 1962, p. 210). In other words, if the New Hampshire legislature wanted to control how higher education was to be conducted, it would have to pursue other means such as creating its own college.11

This decision by the U.S. Supreme Court served as a stimulus in two basic ways. It signaled to the colleges that were scattered among the states and regarded themselves as private that they were protected against attempts by state authorities to convert such private colleges (or private entities of whatever kind) to their own purposes. There was room left for some measure of public oversight of the colleges, but converting a private college into a public one by displacing its governing board, Marshall concluded, would violate the U.S. Constitution. At the same time that the Court assured Dartmouth College of its independence, a correlative effect of that decision was to establish that any state that desired to mould higher education to that state's own priorities would need to establish its own public college. And this the legislators of New Hampshire eventually did, albeit almost five decades later. Indeed, the Supreme Court's decision, by endorsing the autonomy of private colleges, appears initially to have retarded the development of unambiguously public colleges. Prior to the Dartmouth decision, several states had created their own public colleges. (Both North Carolina and Georgia vie for the title of being first around 1785). But after the Dartmouth decision, the distinction between public and private colleges (and, later, "universities") became increasingly clear, and ultimately the states seriously undertook the task of creating and nurturing public colleges and universities.

This public-private distinction has lead to a remarkable system of American higher education, so remarkably diverse and radically decentralized, as noted earlier, that it might well be described as a non-system. Whether it is a "system" or "non-system," when considered on this more macro scale, it is amazing to me how these parallel public and private (or "independent") sectors have evolved, over so many years, with neither dominating the other.12 This dimension of American higher education is distinctive -indeed, unique in some respects-when considered from a global perspective. True, in recent decades some other nations have encouraged the private sector's rise to greater prominence -Japan and, more recently, Mexico are among the scattered examples- but the U.S. set in motion long ago its centrifugal approach, featuring a very limited role for central government. Indeed, the federal government did not establish its own cabinet-level (or ministry-level) agency for education until 1980 when the U.S. Department of Education was created by the slimmest of Congressional margins (in 1979) (Radin & Hawley, 1988). Indeed, the newborn Department came perilously close to extermination in its infancy when Ronald Reagan, who, as candidate for president had campaigned against this new federal department, after his election sought, but could not obtain, sufficient Congressional support to destroy it (Schuster, 1982).

The higher education landscape thus has featured these robust side-by-side public and private sectors. It is remarkable that the most prominent universities continue to be divided, roughly equally, between those two sectors. Indeed, the rankings of universities -however useful or irrelevant such rankings may be deemed to be- have not changed substantially throughout the past century, from the Carnegie Foundation's rankings in 1909 (Slosson, 1910) to the latter-day reputational rankings generated by the American Council on Education and the National Research Council. Most interestingly, this outcome is not by design but by happenstance, a kind of historical fluke. And, although public higher education has expanded enormously during the past half century to now account for roughly four fifths of all enrollments in higher education, the independent sector, at least among the more selective (elite) universities and colleges, has maintained, if not increased, its prominence.

If "parallel non-systems" is a fair way to characterize what we have fashioned in the U.S., it is relevant to note that the boundaries of the two sectors are now becoming more blurred than ever, at least since the Dartmouth College Case in effect delineated private from public higher education almost two centuries ago, as discussed earlier. In recent decades, for many reasons, the overlap between the public and private sectors has become more evident. Consider these developments. Public institutions at all levels now engage very actively in fundraising from private sources that formerly "belonged" to the independent sector. Twenty-five or thirty years ago, substantial fundraising by the leading public universities was very uncommon. However, as public funding for higher education became more elusive, that historical condition changed rapidly. It is no doubt shocking to some observers that some eminent public universities today receive less than one quarter, or even less than one fifth, of their overall revenues from the very state that created and claim them. Thus these ostensibly public institutions, while not having changed their legal status, are necessarily responsive to many masters. To be sure, much of the support they receive for research comes from public sources at the national level, that is, federal agencies, prominent among them the National Institutes of Health land the National Science Foundation; the typically sponsor much of the research, especially big science projects, conducted at the research universities. The state may stand in primus inter pares, among the other supporters, but the big public research universities -the University of Michigan or the University of California, Berkeley, among dozens of others- are less and less dependent on, and thereby less driven by a state-generated agenda. This is not to argue that the great state universities have been substantially privatized, although corporate sponsors and individual donors have their priorities. It is to say that the great state universities today negotiate a much more complex environment in which influence comes form many quarters.

Meanwhile, the "independent" colleges and universities, with relatively few exceptions, are so reliant on federal financial assistance to students aid that withdrawal of that public lifeline would result in institutional death in short order for hundreds and hundreds of private institutions whose dependence on the intravenous flow of federal funds for their students' tuition is an indispensable source of revenue. Indeed, it is not uncommon for a small, private liberal arts college to derive upwards of 70 or 80 percent, or even more, of its total revenue from federal monies that are funneled to them through the grants and loans that their students receive.

And so this story has evolved in stages, from an era when no clear distinction existed between public and private, to a quite clear delineation, and now, in more recent years, to a point where the distinction is again fuzzier. This blurriness is not a matter of whether fundamentally important legal and political differences still remain between the two types of auspices or control. These differences persist. But the sources of funding -and, therefore, any given college or university's priorities and capacities- are considerably more varied now. Thus, as a practical matter, at least in some important respects, the differentiation between public and private has lost some of its meaning as the historical parallel lines of development, having diverged in the Nineteenth Century, have moved toward converged in the late Twentieth Century.

 

Summing up

I believe that higher education in the United States is at present in the most rapid state of change than ever before. Moreover, I am persuaded that this observation likely is applicable to higher education in much of the world and almost certainly includes Mexico. This claim about the rapidity and scope of change in U.S. higher education may sound to some like a bold, even extravagant, assertion, but, for the sake of perspective, it is well to consider that higher education has not often changed swiftly during the course of its near-millenium history.

To be sure, change is the only constant. As Heraclitus observed, "All is flux, nothing stays still." Higher education in the U.S. has never been static, but most often the rate of change has been moderate, incremental. Now, however, that is, within the past decade or so, the rate of change appears to be accelerating beyond our capacity to measure it or to comprehend its significance. I am convinced that more significant aspects of higher education are now "in play" than ever before. As this history unfolds, some vectors of change that will powerfully shape the future can be seen as progressing in near-linear fashion, some as reverting toward a previous status, and some tracing a unique trajectory. But ever-expanding access is likely to remain a core value into the foreseeable future.

 

References

Bailyn, B. (1960). Education in the forming of American Society: Needs and opportunities for study. New York: Vintage Books, Random House.

Cartter, A. M. (Ed.). (1965). Higher education in the United States. CITY: American Council on Education.

Cremin, L. A. (1970). American education: The colonial experience: 1607-1783. New York: Harper & Row.

Finkelstein, M. J. (1983). From tutor to specialized scholar: Academic professionalization in Eighteen and Nineteenth Century America. History of Higher Education Annual (Vol. 3, pp. 99-121). University Park, PA: Pennsylvania State University.

Finkelstein, M. J., Seal, R. K., & Schuster, J. H. (1998). The new academic generation: A profession in transformation (Chap. 3). Baltimore, MD: Johns Hopkins University Press.

Gappa, J. M. & Leslie, D. W. (1993). The invisible faculty: Improving the status of part-timers in higher education. San Francisco: Jossey-Bass. Integrated Postsecondary Education Data System (1993, 1995, 1997). Washington DC: National Center for Educational Statistics.

Integrated Postsecondary Education Data System (IPEDS). (1993). Washington DC: National Center for Educational Statistics.

Integrated Postsecondary Education Data System (IPEDS). (1995). Washington DC: National Center for Educational Statistics.

Integrated Postsecondary Education Data System (IPEDS). (1997). Washington DC: National Center for Educational Statistics.

Jencks, C. & Riesman, D. (1968). The academic revolution. Garden City, NY: Doubleday.

Marsden, G. M. (1994). The soul of American university: From protestant establishment to established nonbelief. New York: Oxford University Press.

Perkin, H. (1991). History of universities. In P. G. Altbach (Ed.), International higher education: An encyclopedia (Vol. 1, pp. 169-204). New York: Garland Publishing.

Radin, B. A. & Hawley, W. D. (1988). The politics of federal reorganization: Creating the U.S. Department of Education. Oxford: Pergamon Press.

Ravitch, D. (1983). The troubled crusade: American education, 1945-1980. New York: Basic Books.

Rudolph, F. (1962). The American college and university: A history. New York: Alfred A. Knopf.

Schuster, J. H. (1982, May). Out of the frying pan: The politics of education in a new era. Phi Delta Kappan, 63 (9), pp. 583-591.

Schuster, J. H. (1994). Emigration, Internationalization, and the 'Brain drain': Propensities among British Academics. Higher Education, 28 (4), 437-452.

Schuster, J. H. (1998, jan.-feb.). Reconfiguring the professorate: An overview. Academe, 84 (1), pp.48-55.

Slosson, E. E. (1910). Great American Universities. New York: MacMillan.

Smith, R. N. (1986). The Harvard century: The making of a university to a nation. New York: Simon and Schuster.

Whitehead, J. S. & Herbst, J. (1986, Fall). How to think about the Dartmouth College case. History of Education Quarterly, 26 (3), 162-72

1 Redie thanks Jesús Francisco Galaz Fontes, M. Ed., for his collaboration in the edition of this key-note lecture.

2 InIn fact, an even earlier effort had been made to establish a college in the colony of Virginia, but that effort -founded as Henrico College (1620-22)- failed. (Cremin, 1970; Bailyn, 1960).

3 The embryonic college was described as "puny." (Smith, R. N., 1986, p. 16.).

4 In 1775 the combined graduating classes of Harvard (the largest at 40), Yale, Kings, Dartmouth, and Philadelphia (to become Pennsylvania) colleges was 107. (Cartter, 1965, p. 20). Note, also, that in 1725 Harvard and Yale together graduated 56 students, but fully 45 years later, in 1770, the combined number of graduates from Harvard, Yale, and New Jersey (that is, Princeton) had reached only 75.

5 Counting colleges is not as easy as might be supposed; different sources give quite different tabulations in the Nineteenth Century.

6 Hutchins went so far as to warn that "Colleges and universities will find themselves converted into educational hobo jungles." (Ravitch, 1983, p. 13).

7 An "omnibus" measure covering many initiatives, the Higher Education Act's most enduring provision was the establishment of several key federal student financial aid programs to provide grants, subsidized loans, and on-campus work opportunities for students. Also in 1965, Congress enacted the landmark Elementary and Secondary Education Act, another key element of President Johnson's "Great Society" initiatives.

8 The authors conducted a secondary analysis of data derived from the 1993 National Study of Postsecondary Faculty, National Center for Education Statistics, U.S. Department of Education.

9 The authors calculate that approximately 29,100 in the new cohort of 172,319 full-time faculty were not native-born U.S. citizens.

New Faculty
Senior Faculty
Native U.S. citizen
83.1%
88.5%
Naturalized U.S. citizen
5.3
7.4
Permanent resident
8.8
3.7
Temporary resident
2.8
0.4

(See Finkelstein, Seal, and Schuster, op. cit., Table 7, p. 32; Figure 4, p. 33; Table 7-A, p. 127.)

10 See Tables 7-A, 7-B, T-C, pp. 34-35; Table A-7A, p. 128.

11 The Court "…in effect restored the old charter of the College and endorsed the right of initiating groups to control what they had created, to gain from the state equal privileges with all other groups and to retain them even against the state itself." (Cremin, 1970, p. 47).

12 The roughly 3,700 two- and four-year colleges and universities include about 1,600 public and 2,100 private institutions. Those bare numbers do not measure strength or attractiveness of the respective sectors, but they do reflect some measure of balance.

Please cite the source as:

Schuster, L. H. (2001). Higher education in the United Sates: Historical excursions. Revista Electrónica de Investigación Educativa, 3 (2). Retrieved month day, year from the World Wide Web:
http://redie.uabc.mx/vol3no2/contenido-schuster.html