Home inFocus Government and Government Overreach (Fall 2013) Four Ways to Improve Higher Education

Four Ways to Improve Higher Education

Anne D. Neal and William Gonch Fall 2013

America’s colleges and universities are in crisis. They cost far too much: America spends thousands more per post-secondary student than any other OECD nation. For all that money, we achieve mediocre outcomes: 42 percent of students who enter a four-year institution fail to leave that school with a degree within six years, and studies find that many students’ improvement in thinking skills is insignificant or nonexistent. Graduates typically leave with mountains of debt and struggle to find work.

In a trend that would have been unthinkable ten years ago, Americans are expressing skepticism about higher education—not just with their voices, but also with their pocketbooks and their time. At all but the most elite schools, application numbers are down, and students are increasingly choosing less-expensive schools, starting at community colleges, or otherwise seeking to control costs.

Colleges are feeling the pinch. The average “tuition discount rate” at private colleges, which measures the discount colleges offer through grants and loans out of their own budgets, has reached 45%. An article in The New York Times notes that net tuition revenue is flat or decreasing at 73 percent of schools. And in a recent study of 1,700 American colleges and universities, Bain & Company found that one-third of those schools are “on an unsustainable financial path.” These developments threaten the budgets of many schools, and they will only worsen over time.

Reform is essential, then, for our colleges and universities. How do we achieve it? Four changes are especially important.

Require a Core Curriculum

It used to be common for students to begin their college careers with a general education program made up of core courses in the humanities, social sciences, natural sciences, and a few other subjects. Often these courses introduced students to great works and ideas; at their best, such courses provided students in disparate fields with a common intellectual experience and turned them into a community of learners. And in any event they leave all students with a common grounding in basic knowledge.

Today, however, only a few schools maintain such curricula, and many college graduates lack even basic knowledge of western culture and the world in which they live.

American Council of Trustees and Alumni’s (ACTA) “What Will They Learn?” project studies the curricula of more than 1,000 colleges and universities. The 2012-2013 study found that fewer than 19% of schools require a foundational course in U.S. history or government. Fewer than 14 percent require intermediate-level foreign language, and more than a third do not even require college-level mathematics. Of the seven subjects studied by “What Will They Learn?” the average school required about three.

Instead, students take courses such as “Zombie Nation,” at SUNY—Binghamton, “Tattoos in American Popular Culture” at Pitzer College, or “History of American Television” at Emory. Not only are those all real courses at major American universities, but every single one of them satisfies a core requirement in the school’s general education program.

Students’ ignorance reflects their poor curricula. A recent ACTA survey found that large majorities of recent college graduates were unable to answer correctly many questions about American history and government—such as identifying George Washington as the General at Yorktown, or the terms of office for Senators and Members of Congress.

Each campus constituency has work to do in changing the culture of ignorance.

Faculty can build centers on campus to teach liberal learning in a rigorous way. Emory University provides an excellent example: a small number of faculty have recently put together a top-flight voluntary core curriculum that covers classical literature, moral philosophy, American political thought, and other key subjects. Indeed, the most storied core curricula often began as faculty initiatives: Columbia University’s famous Great Book program, for example, was founded by enterprising faculty members. Administrators can provide the leadership necessary to implement a rigorous core: to take just one example, University of Chicago President Robert Hutchins worked with Mortimer Adler to implement one of the most famous cores of the 20th century.

Trustees have ultimate responsibility for safeguarding their institutions’ academic health and setting strategic direction; much of the responsibility for graduates’ ignorance rests with them. They should not micromanage faculty choices in the classroom—but they should make sure they know what their institutions require of all students and ensure that such requirements are strong.

Some excellent board members do this already. The Tennessee Board of Regents, which oversees six public universities in Tennessee, has put in place an admirable requirement in American history. The Texas Higher Education Coordinating Board has gone even further: it requires a broad-based curriculum that covers writing, mathematics, American history and government, the creative arts, and several other important subjects. These board members are a model for all trustees: they take responsibility, not just for their institution’s finances, but also for its academic excellence.

And students? Parents? They can stop bankrolling ignorance immediately by taking care to choose undergraduate programs that require every student to master a core body of liberal knowledge.

Empower Trustees

Reflecting on reforms at the City University of New York (CUNY), Board Chairman Benno Schmidt said, “Change in institutional strategy can only come from trustees.”

He would know. Starting in 1998, Schmidt and the rest of the CUNY board oversaw a transformation in the CUNY system. A storied university system that had been allowed to drift for more than two decades, CUNY suffered from weak academic standards, high dropout rates, poor outcomes (in 1999 only 30 percent of four-year freshman students graduated within six years) and lack of confidence among alumni, which was reflected in low levels of alumni giving. Thousands of students in remedial courses dragged down the performance and standards of CUNY’s four-year schools.

The board hired a reform-minded chancellor and dug in, working hard to turn the school around. Remediation was moved to the junior colleges, and the four-year institutions raised academic and admissions standards. Enrollment soared, especially among under-privileged groups—and so did graduation rates. A new honors college attracted students who turned down offers from NYU and Columbia. The system’s reputation recovered, and alumni giving increased eight-fold.

The transformation was led by the trustees—as any transformation so radical must be, because trustees are in an ideal position to stimulate reform. They are fiduciaries, standing to some degree apart from their institutions while they care for its financial and academic well being. And they are ultimately responsible for its success or failure.

Yet for every story like CUNY’s there are many stories of boards serving as cheerleaders: trustees who raise money and talk up the school but do not hold the administration and faculty accountable for delivering academic quality. Too many rely exclusively on their institutions’ presidents—whom trustees are supposed to oversee—for information about the school’s plans and performance.

Trustees—all trustees—need to break from the prevailing mindset in university governance. They need to recognize that they are leaders first and fundraisers second. They need to study key measures of educational quality—outcomes, graduation rates, the core curriculum. They need to rein in the grotesque inflation in administrative salaries and the lopsided growth that puts more new money into administration and athletics than student instruction. They need to demand a faculty reward system that encourages more time in the classroom and more focus on student success. And they need to ensure before ground is broken for new buildings that the existing ones are fully used, and that means Friday afternoons, Monday mornings, and summer term.

Here the public has a critical role to play. Governors and state legislators appoint the trustees of many public colleges and universities. These trustees are in turn responsible to state officials, and ultimately to the public. It falls to governors and legislators to ensure that their appointees are informed and engaged, and that they govern, not defer to, administrative leaders. If trustees fail to do their job, the governor must be held accountable.

Assess Student Learning and Hold Schools Accountable

In 2011, sociologists Richard Arum and Josipa Roksa rocked the academic world with a simple finding: students are not learning much in college. Arum and Roksa tracked more than 2,300 students longitudinally throughout their college careers at 24 accredited four-year institutions, assessing their learning with a nationally normed instrument, the Collegiate Learning Assessment.

Fully forty-five percent of students did not show any statistically significant learning gains after two years. Thirty-six percent did not show any gains after a full four years of college. These findings have been replicated by subsequent studies.

In other words, college has become “intellectually unsafe” for too many students.

Responsible colleges should not gamble with students’ futures—or with the hard-earned money of students, their families, and taxpayers. That is why it is important to measure student learning and adopt appropriate strategies, as necessary, to improve teaching and learning on their campuses.

States should require their public universities to administer one of the three major assessments (the CLA, the Collegiate Assessment of Academic Proficiency, or the ETS Proficiency Profile) to their students when they enter and when they graduate—and to publish the results. This would allow students and parents to make informed choices about which school to attend.

Increasingly, states are moving to performance-based funding models that tie the size of a state university’s budget to the university’s success in achieving certain outcomes. And what outcome could be more important than student learning? Longitudinally assessing student outcomes will allow policymakers to reward high-achieving schools and take remedial action to address the problems at struggling institutions.

Private colleges and universities should not be coerced by the state, certainly—but publishing student-learning data is the right thing to do. Trustees and administrators should put in place their own programs to test student learning, and they should publish the results.

Reform the College Accreditation System

Consider this: not only did one-third of students in Arum & Roksa’s study demonstrate no learning gains, but every single one of those students attended a fully-accredited institution.

The six main regional accrediting agencies have, under federal law, for decades held the task of determining which colleges within their geographical regions are of sufficient quality to make them eligible for federal student aid—a life-or-death financial issue for almost every school. The accreditors are supposed to ensure that schools that receive federal money meet basic standards of educational quality. Students and families depend on accreditors for the same reason: to ensure that their very expensive investment will result in a quality education.

Unfortunately, accreditors have utterly failed to ensure quality. Arum and Roksa’s findings are merely the tip of the iceberg: dozens of accredited schools graduate fewer than one in four students; some graduate fewer than one in ten. Increasing evidence points to the conclusion that students waste lots of money and learn little at these schools…yet the schools retain their accreditation.

While doing nearly nothing for academic quality, accreditors manage to stifle innovation by erecting barriers to entry into the market. They also regularly interfere in governance responsibilities properly left to the trustees, often obstructing reform efforts.

Federal financial aid represents such a large fraction of schools’ budgets that very few schools can survive without access to federal funds—or push back when accreditors overstep their bounds. The accreditation process can take years and cost more than a million dollars. Start-up schools fight through a years-long twilight existence before they are able to compete.

In these circumstances it is unsurprising that few new institutions enter the market. In addition, accreditors often impose irrational standards on schools or take arbitrary action that second-guesses university leaders. For example, in late 2012 accreditors placed the University of Virginia “on warning” because the board attempted to remove its president—an action that, although the board pursued it hastily and without the proper care, was well within their rights. In December 2011, the Southern Regional accreditor reprimanded the Florida governor for suggesting publicly that the Florida A&M board should suspend its president after the disturbing hazing death of a drum major. Accreditors have impeded the University of California Regents’ attempt to rein in out-of-control administrative costs. They have imposed unhelpful requirements on hiring decisions, often requiring schools to hire a Ph.D. over a non-Ph.D. despite any other skills that the non-Ph.D. candidate might have offered.

Accreditors have also pressured schools to decrease faculty teaching loads, and even tried to pressure a school with a rigorous great-books curriculum into becoming more “open.”

In these ways and others, accreditors exist to protect the established business model in academia. Accreditors use their control over financial aid to enforce their standards on universities. If policymakers wish to reverse these trends they need to break the link between accreditation and federal financial aid. If new educational institutions were not forced into the accreditors’ straitjacket we would see a proliferation of new educational models and, through the power of the market, real progress on quality and cost.

The accreditors could have a new, constructive role if the federal government broke the current system of regional-monopoly accreditation and allowed accreditors to compete with one another to offer the most compelling certifications. Former Boston University president Jon Westling summarized how this would play out:

Accreditation agencies should be, in effect, accredited by their customers. If they have anything worthwhile to offer colleges and universities, colleges and universities will pay them for it. Generally, colleges and universities will pay to be reviewed by the agency that has the strictest standards that the institution thinks it can pass. A free market in accreditation agencies will quickly stratify, with the toughest agencies attracting the best colleges and universities. The public will benefit from a genuine ranking system.

If accreditation agencies started acting like business consultancies and consumer-information agencies, the needed changes to higher education would come swiftly.

Reform: The Way Forward

Ballooning costs, minimal learning and a regulatory system that resists new entrants and new ideas—these problems plague American higher education. The last problem requires the federal government; and congressional action to reform accreditation will also help address issues of quality and cost. Meanwhile, drawing on that great strength of American higher education, which is its decentralization, trustees and administrators at any institution can begin to take action to address issues of quality and cost. Many are doing so now—but far more must join in the effort if we are to provide a quality, affordable education to the next generation of Americans.

Anne D. Neal is President of the American Council of Trustees and Alumni (ACTA) and William Gonch is Senior Program Officer, Communications. Some portions of this essay have appeared in other ACTA publications.