By Andrew Gillen, Jeff Slingo, and Mandy Zatynski
As a knowledge-based workforce has transformed the American economy over the last several decades, few people have questioned the value of higher education. Enrollment has surged at all types of colleges—up by more than one-third in just the last decade—as the college credential has become the ticket to a better life. From a purely economic standpoint, the numbers back up the prevailing wisdom that college is worth it: College graduates earn more and are less likely to be unemployed than those with only high school diplomas.
But with the onset of the Great Recession in 2008, the perimeters of the value debate in higher education began to shift. College prices continued to climb even as average household wealth declined. Average tuition today eats up nearly 40 percent of the median earnings in the United States, where a decade ago it consumed less than a quarter of income. At the same time, total student debt surpassed the $1 trillion mark in 2012.1 Since 2000, the average federal PLUS loans for parents increased by about one-third, to around $12,000.
Higher debt, along with stories of college graduates living in their parents’ basements or working as baristas at Starbucks, is leading prospective students and their families to increasingly ask the value question: What will we get in return for our investment in college, especially if we are taking on significant debt? Often it’s not that students and families are questioning the value of college per se, just the value of attending certain colleges.
The answer to the value question is ambiguous, often dependent on factors unique to each student and college pairing, such as campus preference, location, or even fit. Graduation rates and earnings data are helpful, but in their current state, incomplete and difficult to access and interpret. Students need simpler tools that allow them to pull up the information they need—from graduation rates that account for all students (or students like them) to lifetime career earnings that go beyond the first, often poorest, year-after-graduation salaries. With more comprehensive, accessible data, institutions will have a clearer picture of their outcomes, and students and their families will have a better chance of answering the value question.
MOST BANG FOR THE BUCK
In February, President Barack Obama thrust this return-on-investment question into the national spotlight when he used his State of the Union address to introduce the College Scorecard. This new resource, the president said, would allow parents and students “to compare schools based on a simple criteria: where you can get the most bang for your educational buck.” Now there would be a government-backed tool that allowed students to compare colleges the same way consumers size up cars or televisions in Consumer Reports.
The idea of applying economic measures to a degree makes most academics uncomfortable. It fails to account for higher education’s contributions to society, nor does it measure the less tangible benefits of a college degree, such as improved health, civic engagement, and broad knowledge of the world.
Moreover, not everyone gets the same benefits out of education. “When you come into Staples, you come out with office supplies; when you go into a car dealer, you come out with four wheels and a motor,” says Michael Hout, the Natalie Cohen professor of sociology and demography at the University of California at Berkeley. “It’s not clear what you come out with, with a college degree. It’s a different thing for everybody.”
Yet despite its unease with the idea, higher education for decades has been selling its economic returns as the primary reason students and families should pay ever-increasing tuition prices. Indeed, the College Board publishes a report every three years titled Education Pays, which presents detailed evidence about the benefits of higher education.
The difference now for higher education is that the data allow comparisons between individual institutions, and by that measure, not all college degrees are created equal. Colleges can no longer simply cite the national averages that they have relied on since the 1970s to sell their degrees at nearly any cost.
In 1974, Jacob Mincer wrote Schooling, Experience, and Earnings, a book whose ideas have dominated the discussion about college rates of return ever since. While many had realized that labor market earnings were affected by schooling and work experience, Mincer’s key contribution was a clever arrangement that allowed for an easy estimation of what came to be called “the rate of return to education.”2 The Mincer earnings equation has been used to estimate this “rate of return to schooling” ever since, and most analysts find that it is “on the order of 6-10 percent,”3 meaning that every additional year of schooling tends to increase annual earnings by 6 to 10 percent.
This is a large boost in earnings and, when maintained over decades of paid employment, it means that on average, there will be a large difference between the earnings of college graduates compared with high school graduates. Indeed, some calculations find that over their lifetimes, college graduates earn $1 million more than high school graduates.
THE COLLEGE DISCONNECT
Considering this large discrepancy in earnings, higher education should be highly valued and seen as necessary. But surprisingly that’s not the case among a large swath of Americans. Only 37 percent of men and 50 percent of women think that colleges provide an excellent or good value for the money spent by students and their families.4 What’s even more curious, given the boost in earnings, is that many students who would likely benefit from college do not enroll or enroll only to drop out without receiving a degree.
There are many reasons for the disconnect: Qualified students may be unaware of or ignore information on the benefits of attending college. They may have other pressing matters—for example, financial and family obligations. It might be that the measurement of college value is skewed. Or it could just be that smart, directed students will succeed financially whether they have a college degree or not.
Unaware of the Benefits of College
When students are aware of the higher rate of return on investment in higher education, evidence shows that they tend to pursue more education. One recent analysis by Ran Abramitzky and Victor Lavy studied the effect of a sharp change in the rate of return to education in Israeli kibbutzim (communities).5 Some of these communities functioned as communes, where the earnings of all members were pooled together and distributed among the community. Since any earnings needed to be shared with the entire community, the rate of return to education was virtually zero for any individual member. In the late 1990s and early 2000s, some of these communities reformed to allow individuals to keep more of their personal earnings. Since more education-enhanced income could be kept by the individual earning it, “these reforms caused a sharp and salient increase in the returns to education for kibbutz members.” By exploiting differences in the timing of these changes across different communities,
Abramitzky and Lavy were able to determine that “students in kibbutzim that reformed early increased their investment in education.” In other words, once students were able to keep the higher earnings that often accompany additional education, they tended to acquire more education.
But some students simply lack information about the economic value of a college degree. In the United States, students from families making less than $50,000 a year tend to “systematically underestimate the returns to education,”6 which with other economic factors can lead to lower enrollment rates among low-income students. Children from families who earn more than $90,000 have a one-in-two chance of getting a bachelor’s degree by age 24. That falls to a one-in-four chance for those from families earning between $60,000 and $90,000, and a one-in-seventeen chance for those earning under $35,000. Students from high-income families are also four times more likely than those from low-income families to attend a selective college.
Measured Rate of Return is Wrong
Another explanation for lower than expected enrollment is that the rate of return may not be the appropriate measurement for students to use in determining whether they should enroll in college.
The first problem is that Mincer’s earnings equation ignores the costs of education.7 This would be similar to a restaurant determining its profits by only counting its sales, without taking into account the cost of the raw ingredients or the labor needed to turn those ingredients into meals. Failing to account for costs means that the rate of return as estimated by the Mincer equation will be higher than the true rate of return, and since the cost of attending college has grown over time, this overestimate has grown more severe over time.
A second way in which the high rate of return may be wrong is that it incorrectly credits education with higher earnings when they really may be due to something else. Natural intellectual ability is perhaps the most important alternative explanation for the higher earnings of those who graduate from college. The argument is that students with high intellectual ability will tend to stay in school longer and earn more later in life. But the higher earnings are not necessarily due to additional formal schooling; they are due to the student’s innate intellectual ability.
Not Everyone is Average
Even if the average rate of return for going to college is high, not everyone enjoys the average return. In fact, scholars have observed returns as low as −32 percent (in which case more education actually lowers earnings) and as high as 51 percent.8 This astounding variation in financial outcomes is certainly one reason that students are wise to not base their decisions entirely on the average rate of return. A number of factors influence a student’s outcome.
Academic Performance Matters
How students perform academically is one driver of differences in earnings. Students at the top of their classes have more job opportunities than their less academically stellar peers, and the quality of those jobs in terms of financial compensation also is generally better.
A student’s choice of major is also an important driver of earnings. The Center on Education and the Workforce at Georgetown University has been studying the economic value of college degrees for years, and a recent study analyzed differences in median earnings of recent graduates. It found that median earnings “vary dramatically” by major, “from $29,000 for Counseling Psychology majors to $120,000 for Petroleum Engineering majors.”9
In addition to what you major in, where you enroll may also have an impact on the economic returns from a college degree. Researchers have been probing the question of whether it matters where you go to college for several decades, and for the most part, have found the more selective the institution, the higher the economic returns for a graduate.
In one study, Caroline Hoxby, a Stanford economist, separated hundreds of schools into eight groups based on selectivity.10 She looked at men who entered these colleges in 1960, 1972, and 1982. A student who entered one of the colleges in the most selective group in 1982 could expect to make $2.9 million over his career, compared to a student who enrolled in a college in the least selective group who would make about $1.75 million.
In 2009, Mike Hoekstra, a professor at Texas A&M University, examined the salaries of young men who were barely admitted to an unnamed state flagship university to those who were just below the cutoff and ultimately rejected.11 While the students were nearly identical in their academic profiles, the difference between getting in and not was significant on their financial futures. Those students who attended the state flagship had wages that were 20 percent higher.
TOOLS FOR MEASURING VALUE
When U.S. News & World Report started ranking colleges in the 1980s, it ushered in a new era of consumer information for students, parents, and counselors searching for colleges. Colleges obliged by publishing dozens of admissions brochures, and later, slick websites. The federal government followed with College Navigator, which displays virtually every piece of data the U.S. Department of Education collects on higher education institutions.
Within the last year, new consumer-information websites have come to market that allow users to interact with data in more in-depth ways, allowing detailed comparisons between and within institutions like never before.
The White House released the College Scorecard12 in February, the day after the president plugged the new concept in his State of the Union address. The website allows users to browse colleges based on specific criteria, such as location, enrollment, and majors, or go right to detailed information about individual colleges. The information about specific institutions is separated into five categories: net price, graduation rate, student loan default rate, median borrowing, and employment.
The College Scorecard is a good start, but as many observers have pointed out, there is much room for improvement. Users of the Scorecard, for instance, can’t make side-by-side comparisons of institutions unless they print out information on each college they are considering. The Scorecard gives a link to an individual institution’s Net Price Calculator, but that requires students and their families to enter their financial information multiple times. And the section on employment is blank while the U.S. Department of Education figures out how to provide information on earnings.
Economic Success Measures
These state databases report earnings13 for recent college graduates broken down by college and major. They were first launched in 2012 by College Measures, a partnership of the American Institutes for Research and Matrix Knowledge, a consulting firm, and now include Arkansas, Tennessee, Virginia, Colorado, and Texas. Other states are expected to follow.
These databases are one of the first attempts to bring precise earnings down to the academic program level within a college, and some of the findings defy conventional wisdom. In several states, average first-year salaries of graduates with two-year degrees are higher than those with bachelor’s degrees. Technical degree holders from community colleges often earn more their first year out of school than those who studied the same field at a four-year university. In Tennessee, for example, graduates in health professions from Dyersburg State Community College not only finish two years earlier than their counterparts at the University of Tennessee at Knoxville, but they also earn $5,200 more, on average, that first year after graduation.
The surveys have been heavily debated because of data limitations. Since they depend on unemployment insurance records for earnings data, they only include graduates who live in the state and also exclude those who are self-employed. In all but Virginia, they only include public colleges. And all of them track only first-year earnings, which tend to be the lowest over the course of a lifetime for most college graduates. Designers of the tool say they are working to include more years of wage data.
College Reality Check
This site from The Chronicle of Higher Education takes many of the same elements of the College Scorecard and puts them in an interactive tool that allows users to compare up to five institutions at the same time. Users can search for colleges based on selectivity, location, net price, and graduation rates or navigate directly to college pages. By selecting one of five income ranges, users can also see what the college might actually cost for students like them.
Unlike the Scorecard, College Reality Check14 includes wage data, but it is not as detailed as what is included in Economic Success Measures. For wage data, this site depends on payscale.com, a website that collects self-reported salaries from users.
All these new data sources have been met with skepticism by higher education leaders, who worry that prospective students will place too much emphasis on the economic returns of a college degree. Even so, a few institutions are beginning to design their own tools that attempt to answer the return-on-investment question. Using surveys and social media, St. Olaf College in Northfield, Minn., built a website15 with detailed employment and salary data for 92 percent of its Class of 2011. The website allows users to view their graduates’ employers and job titles and sort by major.
MEASURING VALUE: AN IMPOSSIBLE TASK?
One reason college officials dislike focusing solely on the economic returns of higher education is that most believe a college degree serves multiple purposes, many of which are difficult to value. A college education is also what economists often refer to as an “experience good,” meaning its quality and value can’t be discerned in advance. It’s only after you have experienced the product (obtained the education) that you can place a true value on it (and arguably not even then, since you have little to compare it to).
But choosing a college is not the only time in our lives when we make a decision based on limited information about the potential outcomes. Take health care. Despite efforts to inform patients and make it easier to pick doctors and insurance plans, many people are still not clear on the choices they’re making, says Nancy Kendall, an ethnographer at the University of Wisconsin at Madison.
More often, people rely on word-of-mouth, asking friends and colleagues for physician recommendations. When they have a negative experience, they are likely to switch; but when the experience is tolerable, Kendall says, “they tend to just stay because it’s hard to figure out what would be better.”
Those who make very explicit decisions, however, tend to have a major goal in mind: top-notch OB/GYN care, for example. The same goes for college, Kendall says. A student interested in research is going to look for an institution with quality, undergraduate research opportunities. “These are the kinds of pieces that are hard to make scorecards for, but it’s what people really want,” Kendall says. So, she asks, why not create materials similar to those that new employees receive when they’re signing up for health insurance? Categorize colleges, so students know what different types of institutions can do for them. They could narrow their choices based on career interests, location, or other priorities. She calls it a “flexi-card,” because while there are core pieces all students want to know, college decisions are often made amid the unpredictable details of personal preference.
In the meantime, Richard Arum, a professor of sociology and education at New York University, says prospective students should push for answers on learning outcomes: How does an institution measure them? Where are there opportunities for improvement, and how is the institution addressing those? “If they can’t answer those, they’re not attending to the academic quality of the programs,” says Arum, co-author of Academically Adrift, a 2010 book that found many students didn’t learn much in college. Ask for copies of class syllabi, Arum says, to see what types of reading and writing assignments are required. It’s time-intensive and perhaps a tedious way to sort through college decisions, but, Arum adds, it’s the best students can do with the limited information available.
MOVING FORWARD: BETTER WAYS TO EVALUATE RETURN ON INVESTMENT
Prestige in higher education is measured not by outputs of how much students learn or by how students fare in the labor market, but mostly by inputs measured by the U.S. News & World Report rankings: factors such as faculty salaries, SAT scores, and acceptance rates. As a result of this prestige race, higher education institutions spend an inordinate amount of time, money, and effort investing in those measures that move the rankings, but do not necessarily improve a student’s return on investment.
Without adequate measures of institutional outputs, prospective students find themselves lost as they try to differentiate among the colleges they are considering. Their ultimate decisions on where to enroll have real consequences. Consider data produced by College Measures in Virginia. According to this data, graduates of the business program at the University of Richmond earn, on average, $24,000 more a year than those of Virginia State University or Ferrum College. George Mason University business graduates earn about $22,000 more.16
That is a significant difference made even more considerable when one looks at the prospect of graduating from any of those institutions with a degree. The six-year graduation rate at Ferrum is 31 percent; at Virginia
State, 41 percent; at George Mason, 63 percent; and at Richmond, 87 percent. So not only do some graduates at Virginia State and Ferrum have average first-year salaries that are significantly lower than those of the other two institutions, they are less likely to even make it to graduation day.
Students in every state should have access to this type of information—and more—as they weigh their college decisions. A system to better measure return on investment needs to be national in scope, since a patchwork of state systems will leave many gaps in coverage. Among the factors it should measure:
Graduation and default rates. Both graduation rates and default rates should be expanded to provide more complete and accurate information. Default rates, for example, are reported by cohort, defined as all students who entered repayment within a certain period. However, it would be more useful to distinguish between the default rate of graduates and the default rate of dropouts—and even among graduates with different majors. Similarly, current graduation rates only account for first-time, full-time students, but these students make up less than half of all students currently enrolled in college. Graduation rates should be tracked for all students.
Beyond this, the data to calculate input-adjusted measures should be publicly available. Raw graduation rates make colleges that serve at-risk students look worse than colleges that cater to the affluent. For example, a college that enrolls many low-income students will tend to have a lower graduation rate, even if it provides the same education as a college that enrolls only high-income students. This problem can be avoided by devising input-adjusted graduation rates, which in this example would take into account the income of enrolled students.
Lifetime earnings. First-year earnings matched by College Measures are simply too limiting given that employees’ salaries are often volatile in the years right after college graduation. A more useful dataset would show lifetime earnings, sortable by institution and major, and connect to other government data sources, so policymakers could more easily track the earnings of those who received government aid, such as Pell grants or student loans.
Career mapping. When viewed in isolation, career earnings can be misleading, if for example an institution places most of its graduates in public-service fields. A better consumer information system would give students and policymakers a snapshot of the types of jobs graduates from particular colleges and majors end up taking.
Student satisfaction surveys. Satisfaction means a lot, from restaurant outings to doctor visits. If the experience is a good one, that person is likely to recommend it to friends and other peers. College is no exception. By uniformly collecting and reporting results of student satisfaction surveys, prospective candidates would have much richer information about students’ experiences in class and on campus, what kind of value they put on their four (or more) years at an institution, and whether they believe the experience helped them land a job.
Higher education has been selling the degree premium for decades as a reason to pay its ever-escalating prices. Now the time has come for colleges and universities to help build a system that gives better information on the value of a college education.
For small values, the difference between natural logged values is approximately equal to the percentage change. By taking the natural log of earnings as the dependent variable, the interpretation of the coefficient for the “years of schooling” variable in a regression can be interpreted as the percent increase in earnings from an additional year of schooling. While it is not an actual rate of return, it was similar enough to be called one in the 1970s, and the terminology has stuck ever since.
Daron Acemoglu and Joshua Angrist. “How Large are the Social Returns to Education? Evidence from Compulsory Schooling Laws.” NBER Working Paper No. 7444, December 1999. http://www.nber.org/papers/w7444.
Paul Taylor et al., “Women See Value and Benefits of College; Men Lag on Both Fronts, Survey Finds.” (Washington, DC: Pew Research Center, 2011). http://www.pewsocialtrends.org/files/2011/08/Gender-and-higher-ed-FNL-RPT.pdf.
Ran Abramitzky and Victor Lavy. “How Responsive is Investment in Schooling to Changes in Redistribution Policies and in Returns.” NBER Working Paper No. 17093, May 2011. http://www.nber.org/papers/w17093.
Julian R. Betts, “What Do Students Know About Wages? Evidence From a Survey of Undergraduates,” The Journal of Human Resources 31, No. 1, (Winter 1996): 27-56.
Andrew Leigh and Chris Ryan, “Estimating Returns to Education Using Different Natural Experiment Techniques,” Economics of Education Review 27, No. 2, (April 2008): 149-160.
Anthony P. Carnevale, Jeff Strohl, and Michelle Melton, “What’s It Worth? The Economic Value of College Majors.” (Washington, DC: Georgetown University’s Center on Education and the Workforce, 2011).
Caroline Hoxby, “The Return to Attending a More Selective College: 1960 to the Present,” in Forum Futures: Exploring the Future of Higher Education, eds. Maureen Devlin and Joel Meyerson (Jossey-Bass Inc., 2001): 13-42.
Mark Hoekstra, “The Effect of Attending the Flagship State University on Earnings: A Discontinuity-Based Approach,” The Review of Economics and Statistics 91, No. 4, (November 2009): 717-724.
“College Affordability and Transparency Center College Scorecard,” accessed May 7, 2013, http://www.whitehouse.gov/issues/education/higher-education/college-score-card.
Virginia only follows graduates who get their first jobs within the state.