Reengineering the University

What's the best way to make scientists?
Sean McCabe

Share

The information-technology revolution that should have made the traditional university obsolete happened in 1439, when Johannes Gutenberg brought moveable-type printing to Europe. Until then, books had been hand-copied and were too expensive for all but the wealthiest seekers of knowledge. Instead, students would listen as a lecturer (from the Latin legere, “to read”) recited the contents of these unattainably complex devices. Centuries later, even as the Internet further reduces the cost of knowledge distribution, the lecture hall continues to dominate higher education. If we hope to ever fully capture the benefits of all this innovation—that is, to improve our system for imparting information to students as dramatically as we’ve improved our system for communicating with everyone else—we need to understand why the lecture hall survived Gutenberg.

The most obvious reason is that people like to spend time together. We practice yoga in instructor-led groups, gather at conferences to read papers, and fly thousands of miles to close deals in person. Sometimes we simply learn better in person. But that is not the only relevant social tendency in play. People also like (or need, for professional reasons) to place themselves within a hierarchy. And so universities affirm to the world to other people—not just that their students have been educated, but that they have been educated by them. This is particularly true of the schools people find most desirable. When you go to Stanford University, after all, you don’t just get a Stanford education, you also get a Stanford degree.

If we want to broaden access to learning, what we need is a new kind of degreeThese social functions build considerable conservatism into the system. If people want prestigious degrees from established institutions, then by definition they can’t buy them from innovators. What’s more, the top schools are quite picky about which students they will teach. Most businesses want more customers, so when a new technology allows them to provide a product or service more efficiently, they compete to do so. But selective universities are in part selling prestige. And since universities gain prestige, in part, by being selective, they have little incentive to reach more people.

If we want to broaden access to learning, then, what we need is a new kind of degree—a rigorous, widely recognized credential with which people may credibly assert that they possess certain knowledge and skills. Separating the question of what you know from the question of where or how you were taught would allow teaching institutions to focus on delivering knowledge the best way possible.

Science educators are well-positioned to lead the way. Scientific inquiry is geared toward generating consensus in ways that literary theory or history are not. Designing universally applicable standards for what constitutes mastery of organic chemistry or basic astronomy will be difficult—but still much easier than applying the same concept to the humanities. And once those standards are set, educational institutions will be required to focus entirely on teaching instead of administering tests and handing out degrees. Science changes over time, of course, and accrediting and education institutions alike will need to continually revise and update their standards. Government regulators are likely to play a more forceful role than they currently do in monitoring or directly running accrediting institutions. But once we’ve decoupled learning from credentialing, we can expect a huge range of pedagogical approaches to flourish. Autodidacts will have the chance to learn on their own and prove it to credentialers, but others might benefit from a more formal, or indeed more social, approach—discussion groups, say, or perhaps a full-on campus-based education.

In some ways, it is the ineluctable materiality of science itself that will help science educators speed the adoption of digital instruction. The practical need to do lab work in person, with a limited number of students, means that the prestige economics of selective universities won’t be threatened by encroaching virtuality. Initial offerings such as Apple’s iTunes U and OpenCourseWare have tremendous promise, but they don’t yet provide students with the experience of participating in original research. The best researchers, meanwhile, will still want to work with the best students, so traditional screening will persist despite the increasingly easy access to information.

Students and teachers could also err too much in the other direction, however. Splitting education from credentials will increase access and affordability, but it could just as easily bring about hyperspecialization and further separation of science from the humanities. Even as we widen access, therefore, we should avoid replacing broad education with an overly simplistic, “customer-driven” à la carte approach. The university should remain a venue for the shared pursuit of wide understanding.

Getting the mix right will require a great deal of experimentation, but these are experiments well worth undertaking, and ideally without too much concern for commercial success. Making scientists is not the same thing as making a product. Thomas Jefferson, who among his many other achievements launched his own university, would have been saddened by the notion that a college education had no higher purpose than to provide students with status and employment. He argued—persuasively, if idealistically—that a system of higher education should address students as whole people. It should not only “enlighten them with mathematical and physical sciences, which advance the arts, and administer to the health, the subsistence, and comforts of human life” but also “form them to habits of reflection and correct action, rendering them examples of virtue to others, and of happiness within themselves.”

That’s a lot to ask. But if educators and policy planners harness the Internet to their own humanistic ends, rather than submit to its commercial logic, they will be acting in the best interests both of science and of humanity as a whole. Putting the resources employed by the world’s most talented students at the fingertips of everyone with an Internet connection will do more than add to the sum of our knowledge; it will substantially broaden the set of people who end up seeking that knowledge for the sake of understanding rather than simply to make a living. Like the Internet or the printing press before it, science is a tool that can help people see the world as it truly is—a pursuit that’s useful not just to astronomers and engineers, but to all of us.

Matthew Yglesias, a fellow at the think tank the Center for American Progress Action Fund, is the author of Heads in the Sand._ He lives in Washington, D.C._

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.