Nudge Nation: A New Way to Prod Students Into and Through College (EdSector Archive)
10 September 2013 | by Ben Wildavsky
When Harvard law professor Cass Sunstein took his teenage daughter to the Lollapalooza music festival during a Chicago heat wave some years ago, the huge electronic displays that typically show performance schedules also flashed periodic admonitions: DRINK MORE WATER. YOU SWEAT IN THE HEAT: YOU LOSE WATER. “The sign was a nudge,” wrote Sunstein and his coauthor Richard Thaler, one of many described in their bestselling 2008 book, Nudge.Without coercing concertgoers to behave in a certain way, it provided information designed to prompt them to make wiser decisions—increasing their water intake to prevent dehydration.
Thanks in part to Thaler and Sunstein’s work, the power of nudges has become well-established—including on many college campuses, where students around the country are beginning the fall semester. While online education and software-driven pedagogy on college campuses have received a good deal of attention, a less visible set of technology-driven initiatives also has gained a foothold: behavioral nudges designed to keep students on track to succeed. Just as e-commerce entrepreneurs have drawn on massive troves of consumer data to create algorithms for firms such as Netflix and Amazon, which unbundle the traditional storefront consumer experience through customized, online delivery, architects of campus technology nudges also rely on data analytics or data mining to improve the student experience.
By giving students information-driven suggestions that lead to smarter actions, technology nudges are intended to tackle a range of problems surrounding the process by which students begin college and make their way to graduation.
New approaches are certainly needed. Just 58 percent of full-time, first-time college students at four-year institutions complete a degree within six years. Among Hispanics, blacks, and students at two-year colleges, the figures are much worse. In all, more than 400,000 students drop out every year. At a time when postsecondary credentials are more important than ever, around 37 million Americans report their highest level of education as “some college, no degree.”
There are many reasons for low rates of persistence and graduation, including financial problems, the difficulty of juggling non-academic responsibilities such as work and family, and, for some first-generation students, culture shock. But academic engagement and success are major contributors. That’s why colleges are using behavioral nudges, drawing on data analytics and behavioral psychology, to focus on problems that occur along the academic pipeline:
• Poor student organization around the logistics of going to college
• Unwise course selections that increase the risk of failure and extend time to degree
• Inadequate information about academic progress and the need for academic help
• Unfocused support systems that identify struggling students but don’t directly engage with them
• Difficulty tapping into counseling services
These new ventures, whether originating within colleges or created by outside entrepreneurs, are doing things with data that just couldn’t be done in the past—creating giant databases of student course records, for example, to find patterns of success and failure that result when certain kinds of students take certain kinds of courses. Like many other technology initiatives, these ventures are relatively young and much remains to be learned about how they can be made most effective. Already, however, nudge designers are having a good deal of success marrying knowledge of human behavior with the capacity of technology to reach students at larger scale, and lower cost, than would be possible in person.
COOLING ‘SUMMER MELT’
For education researcher Benjamin Castleman, the idea of using technology nudges to keep teenagers on track to college was a logical extension of his thinking about adolescent cognitive development. For several years he and his colleagues had conducted experiments gauging the effectiveness of various interventions designed to prevent “summer melt,” the troubling phenomenon in which high school graduates, who have been accepted to a college and plan to attend, never show up in the fall. Summer melt affects 10 to 20 percent of teens nationally and the numbers are higher among low-income, first-generation-college teenagers. After counselors in the summer melt experiments reported that texting was by far the most effective way to set up counseling sessions with students, Castleman, who is finalizing his Harvard School of Education doctoral dissertation and recently joined the faculty of the University of Virginia, realized that making even greater use of texting would capitalize on adolescent impulsiveness.
As a onetime high school teacher, Castleman says in an interview, he knew that if students are given the choice between “wading through a bunch of [college entry] paperwork they find confusing, without any help,” or hanging out with friends and working at summer jobs, they tend to put off important pre-college tasks until too late in the summer. Why not turn teenage impulsiveness into an asset by allowing students to complete key pre-college tasks immediately from their mobile phones?
Castleman did just that in a 2012 summer experiment with his collaborator Lindsay Page, a researcher at Harvard’s Center for Education Policy Research. In randomized experiments involving thousands of low-income students in the Dallas Independent School District and districts in Boston, Lawrence, and Springfield, Mass., researchers sent personalized texts to recent high school graduates in the treatment groups to remind them about tasks such as registering for freshman orientation and placement tests. The texts offered help with deciphering financial aid letters and more. The project was coordinated with the colleges that most district graduates attend, so the reminders and accompanying web links took students to the right places to complete tasks and were tailored to specific deadlines and requirements of each student’s intended school. Interestingly, although each text offered the option to connect students to live counselors for personalized assistance, relatively few students (just six percent in Dallas) sought this help. Results were striking. Just 10 to 12 text messages sent to low-income students over the summer raised college enrollment by more than 4 percentage points among low-income students in Dallas and by more than 7 percentage points in Lawrence and Springfield, Mass. Castleman notes that the texting intervention had no impact in Boston, where students can access a wide range of college-planning support services, both during the school year and during the summer after graduation. The total cost of this technology nudge: $7 per student, including the cost of counselors’ time.
Why was the intervention so effective? “The summer is a uniquely nudge-free time in students’ educational trajectory,” says Castleman. Given that so many college-intending adolescents receive few reminders about completing key tasks—and that so many are prone to procrastination—well-designed prompts can fill a void.
According to Castleman, “The text intervention has the potential to be several times more cost-effective at increasing college entry among students from disadvantaged backgrounds than other comparable interventions,” such as additional college counseling during the summer after high school graduation.
This study surely won’t be the last word on summer melt. Many factors prevent disadvantaged students from enrolling in college; Castleman, Page, and other researchers continue to explore numerous ideas for tackling those barriers. But as policymakers seek to improve college-going rates, a cheap, scalable, personalized, technology-based intervention such as the Castleman-Page texting campaign seems to have a lot of appeal.
CHOOSING THE RIGHT COURSES
Getting students through the campus gates is just the first step. New undergraduates must make course-selection decisions that have significant implications for their future persistence and success—yet they don’t always receive the best advice. Many low-income and first generation students can’t rely on guidance from family members. And all undergrads must rely on faculty and staff advisors who don’t necessarily do a good job helping them puzzle through the course catalogue to discern which classes, taken in which order, make the most sense on the way to fulfilling requirements and maximizing the likelihood of academic success.
These problems, according to Tristan Denley, provost and vice president for academic affairs at Austin Peay State University in Tennessee, are among the reasons that nationwide on average bachelor’s degree students take 14 percent more courses than they need to graduate. This wastes resources, leads to a longer time-to-degree, and puts students at risk for not graduating at all. Denley, a mathematician who arrived at Austin Peay in January 2009 from the University of Mississippi, set out to tackle the course-selection challenge as part of a broader look at how Austin Peay could make better use of data to maximize student success.
The idea for a new system came to Denley while he was traveling in Europe and reading three books: Super Crunchers,which describes how bodies of data can shed light on decision making across disciplines; Moneyball,the Michael Lewis book about how Billy Beane, the general manager of the Oakland As, used a revamped statistical analysis of the optimal way to assemble a successful team; and Nudge. He realized, Denley says in a telephone interview, that “data-informed choice architecture” could help students make better course selections. “Every campus is sitting on terabytes of historical grade data. It’s just a matter of ... can you take that data, analyze it in the right kind of way, and communicate it?”
That’s just what Denley did, creating the Degree Compass system, which combines data mining and behavior nudges to match students with “best-fit” courses. It draws on data from hundreds of thousands of past students, scrutinizing their classes, grades, and majors. Then it gives current students—and their advisers—course recommendations based on how well similar undergraduates with similar course-taking histories have performed in the past. In certain respects, it’s similar to the you-might-also-like choices on Netflix, Amazon, and Pandora. But Degree Compass is less concerned with students’ likes and dislikes than with predictive analytics—calculating which course selections will best help undergraduates move through their programs of study most successfully—and most expeditiously.
Information generated by Degree Compass also goes to department chairs as “enterprise-scale” reports, which allow them to intervene with students who are having difficulty and need extra tutoring or mentoring.
Students aren’t forced to take any of the recommendations generated by Degree Compass. This is why the system is a technology nudge rather than a more directive tool. But there’s reason to believe that students should take the nudges seriously. While it is too soon for any extensive retention and graduation data to be available, there is already some evidence that Degree Compass can accurately predict grades. Initially, the system correctly predicted whether students would earn a C grade or higher 90 percent of the time. Now, Denley says, more recent research has found that Compass can predict whether a student will get either an A, B, C, or F in a given class with 90 percent accuracy.
The next step for Degree Compass is MyFuture, launched in November 2012, to match students not just with individual classes but with entire majors. Again, the goal is to look at an individual student’s academic record and predict future grades, match that against the trove of data on the university’s servers, and recommend academic specialties most likely to successfully lead that student to a degree. Moving beyond major recommendations, MyFuture also gives students links showing possible career options for a given major, as well as average salary data, and current job prospects in the field.
GETTING THE ACADEMIC GREEN LIGHT
The need for better information, of course, doesn’t end when students pick their classes. On the contrary, undergraduates moving through their coursework often don’t know as much as they should about their academic progress, which means that struggling students don’t seek out help early enough to make a difference in their class performance. Here, too, technology nudges using data analytics are proving to be valuable tools for letting students know when they are at academic risk.
Probably the best known of these nudges is Purdue University’s Course Signals program, which was piloted in 2007 and has reached nearly 30,000 students in 122 courses and 246 course sections. Created by John Campbell, Purdue’s associate vice president for academic technologies (who recently moved to West Virginia University as associate provost and chief information officer), Course Signals relies on the kind of large data-set mining now common in the corporate world. Its proprietary algorithm predicts students’ risk status in the classroom by drawing on a wide range of data:
• Class performance, based on marks earned to date
• Effort, measured by how often students interact with Blackboard Vista, the learning management system used by Purdue
• Previous academic history, including high school grades and standardized test scores
• Student characteristics, such as age and number of credits taken
Students who log onto the learning management system—or who receive a personalized email from their professors asking them to log on—see one of three traffic signals: Red (Stop and get help), Yellow (Caution, you are falling behind), or Green (Keep on going). By clicking on the signal, students facing difficulties can receive specific feedback from the instructor, including guidance about which tutoring or study resources they can access to improve their academic progress. If they wish, they can do this on the go: Signals is available through the Blackboard mobile app.
The program has encouraged students to seek out academic support, thus improving retention rates. When Signals was used for some sections of a large biology course, for example, students who received messages from the Signals system were much more likely to visit the biology resource center than their counterparts in non-Signals classes. Overall the Signal students received better final grades. The four-year retention rate for the 2007 cohort of Purdue undergrads who used Signals at least once is 87.4 percent, versus 69.4 percent for those who did not—an impressive 18 percentage point difference.
The spike in student retention, at 93 percent, was even higher for students who took two or more Signals courses. That is particularly noteworthy given that those students were less prepared for college than their non-Signals counterparts, as measured by SAT scores that were 50 points lower. “In short,” wrote Campbell and two former Purdue colleagues, James E. Willis, III, and Matthew D. Pistilli in a May 2013 EDUCAUSE Review article“by receiving regular, actionable feedback on their academic performance, students were able to alter their behaviors in a way that resulted in stronger course performance, leading to enhanced academic performance over time.” In brief, they conclude, the program’s impressive results stem from “using big data to offer direct feedback.”
That said, initial graduation rate results for Signals participants have been more mixed. Students from the 2007 cohort who took one Signals class saw a 4 percentage point boost in their four-year graduation rates, to 45 percent, compared to those never enrolled in a Signals course. But those who took two or more Signals courses saw a 2.5 percentage point decline in their four-year graduation rates. There’s an explanation for this counterintuitive finding, according to Purdue officials.Students taking more than one Signals class often are academically weaker than their peers, who may have tested out of large “gateway” courses by taking Advanced Placement classes in high school. Thus, students who take two or more Signals courses often take longer to earn their degrees because of the time they must spend taking required courses. What’s more, many Purdue majors are designed as five-year programs. For these reasons, Purdue officials predicted that five-year graduation rates for undergraduates who have taken two or more Signals classes would surpass those of their peers. They turned out to be correct. The five-year grad rate for non-Signals students in the 2007 cohort was 61 percent. Undergraduates who took two or more Signals classes, despite average SAT scores that were 50 points lower than their non-Signals peers, had a 71 percent graduation rate—a 10 percentage point difference.
Technology nudges that rely on algorithms created by analyzing large volumes of student data might seem likely to prompt privacy concerns. Among students taking Signals classes, however, that has not been the case, according to Campbell. “I was always in genuine fear that I’d have my picture on the front of the student newspaper with the caption ‘Big Brother Is Watching,” he says in an interview. But in the age of social media and pervasive Googling, he adds, “students have a very different sense of privacy and how data can be used.” Among some faculty members, however, there has been apprehension that giving students a high number of yellow and red signals could be viewed as a sign of deficient teaching. Says Campbell, “I get more Big Brother questions from the faculty than I do from students.”
COPING IN CLASS AND LIFE
The technology nudges described here—texting to reduce “summer melt,” help with course selection, and warnings about problematic course performance—are each targeted at a specific moment in a student’s educational trajectory. Another initiative, a for-profit start-up called Persistence Plus, aims to provide more general behavioral interventions throughout undergraduates' academic careers. Calling itself “the Weight Watchers of college success,” Persistence Plus provides students with regular personalized nudges either via text or through iPhone or Android apps. Using data from partner universities’ data management systems, the company sends undergrads messages about time management and class deadlines, offers help coping with setbacks, and connects students to their peers in social networks organized around academic goals.
The Boston-based firm was founded in late 2011 by Jill Frankfort and her husband Kenny Salim, who is now superintendent of the Weymouth School District. Incubated in part at the Ewing Marion Kauffman Foundation’s labs program for education start-ups, Persistence Plus has begun a series of pilot projects to test its model in different settings. In one partnership, Persistence Plus worked with the University of Washington Tacoma to nudge students in online introductory math classes. The goal was to see whether behavioral intervention could help lower-division undergraduates who are studying online, a population that often performs poorly. Students received daily reminders, often requesting a response. Behavioral research shows that students are more likely to complete a task when they commit to doing it not only at a specific time, but at a specific place as well. So the Persistance Plus iPhone app might text this message: “Students who pick specific times to finish assignments do better. Your personal essay is due soon. When and where will you finish it up? To which a student might reply, “Tuesday at the library.”
Text messages also ask students about their sense of well-being: “The beginning of the semester can sometimes feel overwhelming. How are you feeling?” If an undergrad’s response indicates that he or she is having personal difficulties, Persistence Plus sends the student messages geared toward the challenge he or she faces, or can connect students who have larger needs to campus counselors for individual support. For now, the firm’s behavioral interventions combine what it calls “machine and human intelligence, and manual and automated process.” Along with differentiating the messages it sends based on what students report about their experiences, Persistence Plus staff can also become personally involved. (On one occasion, according to Frankfort, a student told Persistence Plus—but not any faculty or staff—that she was about to fail all her classes because of a medical problem. The firm was able to help her arrange a medical withdrawal.) But over time it plans to automate and personalize the experience in a way that can be scaled “to an unlimited number of students.”
The Tacoma pilot found that students who received nudges performed better academically than those who didn’t. The company is still in a data-gathering, start-up stage—it is developing pilot projects with several universities in 2013–2014. In this experimental spirit, Persistence Plus has asked students to design their own motivational messages in an effort to find out which are most effective. (One student’s message: “Stick with it, sit down and study your ass off, you brilliant bastard.”) Over time, the company plans to assess the effectiveness of the range of messages students design, then tailor its nudges to different student personalities.
Perhaps even more than some other technology nudges, the Persistence Plus model shows that worries about the potentially dehumanizing effects of technology in an educational setting are misplaced. The firm’s model, while still a work in progress, may show just the opposite—that technology can be used to personalize student support through outreach of a kind that simply isn’t possible using traditional face-to-face methods.
Frequent messages keyed to course deadlines and personal well-being give students the message, both literally and figuratively, that the university cares about them and their success.
COUNSELING BY PHONE
An underlying philosophy behind behavioral nudges, though rarely stated directly, is that nudges are paternalistic—they tell students what to do (albeit without coercion) rather than assuming they can figure out pathways to academic success on their own. That’s a good thing, according to Stanford University economist Eric Bettinger. Together with a Ph.D. student, Rachel Baker, he conducted a study for the National Bureau of Economic Research assessing the effectiveness of another for-profit service, InsideTrack, which provides telephone counseling to undergraduates in a directive way that seems to be good for them. As Bettinger and Baker write:
Oftentimes in higher education, we assume that students know how to behave. We assume that they know how to study, how to prioritize, and how to plan. However, given what we know about rates of college persistence, this is an assumption that should be called into question.
The largest coaching firm in the country, InsideTrack, works with more than 40 colleges, including public, private, and for-profit institutions. It advises students, many of them nontraditional undergraduates enrolled in online programs, about issues that can get in the way of their academic progress. The company’s philosophy is to try to connect students’ life goals with their schoolwork. Practical advice to students addresses common problems such as academic time management. But counselors (the firm employs more than 300, located primarily in Portland, Ore., San Francisco, and Nashville), also spend significant time on practical, life-management skills that can make a big difference in whether students stay enrolled—helping students deal with challenges from financial problems to working around job schedules to child care responsibilities.
The results of the Bettinger/Baker study, released in March 2011, were striking. After just six months of coaching, 8,000-plus students in the test group were five percentage points more likely to stay in school than the 5,500 in the control group. After one year, when the coaching program ended, a dismaying number of all students—more than half—had dropped out. But the 5 percentage point spread between the coached and non-coached students remained, diminishing only slightly by the 24-month mark. Coaching had an impact regardless of a student’s SAT or ACT scores, Pell Grant status, or age, though male students were more likely to benefit than female students.
While phone calls are not as cutting edge as texts or apps, they share the key characteristic that makes technology nudges appealing: scalability. In-person counseling is more expensive—and in some cases less convenient for students—than advice delivered over the phone. A key finding of the Bettinger/Baker paper is that the cost per student of telephone-coaching intervention was $500 per semester—not trivial, but a figure that compares favorably to the modest retention effects achieved though measures such as paying undergrads to attend campus counseling or boosting need-based financial aid by $1,000 or more annually.
The cost-savings of phone counseling are possible not just because of technology but because of specialization. Counseling typically is part of a large bundle of services offered on a college campus (though rarely in the systematic, intense form provided by InsideTrack). Like Persistence Plus, InsideTrack unbundles one slice of counseling. By specializing, by focusing on the nudges most likely to influence student behavior, by using technology (including, at times, texting and email in addition to phone calls), and by taking its services to scale, the company is able to offer less expensive counseling than the traditional model. Based on the evidence to date, that combination is producing clear benefits for students.
BUILDING POLICIES FOR SUCCESS
The examples cited here are by no means the only instances of colleges or companies giving students technology-driven prompts to persist to graduation. But many colleges and universities continue to use more traditional—and less effective—student support methods. What might be done to encourage more widespread use of technology nudges—whether to combat summer melt, promote better course selection and scholarly progress, provide more effective coaching with personal and academic challenges, or to give students personalized digital encouragement simply to persist through college?
Louis Soares, a Center for American Progress senior fellow, offers several policy recommendations for expanding personalized higher education technology in an October 2011 report. He suggests the U.S. Department of Education could create competitive grants to promote the use of technology tools with a special focus on low-income students. The government could try to integrate existing data collected by the Education Department with user-generated data from colleges to inform and improve student decision-making. And the Department also could generate guidelines for how this data could be shared “in a social environment” while protecting student privacy.
These are worthwhile ideas. But ultimately the promise of behavioral nudges, delivered via technology, seems unlikely to be unleashed first and foremost through public policy. Colleges themselves need to recognize the value of these new tools and move to adopt them as they prove their worth. This will happen more quickly on some campuses than on others. Nudge initiatives, like other educational technology ventures—from digital textbooks to customized, computer-driven teaching modules—are going through a period of trial and error. Early results seem good, and some high-quality assessments have begun. But advocates who want nudges to spread will need to ensure that they are studied rigorously by outside analysts.
If nudges aren’t as effective as they should be—and they won’t always be—their creators will need to be willing to make course corrections. In an increasingly data-driven world, this is healthy and should become routine. If behavioral nudges continue to become more effective, and if they become part of the internal culture of more universities, they should become an increasingly important tool for promoting student success.
 Richard H. Thaler and Cass R. Sunstein, Nudge: Improving Decisions About Health, Wealth, and Happiness, Yale University Press, 2008.
 “Graduation Rate Fast Facts,” National Center for Education Statistics, accessed August 29, 2013, http://nces.ed.gov/ fastfacts/display.asp?id=40; Selingo, Jeffery J., College (Un) Bound: The Future of Higher Education and What it Means for Students; New Harvest, 2013; “The 37 Million Person College Dropout Crisis” Complete College America Blog, December 5, 2012, accessed August 29, 2013: http://www.completecollege. org/blog/post/the_37_million_person_college_dropout_crisis/.
 Ian Ayres, Super Crunchers: Why Thinking-By-Numbers is the New Way To Be Smart, Bantam, 2007.
 Michael Lewis, Moneyball: The Art of Winning an Unfair Game. Norton, 2003.
 James E. Willis, III, John P. Campbell, and Matthew D. Pistilli, “Ethics, Big Data, and Analytics: A Model for Application,” EDUCAUSE Review, May 6, 2013.
 Matthew D. Pistilli, Kimberly Arnold, and Matt Bethune, “Signals: Using Academic Analytics to Promote Student Success,” EDUCAUSE Review, July 18, 2012.
 Most examples in this section are drawn from a July 2012 case study published in EDUCAUSE Review: “Analytics, Nudges, and Learner Persistence,” by Frankfort, Salim, and two University of Washington officials, Colleen Carmean and Tracey Haynie.
 Quoted in Scott Jaschik, “The Power of the Nudge,” Inside Higher Ed, March 10, 2011, from which the summary of the Bettinger/Baker study in the following paragraphs is drawn.