Purdue University in Indiana has for years touted the ability of its early-warning system, Signals, to improve student retention, but a series of blog entries analysing the institution鈥檚 claims has not found a causal connection between students who use the system and their tendency to stick with their studies.
Purdue鈥檚 method of structuring its early warning system has permeated the industry, and research invalidating its results could have sent shockwaves through its competitors. The university鈥檚 approach is not being called into question, however, only its claims to boost retention - which, on one hand, is likely to come as a relief to the many software providers that have attempted to recreate what Signals does. On the other hand, the rationale given by many for using such early-warning systems is in fact to improve retention.
Signals combines demographic information with online engagement and produces a red, yellow or green light to show students how well they are doing in their courses - and provides that information to their professors so they can provide help to students before they drop or fail. Ellucian, which provides administrative software, sells it as the commercial product Course Signals, while educational software providers Blackboard and Desire2Learn offer many of the same features through Retention Center and Student Success System, respectively.
Michael Caulfield, director of blended and networked learning at Washington State University at Vancouver, decided to take a closer look at Signals after Purdue in a September press release claimed taking two Signals-enabled courses increased students鈥 six-year graduation rate by 21.48 per cent. Mr Caulfield described Purdue research scientist Matt Pistilli鈥檚 statement that 鈥渢wo courses is the magic number鈥 as 鈥渕addening鈥.
探花视频
Comparing the retention rates of the 2007 and 2009 cohorts, Mr Caulfield suggested much of what Purdue described as data analysis just measured how many courses students took. As Signals in 2008 left its pilot and more students across campus enrolled in at least one such course, Mr Caulfield found the retention effect 鈥渄isappeared completely鈥.
Put another way, 鈥渟tudents are taking more鈥ignals courses because they persist, rather than persisting because they are taking more Signals courses鈥, Mr Caulfield wrote.
探花视频
His findings were last month corroborated by Alfred Essa, McGraw-Hill Education鈥檚 vice-president of research and development and analytics, who wrote a simulation that substituted 鈥渞eceived a piece of chocolate鈥 for 鈥渢ook a Signals-enabled class鈥.
鈥漈he simulation data shows us that the retention gain for students is not a real gain (i.e., causal) but an artifact of the simple fact that students who stay longer in college are more likely to receive more chocolates,鈥 he concluded. 鈥淪o, the answer to the question we started off with is 鈥楴o.鈥 You can鈥檛 improve retention rates by giving students chocolates.鈥
Mr Essa helped design the Student Success System as a strategy director for Desire2Learn, but said it and other products that have been inspired by Signals don鈥檛 face an existential crisis.
鈥淭he aim of these early warning alert systems at the course level is just to make sure that students are performing well,鈥 Mr Essa said. 鈥淚t鈥檚 a huge leap to go from that and say, 鈥極h, and we鈥檙e also going to improve your retention rates directly.鈥 鈥
探花视频
Dr Pistilli defended the claims about Signals鈥 ability to increase retention - with the caveat that more research needs to be done. 鈥淭he analysis that we did was just a straightforward analysis of retention rates,鈥 he said. 鈥淭here鈥檚 nothing else to it.鈥
To ensure an empirically grounded analysis of Signals, Mr Essa urged Purdue to give researchers access to as much data as possible. Dr Pistilli said he is open to participate in that conversation, but pointed out that granting open access could violate students鈥 privacy rights.
With Signals marking its fifth anniversary this year, Dr Pistilli said 鈥渋t was probably just a matter of time for people to start looking for these pieces and begin to draw conclusions鈥. In that sense, the discussion about early warning systems resembles that of other ed-tech innovations, like flipping the classroom and massive open online courses, where hype drowns out any serious criticism.
鈥淚 think part of the answer is we鈥檙e really bad at statistical reasoning,鈥 Mr Essa said. 鈥淓ven experts get tripped up by statistics, and it鈥檚 very easy to make claims like this, but it鈥檚 difficult to dig in and try to make sense of it.鈥
探花视频
He added: 鈥淢aybe one of the conclusions that could be derived from this is that we really don鈥檛 have a strong community to test and validate these claims? Maybe that鈥檚 really the starting point of discussion in the academic community. As we move forward with new technologies in learning analytics, how and who will be evaluating the claims that people put forward?鈥
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 罢贬贰鈥檚 university and college rankings analysis
Already registered or a current subscriber?




