When is the right time to intervene with a student who might be struggling in their learning journey? A complex question but one we sought to answer partly with a big data project using data we had collected over many years on our students. The Project, called the Student Journey Project, was the brainchild of our COO, Associate Professor Rhys Johnson and utilized the expertise of Associate Professor Keith B. Carter and his KBC consultancy as well as our own IT, Academic and Operations talent. The home of the project and the data was the School of Diploma Studies, of which the author is Dean. This article explores the drivers, decisions, and details made on that project as well as some early results.
Firstly, we had to define what data we wanted and make it less big, or even question if it was big. You see, we’d been challenged at a recent Kaplan Industry Advisory Board Meeting, by the Chair of IT Board Mr. Shreeniwas Iyer, who described “Big Data” as a tautology. “Big is like the prefix multi in multimedia, in the 21st century it’s redundant; just as all media is now convergent and so by nature any one medium has multiplicity, so too data is of such a vast volume that the big is assumed, it’s all just data”, he asserted. He should know, Mr Iyer is the senior coder with Quantcast, a company that deals with advertising data volume in the billions every day.
We came to see what Mr Iyer meant as we completed the User Acceptance Testing on this project, which enabled us to mine 82 million data points that pertained to our students here in Singapore. Our goal was to make data-driven decisions on when to intervene with students who were at-risk of not passing or graduating.
1. The Drivers:
A. Accounting for Failure in Accounting
Accounting can be the hindrance of many a well-intentioned commerce, business or related interdisciplinary students (trust me, mine is the voice of the bookkeeping challenged). The three D’s cause much consternation: Double sided, Debits/Credits and D grades (and I’m not talking about Distinctions). At Kaplan, for non-accounting majors, first semester accounting is consistently the lowest performing subject in terms of students’ first-time pass rates.
We knew from the literature and our own experience that the challenge we had was around early intervention, which is making a meaningful connection with the student over and above that of their dedicated professor, in order to identify the barriers to their learning before it’s too late. Our courses are directed to reflect our diverse student population, ranging from part-time working adults to full-time students, so it was imperative that we needed extra real-time data to act in a timely manner. Moreover, some 13 of our courses offer accounting with different lecturers on different days at different times in different semesters, a challenge that required a technical solution to allow for the variability and scale of delivery.
B. Divergent Systems with Important Data
In order to have the right data at our disposal to arrive at the right time to intervene and rescue students, required the convergence of a myriad of systems (see Figure 1 below): The Student Management System for final performance data, The Moodle Learning Management System for students’ online participation and formative (midstream) performance data, the Digital Entry System logs for real-time attendance data and Navision Accounting Systems for payment confirmation. An aggregator lumped it all in and then Tableau allowed for dashboards.
Figure 1: Key Technology Data Sources Subsumed Into Tableau Dashboards
2. Decisions to Make: Questions are the Answers
Our straightforward approach should not minimize the effort we needed to come to a conclusion. With 82 million data points available to us, our temptation was to quickly generate answers. But as we found out, formulating the right questions was crucial to the development of our program. Enter Associate Professor Keith B. Carter, National University of Singapore, School of Computing and his colleague, Ms. Ann Luo.
Associate Professor Carter came in and showed us the direction we needed to get the right answers. He started working with us in low-tech and set out to ensure we clearly defined our query to better plan out a course of action. He took the time to time to know what we wanted to know challenging our Socratic Paradox—we knew that we knew nothing. As the project plan below demonstrates, Step 1 was a discovery around getting some consensus on what Business Questions we wanted answered (Carter, 2016). While the framework below seems to illustrate a consistent time frame on each step, were it to scale, Step 1 would be several times bigger than the other steps. Like the metaphor of working out the most efficient and effective way to cut down a tree, Keith taught us to sharpen our axes before we approached the tree.
Our axes were our business questions; we came up with questions around the 5 W’s and 1 H:
• Who are our students by entry qualification?
• What are their age, nationality, and other demographic profile?
• When do they withdraw (attrite) if they do?
• Where do they live and does this impact results?
• How does their attendance and other variables relate to their performance?
• Why is success/failure early a predictor of success/failure later?
With this starting list of questions the iterative nature of changes and modifications to the list kept us coming back to priorities and ensured project alignment to broader Institutional Mission and Vision around the student experience, pedagogy and their learning journey.
3. Details: Three Little Gems From the Big Mine
Of the countless working groups and approaches implicit in the detail of the Framework (Figure 2), three stood out for their contribution to the resulting dashboards and subsequent timely interventions for students:
1. Data Quality Group: the assurance of many different pairs of eyes both looking at data and coming to consensus on definitions of what the data represented was vital for validating the data but also the project. For example, we had previously attributed some of our inaction to uncertainty about the student data, but with consensus that the data was valid, we could not do that anymore;
2. Data Visualization Proof of Concept for three months: With the data assured, agreeing on the aesthetic of the dashboards was vital in ensuring it could be used by a range of users from high end strategic users to those who would need the 360 degree profile of students with which to contact them promptly via the best channel.
3. Staff Training on the actionable nature of the intelligence: reinforcing the mindset that a duty to act on the data was paramount or as one project member said “The dashboard won’t meet the student, we will”;
4. Early Days, but some results
From big data, little patterns show. Even in User Acceptance Testing (UAT), we started to see illuminated the profiles of students that had a greater tendency to struggle over others. We also saw the significance of accounting and other first semester subjects as a predictor of future performance overall. This only served to vindicate our efforts, but it was time to act on the data!
As the results for the Diploma Course pilot below show, the dashboards revealed an at-risk population—defined as students who failed all subjects in a term—of 12.6 percent of the whole cohort that was reduced to 8.7 percent post-intervention. The almost 4 percent improvement of the overall population of the course had a much-improved result after intervention, made up of 2.4 percent passing all subjects (19 percent or 8 of the 42 who received intervention) and a further 1.5 percent passing up to half the subjects (12 percent or 5 students).
A solid start and more would-be accountants for the world, but ideally we would like to enable all students to succeed, surely the role of any institution, and one in part made easier by actionable intelligence, the right information, in the right hands, at the right time in order to improve outcomes. In these projects we are using it to change lives!
Acknowledgements: Other than Profs. Carter, Johnson and Ann Luo mentioned in the article, the following should be credited with the success: The Kaplan Singapore Student Journey Project Team consisting of Constance Chee, Alan Lam, Peter Baeck, Nelson Ang, Phani Vemuri, and Sherly Setianingsih and countless others who worked with us.