What does it take for a university to become a predictive analytics powerhouse? King’s College London shared some key learnings at our recent event.
Inspire. Inform. Innovate.
That was the overarching theme of our recent joint event with King’s College London and Microsoft, A Higher Education Journey to Predictive Analytics, held at the iconic Bush House in London.
We were discussing KCL’s ongoing initiative to move beyond reactive use of siloed data to embrace proactive, predictive analytics – and the role Microsoft technology and Adatis support have played in the project’s initial successes.
With so many excellent speakers and so much great insight packed into one day, it’s difficult to distil all the learnings from KCL’s journey into one recap – so here are just a few of our key takeaways from the event:
1: No university has ever said it didn’t have enough data
The first session of the day saw Microsoft UK’s Director of Education, Chris Rothwell, discussing how data factors into Higher Education in the 4th Industrial Revolution.
All educational institutions, whether they’re primary schools or universities, are discussing data, and what they can achieve with it. But it’s never a case of ‘we don’t have enough data’ – in fact, there’s often more than anyone knows what to do with.
Every organisation is collecting mountains of information from an ever-growing range of sources; the real problem is that data is rarely in the right place or being used for the right purposes.
This was reiterated neatly by Richard Salter, KCL’s Director of Analytics, later in the day: many institutions are data-rich, but intelligence-poor. In short, everyone understands the value of data but aren’t always in a position to leverage it effectively.
2: The golden record can be elusive, but it’s vital
The first step on the journey to effective analytics is establishing a single source of truth – an agreed-upon golden record that underpins everyone’s understanding of the whole organisation.
Of course, in a university – with separate faculties working in silos and everyone working to slightly different definitions – this is easier said than done. Even something as simple as how to count student enrolment might not be consistent across different departments.
Bringing all that data together, ensuring its quality and accuracy, and making it available to those who need it (not just top management) should be the priority in any higher education analytics project.
3: Your detractors can be your greatest advocates
Any major transformation will likely be met with some resistance, especially in long-established institutions where data and technology habits have been entrenched for years.
(Richard recounted a memorable example where a fire safety officer was dealing with no fewer than 158 separate spreadsheets, which were manually updated and trawled through for reports each quarter.)
Dissenters can be disruptive – but if you can communicate the value of the project and bring them over to your side, they will be the best advocates you could ask for. The important thing to remember is that active dissent equals engagement; often, there will be validity in the concerns they raise, and that can help improve your outcomes in the long run.
What you need to watch out for is passive resistance – the signs that some of your stakeholders aren’t really engaged with the process at all. It can be difficult, or even impossible, to get everyone on board, so you’ll need to consider how you minimise their disruptive impact on the project.
One useful method is to give them part of the project they ‘own’ – it doesn’t have to be a major component, even something as simple as the font style or colour palette used in reports can work. If they have a sense of accountability for the initiative, however small, they’re more likely to get involved.
4: Be prepared for the goalposts to shift
A recurring theme in several of the day’s sessions was the perceived value of a business case and a defined strategy.
Though these are both important factors for shaping your journey, institutions risk getting caught up in trying to achieve a rigid set of objectives using specific methods. If you can’t alter your course, you might end up with an analytics ecosystem that isn’t fit for purpose.
Change will happen, and you need to be ready to adjust accordingly. As our own Martyn Bullerwell explained in his session: if you ignore changes to your goals, you’ll probably find yourself at the end of the project with exactly what you asked for, but not what you need.
Expert partners (like Adatis and Microsoft, for example) can be on hand to help you redirect your development efforts and technology to meet your new objectives, without having to go back to square one.
5: Your first deployment won’t be your last
Good code is finished. Great code is deployed. Perfect code is neither.
Rolling out a new programme or feature can be cathartic – but the work doesn’t end there. To be successful, you have to be prepared to iterate repeatedly on everything you’ve deployed.
Your journey will see you return to the initial stages of data preparation and analytics modelling many times – Fatma Ali, Lead Data Analyst for KCL, estimates that 80% of your time should be spent on data prep to avoid the ‘garbage in, garbage out’ trap.
Refinement is the most important part of the process. As you continue to surface new data and bring in different parts of your institution’s ecosystem, there’ll be key opportunities to encourage improvements in data capture and entry in the first instance. And once you can trust the quality of your data, there’s really no limit to what you can do with it.
Start your own journey to predictive analytics
The higher education marketplace is getting more complex and competitive – and a lack of data maturity is likely to hold institutions back.
If you want to learn more about King’s College London’s story, you can read the case study here. And if you’d like to start thinking about what your institution’s journey might look like – and what you can achieve – let’s talk.