+44 020 3930 8303

Our changing relationship with data – there may be trouble ahead

by | Jul 26, 2022 | Strategy & Transformation

Our latest guest blog author Andy Youell is the Executive Director – Regulation at the University College of Estate Management. Andy has spent three decades working in HE data and he was a member of the review of 2020 qualification grades commissioned by the Welsh Government.

Throughout the information technology revolution, Moore’s Law has been central to our thinking about the power of this technology. Although Gordon Moore’s 1965 research was focused on the number of transistors that could be put onto a silicon chip, it has become the universal descriptor for the exponential rise in the speed, capacity and efficiency of data-processing technologies.

There has been another significant trend in data in recent decades. It is as significant as Moore’s Law but it does not have a name and it is rarely discussed. Our expectations of the value that can be derived from data seem to be increasing at an exponential rate. This is exemplified by the language we use to describe data analysis. We used to do Business Statistics until Management Information took over; Business Intelligence was the term for the new millennium but now people talk about driving Insight from data; at the Office for Students (OfS), the Chief Data Officer is called the Director of Data, Foresight and Analysis.

Setting high expectations can be a good thing. It drives progress and innovation and, in the world of data, Moore’s Law provides the platform from which we can deliver greater intelligence, insight and foresight. However, the value we derive from data is not wholly dependent on the power of the technology; people and organisations use this technology, but Moore’s Law does not apply to them. We are therefore reaching a point where the expectations of data processing are starting to exceed our capabilities.

The exams fiasco of 2020 is perhaps the most high-profile recent example of a data project that went badly wrong, and with such profound consequences for the students involved. The mistaken belief that an algorithm could improve upon the teachers’ predicted grades was allowed to become reality due to the absence of rigorous oversight and feedback loops to the highest level of policy makers.

The OfS regulatory regime is an example of where data expectations could be pulling ahead of data capabilities. There is a lot of very significant decision making driven by algorithms and the underlying HE datasets take a ‘one-size-fits-all’ approach to describing a sector that is increasingly diverse and dynamic. The High Court case between Bloomsbury Institute and the OfS found that the HE provider had been denied a place on the OfS Register because of flaws in the interpretation of the Bloomsbury data and the extent to which significant policy decisions were being taken deep in the weeds of the algorithms. More recently, HE providers have been studying their B3 indicators and trying to make sense of how their complex and fluid reality manifests itself in the indicators that determine the assessment of ‘quality’.

At the heart of this issue is the fundamental truth that deriving value from data should not be considered a technical or technology issue; successful data projects depend on high standards of management and a strong governance layer. This should include robust challenges to the design assumptions, strong feedback loops from implementation back to policy objectives, and the willingness to call on external expertise and audit in more specialist areas.

The technology revolution that is described by Moore’s Law has changed the world, but the disconnect between our expectations and our capabilities is beginning to emerge as a significant risk. In the absence of appropriate controls, the likelihood and impact of failures will continue to grow.