Skip to main content

Collaboration for quality

This year’s student data collection process has seen the partnership between Jisc staff and the many colleagues across the HE sector face some big challenges. We’re enormously grateful for the dedication, commitment and expertise of all those involved in data preparation and submission across a diverse range of UK HE providers.


With brand new systems, structures and tools, the HESA Data Futures Programme has transformed the process of student data collection. With this transformation, we need to carry out enhanced analysis on the quality of data delivered at the end of the process.

Our open data publications, and the source data that underpins them, have a range of onward uses from regulation and funding to league tables, business planning, and journalism.

Part of my team’s role, in producing our statistical publications, is to assess the quality of the data we use for this purpose. The steps we take to mitigate and explain any issues with the data are intended to ensure that users can trust our data publications and use them with confidence.

Quality assurance

This year we are enhancing our data quality assessment processes, working with other key data users to ensure their requirements are considered. This enhanced approach will use the stock of quality intelligence developed as part of the data collection activity. We will build upon this by undertaking three further quality assurance testing stages:  

Field by field comparison with year variances

Firstly, we will undertake checks on each data field used in our publications, comparing the distribution of field categories with equivalent data from previous years. Any changes we see which are beyond expected ranges will be flagged for further investigation.

Continuity testing

Secondly, we will undertake ‘continuity testing’. This involves linking records for each student to their records in previous years (where such records exist) to make sure that characteristics we would not expect to change (e.g. personal characteristics) have been recorded consistently over time.

Compare statistical tables with previous years

Thirdly, we will construct the statistical tables for our publications and compare with equivalent tables using data from previous years; any variations we see that are outside expected ranges will be flagged for investigation.

All of these checks will be undertaken at UK aggregate level, UK nation level and at individual HE provider level. We will prioritise quality issues which affect the most significant data uses with the greatest impacts for the HE sector.

Transparent information

The actions we take based on the outputs of the quality assessment might range from clear explanations of known weaknesses to the suppression or removal of data in the most serious cases. We will publish a comprehensive set of quality information with our 2024 statistical products in the form of a ‘user guide’, alongside any  targeted steps we may take in specific cases.  

Relevant funding and regulatory bodies have oversight of Jisc statistical publications as well as using the data for their own statutory purposes. We will work collaboratively with these bodies to ensure they are fully informed of everything we find.

I hope readers of this blog will be reassured that we will be paying close attention to data quality and that we are ready to take appropriate actions if significant issues are found.

Jonathan Waller, Director of Information & Analysis

Jonathan Waller

Director of Data & Innovation