Evolution from DLHE to a centralised survey
Following what we learned from the first consultation, we completed a review of the suitability of the DLHE for retaining National Statistics designation. Methodological considerations had been raised in both the remit for the review and by respondents to the consultation. We knew that the DLHE needed to be perceived as highly trustworthy, but we wanted to understand what this would entail in practical terms. The analysis of the suitability of the DLHE for official statistics purposes offered us a framework in which to consider these issues and follows recognised good practice.
Given the extensive public uses of graduate destinations and outcomes data, it was vital that there was a high level of confidence in the robustness of the data, commensurate with retention of National Statistics designation. Therefore, the highest standards of quality should be assured in its production. The design of the NewDLHE needed to deliver a comprehensive level of assurance and HESA should produce a design that met this requirement. In terms of the UK Statistics Authority’s quality assurance matrix, this meant that we should aimed for the assurance level “A3 – Comprehensive assurance” in order to be fully compliant. This would align with other data with a high level of public interest.
Quality assurance self-assessment
We undertook a quality assurance self-assessment of the DLHE as HESA would seek to maintain National Statistics designation for its equivalent to the DLHE Statistical First Release (SFR – now referred to as a “Statistical Bulletin”) under any future data source resulting from the review. To maintain this designation, as well as wider public trust in the data, HESA confirmed that it should be aiming for the highest assurance level possible. In terms of the UK Statistics Agency’s quality assurance matrix, the aim should be to achieve “A3 – Comprehensive assurance”. The DLHE survey was judged to not meet this standard. It is mostly at “A2 – Enhanced assurance” level with some aspects at “A1 – Basic assurance” and some at “A3 – Comprehensive assurance”.
In DLHE, many providers chose to outsource data collection to a third party contractor. In this situation, the contractor was responsible for collecting data under contract to the provider, exclusively. Decisions about the implementation of the methodology that might conceivably introduce bias could occur as a result of operational decisions by the contractor and provider. These effects were not necessarily visible to HESA and in some cases may not have been recognised as significant, as the implementation was dependent on a distributed workforce with varying resources and skills.
The self-assessment made several recommendations to achieve comprehensive assurance. The key recommendation was the need to reconfigure the former DLHE methodology to further enhance quality assurance mechanisms. We took this into account when deciding how to develop a model. We determined that the majority of necessary methodological improvement could be achieved either through a centralised approach, or through a substantially enhanced audit process. An audit process would investigate processes and practices, backed up by an enhanced analytical quality function at HESA and the publication of materials generated through these processes. If the collection process is distributed, then this process must necessarily include a substantial sample resurvey.
We compared options for delivering the survey either on a centralised basis or continuing with a distributed model. The quality assurance self-assessment followed the ONS guidelines, and explains the work that was done to determine the required features of the Graduate Outcomes survey. These changes were agreed as the outcome from a major review process, which we also cover in more detail elsewhere.
The decision to pursue the open centralisation model was ultimately taken by HESA governance mechanisms and reflected the best value option available to meet requirements. The open centralisation model retains much of the control and oversight that HE providers previously had over the data collection process, while ensuring the methodology is applied systematically and openly by a trusted third party (HESA). It was designed to create system-level efficiencies and agility that can be shared widely and funded fairly.
For Graduate Outcomes, HESA and our suppliers collect survey data directly, to produce Official and, subject to assessment, National Statistics. HESA is accountable for this both to our customers and directly to the Office for Statistics Regulation. Methodological decisions will be dictated by what is in the interest of our users and in line with the Code of Practice for Statistics. The performance of our suppliers is subject to ongoing standardised quality assurance supported by consistent Key Performance Indicators (KPIs). Issues in collection will still occur as they did previously for providers, but we have the advantage of a central concentration of statistical skills to recognise, address, and correct for these issues, systematically. Our oversight also makes subjecting the survey to continuous improvement more practical and efficient.