Graduate Outcomes survey review – our approach to continuous improvement
When we launched the Graduate Outcomes survey in December 2018, we made sure that continuous improvement remained at the heart of our approach. This ensures the survey continues to meet the sector’s requirements and support HESA’s mission to collect, analyse and disseminate high quality data. In 2020 we launched a project aimed at reviewing all of the core components of the Graduate Outcomes survey.
The review process
With many different workstreams, systems, suppliers and services within the umbrella of the survey, we wanted to be ambitious with the review but ensure it was manageable. Due to the scale of the review and the logistical challenge of conducting it alongside administration of the live survey, it was split into phases. In consultation with the Graduate Outcomes Steering Group we identified three priority topics in scope of phase one of the review: data collection instruments, survey methodology and brand awareness.
We also agreed several review objectives to ensure when making decisions, we had a clear goal in mind. These were to reduce survey administration costs, increase response rates, reduce data collection and processing burden, improve data quality and increase alignment with users’ requirements.
The review comprised of a series of evaluations which were conducted in partnership with our suppliers, independent technical experts, HESA colleagues and representatives from the higher education sector.
A number of technical modifications have been implemented as a result of this review and several analytical and strategic issues were also explored (as summarised below). Perhaps one of the most significant among these is a review of the Graduate Outcomes questionnaire.
A group of HESA colleagues and members of the Steering Group formulated a working group which was tasked with a review of the entire questionnaire. The objectives of this review were to evaluate the usefulness of existing questions and determine the need for new questions. The group recommended changes to the Steering Group for further advice and consultation. Once approved, all changes were signed off by the following Statutory Customers: OfS, HEFCW, SFC and DfE(NI).
As you can imagine this was quite a lengthy process and we have tried to make it as comprehensive as possible! While we haven’t finished reviewing every single question, we have made good progress in terms of identifying some redundant questions and improvements to other existing questions.
A summary of the upcoming questionnaire changes has been provided below, but first let’s reflect on the outcomes of some of the other review topics.
Outcomes of phase one
We reviewed the suitability of the 15-month reference period and concluded there was no firm requirement for change. We also weighed up the benefits of a census vs sample survey which again found no evidence for burden or cost reduction and no evidence for an increase in response rate.
We now have a better understanding of what works and have improved the feedback to the sector on incomplete/invalid contact details. View our blog ‘Improving Graduate Outcomes response rates: why quality contact details matter’ for more information.
Data collection instruments
The survey platform, call management system and questionnaire (read more about this below) were reviewed. As a result, a number of design changes have been implemented, making the survey more user friendly and improving data quality. Details of the main data quality improvements were published alongside our statistical releases earlier this year in the Survey Quality Report.
The first set of findings from exploratory work using linked Graduate Outcomes and Student data were published this year, with positive results overall. Further work to understand differences between the two data sources and extend the analysis to other groups of interest is planned.
Brand awareness and graduate engagement
We are finalising a mission and vision for Graduate Outcomes which allows us to create a communications strategy, incorporating a social media strategy. We will then use them as a basis for our ongoing engagement with graduates, the sector and the wider community of users, researchers and analysts.
Each of the above topics is a collection of several complex items. Some of these have already been explored and others will be up for discussion and evaluation during the next phase of the review.
Changes to the survey questionnaire for year four (C20072)
As mentioned previously, one of the aims during phase one was a review of the Graduate Outcomes questionnaire with a view to determine redundant questions, improvements to existing questions and requirements for new ones.
While the review is still ongoing, most existing questions have been reviewed and we have categorised all survey questions as follows:
- Questions to be deleted (12)
- Existing questions to be updated (11)
- No change to the questions (25)
- Questions on hold for further investigation in 2022 (11 core questions and opt-in banks)
The decision to delete questions from the survey was driven by the need to minimise data collection and make the survey more cost effective. Therefore, a question was deleted where a firm requirement for the onward use of data could not be identified or the data had very limited useability, according to sector representatives.
We are pleased to report that in removing these 12 questions from the survey we have secured a small but noticeable amount of monetary saving. We intend to reinvest these savings into future survey enhancements through initiatives aimed at improving online response rates and further adding value to this service for our users. This project is still at a very early stage and we will provide further information in due course.
To find more detail about the questions that have been removed or changed and the reason for this, visit the C20072 coding manual, where you’ll also find the updated survey routing diagram.
New dropdown list to collect town/city
Making enhancements to the survey that improve data quality is a key focus and we’re delighted to introduce a new dropdown list which will collect the graduate’s town / city (for EMPCITY and BUSEMPCITY). Respondents will be able to select an appropriate answer from the list provided. An ‘other’ option and a free text box will allow them to specify should they not be able to define their location from the list. One of the main advantages of moving away from a free-text response layout to a pre-populated selectable list is that it enhances the quantity and quality of useable data by encouraging consistency in data collection.
The dropdown has been created based on a combination of standard geographies across the UK to provide graduates with a wide range of options. It is sourced from several locations, which include the Office for National Statistics’ Built-up Area (BUA) classification (comprising BUA and BUA sub divisions) for England and Wales, defined as part of their postcode directory; National Records of Scotland which provide an approximation to towns and cities; and District Electoral Area geography as defined by the Northern Ireland Statistic and Research Authority as part of the Central Postcode Directory.
We are now continuing with phase two of the review, including the next phase of the questionnaire review and we look forward to sharing another update with you in 2022.