Reflecting on Graduate Outcomes survey response rates in unpredictable times
Having recently closed the second cycle of Graduate Outcomes, we wanted to pause to reflect on the progress that has been made in the collection of this important data set.
To say 2020 has been an unusual year is an understatement, and here at HESA we have all adapted and responded to the ever-changing environment as best as we could.
One of the most important decisions we had to make in the early Spring, was whether to continue with the Graduate Outcomes survey, asking graduates about their destinations, reflections and wellbeing. We decided to press ahead as we were acutely aware of the significance of this work for our data users and for society as a whole and that the survey results would provide robust scientific evidence to inform future understanding of the impact of the pandemic on our lives. Logistically too, we felt well-equipped to carry on.
With increasing demand for online survey participation and a growing sense of respondent fatigue, we anticipated an adverse impact on the survey’s response rates. However, these fears have since been allayed by a growth in the overall response rate achieved by the close of the collection year.
2018/19 vs 2017/18 response rates
This year, for the UK domiciled population, we achieved an overall response rate of 52.8% (compared with 51.6% last year). While this is a year-on-year improvement, it is still lower than the 60% target set by the sector. Over the coming weeks we will review the survey’s performance over the last 12 months, identifying key strengths we need to build on and areas for further improvement. We will also publish an end of collection report early in the new year, looking back at this year’s survey.
Do response rates matter?
While a higher response rate is desirable, simply increasing response rates does not guarantee less bias. Bias occurs if graduates with certain characteristics are found to be less likely to complete the survey (therefore under-represented), and their outcomes are unique in some way and different to those who do take part in the survey. For example, if male maths graduates are less likely to respond to the survey and more likely to be unemployed than female maths graduates, then the survey results will not accurately represent the population of maths graduates. Bias makes survey results skewed and could lead data users to mis-informed conclusions and make decisions based on them.
It is therefore crucial that we manage the risks of bias. Our research into the 2017/18 Graduate Outcomes survey showed that at a sector-wide level results from the survey are unbiased. It is important to note that while this conclusion applies to the survey as a whole, unavoidably, the results for small groups may not be as reliable as those for larger groups, for example subjects within providers. We also recognise the fact that some providers have lower response rates than others which would inevitably affect the extent to which they can use data from this survey.
Our work on improving response rates and looking into provider level variability, has identified a few components of the survey that are strongly correlated with low response rates. Among the UK domiciled population, the availability of contact details, their quality and useability and survey brand awareness among graduates are noteworthy. We will revisit these topics, briefly, in the last segment but first let’s position Graduate Outcomes response rates within a wider spectrum of surveys.
How does Graduate Outcomes compare with other surveys at home and abroad?
In July 2020 the Office for National Statistics highlighted the fact that survey response rates are falling in general. Organisations are struggling to contact respondents, and when they do it is hard to convince people to complete surveys.
Various large-scale surveys in the UK are achieving between 40-50% response rates. A few examples are:
- Labour Force Survey (July-Sept 2020 - 27.5%)
- Family Resources survey (2018/19 - 50%)
- Understanding Society (2009/10 Wave 1 of a longitudinal survey - 57.3%)
- European Social Survey (UK response rate in 2018 - 41%)
- British Social Attitudes (2019 - 44.8%).
These are a few examples of some of the most established surveys in the UK. They are not comparable with each other as they have different topics, target populations and data collection methods, but their response rates do give a sense of the challenges facing anyone undertaking a large survey. Graduate Outcomes is different again, so for better comparisons we need to look further afield.
Looking at graduate surveys conducted by some other countries around the world, we find these response rates:
- Graduate Outcomes survey in Ireland (2019 – 51%)
- Australian Graduate Outcomes Survey (2020 – 42.3%)
- Eurograduate Pilot Survey in eight European countries – (2018/19 - a range of response rates, the highest 21.9%).
These again, provide a broad range of response rates to benchmark our progress against, whilst keeping in mind that social, cultural and political differences between countries may have an impact, as well as any methodological differences between the surveys.
How are we improving response rates for Graduate Outcomes?
Increasing response rates to social surveys is a science as well as an art. For instance, a methodologically robust survey design cannot yield a high response rate if we do not connect with respondents effectively, answering the question “what’s in it for me?” Awareness of the survey brand and communicating the benefits of responding, are equally as important as having good contact details for graduates.
Since the inception of Graduate Outcomes we have worked systematically, with the HE sector, to enhance the survey design, user engagement, communication strategy, data collection methodology and the collection of good quality contact details. This work will continue for the lifetime of the survey. We have worked on a range of changes with a view to improving response rates, these have been based on both academic research on survey methods and the sector’s own unique knowledge of its students and graduates.
A current priority is raising awareness of the survey among graduates. One of the ways we are approaching this challenge is through our increased presence on social media channels, which unsurprisingly is a popular mode of communication in the target population. Likewise, Higher Education providers also play a pivotal role in increasing awareness about this survey among graduates. Your well-established relationship with students and graduates places you in a unique position that allows you to inform them about the significance of the data collected by the survey. This will not only make graduates feel assured but also motivated to take part. We know a few institutions are highly active in this already, but similar practice has not permeated the entire sector yet.
We will also be focussing further on the effectiveness of contact details in the coming year. We know that good contact details improve our chances of contacting as many graduates as possible. We are also beginning to understand more about how providers compare in relation to the quality of contact details supplied. Many providers are trying extremely hard to supply the best contact details possible for graduates. Some are finding it difficult and where this is the case, we would like to hear from you. There may not be an easy solution to the problem, but the more we know about the challenges you face the better placed we will be to find solutions and share best practice.
As we embark on the third cycle of Graduate Outcomes we will build on the strong foundation of the past few years. But this building will never be fully finished; we always continue to review our systems, methods and processes to create a much stronger survey for the future.