Skip to main content

Operational survey information

As the Graduate Outcomes survey is live, we’ve shared some useful information for providers which outlines how HESA is delivering the Graduate Outcomes survey.

The content will grow and change over time but to respond to provider queries, we will add to this information gradually, so it can be made available to you sooner. In addition, as we learn from each cohort, we may also change our approaches as outlined on these pages. We will endeavour to update this page as relevant.

Contact centre methodology

How are IFF introducing themselves at the start of the interviews?
IFF interviewers introduce the Graduate Outcomes survey as the reason for the call and state they are calling on behalf of the provider for the particular graduate. If they are challenged further, they will explain that they are a research agency that have been appointed by HESA to carry out this work. If required, the interviewer can also advise that the survey has been commissioned by the UK higher education funding and regulatory bodies.

How do you ensure that where a graduate is in a different time zone, this is reflected in the time of call?
Where we have collected the data fields for international telephone numbers, the computer-assisted telephone interviewing (CATI) system automatically allocates the appropriate time zone to ensure the graduate is contacted at the correct time of day. The CATI system determines the location of the telephone number and assign it to the correct call queue. This is another reason for personal contact data needing to be as accurate as possible.

What does the telephone number come through as? Will it look like a spam call?
We have assigned a relevant geographical telephone number in the Confirmit system to reflect each institution’s geographical location.

What happens when a graduate answers, but is unable to take the call?
In these instances, where we know the contact details are correct but the timing of the call is inconvenient, the interviewer will ask for the graduate’s preferred date and time to undertake the survey. The interviewer will log these details into the CATI system and the graduate then will not be made available to be called until the date and time specified. The graduate is also given the option of completing the survey online.

How does the CATI system react to a graduate switching modes (phone to online and vice versa)?
If a graduate is called and states that they would prefer not to take the survey over the telephone, the IFF interviewer asks whether they would be willing to complete the survey online instead. If they consent, the interviewer checks the email address with the graduate and closes the interview.

On closing the interview, the graduate will be automatically sent an invite to participate in the online survey, via the email address checked on the call. Where a graduate has already started the survey online, they will be withheld from telephone interviewing until such a point where it is deemed unlikely that they will go on to complete the survey.

At this point, the graduate will be made available in the CATI system for telephone interviewing. An IFF interviewer will call the graduate, and upon introducing the survey and gaining the consent to participate, will start the survey at the point the graduate reached in the online survey.

What’s the approach to quality assurance?
All interviews are recorded digitally to keep an accurate record of interviews. A minimum of 5% of each interviewers’ calls are reviewed in full by one of IFF’s Team Leaders. Quality control reviews are all documented using a series of scores. Should an interviewer have below acceptable scores, this will be discussed with them along with the issue raised, an action plan agreed and signed, and their work further quality controlled.

Team Leaders rigorously check for tone/technique, data quality and conduct around data protection and information security. In addition, where Oblong are unable to code verbatim responses, these will be returned to IFF who will take steps to obtain and supply better quality verbatim by listening back to the interview and where necessary calling the graduate again.

How opt outs are handled?
When a graduate informs that they wish to opt-out on the call, then the IFF call handler will action this on the system so they will be set to not be contacted by any form of communication.

Where are IFF’s interviewers based?
IFF operates a CATI centre in their offices in London, which is supported by interviewers working remotely within the United Kingdom. These remote workers access the same system and interface as London operators and they are treated no differently to interviewers based in the London office. Each machine used by home workers is under the complete control of IFF’s IT department.

What’s the approach to training IFF’s interviewers?
Interviewers receive full training covering practical, theoretical and technical aspects of the job requirements. For quality control purposes, Team Leaders provide ongoing support throughout, harnessing interviewer skills and coaching around areas for improvement. This is done through top-up sessions, structured de-briefs and shorter knowledge sharing initiatives about “what works”.

For Graduate Outcomes, all interviewers receive a detailed briefing upon commencing interviewing, covering the purpose of the survey, data requirements (for example level of detail needed at SIC and SOC questions), running through each survey question, and pointing out areas of potential difficulty so objections and questions can be handled appropriately and sensitively.

Data classification (SIC/SOC)

Who is completing the data classification coding for Graduate Outcomes?
Oblong is our supplier for the coding of occupations and industries that graduates are working in (known as SIC and SOC coding). They are business data experts, with their main focus being the classification of businesses and database cleaning/enhancement. They have been SIC coding DLHE for the past 6 years, using specially developed coding software, in combination with a highly experienced manual research team.

The classification of Graduate Outcomes is key to allowing analysis and understanding of this large data source, and accuracy and consistency are paramount given the scrutiny and importance of the data. Learn more about Oblong.

What’s the approach to SIC coding?
Over the years, Oblong has developed self-learning software to deal with the classification of company data. This software has been finely tuned to work with HESA data. Graduate Outcomes will use their Business Data, Unity matching software suite and AutoSIC software to add industry classifications - SIC codes - to companies that employ graduates. The dedicated manual research team quality check most of the data and infill the gaps where the system can't add a SIC.

The fields Oblong will used for SIC coding are:

  • Company Name
  • Company Town/City
  • Company Postcode
  • Country
  • Company Description
  • Job title (to help with School/Healthcare classifications)
  • Course title
  • JACS level 3 grouping
  • Level of qualification

What’s the approach to SOC coding?
As part of the Graduate Outcomes survey, Oblong has also been contracted to add occupation classifications - SOC codes - to summarise the type of job each graduate undertakes for the company they work for. It will use their new self-learning AutoSOC software to add classifications. This will be followed up by manual quality checks on most of the data and manual infill on those the system can't classify.

The fields Oblong will used for SOC coding are:

  • Company Name
  • SIC code
  • Job title
  • Job Duties Description
  • Qualification Required?

They will also take into account if the company is an NHS organisation, if the graduate is self-employed, freelance, running their own business, supervising staff, or own the business.

The coding system uses various different methods to SOC code a record. It looks for keywords in the job title and job duties field, and takes into account if the qualification was required or not, before choosing the SOC. The system learns from data that has been previously coded (including manual SOC coded records), so if it sees a record with similar details to one that was seen before, it can be assigned the same SOC as last time. Oblong are manually reviewing all of the SOC codes the system produces, and then following this up with a second manual quality check and a final consistency check at the end of each cohort.

How will Oblong ensure consistency?
To start with, every SIC and SOC will be manually checked and down the line, Oblong expects to manually check around 75%. This approach aims to provide consistent information across many years of data. For telephone interviews, Oblong has an open channel with IFF Research (our contact centre supplier) to discuss data collection improvements to assist with collecting better data for more accurate coding through this channel.

How will Oblong / HESA respond to cases where providers do not agree with the classifications?
Oblong are well experienced in both SIC and SOC coding. Consistency is absolutely vital for the centralised approach. HESA will review the data file regularly to ensure this consistency is maintained. If challenges occur where there is a systemic error, HESA will make the final decision if a change is required. As consistency is the key principal, this will be addressed across the board rather than individually.

Engagement plan

To aid providers’ communication with graduates, we’ve outlined the general engagement plan below that will be implemented for each cohort. See more information on survey timings (course end dates, contact periods and census weeks).

Our full engagement strategy is an internal only document and incorporates an intricate plan which includes a range of methods, timings and scenarios. This has been crafted using best practice data collection and research methods. We have also worked with experts from the ONS, Confirmit (and liaised with the Graduate Outcomes steering group) to create a robust strategy which carefully balances the need to gain responses using a blend of methods across all of the contact details we hold for each graduate.

The strategy will be reviewed over time and we will make changes and refinements where needed.

Engagement methods

A symbol depicting an email being received.

A symbol depicting a text message arriving on a phone.

A symbol depicting a ringing telephone.

The primary engagement method will be email - we will also send email reminders at regular intervals. Graduates will receive text messages (SMS) from ‘GradOutcome’ which also contain links to the online survey (where we have mobile numbers). Graduates will also receive phone calls from IFF Research (our contact centre) on behalf of providers.

Timetable

The engagement plan for cohort D can be found below. See more on response rates.

Week Milestone

Week 1

Survey fieldwork starts:

  • First online survey invitations and SMS sent to all graduates (for those we have email addresses or mobile numbers) to EMAIL01 / UKMOB01 contact details (find out more about how contact details are prioritised in our operational FAQs)
  • Telephone calls commence for graduates where no valid email address exist
Week 2
  • First online survey invitations and SMS sent to all graduates (for those we have email addresses or mobile numbers) to EMAIL02 - 10 / UKMOB02 - 10 contact details
  • Telephone interviews commence (and continue throughout the contact period)
Week 3/4 Email and SMS reminders sent to non-respondents
Week 5 Email and SMS reminders sent to partial-respondents
Week 6/7 Email and SMS reminders sent to non-respondents
Week 8/9 Email and SMS reminders sent to partial-respondents
Week 10 Email and SMS reminders sent to non-respondents
Week 11 Email and SMS reminders sent to partial-respondents
Week 12/13 Final email and SMS reminders
Week 13 Survey fieldwork ends

Communications

You can view the sample email and SMS message being sent to graduates in survey materials below.

Encouraging response rates

Administration of invitations and reminders is carefully managed and considers timing, frequency, volume and journey of respondents. Every successful cycle of reminders informs the delivery of future reminders. All graduates across the entire sector are treated equally in terms of the communications they receive from us.

Partials to complete

Encouraging the graduates who have started the survey to finish it is a key part of our engagement strategy. To share more about our approach, Neha (our Head of Research and Insight) has shared some key aspects of our engagement strategy.

Read 'from partial to complete' blog by Neha Agarwal

Focusing on the mandatory questions

The majority of the mandatory Graduate Outcomes questions are the same that were required for a valid response in DLHE. Maintaining a set of mandatory questions ensures we have the correct routing in place and the required data for SIC/SOC coding purposes.

To ensure we receive maximum response rates, we have taken a number of steps to ensure each graduate is encouraged to complete these mandatory questions as a priority:

  • non-mandatory questions are optional and can be skipped
  • the subjective wellbeing and graduate voice questions are placed at the end of the survey, after the mandatory questions have been completed
  • opt-in question banks are also placed at the end of the survey, after the mandatory questions have been completed.
Operational FAQs

How are contact details prioritised for surveying?
For C18071, we’ve provided some guidance in relation to the number of contact details required by provider and how they are returned to us. This is all detailed in the C18071 coding manual. We've summarised it here:

EMAIL: We send the email invitations and reminders to EMAIL01 first, followed by EMAIL02-10. It is recommended that the ‘best’ email address for a graduate is returned in the first position either in the webform or in the XML file.

UKMOB / UKTEL / INTTEL: It is recommended that the ‘best’ telephone numbers for a graduate are returned in the first position for UKMOB / UKTEL / INTTEL either in the webform or in the XML file. Graduates will be called on their first telephone number before subsequent numbers are tried, and once successful contact is made, this contact detail is prioritised for any future calls.

Mobile numbers are more likely to result in successful contact and therefore, UKMOB (mobile) numbers are called before UKTEL (landlines), followed by INTTEL (international) numbers, where applicable.

Can you explain how to determine the ‘best’ contact details?
What we mean by ‘best’ is the supply of contact details that are most likely to elicit a response to the survey. This can be determined by recent contact with the graduate via this contact detail.

Some providers will be using email clients (e.g. mailchimp, raisers edge) to despatch their communications to graduates which often provide rich insight into the behaviour of the email recipient. This includes where the recipient has opened the email, clicked on a link or replied to it. This provides the evidence required to determine that the graduate is actively using this contact detail and is therefore useful in terms of survey engagement. If the provider is using these systems to carry out the suggested graduate contact during the 15 months post-course completion period, this provides a regular feed of information about this contact detail to help make this determination.

Where providers do not have access to this information, and multiple email addresses for a graduate are known, then provided these remain accurate for the graduate, it is recommended these are all returned to give HESA additional opportunities to contact the graduate.

Click here to view the definition of 'best'.

Can you share the rationale for the pre-notification strategy for cohort D?
We are implementing a new system called MailJet to help us with the issues related to internet service providers (e.g. Gmail, Office 365) blocking our email invitations. We have also been working to trial new activity that aims to improve our response rates and a pre-notification email (a warm up) is one strategy that we have been considering.

From 14 August to 30 August, we will stagger the delivery of a pre-notification email to approved cohort D graduates that shares key information about the survey and lets them know they’ll receive their unique survey link in early September. Read more in the email issued to providers on 8 August 2019.

Can you provide more detail about the subjective wellbeing survey questions, including how you ask graduates and how the data will be used?
You can read more about this in our blog - 'Asking graduates how they feel' by Neha Agarwal, Head of Research & Insight.

Can you share more detail about what each of the statuses in the provider portal progress bar mean?
The provider portal user guide explains what each status means and who's included within each one. 

Are providers allowed to contact graduates within a cohort (once it has opened for surveying)?
Once a cohort has commenced surveying, we welcome providers’ engagement with their graduates at a brand recognition level and using non-direct channels such as provider websites and social media platforms including Linkedin and Twitter. We have provided a suite of communications materials for this purpose. View our hints and tips guide which suggests ways you could support the overall awareness of Graduate Outcomes.

We suggest that you do not try and contact graduates directly (e.g. via targetted email) as there is a likelihood of crossover in contact and this could put graduates off. In addition, at this stage we are unable to provide reporting for those who've chosen to opt-out so we cannot risk contacting these graduates. Engagement should be spread evenly across the entire cohort, at a brand level and should not target a particular group more than others as this could lead to bias.

Before you do anything, you always should seek advice from your data protection officer on the GDPR status of your contact with graduates. Over time, we endeavour to work more closely with providers to identify the most effective engagement methods and learn from each other to create best practice. We will then be able to prepare more detailed guidance for providers.

Can you share survey paradata or response rates with providers?
In the creation of a centralised survey, our strategy does not include individualised reporting of this kind, so we cannot share survey paradata (data about the process by which the data was collected) at a provider level. This means that our approaches to making any developments to the survey design or engagement strategy will be carried out across the entire survey by reviewing the entire data set.

We will also not be looking to share any overall sector statistics as any release would require considerable contextual information and we are unable to prioritise the required activity at this time. We want providers to rest assured that this analysis and activity is taking place but in a measured and controlled manner. HESA is approaching this first year of Graduate Outcomes in an agile way and taking every step to ensure within the active survey management we are reacting to all of the information available. At the same time, we need to ensure the integrity of the live survey.

How are you encouraging graduates who've started the survey but not yet completed to finish the survey?
This is a central part of our engagement strategy. Broadly speaking, graduates in the ‘started survey’ group will receive regular email and SMS reminders encouraging them to finish the survey. In addition, they will be allowed appropriate time to complete it online before they are followed up via telephone. Read more about how we aim to turn partials into completes.

Can you share your stance on provider incentives?
We strongly recommend that you do not create your own incentives. Depending on the type of population breakdowns a provider may have, incentives could bias the results in favour of that population group. For a centrally run survey like Graduate Outcomes, incentives must be rolled out across the entire population so that all respondents are treated equally. We are looking into the possibility of offering incentives in future cohorts, considering legal, ethical and practical considerations. We will communicate with our steering group and the rest of the sector as this work progresses.

How many times will a graduate be contacted to complete the survey?
It is impossible to provide a simple answer to this question as it depends on what type of engagement we have with each graduate across the cohort and across modes. If you review the outline engagement plan (above), it shows the proposed contacts we will make with the online survey (via email and SMS message) and calls are carefully scheduled to compliment this. Before a call is made, we will ensure that where a graduate has already started the survey online, that they have had time to complete it on this mode. It then depends how successful the contact details are and whether any contact is made using them (i.e. do they pick up). 

Successful contact with graduates heavily relies on the quality of contact details. Invalid and inactive email addresses and phone numbers will inevitably result in no contact. 

Response rate targets

In discussion with the Graduate Outcomes steering group, HESA has set the following response rate targets for Graduate Outcomes:

Target group Response rate target
UK domiciled full-time 60%
UK domiciled part-time 60%
Research funded 65%
EU domiciled 45%
Non-EU domiciled 25%

The above targets are applicable at a national, provider, undergraduate and postgraduate level. In addition to these, we will also monitor response rates for a range of socio-demographic and course characteristics such as age, gender, ethnicity and course.

We understand the importance of good quality outputs for the sector as well as other users of our data. We will continuously monitor the progress of Graduate Outcomes against the targets set out above. We will employ robust research and statistical methodologies to ensure we are able to produce estimates that meet the required statistical quality standard as well as our users’ requirements.

Survey materials

To enable providers to support the survey, final versions of the survey materials and communications with graduates can be viewed below.

Emails

There are a number of emails in the engagement strategy:

A pre-notification (warm up) email is being trialled for cohort D in our first collection. This will be sent to all graduates prior to the start of the contact period to provide key information about the survey and lets them know that they’ll receive their unique survey link in early September.

Pre-notification email - English

Pre-notification email - Welsh

The email invitations sent within the contact period are derived from the core email text which is shared below. This has been updated for cohort D (17/18) and includes some look and feel changes. The other iterations vary depending on the nature of the email and its role in the engagement strategy. Changes are mainly limited to the subject line and first paragraph.

Email - English

Email - English and Welsh

SMS / text message

There is one version of the SMS text which is generic so it can be used across the engagement plan.

SMS / text message

Survey questions

You can view the survey questions on the page below split by type. The data items and routing diagram can be found in the Graduate Outcomes Survey Results coding manual.

View the final Graduate Outcomes survey questions