Online survey design
On this page: Provider personalisation of survey and emails
The survey questionnaire is hosted on an online platform using specialist survey software. All questions are programmed using the software’s coding language. The system is widely used to conduct surveys by leading sector bodies. It conducts the management of survey contact with graduates both online and for telephone interviewers, meaning there is live interaction between the different channels.
The same online version is accessed by telephone interviewers ensuring the same survey is used across both modes of data collection at any given point in time. This is aimed at reducing survey measurement error. The system also includes sophisticated technology which is instantly smartphone compatible, making the survey more accessible by the target audience.
The online survey is accessed through a URL (link) which is unique to each graduate and sent to them via email or SMS (text message). The survey can be conducted on multiple devices (desktop and mobile) with in-built compatibility functions that enable seamless transfer from one device to another. Respondents are provided with data validation prompts to help them with specific questions as they go through the survey. This minimises the risk of respondent error, particularly in self-administered surveys. The following are examples of questions that use validation checks:
- Activity – if a respondent indicates they are in employment as well as retired, they see a prompt that requires them to check their answer and correct if necessary.
- Salary – if a respondent’s salary seems too low or too high, based on currency and intensity of employment, they are asked to check their answers. Typical salary ranges are obtained from ONS’ Annual Survey of Hours and Earnings2.
When completing the survey online, respondents are unable to go back and change their answers to previous questions. This is done for data protection reasons. We are unable to confirm whether the contact details submitted by providers are unique to the graduate because, in several cases, the email addresses used to contact the graduate were previously supplied to them for a different purpose. We therefore consider the data protection risks this may cause.
There is a possibility that a graduate has two email addresses, one that is only available to the graduate in question and the other available to another individual. In such instances, personal data entered by a graduate using a link from one of these emails could be accessed by the person with access to the other email address. By removing the ‘back’ button on the online survey, it is not possible for anyone other than the respondent to have access to information already entered into the survey.
This issue does not affect surveys conducted over the telephone as interviewers confirm the name of the respondent before the survey begins.
Graduate Outcomes depends upon strong collaboration with providers. While HESA manages the planning, delivery and data capture elements of the survey centrally, providers fulfil equally important roles in collating and submitting good contact details and publicising the survey to their graduate communities. It is important that graduates understand the official status of the survey and although few will have heard of HESA, most are likely to feel more confident about the credentials of the survey when it is visibly supported by their ‘home’ providers.
To support this recognition by respondents, we collect providers’ logos for co-branding the survey and a link appears at the end of the survey to a relevant area of each provider’s website (e.g. their careers service). Email invitations and reminders to complete the survey are sent under the name of the provider (but from the central Graduate Outcomes email address). We also include the name of the provider in interviewers’ scripts. All of these approaches are intended to convey the collaborative approach underpinning this survey and reassure graduates about its legitimacy. The principle of survey customisation was agreed with providers during survey design consultations.
To minimise risking the introduction of bias, we ask providers to refrain from any attempts to drive up response rates through direct engagement with graduates during the live survey period. We also strongly discourage the use of incentives for the same reason. Providers can, however, make full use of non-direct channels for promotion, for example social media.