Resources from the Graduate Outcomes Conferences
We have collected together content from the Graduate Outcomes conferences held in London and Manchester. The content is grouped accordingly:
- Answers to questions asked at the conferences, grouped by common themes
- Full slide decks of the conference presentations, available as PDFs
If you have any further queries please contact [email protected]
Questions and answers from the Graduate Outcomes conferences
[I am] concerned that conversations with degree apprentices might need an adapted approach during data collection for Graduate Outcomes. Could we ensure the scripts cater for this, perhaps utilising the apprenticeship marker in the computer-assisted telephone interviewing (CATI) system to driver different conversation paths?
We have walked through the survey from the perspective of an apprenticeship student and feel they would still feel able to complete the survey. However, should this become an issue, we will consider adapting our approach for future collections.
What are you going to do with the wellbeing information? Is it just a general ‘graduates are happy’ kind of thing, or are you intending to create some other target or happiness rating? How are the wellbeing questions going to be reported on? What support is offered to students expressing issues with anxiety, happiness and an ability to trust?
This is further discussed in Rachel Hewitt’s blog reflecting on the conferences.
What are [providers] expected to do if graduates do report high levels of anxiety? What can we do? Will HESA / call centres be referring respondents to further help since they will be the first ones to ask the question direct to the graduate? What if the graduate has said they don’t want to be contacted by us? When collecting wellbeing data on how graduates are feeling, do you then have a process of referring graduates on to further support when inevitable concerns around mental health are received? By asking for this information you have a degree of responsibility for passing it on. When collecting wellbeing data on how graduates are feeling, do you then have a process of referring graduates on to further support when inevitable concerns around mental health are received? What support will be available to leavers who respond negatively to the subjective wellbeing questions?
We do not expect providers to have to follow up with graduates on their responses to these questions, instead HESA is considering the best approach for the inclusion of these questions. However, we will be including a question in the survey which seeks a graduate’s consent for follow-up about any of their survey answers by their provider, in case you wish to follow up on any of the survey data.
Will the subjective wellbeing questions definitely be mandatory? Are they required for a valid response?
These will be mandatory to be asked as part of the core survey. However, these will be asked at the end of the core section and will not be required for a valid response.
Section H on subjective wellbeing doesn’t seem to flow with the rest of the survey. I feel an introduction is needed for this section to contextualise the questions.
We will incorporate introductory text before this section to contextualise these questions and create a smooth transition from the previous section.
We have concerns about the wellbeing questions. If graduates feel uncomfortable answering these, the engagement with subsequent opt-in and provider questions [may drop]. How can HESA reassure [providers] this won’t happen?
HESA will provide introductory text at the beginning of the subjective wellbeing questions to contextualise this section and provide a better understanding for graduates. Answering these questions are not mandatory, so graduates can choose to move on past these questions.
How do we input into the creative / artistic portfolio process?
We are looking to further develop questions for graduates who are developing a creative, artistic or professional portfolio for year two of Graduate Outcomes. If you would like to take part in this process, please email [email protected] marking ‘For the Attention Of’ (FAO) the Graduate Outcomes team.
Will HESA issue any guidance about how to [target] communications to students who didn’t ‘graduate’ and don’t see themselves as such? i.e. Aim = HOO but achieve C20?
Positive feedback on the survey name was reported from cognitive testing; however, many leavers are not graduates – will the name potentially affect participation from non-graduates? (CHE, PGD, PGC, etc.)
Communications to graduates will be tailored around their provider and the subject they undertook. Whilst not all who took part in the survey will be ‘graduates’, the name ‘Graduate Outcomes’ was widely understood to be recognisable to even these leavers.
Graduates who opt-out or disengage would not be aware data about them may be gathered from other sources. More clarity please.
Graduates are made aware of uses of their data through the Student collection notice and the Graduate Outcomes collection notice which will follow.
New activity response ‘unemployed and looking for work’ – old DLHE response item reintroduced for Graduate Outcomes. Why [was this] removed in first place?
This was removed for the C13018 return, following a post-implementation review. However, having tested this option with graduates in the cognitive testing we reintroduced this wording. This wording aligns with that of LongDLHE.
What if the graduates that are being contacted need some career support? How will they be signposted?
Providers have the opportunity through the collection system to upload a careers URL which will display at the end of the survey on completion which the graduates can access.
Have you done any testing of the outcomes of the cognitive testing? How varied are the results to that which might have been gathered by DLHE?
We have not carried out any further testing other than the cognitive testing but will continue to review any requirement for further testing of the survey. We do not expect the Graduate Outcomes survey responses to be comparable to DLHE, due to the different timing of the survey and change in the survey design.
How do we capture students and reflect their activities when the university has a relationship with the employer and the employer selects students to be sent on the course paying the course fees and a living cost to the student? The student is then expected to reach a certain grade in their course to be offered a job by the employer.
These graduates would still be surveyed through the Graduate Outcomes survey. We feel they would still be able to complete the survey but will continue to review the appropriateness of the survey questions to this group.
What information will be gathered about a student’s postgraduate study if it has been completed in full by the time the survey 15 months after first graduation? Subject? Type of course etc.? What about those who did 1-year PGT within 15 months – can this be identified since job role may be impacted by this whether at same vs different institutions?
See section F of the survey: https://www.hesa.ac.uk/innovation/outcomes/survey.
We’ll also capture linked Student record data for these graduates. Linked data will be used over self-reported data in the survey where these sources conflict. Once data has been collected we will review this to see what impact it should have on our outputs to avoid misleading conclusions being drawn.
How will you code if a graduate reports further study with a private training provider, e.g. not college or with institution?
The question (D3) that captures this information allows for free text responses.
The survey is centred on the activities of leavers within a specified census week, but some questions (notably section H) refer to how a leaver was feeling yesterday. How will data be compared and used if the questions and answers will not be referring to the same point in time for all leavers?
The difference in time capture doesn’t affect the comparability of the data being captured for its specific need. The subjective wellbeing questions are being kept consistent in line with their use in other surveys to maintain statistical comparability.
How many contacts will be made to each graduate? If 10 (for example) will that be 10 to each contact type (email / landline / mobile) or 10 split across all contact types?
When will the HESA Graduate Outcomes Engagement Strategy be published?
What is the fieldwork schedule – number of emails, calls and will SMS be used? How many times will graduates be contacted?
The engagement strategy is currently being refined. Further information on this will follow in due course.
How do HESA envisage responding to potential Student Union-led boycotts of the Graduate Outcomes survey given its link to TEF and the boycott we have seen surrounding NSS?
HESA have worked closely with the National Union of Students (NUS) throughout the NewDLHE review and the development of Graduate Outcomes. The NUS are supportive of the changes made to DLHE and we will continue to engage with them on the development of the survey.
Will a paper version of the questionnaire be available?
A postal (paper) version of the questionnaire will be available to graduates who are uncontactable by email or phone (expected to be a very small percentage). It’s important to note that postal will only be used when no email or phone is available, not when attempts to contact via email or phone have failed.
At what point is explicit refusal taken? When the graduate informs the institution or when they inform the contractor?
What will be classified as a refusal to the survey? When will we be made aware if this?
Explicit refusal will be taken when a graduate informs HESA or the call centre that a graduate does not wish to take part in the survey following an attempt at contact.
What if caller hears an international dialling tone? Continue call? Might wake up graduate if in foreign country – no awareness of time zone.
We will define handling of international calls with the call centre, including timing of these calls to ensure the most appropriate time to contact.
Are there scope / plans for the invitation to take the survey (online) to be personalised? Can HEPs add their own message?
Other than inclusion of the provider logo, to maintain consistency there will not be the opportunity to personalise survey invitations.
Many of our UK graduates relocate to Africa to set up businesses. There will be limited linked data. Will they be contacted? Will HESA phone internationally. Will you call international graduates?
How will institutions be informed about responses / non-responses? Will it be done in time for the institution to specifically target non-response groups?
Dashboards, supplied through the provider portal, will inform providers of response rates during the survey period and will be updated on a near-real time basis.
Where online responses are low, [it can] take 7-9 phone calls to get a positive response. What assurances are there that we won’t be disadvantaged, and the [call] centre will persist in calling?
We will take all required steps to meet the response rate targets for all graduates.
Have the specifics with regards to limited response been worked out, i.e. if someone puts the phone down halfway and doesn’t respond later on – how will it be used?
If a partial response has been provided, then the call centre will follow-up with the graduate through calling attempts. If the graduate fails to continue with the survey, the data will only be used where they have answered all questions required for a valid response.
You mentioned you will be focussing on driving online engagement to complete the survey under Graduate Outcomes. Can you please share how you intend to do this?
We have used best survey practice in survey design to ensure the survey is appropriate for online delivery, are developing the online survey system with a ‘smartphone / tablet first’ approach and have taken learning from existing surveys.
You mentioned you are producing an engagement strategy that you hope [providers] will use to engage graduates over the 15-month period. Can you confirm when these engagement strategies will be available for us to use?
The engagement strategy is for HESA’s use in contacting graduates to take part in the survey. The roles and responsibilities section of the website sets out the approach providers should take in the 15-month window.
Can the survey be completed offline with the data sent once reconnected? Thinking of all our London commuters with no reception on the tube.
The survey is not downloaded and therefore works on a live webpage; however, graduates should not be stopped from completing the page they are accessing and submitting when the connection is re-established.
Any plans for incentives?
Graduate Outcomes incentive idea. Money for charity per completed response. Student ticks charity of choice or charity back at university or an amount to university charity. Ethical.
If incentives are allowed as part of the survey will they be provided on a provider basis or to everyone in the GO populations? Who will cover the costs of the incentives?
HESA are working with the steering group to consider plans for incentives. Our focus is ensuring any approach we take is cost-effective and achieves increased response rates. We will consider as part of this the idea of incentivising through donations to charity. Incentives would be provided across the whole Graduate Outcomes population. Cost of the incentives will come from the Graduate Outcomes subscriptions.
When we speak to graduates now, they can chat for quite some time about their course and how they feel it prepared / did not prepare them and those specifics are quite useful. If graduates volunteer extra info like this, will it be passed on?
Only information collected through the Graduate Outcomes survey questions will be retained. This is largely due to ensuring General Data Protection Regulation (GDPR) compliance.
Currently, The Department for Education (DfE) use DLHE to look at destinations of teachers at 6 months. Indeed, they contact us after each DLHE for any supplementary information we have. Have they signed up to changing their timeframe to 15 months or will they continue to make us provide the information about the 6-month timeframe?
DfE are currently exploring their requirements for Graduate Outcomes data; however, they believe they may be able to use existing data sources, rather than use Graduate Outcomes for their requirements.
Did I understand correctly that other bodies can opt-in to optional question banks for our graduates? If so, I assume we don’t pay for this, and how do we ensure that asking optional questions does not alter response rates? Opt-in questions banks – you mentioned some institutions may be ‘opted-into’ by other bodies, e.g. UK Research and Innovation (UKRI) and National College for Teaching and Leadership (NCTL). Can you provide more information on this, please?
The cost of any opt-in bank will be covered by those requesting the opt-in bank, e.g. if UKRI were to ‘opt-in’ to the research bank, they would cover the cost of these questions and specify the population they want these questions to be asked of. Opt-in banks are asked at the end of the survey and are not required for a valid response. If graduates drop out during this section, their responses to the core survey will still be retained.
What are provider questions?
Provider questions are bespoke questions that institutions can create (under governance from HESA) and ask of their own graduates. Provider questions will be implemented for year two.
What is the difference between opt-in questions and provider questions?
Opt-in questions are banks are consistent question banks available for all providers and statutory organisations to ‘opt-into’. Provider questions are bespoke question set(s) that are created and applied by individual providers to ask of their graduates only.
When do we need to have decided on whether we wish to select any of these (opt-in)? What is the deadline for confirming if an institution wishes to include opt-in questions?
Do these have to come from the opt-in question banks, or can we submit our own (for our distinct student cohort)?
Bespoke provider questions for individual providers will not be available in the first year of the survey, however from year two onwards this option will become available.
How will you encourage more to choose the opt-in questions?
We do not intend to encourage stakeholders to take up opt-in questions, this is at the discretion of providers / statutory customers.
How many opt-in questions are you expecting? How many do you need to get a suitable sample size?
We do not currently have an estimate of how many providers will take up opt-in question banks. Data will only be published where we have a sufficient level to meet our publication thresholds; however, providers will receive the raw data supplied for their graduates.
Why is the referral to the leavers careers service an optional question? As an institution that offers a lifetime service we feel it is not conducive to ongoing support to have to pay additionally so leavers can be referred to us.
Not all providers will have careers services / resources within careers service to follow-up directly with individual graduates while the survey is running, therefore this question became part of the ‘opt-in banks’. You will be able to supply a links to careers services at end of the survey to direct graduates to your careers service.
We believe that if [provider] opts-in to undertaking additional questions as part of the Graduate Outcomes that this data should only be available [to the individual provider] and not considered part of their core questions.
Opt-in question banks may have national comparability and include areas of interest wider than just individual providers. We will only publish data where we have a sufficient level to meet our publication thresholds.
Response rates are the concern [in discussion with other providers at the conference] over lunch. Not all institutions have the infrastructure to update / maintain contact details. Will these considerations be made? What have been the discussions with HESA and OfS on this?
Graduate Outcomes will take the burden of running the survey collection from providers; however, the requirement for maintaining high quality contact details is retained. Providers should be able to evidence best efforts have been made to hold and maintain these contact details for graduates.
Contact collection opens in September, when is the latest that the first upload must be made by?
9 November for Student record providers, and 28 November for AP Student record providers. For more information see the data collection schedule: https://www.hesa.ac.uk/collection/c17071/data_collection_schedule.
Will there be any wriggle room for special designation? In terms of contact details and response rates?
We are unsure what is meant by special designation in this circumstance, but we expect high quality contact details to be supplied for all graduates by all providers.
Do we need to provide logos in specific format to enable scalability across devices?
Yes, we will provide the format specification of logos to providers in advance of the collection system opening to submit them.
Do we upload contact details directly into Confirmit or to HESA? Is there capability built in, to [enable providers to] send updated details in [as we receive them] when we have late responses to our marketing efforts?
This will be uploaded directly to HESA, through the provider portal, where this will then be transferred to the Confirmit system. Contact details can be updated continually up until the final week of the survey period for the active cohort. More details can be found in the data collection schedule: https://www.hesa.ac.uk/collection/c17071/data_collection_schedule.
Clarification on GRADSTATUS field please, in particular ‘not contactable by email / phone?’ GRADSTATUS captures extenuating circumstances of a graduate not being contacted at all, or by email or phone. The code for not contactable by email or phone should only be used when a graduate is uncontactable through these methods; for example, those graduates who are in prison. This field will have validation thresholds in place, which if exceeded will require valid explanation.
[Please explain the hierarchy of] roles who sign-off [the data]?
As the survey is now centralised and being undertaken by HESA, providers are no longer required to sign-off the data at the end of the process.
Does the [provider] have to submit details for everyone? What is the scope to target those we know are in employment?
The provider will be required to submit contact details for everyone that is listed in the Graduate Outcomes population, regardless of whether some graduates are already in employment.
The student return deadline is at the time of submission of contact details for cohort A. What happens if we miss anyone, will they be in cohort B?
The student sign-off deadline is several weeks before the start of the survey period, so the Graduate Outcomes population should be finalised before surveying starts. If students have been omitted incorrectly in the Student record after sign-off, we will deal with this through the usual fixed database processes and discuss with the relevant statutory body as to how to survey these students on an ad-hoc basis.
Do HESA require any data to be returned to them after the contact details? E.g. proof of effort to check details or compulsory contact point?
We will be reviewing the quality of contact details following year one. Where significant issues are identified we will be seeking providers to supply documentation of the processes undertaken to collect and maintain these contact details.
Will future collection points align with Data Futures timing?
Yes – the collection points do not currently match the Data Futures timings but will align with them. The shift to Data Futures will allow us to provide the population to you much sooner.
How will any questions, asked by graduates to the call centre, be handled and passed back to the [provider]?
Graduates will be advised to contact their provider directly with any provider-specific questions.
What level of detail will be available on the real-time dashboard? When will further details of the HESA dashboard and its functionality be available?
Dashboards will provide information on the response rates, percentage of graduates in each activity and Standard Industrial Classification / Standard Occupational Classification (SIC/SOC) information. Further information on the dashboards will be available in Autumn.
Can we feedback / feed in on dashboards?
Yes, we will continue to refine the dashboards based on feedback.
Who will have access to dashboards?
Access to the dashboards will be provided by your Graduate Outcomes record contact, using our familiar Identity system (IDS).
Does linked data count towards response rate target?
No, the response rate is a survey response rate.
What are the response rate targets for international students? How can we ensure we have data for our high population of international students?
The response rate targets for international students are still being finalised with the steering group, but we are envisaging setting the target round 25 percent for non-EU domiciled and 45 percent for EU domiciled.
What [provider] data will be allowed – is it definitely none at all?
Provider responses are not in scope for Graduate Outcomes. This was agreed with the steering group where the decision was taken that it wouldn't be appropriate to take these responses given the change to methodology.
Since [providers] are no longer responsible for collecting data, who holds responsibility if response rates are not met? Are there plans in place if this does start to look likely?
Provided that sufficient high quality contact details are supplied by providers, the responsibility for meeting response rates lies with HESA. HESA will be closely monitoring the responses whilst the survey is ongoing, and where response rate targets are not being met, will take mitigating action such as increasing the level of contact with graduates.
As there is not going to be linking on further study information this time, will providers be able to return these as a provider response?
We will be linking further study information from year one of Graduate Outcomes.
Will [providers] have contact with the call centre? E.g. if a graduate requests a call on a certain day / time, can we pass this on?
Graduates will be able to get in contact with HESA via email if they have any queries about the survey, including wanting to arrange calls. Providers will be able to direct graduates to this email address.
Will call centres be used at evenings and weekends?
We will be agreeing a call schedule with the appointed call centre, but we intend to use evenings and weekends for calling.
Will Longitudinal Educational Outcomes (LEO) data be used for people who do not engage with Graduate Outcomes?
We are exploring with the Department for Education (DfE) whether we can get access to LEO data for those who have not completed the survey.
We (for DLHE) aim to get an even spread of responses across departments. This helps inform decisions as the data is as representative as possible. Can Confirmit / HESA also offer this level of service?
Yes, we will be monitoring response rates at subject level.
If you hit response rates for international cohorts (for which the targets are very low) will you continue to push for further responses?
Once response rate targets have been met we will not continue to push for further responses, due to the cost associated with this. However, if response rate targets are felt to be too low, we will review these beyond year one.
Response rate likely lower but presumably sufficient to get representative sample so was DLHE over-surveying. Was the difference between 60 [percent] and 80 [percent] wasted effort? Or is lower response rate not sufficient?
The change in response rate is largely to do with the change in survey timing. Whilst 80 percent was achievable at 6 months, it is unlikely to be achievable at 15 months post-graduation.
[With] regard to working on which census cohorts of students will fall into… When is a degree defined as complete?
In terms of ‘complete’ for being included in the Graduate Outcomes population and students falling into the different cohorts, this is determined by ENDDATE in the Student / AP Student records.
Where do students taking re-sits fit into this? For HE within FE our partner university boards might be a factor. Clarification?
Once the award has been given, then the student will fall into the population based off their ENDDATE. For example, an August ENDDATE with an award being given in December due to a re-sit will still fall into the cohort A period the following year, as the ENDDATE is August and the award has just been given at a later date. Provided that the award has been given before the student return closes in the following year, then they will fall into the population.
As an HE in FE provider, our students’ first degree is a 2-year foundation degree – the 1-year top-up to a bachelor’s falls in to the 15-month survey ‘gap’
What weighting do the questions in section G have on the survey? Our students will be being surveyed 3 months after finishing the bachelor’s [degree] and less likely to be in work / study / other than at the current DLHE 6-month census point.
We will be reviewing the use of section G once we have received the data for the first survey.
As an HE in FE provider, our students undertake a 2-year foundation degree (technically their first degree), then top-up to a full bachelor’s in the following year
Will students be surveyed 15 months after the completion of their 2-year foundation, then after completing their 1-year bachelor’s?
This depends on whether these are two separate instances, therefore two separate end dates and awards. If this is the case, if the awards meet the criteria then they fall into different cohorts to be surveyed. If this is one instance and one award at the end of this (successful bachelor’s and the foundation degree is interim) then they will be surveyed once on completion of their bachelor’s.
As an HE within FE provider, current student data is provided through the Individual Learning Record (ILR). This is currently provided once a year and on the [website] HESA says it will remain an ILR submission
Will we be required to complete ILR returns more frequently, e.g. one submission for every census date? Further clarification welcomed.
The ILR will be completed as normal. Graduates who have completed will fall into the population and relevant cohorts for that survey year.
The 4-phase annual cycle is hugely labour intensive and in our case meant we needed to create a 10-page communications plan. Can this be simplified?
The four survey periods have been created to survey graduates as close to the 15-month point post-graduation as possible and to spread the range of the surveying across the year. Not all providers will have graduates who fall into all four cohorts. Therefore, we do not intend to alter this approach.
You mentioned that you will be building a ‘richer picture’ by asking questions about activity pre- and post-census week. If all data users in the sector, e.g. league tables, only look at census week outcomes then we are in some double-digit percentage trouble for those in postgraduate study who will have just finished before the census week. What work are you doing to influence data users such as league tables to use this ‘richer picture’ rather than just the census week outcome?
We will be working closely with league table providers to ensure they are aware of the full range of data available through the Graduate Outcomes dataset and to ensure that necessary caveats to the data are included.
Is there potential for the ‘Graduate level employment’ measure to be replaced by one of the new data items? For example, graduate voice items?
We are still exploring the outputs, however graduate voice measures will likely be additional to existing measures, rather than replacing them.
With reference to weighting, will there be any regional adjustment?
We are still exploring what steps weighting will take. Once we have more information on this, we will publish this on the Graduate Outcomes area of the HESA website.
How will data align with Teaching excellence Framework (TEF) metrics?
How will TEF metrics be measured in light of New DLHE or Graduate Outcomes?
That is a decision for the OfS, but we ensure to keep them updated on progress on the implementation of Graduate Outcomes.
Will raw data be released to [providers] and if so, will coding convention be retained from DLHE?
Yes, the full raw dataset will be provided to providers. The schema and coding of the data will not match DLHE, due to the changes brought about in Graduate Outcomes.
Will raw and weighted measure results be presented or just the weighted version? You would only weigh if not representative – is there evidence that it isn’t?
Providers will receive raw data at the end of the survey. We are investigating the use of weighting for the data. Taking steps to address potential non-response bias is a requirement of National Statistics classification. For more information, see https://www.hesa.ac.uk/innovation/records/reviews/newdlhe/quality-assurance.
It was suggested that HESA are reviewing the use of experimental survey publications during the in-year collection of Graduate Outcomes data. Has it been decided what type of data will be available, who this will be available to and whether this will be available for wider publication prior to the complete data set being available?
Data will be provided to providers after each survey cohort. We are still considering the use of any additional publications prior to the full dataset becoming available.
[Regarding] General Data Protection Regulation (GDPR), can we use contact details collected from any avenue, e.g. reference request from graduate to tutor? Where should this data be stored to comply with GDPR?
Further details about how you should be collecting contact details is available in our data protection guidance. Storage and use of the contact details for any other purpose than Graduate Outcomes is subject to your own status as a data controller under the General Data Protection Regulation (GDPR).
Does the public intent of Graduate Outcomes cover us to contact gradates about just the survey [even] if we do not hold consent for them?
Yes. Further information is available through our data protection guidance.
Where info is inaccurate will you go back to graduates to confirm, e.g. where they give a flippant response to job title – ‘Dogsbody’?
Where significant issues with data quality are identified, we will follow up with graduates. However, we intend to build in most of the data quality checking into the online / telephone forms to avoid requiring follow-up with graduates.
Why are providers being charged twice for this? If it is a statutory data collection, why is this fee an extra on top?
Graduate Outcomes does not fall under the main HESA subscription cost. To provide clarity, we have separated all costs related to Graduate Outcomes into its own subscription.
When will we get clarity on costs?
The subscription page provides information about costs and a timeline for further updates.
Will you give us a refund if large numbers complete online? Will the cost of the survey increase if the calling team is required to contact leavers more times via the phone than outlined in the Engagement Strategy? If so is it known what the cost will be?
HESA will either credit back or reduce / increase the following year’s subscription to account for any discrepancies in year one’s subscription. This will be applied on a sector basis, as opposed to for individual institution basis.
When will we know the cost / or estimated cost (opt-in)?
Level of Scottish engagement in review and implementation? How will HESA engage with the Scottish sector? In particular about Longitudinal Educational Outcomes (LEO)?
Scottish providers, SFC and the Scottish Government have been heavily involved and consulted with throughout the NewDLHE review and implementation of Graduate Outcomes. The NewDLHE review strategic and working groups had representation from a Scottish Provider, SFC and Scottish Government, as does the Graduate Outcomes steering group. This ongoing engagement will include the linked data. We ensure we take a UK-wide approach when seeking input into development of Graduate Outcomes.
[I’d like a more detailed] understanding of the range / number of contractors HESA will be managing and their different functions, e.g. IT platform, Confirmit, call centre, coding, etc.
Confirmit will be delivering the contact management system, which conducts the management of survey contact with graduates both online and for telephone interviewers. This will deliver the emails, send the list of graduates to be called to the call centre, provide the computer-assisted telephone interviewing (CATI) platform and the online survey platform.
We are in the process of procuring a call centre and a coding organisation. Once this process is complete we will announce the organisations we are working with for these services.
We contracted out our DLHE [in 2018] and felt that in some cases, interviewers were unprepared and insufficiently knowledgeable about the survey, University and Higher Education sector in general which impacted on the quality of the data captured. How have HESA planned to ensure that interviewers are sufficiently prepared and what quality assurance (QA) procedures have they put in place to ensure that the contractors’ performance meets expectations and standards of HESA and [providers]?
We are utilising a pre-qualified and established framework for procuring the call centre to ensure the call centre has experience in this area. We will be working with the call centre to develop the training programme for telephonists and we will be closely monitoring the performance of the call centre and taking action where necessary.
If Standard Occupational Classification (SOC) coding is to be completed by a different provider to the calling company, will there be collaborative work between HESA and the two organisations in order to ensure that appropriate data is collected in order to accurately code data?
HESA will manage the relationship to ensure all information required for SOC and Standard Industrial Classification (SIC) coding is provided when conducting telephone surveys.
As with most / all customer supplier relationships, we expect there will be service level agreements (SLAs) in place. Both between HESA and the [providers] and HESA and Confirmit / call centre provider. Please can you provide details of these?
HESA does not manage the relationship with providers through SLAs. We are not able to share the SLAs with Confirmit as these are part of a commercial relationship.
Please [can providers have access to] more training and conferences.
We are currently considering the further training programme for Graduate Outcomes, and more information will follow.
How will you gather feedback after the first year?
We will be engaging across the sector following the first year of Graduate Outcomes to further refine and develop the survey and processes.
Have you thought about getting information from the British Medical Association (BMA) for medics?
We will continue to explore opportunities to utilise linked data sources throughout the survey, to avoid asking graduates questions where we can utilise existing data sources, including looking into whether we could get information from BMA.
Presentations from the Graduate Outcomes conferences
For your convenience, the conference presentation slides are now available as downloadable PDFs from the Graduate Outcomes conferences:
If you have any further queries, please contact Liaison.