Skip to main content

Provider delivery FAQs

This page outlines some useful FAQs about the delivery of final survey data to providers for the 17/18 collection.

Here we have brought together all of the key points and guidance on the delivery of final provider survey data.

Guidance note on use of data

The below guidance note includes important instructions on the use of the final data and the embargo until HESA’s first statistical release.

Guidance note on use of final 17/18 provider data

C17072 Survey Results coding manual

We advise providers to make use of the C17072 Survey Results coding manual to understand the format of this delivery as it provides the data specification and derived fields.  

C17072 Survey Results coding manual

Methodology statements part one and two

These statements provide a comprehensive overview of the survey’s history and operations.

Methodology statements part one and two

Provider portal user guide

The provider portal user guide has been updated to include guidance on the newly created ‘Collection results’ tab. This guidance has been added to the C17071 coding manual.

Provider portal user guide

Introduction to the provider portal e-learning module

Link to the provider portal 

Link to HESA Graduate Outcomes provider portal log in page. If you have not assigned the relevant role to access the data, you will need to do this in the HESA Identity System first.

Link to the provider portal 

Can I / can you tell me how to compare this data to DLHE data?
We advise you not to compare this data to DLHE. This is because Graduate Outcomes is a new product with significant differences from its predecessor. In light of this, HESA is actively considering the variety of potential onward uses of the data and steps we might take to mitigate misunderstandings or misapplications of the data. We share more about this in our blog: 'Don't mistake Graduate Outcomes for DLHE'.

The published outputs make it clear that Graduate Outcomes is a different instrument from its predecessor. These differences include: 

  • Time frame change - 15 months after the graduate leaves (not six) allowing graduates to establish themselves in employment. 
  • Centralisation - the contact centre, fieldwork and SIC/SOC coding are undertaken centrally, no longer by the provider, allowing for greater consistency.
  • New questions - these include the length of time with their employer, how many jobs they've had since graduating, whether they have had responsibility for supervising staff etc.
  • Graduate voice - new questions provide a nuanced understanding of success in outcomes as judged by the graduate. 
  • Opt-in questions - giving providers the opportunity to ask graduates additional questions. 
  • Wellbeing - the inclusion of subjective wellbeing questions developed by the ONS.

These differences are substantial and affect how data is collected, compiled and can be analysed when compared to DLHE. HESA will therefore be taking a proactive stance in relation to correcting high-profile mis-representations of the data where these may occur. In addition, we are working with sector groups to design the survey outputs, so that the final products will be the result of a collaborative effort.

Bearing all of this in mind, and following guidance from Office for Statistics Regulation, we decided early on to release all Graduate Outcomes data outputs as ‘experimental statistics’ for this year. You can read more about this in our blog: ‘The true method of knowledge is experiment" - why Graduate Outcomes statistics are experimental’.

The above also applies to providers comparing their final year one data with previous DLHE data. Users will not have a time-series of data (a series of comparable data from previous years) at their disposal until future years of Graduate Outcomes data become available, therefore comparison should be avoided.

Where can I find out more about response rates?
Please refer to our existing FAQs:

Response rate FAQs

In addition, HESA is presently working to understand the factors that contribute to a lower response rate, for providers where the volume of graduates should enable the survey to achieve an average response rate. Some of the factors associated with this variability include quality of contact details, provider demographics and brand awareness among graduates. Additionally, we are exploring ways of improving response rates for providers with low rates as long as it doesn't introduce bias in the national results.

I believe that a graduate has been incorrectly coded for SOC – what can I do about this? 
We have published the outcomes of our assessment of SOC coding for 17/18 on the HESA website.

View the outcomes of our assessment of SOC coding for 17/18

As previously communicated, any feedback supplied to HESA after the 6 January 2020 deadline cannot be incorporated into year one (17/18) outputs. This was to ensure we remained on track for data delivery. However, we remain open to this feedback to allow us to fully inform the coding of year two (18/19). 

What can I do with my data? 
Before providers download their data, it is essential that the guidance about use of this final data and the statistical release instructions are read and understood by all providers. This outlines the nature and use of this data during the pre-release embargo period (note - now the Statistical Bulletin 2017/18 has been published, this embargo has expired).

View final provider data usage guidance note

When will I receive the national data? 
Please visit the open data - graduates page to access the published outputs and supporting materials. You'll find also find updates on timings of future releases on the upcoming releases page

Where can I find out more about the methodology for Graduate Outcomes?
We have recently shared our Graduate Outcomes methodology statement which contains details of the most important aspects of survey design, data collection, analysis and dissemination for Graduate Outcomes. It is aimed at the users of Graduate Outcomes survey data as well as those with an interest in survey methodology. Part two of the statement includes a detailed section on weighting.

Visit the Graduate Outcomes methodology statement

Where can I find out more about the data items and the survey routing? 
Please refer to the C17072 Survey Results coding manual for more details or specifically the survey results file structures for downloadable files. For derived fields, view the specifications. The survey routing diagram can also be found in the coding manual along with details of the data items.

Visit the C17072 coding manual

How can I access my data?
Your provider’s data will be made available to the Survey Results user role. We have updated the provider portal user guide with instructions on how to access this data. In short, we have created a new ‘Collection results’ tab where this can be downloaded.

Why can I not access this data?
There are a few reasons for this:

  • You do not hold the Survey Results user role. This must be assigned in the HESA Identity System by the Record Contact or Admin user.
  • Your provider is required to complete the data usage sign off form and it has not been returned to HESA. If this is the case, please contact Liaison.
     

What is included in the final data?
This data comprises the final survey results (data entered by graduates) as well as several derived variables (including aggregations) which have been generated from the underlying survey data. The derived columns are primarily aimed at providing a set of comprehensive measures that would help providers use this data meaningfully and allow for comparisons across other HE providers. Please refer to the file structures for downloadable files

It will also contain the final year one SIC / SOC coded data following the extensive assessment of provider feedback and rigorous consistency checking. Read more about the outcomes of the SOC assessment process

What format is the data provided and what’s in the file?
The collection results will be downloaded in a ZIP file which will contain the C17071 data in .tsv format. The ZIP file will contain a ‘ReleaseDescription’ file which lists the contents of the ZIP file. Within each ZIP file there will be 10 files:

  • 4 data files.
  • 4 record description files which describe the structure of a datafile.
  • 1 valid entries file which lists all the codes and their meanings.
  • 1 release description file, as previously mentioned, which lists all the files in the ZIP file.

Will this data change after the final delivery?
As with all HESA collections, further deliveries of the data may occur if it is deemed that the data delivered has changed significantly. This change can be caused by fixed database changes in other collections or errors in the calculation of derived fields. The same process will be used that is used in the other HESA collections to decide if another delivery iteration is required.

*Update on additional provider delivery*
Following the release of final 17/18 provider survey data at the end of March, we provided an important update regarding the content of this delivery (dated 6 April). This related to SOC coding, the ACTSKILLS data item, and the XACTIV03 derived field. To rectify these issues, we are expecting to release a new iteration of the collection results data into the provider portal week commencing 18 May. This delivery will include additional SOC coding but will exclude the XACTIVE03 derived field. Please refer to our email on 29 April for more information.

How have you defined the new XACTIVITY derived field? 
The derived activity field for Graduate Outcomes is broadly based on the most important activity of the graduate with a few exceptions: where graduates state that their most important activity is employment, and they have an additional activity of further study (or vice versa), the graduate is categorised as being in both work and further study. Those graduates whose most important activity is unemployment or doing something else, but who are due to start work or study are also identified separately.

We received some feedback from the sector around how certain categories of graduate activities were being treated, in particular where these related to standard labour market classifications: 

  • Categorisation of graduates who indicate that their most important activity is unemployment, yet they also indicate that they are in work which could be seen as contradictory, by those looking for labour market data.
  • Categorisation of graduates who indicate that their most important activity is doing something else (including travel, caring for someone or retired) and are also due to start work or further study.

Further analysis of the Graduate Outcomes data has taken place and HESA is proposing that the previous activity derivation (XACTIV03) is replaced by a new one (XACTIVITY) and used in published outputs.

Visit the derived field specifications

The key changes are:

  • Reclassify those graduates responding that their most important activity is unemployment, but also responding that they have another activity that is either paid work, self-employment, running their own business or developing a portfolio into the appropriate employment category, on the basis that they meet the standard labour market definition of employment (in that they are working for pay or the expectation of profit). It is intended that those graduates responding that their most important activity is unemployment, but also responding that they have another activity of voluntary / unpaid work, are retained within unemployment as their activity indicates that they are most probably economically active, not in employment, and there is no implied contradiction.
  • For consistency, those graduates who state that their most important activity is voluntary / unpaid work have been separated out from the other work categories. We are adopting the term ‘employment’ for the collective group of paid work, self-employment, running own business and developing portfolio. This fits closely with accepted international and national standards for describing employment and provides more information about the different activities in which graduates are involved.
  • Rename ‘study’ to ‘further study’ to highlight the distinction between the original study undertaken by the student and that undertaken since leaving.
  • The original proposal for XACTIV03 was to categorise graduates as ‘Due to start work’ or ‘Due to start study’ where their most important activity was unemployment or other (includes travel, caring for someone, retired). Given that ‘Due to start work’ has been historically grouped with unemployment in some of the published outputs (including the UK Performance Indicators), this could have an unduly negative impact on perceptions of which graduates are in unemployment. Respondents who state their most important activity is one of the ‘other’ categories, are less likely to be active in the labour market at the point of survey (meaning they are actively seeking work and able to take it up in a short timeframe) and hence it is unreasonable to assume that they are unemployed now, just because they are due to start work or study, soon. The intention is therefore to retain these graduates in the ‘Other including travel, caring for someone or retired’ category, based on their most important activity and relabel the ‘Due to start work’ and ‘Due to start study’ categories as ‘Unemployed and due to start work’ or ‘Unemployed and due to start further study’.
  • To reflect that ‘Other work’ and ‘Other study’ categories include graduates who do not complete the intensity of work and/or study questions, rename as ‘Unknown pattern of employment’ and ‘Unknown pattern of further study’.
  • Combining the two work and further study categories into one.
  • Rename ‘Other’ to ‘Other including travel, caring for someone or retired’ to reflect the content of this category.

If a graduate is both unemployed and in further study, what will they be classed as in XACTIVITY?
The XACTIVITY field focuses mainly on the most important activity returned by the graduate. There are a few exceptions, such as when a graduate indicates their most important activity is unemployment, but they are also in some kind of employment which is seen as a slight contradiction. The same contradiction is not apparent with study, so these graduates are retained within unemployment in line with the most important activity.

Why have those doing voluntary/unpaid work been separated out from those in paid employment?
The XACTIVITY categorisation is designed to align more closely with the standard labour market definition of employment (in that they are working for pay or the expectation of profit), hence why a decision was made to show voluntary or unpaid work in a separate category for the purpose of the top-level activity group.

What is the meaning of null in the data?
Null or blank data indicates that the graduate did not answer this question. The reason for not answering the question could be that they weren't asked the question due to routing, they skipped over the question, stopped just before that question or that they didn't start the survey.

How has the subjective wellbeing data been aggregated?  
HESA has agreed with the steering group that we will not publish nor disseminate this data at provider level for the 2017/18 survey. Subjective wellbeing data delivered to HE providers will be restricted in coverage to their own graduates only and will represent aggregated frequency counts in each question without any further disaggregation (and not provided at individual graduate level).  

We have provided derived fields for each question which outlines the banding of responses applied. 

Where has data been inferred from a routing issue?
In cohort A and part of cohort B, if a graduate in further study fails to select either the provider they are attending or “Other” from the drop-down provider list, they were not routed to the question on country of study (this was corrected from cohort midway through cohort B). In order to correct this routing issue as far as possible, the graduates affected (under 2,500 graduates) are fuzzy matched (using combinations of student identifier, names, date of birth, sex, postcode and other characteristics) to determine if the further study was carried out at an HEP or AP who submits data to HESA. Where there is a strong match at an equivalent level of study, provider name and country will be imputed. For all other remaining records, country code would be set to ‘Not known’.

There are typos and misspellings in the responses – why wasn’t this addressed in the QA review? 
At the end of the collection process, data returned for questions that permit a free-text response went through a cleansing process, in order to improve data quality. This is usually where the respondent has not chosen a value from the drop-down list provided but has instead selected “other” and typed their own answer. 

However, this does not include resolving misspellings or removal of profanity. We have clearly outlined this in the final data guidance note to recommend that providers take this into account in the processes established for use of this data as it could contain sensitive information, opinions or references.

I believe an answer is incorrectly given by the graduate, why haven’t you changed it?
We have been approached by some providers to make changes to graduates’ actual survey responses where providers have determined that a graduate either misinterpreted the question or the supplied answer is inaccurate (based on assumptions made by the provider). A fundamental principle of HESA and Graduate Outcomes is that we do not make changes to data returned by respondents. In the absence of unbiased contradictory evidence, answers given in the survey are deemed to be true from the perspective of that graduate.

Are partial responses included in the data and how are they distinguished? 
Partial and complete responses will be included in the same dataset. We have created a derived field (ZRESPSTATUS) that will distinguish between the two. 

Why is ZRESPSTATUS blank?
Derived fields are only calculated for graduates who have some results data returned, no matter how little results data. If ZRESPSTATUS is blank, this means the graduate did not engage with the survey at all.

What is the difference between a blank ZRESPSTATUS and a ZRESPSTATUS = 02?
A blank ZRESPSTATUS indicates that the graduate did not engage with the survey at all.
A ZRESPSTATUS of 02 indicates that the graduate did not meaningfully engage in the survey and therefore cannot be considered to be ZRESPSTATUS 03 or 04. For example, they may have clicked the link or answered one question and deleted their answer.

If you’re not weighting the data, why is there a derived field for it?
The XWEIGHT01 derived field was provided before we made a final decision on weighting. Now we have decided there is no need to weight the data, providers will not need to use this field. All graduates with the XWEIGHT01 derived field have it set to 1. Find out more about weighting in the methodology statement part two.

Why are there $ values in my file under the SOC fields e.g. XMS2010SOC?
XMS2010SOC will only exist if MIMPACT = 01, 02, 03, 04 or 05. If MIMPACT is something else, then $$$$$ will exist in the file. Please see the derived field technical specifications for further guidance on this.  

What fields in the Collection survey results will tell me the domicile, mode of study and level of study for a graduate?
Providers will need to link to their relevant Student return for this information. The population file may also contain some of these attributes. The Collection survey results is a Graduate Outcomes survey results delivery only.

Will HESA be providing a derived field that aggregates the activity reflection questions for WRK, STU and ACT?
We are not looking to create derived fields that aggregate the graduate reflection questions because we believe it will remind and highlight to data users that there is a distinction between the activities of graduates answering each version of these questions. Each graduate should only be asked one set of these questions depending on their activity/ies so it shouldn't be too complex to aggregate this data in a way that is meaningful for provider analysis.

What is HESA’s position on weighting the data?
We have recently shared our methodology statement part two which contains details of the most important aspects of survey design, data collection, analysis and dissemination for Graduate Outcomes. It is aimed at the users of Graduate Outcomes survey data as well as those with an interest in survey methodology.

Visit the detailed section on weighting

On 21 May, we published a detailed technical account from HESA researchers about how we reached a decision not to weight the final data. The paper is mainly aimed at academics, statisticians, and other interested parties wishing to understand the weighting research and its conclusions.

Read 'New HESA research asks should we weight?'

Some fields are missing in my data – why have you not followed up with the graduate to obtain an answer? 
Some fields may be missing because the respondent was not asked certain questions or they chose not to answer them. Our follow up strategy is primarily aimed at non-respondents, followed by partial respondents. Despite all our efforts to follow up partial respondents some may not complete the survey. 

We also recontact graduates where their records are initially returned as uncodeable for SIC / SOC and where they completed the survey over the phone. But for the online mode, we are unable to make follow up contact due to the logistics of carrying this out at such a large scale.

Where can I find out more about the engagement strategy and the materials used to contact graduates?
You can find access to this information, including copies of the survey itself, on our Operational survey information page.
 

How can I access my data?
Your provider’s data will be made available to the Survey Results user role. We have updated the provider portal user guide with instructions on how to access this data. In short, we have created a new ‘Collection results’ tab where this can be downloaded.

Why can I not access this data?
There are a few reasons for this:

  • You do not hold the Survey Results user role. This must be assigned in the HESA Identity System by the Record Contact or Admin user.
  • Your provider is required to complete the data usage sign off form and it has not been returned to HESA. If this is the case, please contact Liaison.

Why can I no longer see the ‘SIC / SOC’ or ‘survey results’ raw reports in the portal?
Now the final data has been made available, we are required under GDPR to remove access to data that is factually incorrect when latest data has been made available. Therefore, on 31 March, we rescinded access to the ‘SIC / SOC’ or ‘survey results’ reports. Advanced notice of this was issued to providers on 20 March 2020 by email. You now have access to this as final data in the new ‘Collection results’ tab.

Will I continue to have access to the two response rate tabs?
Once the statistical outputs are released, we will be withdrawing access to the ‘provider survey response rates’ and ‘sector survey response rates’ reports. Providers can save down their provider specific report prior to this or create their own response rate calculation from the final delivery. The sector response rates will be included in the statistical outputs. 
 

When will the official statistics be announced?
The Graduate Outcomes Statistical Bulletin is planned for publication at 9.30am on Thursday 18 June 2020. The Open Data release is planned for 9.30am on Tuesday 23 June. Both releases will be published on the graduates Open Data area of the HESA website.

What's the difference between the different publications?

  • The Statistical Bulletin is the first of HESA’s releases and provides high level findings and characteristics of graduate outcomes at a national level. It provides a useful overview of the survey results and will be published alongside a summary assessment of data quality.
  • Shortly after, we plan to publish a larger suite of Open Data tables providing a more detailed view of the data.
  • Finally, as we communicated earlier in April, the publication of UK Performance Indicators: Graduate outcomes, 2017/18 has been delayed. At present we are expecting release during autumn 2020. Read more about how HESA intends to engage with the sector on its design in our measuring performance FAQs

What’s the nature of the statistics being released?
The primary aims for HESA are delivering the data to the regulator and other Statutory Customers and publishing the output in accordance with the UK Code of Practice for Statistics. Given that this is the first year of the survey, the output will be published as Experimental Statistics. The survey methodology and analytical approach will then be formally reviewed by the Office for Statistics Regulation (OSR) in the summer with a view to designating the output as a National Statistic in the future.

Visit the upcoming releases page to get the latest on the timings (dates will be provided 4 weeks before) and read our blog about the nature of the releases as experimental statistics