Skip to main content

Adding value to UK graduate labour market statistics: The creation of a non-financial composite measure of job quality - Section 2: Data

Section 2: Data

While higher education is now a devolved matter across the four nations of the UK, the responsibility for the collection and dissemination of data on students has resided with a single organisation (the Higher Education Statistics Agency (HESA) - now part of Jisc) since the mid-1990s. Each academic year, providers are required to submit data on all students enrolled on their courses to this body, with this covering aspects such as their demographic characteristics, subject of study and qualification aims. Additionally, the rapid expansion in university participation in the late 1980s and early 1990s started to raise queries about how the sector should be funded in the longer-term. There was therefore growing interest in understanding the public and private benefits of higher education among students and policymakers. Consequently, alongside the annual gathering of administrative records on students over the last thirty years, HESA have also conducted yearly surveys of graduates shortly after they complete their studies across the same time period.

Up until the academic year 2016/17, the primary questionnaire relating to graduates was known as the Destinations of Leavers from Higher Education (DLHE) survey. This asked individuals to supply information on their activities six months after they had qualified. Since then, DLHE has been superseded by the Graduate Outcomes survey. As with DLHE, this is sent to all qualifiers from a particular academic year and participation is optional, with the initial cohort invited to take part being those who graduated in 2017/18. There is some alignment in the topics contained within the two questionnaires, with both querying graduates in employment on matters such as their annual earnings and the type of organisation they work for. However, there are a few ways in which Graduate Outcomes differs to DLHE. Firstly, the survey is administered fifteen months after an individual finishes their qualification. Secondly, the content of the questionnaire has been partially altered to better reflect present policy objectives and the need to capture subjective (as well as objective) outcomes. For example, the financial crisis of 2008 led to an acknowledgement that an economy cannot solely be judged by Gross Domestic Product (GDP) and that it was important to also concentrate on citizen wellbeing (Stiglitz et al. 2009). In the UK higher education sector specifically, there was a desire to see information assembled on how graduates perceive their employment outcomes, with the importance of subjective data on job quality also noted by the MJQWG.

Questions on wellbeing and individual perceptions of the extent to which their work was meaningful, utilised their skills and aligned with their career objectives were thus new additions to the Graduate Outcomes survey that did not form part of DLHE. In the section on wellbeing, graduates are asked to rate the below aspects of their life on an 11-point scale that goes from 0 to 10, where 0 represents ‘not at all’ and 10 indicates ‘completely’. These match the four questions put forward by the Office for National Statistics (2018) and that are presently used as part of the data dashboards that track national wellbeing over time.

  • Overall, how satisfied are you with your life nowadays?
  • Overall, to what extent do you feel that the things you do in your life are worthwhile?
  • Overall, how happy did you feel yesterday?
  • Overall, how anxious did you feel yesterday?

Meanwhile, the part of the survey requesting graduates to reflect on their activity to date asks those in employment to highlight the extent to which they agree or disagree with the following three statements:

  • My current work is meaningful
  • My current work fits with my future plans
  • I am utilising what I learnt during my studies in my current work

A Likert scale is used that comprises five options ranging from strongly disagree to strongly agree. All three of these indicators seem to align closely with the job design and nature of work dimension developed by the MJQWG. In the next section, we shall empirically investigate whether there is evidence to suggest that all do relate to the same component and, if so, whether we can create a composite measure that would support communication of job quality information in the public domain.

For our quantitative work, we focus on the first two collections of the Graduate Outcomes survey, which consists of those who qualified in either 2017/18 or 2018/19. Survey information is linked to administrative records on students held by HESA to enrich the dataset we use for our analysis. We restrict our sample to UK domiciled graduates (those whom we could not assign a UK region or from Guernsey/Jersey/Isle of Man were excluded, though this constituted less than 0.5% of the sample) whose sole activity at fifteen months was paid employment in the UK for which they were renumerated in pounds sterling. Additionally, they must have responded to all three statements above in their survey submission. The final sample we had available to us contained a total of 286,240 observations (please note that all totals in this paper are rounded to the nearest five to align with the HESA rounding methodology, which is designed to prevent the disclosure of personal information).

Next: Section 3: Methods

Share
Insight
Tej Nathwani

Tej Nathwani

Principal Researcher (Economist)

Contents

Download research paper as PDF

 

See more research from HESA

 

Sign up for Research releases