We ran two public consultations to gather input into our plans during the NewDLHE review.
Our first consultation ran in the summer of 2016 and invited respondents to comment on the top-level principles behind collecting outcomes data. Our second consultation on our detailed model ran between March and April 2017.
Consultation one – summer of 2016
We invited all interested parties to explore the desirable characteristics for a future replacement for the DLHE survey. We explained that we aimed to identify the features of a data product that supported the aims of a wide range of uses and users. We also wanted to gauge support for some of the ideas we presented as tentative proposals. To aid a deliberative and exploratory approach, we explained that later on there would be a time for sifting, analysing and refining ideas to produce a deliverable data product and a further consultation around that.
At this early stage, we wanted to engage all interested parties in a wide-ranging debate about what the future of student destinations and outcomes data should be: to play a part in shaping the settlement for data about graduates, to support public information, policymakers’ decisions and our collective understanding of the role of graduates in the economy and society, for the long-term.
We received a total of 206 valid responses and we published a summary of the responses. Overall, it offered strong support for the use of linked data, for the continuation of a census survey, and for a change to the timescales of deployment to a single survey at between 12 and 18 months.
There was also strong endorsement for the high-level scope of the survey, proposing continuation from the topics covered in L/DLHE, but also showing very strong support for new measures of graduate outcomes, and the addition of measures around graduate entrepreneurship and placements. We also revealed the findings of the costing exercise that we had conducted as a part of the first consultation survey, which revealed the full costs providers faced in delivering the distributed DLHE survey, based on a sample of 111 detailed costings.
Consultation two – March to April 2017
The second consultation offered an opportunity for stakeholders to comment on the draft model proposed in response to our review findings and to indicate their support for it. The model we proposed was synthesised from the many sources of expertise that fed into the review.
The second consultation received 187 responses and delivered a clear mandate to proceed to implement our proposed model. Some of these outcomes included:
- We received strong support for our proposed survey design with over 80% in favour.
- Over 70% were in favour of the implementation plan.
- The survey practicalities, including the open centralisation methodology, received solid support with over 60% in favour.
Comments received were supportive of the direction of travel, although a number of queries and concerns were raised on aspects of the model. We continued to publish responses to questions and issues raised by stakeholders.
Some of the issues raised reflected the complexities involved in moving from a devolved model to a centralised one. An example is the position on response rates. Under the administrative data model of DLHE, stratified response rate targets for sample groups defined by qualification aim and domicile were closely monitored. Under LDLHE, the sample survey contractor achieved a high response rate. However, given the unique nature of the proposals (the closest parallel is the Australian Graduate Outcomes Survey, which achieves c. 39% overall response rate at 12 months from an onlineonly survey), we could only offer a guide to our approach, establishing a 70% target response rate as a challenging target to be kept under review.
Subsequent work during the implementation period focused more on reducing the potential for bias in the survey through various approaches and ensuring that the survey is managed according to professional standards for quality assurance of surveys. These necessarily place the achievement of high response rates alongside other important quality factors, such as ensuring appropriate approaches to call prioritisation and incentives, identifying the approach to be taken in producing confidence intervals for our findings and producing survey weights. Papers on these matters have been produced by HESA’s statistics and econometrics staff subsequently and discussed by the Graduate Outcomes Steering Group extensively.