Skip to main content

PraxisAuril 2022 conference – summary of feedback

Summary headlines:

HESA colleagues were delighted to attend the twentieth PraxisAuril 2022 annual conference and engage with a diverse range of colleagues from across the knowledge exchange (KE) community. Working in partnership with colleagues from The University of York and the University Commercialisation and Innovation Policy Evidence Unit (UCI), University of Cambridge we delivered an interactive session titled “Beyond the £ - Measuring and Valuing University Knowledge”. The focus of the one-hour session was to explore measures of value of knowledge exchange (KE) – whether social, economic – and how a quantitative data collection such as the HE-BCI collection can appropriately demonstrate this as something other than income. As part of these discussions, we updated delegates with the developments of the HE-BCI major review to date and displayed the six identified priority areas for review. 

We invited feedback from an estimated 150 delegates - split across 18 tables - in response to three focused questions:

  1. Which metrics currently exist in the HE-BCI data collection and are sufficient in measuring value (social, economic, cultural, etc)?
  2. Which metrics currently exist in the HE-BCI data collection but are insufficient in measuring value and can be removed?
  3. Which metrics are deemed appropriate in measuring value and should be incorporated into the HE-BCI data collection following review?

Thematic analysis of responses

Feedback demonstrated a consensus in the variety of engagement types demonstrated across Tables 1–5 of the HE-BCI data collection as being reflective of those undertaken by providers. However, there were often differing opinions on the metrics currently used to exemplify the value of KE in each area. Whilst the demographic of the delegates was not formally recorded, there was a broad range of roles, disciplines and levels of strategic involvement represented across the discussion. Overall, the value and impact of KE was thought to be both economic and social; however, the feedback reflected the range of different values and how the impact of KE activities varied in importance. The most notable faction was the contrast in significance for income generating activities and those producing tangible outputs (contract research and creation of intellectual property) compared against cultural outputs and interactions (public and community engagement and events). This dichotomy was highlighted during discussions between individuals representing science and technology disciplines and those with a focus in arts and humanities. This is highlighted throughout the feedback as further explained below.

The feedback provided by attendees demonstrated well-considered responses. Although not extensive, their responses offered suggested directions for development of the HE-BCI record and useful examples of alternative measures of the value of KE. Furthermore, the cross referencing of different themes of the HE-BCI data collection (reflected across Tables 1–5) in answers to all three suggested no single area of the record is sufficient to demonstrate the value of KE in its current form. For the purposes of logical thematic analysis, we have collated responses against each of the five record tables, as below:

Table 1: Research related activities

Advocates of the current income-related metrics used to report collaborative research were often representatives from science, technology, engineering, and manufacturing (STEM)-focused disciplines, with many holding strategic and management positions. Their responses cited collaborative research and consultancy as commonplace features of KE between themselves and partner organisations and the current funding metrics a necessary feature of the record. They supported the continued split of income by partnerships involving different partner sizes, for example subject matter experts (SMEs) and non-SMEs; however, they often caveated this support with a need to further develop definitions to align with those used in industry. Providers stated this was helpful to aid the required disaggregation of business sizes. One table further developed this requirement, stating that the split of SMEs should be further disaggregated to identify collaborations with both micro and small-medium sized partners. They suggested that by highlighting these differences they could more accurately demonstrate their strategic commitment to supporting current agendas such as levelling up and connecting capabilities whereby partnerships with smaller organisations are encouraged and incentivised.

Additionally, the requirement to develop guidance in supporting the valuation of in-kind contributions was unanimously highlighted across all discussion tables. Feedback demonstrated the need for greater clarity on the required level of accuracy and accountability when assigning values and felt examples of these would be beneficial in supporting providers and external partners in attributing values to contributions at the point of initial contract. They shared that taking examples from other data collection resources and guidance would help to align their own work and help develop a basis to repeatedly and reliably assign values. Whilst they did not make direct reference to such resources, providers noted that a consistency in methodology and guidance would help to reduce burden and improve consistency across the sector. Furthermore, they felt this would develop their own confidence in both the onward use of in-kind data by both themselves and external colleagues, but also in the completion of the data collection.

Table 2: Business and Community services

Facilities and equipment related income was deemed a continued requirement and little feedback was provided on how this area may be improved. Similarly, continuing professional development (CPD) and Continuing Education (CE) were cited as meaningful and required features of the record; however, the use of ‘leaner days’ was agreed as inadequate to demonstrate the breadth or value of teaching, and more broadly as a representation of the value and impact of KE. One table felt that the current metric provided a consistency of measurement across all submitting providers; however, recommended splitting learner days into hours to accommodate for shorter courses. Others felt that the current use of time as a proxy for value was insufficient and instead shared a requirement for metrics to focus on a measurement of quality rather than quantity. Although highlighted as a difficult concept to measure, suggestions of alternative metrics included the reporting of repeated partnerships between providers and the organisations enrolling employees on their courses. An organisation’s repeated investment in their employees by the teaching delivered at a provider demonstrates a perceived value of the quality of the knowledge exchanged. 

Table 3: Regeneration and development programmes

A need for greater emphasis on place and geography featured in discussions across all areas of the data collection, particularly when reporting regeneration and development and public and community engagement (P&CE). Providers cited the use of case studies and narratives as suitable for outlining the location of communities and the public engaged. The use of banding of proximity of an external partner to the provider was deemed a good measure for the reach of regeneration and development partnerships. The inclusion of geography in these ways was thought to demonstrate a provider’s commitment to levelling up and place-based agendas.

There was discussion around the need for greater representation of the types of regeneration programmes providers were frequently involved in. This was particularly important to align with the differing initiatives of the funding bodies, for example in Scotland where programmes such as the European Structural Fund will cease, and the UK Shared Prosperity Fund will be introduced. Providers felt that aligning these areas of the records could reduce associated burdens and improve their own internal data recording processes

Table 4: Intellectual Property

The relevance of intellectual property (IP), licenses, patents and trademarks were identified as a contentious area for discussion. Notably, representatives from research intensive providers and those with orientation towards science, technology, engineering, and manufacturing (STEM) disciplines demonstrated a faction in the perceived value of IP as being a sufficient measurement of value. Whilst some suggested the current use of number of patents filed as being suitably indicative of research outputs and KE in these areas, others felt that a provider’s ability to ‘game’ the system by producing high quantities of low-quality IP applications and filings was problematic. Few individuals commented on providers ‘churning out’ licenses and patents for the purpose of increasing numbers, however, some felt the quality of these outcomes were often not valuable exchanges of knowledge. They stated that comparison of quantity rather than quality was misleading and promoted a distrust in data outputs dependent on these metrics, for example, the Knowledge Exchange Framework (KEF) and Higher Education Innovation Funding (HEIF) allocations.

Similarly, providers with greater emphasis on Social Sciences, Humanities and the Arts for People and the Economy (SHAPE) provisions stated the outcomes of valuable KE within their disciplines are less frequently demonstrated as the production of IP. Therefore, in comparing the quantity of IP outputs without consideration for academic discipline, or their quality, is misleading and disadvantageous to some. Whilst they cited that clustering methods used in the KEF helped to alleviate the unfair comparison across disciplines, the requirements of the HE-BCI data collection to report on this data was not appropriate for all providers.

Similarly, the use of counts of spinouts was stated by some to be a useful indicator of impactful and valuable KE. However, there was a shared requirement for a refined definition of spinouts and further information on the necessary relationship between them and the provider to qualify their eligibility for inclusion in the data collection. Additionally, they discussed the point at which a spin-out ceased to be eligible and the extent to which the provider’s contribution to their social and economic value ceased. They responded well to suggestions of a disaggregation counts based on percentage of investment into spinouts by providers and felt this could be a meaningful way to demonstrate the provider’s perception of their value and identify the ‘transition’ from spin-out to a stand-alone organisation. Few representatives shared similar concerns surrounding spinouts as that expressed with regards to IP, stating that a provider may produce a high number of low-quality spinouts which are of little value or impact.

Additionally, many delegates advocated the use of linking external datasets to the HE-BCI data collection as both reducing burden and improving accuracy of data. It was stated that the required estimated employment and turnover data of spinouts is often arduous data to obtain – a difficulty that exacerbates over the passing of time from company registration onwards. Furthermore, representatives from providers with less resources – both staff and data recording systems – cited maintaining relationships with their alumni to monitor this information over a period of three years was often difficult if not impossible. Therefore, it was stated that provider capacity may correlate to inequalities in the consistency or accuracy of data thus undermining its use as a good measure of value of KE of this kind. They referenced the prospective linking of Companies House (CH) data as having potential to alleviate these issues and stated it would be a better and more accurate source of the true value of provider spinouts across a variety of measures (economic, social, etc).

Similarly, there was a unanimous desire to link HE-BCI data collection with Intellectual Property Office (IPO) data. This would reduce burden on those providers in monitoring the use of IP, and the breadth of information held by the IPO was deemed more comprehensive and accurate. It was discussed that the metrics held by the IPO was more reliable and demonstrable of the value of the impact of IP compared to that in the HE-BCI data collection.

Table 5: Social, community and cultural engagement: designated public events

Representatives from SHAPE-orientated providers often cited public and community engagement (P&CE) as being an area of the record of particular importance to them. However, where they was consensus for these types of interactions to remain in the record, there was also a desire for revision of the current metrics. The current reporting of the ‘number of attendees’ to events and activities was stated as an inadequate reflection of the quality of KE interactions. Their comments mirrored those regarding IP and spin-out creation in that quantity measures overlooked quality; a necessary to demonstrate and evaluate the value of KE. Although there was support for the need to enhance the reflection of quality there were few suggestions of how this could be done. Whilst they acknowledged that Part B of the HE-BCI data collection is quantitative, one recommendation was the use of supplementary qualitative case studies. Feedback suggested case studies as a demonstrable way to report the impacts and value of engagement of these types and allow for the fuller explanation of the audiences interacted with and local needs that KE activities sought to address. Many representatives explained that they often obtain testimonials and participant feedback following their engagements which is then strategically used to shape and design future interactions. Furthermore, they outlined using case studies in submissions to other data collections such as the KEF and Knowledge Exchange Concordat (KEC) applications. Therefore, it was felt that case study inclusion in the HE-BCI data collection was advantageous to demonstrate the impacts and value of KE, reducing burden through aligning of collections and de-duping efforts.

Additionally, providers confirmed table 5 of the HE-BCI data collection as a suitable place to record engagements with public and non-for-profit organisations rather than just individuals. They cited partnerships with regional stakeholders such as local authorities (LAs) and schools as commonplace and demonstrating significant and highly valuable examples of KE. Furthermore, they felt it reflected a provider’s strategic commitment to local needs and co-development – both useful in evidencing their civic missions and place-based agendas.

Interestingly, several delegates noted the requirement to disaggregate in-person and online engagement types. Previously collated into one metric as defined by the activity (public lecture, performance arts etc) adapting to online  event delivery during the restrictions of the COVID-19 pandemic led to the inclusion of specific guidance for reporting these. For the purposes of the C19032 and C2032 HE-BCI data collections providers were advised that where an interaction is delivered online but intended to have been done so in-person, it should be considered eligible for return if the content and ‘experience’ of participants was like that had by those who attended in person. Whilst this was deemed problematic and open to ambiguity by providers, the increased use of online platforms was stated in feedback as becoming routine practice. Therefore, to alleviate such ambiguities, many shared a desire for events, even the same in content, to be returned as either in-person or online. For hybrid events, providers they made a judgement on which categorisation was most appropriate based on attendee experience.

Across the record

Whilst delegates often discussed each of the three posed questions against the five tables of the HE-BCI data collection some feedback was deemed applicable across the record, particularly regarding coverage of the record and the granularity of financial data reported. It was suggested that to measure the full value of KE both staff with academic and non-academic contracts should be consistently included in record coverage, considering support staff and those with non-research contracts have sizeable contributions to KE interactions, through both preparation and delivery of knowledge. Similarly, they reported students as prominent agents of knowledge exchange with examples of their work with local community groups and schools as commonplace. Students were reported as often holding both paid and voluntary positions in these roles.

Additionally, whilst the use of income as a proxy measure for the value of impact was widely regarded as inadequate it was felt necessary that the detail of income should be further elaborated beyond the use of £000s. Collaborations and projects of smaller scales, or generating greater in-kind rather than direct monetary incomes, were often excluded from their returns due to not meeting the current financial threshold. Enabling the return of interactions receiving smaller quantities of income was a better representation of KE activities, thus enhancing the measure of breadth and value of KE. This point was supported by comments on providers’ increased effort to collaborate with micro and SMEs as portrayed as a disadvantage to them when analysing in the data. It was shared that work with partners of these size are typical of producing smaller financial incomes to the provider and when working with community or charitable groups.

Finally, the inclusion of metrics across all areas of the record to show repeated interactions – whether collaborations, partnerships, or events –helped to identify the breadth of provider commitments and relationships with external partners. Therefore, it was suggested that in extrapolating counts of repeated interactions from aggregated figures it would help identify those providers who have established connections and commitments from those engaged in a higher quantity of short-term partnerships. Whilst discussions did not suggest that the comparison of the two measures would necessarily depict any opinion towards a provider, whether negatively or otherwise, they felt it was useful in demonstrating their own strategic commitments and for helping develop their own key performance indicators and monitoring the value of KE.


Overall, the feedback shared across the 18 discussion tables was consistent with the planned priority areas of the review. Discussions highlighted a consideration for the values of KE as being both social and economic and supported the development of the record to enhance the demonstration of these. The exercise was deemed highly beneficial by provider representatives and HESA colleagues alike in confirming these six areas for development met the needs of the sector. Many delegates reported the exercise had instilled a confidence in HESA’s abilities to appropriately review the HE-BCI data collection and that running a consultation session made them feel included and accounted for in the review process.

HE-BCI major review