Skip to main content

Data Futures Alpha progress report: August 2021

In this edition

HESA website sessions

We held two website review sessions: ahead of the sessions, participants completed a Top Tasks survey. The Top Tasks methodology focusses on customer needs to refine website content and layout. The methodology has been used successfully in the HE sector, as well as in organisations like Gov.uk, Google, and the BBC.

The top five tasks correlated with our understanding of requirements as we move towards implementation:

  1. Access the Student – Data Futures coding manual
  2. Confirm understanding of guidance
  3. Find out about the transition to Data Futures
  4. View the data items
  5. Find out when the first Student – Data Futures collection begins

Participants also completed a worksheet asking what they want or need to do when they visit the HESA website. This user research demonstrates providers’ requirements evolving in line with the programme milestones.

Responses included:

[When I visit the HESA website, I want to do … so that…]

  • Access the data dictionary, to constantly reaffirm understanding of different fields in the return
  • Check deadlines for returns so that I can plan to ensure returns are finished in good time
  • Look for the latest news so that I can keep up with new developments.
  • Access examples/scenarios that match my institution's characteristics, so that I can ensure that I am manipulating the data appropriately and I can contact Liaison quickly for any missing scenarios/guidance.
  • Check for open consultations to allow [my organisation’s] voice to be heard.
  • Look at programme timelines, to plan staff time.
  • Access clear and timely information on news and updates/amendments, to be well-informed of changes that could impact on the returns.

During the live sessions, we asked:

  • What works for you?
  • What doesn’t work?
  • What else would you like to see on the website?

Feedback and suggestions included:

  • Focus on timelines and transition as we move towards implementation
  • Move content to the top of the Data Futures landing page: timelines, coding manual links, information on transition.
  • The e-learning content is useful across the organisation, not just for operational contacts.
  • Review how to make the user journey more intuitive across content.

Both sessions were very productive and informative, and we will take forward actions as we move towards transition and implementation.

Data migration update

We held a further data migration session on 31 August to answer any queries arising from the work.

The concept of the migrated dataset is to make a dataset that will look exactly like what you’re going to submit during the first year of migration based on the 2019/20 data, as if you were submitting that to the system.

We’ve put a lot of extra fields in to help with understanding on how the migration works, but the idea is really to put the ‘legacy’ data in the new structure.

Our Head of Solutions Development clarified how we want our participants to start exploring the data and see if it makes sense and to see if their dataset looks as they would expect if they were returning it in the Data Futures data model. We are keen for participants to check the fields that hang off different entities, to check and to flag if any don’t make sense.

Participants shared some of their initial activities:

  • Comparing student instance and profiles of the instance counts against credibility reports: numbers tallied as expected.
  • Matching up submitted data with migrated data.

We explained how student course session works as a placeholder in the migrated datasets:

  • No concept of when a year of study starts and ends in historic data.
  • Commencement date and anniversary of commencement used to make placeholder student course sessions.
  • For non-standard year types, have sometimes made two student course sessions, but sometimes have only made one if the student has gone dormant:
    • Might have one session finishing but not a new one started.

A data migration exercise will happen during Beta. During Alpha, we’re trying to fix any obvious issues so when we get the 2020/21 data, we can do another migration exercise for Beta participants. During go-live there will be an expected iterative process if we spot any issues.

We have offered further individual and group data migration sessions to be arranged mid-September onwards, in consideration of the business-as-usual demands this month on our operational colleagues.

Coming up in September’s Alpha progress report

  • End of Alpha: wash-up session and participant feedback
  • Online validation kit session notes