Data Futures Alpha progress report: June 2021
In this edition
- Welcome to Alpha
- Moderated usability testing: HESA Data Platform
- Coming up in July's Alpha progress report
The Data Futures Alpha phase launched in May 2021 and 14 providers are participating.
We held a ‘Welcome to Alpha’ Microsoft Teams meeting on 13 May. All providers were represented on the call, with between 1 and 6 attendees per provider.
We hosted an interactive whiteboard session and below are the questions we asked and some of the attendees’ responses*.
(*a ‘bonus goal’ was to ‘stop feeling like a novice using Microsoft Teams’ – an additional perk for Alpha participants!)
Why have you chosen to take part in Alpha?
'To have a greater understanding of [Data Futures] and to assist other [colleagues] to understand the importance and getting buy-in to make changes to business processes, etc'.
'We have an in-house student records system…so have quite a different approach to creating a student return'.
'To understand plans, get clarity on timescales, ability to feed back and get early sight of things'.
'To achieve [a] clear view, and work with our software supplier to get best solution possible'.
'A full understanding of what is expected and an idea of the timings that we will need to complete the tasks with current resources'.
'It is a way of us helping HESA to see how the data return can work for alternative providers'.
'Gain further information on the model and requirements'.
'Getting some early views of the issues for [Northern Irish] providers'.
'To ensure that our understanding of the guidance is correct to implement the changes needed at our institution'.
'Building relationships and networking'.
'the team are taking on board the way that providers interact with the system, so if you have any difficulties or concerns I would definitely encourage you to raise them'.
Please tell us what you would like to achieve by the end of Alpha?
'To develop our understanding of the data model / guidance and since we have in-house systems, we hope it will help with development'.
'To feel more confident about the new process and have helped it get off to the best start'.
'A view of the HESA Data Platform and what's different from the existing system'.
'A realistic perspective of the actual [Data Futures] burden'.
'Increase knowledge of model and get some reassurance that we have all the data and can get [it] into correct format'.'
‘Have a say for alternative providers with the aim of making the process easier for some in the future'.
How can HESA best support you during Alpha?
'Provide more specifics on requirements and expectations'.
'Be available to answer questions and don't be horrified at the lack of in-depth knowledge'.
'Be clear on dates for activities - we [want] to plan to clear timelines to help us manage our people [participating in Alpha]'.
'Listen to feedback, take this onboard and feedback on how this can be implemented along the process'.
Feedback and queries
We are committed to providing consistent and authoritative information across the Data Futures programme: please contact [email protected] if you have any queries or feedback.
We launched moderated usability testing on 27 May.
Each session includes one provider, and three HESA colleagues:
- Session lead
- Session moderator (Technical support, observing non-verbal interactions)
We used a test script and test data files in each session.
Providers completed standardised activities, so we could compare findings and outcomes across the sessions:
- Upload test files and understand any schema errors.
- Find and view the quality rules report following an upload.
- Find and view the credibility report.
- Complete and sign-off a submission.
As is the nature with Alpha testing, we expect to come across issues so we can resolve these ahead of the Beta phase
The HESA Data Platform (HDP) has the required functionality for the tests, but the interface is not showing the final design (more on this in the July Alpha progress report).
Providers were encouraged to comment and talk aloud as they work through each test, so we can capture the initial response to any changes to process or the interface. As part of the testing script, we give as few prompts or instruction as possible, so we can understand how users engage with the system, including how intuitive it is. The testing is intended to really challenge the HDP and we do not want to miss any chances for improvement.
As is the nature with Alpha testing, we expect to come across issues so we can resolve these ahead of the Beta phase. So far, we have not come across any unknown bugs – this does not mean we are ‘bug-free’, but all bugs encountered so far have already been noted and are planned for resolution in future development and releases. We’ve received lots of useful input on new features / suggestions for improvement and we’ll talk more about this in future editions of the progress report.
'The usability session was very interesting as I have not done anything similar before, and I thought it was managed really well by the team at HESA'.
We asked for testimonials following the sessions:
'The usability session was very interesting as I have not done anything similar before, and I thought it was managed really well by the team at HESA. I appreciated the fact that they took notes while I worked, rather than me having to try and write down my thought process. I felt supported and that the team were non-judgmental as I worked my way through the exercises and I liked the feedback at the end'.
'It is clear that the team are taking on board the way that providers interact with the system, so if you have any difficulties or concerns I would definitely encourage you to raise them. They won't know what is and isn't clear or working unless you tell them. Also have an open mind when approaching the new system: things may have different names, or be organised slightly differently, but the sooner you can engage and familiarise yourself, the better'.
We have been holding User Interface (UI) sessions where Alpha participants have had a first look at the interface of the HDP and have an opportunity to feedback on the visual 'look and feel' of the system, from the prototype concept, and to share your opinions on what you like and dislike.
We have also hosted quality assurance ‘deep dive’ sessions and we will provide a summary of what we covered and participants' feedback.
We will be hosting a data migration session, to present our plans and approach to participants for feedback and input.