Sheffield Hallam University.
Hive IT worked with Dr Lynne Barker of Sheffield Hallam University to develop an entirely new assessment of cognitive ability for use after traumatic brain injury (TBI), stroke, and mild cognitive impairment (MCI)/early dementia. Our goal was to build a an interactive analogue of real world (cooking) behaviour (Cog:LAB) that benchmarks current brain injury levels in a quick, cost effective way in comparison to the current NHS measurement techniques
- Client
- Sheffield Hallam University
- Who they are
- We worked specifically with Dr Lynne Barker, Reader in Cognitive Neuroscience, Centre for Behavioural Science and Applied Psychology, Department of Psychology, Sociology, and Politics at Sheffield Hallam University
- Location
- Sheffield
- Requirements
- Build a prototype application to aid in the assessment of cognitive functions.
Problem
In real life, complex tasks such as cooking are often significantly affected when a person sustains a brain injury (or early on in the course of a neurodegenerative disorder) because they require the coordination of multiple cognitive functions. Current assessment tests are expensive and evaluation is clinically and patient time-costly.
Dr Lynne Barker wanted to develop a new interactive assessment of cognitive ability. The aim of which was to collect preliminary data with mixed neurological groups to test use-ability, reliability and sensitivity of measure compared to treatment as usual (TAU) to inform patient trials for follow-on grant submission.
Result
Hive IT developed an interactive analogue of real world (cooking) behaviour. Cog:LAB is a tablet application to be used in situ with patients to gather the relevant metrics for assessment. We followed a user centred design process by conducting two rounds of user testing to attempt to mitigate against the technological uncertainties of using the application in a patient based setting.
By observing patients using tablet devices in scenarios similar to those of a clinical assessment we have been able to make iterative design changes to meet the physical and neurological needs of our end users wherever possible. The application is now being used in patient trials to benchmark against current assessment measures and understand how the prototype can be further developed.
Approach
From the outset of the project we needed to understand the complex situation of brain injuries and work with Dr Lynne Barker to understand the information that she was attempting to gather and why. We spent several workshops with Lynne mapping out the expected user journey, asking questions about why things had to be done in a certain way, understanding how patient assessments work and where the boundaries lay in supporting users to engage with the application, without aiding them in completing the tasks required.
Working closely with Lynne to really understand the project, allowed us to set a clear remit for the project and begin development. Once we reached that stage, we made sure that (as with everything!) we took a user centered approach - engaging with end users over various sessions to observe them using the prototype, and iterating accordingly to improve the interface and assessment techniques.
engaging with users
A key part of the project was making Cog:LAB relatable to multiple users. Most similar applications on the market are aimed at children or at least designed in a very childlike manner. However, it was important when working with patients with brain injuries or dementia that we make the kitchen environment recognisable in a real world context. When we introduced an object that was too stylised many patients became blocked from moving on with the task and focussed solely on the inaccuracies instead. We learned early on in the project that we had to develop assets for the application (such as pans, beans, cookers, toasters, bacon) which were fairly realistic in order to not distract the user from the task they were being asked to complete.
Whilst there is no measurement within the application on the dexterity of patients, this has been a strong consideration throughout the length of the design and development process. We engaged with users to understand how they would hold, use and react with different interfaces, and we saw first hand some of the symptoms of a brain injury, and the impact that this can have on a person's dexterity.
As such, we iterated our design to adapt the modal interface. Instead of using traditional gestures of swiping, we limited the gestures to tapping on icons. In addition to this, the design had to carefully factor in the limited screen real estate available whilst being able to accurately portray a kitchen scenario with large enough icons for users to not be hindered in their completion of the task by their mobility. Additionally, we ensured that all instructions and information was available in audio to ensure that users could absorb information in whichever way suited them best.
Unity for development
On receiving the initial brief from the client, we undertook an analysis of the available and most suitable software to support our development. Unity was chosen as an existing and major application development toolkit, allowing for rapid development and support throughout the iterative cycles. Unity was also useful for supporting the initial target platform and allowing for expansion on to other mobile platforms as and when required in the future development of the application. 3D models were created in Blender 3D and then imported into the Unity engine. Within Unity we modelled the interactions and constraints around each of the cooking tasks, whilst measuring the key metrics required to produce ‘results’ for the patients. The prototype application was then ported on to individual devices which were used to undertake initial trials for further feedback.
Technology
Throughout this project we used:
- Adobe XD and Blender 3D - UI Designs
- Unity - 3D Modelling of interactions
what's next?
Following wider engagement with patients in a clinical setting, we have more detailed user feedback and are now planning a second round of development in order to use it to create a normative database - and then move in to further phases of deployment. We’ll also be working towards achieving Class IIA medical device software approval of the application, in line with the new medical device regulations. Our dream is that this can be rolled out to support more effective and accurate assessments of brain injuries within clinics across the UK and globally - supporting the advancement of research and development within this very specialised field.