Dr Jim Roberts, independent advisor in healthcare innovation for PDD, focuses on the human aspect when it comes to medical technology development.
Tex vector Shutterstock
2166598585
digital health
Digitisation has played a pivotal role in helping improve communications and efficiency within much of the healthcare industry and is set to be even more widely adopted going forward. Use of the NHS app during the pandemic has helped accelerate the digitisation of the UK’s health and social care system. In addition, the government announced in 2022 they would set aside £2 billion of funding to support the rollout of electronic patient records across all NHS trusts. This is part of a larger digital transformation strategy which will serve as the linchpin upon which all healthcare reforms will be based.
The objective is to streamline working practices in ways not seen since the start of the NHS 75 years ago. It will, however, create an organisation almost completely reliant on digitisation and the systems brought in to deliver it. Therefore, it is incredibly important that those systems are designed and optimised with the day-to-day running of patient care front and centre. Otherwise, there’s a real risk of the end product actually slowing down or complicating treatment and services.
How can we make sure that the technologies being developed to help digitise our healthcare are grounded in the needs of patients and healthcare providers at a time when they need them most?
Real-world implications
When you look at how medical technology is designed and how it finds its way into hospitals or into care settings, you will find a process that is often fragmented. New devices are procured based on their cost and feature-sets, but when they get used in real-world clinical contexts, flaws can emerge. This is because of the disconnect between the intended use the technology has been developed for, and those real-world contexts. This disconnect can delay the adoption of advanced technologies in healthcare, hindering efforts to provide best care, negatively affecting patient outcomes, healthcare systems and industry.
Inevitably, when you start using a new product or system, unintended consequences can arise which even the most appropriate commissioning and procurement processes can’t always predict.
Electronic Patient Records (EPRs) are a good example. They provide invaluable information about a patient and their medical history as they transition through a hospital care pathway without the need for physical paperwork. They can help limit miscommunication, and in some cases provide essential safety guardrails to stop over-administration of therapies. However, some healthcare providers with limited software skills find these systems daunting, and often must learn to use different EPRs going from one NHS trust to another – a burden on their already hectic working lives.
Another example is virtual wards – digital spaces where patients can register ahead of surgery and get personalised advice on what to do, and what not to do, ahead of an operation. They have proven their worth by reducing last-minute surgery cancellations (and their devastating impact on waiting lists and patient care). Yet, to use Virtual Wards, you need to be digitally literate and be able to engage with a virtual environment. Not every patient has the capacity, knowledge, or access to do that.
Even when the positive impact of these newly adopted technologies is apparent, as healthcare innovators, we must strive to go further, putting patients and healthcare providers right at the centre of the innovation process.
Healthcare innovation often exists within complex systems, where multiple user groups and stakeholders have different needs and expectations of what a device or digital system can achieve and what it should do for them. With new digital technologies, we need to consider the needs of each distinct user group, liaising directly with them to remove any incorrect assumptions that might exist. What’s more, we need to fully understand the real-world setting of how a device is used to deliver value to patients and healthcare providers, keeping users and human-centred design practices in mind from the beginning.
What’s the solution?
When it comes to the digitisation of healthcare, we need a paradigm shift - from a model where medtech companies sell ‘oven-ready’ solutions to healthcare systems, to a model that fosters better collaboration and partnership between the two parties. One where stages of discovery, conceptual design, user testing, monitoring, maintenance, and iteration are an intrinsic part of the process, involving the very patients and healthcare providers expected to use it within their day-to-day lives and jobs.
Taking time to do research and having conversations at the start of any digitisation design or redesign efforts will ensure all actions are based on real insights and any changes made will truly be for the better. From here, and as the development process moves forward, teams can start filtering down ideas in line with commercial and business objectives. This is not a linear process but a cyclical one. When all stakeholders work in close collaboration, it’s easier to ensure solutions respond to user needs but also are viable, technically feasible and grounded in reality.
So, in the case of EPRs, building software systems from the ground up and engaging users throughout the development will increase the likelihood of success and adoption. Making EPR’s a nationwide standard will greatly help healthcare providers who will only need to learn one system. Finally, if training is treated as an equally important element of the system to be designed in parallel (and not just bolted on at the end), then a more successful and inclusive end-product is likely to emerge.
New doesn’t always mean better
As we acknowledge the value of digitisation in healthcare, it is worth noting that adding new technology components to an existing product or system can result in additional costs, complexity, and an increased potential for use errors. We must also remember that new technology features typically take the shape of an additional physical product that people must store, carry, and remember to have with them. Consider the difficulties that this might present for users.
We also need to recognise that the value of technology in the earlier stages of healthcare product development can be difficult to assess. As we progress with user research and testing, it is not uncommon for features that initially seemed attractive to be proven less relevant or helpful. Therefore, however enticing connected technologies might be, we must stay grounded on real user needs and be mindful not to overestimate their benefits. At the same time, we must consider the wider context of use within our healthcare systems to ensure that any critical requirements in terms of types of data or communication pathways are addressed early in the process.
Moving forward
The successful adoption of technology is rarely driven by what that technology can do, but by how people interact with it and perceive its benefits. This is particularly true in healthcare. Trust and perception are critical in ways that can be hard to make explicit. Age, background, physical and cognitive abilities, and socio-cultural influences will significantly affect a patient’s willingness to interact with a new device.
As connected technologies continue to permeate healthcare environments, and patients continue to take a more active role in the decisions around their health, keeping users at the centre of medical device development is more imperative than ever. Only then will we be able to develop tech-enabled products that are appealing, improve quality of care and maintain the integrity, safety, and effectiveness of our healthcare systems.