On March 4th, I attended the Digital Health Rewired conference in London on behalf of Digital Health Today. The attendees and presenters took every opportunity to highlight technologies and techniques for improving the efficiency of healthcare delivery globally. However, most sessions focused on the UK’s National Health Service (NHS).
Here are the three most important themes that I took away from the day.
Coronavirus Creates an Opportunity to Accelerate Digital Transformation.
In 2018, the NHS had an 8.5% vacancy rate (100,500 vacancies, including for doctors & nurses), the highest vacancy rate of any UK public sector service. Thus, many UK healthcare providers couldn’t design, develop and deploy digital transformation projects. However, there is an unprecedented level of citizen interaction with the NHS through digital channels, according to Sarah Wilkinson, CEO of NHS Digital. She highlighted that the Coronavirus epidemic created a ‘must act’ environment which is emboldening healthcare innovators and forcing teams to make decisions which never made the agenda before.
As an example of the growth in digital engagement, she shared that last Friday (February 28th, 2020), they stress-tested the 111 call centres and verified 12x peak load capacity, deemed as sufficient. The following Monday the BBC highlighted the COVID-19 pathways deployed in 111 and the call centres experienced 19x peak load.
The digital innovator community makes big promises about the potential for technology to make care delivery more effective and efficient. Now we have an opportunity to double down, put our chips on the table and deliver impact for patients.
Today’s clinical impact from digital transformation is coming from more straightforward solutions than you might think.
There were several presentations at the event about blockchain, artificial intelligence and other emerging technologies which promise to transform the industry. However, they were forward-looking, and the speakers who featured real-world case studies featured more straightforward digital health innovations.
For example, Matt Neligan, Director of Primary Care Commissioning Transformation for the NHS England, shared a case study from the NHS Whitley Medical Centre in Reading. They suffered from an overworked provider staff, unable to fill two open GP positions. They knew something had to change because they didn’t have enough time to manage their complex cases properly. By deploying digital patient messaging, appointment scheduling and electronic consultation (GP at Hand) services, they decreased face to face consultation volume by 43%, while maintaining 87% patient satisfaction.
It is tempting to get excited by the ‘next big thing’. However, UK health care innovators should remember that NHS adoption of digital technologies happens in a decentralized manner. Sometimes the most significant clinical impact can come from shipping the basics.
No one can keep up with the torrent of clinical knowledge coming from research papers released every day. Including computers. That’s a problem.
Information, the commodity that reduces uncertainty, is crucial to fueling the digital transformation of healthcare.
However, the information doesn’t equal knowledge. While tremendous resources go into researching and publishing clinically relevant knowledge, mobilizing this and applying it to improve patient outcomes is a monumental task.
Jeremy Wyatt, Emeritus Professor of Digital Healthcare at the University of Southampton, gave a fascinating presentation about publishing research in understandable language. Not to humans, but machines.
Amazon, Google, Microsoft, and other technology giants are working to apply natural language processing (NLP) to back clinical knowledge out of the body of research published online. However, these approaches are reliant on the effectiveness of algorithms. As these algorithms differ across cloud computing providers, the clinical knowledge could vary depending on which NLP algorithm processed the report.
In contrast, researchers could publish two versions of their studies, one for human reading and the other encoded in language designed for computers to index and analyze. In this paradigm, downstream NLP would not be necessary.
Unfortunately, there are challenges on the horizon. Publishing two versions of reports would be more expensive, and it is unclear who would pay. Also, the taxonomy researchers use to classify their findings would have a significant impact on the utility of computer-processed knowledge. How the research community would make these decisions was not shared in detail. It seems that finding a method that is normalized enough to be useful yet sufficiently flexible for the diversity of use cases will be a non-trivial endeavor.