We've come such a long way in 70 years. Many of the major advances in that time can be attributed to a better understanding of hygiene and antibiotics, and to pharmaceuticals more generally. As advances in pharma are becoming more costly, digitally enabled health and wellbeing are likely to provide greater gains.
The history of analogue medical devices goes back hundreds, or even thousands, of years. For example surgical knives are believed to date from Mesolithic times (8000BC), syringes from the 1500s, and the first stethoscope from 1816.
There have been transformational developments in digital health technologies from the 1970s onwards. People may find it difficult to remember back to the times when there was no such thing as intensive care (as we now understand it) but it has emerged within our lifetimes: critical care medicine, with its focus on continuous monitoring and intervention, was established in the late 1950s. Imaging is another area that has grown in significance from x-rays – largely since the 1970s, when Computerised Tomography (CT scans) and Magnetic Resonance Imaging (MRI) were introduced. Now computing is fast enough that it is becoming possible to use imaging in real time during surgery, and to introduce interactive 3D images (built up from 2D slices).
These are part of another phase of rapid developments which are also being brought about by the availability of consumer devices, including wearables, that are becoming accurate enough to substitute for professional devices. Also, big data; for example, genomics is improving our understanding of the interrelationships between genes and their combined influence on health, while consumer genetic testing kits are making new health-relevant information available to the individual.
As the digital computer and the NHS reach their 70th birthdays, we are seeing huge advances in the technologies that address relatively simple problems. However, we have made much less progress in the technologies for complex problems. Go into any hospital and look at the complexity of the systems clinicians have to use – e.g. 20-30 different interactive technologies on a general ward, all with different user interfaces, all of which every nurse is expected to be able to use. From a patient perspective, someone managing multiple health conditions has to integrate information between the different tools and specialisms they have to engage with. We are seeing growing friction as what is theoretically possible slips past what is currently practicable.
What do the next 70 years promise? It is of course hard to say. A paperless NHS? – probably not by 2020, but maybe by 2088. Patient controlled electronic health records? – maybe if people are appropriately educated and supported in managing the burden of care; this will require us to address health inequalities brought about by differentials in income, education, technology literacy, health literacy, etc. The huge challenge is not the technology, but the individual and social factors, and the regulations, around it. This will require a new approach to data privacy and security, funding models and regulations that are fit for the 21st century, and education for clinicians, technologists and the public to ensure these changes are beneficial for all.
Of course, the NHS is just one healthcare delivery organization, amongst many globally. Some other health providers are doing things on a shoestring but overtaking the West in many ways by being agile – e.g., investing straight in mobile technology.
However, whatever advances we see in technology, care is still first and foremost about the human touch. The technology is there to support people.