In the developed world the really scary diagnoses are very uncommon. More often than not a patient’s symptoms can be safely interpreted as benign. This has engendered a false sense of security because there is evidence that doctors fail to recognise presentations of some nasty diseases.
Innovators in medicine have been focused on this problem for some time. For example a research team reported in 2009 that skin cancer was much easier to diagnose with the aid of a handheld device that draws attention to cancerous changes. The problem however is that doctors need to attend a course on dermatology and take an exam before it is safe to let them lose on people with the instrument. The published report was upbeat despite the fact that one in three doctors didn’t complete the required training. The outcome of this research (and many other research programs funded using millions in taxpayer dollars), was an academic paper that will never impact on the early diagnosis of the disease.
Less than five years later some of the same team were back to test a simpler device but with a similar requirement for education of doctors before successful deployment. The negative results were hardly surprising. The team concluded that cancer was more likely to be diagnosed early if doctors followed guidelines.
History has taught us that just because an intervention may be of benefit to patients, that doesn’t mean it is likely to be embraced by overburdened care providers trying to earn a living. The most successful innovators understand the need to tailor interventions to meet the needs of both the health professional and her patient. They realise that tools that are inconvenient or cumbersome are doomed to novelty status.
Committees that determine which ideas are worthy often deny the lessons of agile, intuitive, creative and effective innovations. These are more likely to be reliable, developed relatively cheaply and don’t need an instruction manual.
How hard is it to adopt your innovative ideas in practice?