Intended for healthcare professionals

Editorials

Health informatics: a required skill for 21st century clinicians

BMJ 2018; 362 doi: https://doi.org/10.1136/bmj.k3043 (Published 12 July 2018) Cite this as: BMJ 2018;362:k3043
  1. Douglas B Fridsma, president
  1. American Medical Informatics Association, Bethesda, MD, USA
  1. fridsma{at}amia.org

Literacy in informatics should be a formal requirement of all medical education

The world is estimated to produce more than 2.5 quintillion bytes of data every day (a quintillion is 1 followed by 18 zeros), and, by 2025, the total number of genomic data will likely surpass that for astronomy, YouTube, and Twitter combined.1

With the increase in health data, health professionals also have new kinds of technology to collect, analyse, and use that information. They have electronic health records to document care, clinical data warehouses to organise data around diseases or quality indicators, population health analytics to identify predictive characteristics for populations at risk of disease, and new technologies that use machine learning and artificial intelligence.

Information technology has changed the way that healthcare professionals practice. And, while many health professionals see the potential that these changes can bring to improving the quality and cost effectiveness of healthcare, many are also frustrated.2 They are struggling to adapt, without knowing the underlying science of information in these new tools.

Health informatics is the science of how we collect, analyse, and use health information to improve health and healthcare. But, despite its importance in 21st century medicine, this science is not routinely taught to health professionals. In hospital settings we often rely on health IT companies to teach health professionals how to operate their tools. This may teach clinicians where to click in the electronic health record to get through their day, but it doesn’t give them the depth of knowledge they need to optimise these tools to improve complex patient care. For example, during the Ebola outbreak in 2014, an infectious patient was inadvertently discharged from a hospital in the United States when important travel information was listed in a section of the electronic record where the clinician would not normally think to look. The healthcare team knew how to enter data but didn’t understand how the data were being used.3

To prevent harm to patients, clinicians need fundamental training in how to collect, analyse, and use health data—training that is not tied to a specific technology. Without that foundation, we are faced with the educational equivalent of drug companies teaching medical students the mechanics of how to write a prescription for their products, without teaching them essential pathophysiology, pharmacology, and microbiology to make them safe and effective prescription writers. We need to move beyond the basic mechanics of how to use information technology and teach health providers the underlying science of health information.

In the US, the American Medical Informatics Association leads the establishment of clinical informatics as a specialty (now with nearly 1700 board certified professionals4) and more recently supported the development of accreditation and certification standards for physicians and other healthcare professionals.5

A similar transformation is now underway in England with the Topol review, Preparing the HealthCare Workforce to Deliver the Digital Future. Similar to efforts in the US, the interim report, published on 28 June, suggests establishing the research foundation of health informatics (through targeted education in health data sciences and bioinformatics), appointing “clinical informatics translators” who can support leadership through chief clinical information officers and other clinical informatics professionals, and assuring broad expertise in informatics across all healthcare professionals.6 The review proposes three guiding principles—that new technology should be supported by trustworthy evidence and ethical frameworks, should empower patients, and “whenever possible” should allow more time for patient care.

Both the US and England recognise that we need more than just a few highly trained professionals. Health informatics should be a fundamental skill allowing every clinician7 to exploit technology to improve care, fully partner with patients, guide them to the best sources, and help them understand the underlying biases (and challenges) in data that are widely available to them.

Literacy in informatics should be a formal requirement of all medical education, biomedical research, and public health training. Currently, there are few formal requirements for medical students to learn health informatics. But focused and concerted educational training in health informatics is essential for health professionals to realise the full benefit of the data and tools that are already part of the practice of medicine and to help develop the new and improved tools of the future.

Footnotes

  • Competing interests: I have read and understood BMJ policy on declaration of interests and have no relevant interests to declare.

  • Provenance and peer review: Commissioned; not externally peer reviewed.

References