Cognitive health care in 2027 Harnessing a data-driven approach in personalized health care

Article Sections

“Precision medicine” or care that is highly personalized for each person’s genome is likely to revolutionize health care of the future. And cognitive technologies will play a pivotal role, as handling the enormous amounts of data—one of the imperatives of cognitive health care—requires much more than just “artisanal” analytic capabilities.

Introduction

The next 10 years will likely see a revolution in the use of cognitive technologies for health care. Admittedly, the industry has not been a leader in the use of data and analytics in the past. Multiple disconnected systems, poor data quality, and difficult-to-change patient and provider behaviors have often been part of the challenges related to health care information. “Imprecision medicine” has generally been the rule. But there are clear signs of change in the $3 trillion US health care industry1 that we believe will come to fruition over the next decade. And from an information technology perspective, cognitive technologies are probably the only resource that can make that revolutionary vision for personalized health care possible.

More precision for providers, payers, and life sciences firms

“Precision medicine” is the shorthand term—adopted by the US government and many others—for care that is highly personalized to each individual’s genome, behavior, social, and environmental factors. For the federal government (specifically the National Institutes of Health and The Centers for Medicare & Medicaid Services), the primary focus is a data set of 1 million or more individuals whose information on all the above factors is captured and analyzed. There are other precision medicine initiatives underway for specific groups (such as the “Million Veterans” initiative for military veterans) and diseases (such as the Oncology Precision Network for cancer). The United Kingdom has a “100,000 Genomes Project” to sequence that many patient genomes. The primary focus of these initiatives is on health care providers, helping them develop treatment approaches that are most effective for individual patients. Genomic-driven medicine has already had some initial successes in fields like cancer.2

There are similarly transformational efforts underway among payers and life sciences companies. Payers for health care—both governmental and private sector—also have incentives to change their business models. Precision health care is based on value-based care, not the volume of treatments—which should ultimately lead to better outcomes. And not only acute or chronic care but disease prevention plans can also become personalized and more effective. Some payers are already working on individualized care plans, and some have formed new business units to create and market them.

Precision medicine will likely be just as revolutionary for life sciences companies, which have already developed some drugs that are designed for specific genome profiles. Many of these companies are also analyzing genomic, metabolic, and clinical data to identify biomarkers that can facilitate early diagnosis of diseases and indicate whether a particular drug will prove effective on a particular individual. One consortium of hospitals, researchers, and a startup, for example, is conducting “Project Survival” to identify effective biomarkers for pancreatic cancer.3 In other firms, real-world data sources are being used to identify molecules that might be particularly effective (or ineffective) in clinical trials. Another key focus is on using machine learning to quicken drug development processes and help predict the most fruitful molecules and compounds. Some AI-based startups are developing their own new drugs based on extensive clinical data analysis; others are partnering with “big pharma” firms or university researchers.

What’s making all this possible?

As in other domains of data science, the availability of digital data is key to precision medicine. Each human genome, for example, contains about a gigabyte of data before compression; proteome and biome data would increase that amount dramatically. Over a million humans have had their genomes sequenced, and the number is growing rapidly as the price falls. Medical devices, mobile phones, activity trackers, and health sensors generate data continuously. Electronic medical records (EMRs) accumulate patient data over a lifetime. The availability of all these data makes it possible—albeit still difficult—to personalize diagnoses and treatment plans to the individual level, and necessitates new data aggregation, storage, and modeling approaches. Traditional “artisanal analytics” cannot power precision medicine effectively.

Machine learning is one of the most common techniques for dealing with large volumes of rapidly changing data. It allows for a variety of statistical algorithms, can involve a large number of highly granular models, and can quickly generate new models for new data. It can be used to predict (disease onset, for example), detect patterns in data (a drug’s effects on populations or individuals, for example), or to classify populations (patient subpopulations, for example). Machine learning can also be used for the prosaic but important task of combining data across disparate data sources—say to create a Patient 360 view, a “predictive matching” approach is employed to aggregate data of the same patient (with slightly different names and addresses) that may be federated across multiple databases.

While generating complex models, some machine learning approaches allow for some degree of transparency—information on why the model is suggesting a certain course of action—which is often important in prescriptive or diagnostic models in health care. Patients and physicians are unlikely to accept “black box” recommendations. Less transparent forms of machine learning like neural networks and deep learning also have a role in cognitive health care. Deep learning-based image recognition, for example, is being used to identify potentially cancerous lesions in medical imaging, and to identify abnormalities and pathologies in cells and tissues.4 Since the identification is often a first-level screening, the interpretability of outcomes is typically less of an issue than in some areas of medicine.

Among payers, there is growing interest in technologies that address member and patient engagement. These approaches employ some form of natural language processing (NLP). NLP-based “chatbots” or intelligent virtual agents can be used to answer common patient questions, issue reminders about treatments and appointments, and capture subjective patient conditions. Among some providers, NLP is also being used to extract meaning from unstructured text, such as clinical notes or research articles.

What’s standing in the way?

While the technical feats we have discussed are possible today—at least as pilots or proofs of concept—it’s not easy to execute them at scale, and it’s likely even more difficult to deploy them in mission-critical care settings. First of all, the cognitive technologies are still evolving, and they involve technical challenges. Some organizations have found it very difficult, for example, to use them for ambitious medical objectives like predicting cancer or creating customized treatment pathways. Still, we expect that many of these technical barriers will be sufficiently addressed within a decade.

Secondly, a substantial amount of integration is required with existing systems, particularly in health care provider organizations. Cognitive solutions should be embedded into EMRs and revenue cycle systems in providers. These systems themselves will have to evolve and become more open to effectively accommodate new cognitive functionality.

Another long-term challenge to be addressed by the life sciences and health care industry is collaboration and integration of data. There is no consensus within the industry about who owns the data about a patient, for example, and who can do what with them. The industry should consider more collaborations and partnerships across sectors and organizations. To some degree, this is already happening. Project Survival, for example—an effort to find a pancreatic cancer biomarker—involves collaboration among a big data drug development startup (Berg Health), an academic medical center (Beth Israel Deaconess in Boston), a nonprofit (Cancer Research and Biostatistics), and a network of oncology clinicians and researchers (the Pancreatic Research Team). It seems likely that this form of cancer will eventually succumb to such a collaborative effort.

Many health care organizations will likely also face the same types of talent shortages with cognitive technologies that many other organizations encounter. There is a general lack of skills to engineer cognitive solutions. Costs are falling for software and hardware, but experienced software developers and system architects tend to still be expensive and often difficult to hire. This problem is particularly pronounced among providers, many of which lack the resources to attract top data science talent. In addition to the talent that develops cognitive applications, we’ll also need a generation of clinicians who are comfortable with smart machines augmenting their knowledge.

All of these challenges are surmountable. Over the next decade, universities will churn out more talented people, more collaborations will be created, and cognitive software is expected to become more capable and easier to use. We see a bright future for leaders and organizations wishing to harness a data-driven approach to help treat and cure disease in human populations. It will be an exciting and constructive time to be in the field of cognitive health care.