Generating Knowledge from Best Care: Advancing the Continuously Learning Health System

By Edward Abraham, Carlos Blanco, Celeste Castillo Lee, Jennifer B. Christian, Nancy Kass, Eric B. Larson, Madhu Mazumdar, Stephanie Morain, Katherine M. Newton, Alexander Ommaya, Bray Patrick-Lake, Richard Platt, John Steiner, Maryan Zirkle, Marianne Hamilton Lopez
September 6, 2016 | Discussion Paper

In this Perspective, the authors aim to facilitate the growth of learning health environments by highlighting strategies and examples of operational and research collaborations within delivery system settings in the United States. Informed by empirical data, conceptual literature, and authors’ experiences, the paper explores barriers to successful research and operational collaborations within the current health care context, as well as strategies that various health systems have used to move further toward the goal of continuous learning.

Specifically, the authors highlight activities by the National Academy of Medicine (NAM) (1) and the Patient-Centered Outcomes Research Institute (PCORI) to engage various stakeholders, specifically health executives, in the evidence-generation process; present an overview of collaborative activities that have resulted from these initiatives, including the findings from two research projects designed explicitly to explore the priorities, decisions, and analytic needs facing health executives at the delivery-system level as they consider further transitioning to a learning health system; and finally offer a series of priorities for actions (see Box 1) and corresponding case examples, to demonstrate strategies for generating knowledge from care. While all types of research are essential to continuous learning, this Perspective focuses on those evidence-generating activities that are designed explicitly to be conducted within the care delivery system, embedded within the practice of routine care, and whose findings are used to impact the clinical enterprise.



It is the authors’ intention that this paper will add to the emerging body of knowledge around learning health systems while also serving as an actionable blueprint for institutions interested in transforming to a continuously learning system.


An Opportunity for Collaboration

In 2014, operational leaders in Kaiser Permanente (KP) Colorado resolved to reduce missed appointments in their primary care clinics. While these leaders considered simply initiating telephone and text-message reminders throughout the delivery system, KP’s commitment to evidence-based practice led them to collaborate with researchers from their organization to implement a strategy that could simultaneously meet operational need, improve quality, and lead to generalizable knowledge. They initially randomized patients to receive, or not to receive, telephone and text-message reminders at 1 of their 27 primary care clinics and then replicated the randomized intervention at a second clinic. In both the pilot and replication sites, rates of missed appointments declined significantly, while appointment cancellations increased. The intervention increased appointment access by 17-38 slots per week at each site. Based on this evidence, the intervention was then disseminated to all primary care clinics in the system. Perhaps most telling, the intervention fostered an ongoing collaboration between operational leaders and researchers to further refine the reminder process and to better understand the characteristics of members missing appointments (Steiner et. al., 2016). This example of evidence generation and dissemination— as well as the additional examples provided throughout this paper—is illustrative because it is relevant to the priorities of a delivery system, engages both operational and research leaders, and exemplifies the possibilities of a continuously learning health system.

In a continuously learning health system, as defined by the NAM Learning Health System Series, “science, informatics, incentives, and culture are aligned for continuous improvement and innovation, with best practices seamlessly embedded in the delivery process and new knowledge captured as an integral by-product of the delivery experience” (italics added) (2). This vision calls for mutually productive, meaningful partnerships between researchers and those responsible for delivering health services.

To transform into a learning health system, the authors believe that the knowledge used by delivery systems must be generated by a research enterprise that is fully integrated, with researchers embedded in the system itself, and operations and clinical professionals helping to identify key questions. Additionally, a learning health system will advance by facilitating a variety of embedded learning activities (especially pragmatic clinical trials and quasi-experimental intervention studies) (3, 4), within delivery systems. These embedded learning activities require collaboration between various stakeholders at all stages of the knowledge generation cycle, from identification of the problem to devising and testing solutions and then finding ways to implement and refine them. It is also the authors’ viewpoint that, while there are examples of successful research and operational alignment and integration—many of which have been highlighted below—widespread progress to reach these goals continues to face numerous barriers.


Current Health Care Context

Overcoming barriers requires an understanding of the current health care context. In particular, health delivery systems are operating within an environment that increasingly demands continuous improvements in the quality, outcomes, and systems of care under increasingly tight margins. Enormous amounts of data are generated to meet these requirements. Data are generated as well for other ongoing activities within health systems such as administration, care delivery, care improvement, performance transparency, registries, patient-powered research networks and online forums, and consumer genomic databases. Yet, the vast majority of the data are not captured for broader systematic (i.e., research) learning, creating a missed opportunity. As demonstrated by the examples highlighted throughout this paper, enhanced linkage of the research enterprise to care delivery, and of care and operational challenges to research questions, can increase the relevance of research questions and findings, accelerate knowledge transfer, and lead to care and outcome improvement. For example, as demonstrated by Harvard Pilgrim Health Care (see Box 2), there is great value in understanding what type of care works (or does not work) (Greene, et. al., 2012), for whom, and under which circumstances.



Despite the promise of embedded learning activities, barriers to operational and research collaboration remain. A dearth of research studies is not the problem. Even with the thousands of randomized clinical trials (RCTs) and clinical studies published yearly, a lack of evidence exists to inform clinical decisions made by providers and patients (Bracken, et. al., 2014; Patsopoulos, 2011), clinical practice guidelines are often based on low-quality evidence (Level C: 48% of ACA/AHA guidelines in one study) (Tricoci,, 2009), and translation of research into practice continues to take far too long (Slote, et. al., 2011).


Barriers to Continuous Learning

The path toward continuously learning faces numerous barriers in part because clinical and research systems have been developed and maintained separately from one another in personnel, agenda setting, data infrastructure, and funding.


Lack of Generalizability

Research studies often do not reliably generate evidence that is generalizable for clinical decision making. The use of strict inclusion and exclusion criteria restricts a broader clinical population from participating in studies. In 2003, Masoudi and colleagues demonstrated that only a minority (13% to 25%) of persons with heart failure in clinical practice would qualify for enrollment in clinical trials (Masoudi, et. al., 2003). Similarly, studies demonstrate that a majority of persons with asthma would be ineligible for selected randomized clinical trials (Travers, et. al., 2007; Herland, et. al., 2005). Most individuals with psychiatric disorders would not qualify for clinical trials of medications or psychotherapy (Blanco, et. al., 2008; Okuda, et. al., 2010; Hoertel, et. al., 2012). The Food and Drug Administration (FDA) has also produced a white paper that reported that patients with serious but common conditions, such as psychiatric and cardiac diseases, were frequently excluded from evaluations of new medicines (FDA, 2011). Although there are ongoing efforts to develop statistical methods to validly extrapolate results from clinical trials to general populations of individuals with the target disorder, much work remains to be done in this area (Cole, et. al., 2010; Blanco, et. al., 2016). As such, even when lengthy clinical trials are completed, clinicians may not feel that study results can be applied to their own, often more complex, patient populations.


Limited Relevance

Current research findings are not consistently relevant to the broader population, business strategies, and immediate needs facing decision makers at the delivery system level. Within the current health context, investigators and funding agencies often identify research priorities on their own, rather than working with health executives, payers, clinicians, and patients to define questions or to provide input for the design or conduct of research studies. Front-line clinicians are not trained, nor do they have time, incentives, or funding, generally, to engage in evidence-generation activities, and they are typically overlooked during development of research priorities. Furthermore, the outcomes and questions being assessed are not the most pressing clinical questions for practicing clinicians. For example, a systematic review evaluating treatments for depression concluded that few studies provided data on the long-term effectiveness of treatment, the functional status of patients, or the outcomes of patients treated in typical practice settings (Mulrow, et. al., 1999). The absence of longer-term follow-up and typical practice settings is not unique to depression treatment—more the rule than the exception—but these characterize questions that both clinicians and patients most want answered.

Investigators, clinicians, operational leaders, and health executives have few incentives to collaborate on research priorities or enhance the applicability of results. Investigators are encouraged to seek grants and publish findings but are not evaluated on the impact of their findings in changing practice. Furthermore, promotion committees are not designed or inclined to reward either impact or collaboration, both of which can potentially slow down research completion. Operational leaders and health executives may see most research studies as expensive and interfering with current operations and patient care, while offering uncertain and distant knowledge that may not benefit them directly.


Inconsistent Taxonomy

While all types of research are crucial to continuous learning, this Perspective specifically focuses on activities that can be conducted within the care delivery system, by design both embedded within the practice of routine care and intended to influence the clinical enterprise. Yet, the authors recognize that even the term “research” covers many kinds of activities and means different things to different stakeholders. For instance, research to health system leaders could mean “narrowly focused operations research that addresses an immediate question and may or may not generalize outside my institution,” while funding leaders might define research as “implementing protocols to test hypotheses that may or may not lead to changes in clinical care after several more years of investigation.” When working toward collaboration, there is a risk of talking past one another.


Competing Priorities

Health systems are operating within tight margins and need desperately to identify means of generating revenue and controlling costs. Most have been organized on a fee-for-service basis, meaning that additional tests and procedures contribute positively to their financial bottom line. Better care is not always cheaper or reimbursable, and what is most lucrative is not always best care. Using the traditional models of research, the knowledge generated from studies may be important to health, but they may not be implemented due to competing incentives. Health system chief executive officers (CEOs) face an abundance of competing demands and increasing data. Amidst competing demands, CEOs may not see or understand how research could provide a return on investment, especially when research competes with what many health system CEOs experience as unlimited demands for funding. The funding provided by a grant may be a fraction of the revenues that could be generated through clinical care and the salaries that are allowed from grants may be considerably smaller than those of the participant clinicians, thus requiring the institution to subsidize one way or another a portion of the salary of those clinicians.

Additionally, within the current environment, funders do not routinely fund researchers to move to the implementation stage. When making funding decisions, funders do not necessarily ask for a plan for who would authorize the integration of findings if appropriate and whether those stakeholders are part of the project team.


NAM Partnerships to Advance Continuous Learning

To respond to the ever-evolving health environment, and to further explore the potential of operational and research collaboration, the NAM partnered with PCORI in 2014 to develop a two-session workshop titled “Health System Leaders Working Towards High-Value Care Through Integration of Care and Research” (IOM, 2015). The goal of the project was to engage health system leaders as essential partners in the development of research that aligns with the pace and priorities of health delivery centers and systems. The first workshop convened health care system leaders and researchers to discuss activities and opportunities. The second workshop convened health system CEOs considering strategic priorities and approaches to implementation. Foundational to the workshop was the idea that continuous and seamless assessment of the effectiveness and efficiency of care is basic to a continuously learning health care system, and constant integration of what was learned is central to a continuously improving one.

The 2014 workshop resulted in multiple outcomes:

  • Two independent research initiatives (described in Boxes 3 and 4);
  • The development of the NAM Executive Leadership Network;
  • A follow-up 2016 NAM/PCORI meeting that engaged additional health leaders, policy leaders, and researchers interested in practice-embedded research; and
  • A new PCORI funding opportunity through its National Patient-Centered Clinical Research Network (PCORnet), which brings together health systems from across the country into Clinical Data Research Networks (CDRNs) and offers a standard way of organizing and aggregating data to facilitate multisite research.


See details of each initiative below. Each of these activities highlights an increasing interest and willingness of health executives to engage in the evidence-generation process. The authors believe that achieving a continuously learning health system depends on a well-informed and equipped cadre of stakeholders, including health care system leaders, who have patient-centered values and motivation, aligned interests, and levers for making progress, and can benefit directly from a better synergy of effort through collaboration.


Research Studies

A survey of health care executives

In consultation with the NAM and a small group of PCORnet awardees and stakeholders, the Group Health Research Institute (GHRI) conducted a survey in between the 2014 NAM workshops to “explore the benefits and challenges of integrating research into practice from the perspective of C-suite leaders . . . and from the perspective of persons engaged in developing Clinical Data Research Networks (CDRNs) and others attending the first NAM workshop” (Johnson, et. al., 2014). Key findings from the survey are presented below (see Box 3).



Interviews with Health Systems

At the conclusion of the 2014 workshops, researchers from the Johns Hopkins Berman Institute of Bioethics, in consultation with the NAM, identified and interviewed leaders from U.S. health care institutions engaged in a transition toward learning health care. The goals of this project were to explore how institutional leaders in “learning health care” transformed their health care systems or institutions to proceed further along the path to becoming learning health care systems, and then to document the nature and extent of ethical and regulatory challenges they faced in this transformation (Morain and Kass, 2016). Key findings from the interviews became the impetus for the development of this NAM Perspective and are presented below (see Box 4) and discussed throughout the remainder of this piece.



Executive Leadership Network for a Continuously Learning Health System

The NAM Leadership Consortium for a Value & Science-Driven Health System developed the Executive Leadership Network (ELN) for a Continuously Learning Health System to facilitate the cooperative engagement, communication, and leadership of health system executives to accelerate individual, organizational, and system-wide capacity and progress for health care that continuously learns and improves. Network participants advise on the issues, strategies, and returns from continuous learning capacities that simultaneously support operational decision making, performance improvement efforts, and the generation of better evidence—including rapid-cycle learning efforts, patient data sharing across entities, focus and harmonization of measurement activities, ethical framework and IRB streamlining for evidence development from the care experience, and improving the implementation of reliable clinical decision support tools.


2016 Accelerating Clinical Knowledge Generation and Use Meeting

In a January 2016 meeting, Accelerating Clinical Knowledge Generation and Use (5), the NAM and PCORI convened health system leaders and researchers to explore strategies to improve linkages and synergy among health care organizations with related interests and to consider questions and demonstration projects that PCORI might support to advance the field. Participants worked to identify compelling questions on system performance, measurement, and operations that might be answered from structured and routinely captured care delivery data, and provided input on the priority of the questions and standardized data needed; explored communication strategies to help improve linkages, synergy, access to information, and progress among health care organizations with related interests; characterized core clinical data elements and system characteristics necessary to generate usable knowledge in real time, including use of PCORI’s common data model; and considered demonstration projects that PCORI might support, along with approaches for expanding strategic priorities.


The Health Systems PCORnet Demonstration Project

Another key initiative was the development of the PCORI Health Systems PCORnet Demonstration Project, which is designed to “provide the CDRNs participating in PCORnet with an opportunity to test their capacity to conduct collaborative research with health systems leaders across the network” (6). According to its funding announcement, the Health Systems PCORnet Demonstration Project will have the following objectives and guidelines:

  • The project will be of interest and add value to multiple health systems.
  • The project must include at least two health systems that span two or more CDRNs.
  • The project will leverage PCORnet data resources.
  • Topics will be rated as priority by the CEOs and systems leaders, and their input will be included in the PCORI Funding Announcement responses.
  • Proposed projects will be identified, vetted, and involve iterative review and discussion among researchers, clinicians, and health systems leaders.
  • Projects may be comparative, may be descriptive, or may evaluate the utility of new data sources for addressing specific questions of health systems leaders.
  • The project will pay particular attention to the needs of a learning health care system to optimize decisions made by health systems, care providers, and patients and their families.
  • The project may involve linkage to administrative claims data.
  • The project will evaluate the impact of the research on PCORnet’s capacity to support collaborative research and extend the breadth of PCORnet’s shared tools and resources using PCORnet infrastructure.


In Step 1 of the Health Systems PCORnet Demonstration Project, which occurred between June 2015 and January 2016, each participating CDRN implemented stakeholder engagement processes within their networks to prioritize a set of health system–oriented research topics to inform the Step 2 funding announcement described above. By aligning with PCORnet researchers and utilizing the PCORnet data infrastructure, these projects are designed “to identify and prioritize a set of data-driven research activities of high interest to health systems and clinicians.”


Priorities for Action

To enable the engagement of well-informed and equipped stakeholders and successfully transition to a learning health system, the authors propose a cultural shift in the practice of research and operations, in which research priorities also come from clinicians, evidence-generation activities are embedded, collaboration is rewarded, funding agencies support research whose results can be readily applied in routine clinical settings, evidence-based clinical and operational improvements are reimbursed by payers, and research findings inform clinical decisions and improve health outcomes. Reaching these goals requires a concentrated effort from leadership at the national funding and policy-making levels, from multistakeholder teams in delivery systems, and from those designing and providing education at our nation’s medical and research institutions. It also requires that health systems come to view continuous learning as helpful to their mission and bottom line.

The authors have identified a number of key priorities for action, with the belief that they will foster the growth of learning health care environments, including

  • Emphasizing the bidirectional relationship between health operations and research: research is more relevant that is responsive to key clinical and operational challenges, and those able to consider how findings can be incorporated into clinical or operational protocols should be part of teams;
  • Participating in networks and relationships that facilitate a faster, more relevant research process and better use of finite resources;
  • Embedding learning activities (especially pragmatic clinical trials, systems/management studies, and quasi-experimental studies) within delivery systems, which allows learning to be more relevant and more generalizable; it also sets processes that are more easily replicable for the long term;
  • Involving a broad range of stakeholders: patients and clinicians may be best situated to identify their highest priority unanswered questions, operations managers will know which operational challenges are most central, and researchers can consider how to best answer these questions and in ways that can appropriately be generalized back to the functioning clinical and operational systems;
  • Prioritizing training for a variety of stakeholders while identifying and increasing opportunities for learning; and
  • Focusing on implementation and dissemination of results; considering how results will be translated into implementation should be considered when learning projects are planned, with key operational stakeholders at the table to ensure the design facilitates future implementation.


Additionally, realizing that there are a number of examples of successful partnerships, the authors looked across health systems to highlight both research- and process-specific case examples that demonstrate these priorities for action. It is our intention that the proposed priorities for action, as well as the compilation of case examples, will add to the emerging body of evidence around learning health care while also serving as an actionable roadmap for institutions interested in transforming to a continuously learning system.

Underlying all of these priorities, and demonstrated repeatedly throughout the case examples, is an essential commitment to culture change. Transformation to a learning health system requires the development of a culture committed to continuous learning. A key finding from the Johns Hopkins research study (described in Box 4) is the importance of top leadership communicating the culture of the institution and representing a commitment to a learning health system as an organizational priority. Leaders must be instrumental in communicating a commitment to learn and then asking “how” to implement new opportunities for evidence generation rather than “whether” to implement them, and in developing an environment committed to multistakeholder engagement and innovation. Transformation also requires a culture of collaboration among clinical leaders, operational administrators, and researchers. While all three groups have much to gain, each must also adapt to the legitimate needs and expectations of the others. At their best, learning health system projects and programs must blend the timeliness and relevance expected by clinical/operational leaders with the rigor required by researchers. The topics chosen for learning health system projects should both improve care and contribute to scientific knowledge. Inevitably, compromises between these expectations will be necessary. Such compromises can only be made in a culture of trust, which in turn requires an investment of time, financial, and emotional capital. A collaborative culture is slow to build and easy to squander. Therefore, all partners in a learning health system must take a long-term, strategic perspective. If a group of leaders can navigate the obstacles and instill such a culture in their organization, it can provide a uniquely productive and professionally satisfying haven for its participants and can serve as a model to others.


Enhanced Bidirectional Relationship Building

As described above, within the current health context, the two activities of clinical operations and research operate in largely separate environments with different (and at times competing) players, funding streams, incentives, and priorities. The authors believe that research can move more quickly once research interests are aligned with operations. Likewise, operations will be more evidence-based and thus the quality of care improved once operations stakeholders are engaged in the development of research priorities and their needs and strategies are reflected in the research agenda.

To build these relationships requires that health executives promote the benefits of integration, including ideas related to seamless integration of research and practice, and that they create structures, funds flow, and processes, and allocate time and resources, to those collaborations. Of key importance in relationship building is that leaders work to overcome barriers related to competing priorities between researchers and clinicians, and they collaborate to build an environment that promotes transparency, as well as opportunities to ask questions together, to develop innovative solutions to health care delivery and population health problems, and to build synergistic and productive partnerships. By working through this process, researchers, clinicians, and leaders should discover mutual interests, leading to an environment that values and rewards collaboration through new models for tenure and promotions and shared financial rewards. In the process-specific case examples below, one medical center, Wake Forest Baptist Medical Center (see Box 5), uses a shared strategic commitment to “becoming a Learning Healthcare System” and a centralized funding flow to align clinical, research, and educational programs, while Johns Hopkins School of Medicine’s institution-wide vision of a learning health system (see Box 6) is communicated across its system to align clinicians and researchers.




Networks for Evidence Generation

Another approach to aligning research at the health delivery system is through the participation of health systems in multi-institutional networks that allow a faster, more relevant research process and better use of finite resources.

A number of interconnected networks exist, both within single institutions and among multiple health systems that are designed to foster the development and conduct of evidence-generation activities across institutions. Examples include PCORnet, the National Patient-Centered Clinical Research Network;  the National Institutes of Health, Health Care Systems Research Collaboratory, National Center for Advancing Translation Sciences; Big Data to Knowledge Initiative and Precision Medicine InitiativeTM Cohort Program; the Health Care Systems Research Network [formerly known as the HMO Research Network and which includes topical research networks in cancer (Cancer Research Network) and mental health (Mental Health Research Network) among others], and the FDA Sentinel initiative. Through these efforts, multi-institutional collaborations are studying aspirin dosing, bariatric surgery outcomes, and the use of antibiotics in children, to name a few. Box 7 provides two examples of multistakeholder research prioritization efforts from the PCORI Health Systems PCORnet Demonstration Project, while Box 8 demonstrates a multi-institutional pragmatic study led by the Hospital Corporation of America.





Embedded Learning Activities

Embedded learning activities in health care delivery systems can be greatly facilitated by six attributes, summarized as “right team, right question, right design, right data, right analysis, and right interpretation” (7) :

  • Right team. Systems should assemble teams of operational/clinical experts, patients, and researchers with diverse perspectives and skill sets. These teams should operate under the sponsorship of supportive executive leaders who are committed to incorporating rigorously derived, local evidence into their decisions.
  • Right question. These teams should propose research questions that are clear, answerable, of high operations and/or clinical priority, and actionable. Leaders should be able to define a course of action that would result from any plausible answer to a good question. For example, if the question involved the effectiveness of a new clinical strategy or approach to delivering care, leaders should anticipate how the program can be disseminated if it is effective, and should be willing to terminate it or modify it if not.
  • Right design. The design of a data analysis or evaluation must balance multiple competing constraints such as the urgency of a decision, the level of confidence necessary to make that decision, and the costs and feasibility of collecting data to answer the question. Within those constraints, the design should be as rigorous as possible. Observational data analyses should consider issues of data quality, as well as missing data, and should account for the well-known biases that can affect the interpretation of observational analyses such as comparative effectiveness studies. Studies of interventions should incorporate rigorous designs, and consideration should be given to the use of randomization to conduct pragmatic trials.
  • Right data. Research in a learning health system must take advantage of the rich clinical data provided by electronic health records, augmented by administrative data and clinical information from laboratory, imaging, pharmacy, and other sources. Researchers should invest the time to understand the provenance of the data, and its biases and limitations.
  • Right analysis. Analytic approaches should be as sophisticated as the constraints of the process allow. Researchers may be particularly helpful in areas such as applying advanced, multivariate techniques to the analysis of observational data, accounting for hierarchical data structures (patients nested within clinician practices, in turn nested within clinics), and accounting for potential sources of bias in patient selection.
  • Right interpretation. The results of these analyses should be conveyed to decision makers and patients in a format that is interpretable and actionable. In conventional research, the findings of a multiyear clinical trial must often be condensed into one or two figures and a few data tables. Similar parsimony should apply to the presentation and interpretation of findings from embedded research.
  • Right action. This step is the culmination of all the preceding “rights” and reinforces the expectation that, in a learning health system, knowledge should lead to interventions. These interventions should be evaluated rigorously, so that their outcomes can inform the next round of increasingly relevant and sophisticated questions.


Box 9 highlights an embedded research study within Mount Sinai Health System (MSHS)—an interrupted time-series design for evaluating an intervention for restricting intravenous acetaminophen use, which has implications for administrative costs and prescription behaviors between groups within the hospital.


Involve a Variety of Stakeholders

A successful learning environment involves shared strategies, linked activities, and the involvement of stakeholders from throughout the health care system, each representing different departments, disciplines, and perspectives. By actively involving a variety of stakeholders, systems can increasingly demonstrate the value of operational and research collaboration, and increase the likelihood that research and operational priorities align.

To involve a variety of stakeholders will require looking beyond the traditional research environment and beyond traditional research activities to involve patients and other key stakeholders (e.g., advocates, community leaders, clinicians, board members, health executives, payers, and policy makers). GHRI researchers, who have a history of collaborating with clinicians and unit leaders (Buist, et. al., 2016), partnered with the community and policy makers (see Box 10) to increase national attention of the prescription practices of opioid analgesics for chronic pain. This work illustrates the potential for learning health system work to generate important and currently ongoing public policy and research efforts as well as to generalizable knowledge. Box 11 is an example of clinician leadership in evidence generation and practice at Intermountain Healthcare.



Prioritize Training

Among the challenges identified by respondents in the Johns Hopkins University’s interviews with health executives (see Box 4) was that the traditional model of medical training fails to prepare clinicians for the evidence-driven, team-based approach needed for a successful learning health care system. As previously argued by Catherine Lucey, the goal of medical education should not be to produce physicians; it should be to improve the health of patients and populations (Lucey, 2013). Respondents to the interviews noted the importance of education in “system sciences,” reinforcing similar arguments in the literature of the need for physicians working in complex health systems to have knowledge and understanding of health care financing, population health, quality improvement, informatics and health care technology, and team-based care (Gonzalo, et. al., 2015). Box 12 below demonstrates how the University of Michigan is working to educate on system sciences and formally integrated the concept of continuous learning within its medical institution.



The authors believe that, while it is important to train the next generation of clinicians and researchers, in a continuing learning system everyone needs training. Health systems should also prioritize training for stakeholders already within their employ.


Focus on Implementation and Dissemination of Results

One of the motivators for developing this Perspective was the agreement among its authors that, while there are examples of institutions engaged in continuous learning, there are few mechanisms for scaling and spreading these best practices within and between health systems. Implementation and dissemination of the lessons learned from the practice of care, and its impact on cost, better practices, and generalizable knowledge, should be a priority among health executives, clinicians, researchers, and patients.

The transformation to a continuously learning system is critically dependent on dedicated resources and a commitment to sharing, adopting, and sustaining these best practices. A number of the examples above include implementation and dissemination strategies, including peer and research networks. The networks are important conduits for putting research evidence into daily clinical practice.

As health systems and researchers embark on embedded activities, they should simultaneously reach out to external funding sources to identify or create opportunities for collaborative support. This is again why the involvement of multiple stakeholders is crucially important—a supportive board member or health executive may be able to make a particularly compelling case to external funders. Additionally, consideration should be given at the beginning of a study to intervention and implementation strategies, including ongoing funding for services and the development of new partnerships once the original research funding is no longer present. Box 13 demonstrates how the VA continues to expand, communicate, and successfully implement its primary care–mental health integration program throughout its system.




As illustrated throughout this piece, the transformation to continuous learning requires several key components. It requires strong leadership, one that both signals a cultural change and demonstrates a commitment to multistakeholder collaboration. Interviews with C-suite executives and institutional leaders suggest that if there are not commitments from the top, change will be difficult, and once commitments are in place, strategy and choice of specific projects must reflect the thinking of multistakeholder teams, be deliberate and limited in priorities at any one time, and build in systems for implementation and translation once done. Second, it requires engagement of health executives, academic leaders, and research institutes to create the necessary physical and cultural infrastructure, provide financial incentives, and offer professional rewards for partnerships. Third,  successful transformation depends on local teams of clinicians, operational front-line leaders, and researchers who identify problems that matter to them and to their patients, begin a shared process of solving those problems, and disseminate solutions.

The learning health system is increasingly recognized as critical to improve the quality of patients’ care, to be efficient in what is provided, to increase the justice of care, and to use research as a tool to improve quality and operations, rather than risk it being viewed as an end in itself. As the authors have demonstrated above, multiple institutions across the United States are addressing the structural, leadership, and collaborative elements necessary to move closer toward transforming to a continuously learning system. They are creating models and examples from which we can all learn. That institutions are working to test models to improve is important and we can and must continue to identify which models work or do not work to help all institutions continue on the path.



Download the graphic below and share it on social media! 



  1. Formerly the Institute of Medicine (IOM).
  2. See
  3. “Pragmatic clinical trials (PCTs) test clinical interventions (e.g., treatments, diagnostic tests, and delivery strategies) that are widely used in practice and for which there is often clinical equipoise. Similar to traditional explanatory trials of novel therapeutics, PCTs use randomization to decrease selection bias. In contrast, PCTs rely on extant data sources (e.g., electronic medical records [EMRs]) and test interventions that can be implemented with minimal research infrastructures. Thus, PCTs have drawn interest as vehicles for decreasing the cost of clinical research and for creating learning health systems, which, as articulated by the Institute of Medicine, seek to generate new knowledge as an integral by-product of the delivery experience. However, realizing this vision for PCTs will require innovative approaches for engaging clinicians, improving the efficiency of subject recruitment, improving the reliability of EMR data, and new paradigms for the regulatory review of low-risk trials to decrease unnecessary hurdles to practice-based knowledge generation” (Rosenthal, G. E. 2014. The Role of Pragmatic Clinical Trials in The Evolution of Learning Health Systems. Transactions of the American Clinical and Climatological Association.125:204-218.).
  4. “Although PCTs are often a gold standard for determining intervention effects, in the area of practice-based research, there are many situations in which individual randomization is not possible. Alternative approaches to evaluating interventions have received increased attention, particularly those that can retain elements of randomization such that they can be considered “controlled” trials.” (Handley, M. A., D. Schillinger, and S. Shiboski. 2011. Quasi-Experimental Designs in Practice-based Research Settings: Design and Implementation Considerations. Journal of the American Board of Family Medicine. 24(5):589-596). These quasi-experimental design approaches—the interrupted time-series design, the stepped-wedge design, and a variant of this design, a wait-list crossover design—are gaining popularity.
  5. The Accelerating Clinical Knowledge Generation and Use meeting agenda and summary are available at
  6. See
  7. The attributes are part of the framework used for promoting the learning health system in Kaiser Permanente-Colorado. See



  1. Agency for Healthcare Research and Quality. Universal ICU Decolonization: An Enhanced Protocol. September 2013, Rockville, MD. Available at: (accessed July 28, 2020).
  2. Bernstein, J. A., C. Friedman, P. Jacobson, and J. C. Rubin. 2015. Ensuring public health’s future in a national-scale learning health system. American Journal of Preventive Medicine. 48:480-487.
  3. Blanco, C, A. N. Campbell, M. M. Wall, M. Olfson, S. Wang, and E. V. Nunes. 2016. Towards national estimates of treatment effectiveness for substance use. Journal of Clinical Psychiatry 78(1): e64-e70.
  4. Blanco, C, M. Olfson, R. D. Goodwin, E. Ogburn, M.R. Liebowitz, E. V. Nunes, and D. S. Hasin. 2008. Generalizability of clinical trial results for major depression to community samples: Results from the National Epidemiologic Survey on Alcohol and Related Conditions. Journal of Clinical Psychiatry. 69(8):1276-1280.
  5. Bracken, M. B., B. Djulbegovic, S. Garattini, J. Grant, I. Chalmers, A, M. Gülmezoglu, D. W. Howells, J. P. A. Ioannidis, and S. Oliver. 2014. How to increase value and reduce waste when research priorities are set. Lancet. 383:156-165.
  6. Buist, B, E. Chang, M. Handley, R. Pardee, G. Gundersen, A. Cheadle, and R. Reid. 2016. Primary care clinicians’ perspectives on reducing low-value care in an integrated delivery system. The Permanente Journal. 20(1):41-46.
  7. Calfee, D. P., C. D. Salgado, A. M. Milstone, A. D. Harris, D. T. Kuhar, J. Moody, K. Aureden, S. S. Huang, L. L. Maragakis, and D. S. Yokoe. 2014. Strategies to prevent methicillin-resistant Staphylococcus aureus transmission and infection in acute care hospitals: 2014 update. Infection Control & Hospital Epidemiology. 35:S108-S132.
  8. Cole, S. R. and E. A. Stuart. 2010. Generalizing evidence from randomized clinical trials to target populations: The ACTG 320 trial. American Journal of Epidemiology. 172(1):107-115.
  9. Deyo, R. A., M. Von Korff, and D. Duhrkoop. 2015. Opioids for Low Back Pain. BMJ. 350:g6380. Food and Drug Administration (FDA). White Paper: Inventory of Clinical Trials Protocols and Clinical Study Data. Available at: (accessed March 2016).
  10. Gonzalo, J. D., P. Haidet, K. K. Papp, D. R. Wolpaw, E. Moser, R. D. Wittenstein, and T. Wolpaw. 2015. Educating for the 21st-Century Health Care System: An Interdependent Framework of Basic, Clinical, and Systems Science. Academic Medicine 92(1): 35-39.
  11. Greene, S. M., R. J. Reid, and E. B. Larson. 2012. Implementing the learning health system: From concept to action. Annals of Internal Medicine. 157:207-210.
  12. Herland, K, J. P. Akselsen, O. H. Skjonsberg, and L. Bjermer. 2005. How representative are clinical study patients with asthma or COPD for a larger “real life‟ population of patients with obstructive lung disease? Respiratory Medicine. 99(1):11-19.
  13. Hoertel, N, Y. Le Strat, C. Blanco, P. Lavaud, and C. Dubertret. 2012. Generalizability of clinical trial results for generalized anxiety disorder to community samples. Depression and Anxiety. 29(7):614-620.
  14. Huang, S. S., E. J. Septimus, K. Kleinman, J. Moody, J. Hickok, T. Avery, J. Lankiewicz, A. Gombosev, L. Terpstra, F. Harford, M. Hayden, J. Jernigan, R. Weinstein, V. Fraser, K. Haffenreffer, E. Cui, R. E. Kaganov, K. Lolans, J. B. Perlin, and R. Platt. 2013. Targeted versus universal decolonization to prevent ICU infection. New England Journal of Medicine. 368:2255-2265.
  15. Huang, S. S., E.J. Septimus, M. K. Hayden, K. Kleinman, J. Sturtevant, T. R. Avery, J. Moody, J. Hickok, J. Lankiewicz, A. Gombosev, R. E. Kaganov, K. Haffenreffer, J. A. Jernigan, J. B. Perlin, R. Platt, and R. A. Weinstein. 2016. Effect of body surface decolonization on Bacteriuria and Candiduria in intensive care units: An analysis of a cluster-randomised trial. The Lancet Infectious Diseases. 16(1):70-79.
  16. Huang, S. S., E. J. Septimus, T. R. Avery, G. M. Lee, J. Hickok, R. A. Weinsten, J. Moody, M. K. Hayden, J. B. Perlin, R. Platt, and G. T. Ray. 2014. Cost savings of universal decolonization to prevent intensive care unit infection: Implications of the REDUCE MRSA trial. Infection Control & Hospital Epidemiology. 35(S3):S23-S31.
  17. Institute of Medicine. 2015. Integrating Research and Practice: Health System Leaders Working Toward High-Value Care: Workshop Summary. Washington, DC: The National Academies Press.
  18. Johnson, K., C. Grossman, J. Anau, S. Greene, K. Kimbel, E. Larson, and K. Newton. 2015. Integrating Research into Health Care Systems: Executives’ Views. NAM Perspectives. Discussion Paper, National Academy of Medicine, Washington, DC.
  19. Lucey, C. R. 2013. Medical education: Part of the problem and part of the solution. JAMA Internal Medicine. 173:1639-1643.
  20. Masoudi, F. A., E. P. Havranek, P. Wolfe, C. P. Gross, S. S. Rathore, J. F. Steiner, D. L. Ordin, and H. M. Krumholz. 2003. Most hospitalized older persons do not meet the enrolment criteria for clinical trials in heart failure. American Heart Journal. 146(2):250-257.
  21. McClellan, T, and B. Turner. 2010. Chronic Noncancer Pain Management and Opioid Overdose: Time to Change Prescribing Practices. Annals of Internal Medicine.152(2):123-124.
  22. Morain, S. R. and N. E. Kass. 2016. Ethics issues arising in the transition to learning health care systems: Results from interviews with leaders from 25 health systems. eGEMs (Generating Evidence & Methods to improve patient outcomes). 4(2): Article 3.
  23. Mulrow, C. D., J. W. Williams Jr, M. Trivedi, E. Chiquette, C. Aguilar, and J. E. Cornell. 1999. Treatment of Depression: Newer Pharmacotherapies. Rockville, MD: Agency for Health Care Policy and Research; AHCPR publication 99-E014. Available at: (accessed July 28, 2020).
  24. Okuda, M, D. S. Hasin, M. Olfson, S. S. Khan, E. V. Nunes, I. Montoya, S. M. Liu, B. F. Grant, and C. Blanco. 2010. Generalizability of clinical trials for cannabis dependence to community samples. Drug and Alcohol Dependency. 111(1-2):177-181.
  25. Patsopoulos, N. A. 2011. A pragmatic view on pragmatic trials. Dialogues on Clinical Neuroscience.13:217-224. Available at: (accessed July 28, 2020).
  26. Post, E. P., M. Metzger, P. Dumas, L. Lehmann. 2010. Integrating mental health into primary care within the Veterans Health Administration. Families, Systems, & Health, 2010; 28(2): 83-90.
  27. Pronovost, P. J., C. G. Holzmueller, N. E. Molello, L. Paine, L. Winner, J. Marsteller, S. M. Berenholtz, H. J. Aboumatar, R. Demski, and C. M. Armstrong. 2015. The Armstrong Institute: An academic institute for
    patient safety and quality improvement, research, training, and practice. Academic Medicine. 90(10):1331-1339.
  28. Pronovost, P. J., C. M. Armstrong, R. Demski, T. Callender, L. Winner, M. R. Miller, J. M. Austin, S. M. Berenholtz, T. Yang, R. R. Peterson, J. A. Reitz, R. G. Bennett, V. A. Broccolino, R. O. Davis, B. A. Gragnolati, G. E. Green, and P. B. Rothman. 2015b. Creating a high-reliability health care system: Improving performance on core processes of care at Johns Hopkins Medicine. Academic Medicine. 90(2):165-172.
  29. Septimus, E, J. Hickok, J. Moody, K. Kleinman, T. R. Avery, S. S. Huang, R. Platt, and J. Perlin. 2016. Closing the translation gap: Toolkit based implementation of universal decolonization in adult intensive care units reduces central line associated bloodstream infections in 95 community hospitals. Clinical Infectious Diseases 63(2): 172-177.
  30. Septimus, E. J., M. K. Hayden, K. Kleinman, T. R. Avery, J. Moody, R. A. Weinstein, J. Hickok, J. Lankiewicz, A. Gombosev, K. Haffenreffer, R. E. Kaganov, J. A. Jernigan, J. B. Perlin, R. Platt, and S. S. Huang. 2014. Does chlorhexidine bathing in adult intensive care units reduce blood culture contamination? A pragmatic cluster-randomized trial. Infection Control & Hospital Epidemiology. 35(S3):S17-S22.
  31. Slote, M. Z., S. Wooding, and J. Grant. 2011. The answer is 17 years, what is the question: understanding time lags in translational research. Journal of the Royal Society in Medicine. 104: 510-520.
  32. Steiner, J. F., M. R. Shainline, M. C. Bishop, and S. Xu. 2016 Reducing Missed Primary Care Appointments in a Learning Health System: Two Randomized Trials and Validation of a Predictive Model. Medical Care. 54(7):689-96.
  33. Tew, J., J. Klaus, and D. W. Oslin. The Behavioral Health Laboratory: building a stronger foundation for the patient-centered medical home. Family Systems Health. 2010 Jun;28(2):130-45.
  34. Travers, J, S. Marsh, M. Williams, M. Weatherall, B. Caldwell, P. Shirtcliffe, S. Aldington, and R. Beasley. 2007. External validity of randomized controlled trials in asthma: To whom do the results of the trials apply? Thorax. 62:219-223.
  35. Tricoci, P, J.M. Allen, J. M. Kramer, R. M. Califf, and S. C. Smith. 2009. Scientific evidence underlying the ACC/AHA clinical practice guidelines. JAMA 301(8):831-841.
  36. Von Korff, M, and G. Franklin. 2016. Responding to America’s Iatrogenic Epidemic of Prescription Opioid Addiction and Overdose. Medical Care 54(5):426-9.



Suggested Citation

Abraham, E., C. Blanco, C. Castillo Lee, J. B. Christian, N. Kass, E. B. Larson, M. Mazumdar, S. Morain, K. M. Newton, A. Ommaya, B. Patrick-Lake, R. Platt, J. Steiner, M. Zirkle, and M. Hamilton Lopez. 2016. Generating Knowledge from Best Care: Advancing the Continuously Learning Health System. NAM Perspectives. Discussion Paper, National Academy of Medicine, Washington, DC.


The authors would like to thank Rosheen Birdie, National Academy of Medicine, for her valuable assistance in facilitating the development of the paper. Thank you also to all the institutions that provided case examples.


The views expressed in this Perspective are those of the authors and not necessarily of the authors’ organizations, the National Academy of Medicine (NAM), or the National Academies of Sciences, Engineering, and Medicine (The Academies). The Perspective is intended to help inform and stimulate discussion. It has not been subjected to the review procedures of, nor is it a report of, the NAM or the Academies. Copyright by the National Academy of Sciences. All rights reserved.

Join Our Community

Sign up for NAM email updates