Clinical trials are a means of gathering information about medical products or services. In medicine, however, they are also the fundamental means for advancing the underlying science and have enabled the many advances we have made in health care over the last few years. In many ways, clinical trials are the core infrastructure of the practice of medicine.
For an activity so vital to the field, the clinical research process is in serious trouble, especially in the United States. The cost of clinical trials for manufacturers of pharmaceuticals, biologics, and medical devices, as well as for public health investigators, continues to escalate. In pharmaceuticals (where this trend has been best documented), the costs have increased 7.4 percent annually over inflation for the last 20 years (DiMasi et al., 2003). The rising cost of clinical research is multifactorial but has been attributed to increased protocol complexity; an increase in local, regional, national, and international regulations/guidance and lack of harmonization among these; excessively risk-averse interpretations of regulations; and increasing time (and financial) pressures on clinician-investigators (Bollyky et al., 2010). Meanwhile, the need to conduct research in this environment has fueled the creation over the last three decades of a large and relatively new industry clinical research organizations (CROs). While CROs improve efficiency for sponsors, their business practices also contribute to the current cost structure of clinical research. Irrespective of the factors driving the cost increases, the implications are clear. The greater the cost of research, the fewer new medical products that come to market, the less we know about the products that do, and the fewer initiated investigations of public health inspired research questions. While we advance the concept of the “learning health care system” (Institute of Medicine [IOM], 2003), the cost of clinical research moves us further from this goal every day.