Health Technology for All: An Equity-Based Paradigm Shift Opportunity

By Jennifer MacDonald, George Demiris, Michelle Shevin, Sonoo Thadaney-Israni, Timothy Jay Carney, and Anna Cupito
December 14, 2022 | Discussion Paper


Executive Overview

The health of the U.S. population faces a seemingly intractable shortfall in the prospects for large numbers of people, including and especially people of color, long confronting the consequences of structural racism. Yet, rather than focus on the equitable cultivation of health and wellness, policies and practices most drawing attention and priority of U.S. health care and related organizations have been those of cost and efficiency. Without equity as an intentional foundation, incentives toward cost and efficiency have exacerbated marginalization and resulted in resource optimization for only a select subset of the population—deepening disparities and, paradoxically, ballooning national cost and inefficiency.

Compounding the gap has been the impact of limited access to health technology that could otherwise be useful in countering some of the discrepancies. Health-relevant technologies, defined broadly as those technologies that impact individual and population health, have followed the patterns of structural discrimination that result in poorer information, resources, and support. With a paradigm shift centering on equity for individuals, families, and communities, a different future is possible.

Health-relevant technologies may provide the capabilities that enable the nation to counter structural impediments, but this will require sustained leadership, intentionality, and tactical action. Addressing structural racism requires structural solutions. Diverse datasets; equitable and person-centered criteria for evaluating and implementing health-relevant technologies; and new policy, incentive, and accountability structures are necessary elements for the future. Importantly, the most essential tenet of an effective future paradigm is to honor and engage the wisdom of individuals, families, and communities that have been marginalized and have too long been striving against inequity themselves. In this paper, the authors prompt an urgent call to action for health sector stakeholders—including academic institutions, funding organizations, health system leaders, technology companies, and policy leaders—to consider their role in a future paradigm where equity and progress are inextricably linked.


Prioritizing What Matters: A Call to Action 

America’s innovation ecosystem evermore robustly tracks, surveils, and measures individuals, but the same level of data gathering and fidelity, especially concerning equity, does not occur for organizational culture and behaviors, datasets and algorithms, technologies, policies and practices, systems and structures, and health care outcomes. This fact calls for a shift in what and how measurement occurs, the questions data and technology try to answer, and how civil and human rights are safeguarded amid the ongoing wave of service delivery transformation.

In 1998, 64 individuals from 29 countries gathered in Salzburg, Germany, to envision a person-empowered future of health care (Delbanco et al., 2001). Working from a concept of “nothing about me without me,” they set forth a revolutionary view of engagement and partnership that remains relevant but largely unrealized in U.S. health care today. Technology design and person-centeredness were essential in their framework, and more than a decade before electronic health records (EHR) adoption had reached even 10% of U.S. hospitals (Henry et al., 2016), these thought leaders envisioned providers “drawing, where possible, on computer-based guidance and communication technologies” (Delbanco et al., 2001). Their model, described in Healthcare in a Land Called People Power, applied continuous feedback to inform system evolution and partnered persons and communities with professionals at all levels of administration and government (Delbanco et al., 2001). Notably, what is described is not a future without data analytics, but rather one where the use of data is geared toward individual and collective empowerment, not toward cost reduction or efficiency for already inequitable systems.

Since that meeting, initiatives like OpenNotes have endeavored to empower patients with health information, and major institutions have recognized the criticality of Patient and Family Advisory Councils (Open Notes, 2020; Agency for Healthcare Research and Quality, 2013). Efforts like the Social Interventions Research and Evaluation Network Gravity Project, founded to generate coding standards for social determinants of health data capture in EHR systems, have even strived to include extra-institutional human factors in the design of our health-relevant technology (Gravity Project Team, 2019). Yet, there is far more to be done. As described by Lovis (2019) in a visionary 2019 JAMA article:

[T]he scientific and technological progress in handling information and its further processing and cross-linking for decision support and predictive systems must be accompanied by parallel changes in the global environment, with numerous stakeholders, including citizen and society. . . . This requires science and society, scientists and citizens, to progress together.

Those with power to influence the health and care of others share responsibility to employ equity and inclusion in the design and implementation of technologies and health systems. When equity informs design and code, health systems, health technology innovators, individuals, families, and communities can rest assured that these products will produce the best outcomes possible, ultimately optimizing both health and cost.

Using a lens of equity and inclusion to address challenges, major stakeholders—including policy makers, health system leaders, academic institutions, technology companies, funders, communities, and individuals—must consider whether an issue is fundamentally a resource problem (e.g., inadequate money, people, knowledge, infrastructure) or fundamentally an equity problem (e.g., the uneven distribution of money, people, knowledge, infrastructure). With this answered, equity metrics can further be used to drive solutions toward persons and communities where impact potential is greatest and to design interventions most likely to improve the health of disenfranchised communities. During each step toward design and implementation, community wisdom should be employed to inform and magnify the impact of an intervention.

To realize this vision of a new health innovation ecosystem, those with power to influence the national health landscape, including policy makers, health system leaders, and technology companies, have a profound leadership opportunity. These leaders have the ability, and the ethical imperative, to create and incentivize accountability for health equity and the empowerment of persons, families, and communities. The authors of this paper have outlined a list of priority actions that major stakeholders can take to reset the system and begin to move toward centering equity in all health care innovation and improvement. The paper also offers compelling illustrations of the problem that the priority actions are designed to address. Additionally, the authors assert that it is imperative for all to lead the change that is needed.


Opportunities to Lead

Starting points for all:

  • Integrate responsibility for achieving equity into every organizational unit and management plan
  • Actively listen to and amplify underrepresented individual and community voices
  • Refocus on equity as the central goal of health and technology innovation and the touchstone by which to assess progress
  • Internalize and assert that without equity, progress indices are not meaningful
  • Understand the history, nature, and profile of structural racism and marginalization, then identify and address the remnants of harmful historical, social practices in your current thinking or work
  • Prioritize measuring equity and inclusion above the much easier measurement of efficiency and throughput
  • Invest in measurement and evaluation that is culturally responsive, participatory, and non-extractive

Technology companies:

  • Establish meaningful equity and inclusion data practices and metrics in collaboration with communities, and continually practice intentional involvement of communities in data collection and interpretation
  • Ensure equity and inclusion are key principles in technology design and enhancement
  • Prioritize resources to design and promote technologies that ameliorate disparities and advance equity
  • Employ leaders personally accountable for:
    • understanding national/state/local disparities
    • engaging communities and relevant partner organizations in the advancement of equitable health outcomes
    • ensuring data practices center equity
    • embedding community wisdom in the configuration, implementation, and ethical oversight of purchased technology and new technology design
  • Assess and have accountability on policies and practices to hire, retain, and promote a diverse employee population that builds trust with communities


Health system leaders:

  • Center equity as the key problem to be solved in health and health care, first for the human benefit and second as an inherent necessity to achieve cost and efficiency goals
  • Evaluate for equity in the design and implementation of health-relevant technologies, and use buying power to drive equitable practices and outcomes
  • Employ diverse executives personally accountable for:
    • processing national/state/local disparity data
    • engaging communities and relevant partner organizations in the advancement of equitable health outcomes
    • ensuring data practices center equity
    • embedding community wisdom in the configuration, implementation, and ethical oversight of purchased technology and new technology design
  • Train employees and lower-level decision makers on personal responsibility and methods to advance equity
  • Accountably assess policies and practices to hire, retain, and promote diverse employee populations that reflect and build trust with the communities they serve
  • Establish a population/public health imperative where the principles of equity for all are not circumvented by the needs, desires, and influences of the immediate bottom line


Academic institutions:

  • Prioritize resources to promote the study of technology solutions to advance equity and the most efficient and effective ways that systems can be “reset” to focus on equity and be actively anti-racist
  • Employ leaders personally accountable for:
    • understanding national/state/local disparities
    • engaging communities and relevant partner organizations in the advancement of equitable health outcomes
    • ensuring data and research practices center equity
    • embedding community wisdom in the configuration, implementation, and ethical oversight of purchased technology and new technology design
  • Embed health equity, civil rights history, and community engagement into core curricula for health, social science, technology, and civic professions
  • Assess and have accountability for policies and practices to hire, retain, and promote a diverse employee population
  • Formulate a plan of coordination or resources, accountability, and mobilization of many federally funded academic centers of research for equity and health disparities, meeting their mission of making change in an evidence-based way


Funding organizations:

  • Prioritize authentic engagement with communities in all funding opportunities
  • Establish meaningful equity and inclusion data practices and metrics in collaboration with communities
  • Ensure equity is the central criterion for funding awards
  • Coordinate investments with aligned organizations to maximize synergy and impact disparities


Individuals, families, and communities:

  • Recognize the unique role of personal and community perspectives and lived experience in facilitating a paradigm shift in technology and health care
  • (Those in positions of relative privilege or safety) Amplify the voices of those marginalized and advocate for equity and inclusion
  • (Those in positions of relative privilege or safety) Form partnerships to generate awareness, and set expectations for system behavior


Policy leaders:

  • Establish meaningful equity and inclusion data practices and metrics that reflect the key health indicators, concerns, and priorities of the communities served
  • Establish national/state/local objectives and define necessary accountable action to achieve those objectives in ways that include and do not marginalize or exploit individuals, families, and communities
  • Describe, incentivize, and create an accountable assessment of the impact of health-relevant technologies and related system processes on disparities
  • Ensure transparency and reproducibility of automated algorithms, employing human adjudication for values-based assessment
  • Describe, incentivize, and create an accountable assessment for hiring, retaining, and promoting diverse employee populations that will reflect and build trust with the communities they serve
  • Engage and support health system leaders in centering equity in all that they do


The authors pose this question to readers: How are you meeting this opportunity and imperative to center health equity in your sphere of influence?


Why It Matters

Centuries after the American intellectual and cofounder of the NAACP, W. E. B. Du Bois, called attention to how structural racism was creating and exacerbating disparities in health outcomes and the everyday lives of Americans of color, our nation reached a moment of reckoning. Intertwined crises are impossible to ignore, as a global COVID-19 pandemic continues to disproportionately impact communities of color, and Americans of all races are calling for racial justice following the incomprehensible killings of George Floyd, Breonna Taylor, and Ahmaud Arbery, among many others. Today, more than 50 years after the Civil Rights Act, disgraceful inequities exist in health and health care, and the dearth of measurement, interventions, and accountability to resolve disparities deepens. While health equity, person-centeredness, and the importance of the social determinants of health have risen to the mainstream discourse in recent years (Centers for Disease Control and Prevention [CDC], 2020), the authors of this paper argue that merely adjusting or refining our current health and health care systems will neither reverse deep and historical inequities nor achieve the financial and business efficiency aims of the industry. A bold reset is needed—on an immutable foundation of equity and inclusion.

Health care and health systems in the United States, in the same pattern as other sectors, employ practices, policies, and behaviors reflective of structurally embedded institutional racism, which have resulted in both intentional and unintentional harm to populations of color for decades and have proven difficult to detect, remediate, and eliminate (Institute of Medicine [IOM], 2003). Systems and institutions, as well as the research and business interests that inform care, were designed largely by, and in the interests of, the outcomes for White men, then White women and other people of privilege, who could afford to access an increasingly expensive system (Bailey et al., 2017). This design exacerbated inequities, and prominent historical figures within it generated profit and prestige from the mistreatment of people of color (Wailoo, 2018). Lack of access to basic health care inevitably results in negative health outcomes, and the impact on communities of color is compounded by the realities of structural racism and social determinants of health, including disparities in employment opportunities, transportation, safe housing, and a host of others (Paradies et al., 2015). Health technologies and biomedical innovation, critical to the progress of improving health and health outcomes, have historically been designed and implemented by the White dominant culture and available only to those who could afford these innovations—overwhelmingly wealthy, White, privileged populations. Critically, the datasets that inform and power new health innovations lauded as the future of medicine are also generated from those who have had access to the system—failing to represent the diverse user populations they impact (Zhang et al., 2017; Popejoy and Fullerton, 2016).

Many corporations and entities that set the direction of the private U.S. health care system and create our health technologies are, at baseline, driven by profit. While many engage in philanthropy and promote focus on the public good, their efforts on equity often face greater impediments. The transactional and volume-based financial and incentive structures of the health care ecosystem—applicable to both for-profit and nonprofit entities—dictate that cost and efficiency, not equity, are prioritized in solution delivery.

Tweaks have been made to technologies and datasets over the last decade to try to address the inherent inequities of both, but these have not scaled to sustainable systems change. For example, large datasets of faces, many collected by scraping the internet or using surveillance cameras, have been removed upon scrutiny from researchers and reporters; however, there has been little discussion on how to scale the determination of appropriateness of use or minimization of inequitable practices moving forward (Almeida et al., 2021; Crawford and Paglen, 2019). The authors of this paper propose that this continual tweaking and adjusting will never succeed: structural problems require structural solutions, not marginal improvements. Making unjust systems more efficient only accelerates the inequities they help to perpetuate. And where data-driven efficiency is narrowly pursued in systems already contributing to inequity, inequality measurably speeds up. Unless equity is the main target of creation or optimization, the chasms between minoritized groups and the health care systems will only grow (Younge et al., 2019).

To develop health technology, and eventually a health care system, that truly serves all Americans, we need a bold and collective “reset.” Health technologies, and especially data-driven technologies, need to center patients, families, and communities—ensuring that the people who are represented in the data and who will be using the technology and accessing the health system are represented in it. Health technology products must work for and serve patients, families, and communities. Only by refocusing and including the entire U.S. population our health system needs to serve can technology and data contribute to a more equitable, affordable, and sustainable public health ecosystem. This call to action goes beyond “human-centered design”; what is needed is a structural and cross-sector shift toward prioritizing people, communities, and the public interest.

Centering health systems and innovation on the people who will use both will improve care and health technology for all. And while prioritizing short-term efficiency can trade off equity, such as in the case of the Optum/UHG algorithm, the authors argue that prioritizing equity will actually lead to greater efficiency in the system in the long term (Obermeyer et al., 2019). This redefinition of the meaning of efficiency has been evolving since the 1980s, in the context of corporate responsibility and economic efficiency, and it applies to the efficiency of the health and health care landscape as well (Bodislav, 2012; Jones, 1980). Centering impacted people will save money and increase long-term efficiency in the system by realizing the benefits of prevention, improving health equity, reducing lifetime health care costs, and improving health outcomes. In the interdependent ecosystem of U.S. health care, this is an action that must be undertaken by academic institutions, health systems, health insurance plans, corporations, technology companies, and policy makers alike.

As investment, research, and development increasingly focus on health data and related technologies, we have better information on the results of centuries of structural racism. Disparities are evident in datasets and biased or opaque analytic algorithms, and the cause cannot be explained by mathematics and computation; the source is deeply historical and sociological. Still, artificial intelligence (AI) tools, such as machine learning algorithms and other advanced analytical technologies, offer the tantalizing promise of not only making sense of massive volumes of information but also automating the process of turning data-driven insights into decisions. As researchers have demonstrated, the use of machine learning inherently codifies historical discrimination and other forms of bias into automated processes (Maddox et al., 2019; Matheny et al., 2019).

Without an explicit focus on equity, AI applied to our health system will exacerbate the harm of the enduring legacies of racial injustice and oppression even as ostensibly neutral optimization targets like cost-savings and efficiency are pursued. In 2019, research revealed that an algorithm determining care plans for the world’s largest health care company was making racially biased recommendations by focusing on cost as the main marker of care needed, rather than other indicators of a patient’s health (Obermeyer, 2019). Other researchers have noted the potential risk of harm to Black women if algorithms are developed based on genetic testing for germline mutations that indicate a higher risk of breast cancer, as this population is less likely to receive additional genetic screening (Parikh et al., 2019). Many AI-enabled clinical decision supports are being rapidly developed and implemented, but there is little public data on the efficacy or impact on equity these tools have (Robbins and Browdin, 2020). These and many other examples presented later in this paper demonstrate the dire urgency to chart a new trajectory in the health sector: a fundamental redefinition of priorities and progress and a massive, coordinated, immediate intervention to cease harms to marginalized human beings and instead cultivate health.

The remainder of this paper illustrates the critical need for the aforementioned call to action and why all stakeholders play a necessary role in changing the focus of the health systems and related industries from cost and efficiency to centering equity. The authors describe where challenges exist and emphasize how equitable solutions can address gaps in the current system in the following areas:

  • Understanding a century’s journey toward valuing health equity
  • Prioritizing communities to overcome health challenges
  • Moving toward a person-centered health information ecosystem
  • Realizing that our data are inequitable
  • Ensuring that data governance evolves to include equity and privacy
  • Restarting the system
  • Advancing toward a rights-based regulatory framework


Understanding a Century’s Journey Toward Valuing Health Equity

Given the persistence of racially disproportionate health disparities today, an increasing number of cross-sector voices are acknowledging the importance of health equity. More than a century of civil rights leadership raised a conversation on structural racism in health care, and much of the recent conversation can still be attributed to impacted communities elevating their own voices. Health care systems, especially those powered by modern technologies, are at the epicenter of an interconnected web of ongoing racism, oppression, and disparate health outcomes; our journey to a better future must begin with listening to and engaging the communities experiencing harm.

While dimensions of inequity in health and health care are many, it is important to name racism as a key and primary driver of these inequities. Race and racism are social constructs—cultural, systemic, institutionalized, and structural; sharpened over centuries; inherent in the nation’s history; and antithetical to a healthy and productive society. The absence of overt racism does not mean that progress has been made, as this allows existing, inequitable structures to persist. A new direction must be established to overcome the inherently and structurally racist current health care environment, including with regard to social determinants of health.

The documented journey of naming and understanding these inequities through a scientific lens traces back to the 1890s. A full century before health equity and social determinants of health were formally defined, W. E. B. Du Bois was publishing sociological studies with health equity at the core (Mansky, 2018). In the seventh ward of Philadelphia, Du Bois personally conducted more than 5,000 detailed interviews with residents and their families, applying archival research and descriptive statistics to topics of health, education, occupation, religion, social structure, and family life. Historically, measurements of individuals and communities have been used to justify, rather than overcome, unconscionable class and race-based disparities (Eubanks, 2018). This is what spurred Du Bois’ research. The dominant ideology of the time, which inexcusably persists today, placed blame for challenges faced by Black communities on Black people themselves, citing perceptions not based on evidence. Du Bois proved that the disparities these communities faced were, in fact, the direct result of systemic, structural oppression.

Du Bois showcased the wisdom of his interviewees, their families, and their communities with intellectual and scientific rigor, demonstrating that these communities had been systematically shut out of the best education, training, and jobs based on the color of their skin (Johnson, 2019). Through detailed statistical analysis and sociological study, he demonstrated how the effects of racism made the American dream of a healthy family and thriving community exponentially harder to achieve for these communities. His visionary depiction of this Philadelphia community explained structural racism and its impacts to a world that wasn’t listening (Eubanks, 2018). Over the years, many others have recognized the need to address equity, but there is still substantive work yet to do. While recent efforts, such as President Biden’s January 2021 Executive Order on Advancing Racial Equity and Support for Underserved Communities Through the Federal Government, are a call to action, they illustrate the magnitude of structural racism live on in the 21st century (White House, 2021).


Prioritizing Communities to Overcome Health Challenges

Unfortunately, the challenges in the 1890s persist today, as the current wave of health-relevant technological advancement focuses on tools that promote greater efficiency of the status quo, deepening existing inequalities by exacerbating already unjust processes or making them more cost-efficient. Modern medicine, and related fields such as scientific research, solution design, and technology development, would benefit from Du Bois’ work. The authors plead with readers to not only read this paper, but to seek out the compelling, direct wisdom of Du Bois’ work, such as The Philadelphia Negro and The Health and Physique of the Negro American. These publications take a social epidemiological approach to sharing a fact-based, empirical set of data to emphasize the humanity of Black people (Du Bois, 1906, 1899). Tragically and unconscionably, there is still a need to amplify his insights more than a century later. This is a leadership opportunity for all of us.

While the establishment of patient and family advisory councils is a growing trend in health care, and human-centered design thinking is increasingly prevalent in the development of new technologies (Holeman and Kane, 2019), the voices and power of people most impacted by inequality are not yet adequately or equitably included. As such, these actions can feel performative. To create equitable and useful solutions, we must ensure these voices are empowered and included at each stage of advancement and innovation: advisement, design, development, implementation, measurement, and evaluation. This is especially important when setting data governance principles; the family councils could be best used as key decision makers when establishing these standards.

Passive or ineffective engagement of patients and families results in a vicious cycle of poorly defined and poorly organized inputs and less desirable, non-impactful outcomes, typically referred to as a state of “garbage in, garbage out.” Examples of structural racism effectuated through health technology abound, including the algorithm found to be driving racist care plan recommendations for one of the largest corporations in the United States, a case in which optimizing for cost caused the algorithm to prioritize the health of (wealthier) White patients (Obermeyer et al., 2019). Even with the best of intentions, honing existing practices and structures without first solving for equity will inevitably result in bias and harm. A recent publication in JAMA found that clinical applications of deep learning algorithms are disproportionately trained on health data from just three states—CA, MA, and NY—leading to concerns about the generalizability of their recommendations (Kaushal et al., 2020).

Other examples result not from historically biased or non-representative data but from non-diverse design teams. Health wearables and mobile health trackers have been criticized for failing to prioritize the needs of diverse populations. For example, optical sensors used for heart rate monitoring are less accurate on darker skin, which is thought to be at least partially a result of failures to test and train the sensors with diverse users (Hailu, 2019). In a basic search of publicly available health apps, few to none target the well-known, unconscionable maternal morbidity and mortality disparities long experienced by Black women and families. Because people may be marginalized along multiple intersecting aspects of identity and circumstance—including ethnicity, gender, orientation, language, education, age, geography, and digital access—such failures and harms compound when their input is not directly sought. Similar examples across the technology industry abound and confirm the urgency to include and empower those who experience harms and will be using the technology, actively representing their health needs and related challenges. Community-based participatory research methods and citizen science tools can provide a platform to engage and amplify the voices of persons, families, and communities (Katapally, 2019). However, these approaches are typically underfunded and labor and time intensive, making them difficult to scale and sustain.

New programs, such as the National Institutes of Health (NIH) All of Us program, hold promise (NIH, 2021), and many point to the national-level data on patients, families, and communities held by public agencies as a resource that can be used to improve the delivery of health services (Data Across Sectors for Health, 2019). But beyond the serious privacy and security implications of linking such data across agencies, these datasets often oversample the most impacted communities, leading to data-driven systems not reflective of the broader population, and further exacerbating existing inequalities (Eubanks, 2018). Further, unless equity and inclusion are explicitly centered, queries of those systems—and the insights derived—may hold inherent bias, finding fault in the population sampled rather than identifying the barriers they face. This was the case when the federal government began collaborating with Silicon Valley beginning in the 1960s. Without meaningful racial and ethnic minority voices or representation in design and development, the resulting software encoded the implicit biases of its creators and explicitly reproduced structures of monitoring, exclusion, segregation, and disparate treatment of people of color (McIlwain, 2019). This is precisely what Du Bois sought to change.

The opportunity to address racism and structural inequity in health care and other facets of society is well-documented, and yet community-informed tools that would empower people and compel change are not readily available. There are few short, purposefully designed financial incentives and no short-term efficiency-based impetus to compel organizations and agencies to prioritize equity and the voices of communities. While some EHR systems and health-relevant technologies are beginning to include data on the social determinants of health and basic human needs, datasets and algorithms remain flawed, and academic and policy leaders still have yet to settle on standards, measures, and goals that are necessary for meaningful progress and accountability on health equity (Chen et al., 2020). It is not sufficient to better measure socioeconomic influences on health outcomes—it is imperative to track them back to their root causes and develop systemic solutions to address them.

The needed intervention today, on the part of technologists and public and private sector industry leaders alike, is to reconsider design priorities and actively unwind such overt and subtle structures that reflexively disadvantage some people and favor others. In a world replete with health-relevant technologies, we have the opportunity to pause and consider the whole landscape: the health system itself, the racist structure it rests on, the health technologies developed to date, their fundamentally inequitable design, and the great potential for health technology to truly transform well-being and care for everyone if these systems are rethought and rebooted.

This fundamental, cross-sector disruption is necessary to drive toward an equitable system that is accountable to those it serves and produces technologies that improve, rather than deeper entrench, health, health care, and access inequities.


Moving Toward a Person-Centered Health Information Ecosystem

The Office of the National Coordinator for Health Information Technology defined patient-generated health data as “health-related data—including health history, symptoms, biometric data, treatment history, lifestyle choices, and other information—created, recorded, gathered or inferred by or from patients or their designees” (Shapiro et al., 2012). This definition highlights the importance for system designers to recognize and identify who truly needs to be involved in the development of systems. For example, the traditional health care system upholds clinicians as gatekeepers, but emerging tools—such as those supported by application programming interfaces (APIs) mandated by the 21st Century CURES Act—may be used outside traditional clinical settings and not include clinicians (Shanbhag and Bender, 2020; Cohen and Mello, 2019). The Act requires that data held by health systems must be accessible to patients via APIs.

Presently, data owned and used by health care systems are deployed to support health services, conduct research, and optimize cost, quality, safety, and efficiency to minimize risk and maximize profit. Although patient-generated health data support health care system operations, patients are not at the heart of any of these data transactions. A personalized information ecosystem aimed primarily at informing and empowering the patient would follow the patient for their entire life cycle, regardless of provider, insurer, or location where the data were generated. This personalized dataset would need to prioritize achieving better outcomes rather than driving transaction-based monetization and must be built on a foundation that prioritizes privacy as a right, not a widget for “Surveillance Capitalism” (Zuboff, 2019). In this new health ecosystem, EHR platforms—designed to be used for billing, with a focus on cost and efficiency—would need to shift toward prioritizing the needs of persons, families, and communities (Krumholz, 2020; Mandl and Kohane, 2020). This shift would require coordinated action from all stakeholders, including policy makers, academic institutions, funding organizations, health system leaders, technology companies, and individual patients and their advocates.

Here again, the engagement of all individuals, patients, families, and communities—particularly those historically marginalized—is necessary to achieve this vision of a person-centered health information ecosystem. Empowered with knowledge of what data they need to access, what data points are most critical for them to understand, how they can access data, how to choose what they are willing to share, and what they must protect, individuals become a true partner in achieving their life and health goals. This engagement is better for individuals, families, and communities, and it creates the engagement and prevention necessary to reduce system costs. This vision is within our reach—and can fundamentally pivot our health care system functions and improve our nation’s health.


Realizing That Our Data Are Inequitable

Extensive research shows that race-based prejudices and biases grievously impact the level of care and compassion shown to people based on their skin color both within and beyond the health care system (Hoffman et al., 2016). Leadership across sectors has an immediate opportunity to act to address this. In describing the rationale to track health outcomes by demographic factors, including race, gender, sexual identity, disability status, and geographic location, the Department of Health & Human Services’ Healthy People 2020 defines a health disparity as:

a particular type of health difference that is closely linked with social, economic, and/or environmental disadvantage. Health disparities adversely affect groups of people who have systematically experienced greater obstacles to health based on their racial or ethnic group; religion; socioeconomic status; gender; age; mental health; cognitive, sensory, or physical disability; sexual orientation or gender identity; geographic location; or other characteristics historically linked to discrimination or exclusion.

Structural disparities are well known to impact health outcomes, and racism and other established systems of structural inequity and underinvestment in marginalized communities have perpetuated unequal care for generations.

Much of the current health innovation landscape depends on and produces individual-level data. Whether used to train algorithms or reinvent care coordination, numerical representations about individual health and identity, including and especially those who experience disparities, are invaluable inputs to the development of health-relevant technologies. Increasingly, data leveraged en masse have the potential to impact health care decision making, the provision of resources, and the development of technologies. The value of data as a commodity has been compared to other materials of economic value, including “the new oil.”

Any data point is a measurement about something or someone at a particular point in time. But data are invisibly shaped by what questions are asked; by who or what takes the measurement; by who analyzes and interprets the information; and by the rules, norms, and historical context of implicated systems. Effects are then generated by how the data are operationalized and in service of goals—goals often set by the same entities shaping the data. These very human margins of error and obfuscation directly result in the disparate outcomes of technologies designed to be “neutral” (Obermeyer, 2018). In fact, no technology is neutral; every tool is shaped by the context of its design and implementation. As the collection and use of data have moved from paper to basic technologies to machine learning and AI, the potential impact of bias and error inherent in the design is ever greater.

Health data collected during decades of overt racial exclusion and injustice cannot be “neutral,” yet despite deep and well-documented flaws, this same data powers our research, innovation, and health technologies. These data not only create a flawed representation of reality—as seen through the eyes of those historically empowered to ask questions and shape rules and norms—but are the direct result of historically racist and exclusionary systems. Technology applied without an explicitly anti-racist lens will thus perpetuate or exacerbate racism and inequity by default.

Further, data collected, analyzed, and interpreted without a lens of equity may cause us to ignore key human impacts and perpetuate further disparities. For example, maternal mortality rates are much higher among Black women than White women. If Black women are not involved in identifying which data to collect, and data systems are not set up to capture that data, we may miss collecting critical information. If these data structures are not interrupted with Black women and with equity in mind, it may be easy to assume that these disparities exist due to individual behaviors rather than the structural determinants of health and racist systems that have influenced Black women’s health. As with the wearable heart rate monitor example above, data and technology developed without a diverse and representative set of empowered stakeholders will continue to leave behind those whom biased systems have never favored. As an example of a remedy, the Health Level 7 (HL7) Fast Healthcare Interoperability Resources (FHIR) Argonaut Project has endorsed an FHIR questionnaire resource that, if adopted by EHR systems, could dramatically lower administrative burdens for such input from a Black woman to be incorporated at scale.

Because the data we choose not to collect are as relevant as those we do, the call for increased measurement to hold ourselves accountable to equitable outcomes must not be confused as a justification for a broader surveillance infrastructure. Measurement must be done in partnership with empowered communities, reflecting the desired outcomes of those who are represented in resulting datasets, not just those of a technology’s designers, deployers, or funders. The artist and researcher Mimi Onohua (2018) has described data voids, including “the number of trans people killed or injured in instances of hate crime,” as well as “poverty and employment statistics that include people who are behind bars,” that begin to demonstrate which systemic realities have been traditionally ignored rather than measured and repaired. At a national and global level, in 2014, the United Nations Council on Eliminating Racial Discrimination called on the United States to standardize data collection and analysis regarding disparities in maternal and infant mortality. To date, this has not been achieved—reinforcing a core argument of this paper, that the collection of data that would problematize fundamental structures of power and the inequitable status quo has not been prioritized.


Ensuring That Data Governance Evolves to Include Equity and Privacy

We are not without precedent for intervention. Valid codes of ethics like human and civil rights or environmental factors such as carbon footprint have not yet been extended to health-relevant data and technology governance. Combined with a collective reorientation toward prioritizing equity, embedding rights frameworks into our innovation efforts will be transformational.

If our innovation ecosystem is currently seeking to decrease costs by improving the efficiency of systems regardless of deepening inequities, what would it take to reverse this inertia and move us to collectively and sustainably drive toward equity? Though structural inequity is now steering our innovation ecosystem by default, evolving technologies have the potential to exacerbate or ameliorate disparities. We can collectively change course. By standardizing and maintaining a person-centered, rights-based approach to health research, policy making, innovation, and definitions of progress, we can drive our systems to deliver equity.

Specific to health-relevant technologies, this shift in operating logic must go beyond “patient-centered care” to a true person-, family-, and community-centered approach, engage and empower diverse populations with disparate needs over their life course, and source wisdom directly from these populations. As the use of technology continues to become more prevalent in all aspects of our lives, the amount of health-relevant data we produce will continue to increase. It is critical to take a rights-based approach toward integrating these data into health services and to strive after the Quintuple Aim (Matheny et al., 2019), which adds equity and inclusion to patient experience, reducing cost, population health and care team well-being (see Figure 1).

The possibility of linking personal health information (PHI) with other data sources holds the potential to move from a reactionary system that provides care to the sick to a proactive ecosystem that cultivates health and wellness. The line between PHI data and consumer data is already blurring, as data streamed from both consumer and medical devices become a part of our daily lives and are increasingly used to make decisions about the services, experiences, and opportunities we may access. There are unsurprisingly grave equity concerns as this trend advances, which often lead to exacerbated privacy concerns.

In trying to account for historically racist data sources, some developers are rushing to include more diverse data in the design of their algorithms. However, extractive data practices (Hoffman, 2020) can harm the targeted populations and be exploitative, such as when contractors for a major technology corporation reportedly targeted and tricked homeless people in Atlanta into providing data to improve facial recognition for darker skin (Elias, 2019).

Further, despite clear equity and accuracy problems with these data, insurers are beginning to require activity tracking for policy eligibility (Barlyn, 2018), incurring warnings from privacy advocates (Hart, 2018; O’Neill, 2018). Popular health apps routinely share sensitive data like blood pressure and menstrual cycle information with Facebook and other private corporations (Cole, 2019; Neidig, 2019) and with insurance companies and employers (Harwell, 2019). Those same corporations may refuse to share data with the individuals on whom data have been collected and to be transparent regarding the use of that data (Williamson-Lee, 2019). Voluntary efforts by such industry leaders as the Apple App store present an improvement in the perceived transparency of some apps, but the long-term impact is unclear (McGraw and Mandl, 2021; Perez, 2020). Indeed, most diabetes apps lack even a basic privacy policy (Blenner et al., 2016). Emerging efforts, such as those led by an alliance of patient-caregiver groups, physicians, and health systems, to encourage a broader code of conduct and improved security and interoperability are necessary but at present insufficient (Creating Access to Real-time Information Now through Consumer-Directed Exchange [CARIN] Alliance, 2019).

Patient-generated data have been challenged as unreliable and inadequately protected, but even more challenging is the prospect of generating meaningful and equitable medical decision making from it, given that it typically must be incorporated into existing platforms, which are themselves flawed with bias. As described above, without representative datasets, the algorithms being developed for myriad interventions in health, including clinical decision support, will replicate and deepen existing disparities and inequities. Finding ways to center the patient voice in such “data-driven” technologies is paramount. Furthermore, such algorithms are typically proprietary and lack both transparency and reproducibility—making bias and its deleterious effects insidious. This “black box” of algorithmic decision making is a driver of the status quo, incongruous with a charted course toward equity.

Ultimately, all personal data are health-relevant data in a monetized ecosystem, requiring care and protection at the level of sensitive health information. But far from being protected by HIPAA and the Common Rule, emerging troves of health-relevant data are being monetized and traded between private companies. In addition, large datasets of protected health information are controlled by academic institutions, research institutions such as the NIH, and health systems, which poses significant challenges to interoperability and questions over who benefits from the research that was conducted using data collected from millions of individuals (National Academy of Medicine [NAM], 2020). Significant advances in data analytics require the ability to combine disparate sets of identifiable data, so even companies with monopolies on search and location data stand to benefit from harvesting health-relevant data on individuals and collaborating with health systems and academic medical centers, which control large quantities of that data (Stanford Medicine, 2021; Vaidya, 2021).

An article in the Wall Street Journal in November 2019 on Google’s Project Nightingale described how the company had harvested “tens of millions” of medical records from health provider Ascension, including patient names, lab results, diagnoses, hospitalization records, and prescriptions (Copeland, 2019). As described by Sidney Fussell in the Atlantic:

Google says it doesn’t combine its user data with Ascension patient data. But the fact remains that the data it already has on all its users are tremendously revealing. Your IP address contains information about where you live, which in turn is associated with social determinants of health such as income, employment status, and race. Search terms such as nearest food pantry or nearest HIV test can offer further clues about income level, sexual orientation, and so on. (Fussell, 2019)

Regulations governing PHI, such as HIPAA, were not designed for the digital age, and companies do not need basic consent to collect, store, package, mine, and sell the vast majority of health-relevant data (Warzel, 2019).

Critics of regulation often assert that the consumer or patient is responsible for protecting their own privacy while citing that data sharing is essential to health innovation (Miner, 2019). But as Ciara Byrne argued, trading privacy for survival is a tax on the poor (Byrne, 2019). It is no longer possible or reasonable to avoid making use of the digital services that track and surveil us on the back end. The math on how long it would take the average user to read all of the privacy policies, terms, and conditions (T&Cs) they encounter is estimated to take 76 eight-hour days just to read the T&Cs the average user agrees to in one year (Wagstaff, 2012).

And even where users take care to make use of platforms’ built-in privacy controls, companies are profiting from data monetization. As Representative Janice Schakowsky, Chair of the House Consumer Protection and Commerce Subcommittee, summed it up at a hearing held on February 26, 2019:

Modern technology has made the collection, analysis, sharing and sale of data both easy and profitable. . . . Personal information is mined from Americans with little regard for the consequences. In the last week alone, we learned that Facebook exposed individuals’ private health information [that] they thought was protected in closed groups and collected data from third-party app developers on issues as personal as women’s menstrual cycles and cancer treatments. People seeking solace may instead find increased insurance rates as a result of the disclosure of that information. (U.S. Congress, 2019)

It is clear we need a multisector, large-scale rethinking of data governance based on the insidious ways data are used to manipulate, influence, and control individuals and communities. Individuals need more control over who has access to, and profits from, sensitive information about them.


Restarting the System

Large-scale efforts have been made in recent years to redirect health innovation and practice toward new targets and goals. The proliferation of concepts like the aforementioned Quintuple Aim, “value-based care,” and “patient-centeredness” demonstrate widespread understanding that the priorities that both underpin and guide how health care is delivered and health technologies are created must shift at the highest level. However, these efforts have too often not been explicit about anti-racism and structural inequity in our society. As has been iterated throughout this paper, it is clear that these systems are responsible for inequities, and we must address these systems intentionally in the way we design health technologies, measure health behavior, and intervene for health outcomes.

Decisions regarding measuring health behavior are often made without a focus on equity. For example, in the last century, the world addressed global hunger by increasing yield per acre, not nutrition per acre (which is understandably harder to measure). Today, yield per acre is considerably higher, but nutrition per acre is much lower, leading to obvious disparities. The privileged have access to highly nutritious food, and the less privileged have access to less nutritious food, if they have access at all. The most insidious portion of this equation is that, by the standards of the goals set, global hunger is being addressed by creating more food per acre; in contrast, nutrition per acre is unmeasured and overtly worsening in many places.

A similar focus on measuring only access to health care leads to further exacerbation of inequities, in that the privileged have access to “good” medicine that actually cares for their health and pro-actively looks after their well-being, whereas the less privileged have “empty medicine” that is tailored to someone else’s needs and does not address the root issues of their own health challenges. Similar to accessing only less nutritious food, these populations receive inefficient care, which compounds with cross-sector disparities to cause poor outcomes.

Traditionally, when health systems and researchers seek to optimize systems, cost and efficiency are identified as the key problems to be solved—with equity an afterthought at best. Naturally, companies seeking to deliver technologies purchased by those systems, or by relevant stakeholders, will target what is important to the customer. When health systems and researchers intentionally center on human beings, specifically including those most impacted by structural racism, and make cost and efficiency a second-tier priority in the short term, inequity becomes the core problem to be solved. This does not mean cost is irrelevant, as cost-effective health care is core to ensuring access and sustainability. Neither does it disregard the need for efficiency; in fact, efficiency is paramount—applied on a foundation of equity. But a myopic focus on deriving cost and efficiency for a narrow set of individuals is inherently flawed logic; cost and inefficiency for those excluded from that equation—both in human and financial terms—balloons, as it is now (Tikkanen and Abrams, 2020). The authors of this paper assert that sustained cost reduction and efficiency can only be achieved by solving for equity. A fundamental, cross-sector reset is needed as systems have been built on hundreds of years of accumulating inequities. This shift will benefit individual humans, as well as systems, corporations, and the nation. By centering equity and the people who will use health care and health technology, health care and health technology will be more effective than ever. Health care systems will be better able to efficiently prevent harm and create room to care for those who need it. Health care systems and health technology can empower individuals and communities with actionable data and information relevant to their health and life goals. Individual actions to improve health will be not only possible but accessible. Ultimately, centering equity will reduce the life cycle cost of health, health care, and health technology. But it is only by engaging persons and communities—not as recipients, but as leading partners—in the optimization of health that we can retune our operating logic to achieve an ethical, cost-effective future.

One recent example of action toward this necessary shift is the set of Equity Metrics mandated by the State of California to determine the COVID-19 safety level of each county (California Department of Public Health, 2020). California requires measurement of the disparity in infection rates across neighborhoods, targeting investments to support disproportionately impacted populations (California Department of Public Health, 2020). Unsurprisingly, this mandate triggered many counties to prioritize and fund health equity.


Advancing toward a Rights-Based Regulatory Framework

What would it take to move toward a health innovation ecosystem that prioritizes the agency, dignity, and consent of those who use it? Thoughtful regulations on the use of health-relevant data are a useful way to start. A Rock Health and Stanford study published in November 2019 found that just 10% of those surveyed for the study would be willing to share their health information with technology companies. Researchers noted that “survey takers were less willing to share data today than they were in 2017. For example, in 2017, 86% of participants said they would share data with physicians, while today that number has dropped by 13 percentage points” (Lovett, 2019). Many researchers and health pundits have pointed to a lack of trust on the part of consumers and patients; however, blaming patients for lack of trust ignores why the lack of trust is there in the first place. If individuals had not been wronged in the past, they would not be distrustful now. As set forth in Artificial Intelligence in Health Care: The Hope, the Hype, the Promise, the Peril, centering human rights values requires a courageous change in thinking and input from those who have lost trust (Matheny et al., 2019).

Writing in Psychology Today, Dr. Elias Aboujaoude (2019) argues that “tougher privacy laws have become a psychological and public health necessity.” California’s Consumer Privacy Act, likely to set a precedent for state-level privacy protections in the United States, came into effect on January 1, 2020 (Goel, 2019). Notably, the law includes provisions to fill gaps in existing health data protections like HIPAA, extending a regulatory framework to all for-profit entities operating in the state, not just so-called “covered entities” (Goel, 2019). The American Medical Association put forward a new proposal for privacy practices, which give patients more control over their data when used by some private entities but does not apply those protections to use by health care providers and health insurance plans (McGraw et al., 2020). OpenNotes has submitted comments on modifying HIPAA rules to the Office of Human Rights, but there is no indication that substantive action is being considered (OpenNotes, 2019). To feel centered and comfortable in the vision of a new health ecosystem, individuals will need to be confident that the data they have requested be protected will be protected. The developments described above are promising but insufficient in scope.

Paradoxically, regulations like HIPAA, ostensibly designed to enable appropriate data sharing, are often interpreted as overly restrictive, dissuading providers from sharing data even when a patient has approved such sharing (Berwick and Gaines, 2018). The opposite side of the coin from necessary data protections is the ability to share patient data across systems and silos when requested. This sharing will require enhanced system interoperability, currently a huge barrier to sharing patient data across health locations and systems, let alone outside of the health care ecosystem. Current data management and regulatory mechanisms, which treat consumer and patient data in silos and hamper interoperability, are inadequate to support equitable and improved health outcomes. Putting control over data in the hands of individuals is essential for designing a healthier future, but it must be done in a way that prioritizes end-user needs, not the needs of systems, and does not place an outsized burden on marginalized population groups. Individuals must understand what data they have access to, what the data mean, and what it means to share or restrict access to said data.

Individuals can share their data in many ways that improve health for all, including by informing research studies and other investigations into population health. The ways data will be used, shared, and analyzed in research must be made absolutely clear at the start of the study. In the New York Times, Abdullah Shihipar argued for “legislation that restricts agencies’ access to data for reasons other than the purpose for which it was collected” (Shihipar, 2019). The intent is not to shut down the use of data for innovation or research, but rather to ensure that data use is centered firmly on the public interest—in other words, not solely to control costs or increase revenues in the health care system, but first to promote the health and well-being of those whose data are represented.

There are also relevant efforts to curb the excesses of extractive capitalism that could be applied to the business of health, such as the Business Roundtable’s 2019 decision to de-prioritize shareholder value—the lobbying group for large companies is emphasizing benefits to employees, value to customers, and environmental responsibility, not just shareholder profits (Gelles and Yaffe-Bellany, 2019). Such fundamental reorientation away from direct financial expenditures as a primary driver can be transformational. When the authors of a recent study showed that Black patients would be incorrectly flagged by an algorithm already in use to guide care planning, they were able to significantly de-bias the model simply by removing “cost” as the target of optimization (Obermeyer et al., 2019). Data that have been or are being collected today are increasingly available to whoever can pay for them, beyond health and into the public and private spheres and across public agencies. These data may or may not be de-identifiable (Rocher, 2019), and may or may not be held in secure containers. Current laws and regulations are insufficient to ensure that these data are appropriately protected and shared. Interoperability, data security, and ethical guardrails are fundamentally important, but because of nascent algorithmic capabilities, it is easy to leave rights-based frameworks out of the conversation. This is not new territory, but the stakes are ever higher. This is a critical juncture to ensure that we do not leave humanistic progress behind in a press toward technology advancement (Price and Cohen, 2019).



Health innovation must center the ethical imperative of equity before, and in order to achieve, cost and efficiency goals. This ethos is not opposed to the logic that guides the economic imperative; it is simply broader and looks further upstream than traditional economic modeling. Far from being antithetical to innovation, progressive granting of rights and robust regulations are core to ensuring “progress” does not come at the expense of the already disempowered, a historical trend that is not only unconscionable but also costly and inefficient in the long term. There is no meaningful progress without equity.

As we move toward a future in which we are increasingly surveilled in all aspects of our lives, it is critical to adapt and adhere to rights-based frameworks (Matheny et al., 2019) that ensure transparent knowledge and value creation, empower the most marginalized, and move toward a reality in which power is more equitably distributed. Though rights frameworks are still evolving, notably toward becoming more collectivist in nature, we have solid and established bases of human and civil rights from which to work.

As the authors of this paper have argued, the high cost of health care in this country is a symptom of a deeper issue: a culture that has historically treated some lives as more valuable than others. This is a structural issue, and costs will not be sustainably contained until we confront the reality of racism and its pernicious effects on public health. This uncontrolled growth of health care costs and sinking rates of effectiveness are perpetuated emergencies, and we cannot address the real costs of these emergencies without facing our willingness to continue funding a system that produces dramatically different outcomes based on skin color and other personal characteristics without a clear path toward equity.

In the United States, there is a pervasive spread of data-driven business models that some groups, such as Amnesty International, have argued are a threat to human rights (Amnesty International, 2019). This reality is colliding with a complicated public health stakeholder ecosystem. The tech industry sees health care as a key market opportunity. Providers and payers within the health system are heavily incentivized to drive toward compliance and wide margins. Meanwhile, patients and families across income and health literacy levels are too often left scrambling to coordinate their own care. Across this stakeholder landscape, the proliferation of personal data adds complexity to an ecosystem traditionally managed in silos. Data flow continuously across sectors in pursuit of business opportunities and profit, while the individuals represented in datasets are left without access to their own information.

As private sector companies increasingly build products and services that seek to influence end-user health and behavior, the role of the consumer becomes inextricably intertwined with the roles of person, patient, and citizen (Sarasohn-Kahn, 2019). A consensus is emerging that all personally identifiable data are health data, which raises broad questions of privacy, profit, and equity (Warzel, 2019). Further, the personal data disclosed in the process of consuming private sector goods and services hold important implications and insights into our habits and health at both individual and population levels. Some argue that these “social determinants”—including where we live, what we buy, and how we spend our time—are better predictors of health status and health risks than information in the medical records maintained by our doctors and other health care providers. However, due to the potential for this information to be used to negatively impact persons, families, and communities, it is essential to go beyond simple awareness of the importance of these determinants to a truly person-centered, rights-based framework.

The authors of this paper are not the first to argue for centering equity and anti-racism in efforts to improve population health—but rather the latest to amplify the message (Sarasohn-Kahn, 2019). In tying together decades of scholarship on the costs of inequity and the dangers of technology applied absent a focus on individual and collective rights, our effort is a multisector call to action to redefine concepts of value and progress at a fundamental level. The call to action at the beginning of this paper includes a revolutionary “resetting” of many systems to refocus on equity, understanding that cost-effectiveness and efficiency will follow. The authors have argued that rights and regulations are structures that promote, not constrain, innovation, ensuring its benefits are equitably distributed and not corrosive to societal well-being and public health in the long term. Mistakes will be made; downstream and unintended consequences are guaranteed, particularly when information technology is concerned. But collectively, we—all of us, policy makers, health system leaders, academic institutions, technology companies, funders, communities, and individuals—can improve our efforts to anticipate and prevent deleterious impacts, especially for the most marginalized, by amplifying their wisdom. We can agree not to prioritize short-term growth and profit over equity or the dignity and agency of individuals, families, and communities. We must do a better job of measuring the complex social factors that influence public health outcomes, holding ourselves accountable to promoting equity and centering impacted communities in the design and deployment of health-relevant technologies. We can achieve a more equitable and healthy future, but we must have the courage to see beyond short-term results toward the long-term life cycle benefits for all. We must all answer the call to action and meet this imperative to center health equity in our sphere of influence.



Join the conversation!

Tweet this!  Authors of a new #NAMPerspectives discussion paper outline a list of compelling priority actions to refocus U.S. health care innovation and improvement around equity. Read more:

  Tweet this!  “Health innovation must center the ethical imperative of equity before, and in order to achieve, cost and efficiency goals.” Read priorities for health system stakeholders in a proposed new #NAMPerspectives:

  Tweet this!  In a new #NAMPerspectives, authors propose a bold “reset” of the U.S. health system to address the imperative of equity and achieve a more effective and affordable system for all. Read more:


Download the graphics below and share them on social media! 






  1. Aboujaoude, E. 2019. Digital Privacy: Don’t Get Over It. Psychology Today, May 24. Available at: (accessed October 5, 2020).
  2. Agency for Healthcare Research and Quality. 2013. Working With Patient and Families as Advisors Implementation Handbook. Available at: (accessed October 5, 2020).
  3. Almeida, D., K. Shmarko, and E. Lomas. 2021. The ethics of facial recognition technologies, surveillance, and accountability in an age of artificial intelligence: a comparative analysis of US, EU, and UK regulatory frameworks. AI and Ethics.
  4. Amnesty International. 2019. Surveillance giants: How the business model of Google and Facebook threatens human rights. Available at: (accessed October 5, 2020).
  5. Bailey, Z. D., N. Krieger, M. Agénor, J. Graves, N. Linos, and M. T. Bassett. 2017. Structural racism and health inequities in the USA: evidence and interventions. The Lancet 389(10077):1453-1463.
  6. Barlyn, S. 2018. Strap on the Fitbit: John Hancock to sell only interactive life insurance. Reuters, September 19. Available at: (accessed October 5, 2020).
  7. Berwick, D. M. and M. E. Gaines. 2018. How HIPAA Harms Care, and How to Stop It. JAMA 320(3):229-230.
  8. Blenner, S. R., M. Köllmer, A. J. Rouse, N. Daneshvar, C. Williams, and L. B. Andrews. 2016. Privacy Policies of Android Diabetes Apps and Sharing of Health Information. JAMA 315(10):1051-1052. 10.1001/jama.2015.19426.
  9. Bodislav, D. A., 2012. The new economy: efficiency, equity and sustainable economic growth. Calitatea 13(1):21.
  10. Byrne, C. 2019. Trading privacy for survival is another tax on the poor. Fast Company, March 18. Available at:[35vival (accessed October 5, 2020).
  11. California Department of Public Health. 2020. Blueprint for a Safer Economy: Equity Focus. Available at: (accessed October 16, 2020).
  12. CARIN (Creating Access to Real-time Information Now through Consumer-Directed Exchange) Alliance. 2019. The CARIN Trust Framework and Code of Conduct. Available at: (accessed June 11, 2021).
  13. Centers for Disease Control and Prevention. 2020. Health People 2020. Available at: (accessed October 5, 2020).
  14. Chen, M., X. Tan, and R. Padman. 2020. Social determinants of health in electronic health records and their impact on analysis and risk prediction: A systematic review. Journal of the American Medical Informatics Association 27(11):1764-1773.
  15. Cohen, I. G., and M. M. Mello. 2019. Big Data, Big Tech, and Protecting Patient Privacy. JAMA 322(12):1141-1142. 10.1001/jama.2019.11365
  16. Cole, S. 2019. Health Apps Can Share Your Data Everywhere, New Study Shows. Vice, March 20. Available at: (accessed October 5, 2020).
  17. Copeland, R. 2019. Google’s ‘Project Nightingale’ Gathers Personal Health Data on Millions of Americans. The Wall Street Journal, November 11. Available at: (accessed October 5, 2020).
  18. Crawford, K., and T. Paglen. 2019. Excavating AI: The Politics of Images in Machine Learning Training Sets. Available at: (accessed June 24, 2021).
  19. Data Across Sectors for Health. 2019. Sector: Health Care, Primary Stakeholder: Hospitals/Health Systems. Available at: (accessed October 5, 2020).
  20. Delbanco, T., D. M. Berwick, J. I. Boufford, E. Levitan, G. Ollenschläger, D. Plamping, and R. G. Rockefeller. 2001. Healthcare in a land called PeoplePower: nothing about me without me. Health Expectations 4(3):144-50.
  21. Du Bois, W. E. B. (1899). The Philadelphia Negro: A Social Study. Philadelphia, PA: University of Pennsylvania.
  22. Du Bois, W. E. B. (1906). The Health and Physique of the Negro American: Report of a Social Study Made under the Direction of Atlanta University: Together with the Proceedings of the Eleventh Conference for the Study of the Negro Problems, Held at Atlanta University, on May the 29th, 1906. Atlanta University Publications, Number Eleven. Atlanta, GA: Atlanta University Press.
  23. Elias, J. 2019. Google contractor reportedly tricked homeless people into face scans. CNBC, October 3. Available at:,%245%20gift%20cards%20as%20incentives (accessed October 5, 2020).
  24. Eubanks, V. 2018. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York, NY: St. Martin’s Press.
  25. Fussell, S. 2019. Google’s Totally Creepy, Totally Legal Health-Data Harvesting. The Atlantic, November 14. Available at: (accessed October 5, 2020).
  26. Gelles, D., and D. Yaffe-Bellany. 2019. Shareholder Value Is No Longer Everything, Top C.E.O.s Say. The New York Times, August 19. Available at: (accessed October 5, 2020).
  27. Goel, S. 2019. California Consumer Privacy Act and the future of the health data economy. Med City News, November 7. Available at: (accessed October 5, 2020).
  28. Gravity Project Team. 2019. The Gravity Project: A Social Determinants of Health Coding Collaborative Project Charter. The SIREN Project. Available at: (accessed October 5, 2020).
  29. Hailu, R. 2019. Fitbits and other wearables may not accurately track heart rates in people of color. Stat News, July 24. Available at: (accessed October 20, 2020).
  30. Hart, R. D. 2018. Don’t share your health data with insurance companies just for the perks. Quartz, September 11. Available at: (accessed October 5, 2020).
  31. Harwell, D. 2019. Is your pregnancy app sharing your intimate data with your boss? The Washington Post, April 10. Available at: (accessed October 5, 2020).
  32. Henry, J., Y. Pylypchuk, T. Searcy, and V. Patel. 2016. Adoption of Electronic Health Record Systems among U.S. Non-Federal Acute Care Hospitals: 2008–2015. ONC Data Brief Available at: (accessed October 5, 2020).
  33. Hoffman, A. L. 2020. Terms of Inclusion: Data, Discourse, Violence. New Media and Society. Available at: (accessed October 5, 2020).
  34. Hoffman, K. M., S. Trawalter, J. R. Axt, and M. N. Oliver. 2016. Racial bias in pain assessment and treatment recommendations, and false beliefs about biological differences between blacks and whites. Proceedings of the National Academy of Sciences 113(16):4296-4301.
  35. Holeman, I., and D. Kane. 2019. Human-centered design for global health equity. Information Technology for Development 26(3).
  36. IOM (Institute of Medicine). 2003. Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care, edited by B. D. Smedley BD, A. Y. Stith AY, and A. R. Nelson AR. Washington, DC: National Academies Press.
  37. Johnson, G. 2019. The times and life of W.E.B. Du Bois at Penn. Penn Today, February 22. Available at: (accessed October 5, 2020).
  38. Jones, T.M. 1980. Corporate Social Responsibility Revisited, Redefined. California Management Review 22(3):59-67. doi:10.2307/41164877.
  39. Katapally, T. R. 2019. The SMART Framework: Integration of Citizen Science, Community-Based Participatory Research, and Systems Science for Population Health Science in the Digital Age. Journal of Medical Internet Research mHealth uHealth 7(8):e14056.
  40. Kaushal, A., R. Altman, and C. Langlotz. Geographic Distribution of US Cohorts Used to Train Deep Learning Algorithms. JAMA 324(12):1212-1213.
  41. Krumholz, H. M. 2020. An ‘Epic’ pushback as U.S. prepares for new era of empowering patient health data. Stat News, January 27. Available at: (accessed October 5, 2020).
  42. Lovett, L. 2019. Rock Health, Stanford report: Consumerization of health reshaping doctor-patient relationships, data conversations. MobiHealthNews, October 29. Available at: (accessed October 5, 2020).
  43. Lovis, C. 2019. Unlocking the power of artificial intelligence and big data in medicine. JMIR Publications 21(11). Available at: (accessed October 5, 2020).
  44. Maddox, T.M., J. S. Rumsfeld, and P. R. O. Payne. 2019. Questions for Artificial Intelligence in Health Care. JAMA 321(1):31-32.
  45. Mandl, K. D., and I. S. Kohane. 2020. Epic’s call to block a proposed data rule is wrong for many reasons. Stat News, January 27. Available at: (accessed October 5, 2020).
  46. Mansky, J. 2018. W. E. B. Du Bois’ Visionary Infographics Come Together for the First Time in Full Color. Smithsonian Magazine, November 15. Available at: (accessed October 5, 2020).
  47. Matheny, M., S. Thadaney Israni, M. Ahmed, and D. Whicher, Editors. 2019. Artificial Intelligence in Health Care: The Hope, the Hype, the Promise, the Peril. NAM Special Publication. Washington, DC: National Academy of Medicine. Available at: (accessed October 5, 2020).
  48. McGraw, D, D. B. Mendelsohn, and M. Savage. 2020. The Right Prescription For Health Data Privacy: Reflections On The AMA’s New Privacy Principles. Health Affairs Blog. Available at: (accessed September 29, 2020).
  49. McGraw, D., and K. D. Mandl. 2021. Privacy protections to encourage use of health-relevant digital data in a learning health system. npj Digital Medicine 4(2).
  50. McIlwain, C. D. 2019. Black Software: The Internet & Racial Justice, from the AfroNet to Black Lives Matter. New York, NY: Oxford University Press.
  51. Mimi Onuoha. 2018. On Missing Data Sets. Available at: (accessed October 5, 2020).
  52. Miner, L. 2019. For a Longer, Healthier Life, Share Your Data. The New York Times, May 22. Available at: (accessed October 5, 2020).
  53. NASEM (National Academies of Sciences, Engineering, and Medicine). 2017. Communities in action: Pathways to health equity. Washington, DC: National Academies Press.
  54. Neidig, H. 2019. Report: Apps are sharing sensitive data with Facebook without informing users. The Hill, February 22. Available at: (accessed October 5, 2020).
  55. NIH (National Institutes of Health). 2021. All of Us Research Program Overview. Available at: (Accessed June 10, 2021).
  56. O’Neill, S. 2018. As Insurers Offer Discounts For Fitness Trackers, Wearers Should Step With Caution. NPR, November 19. Available at: (accessed October 5, 2020).
  57. Obermeyer, Z., B. Powers, C. Vogeli, and S. Mullainathan. 2019. Dissecting racial bias in an algorithm used to manage the health of populations. Science 366(6464):447-453.
  58. 2019. OpenNotes Submits Comments on Modifying HIPAA Rules to Office of Civil Rights. Available at: (accessed December 15, 2019).
  59. OpenNotes. 2020. Everyone on the Same Page. Available at: (accessed February 22, 2020).
  60. Paradies, Y., J. Ben, N. Denson, A. Elias, N. Priest, A. Pieterse, A. Gupta, M. Kelaher, and G. Gee. 2015. Racism as a determinant of health: a systematic review and meta-analysis. PLOS ONE 10(9): p.e0138511.
  61. Parikh, R. B., S. Teeple, and A. S. Navathe. 2019. Addressing Bias in Artificial Intelligence in Health Care. JAMA 322(24):2377-2378.
  62. Perez, S. 2020. Apple launches its new app privacy labels across all its App Stores. (accessed June 11, 2021).
  63. Popejoy, A., and A. Fullerton. 2016. Genomics is failing on diversity. Nature 538:161-164.
  64. Price, W. N., and I. G. Cohen. 2019. Privacy in the age of medical big data. Nature Medicine 25(1):37-43.
  65. Robbins, R. and E. Brodwin. 2020. An invisible hand: Patients aren’t being told about the AI systems advising their care. STAT News, July 15. Available at:
  66. Rocher, L., J. M. Hendrickx, and Y. A. de Montjoye. 2019. Estimating the success of re-identifications in incomplete datasets using generative models. Nature Communications 10(3069).
  67. Sarasohn-Kahn, J. 2019. Patients, Health Consumers, People, Citizens: Who Are We In America? Available at: (accessed February 22, 2020).
  68. Shanbhag, A. and J. Bender. 2020. Application Programming Interfaces in Health IT. Available at: (accessed June 10, 2021) 10.1001/jama.2019.11365.
  69. Shapiro, M., D. Johnston, J. Wald, and D. Mon. 2012. Patient-Generated Health Data. Office of Policy and Planning, Office of the National Coordinator for Health Information Technology.
  70. Shihipar, A. 2019. Data for the Public Good. The New York Times, October 24. Available at: (accessed October 5, 2020).
  71. Stanford Medicine. 2021. Using Artificial Intelligence to Improve Clinicians’ Experiences with the Electronic Health Record. Available at: (Accessed June 3, 2021).
  72. Tikkanen, R. and K. Abrams. 2020. U.S. Health Care from a Global Perspective, 2019: Higher Spending, Worse Outcomes? Commonwealth Fund, January 30. Available at (accessed October 23, 2020).
  73. United Nations Committee on the Elimination of Racial Discrimination. 2014. Concluding observations on the combined seventh to ninth periodic reports of the United States of America. September 25, 2014. Available at: (accessed October 23, 2020).
  74. United States Congress. 2019. Opening Statement Chair Jan Schakowsky Subcommittee on Consumer Protection and Commerce Committee on Energy and Commerce Hearing on “Protecting Consumer Privacy in the Era of Big Data.” Available at: (accessed October 5, 2020).
  75. Vaidya, A. 2021. Mayo Clinic, Google eyeing ‘AI factory’ as collaboration moves forward. Available at: (Accessed June 3, 2021).
  76. Wagstaff, K. 2012. You’d Need 76 Work Days to Read All Your Privacy Policies Each Year. Time, March 6. Available at: (accessed October 5, 2020).
  77. Wailoo, K. 2018. Historical Aspects of Race and Medicine: The Case of J. Marion Sims. JAMA 320(15):1529-1530. doi:10.1001/jama.2018.11944
  78. Warzel, C. 2019. All Your Data Is Health Data. The New York Times, August 13. Available at: (accessed October 5, 2020).
  79. Whicher, D., M. Ahmed, S. Siddiqi, I. Adams, M. Zirkle, C. Grossmann, and K. L. Carman, Editors. 2021. Health Data Sharing to Support Better Outcomes: Building a Foundation of Stakeholder Trust. NAM Special Publication. Washington, DC: National Academy of Medicine.
  80. White House. 2021. Executive Order On Advancing Racial Equity and Support for Underserved Communities Through the Federal Government. Available at: (Accessed July 4, 2021).
  81. Williamson-Lee, J. 2019. Who Owns Your Health Data? OneZero, March 27. Available at: (accessed October 5, 2020).
  82. Younge, A., D. Yusuf, E. Voegeli, and J. Truong. 2019. Automating New York City and Encoding Inequality. Boston, MA: Harvard University.
  83. Zhang, X., E. J. Pérez-Stable, P. E. Bourne, E. Peprah, O. K. Duru, N. Breen, D. Berrigan, F. Wood, J. S. Jackson, D. Wong, and J. Denny. 2017. Big Data Science: Opportunities and Challenges to Address Minority Health and Health Disparities in the 21st Century. Ethnicity & Disease 27(2):95-106.
  84. Zuboff, S. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York, NY: Public Affairs.


Suggested Citation

MacDonald, J., G. Demiris, M. Shevin, S. Thadaney-Israni, T. J. Carney, and A. Cupito. 2022. Health Technology for All: An Equity-Based Paradigm Shift Opportunity. NAM Perspectives. Discussion Paper, National Academy of Medicine, Washington, DC.

Author Information

George Demiris, PhD, is Mary Alice Bennett university professor at the University of Pennsylvania. Michelle Shevin, MA, is senior program manager, Technology and Society, at the Ford Foundation. Sonoo Thadaney-Israni, MBA, is executive director, Presence & Program in Bedside Medicine at Stanford University. Timothy Jay Carney, PhD, MPH, MBA, is founder and chief advisor at the Global Health Equity Intelligence Collaborative, LLC and adjunct assistant professor at University of North Carolina Chapel Hill. Anna Cupito, MPH, was an associate program officer at the National Academy of Medicine until 2021.


This paper benefited from the thoughtful input of Sameer Badlani, M.D., F.A.C.P, University of Minnesota; Saurabha Bhatnagar, MD, Department of Veterans Affairs; Aneesh Chopra, MPP, CareJourney; Lauren Choi, JD, MA, Blue Cross Blue Shield Association; Lori Gerhard, U.S. Administration for Community Living; Leslie Kelly Hall, Healthwise; Amanda Kong, University of North Carolina; Kameron Matthews, MD, JD, FAAFP, Veterans Health Administration; Matt Quinn, Telemedicine & Advanced Technology Research Center; Liz Salmi, OpenNotes; and Neil Smiley, Loopback Analytics. Ayodola Anise and Asia Williams from the National Academy of Medicine provided valuable support to the development of this paper.

Conflict-of-Interest Disclosures

Timothy Jay Carney reports servicing the Gravity Advisory Board, HIMSS SDOH Workgroup, ASPE-PCORI Advisory Board, and HHS SDOH Taskforce (Data Subcommittee) in his CDC capacity.


Questions or comments about this paper should be directed to Anna Cupito at


The views expressed in this paper are those of the authors and not necessarily of the authors’ organizations, the National Academy of Medicine (NAM), or the National Academies of Sciences, Engineering, and Medicine (the National Academies). The paper is intended to help inform and stimulate discussion. It is not a report of the NAM or the National Academies. Copyright by the National Academy of Sciences. All rights reserved.

Join Our Community

Sign up for NAM email updates