How Can We Dismantle Health Equity Barriers In Research?
Research funders must be vigilant in working purposefully toward health equity. As a community, we must examine biases that may be hidden in our processes and may be promoting inequities inadvertently. There are many possible reasons, including health inequities in data systems that are used in research; the research topics that are solicited; and potential bias in evaluating grant proposals. Like many of our peer organizations, the Donaghue Foundation is taking steps to promote greater equity through elevating awareness and revising our procedures.
The move to use real-world data in research offers many benefits, but funders and researchers must be mindful that they hold the potential to perpetuate historic inequities. An article in the Washington Post in October 2019 reported on research published in Science that showed even nonhuman systems can contribute to racial disparities that affect the care that people receive.
In the example described in Science, a computer algorithm sold by Optum underestimated the health needs of the sickest black patients, even though it used a race-blind predictor of future health care needs: cost. The algorithm used health costs to identify patients with complex needs for special care management programs, but because black patients incurred less in annual medical costs than white patients—because of lower levels of health care coverage and more difficulties in paying out-of-pocket expenses, black patients were assessed as being less at risk. Collectively, however, black patients suffered from nearly 50,000 more chronic diseases than white patients. The Optum algorithm drove care decisions that may have hampered access to care for nearly half of the black patients.
As the author of a commentary entitled “Assessing Risk, Automating Racism,” which accompanied the original Science article, noted, “Coded inequity is perpetuated precisely because those who design and adopt such tools are not thinking carefully about systemic racism.” In that commentary, author Ruha Benjamin cautions against regarding technology-enabled processes as lacking bias. She describes what she calls the “New Jim [Crow] Code”—that is, “automated systems that hide, speed, and deepen racial discrimination behind a veneer of technical neutrality.”
The researchers who studied the Optum algorithm contacted the company to let it know what they had found and received an encouraging follow-up to their work. Somewhat to the researchers’ surprise, they were warmly received and have been collaborating on experimental solutions since. “Our early results show that simply by changing the label [the outcome the algorithm was asked to predict,] we reduced bias by 84 percent,” the authors write in a Health Affairs Blog post.
Even more notable, those same researchers have offered to work with members of the health care industry to ensure that future algorithms used in population health management are “constructed and applied fairly,” and they are offering these services pro bono. “Fortunately,” as Sendhil Mullainathan, one of the Science study’s authors argued recently in the New York Times, “it is much easier to remove bias from algorithms than from people.”
Changing The RFPs
At the Donaghue Foundation, we are also being more vigilant about potential bias in data. Formerly, in the application instructions for our Another Look program, we said, “Describe the dataset you will be using in your research. Include the purpose of the dataset, how and when the data were collected, any checks used to assure the validity of the data, and any non-random biases that might impact the data.”
In 2020 the Another Look program announcement will state, “In all programs, the Foundation requires applicants to demonstrate that they have reviewed their research questions and protocols, sampling, and data analysis plans to ensure that they are not inadvertently masking or exacerbating racial, ethnic, or gender health inequities.”
We are also sharpening our language to invite research questions that address inequities. For example, the announcement for our 2019 Greater Value Portfolio program had as an eligible topic, “Offer solutions to problems of poor-quality, low-value care that disproportionally impact vulnerable populations such as uninsured individuals and members of minority groups.” For 2020, the language in the Greater Value Portfolio program announcement will be even more intentional, stating that we seek interventions that reduce disparities in health status.
Attracting Applicants And Selecting Grantees
Research funders must also be aware of potential implicit biases in their grantee selection process. Knowing the applicant and his or her institution may be like putting your thumb on the scale. That is why Donaghue’s Letters of Intent, the initial phase of our application review process, have been going to our external reviewers blinded. No information is provided that identifies the principal investigator or her or his institution. We believe that the blinding results in a wider pool of candidates and institutions being invited to submit full applications.
To reduce potential bias in all forms, including our affinity for organizations or investigators we’ve worked with before, and to eliminate implicit gender or racial bias, we will continue to use a blinded process for reviewing Letters of Intent.
We recognize that this is much more challenging to do when reviewing applications. We can’t take out the information fields that reveal the applicant’s identity and institutional support as these are needed to assess whether the applicant has the experience to do the project. An interesting experiment conducted by the National Institutes of Health to do just that has been described as “harder than it looks” and “a heavy lift.” In short, anonymity doubles the necessary review activities.
Applicants’ gender, race, and ethnicity may drive their interest in particular research topics and questions. A mismatch between those topics and what interests funders can make the grant application process ripe for unconscious bias. Further study of NIH funding processes found that “topic choice alone accounts for over 20 percent of the funding gap [between African American researchers and white researchers receiving NIH grants] after controlling for multiple variables, including the applicant’s prior achievements.”
The NIH study authors recommend taking steps to ensure more diversity in applicant pools. At Donaghue, we are debating whether to collect gender, race, and ethnicity data from applicants as a way to understand and validate the diversity of our applicant pool, and we are reviewing the practices of our peer funders. We have not decided yet whether this approach makes sense for us. It might undermine the value of blinding as a mechanism for soliciting Letters of Intent from a diverse applicant pool. We are making concerted efforts to reach out to affinity groups, professional associations, and membership organizations (such as America’s Essential Hospitals), whose participants we would like to attract as applicants and grantees.
Another avenue we are taking to increase the diversity of our applicant pool is increasing the diversity of our review committees in all areas, including career stage, race, gender, research interests, and scholarly expertise.
Since this research article about racial bias in an algorithm intended to improve health appeared in Science magazine, it has generated a great deal of discussion and brought significant attention to this topic. The article, just published in October 2019, made the 2019 Altmetric Top 100 list of most-discussed scientific articles of 2019 across all research disciplines.
For Donaghue it has sparked a broader conversation about bias in its many forms that we will continue to have with the foundation’s reviewers and advisers, funding colleagues, and others in the research community—discussing what we each need to do to dismantle barriers to health equity in research.
Lynne Garner, Nancy C. Yedlin, Jennifer Salopek, Can We Dismantle Health Equity Barriers in Research?, Health Affairs Blog, February 5, 2020, https://bit.ly/2UERNoR, Copyright © 2020 Health Affairs by Project HOPE – The People-to-People Health Foundation, Inc.