Statement for the Record - Senate Science, Manufacturing, and Competitiveness Subcommittee regarding AI
Statement for the Record
U.S. Senate Committee on Commerce, Science, and Transportation Subcommittee on Science, Manufacturing, and Competitiveness Hearing: “Less Hype, More Help: AI That Improves Safety, Productivity, and Care”
Hearing Date: March 3, 2026
Dear Chairman Budd, Ranking Member Baldwin and Members of the Subcommittee:
I am writing as Acting Chairman of the National Council on Disability (NCD), an independent, bipartisan federal agency that advises Congress, the President, and other federal agencies on matters affecting the lives of people with disabilities, to provide this statement for inclusion in the written record of this Subcommittee’s hearing, “Less Hype, More Help: AI That Improves Safety, Productivity, and Care.” NCD is providing this statement to make policymakers aware of the susceptibility of artificial intelligence (AI) to develop explicit and implicit biases about people with disabilities and to advise policymakers on effective ways to ensure that these technologies are developed with data sets that include people with disabilities.
While NCD’s research recognizes the potential benefits of utilizing AI to improve healthcare outcomes for people with disabilities, there exist some vulnerabilities in these technologies that could negatively impact the diagnosis and treatment of people with disabilities and provide policymakers with erroneous information rather than accurate solutions.
The ultimate goal of AI is to create machines that can make the same decisions as humans.1 NCD’s 2024 report, titled The Implicit and Explicit Exclusion of People with Disabilities in Clinical Trials, analyzed the use of AI in clinical trials.2 One study NCD examined described how technologies such as AI, machine learning, and natural language processing can be incorporated into several aspects of clinical trial research.3 Some examples include data mining, prescreening for possible participants, and automating invitations to possible participants who have been prescreened through automation. Academic researchers and the pharmaceutical industry are using AI to mine and utilize data from electronic sources such as health records and devices.4
Our 2024 report found that a contributing factor to health care disparity outcomes for people with disabilities was physicians’ erroneous assumptions about the values and expectations of their patients, assumptions that mirror widespread, stigmatized societal views about the disabled.5 Due to these concerns, NCD found that disability cultural competence should be a core strategy for the healthcare system in order to reduce healthcare disparities for people with disabilities.6 Strong evidence exists that cultural training for health care professionals improves providers’ knowledge, understanding, and skills for treating patients from culturally, linguistically, and socioeconomically diverse backgrounds.7
While NCD’s 2024 research was limited to AI and clinical trials, we reasonably believe our findings and recommendations can be generalized to other forms of AI technology in healthcare. Our findings were based on published studies, legislation, and clinical trial protocols as well as subject matter expert interviews, trial participant interviews, healthcare provider and participant surveys as well as feedback from stakeholders at the National Institutes of Health (NIH) and Food and the Drug Administration (FDA). Because AI is intended to develop the same decision-making capabilities as humans, NCD is similarly concerned that AI technologies may inadvertently develop the same assumptions and biases about people with disabilities. For this reason, NCD advises policymakers to review the use of AI in healthcare in general and establish regulations as needed to ensure that these technologies are built on data sets that include people with disabilities, so that implicit and explicit biases are not accidentally developed or “learned.”
Thank you for the opportunity to provide a brief summary of NCD’s relevant research, analysis, and recommendations on ways to improve the use of AI technology in the healthcare system for people with disabilities. We welcome the opportunity to brief the Committee and its staff in depth on any of these or related topics at your direction and request.
To that end, please do not hesitate to contact our Executive Director, Ana Torres-Davis, atorresdavis@ncd.gov, and Director of Legislative Affairs and Outreach, Anne Sommers McIntosh, amcintosh@ncd.gov, who will be glad to address any request for follow-up you may have or provide a more in-depth briefing on any of our reports and advisement.
Respectfully,
Neil Romano
Acting Chairman
National Council on Disability
-—
-
Harrer S, Shah P, Antony B, et al., “Artificial Intelligence for Clinical Trial Design,” Trends in Pharmacological Science, 2019;40(8):577–591. doi:10.1016/j.tips.2019.05.005. \ ↩
-
National Council on Disability, “The Implicit and Explicit Exclusion of People with Disabilities in Clinical Trials,” (2024) available at https://www.ncd.gov/2024/08/14/federal-report-illuminates-need-for-disability-inclusion-in-clinical-trials. \ ↩
-
Von Itzstein MS, Hullings M, Mayo H, et al., “Application of Information Technology to Clinical Trial Evaluation and Enrollment: A Review,” JAMA Oncology, 2021;7(10):1559–1566. doi:10.1001/jamaoncol.2021.1165. \ ↩
-
Woo M, “An AI Boost for Clinical Trials,” Nature, 2019;573(7775): S100–S102. doi:10.1038/D41586-019-02871-3. \ ↩
-
Shakespeare T, Iezzoni LI, Groce NE, “Disability and the Training of Health Professionals,” Lancet, 2009;374(9704):1815–1816. \ ↩
-
Association of American Medical Colleges, Cultural Competence Education for Medical Students, 2005. https://www.aamc.org/download/54338/data/culturalcomped.pdf. \ ↩
-
Govere L, Govere EM, “How Effective Is Cultural Competence Training of Healthcare Providers on Improving Patient Satisfaction of Minority Groups? A Systematic Review of literature,” Worldviews on Evidence-Based Nursing, 2016;13(6):402–410. Accessed January 16, 2024. https://sigmapubs.onlinelibrary.wiley.com/doi/full/10.1111/wvn.12176 ↩