Draft:Ziad Obermeyer
![]() | Review waiting, please be patient.
This may take 3 months or more, since drafts are reviewed in no specific order. There are 2,847 pending submissions waiting for review.
Where to get help
How to improve a draft
You can also browse Wikipedia:Featured articles and Wikipedia:Good articles to find examples of Wikipedia's best writing on topics similar to your proposed article. Improving your odds of a speedy review To improve your odds of a faster review, tag your draft with relevant WikiProject tags using the button below. This will let reviewers know a new draft has been submitted in their area of interest. For instance, if you wrote about a female astronomer, you would want to add the Biography, Astronomy, and Women scientists tags. Editor resources
Reviewer tools
|
Submission declined on 11 February 2025 by DoubleGrazing (talk). This submission is not adequately supported by reliable sources. Reliable sources are required so that information can be verified. If you need help with referencing, please see Referencing for beginners and Citing sources. The content of this submission includes material that does not meet Wikipedia's minimum standard for inline citations. Please cite your sources using footnotes. For instructions on how to do this, please see Referencing for beginners. Thank you.
Where to get help
How to improve a draft
You can also browse Wikipedia:Featured articles and Wikipedia:Good articles to find examples of Wikipedia's best writing on topics similar to your proposed article. Improving your odds of a speedy review To improve your odds of a faster review, tag your draft with relevant WikiProject tags using the button below. This will let reviewers know a new draft has been submitted in their area of interest. For instance, if you wrote about a female astronomer, you would want to add the Biography, Astronomy, and Women scientists tags. Editor resources
This draft has been resubmitted and is currently awaiting re-review. | ![]() |
Comment: There are several paragraphs, and at least one whole section, entirely without references – where is all that information coming from? Please note that in articles on living people, pretty much every material statement, and certainly anything potentially contentious or sensitive, as well as all private personal and family details, must be clearly supported by inline citations to reliable published sources, or else removed.Scribd is user-generated and not considered reliable. DoubleGrazing (talk) 15:54, 11 February 2025 (UTC)
Ziad Obermeyer
[edit]Ziad Obermeyer (US: /ˈziː.jæd ˈoʊ.bɚˌmaɪ.ɚ/: Arabic: زياد أوبرماير) is a Lebanese American physician, researcher and academic, known for his work in the intersection between machine learning and health policy.[1][2][3] He is the Blue Cross of California Distinguished Associate Professor of Health Policy and Management at the University of California, Berkeley.[4]
His research focuses on helping researchers and healthcare personnel make better decisions by ‘seeing’ the world the way algorithms do. His work on algorithmic racial bias has been highly influential in shaping the discourse and policy surrounding Artificial Intelligence (AI), [5][6][7][8][9] particularly in the ways in which organizations can build and use algorithms,[10] and how state and federal lawmakers[11] and regulators[12] can hold AI accountable. In 2024, Obermeyer testified before the US Senate Finance Committee on Artificial Intelligence in healthcare.[13]
Obermeyer is co-founder of Nightingale Open Science,[14] a non-profit that makes massive new medical imaging datasets —like electrocardiogram waveforms, sleep monitoring data, and digital pathology—available to algorithm developers for free,[15] and Dandelion,[16] a venture-backed startup that aims to jump-start AI innovation in health.[17] He is also a founding and core faculty member of the Berkeley–UCSF joint program in Computational Precision Health.[18][19]
Obermeyer is a Chan Zuckerberg Biohub Investigator[20] and a Research Associate at the National Bureau of Economic Research.[21] He was named an Emerging Leader by the National Academy of Medicine in 2020,[22] and one of the 100 most influential people in Artificial Intelligence by TIME Magazine in 2023.[23]
Early Life and Education
[edit]Obermeyer was born in Beirut, Lebanon, and raised in the United States.[24] He earned a Bachelor of Arts (A.B.) degree from Harvard College,[25] a Master of Philosophy (M.Phil.) in History and Science from the University of Cambridge,[26] and a Doctor of Medicine (M.D.) from Harvard Medical School, graduating magna cum laude.[26][27] He later served as an Assistant Professor at Harvard Medical School.[25]
Obermeyer trained as an emergency physician at Mass General Brigham (MGB) in Boston, Massachusetts.[27] Emergency Medicine offered him an overview of medicine and physiology that he couldn’t get from any other single specialty. His clinical practice inspired him to explore the role of data science in optimizing medical decision-making under uncertainty[28] and to push forward the science underlying some of those decisions around patients who could have sudden cardiac death, develop complications from COVID-19, or develop metastatic cancer and who doesn’t.[29]
Before pursuing a career in medicine, Obermeyer worked as a consultant at McKinsey & Company, advising pharmaceutical and global health clients in New Jersey, Geneva, and Tokyo.[30]
Research contributions
[edit]Obermeyer’s research focuses on enhancing medical decision-making by using machine learning as a tool to study and identify patients who are more likely to have a heart attack and would require further testing,[3] identifying new patterns that doctors miss in underserved patients[2] or linking individual body temperature set points to health outcomes.[31]
Obermeyer has also shown how widely used algorithms affecting millions of patients automate and scale up racial bias.[1]
Machine Learning Approach to Low-Value Health Care
[edit]In a study led by Obermeyer and Sendhil Mullainathan, the authors contend that healthcare models must incorporate physician error, and that policies focused solely on incentive problems can produce large inefficiencies.[3] In the study, they used machine learning as a tool to study decision-making in healthcare, focusing specifically on how physicians diagnose heart attacks and comparing the outcomes to the results of an algorithmic model that identifies at-risk patients’ probability of developing a heart attack. Their findings reveal two key inefficiencies: overtesting and undertesting.[3] In the case of overtesting, physicians could administer tests to low-risk patients who would not benefit. Meanwhile, predicted at-risk patients are left untested and are to suffer from adverse health consequences, including death.
Mullainathan and Obermeyer confirmed these findings by analyzing shift-to-shift testing variation, tracking how different doctors make decisions under similar circumstances, leading to the conclusion that over- and undertestings are not strictly explained by possible financial incentives but instead point to systematic errors in judgment. Misdiagnosis, as Mullainathan and Obermeyer reason, is due to physicians using simplistic models of risk, overweighing factors representative of a heart attack such as chest pain, over others.[3]
Racial Bias in Healthcare Algorithm
[edit]In a landmark study investigating commercial algorithms hospitals rely on to guide follow-up care decisions, Obermeyer, Mullainathan and others investigated a widely used algorithm developed by the Optum unit of UnitedHealth Group, [32] typical of this industry-wide approach and affecting millions of patients in the United States.[33][1]The authors found evidence of racial bias, such as that black patients were assigned the same level of risk as white patients while being considerably sicker, as evidenced by signs of uncontrolled illnesses.[33] The authors estimated that this racial bias reduces the number of Black patients identified for extra care by more than half.[1] According to the study, only less than 18% of the patients were identified by the algorithm as needing more care were black, compared to about 82% of white patients.[33] Remedying this disparity would increase the percentage of Black patients receiving additional help from 17.7 to 46.5%.[1]
“What the algorithm is doing is letting healthier white patients cut in line ahead of sicker black patients,” Obermeyer explained in an interview with the Wall Street Journal in 2019.[32]
Obermeyer et al. suggested that bias arises because the algorithm uses health costs as a proxy for health needs, whereby less money is spent on black patients with the same level of need, and the algorithm thus falsely concludes that Black patients are healthier than equally sick White patients.[1] Reformulating the algorithm so that it no longer uses costs as a proxy for needs eliminates the racial bias in predicting who needs extra care.
Impact
[edit]Following the algorithmic racial bias study, New York’s insurance regulator launched an investigation into a UnitedHealth Group Inc. algorithm[34]. The state’s Department of Financial Services joined by the New York Department of Health sent a letter to UnitedHealth Chief Executive David Wichmann asking the company to stop using the algorithm, citing concerns that it constituted a “discriminatory business practice.” Under New York state law, insurers are prohibited from relying on, producing or promoting a discriminatory algorithm.[34][35]
Obermeyer and colleagues collaborated with the developers of the algorithm to address its biases. Their findings also prompted concerns from insurers, hospitals, and other stakeholders that similar racial biases might exist in their own predictive models. In response, the researchers launched an initiative at the University of Chicago Booth School of Business to provide pro bono assistance to health systems and organizations seeking to identify and mitigate bias in their algorithms.[36]
The study was covered in media outlets and has shaped policy surrounding AI accountability[37]. Democratic senator Cory Booker (D-New Jersey) and Senate colleague Ron Wyden (D-Oregon) released letters to the Federal Trade Commission and Centers for Medicare and Medicaid Services asking the agencies how they look for and prevent bias in healthcare algorithms.[38][39] They asked the Federal Trade Commission to investigate whether decision-making algorithms discriminate against marginalized communities, and wrote to five of the largest healthcare companies, asking about their internal safeguards against bias in their technology.[38][39]
Algorithmic Bias Playbook
[edit]Obermeyer and colleagues at the University of Chicago Booth School of Business released the Algorithmic bias playbook in 2021[40][41], a resource designed to help policymakers and regulators, C-suite leaders, and technical teams working in healthcare to define, measure, and mitigate racial bias in live algorithms.
Racial bias in government-allocated CARES Act funding
[edit]In a study published in the Journal of the American Medical Association, Obermeyer investigated an algorithm used to allocate a $175 billion relief package under the federal CARE Act, uncovering funding inequities and large-scale racial bias. [42][43] The algorithm has allocated funding based on hospitals’ revenues instead of recorded numbers of COVID-19 cases, significantly benefiting large hospitals with more resources. Meanwhile, underfunded hospitals serving predominantly black populations received disproportionately lower funds, while managing larger numbers of COVID-19 cases.
The study adds to a growing body of evidence showing that communities with people of color were further impacted during the pandemic compared to wealthier white communities.[43]
Algorithmic approach to reducing unexplained pain disparities in underserved populations
[edit]Obermeyer and Computer Scientist Emma Pierson investigated unexplained pain disparities, using a deep learning approach to assess the severity of osteoarthritis in underserved communities.[2] Their findings showed that traditional radiographic assessments accounted for only 9% (95% CI, 3–16%) of racial disparities in pain, while algorithmic predictions explained 43%—4.7 times more (95% CI, 3.2–11.8×).[2]
The study suggested that using knee X-rays to predict patients’ experienced pain could reveal that much of underserved patients’ pain originates from factors within the knee uncaptured by traditional radiographic measures.
Obermeyer and Pierson have found that the algorithm’s effectiveness was due to the racial and socioeconomic diversity of the training set, as they enrolled really diverse, large populations of patients from across the US.
Views on AI
[edit]Obermeyer is quite optimistic about the promise of AI in healthcare. He views it as a powerful tool for generating novel empirical observations in real-world data, many of which are inaccessible to the human eye.[44] Data-driven decision-making could offer a solution to two key challenges in healthcare: suboptimal results and high cost, offering a rare combination of improving health outcomes and reducing costs.[13]
However, Obermeyer has also highlighted the risks associated with poorly designed algorithms, advocating for AI accountability across public and private sector, academia, government and policymakers. He said that in addition to government efforts to regulate AI dangerous algorithms, granting data access to researchers continues to be essential.[45]
Policy efforts
[edit]Obermeyer has been actively working with state and federal lawmakers and regulators to hold AI accountable[13].
He testified before the US Senate Financial Committee on Feb 8, 2024, on Artificial Intelligence and Healthcare.[13] He recommended that the committee apply the following measures: transparency from AI developers to state the output of their algorithms; independent evaluations for algorithms for accuracy and potential racial bias; valuation and reimbursement of AI products according to established principles from health economics and outcomes research.[13]
Awards
[edit]Time Magazine has named Obermeyer one of the 100 most influential people in Artificial Intelligence by TIME Magazine in 2023, in recognition of his influential work in the intersection of machine learning and health.[23]
The National Bureau of Economic Research appointed Obermeyer as a Research Associate in 2023.[21]
Obermeyer was named Chan Zuckerberg Biohub Investigator in 2022.[20]
Obermeyer’s significant study on Dissecting racial bias in an algorithm used to manage the health of populations was awarded the Victor R. Fuchs Award for Lifetime Contributions to the Field of Health Economists from the American Society of Health Economists (ASHEcon) in 2021[46], and the Responsible Business Education Award from the Financial Times in 2022.[47]
The study also won ‘Editors’ Pick’ from StatMadness in 2020.[48] Senior Writer in Science And Discovery Sharon Begley highlighted the significance of that year’s pick, noting that “in addition to being rigorous, important, and innovative, it exemplifies a growing challenge in health care and biomedicine: separating hype from reality in terms of what artificial intelligence can do, from discovering drugs to diagnosing patients.”[48]
In 2012, as an exceptional junior scientist, Obermeyer was awarded the Early Independence Award, Office of the Director, National Institutes of Health, for his work in defining predictors of unexpected deaths in the United States.[49]
Non-Academic work
[edit]Nightingale Open Science
[edit]Obermeyer is a co-founding member of Nightingale Open Science, a non-profit platform that aims to boost data access, competitiveness and quality of research, by connecting researchers with world-class medical data and creating massive new medical imaging datasets available for research.[50]
The project, launched in December 2021, was led by philanthropic venture Schmidt Futures with $6 million in funding[51], led by former Google CEO Eric Schmidt, and his wife, Wendy Schmidt.
Nightingale Open Science focuses on high-dimensional data, such as imaging and waveforms, which are ideally suited to machine learning, and data sets that could facilitate research breakthroughs in unresolved medical challenges. One such focus is cardiac arrest death, which accounts for 300,000 American fatalities every year,[52] with no identifiable cause, and cancer, whose mortality rates and late-stage diagnoses have not reduced in numbers despite improved screening since the 1990s.
The data include 40 terabytes of medical imagery, which Obermeyer has spent two years working with hospitals in the United States and Taiwan to collect.[53] It includes X-rays, electrocardiogram waveforms and pathology specimens, from patients with a range of conditions, including high-risk breast cancer, sudden cardiac arrest, fractures, and Covid-19. Each image is labeled with the patient’s medical outcomes, such as the stage of breast cancer and whether it resulted in death, or whether a Covid patient needed a ventilator.
The researchers behind the platform advocate for developing algorithms that learn from nature and not from humans through linking imaging data to outcomes and actual changes to the patient’s health, not solely relying on a physician’s judgment and interpretation of medical imaging.
Dandelion Health Data Consortium
[edit]Obermeyer is cofounder of Dandelion Health Data Consortium, a company that uses health data for product development and predictive analytics.[54]
In June 2023, Dandelion Health launched a pilot program[55] funded by the Gordon and Betty Moore Foundation and the SCAN Foundation to audit and evaluate the performance of algorithms against intended populations and uncover potential racial, ethnic, and geographic bias at no cost.[56]
The service allows any AI developer to upload the algorithm securely to Dandelion’s computing environment. The platform then runs the algorithm on its diverse national dataset and provides overall performance metrics of the algorithm and ones for pre-specified groups of interest under a patient-privacy-first, open-access model[55].
References
[edit]- ^ a b c d e f Obermeyer, Ziad; Powers, Brian; Vogeli, Christine; Mullainathan, Sendhil (2019-10-25). "Dissecting racial bias in an algorithm used to manage the health of populations". Science. 366 (6464): 447–453. Bibcode:2019Sci...366..447O. doi:10.1126/science.aax2342. PMID 31649194.
- ^ a b c d Pierson, Emma; Cutler, David M.; Leskovec, Jure; Mullainathan, Sendhil; Obermeyer, Ziad (January 2021). "An algorithmic approach to reducing unexplained pain disparities in underserved populations". Nature Medicine. 27 (1): 136–140. doi:10.1038/s41591-020-01192-7. ISSN 1078-8956. PMID 33442014.
- ^ a b c d e Mullainathan, Sendhil; Obermeyer, Ziad (2022-04-08). "Diagnosing Physician Error: A Machine Learning Approach to Low-Value Health Care". The Quarterly Journal of Economics. 137 (2): 679–727. doi:10.1093/qje/qjab046. ISSN 0033-5533.
- ^ "Ziad Obermeyer". UC Berkeley Public Health. 2019-07-16. Retrieved 2025-04-10.
- ^ Ledford, Heidi (2019-10-24). "Millions of black people affected by racial bias in health-care algorithms". Nature. 574 (7780): 608–609. Bibcode:2019Natur.574..608L. doi:10.1038/d41586-019-03228-6. PMID 31664201.
- ^ "Rooting Out AI's Biases | Hopkins Bloomberg Public Health Magazine". magazine.publichealth.jhu.edu. Retrieved 2025-04-10.
- ^ "Racial bias found in widely used health care algorithm". NBC News. 2019-11-07. Retrieved 2025-04-10.
- ^ Ross, Casey (2021-06-21). "'Nobody is catching it': Algorithms used in health care nationwide are rife with bias". STAT. Retrieved 2025-04-10.
- ^ "AI could make health care fairer—by helping us believe what patients say". MIT Technology Review. Retrieved 2025-04-10.
- ^ "Playbook". The University of Chicago Booth School of Business. Retrieved 2025-04-10.
- ^ "Booker, Wyden Demand Answers on Biased Health Care Algorithms | U.S. Senator Cory Booker of New Jersey". www.booker.senate.gov. Retrieved 2025-04-10.
- ^ "Attorney General Bonta Launches Inquiry into Racial and Ethnic Bias in Healthcare Algorithms". State of California - Department of Justice - Office of the Attorney General. 2022-08-31. Retrieved 2025-04-10.
- ^ a b c d e "Artificial Intelligence and Health Care: Promise and Pitfalls | The United States Senate Committee on Finance". www.finance.senate.gov. Retrieved 2025-04-10.
- ^ "About". www.ngsci.org. Retrieved 2025-04-10.
- ^ Murgia, Madhumita (2022-01-03). "Trove of unique health data sets could help AI predict medical conditions earlier". Financial Times. Retrieved 2025-04-10.
- ^ "Dandelion Health". Dandelion Health. Retrieved 2025-04-10.
- ^ PhD, Jonathan D. Grinstein (2024-09-18). "Dandelion Health Cracks Clinical Data Bottleneck Limiting AI in Healthcare". Inside Precision Medicine. Retrieved 2025-04-10.
- ^ "New Program Trains Data Scientists Who are Transforming Medicine | UC San Francisco". www.ucsf.edu. 2024-10-08. Retrieved 2025-04-10.
- ^ "Faculty". Computational Precision Health. Retrieved 2025-04-10.
- ^ a b "Jennifer Ahern, Ziad Obermeyer named Chan Zuckerberg Biohub Investigators". UC Berkeley Public Health. 2022-01-14. Retrieved 2025-04-10.
- ^ a b "NBER Appoints 54 Research Associates, 3 Faculty Research Fellows". NBER. 2023-10-02. Retrieved 2025-04-10.
- ^ "Ziad Obermeyer named an emerging leader in health research by National Academy of Medicine". UC Berkeley Public Health. 2020-05-14. Retrieved 2025-04-10.
- ^ a b "TIME100 AI 2023: Ziad Obermeyer". Time. 2023-09-07. Retrieved 2025-04-10.
- ^ "Featured Researcher: Ziad Obermeyer". NBER. Retrieved 2025-04-10.
- ^ a b "Ziad Obermeyer | Health Care Policy". hcp.hms.harvard.edu. 2017-02-01. Retrieved 2025-04-10.
- ^ a b "Ziad Obermeyer, M.D., M.Phil. | AcademyHealth". academyhealth.org. Retrieved 2025-04-10.
- ^ a b "Clinical and Machine Learning Expert, Dr. Ziad Obermeyer Joins GNS Healthcare's Strategic Advisory Board". BioSpace. 2018-09-18. Retrieved 2025-04-10.
- ^ Communication, Brigham Office of Strategic (2018-06-11). "Q&A: Emergency Medicine Physician Ziad Obermeyer". Brigham Clinical & Research News. Retrieved 2025-04-10.
- ^ Khodabandeh, Sam Ransbotham and Shervin (2023-02-14). "Helping Doctors Make Better Decisions With Data: UC Berkeley's Ziad Obermeyer". MIT Sloan Management Review. Retrieved 2025-04-10.
- ^ "Ziad Obermeyer". UC Berkeley Public Health. 2019-07-16. Retrieved 2025-04-10.
- ^ Obermeyer, Ziad; Samra, Jasmeet K.; Mullainathan, Sendhil (2017-12-13). "Individual differences in normal body temperature: longitudinal big data analysis of patient records". BMJ. 359: j5468. doi:10.1136/bmj.j5468. ISSN 0959-8138. PMC 5727437. PMID 29237616.
- ^ a b Evans, Melanie; Mathews, Anna Wilde (2019-10-24). "Researchers Find Racial Bias in Hospital Algorithm". Wall Street Journal. ISSN 0099-9660. Retrieved 2025-04-10.
- ^ a b c Chakradhar, Shraddha (2019-10-24). "Widely used algorithm for follow-up care in hospitals is racially biased, study finds". STAT. Retrieved 2025-04-10.
- ^ a b Evans, Melanie; Mathews, Anna Wilde (2019-10-26). "New York Regulator Probes UnitedHealth Algorithm for Racial Bias". Wall Street Journal. ISSN 0099-9660. Retrieved 2025-02-11.
- ^ "NY Regulators Probe for Racial Bias in Health-Care Algorithm". GovTech. 2019-10-29. Retrieved 2025-04-10.
- ^ Begley, Sharon (2020-04-06). "Discovery of racial bias in health care AI wins STAT Madness 'Editors' Pick'". STAT. Retrieved 2025-02-11.
- ^ Simonite, Tom. "Senators Protest a Health Algorithm Biased Against Black People". Wired. ISSN 1059-1028. Retrieved 2025-04-10.
- ^ a b "Booker, Wyden Demand Answers on Biased Health Care Algorithms | U.S. Senator Cory Booker of New Jersey". www.booker.senate.gov. Retrieved 2025-04-10.
- ^ a b Wetsman, Nicole (2019-12-04). "There's no quick fix to find racial bias in health care algorithms". The Verge. Retrieved 2025-04-10.
- ^ "Ziad Obermeyer and colleagues at the Booth School of Business release health care Algorithmic Bias Playbook". UC Berkeley Public Health. 2021-06-24. Retrieved 2025-02-11.
- ^ "Playbook". The University of Chicago Booth School of Business. Retrieved 2025-02-11.
- ^ Kakani, Pragya; Chandra, Amitabh; Mullainathan, Sendhil; Obermeyer, Ziad (2020-09-08). "Allocation of COVID-19 Relief Funding to Disproportionately Black Counties". JAMA. 324 (10): 1000–1003. doi:10.1001/jama.2020.14978. ISSN 0098-7484. PMC 7414421. PMID 32897336.
- ^ a b Ross, Casey (2020-08-07). "Study finds racial bias in the government's formula for distributing Covid-19 aid to hospitals". STAT. Retrieved 2025-04-10.
- ^ "Obermeyer bio & talk abstracts". Google Docs. Retrieved 2025-04-10.
- ^ "Obermeyer says government regulation of AI wouldn't stop creative or dangerous uses". www.washingtonpost.com. Retrieved 2025-04-10.
- ^ "2021 ASHEcon Award Winners – ASHEcon". Retrieved 2025-02-11.
- ^ Jack, Andrew (2022-01-19). "Academic research award: smart ideas with real-world impact". Financial Times. Retrieved 2025-02-11.
- ^ a b Begley, Sharon (2020-04-06). "Discovery of racial bias in health care AI wins STAT Madness 'Editors' Pick'". STAT. Retrieved 2025-02-11.
- ^ "2012 NIH Director's Early Independence awards recognizes 14 scientists". National Institutes of Health (NIH). 2015-09-30. Retrieved 2025-02-11.
- ^ "Nightingale Open Science - Open data for healthcare AI research". www.ngsci.org. Retrieved 2025-02-11.
- ^ Palmer, Katie (2022-01-25). "Can open datasets help machine learning solve medical mysteries?". STAT. Retrieved 2025-02-11.
- ^ Huikuri, Heikki V.; Castellanos, Agustin; Myerburg, Robert J. (2001-11-15). "Sudden Death Due to Cardiac Arrhythmias". New England Journal of Medicine. 345 (20): 1473–1482. doi:10.1056/nejmra000650. ISSN 0028-4793. PMID 11794197.
- ^ Murgia, Madhumita (2022-01-03). "Trove of unique health data sets could help AI predict medical conditions earlier". Financial Times. Retrieved 2025-02-11.
- ^ Khodabandeh, Sam Ransbotham and Shervin (2023-02-14). "Helping Doctors Make Better Decisions With Data: UC Berkeley's Ziad Obermeyer". MIT Sloan Management Review. Retrieved 2025-02-11.
- ^ a b "Dandelion Health launches pilot to evaluate AI performance and potential bias". Healthcare IT News. 2023-06-21. Retrieved 2025-02-11.
- ^ "Dandelion Health". Dandelion Health. Retrieved 2025-02-11.