Research Misconduct

It’s Time to Safeguard Genomic Data

On Jan. 24, 2026, the New York Times reported that DNA sequences contributed by children and families to support a federal effort to understand adolescent brain development were later co-opted by other researchers and used to publish “race science” claims about intelligence.

On Jan. 24, 2026, the New York Times reported that DNA sequences contributed by children and families to support a federal effort to understand adolescent brain development were later co-opted by other researchers and used to publish “race science” claims about intelligence.

The story is jolting not only because the misuse of the data was in bad faith, potentially racist and stigmatizing, but also because it violates a basic expectation each family brings to research participation: That their child’s DNA will be safeguarded and used, per their consent, to help other children thrive.

The Times report arrives as genomic research has become increasingly routine. Over the past two decades, the cost and speed of generating genetic data have improved dramatically, making it feasible to collect DNA at scale. Scientists use these datasets to advance public-health goals, including how biology and the environment interact in real life. The ABCD Study highlighted by the Times was designed to examine these relationships by following a large cohort of children.

Precisely because so much genetic data is now collected, stored, and shared, the Times account raises the specter of a broader ethical vulnerability in genomic science: data systems built for beneficial research can be exploited for purposes to which volunteers who contributed their DNA did not agree. Past research transgressions underscore the need for more vigilant governance and stronger safeguards to protect participants from downstream misuse.

The Belmont Report was authored after numerous ethical violations, especially that of the Tuskegee Study, and remains the ethical backbone of U.S. research, emphasizing respect for persons, beneficence, and justice. These principles translate into obligations to “do no harm” by ensuring informed consent (with added protections for children), risk minimization proportional to scientific value, and fair distribution of burdens and benefits. Ethical science is not possible without volunteers — and volunteers cannot meaningfully volunteer if these principles are violated.

Genomic volunteers have seen their Belmont Rights breached before. The Havasupai Tribe case remains a stark example of how biospecimens can be collected with consent for one purpose and later used for another, causing profound harm to a community’s dignity. In the early 1990s, tribe members consented to DNA collection to study diabetes. Later, scientists took that same DNA and used it for unauthorized topics deemed culturally taboo and deeply harmful. The Havasupai case taught us that biospecimen research can cause harm even without a traditional “privacy breach,” because DNA can be repurposed in ways that stigmatize individuals and communities. 

The 2018 Revised Common Rule tried to address this risk by increasing transparency — especially around broad future use and whole-genome sequencing — but it did not create a comprehensive genomic privacy regime to govern downstream control. The Common Rule also fails to confront the risk of genetic-data misuse by permitting studies involving DNA collection procedures to be categorized as “minimal risk” by Institutional Review Boards (“IRBs”) who must confirm regulatory criteria for approval are met by assuring risks to participants are minimized. 

This “minimal risk” designation — defined as harm no greater than what people ordinarily encounter in daily life or during routine exams — focuses on the physical act of providing the DNA sample rather than the enduring, high-stakes informational risks of genomic research. By equating DNA collection with routine exams, the framework bypasses the heightened scrutiny. Because DNA is enduring and revealing, however, downstream risks can persist even when identifiers are removed, especially when data are shared, combined with other datasets, or analyzed at population scale.

Further complicating the issue, to promote good-faith sharing, the NIH requires that federally funded scientists deposit de-identified data (often collected via minimal-risk review) into a shared repository. This is the very repository that the Times report alleges was easily breached, which is especially problematic given that the unauthorized, mal-intentioned users accessed genomic data from children. When children are research participants, U.S. regulations explicitly add protections in Subpart D of the Common Rule. Even in minimal risk research, the regulatory posture recognizes children’s vulnerability and the need for heightened safeguards. 

Whatever one thinks about broader debates on data sharing, children cannot meaningfully assent to lifetime informational exposure. Furthermore, parents agree to participate with a reasonable expectation that governance will match the sensitivity of the contribution. Genomic data needs greater protection based on an assumption of adversarial actors, domestic and foreign. 

To “do no harm,” we propose a framework with additional safeguards for NIH repositories and private holders of genomic data:

  1. Enhance consent requirements: Consent materials should state exactly what is being collected, plans for data security, and potential future use. Scientists must tell participants that while de-identification reduces risk, it does not eliminate the possibility of misuse or stigmatizing interpretations.
  2. Treat genetic data as more than minimal risk: Under the Common Rule, institutions may treat studies that collect DNA as minimal risk, lowering the requirements for review and monitoring. All studies involving DNA should be deemed more than minimal risk to elevate oversight and protection of volunteers.  
  3. Restrict access and require a legitimate purpose: Access should be limited to vetted users with a clearly stated, ethically appropriate research aim. Data Use Agreements should contain strict data sharing limits. Requests should be evaluated against the scope of the original consent and the risk of foreseeable harm.
  4. Build protections that assume bad actors exist: Good-faith compliance is not a security strategy. Repositories should verify who is requesting access, conduct audits, confirm IRB approval, use technical safeguards that reduce copying, and impose consequences for misuse.
  5. Create a higher protection tier for minors’ DNA: Children’s data warrants extra care because families cannot anticipate lifetime downstream uses. Repositories holding pediatric genetic data should apply stricter access review, tighter sharing limits, stronger monitoring, and clearer expectations about when and how families are informed of misuse.

Genomic science is one of the most powerful and rapidly evolving tools in modern medicine and public health. If volunteers believe their DNA can be turned against their communities, participation will fall, datasets will skew, and the very science we hope to advance will suffer. We owe our fellow citizens governance that meets the modern technological moment. 

Note: Per Bill of Health policy, we would like to disclose that AI was used in the research and editing of this blog post. 

The authors alone are responsible for the content and views expressed herein, which do not necessarily represent the views, policies, or positions of their affiliated institutions or any other organization.

About the authors

  • Lauren Hammer Breslow

    Lauren Hammer Breslow, J.D., M.P.H, is an Adjunct Professor at the Elisabeth Haub School of Law at Pace University where she teaches Bioethics & Medical Malpractice and Professional Identity Formation. Lauren also serves as an unaffiliated member of The Rockefeller University Institutional Review Board.

  • Vanessa Smith

    Vanessa Smith, MPSc, CIM, CIP, is Certified IRB Professional serving as Associate Director of the Human Research Protection Program at The Rockefeller University and as a national IRB consultant. She is currently pursuing an advanced graduate degree in Bioethics from Montefiore Einstein Center for Bioethics at Albert Einstein College of Medicine.