WWII's Aftermath and the Nuremberg Code
The atrocities committed during Nazi medical experiments during WWII led directly to the 1947 Nuremberg Code, which introduced core principles: voluntary informed consent, scientific necessity, and participant welfare. These became foundational ethical standards for research worldwide.
Declaration of Helsinki and Belmont Report
In 1964, the Declaration of Helsinki further established globally recognized ethical guidelines, emphasizing participant autonomy and scientific rigor. This marked a moment where physicians established clearer responsibilities to protect participants and uphold scientific integrity. In the U.S., the 1979 Belmont Report articulated the principles of Respect for Persons, Beneficence, and Justice following revelations from the Tuskegee Syphilis Study, when African American participants were misled and denied care.
Dr. Louis Lasagna and the Rise of Modern Ethical Standards
A pioneering clinical pharmacologist and advocate for patient rights, Dr. Louis Lasanaga is recognized as a key figure in shaping the ethical and methodological frameworks of modern clinical trials. In the mid-20th century, Lasagna emerged as a thought leader in the movement to formalize informed consent and improve the scientific integrity of human research.
In 1964, he authored a widely adopted revision of the Hippocratic Oath, emphasizing the ethical obligations of physicians to respect patient autonomy—years before formal federal regulations were in place. He also advocated for placebo-controlled trials, helping to validate the use of controls and blinding in drug development. His influence extended to the FDA, where he served as an advisor amid development of stronger regulatory frameworks after the thalidomide crisis.
Lasagna's contributions bridged ethics and science, helping to lay the groundwork for Good Clinical Practice (GCP) standards. He is also remembered for “Lasagna’s Law,” an observation of the difficulty of enrolling enough participants in trials despite early assumptions of abundance.
The Tuskegee Syphilis Study
From 1932 to 1972, the U.S. Public Health Service conducted the Tuskegee Syphilis Study in rural Alabama, observing the natural progression of untreated syphilis in 399 African American men — all without their informed consent. The men were misled into believing they were receiving free medical care, when in fact treatment was deliberately withheld, even after penicillin became the standard cure in the 1940s. Participants were never told they had syphilis and were subjected to painful procedures under false pretenses. When the study was finally exposed by the press in 1972, it triggered national outrage, congressional hearings, and a deep reckoning with the systemic racism embedded in public health research.
The Thalidomide Tragedy and the Birth of Modern Drug Regulation
The thalidomide tragedy of the early 1960s profoundly shaped modern drug regulation. Prescribed widely in Europe for morning sickness, thalidomide caused severe birth defects in over 10,000 infants. While never approved in the U.S., the near-miss prompted swift legislative action. In 1962, the Kefauver-Harris Amendments were passed, requiring proof of safety and efficacy before drug approval and mandating informed consent for clinical trial participants.
The FDA's Formation and Role
U.S. federal regulation of drugs began in earnest with the 1906 Pure Food and Drug Act, but the 1938 Federal Food, Drug, and Cosmetic Act (prompted by the Elixir Sulfanilamide disaster) required proof of safety before new drugs could reach the market. The 1962 Kefauver-Harris Amendments further required drug efficacy data and rigorous oversight, firmly establishing the FDA as the primary regulator of clinical trials.