top of page

Why revisit outdated treatments?


Modern health care often celebrates its successes: vaccines, antibiotics, minimally invasive surgery, cancer immunotherapy, and digital diagnostics. Yet the path to these breakthroughs is littered with ideas that failed, sometimes disastrously. Examining the medical practices that societies abandoned, from mercury purges to lobotomies, can feel ghoulish, but it serves an important purpose. These stories remind us why ethics, regulation, resilience, and continuous learning are crucial. They also reveal how social beliefs, limited scientific understanding, and profit motives shaped decisions. By revisiting these lost lessons, we can better safeguard today’s health‑care system and resist unproven “miracle cures” that persist on the fringe.


Mercury cures: toxic purges disguised as medicine


For centuries, physicians used mercury compounds as purported cures for ailments ranging from melancholy and constipation to syphilis, influenza, and parasites. An excerpt from James Delbourgo’s book on 18th‑century medicine recounts how calomel (mercurous chloride) became a popular universal remedy. Calomel was a potent cathartic that induced violent purges; patients would salivate and drool as the mercury destroyed tissues. Doctors believed that these extreme reactions indicated that the medicine was working, and doses were repeatedly administered.


The consequences were severe. Chronic mercury poisoning caused tremors, cognitive decline, and death. Despite mounting evidence of harm, mercury remained widely used until the early 20th century, partly because there were few alternatives and partly because of entrenched medical doctrine. It wasn’t until the rise of germ theory and the development of safer antibiotics that mercury cures fell out of favor. The lesson is clear: without rigorous testing and the willingness to overturn tradition, dangerous treatments can persist for generations.


Bloodletting and leeches: bleeding away the bad humors


Bloodletting, or venesection, has an even longer history. According to a History.com article, bloodletting likely originated in ancient Egypt and spread to Greece, where physicians like Erasistratus believed illnesses stemmed from an overabundance of blood. The influential physician Galen further developed the humoral theory, asserting that good health depended on the balance of four humors: blood, phlegm, yellow bile, and black bile. His writings made bloodletting common throughout the Roman Empire. In medieval Europe, the practice was the standard treatment for maladies such as plague, smallpox, epilepsy, and gout. Monks were forbidden to perform it, so barbers stepped in, giving rise to the barber–surgeon and the candy‑striped pole that still adorns barbershops.


In the 18th and 19th centuries, bloodletting reached its zenith. It was employed for childbirth (Marie‑Antoinette was bled during labor) and for presidential sore throats. George Washington’s physicians drained an estimated 5–7 pints of his blood in less than 16 hours; he died the next day. By the late 1800s, new treatments and scientific studies discredited bloodletting, limiting its use to a few conditions, such as hemochromatosis. The practice endured for millennia because it fit the prevailing theory and because its immediate effects (weakness and calmness) were misinterpreted as healing. Modern medicine’s reliance on evidence‑based trials emerged partly in reaction to these tragedies.


Trepanation: drilling holes to release evil spirits

Trepanation, the drilling or scraping of a hole in the skull, is one of the oldest documented surgical procedures. A historical review in Surgical Neurology International notes that trephined skulls have been found across Europe, Asia, and the Americas, dating back to Neolithic times. Prehistoric healers likely performed trepanation to treat intractable headaches, epileps,y or what they believed were demon possessions. Many patients survived these crude surgeries, as evidenced by bone healing around the edges


During the Middle Ages and the Renaissance, trepanation continued for head injuries and “madness.” Artists even depicted the “extraction of the stone of madness,” reflecting the belief that mental illness could be cured by removing an evil object from the brain. Without an understanding of neurology or infection, however, trepanation often led to bleeding, brain damage, and infection. Today, craniotomy is performed for specific medical reasons in sterile conditions, but the history of trepanation reminds us how desperation and superstition can drive dangerous interventions.


Lobotomy: removing personality for psychiatric relief


Few abandoned treatments loom as large in the public imagination as the frontal lobotomy. In the 1930s, Portuguese neurologist António Egas Moniz and surgeon Almeida Lima introduced leukotomy (cutting brain white matter) to treat severe mental illness at a time when asylums were overcrowded, and treatments were scarce. As described in Surgical Neurology International, frontal lobotomy was developed to relieve institutional overcrowding and to manage patients with no effective options. American psychiatrist Walter Freeman popularized the procedure in the United States.


Lobotomy reached a zenith in the 1940s and about 60,000 procedures were performed in the U.S. and Europe between 1936 and 1956. Freeman even pioneered the infamous “ice pick” lobotomy, a transorbital approach performed in minutes in outpatient settings.

Initially, lobotomy was hailed as a miracle cure that could calm agitated patients. But the reality was devastating. Patients often emerged emotionally blunted, with impaired judgment, memory problems and seizures. By the 1950s, evidence of harm mounted and new antipsychotic medications like chlorpromazine offered safer alternatives; leading physiologist John Fulton declared the era of lobotomy over in 1952.


Reports followed showing that most lobotomy patients remained incapacitated. The lobotomy saga underscores the dangers of adopting irreversible interventions without evidence and the importance of patient consent and ethical oversight.


Milk transfusions: trying to replace blood with dairy


Perhaps one of the strangest fads of the 19th century was the milk transfusion. In 1854, Canadian physicians James Bovell and Edwin Hodder injected a patient with milk, believing that its fat particles would transform into white blood cells and restore vitality. Their first patient improved briefly, but subsequent patients developed chest pain, eye movement disturbances (nystagmus), and headaches, and most died. Despite poor outcomes, the practice spread, especially for tuberculosis patients. Physician Joseph Howe later experimented on dogs and humans; all dogs died, and a woman who received human milk stopped breathing and required resuscitation. By the 1880s, human and animal milk transfusions were discredited, particularly after the development of blood typing and safer transfusion techniques. This episode highlights the danger of assuming that substances resembling human physiology will behave like human tissue.


Patent medicines and cure‑alls: poisons in fancy bottles


The 18th and 19th centuries saw a booming industry of patent medicines, proprietary cure‑alls sold via newspapers, mail‑order catalogs, and traveling salesmen. The Oregon Health & Science University exhibit on historic medicines explains that these remedies often contained alcohol, opium, cocaine, or mercury. Manufacturers marketed them as universal tonics for “female weakness,” “nervous exhaustion,” or “bilious attacks,” promising quick and easy cures. They were popular because they were cheaper and more accessible than physician care, and because physicians themselves often used harsh treatments such as bleeding and purging.


These patent medicines were dangerous: children became addicted to morphine‑laden syrups; adults suffered from mercury poisoning; some tonics contained enough alcohol to intoxicate unsuspecting consumers. Public outcry and investigative journalism eventually led to the 1906 Pure Food and Drugs Act, which banned dangerous or addictive ingredients and required accurate labeling. This law became the foundation of modern drug regulation. The lesson is that without regulation and transparency, profit‑driven enterprises may exploit consumers’ fears and lack of knowledge.


Cannibalism cures and tobacco smoke enemas: macabre practices


An article from the Association of American Medical Colleges (AAMC) details some of the most macabre cures ever practiced. During the Renaissance, cannibalism cures (also called medical cannibalism) involved consuming human blood, fat or mummified flesh to cure disease. “Medical vampirism” peaked in the 16th and 17th centuries; people drank fresh blood at executions, believing it could treat epilepsy or weakness. Mummified remains imported from Egypt were ground into powder and taken as painkillers, a practice that persisted until it was outlawed in the 18th century.

Another example is the tobacco smoke enema. In the 18th century, European physicians believed that blowing smoke into a patient’s rectum could revive drowning victims or treat intestinal ailments. Specialized kits were sold for this purpose, and the procedure was popular until the dangers of tobacco were recognized. These practices illustrate how cultural beliefs and fad therapies can override common sense when people are desperate for cures.


Sanitariums for tuberculosis: isolating instead of treating


Before antibiotics, tuberculosis (TB) was one of the deadliest diseases. With no curative drugs available, physicians believed that fresh air, rest and isolation were the best remedies. The Pathologica journal describes how public sanatoriums became the primary method of controlling TB. These institutions isolated patients in rural settings and provided rest and nutrition. The demand for sanatoriums reflected growing awareness of public‑health measures; by 1930, Italy had expanded from roughly 12,000 sanatorium beds in 1923 to about 32,000, yet this was still insufficient. Wartime conditions worsened TB mortality; deaths in Italy jumped from 50,000 to 73,000 between 1915 and 1918. Modern antibiotics introduced in the late 1940s finally rendered mass isolation unnecessary. The TB sanatorium era underscores the importance of investing in research and pharmaceutical development rather than relying solely on isolation.


Lessons for today: resilience, regulation, and ethics

Studying these abandoned treatments isn’t mere curiosity; it yields concrete lessons for modern health‑care policy and practice:


  1. Evidence must trump tradition. Bloodletting and mercury purges persisted for centuries because they fit prevailing theories. Modern practitioners must continually question and test assumptions through randomized controlled trials and data analysis. Accelerated drug approval pathways should be balanced with rigorous post‑marketing surveillance.


  2. Resilience requires investment. When outbreaks strike, we need well‑trained personnel and infrastructure. The C19_SPACE program’s success in rapidly upskilling ICU nurses to treat thousands more patients each month shows how proactive training can save lives. Conversely, TB sanatoriums illustrate that simply isolating patients without investing in cure research prolongs suffering.


  3. Regulation protects the public. The Pure Food and Drugs Act of 1906 outlawed many patent‑medicine poisons, just as today’s FDA regulates drug safety. Calls to loosen oversight for alternative supplements echo earlier eras when unregulated potions hurt people. Vigilant regulation ensures that treatments are safe and effective before they reach the public.


  4. Ethics and consent are paramount. Lobotomies were often performed on vulnerable patients without informed consent, causing irreversible damage. Modern clinical research and practice must respect autonomy, obtain informed consent, and consider the long‑term consequences of interventions.


  5. Beware of profit‑driven quackery. Patent‑medicine peddlers and the creators of tobacco smoke enemas made money exploiting fear. In today’s world, online misinformation about miracle cures and untested supplements can be just as dangerous. Health‑care professionals and regulators must counter misinformation with evidence, education, and enforcement.


  6. Cultural humility matters. Cannibalism, cures, and trepanation reflect cultural beliefs that modern observers may find disturbing. Today’s clinicians work with diverse populations, and understanding cultural contexts is essential for building trust and delivering ethical care.


  7. Continuous education and adaptability. As knowledge evolves, so must practice. Clinicians need training not only in new technologies like AI and telemedicine but also in critical thinking to evaluate evidence. Health‑care systems should adopt a mindset of perpetual learning.


Conclusion: Using history to build a resilient future


Modern medicine has come a long way since tobacco enemas and mercury pills, but new challenges, AI hype, gene therapies, personalized medicine, and pandemics will test our commitment to evidence, ethics, and regulation. Looking back at the lost lessons of old medicine shows how quickly unproven treatments can take hold when desperation, profit, or tradition outweighs science. It also demonstrates the power of investment in people and research; once antibiotics emerged, TB sanatoriums became obsolete, and once antipsychotics arrived, lobotomies faded.


As we confront future pandemics, chronic‑disease burdens, and health‑care inequities, we should honor the courage of past clinicians while avoiding their missteps. Upholding rigorous scientific methods, cultivating cultural humility, protecting patient autonomy, and investing in education will help ensure that today’s revolutionary treatments do not become tomorrow’s cautionary tales. Through resilience, regulation, and ethical commitment, we can build a healthcare system that learns from the past to safeguard the future.


Replace outdated practices with exceptional talent, Get started with CWSHealth.

2 days ago

8 min read

Related Posts

Comments

Share Your ThoughtsBe the first to write a comment.
bottom of page