Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors

The Bizarre History of Phlebotomy

Contact Us

    View full terms

    By checking this box, you are giving your consent for CCI Training Center to contact you regarding our services via SMS message to the phone number you provided above. Message and data rates may apply. Text "HELP" for support and "STOP" to cancel.

    * By clicking the button below as my official signature, I consent to representatives of CCI Training Center contacting me about educational opportunities via phone, text message, and email. I understand that my consent is not a requirement
    for any purchase.

    For more information about our graduation rates, the median debt of students who completed the program, and other important information, please refer to the Gainful Employment Disclaimers.

    A Brief History of Phlebotomy

    Phlebotomy in the past was nothing like the sophisticated and relatively painless process of obtaining blood samples it is today. In fact, the definition of phlebotomy used to be “the art or practice of opening a vein for letting or drawing blood as a therapeutic or diagnostic measure; venesection; bleeding.”

    If it sounds like bloodletting, it was precisely that. In this blog, we’ll go over the history and weirdness of phlebotomy.

    Key Takeaways

    • Galen of Pergamon discovered that veins and arteries carry blood and not just air.
    • The practice of bloodletting continued for over 1400 years.
    • George Washington died due to bloodletting.

    Ancient Origins: Egypt, Greece, and Beyond

    Egypt

    Phlebotomy has been practiced across many ancient civilizations. Historians state that ancient Egyptians were the first ones to practice bloodletting around 1000 BC to cure diseases like acne or the plague, and some also believe it was practiced to cast out evil spirits.

    Greece

    While bloodletting has been a significant part of numerous cultures, its origins are usually traced to ancient Greece. In the 2nd century AD, Galen of Pergamon discovered that arteries carry blood and not just air, as it was previously believed.

    It was a huge breakthrough; however, Galen’s understanding was limited when it came to how blood moved through the body. He believed that blood doesn’t circulate but flows from the liver via veins, and upon reaching the ends of the veins, both blood and air are stagnated. Not just that, he also believed that blood passed from the right to the left ventricle through invisible pores in the septum.

    Since Galen was a dominant medical figure in Rome, he promoted the idea that humoral imbalances come from specific organs and claimed that blood was the most significant humor.

    According to humoralism, four bodily fluids—blood, yellow bile, black bile, and phlegm- determined a person’s temperament, and imbalances led to sickness.

    He also advocated leeching, a process of having leeches suck out the stagnant blood and improve one’s health.

    Given the importance of his breakthrough and his place in the medical world at the time, his practices were soon adopted by the rest of Europe and Asia.

    Middle Ages

    Since the Galenic physiology was practiced for the next 1400 years, bloodletting was a common practice in the Middle Ages. However, what marked a significant change was that physicians did not perform bloodletting, but by barber-surgeons.

    These barber-surgeons also performed amputations and pulled teeth. Believe it or not, the red stripe on a barber pole originated from the time when barbers performed blood-letting; the red stripe was symbolic of the bloodletting portion of their profession, as blood-soaked towels were usually hung outside of their shops.

    However, these barber-surgeons didn’t just pop into existence out of nowhere. Monastery monks hired these barbers as they required a shaven head for health and sanitary reasons. Soon enough, the monks trained these barbers to perform bloodletting and gave rise to barber-surgeons.

    During the Middle Ages, the two most common methods of bloodletting were:

    • Making an incision into a vein in the forearm or neck. 
    • Scraping the skin and using a suction cup to drain blood.

    At some point in the 16th century, surgery became more sophisticated, and the job of bloodletting went back to the physicians.

    The Renaissance and Enlightenment

    You might think that as science advanced during the Renaissance and Enlightenment, practices like bloodletting would fade away. But you’d be wrong. In fact, bloodletting reached its absolute peak of popularity during this time.

    Despite groundbreaking discoveries, like William Harvey’s 1628 publication demonstrating the circulation of blood (which completely debunked Galen’s theories), the tradition of bleeding patients persisted with incredible stubbornness. Physicians developed complex charts detailing which specific veins to bleed for which specific ailments, aligning the practice with astrology and calendars.

    The tools also became more “sophisticated.” The lancet, a spring-loaded device with a blade, was used for a quick, “clean” cut into a vein. For a more intense session, a scarificator was used—a brass box with multiple blades that would spring out to make a series of parallel cuts. A heated cup was then placed over the cuts to create a vacuum and draw out the blood. It was a brutal, systematized version of the ancient practice.

    The 18th Century

    The 18th century, often called the Age of Enlightenment, was ironically the golden age of bloodletting. This period saw the rise of a medical philosophy known as “heroic medicine,” which championed aggressive and often violent treatments. The thinking was that a severe disease required an equally severe intervention to shock the body back into balance. Bloodletting was the star player in this philosophy.

    Physicians of this era carried lancets and bleeding bowls as essential tools of the trade. The practice was no longer just for dire illnesses; it became a routine prescription for almost any complaint, including indigestion, acne, mood swings, and the common cold. It was so commonplace that many people would be bled seasonally as a preventative measure to “purify” their blood and maintain good health.

    A leading figure of this movement in America was Dr. Benjamin Rush, a signer of the Declaration of Independence. During the 1793 yellow fever epidemic in Philadelphia, he advocated for massive bloodletting, believing the disease was caused by a “morbid excitement” in the arteries that could only be relieved by draining huge quantities of blood.

    The 19th Century

    The 19th century was the critical turning point where tradition finally clashed with scientific evidence, marking the beginning of the end for routine bloodletting. This century was a battleground between the old guard of heroic medicine and a new generation of skeptical, data-driven physicians. Several key developments led to bloodletting’s downfall:

    1. The Rise of Statistics: In the 1830s, French physician Pierre Charles Alexandre Louis conducted one of the first clinical trials. By carefully tracking and counting the outcomes of pneumonia patients, his numerical method demonstrated with hard data that patients who underwent bloodletting had no better survival rates—and in some cases, fared worse—than those who did not. This was a revolutionary blow to a practice built on centuries of anecdote.
    2. New Diagnostic Tools: For the first time, doctors had new ways to see what was happening inside the body without resorting to theory. The invention and adoption of the stethoscope, the clinical thermometer, and early blood pressure measurement devices gave physicians objective data about a patient’s condition, reducing the need for speculative treatments like bleeding.
    3. The Germ Theory of Disease: The groundbreaking work of Louis Pasteur and Robert Koch in the latter half of the century completely upended the humoral theory. Once it was understood that specific illnesses were caused by microorganisms (germs) and not an imbalance of internal fluids, the entire rationale for bloodletting collapsed. You can’t bleed out a bacterial infection.

    By the end of the 19th century, bloodletting had been relegated from a primary medical treatment to a fringe practice, viewed by the emerging scientific medical establishment as a symbol of an archaic and dangerous era. This decline cleared the way for phlebotomy to be reborn not as a treatment, but as the essential diagnostic procedure we know today.

    Infamous Examples: When Bleeding Turned Deadly

    The history of bloodletting is littered with tragic examples of its failure. Two of the most famous cases highlight just how dangerous this cure could be.

    George Washington (1799)

    The first President of the United States fell ill with a severe throat infection, likely epiglottitis. Over the course of less than 12 hours, his physicians drained an estimated 5 pints (about 2.4 liters) of blood from him, which is roughly 40% of an adult male’s total blood volume. Weakened and in shock from the massive blood loss, George Washington died shortly after. Many modern historians believe that while his illness was severe, the aggressive bloodletting was the primary factor that hastened his death.

    King Charles II of England (1685)

    After suffering a seizure, King Charles II was subjected to a horrifying array of treatments by his royal physicians. The first thing they did, naturally, was let 16 ounces of blood from his arm. When that didn’t work, they followed up with cupping, blistering, purges, and enemas. Over the next few days, they bled him repeatedly. The king’s health rapidly deteriorated, and he died within a week. The cure was, in this case, a form of torture that almost certainly contributed to his demise.

    The Rise of Modern Phlebotomy

    The contributions made by physicians like Pierre Louis and the acceptance of the germ theory of disease finally put an end to bloodletting.

    However, the practice of accessing veins didn’t disappear; it transformed. Instead of being a treatment, it became a powerful diagnostic tool.

    Today, modern phlebotomy is a safe, precise, and essential medical procedure. Here’s how it works:

    1. Preparation: A trained professional, a phlebotomist, identifies a suitable vein, usually in the arm. They apply a tourniquet to make the vein more prominent.
    2. Sterilization: The site is cleaned thoroughly with an antiseptic wipe to prevent infection.
    3. The Draw: Using a sterile, single-use needle attached to collection tubes (often a vacuum-sealed system like a Vacutainer), the phlebotomist draws a small, carefully measured amount of blood—usually just a few milliliters.
    4. Analysis: The blood sample is then sent to a lab where it can be analyzed for hundreds of different markers, from cholesterol levels and blood cell counts to the presence of diseases and genetic conditions.

    The only remnant of therapeutic bloodletting (therapeutic phlebotomy) is used for a few rare medical conditions where a patient’s body produces too many red blood cells (polycythemia) or too much iron (hemochromatosis). But even this is a highly controlled, sterile procedure based on modern diagnostics.

    Want to learn the art of safe and modern phlebotomy? Choose the CCI Training Center Phlebotomy Technician Program.

    A Brief History of Phlebotomy

    Phlebotomy in the past was nothing like the sophisticated and relatively painless process of obtaining blood samples it is today. In fact, the definition of phlebotomy used to be “the art or practice of opening a vein for letting or drawing blood as a therapeutic or diagnostic measure; venesection; bleeding.”

    If it sounds like bloodletting, it was precisely that. In this blog, we’ll go over the history and weirdness of phlebotomy.

    Key Takeaways

    • Galen of Pergamon discovered that veins and arteries carry blood and not just air.
    • The practice of bloodletting continued for over 1400 years.
    • George Washington died due to bloodletting.

    Ancient Origins: Egypt, Greece, and Beyond

    Egypt

    Phlebotomy has been practiced across many ancient civilizations. Historians state that ancient Egyptians were the first ones to practice bloodletting around 1000 BC to cure diseases like acne or the plague, and some also believe it was practiced to cast out evil spirits.

    Greece

    While bloodletting has been a significant part of numerous cultures, its origins are usually traced to ancient Greece. In the 2nd century AD, Galen of Pergamon discovered that arteries carry blood and not just air, as it was previously believed.

    It was a huge breakthrough; however, Galen’s understanding was limited when it came to how blood moved through the body. He believed that blood doesn’t circulate but flows from the liver via veins, and upon reaching the ends of the veins, both blood and air are stagnated. Not just that, he also believed that blood passed from the right to the left ventricle through invisible pores in the septum.

    Since Galen was a dominant medical figure in Rome, he promoted the idea that humoral imbalances come from specific organs and claimed that blood was the most significant humor.

    According to humoralism, four bodily fluids—blood, yellow bile, black bile, and phlegm- determined a person’s temperament, and imbalances led to sickness.

    He also advocated leeching, a process of having leeches suck out the stagnant blood and improve one’s health.

    Given the importance of his breakthrough and his place in the medical world at the time, his practices were soon adopted by the rest of Europe and Asia.

    Middle Ages

    Since the Galenic physiology was practiced for the next 1400 years, bloodletting was a common practice in the Middle Ages. However, what marked a significant change was that physicians did not perform bloodletting, but by barber-surgeons.

    These barber-surgeons also performed amputations and pulled teeth. Believe it or not, the red stripe on a barber pole originated from the time when barbers performed blood-letting; the red stripe was symbolic of the bloodletting portion of their profession, as blood-soaked towels were usually hung outside of their shops.

    However, these barber-surgeons didn’t just pop into existence out of nowhere. Monastery monks hired these barbers as they required a shaven head for health and sanitary reasons. Soon enough, the monks trained these barbers to perform bloodletting and gave rise to barber-surgeons.

    During the Middle Ages, the two most common methods of bloodletting were:

    • Making an incision into a vein in the forearm or neck. 
    • Scraping the skin and using a suction cup to drain blood.

    At some point in the 16th century, surgery became more sophisticated, and the job of bloodletting went back to the physicians.

    The Renaissance and Enlightenment

    You might think that as science advanced during the Renaissance and Enlightenment, practices like bloodletting would fade away. But you’d be wrong. In fact, bloodletting reached its absolute peak of popularity during this time.

    Despite groundbreaking discoveries, like William Harvey’s 1628 publication demonstrating the circulation of blood (which completely debunked Galen’s theories), the tradition of bleeding patients persisted with incredible stubbornness. Physicians developed complex charts detailing which specific veins to bleed for which specific ailments, aligning the practice with astrology and calendars.

    The tools also became more “sophisticated.” The lancet, a spring-loaded device with a blade, was used for a quick, “clean” cut into a vein. For a more intense session, a scarificator was used—a brass box with multiple blades that would spring out to make a series of parallel cuts. A heated cup was then placed over the cuts to create a vacuum and draw out the blood. It was a brutal, systematized version of the ancient practice.

    The 18th Century

    The 18th century, often called the Age of Enlightenment, was ironically the golden age of bloodletting. This period saw the rise of a medical philosophy known as “heroic medicine,” which championed aggressive and often violent treatments. The thinking was that a severe disease required an equally severe intervention to shock the body back into balance. Bloodletting was the star player in this philosophy.

    Physicians of this era carried lancets and bleeding bowls as essential tools of the trade. The practice was no longer just for dire illnesses; it became a routine prescription for almost any complaint, including indigestion, acne, mood swings, and the common cold. It was so commonplace that many people would be bled seasonally as a preventative measure to “purify” their blood and maintain good health.

    A leading figure of this movement in America was Dr. Benjamin Rush, a signer of the Declaration of Independence. During the 1793 yellow fever epidemic in Philadelphia, he advocated for massive bloodletting, believing the disease was caused by a “morbid excitement” in the arteries that could only be relieved by draining huge quantities of blood.

    The 19th Century

    The 19th century was the critical turning point where tradition finally clashed with scientific evidence, marking the beginning of the end for routine bloodletting. This century was a battleground between the old guard of heroic medicine and a new generation of skeptical, data-driven physicians. Several key developments led to bloodletting’s downfall:

    1. The Rise of Statistics: In the 1830s, French physician Pierre Charles Alexandre Louis conducted one of the first clinical trials. By carefully tracking and counting the outcomes of pneumonia patients, his numerical method demonstrated with hard data that patients who underwent bloodletting had no better survival rates—and in some cases, fared worse—than those who did not. This was a revolutionary blow to a practice built on centuries of anecdote.
    2. New Diagnostic Tools: For the first time, doctors had new ways to see what was happening inside the body without resorting to theory. The invention and adoption of the stethoscope, the clinical thermometer, and early blood pressure measurement devices gave physicians objective data about a patient’s condition, reducing the need for speculative treatments like bleeding.
    3. The Germ Theory of Disease: The groundbreaking work of Louis Pasteur and Robert Koch in the latter half of the century completely upended the humoral theory. Once it was understood that specific illnesses were caused by microorganisms (germs) and not an imbalance of internal fluids, the entire rationale for bloodletting collapsed. You can’t bleed out a bacterial infection.

    By the end of the 19th century, bloodletting had been relegated from a primary medical treatment to a fringe practice, viewed by the emerging scientific medical establishment as a symbol of an archaic and dangerous era. This decline cleared the way for phlebotomy to be reborn not as a treatment, but as the essential diagnostic procedure we know today.

    Infamous Examples: When Bleeding Turned Deadly

    The history of bloodletting is littered with tragic examples of its failure. Two of the most famous cases highlight just how dangerous this cure could be.

    George Washington (1799)

    The first President of the United States fell ill with a severe throat infection, likely epiglottitis. Over the course of less than 12 hours, his physicians drained an estimated 5 pints (about 2.4 liters) of blood from him, which is roughly 40% of an adult male’s total blood volume. Weakened and in shock from the massive blood loss, George Washington died shortly after. Many modern historians believe that while his illness was severe, the aggressive bloodletting was the primary factor that hastened his death.

    King Charles II of England (1685)

    After suffering a seizure, King Charles II was subjected to a horrifying array of treatments by his royal physicians. The first thing they did, naturally, was let 16 ounces of blood from his arm. When that didn’t work, they followed up with cupping, blistering, purges, and enemas. Over the next few days, they bled him repeatedly. The king’s health rapidly deteriorated, and he died within a week. The cure was, in this case, a form of torture that almost certainly contributed to his demise.

    The Rise of Modern Phlebotomy

    The contributions made by physicians like Pierre Louis and the acceptance of the germ theory of disease finally put an end to bloodletting.

    However, the practice of accessing veins didn’t disappear; it transformed. Instead of being a treatment, it became a powerful diagnostic tool.

    Today, modern phlebotomy is a safe, precise, and essential medical procedure. Here’s how it works:

    1. Preparation: A trained professional, a phlebotomist, identifies a suitable vein, usually in the arm. They apply a tourniquet to make the vein more prominent.
    2. Sterilization: The site is cleaned thoroughly with an antiseptic wipe to prevent infection.
    3. The Draw: Using a sterile, single-use needle attached to collection tubes (often a vacuum-sealed system like a Vacutainer), the phlebotomist draws a small, carefully measured amount of blood—usually just a few milliliters.
    4. Analysis: The blood sample is then sent to a lab where it can be analyzed for hundreds of different markers, from cholesterol levels and blood cell counts to the presence of diseases and genetic conditions.

    The only remnant of therapeutic bloodletting (therapeutic phlebotomy) is used for a few rare medical conditions where a patient’s body produces too many red blood cells (polycythemia) or too much iron (hemochromatosis). But even this is a highly controlled, sterile procedure based on modern diagnostics.

    Want to learn the art of safe and modern phlebotomy? Choose the CCI Training Center Phlebotomy Technician Program.

    This article is written by

    Carey Maceira
    An accomplished leader in the allied health career education sector, Carey enjoys managing career education programs, teaching, and mentoring adult students. Her success in working in the field drives her to go above and beyond each and every day.

    Share this article

    Frequently Asked Questions FAQ's

    What physician discovered that both veins and arteries were filled with blood, not air?

    Galen of Pergamon discovered that arteries carry blood and not just air, as it was previously believed.

    While it is hard to pinpoint the exact origin of phlebotomy, the act of bloodletting is considered the beginning of modern phlebotomy.

    Cutting into a vein on the forearm and cupping were the two most commonly used bloodletting methods.

    Yes. In some parts of the world, cupping and leeching are still practiced as a form of alternative medicine.

    Related Articles

    CCI Training 40 Years Anniversary

    CCI Training Center is proud to have completed 40 years in service.