Phlebotomy in the past was nothing like the sophisticated and relatively painless process of obtaining blood samples it is today. In fact, the definition of phlebotomy used to be “the art or practice of opening a vein for letting or drawing blood as a therapeutic or diagnostic measure; venesection; bleeding.”
If it sounds like bloodletting, it was precisely that. In this blog, we’ll go over the history and weirdness of phlebotomy.
Key Takeaways
|
Phlebotomy has been practiced across many ancient civilizations. Historians state that ancient Egyptians were the first ones to practice bloodletting around 1000 BC to cure diseases like acne or the plague, and some also believe it was practiced to cast out evil spirits.
While bloodletting has been a significant part of numerous cultures, its origins are usually traced to ancient Greece. In the 2nd century AD, Galen of Pergamon discovered that arteries carry blood and not just air, as it was previously believed.
It was a huge breakthrough; however, Galen’s understanding was limited when it came to how blood moved through the body. He believed that blood doesn’t circulate but flows from the liver via veins, and upon reaching the ends of the veins, both blood and air are stagnated. Not just that, he also believed that blood passed from the right to the left ventricle through invisible pores in the septum.
Since Galen was a dominant medical figure in Rome, he promoted the idea that humoral imbalances come from specific organs and claimed that blood was the most significant humor.
According to humoralism, four bodily fluids—blood, yellow bile, black bile, and phlegm- determined a person’s temperament, and imbalances led to sickness. |
He also advocated leeching, a process of having leeches suck out the stagnant blood and improve one’s health.
Given the importance of his breakthrough and his place in the medical world at the time, his practices were soon adopted by the rest of Europe and Asia.
Since the Galenic physiology was practiced for the next 1400 years, bloodletting was a common practice in the Middle Ages. However, what marked a significant change was that physicians did not perform bloodletting, but by barber-surgeons.
These barber-surgeons also performed amputations and pulled teeth. Believe it or not, the red stripe on a barber pole originated from the time when barbers performed blood-letting; the red stripe was symbolic of the bloodletting portion of their profession, as blood-soaked towels were usually hung outside of their shops.
However, these barber-surgeons didn’t just pop into existence out of nowhere. Monastery monks hired these barbers as they required a shaven head for health and sanitary reasons. Soon enough, the monks trained these barbers to perform bloodletting and gave rise to barber-surgeons.
During the Middle Ages, the two most common methods of bloodletting were:
At some point in the 16th century, surgery became more sophisticated, and the job of bloodletting went back to the physicians.
You might think that as science advanced during the Renaissance and Enlightenment, practices like bloodletting would fade away. But you’d be wrong. In fact, bloodletting reached its absolute peak of popularity during this time.
Despite groundbreaking discoveries, like William Harvey’s 1628 publication demonstrating the circulation of blood (which completely debunked Galen’s theories), the tradition of bleeding patients persisted with incredible stubbornness. Physicians developed complex charts detailing which specific veins to bleed for which specific ailments, aligning the practice with astrology and calendars.
The tools also became more “sophisticated.” The lancet, a spring-loaded device with a blade, was used for a quick, “clean” cut into a vein. For a more intense session, a scarificator was used—a brass box with multiple blades that would spring out to make a series of parallel cuts. A heated cup was then placed over the cuts to create a vacuum and draw out the blood. It was a brutal, systematized version of the ancient practice.
The 18th century, often called the Age of Enlightenment, was ironically the golden age of bloodletting. This period saw the rise of a medical philosophy known as “heroic medicine,” which championed aggressive and often violent treatments. The thinking was that a severe disease required an equally severe intervention to shock the body back into balance. Bloodletting was the star player in this philosophy.
Physicians of this era carried lancets and bleeding bowls as essential tools of the trade. The practice was no longer just for dire illnesses; it became a routine prescription for almost any complaint, including indigestion, acne, mood swings, and the common cold. It was so commonplace that many people would be bled seasonally as a preventative measure to “purify” their blood and maintain good health.
A leading figure of this movement in America was Dr. Benjamin Rush, a signer of the Declaration of Independence. During the 1793 yellow fever epidemic in Philadelphia, he advocated for massive bloodletting, believing the disease was caused by a “morbid excitement” in the arteries that could only be relieved by draining huge quantities of blood.
The 19th century was the critical turning point where tradition finally clashed with scientific evidence, marking the beginning of the end for routine bloodletting. This century was a battleground between the old guard of heroic medicine and a new generation of skeptical, data-driven physicians. Several key developments led to bloodletting’s downfall:
By the end of the 19th century, bloodletting had been relegated from a primary medical treatment to a fringe practice, viewed by the emerging scientific medical establishment as a symbol of an archaic and dangerous era. This decline cleared the way for phlebotomy to be reborn not as a treatment, but as the essential diagnostic procedure we know today.
The history of bloodletting is littered with tragic examples of its failure. Two of the most famous cases highlight just how dangerous this cure could be.
The first President of the United States fell ill with a severe throat infection, likely epiglottitis. Over the course of less than 12 hours, his physicians drained an estimated 5 pints (about 2.4 liters) of blood from him, which is roughly 40% of an adult male’s total blood volume. Weakened and in shock from the massive blood loss, George Washington died shortly after. Many modern historians believe that while his illness was severe, the aggressive bloodletting was the primary factor that hastened his death.
After suffering a seizure, King Charles II was subjected to a horrifying array of treatments by his royal physicians. The first thing they did, naturally, was let 16 ounces of blood from his arm. When that didn’t work, they followed up with cupping, blistering, purges, and enemas. Over the next few days, they bled him repeatedly. The king’s health rapidly deteriorated, and he died within a week. The cure was, in this case, a form of torture that almost certainly contributed to his demise.
The contributions made by physicians like Pierre Louis and the acceptance of the germ theory of disease finally put an end to bloodletting.
However, the practice of accessing veins didn’t disappear; it transformed. Instead of being a treatment, it became a powerful diagnostic tool.
Today, modern phlebotomy is a safe, precise, and essential medical procedure. Here’s how it works:
The only remnant of therapeutic bloodletting (therapeutic phlebotomy) is used for a few rare medical conditions where a patient’s body produces too many red blood cells (polycythemia) or too much iron (hemochromatosis). But even this is a highly controlled, sterile procedure based on modern diagnostics.
Want to learn the art of safe and modern phlebotomy? Choose the CCI Training Center Phlebotomy Technician Program.
This article is written by
Share this article
Phlebotomy in the past was nothing like the sophisticated and relatively painless process of obtaining blood samples it is today. In fact, the definition of phlebotomy used to be “the art or practice of opening a vein for letting or drawing blood as a therapeutic or diagnostic measure; venesection; bleeding.”
If it sounds like bloodletting, it was precisely that. In this blog, we’ll go over the history and weirdness of phlebotomy.
Key Takeaways
|
Phlebotomy has been practiced across many ancient civilizations. Historians state that ancient Egyptians were the first ones to practice bloodletting around 1000 BC to cure diseases like acne or the plague, and some also believe it was practiced to cast out evil spirits.
While bloodletting has been a significant part of numerous cultures, its origins are usually traced to ancient Greece. In the 2nd century AD, Galen of Pergamon discovered that arteries carry blood and not just air, as it was previously believed.
It was a huge breakthrough; however, Galen’s understanding was limited when it came to how blood moved through the body. He believed that blood doesn’t circulate but flows from the liver via veins, and upon reaching the ends of the veins, both blood and air are stagnated. Not just that, he also believed that blood passed from the right to the left ventricle through invisible pores in the septum.
Since Galen was a dominant medical figure in Rome, he promoted the idea that humoral imbalances come from specific organs and claimed that blood was the most significant humor.
According to humoralism, four bodily fluids—blood, yellow bile, black bile, and phlegm- determined a person’s temperament, and imbalances led to sickness. |
He also advocated leeching, a process of having leeches suck out the stagnant blood and improve one’s health.
Given the importance of his breakthrough and his place in the medical world at the time, his practices were soon adopted by the rest of Europe and Asia.
Since the Galenic physiology was practiced for the next 1400 years, bloodletting was a common practice in the Middle Ages. However, what marked a significant change was that physicians did not perform bloodletting, but by barber-surgeons.
These barber-surgeons also performed amputations and pulled teeth. Believe it or not, the red stripe on a barber pole originated from the time when barbers performed blood-letting; the red stripe was symbolic of the bloodletting portion of their profession, as blood-soaked towels were usually hung outside of their shops.
However, these barber-surgeons didn’t just pop into existence out of nowhere. Monastery monks hired these barbers as they required a shaven head for health and sanitary reasons. Soon enough, the monks trained these barbers to perform bloodletting and gave rise to barber-surgeons.
During the Middle Ages, the two most common methods of bloodletting were:
At some point in the 16th century, surgery became more sophisticated, and the job of bloodletting went back to the physicians.
You might think that as science advanced during the Renaissance and Enlightenment, practices like bloodletting would fade away. But you’d be wrong. In fact, bloodletting reached its absolute peak of popularity during this time.
Despite groundbreaking discoveries, like William Harvey’s 1628 publication demonstrating the circulation of blood (which completely debunked Galen’s theories), the tradition of bleeding patients persisted with incredible stubbornness. Physicians developed complex charts detailing which specific veins to bleed for which specific ailments, aligning the practice with astrology and calendars.
The tools also became more “sophisticated.” The lancet, a spring-loaded device with a blade, was used for a quick, “clean” cut into a vein. For a more intense session, a scarificator was used—a brass box with multiple blades that would spring out to make a series of parallel cuts. A heated cup was then placed over the cuts to create a vacuum and draw out the blood. It was a brutal, systematized version of the ancient practice.
The 18th century, often called the Age of Enlightenment, was ironically the golden age of bloodletting. This period saw the rise of a medical philosophy known as “heroic medicine,” which championed aggressive and often violent treatments. The thinking was that a severe disease required an equally severe intervention to shock the body back into balance. Bloodletting was the star player in this philosophy.
Physicians of this era carried lancets and bleeding bowls as essential tools of the trade. The practice was no longer just for dire illnesses; it became a routine prescription for almost any complaint, including indigestion, acne, mood swings, and the common cold. It was so commonplace that many people would be bled seasonally as a preventative measure to “purify” their blood and maintain good health.
A leading figure of this movement in America was Dr. Benjamin Rush, a signer of the Declaration of Independence. During the 1793 yellow fever epidemic in Philadelphia, he advocated for massive bloodletting, believing the disease was caused by a “morbid excitement” in the arteries that could only be relieved by draining huge quantities of blood.
The 19th century was the critical turning point where tradition finally clashed with scientific evidence, marking the beginning of the end for routine bloodletting. This century was a battleground between the old guard of heroic medicine and a new generation of skeptical, data-driven physicians. Several key developments led to bloodletting’s downfall:
By the end of the 19th century, bloodletting had been relegated from a primary medical treatment to a fringe practice, viewed by the emerging scientific medical establishment as a symbol of an archaic and dangerous era. This decline cleared the way for phlebotomy to be reborn not as a treatment, but as the essential diagnostic procedure we know today.
The history of bloodletting is littered with tragic examples of its failure. Two of the most famous cases highlight just how dangerous this cure could be.
The first President of the United States fell ill with a severe throat infection, likely epiglottitis. Over the course of less than 12 hours, his physicians drained an estimated 5 pints (about 2.4 liters) of blood from him, which is roughly 40% of an adult male’s total blood volume. Weakened and in shock from the massive blood loss, George Washington died shortly after. Many modern historians believe that while his illness was severe, the aggressive bloodletting was the primary factor that hastened his death.
After suffering a seizure, King Charles II was subjected to a horrifying array of treatments by his royal physicians. The first thing they did, naturally, was let 16 ounces of blood from his arm. When that didn’t work, they followed up with cupping, blistering, purges, and enemas. Over the next few days, they bled him repeatedly. The king’s health rapidly deteriorated, and he died within a week. The cure was, in this case, a form of torture that almost certainly contributed to his demise.
The contributions made by physicians like Pierre Louis and the acceptance of the germ theory of disease finally put an end to bloodletting.
However, the practice of accessing veins didn’t disappear; it transformed. Instead of being a treatment, it became a powerful diagnostic tool.
Today, modern phlebotomy is a safe, precise, and essential medical procedure. Here’s how it works:
The only remnant of therapeutic bloodletting (therapeutic phlebotomy) is used for a few rare medical conditions where a patient’s body produces too many red blood cells (polycythemia) or too much iron (hemochromatosis). But even this is a highly controlled, sterile procedure based on modern diagnostics.
Want to learn the art of safe and modern phlebotomy? Choose the CCI Training Center Phlebotomy Technician Program.
This article is written by
Share this article
This article is written by
Share this article
Galen of Pergamon discovered that arteries carry blood and not just air, as it was previously believed.
While it is hard to pinpoint the exact origin of phlebotomy, the act of bloodletting is considered the beginning of modern phlebotomy.
Cutting into a vein on the forearm and cupping were the two most commonly used bloodletting methods.
Yes. In some parts of the world, cupping and leeching are still practiced as a form of alternative medicine.