Surgery in the 19th century saw important advances that reduced mortality rates from pain, infection and bleeding:
1) Anesthesia was developed using ether, nitrous oxide and eventually chloroform, allowing for painless operations.
2) Semmelweis and Lister pioneered antisepsis and aseptic techniques like handwashing and using carbolic acid, reducing post-operative infections.
3) Landsteiner's discovery of blood groups in 1901 and later developments like blood banks during World Wars allowed safe blood transfusions to treat bleeding.
These breakthroughs transformed surgery from a dangerous last resort to a mainstream medical practice.
1. Nominated Topic 2009: Surgery c.1850-c.1950
The situation at the beginning of the 19th century
In 1800, surgery rarely went beyond setting broken bones, removing growths and performing
amputations. However, surgery was beginning to lose its reputation as a “second rate” branch of
medicine and well-trained surgeons were emerging from medical schools. The first 15 years of the
nineteenth century saw the Napoleonic Wars between France and the rest of Europe, and new surgeons
at this time had lots of practice on the battlefield and the quarterdeck.
As the first half of the 19th century went by, surgeons began to go a bit
further than the traditional operations. From Germany, operations were
developed to cure cleft palates and squinting. In America, internal cysts
were removed. For example, surgeon Ephraim McDowell removed a 15
pound ovarian cyst from Mrs Todd of Kentucky. The operation lasted 25
minutes, Mrs Todd sang hymns to drown the pain and she lived for a
further 31 years!
But this was unusual. Mortality rates in surgery were high. In 1800,
about 40% of patients died, mostly through post-operative infections.
Even surgeons who tried hard to be clean, like Spencer Wells in London,
had high mortality rates – Wells’ was 25%.
The three main problems causing this death rate were pain, infection and
bleeding. This meant operations had to be done in the quickest time
possible so that shock, loss of blood and the chance for germs to enter the
wound were minimised. In 1824, it took 20 minutes to amputate a leg through the hip joint; in 1834,
the same operation was done in 90 seconds.
The most important English surgeons at this time were:
• John Abernethy, who became Professor of the College of Surgeons in 1814 and placed great
emphasis on anatomy, turning surgery from a craft to a science.
• Astley Cooper, who operated on King George IV and spent years perfecting his operations on
hernias. He was the first surgeon to successfully operate on an aneurysm (a ballooning in a large
artery which would burst and lead the patient to bleed to death) after practising on a cadaver in the
dead-room next door.
• Robert Liston, who was renowned for biting the surgeon’s blade between his teeth so that he could
save time in operations. He taught anatomy at Edinburgh Medical School from 1818 and then
moved to University College London in 1835.
Hospitals in London provided operating theatres for the greatest surgeons and
operating day was a weekly show with celebrity surgeons performing before
their colleagues, students and the public. However, top surgeons, like Astley
Cooper of Guy’s Hospital, earned their vast salaries from performing
operations on private patients in their homes.
2. How was the problem of pain dealt with?
Since the times of the Ancient Greeks, surgeons had tried to find ways to deaden the pain during
operations. The main products used were alcohol and opium – but experience showed that they were
dangerous. For example, alcohol thins the blood, so a drunk patient might not feel much pain but
would probably bleed to death during the operation.
In 1795, Humphry Davy
experimented with nitrous oxide
(laughing gas) as a painkiller. In
1800, he wrote a report of his
experiments, showing that neat
nitrous oxide would kill animals,
but when mixed with oxygen it
produced reversible
unconsciousness. He used it on a
human patient to relieve the pain
of an inflamed gum and
suggested that it might be useful
in surgical operations, but no-one
took up his ideas. Later, nitrous
oxide was used by dentist Horace Wells in the USA for the extraction of teeth. He developed a bellows
system to administer the gas and gave a public performance in Massachusetts – which went badly
wrong and his patient suffered in agony. Wells lost medical support, grew depressed, became a drug
addict and committed suicide whilst in jail for hurling sulphuric acid at prostitutes.
The next substance to be tried to deaden pain was ether. This had been discovered in the sixteenth
century in Germany and had been used to make people cough up phlegm to balance their humours. An
American doctor, William Clarke, used ether
to deaden the pain of a patient having a tooth
extracted in 1842 and another American
doctor, Crawford Long, used ether to remove
a cyst from a neck. Both operations were
successful. An American dentist, William
Morton, then popularised the use of ether in
dentistry in the USA.
News of ether soon spread to Europe. The
first use of it in England was by Robert
Liston in December 1846, during an
amputation. The newspaper headlines the
next day read: “Hail Happy Hour! We Have
Conquered Pain!” However, ether had
problems. It irritated the lungs and led to
long bouts of coughing. It caused many
patients to vomit. It was flammable – which was not good in an age of candles and gas lamps. An
alternative anaesthetic was needed.
3. James Young Simpson had discovered chloroform in 1847.
Simpson was a professor of surgery in Edinburgh, dealing
specifically in midwifery. One evening, he took home some
chemicals with his assistants to try and find a decent
anaesthetic. Someone knocked over the chloroform bottle,
and when his wife had dinner brought in, Simpson and his
assistants were all found asleep. Later, Simpson tried giving
half a teaspoon of chloroform on a rag to a woman in labour
and was so pleased with the results that another 30 patients
were given chloroform that week.
The key event in the acceptance of chloroform was its use by
Queen Victoria in the birth of Prince Leopold, 7 April 1853.
John Snow administered the anaesthetic and the Queen
recorded in her journal that the effect of chloroform was “soothing, quieting and delightful beyond
measure”. With royal approval, chloroform began to increase in popularity. However, there were
protests against its use including:
• It was seen as cowardly to have pain relief – men were not real men if they took it
• It was seen as anti-religious: the Bible taught that childbirth was supposed to painful
• It was possible to kill people by giving them an overdose of chloroform
• It actually killed more people in operations – this was true. As the patient was no longer thrashing
about, surgeons began to attempt more risky and longer operations, inevitably killing more patients
through bleeding and infection.
John Snow tackled the problem of overdosing on chloroform by
developing the portable inhaler during the 1850s. This now regulated
how much chloroform was given to patients.
It was accepted that knocking people out entirely was a risk.
Surgeons also recognised that not all operations required patients to
be totally unconscious. A search developed for substances which
would numb a particular area for local surgery – such as the removal
of a cyst. Coca leaves were traditionally used in South America for
deadening pain and, in 1859, the active ingredient – cocaine – was
isolated. By 1885, cocaine was being produced commercially as a
local anaesthetic.
A few more developments to the problem of pain were made in the
twentieth century:
• In 1902, anaesthetics began to be injected into the blood stream to
control doses even more precisely.
• Synthetic substances for local anaesthetics were developed such as ester procaine in 1905.
• In the 1930s, artificial respiration was developed which allowed surgeons to use curare – a muscle
relaxing drug.
4. How was the problem of infection dealt with?
In the nineteenth century, it was common for a surgeon to operate in an old
blood-caked frock coat and to wash his hands only after the operation. The
operating theatre would contain a wooden table and sawdust to soak up the
blood. All of these things meant germs were rife and sepsis (infections in
the blood and tissue) were common.
The link between sepsis and cleanliness had already been noted. In 1795,
Alexander Gordon had argued that mothers who contracted childbed fever
after childbirth had been infected by their doctors or midwives, and had
recommended that the operator should wash before coming into contact
with the mother. In 1843, American Oliver Holmes had argued that
doctors were bringing “germs” into contact with mothers but was overruled
by other doctors.
The sepsis problem came to a head in the 1840s in Vienna. The
Vienna General Hospital had the largest maternity clinic in the world.
There were two great wards: in Ward One, childbed fever raged and
the mortality rate was at 29%, in Ward Two, childbed fever was rarer
and the mortality rate at 3%. An assistant physician, Ignaz
Semmelweis, tried to work out what the difference was. He knew that
Ward One was handled by the medical students and that Ward Two
was handled by midwifery students. Medical students came straight
to the wards from the autopsy rooms with soiled hands and
instruments; midwifery students didn’t. When the two groups
swapped over, Ward Two became the place to die. His suspicions
were confirmed when a doctor cut his finger during an autopsy and
died of septicaemia (blood poisoning) with the exact symptoms of the
women dying of childbed fever.
In 1847, Semmelweis ordered that everyone in the wards wash their hand with chlorinated water.
Mortality rated plummeted. However, colleagues refused to believe his theories about putrid particles
being passed from the medical students to the women – remember this was before the discovery of
Germ Theory – and resisted his attempts to clean up. Semmelweis left Vienna for Budapest and
introduced chlorine disinfection to the maternity wards at his new hospital. Childbed fever rates fell
below 1%, and in 1861, he published a book on childbed fever. However, in 1865, Semmelweis died
in a Viennese mental asylum.
By this time, antiseptics of a general kind, like Semmelweis’s, were used widely. Iodine was used for
bathing wounds, and other substances like bromine, creosote, zinc chloride and nitric acid were used
for washing. Florence Nightingale had introduced the idea of spotless hospital environments.
Whitewashing walls was common. Surgeons were urged to use soap and it was known that dressings
should be changed. But no-one had yet introduced one clear, effective way of limiting infection and
got everybody to use it. This was the achievement of Joseph Lister.
Lister studied at University College London and became an assistant surgeon in Edinburgh in 1854,
before moving on to head up surgery in Glasgow in 1860. In Glasgow, patients were often contracting
5. sepsis. He began to research where sepsis was coming from,
experimenting on frogs. Then he read Pasteur’s Germ Theory in
1865 and became convinced that sepsis was being caused by
microbes in the air.
Lister now understood that he needed to get rid of these microbes.
Normally skin provided the barrier to these microbes – if the skin
was open, he would need a chemical barrier instead. Using
knowledge of carbolic acid used in treating sewage, he dressed a
compound tibia fracture with a bandage soaked in carbolic. The
boy, James Greenlees, who had been run over by a cart, walked out
of the infirmary fully healed in 6 weeks. The experiment was
repeated 9 months later and worked again. From this success,
Lister worked out a ritual for operations – antisepsis (killing
infections in the wounds by smothering it in carbolic drenched dressings and tin foil) and asepsis
(preventing new infections entering the wound by spraying the whole operating theatre with carbolic
acid). He wrote up all his findings in 1867 in the medical journal The Lancet and went on to develop
antiseptic ligatures for wounds and a
“donkey engine”, a steam driven device
to spray the operating theatre with
carbolic acid.
Criticisms of his methods immediately
followed:
• Some doctors denied the existence of
bacteria in the air and said all the
carbolic was unnecessary.
• Lister kept changing his methods to
improve them – but this made people
think he was changing because they
didn’t work
• Carbolic acid slowed down an
operation – many people believed that speed was the most important thing
• When people tried to copy Lister’s methods and weren’t so careful, they didn’t work.
• Nurses and assistants in the operations complained of the carbolic fumes and the damage the acid
did to their hands
By the late nineteenth century, operations were a strange mix of the old and new. Theatres were full of
carbolic, but hands were just rinsed not scrubbed. Operations were still based around setting broken
bones, amputating and removing cysts. The old blood-stained coats were still worn. But at the end of
the century, as Koch’s identification of germs continued, Lister’s ideas were accepted more and more
and his antiseptic (getting rid of germs) surgery began to develop into aseptic (no germs in the first
place) surgery. Rubber gloves were worn (1894) and face masks began to be used (1897). Koch’s
work showed that heat was more effective than carbolic for sterilising surgical instruments and the
spray was abandoned in 1890. By 1910, operating theatres were filled with people wearing sterilised
gowns, masks and gloves, using metal furniture and operating under electric lights. Surgeons were
actively pursuing higher and higher standards of cleanliness to reduce the death rates.
6. How was the problem of bleeding dealt with?
The idea of blood transfusions had started back in the 17th
century. Here, doctors had tried to pump animal blood into
patients but with no success. In the 19th century, human to
human transfusions had been tried, but no-one could explain
why they sometimes worked and sometimes didn’t.
In 1901, Karl Landsteiner discovered blood groups. He
noted that when patients died during a transfusion it was
because their blood was clotting. Looking carefully at the
blood cells under a microscope, he found that they were
surrounded by different types of tiny cells which he called
antigens. The first type of blood he called Group A and the
second Group B. When the antigens of the patient matched the antigens of
the transfused blood, all was well. If they didn’t match, the blood would
clot and the patient would die. As he continued to test this theory, he
discovered some blood cells had no antigens (so he called them Group O)
and some blood cells had two types of antigens (so he called them Group
AB). Other blood groups were discovered and Landsteiner received a
Nobel Prize for his work in 1930. He went on to discover positive and
negative blood types in the 1940s.
Landsteiner’s work made it possible to have a successful transfusion every
time. The problem was that a donor was needed on the spot. When WW1
broke out and thousands of soldiers were dying of gunshot wounds in the
trenches, it was not possible to have donors on the spot. The war drove
doctors to find an answer and in 1914, it was found that sodium citrate stopped blood clotting when it
came into contact with the air. From this, doctors began to experiment with mixing blood with these
anti-clotting agents so that it could be transported to the Front. In 1916, this was achieved and led to
the British Army setting up the first blood depot.
As people saw WW2 approaching in the 1930s,
blood banks were set up in readiness for the
carnage. In 1940, American doctor Charles
Drew worked out a way to separate the blood
cells from plasma and found that separately,
these products did not clot. They could be kept
and transported anywhere in the world before
being put together again when needed in an
operation. During WW2, it became possible for
civilians to receive blood transfusions. As
surgeons returned from WW2 with expertise in
using blood banks, they began to demand that
they have the same facilities for patients in peace
times and national blood banks were set up
permanently. Britain still has the National Blood
Transfusion Service.
7. What has happened to surgery since these breakthroughs?
As surgery became safer, surgeons began to become more adventurous in the types of operation they
would attempt. In the late 19th century and early 20th century, surgeons began digging further into the
body, treating bowels, the pancreas, the liver and the abdomen. Cancers were cut out, gallstones
removed. People began to operate on the heart, lungs and brain. Surgery grew in popularity and
people began to see surgery as the way to treat all ills. For example, William Arbuthnot Lane argued
that lengths of gut should be removed to cure constipation. Certain operations became fashionable
such as the removal of children’s tonsils in the 1920s and hysterectomies in the 1930s.
WW1 decisively advanced skin transplants. Shells caused
horrific facial injuries. Harold Gillies set up a plastic
surgery unit in Aldershot and dealt with 2000 cases of
facial damage after the Battle of the Somme. Skin from
other parts of the patient’s body was used to cover facial
damage and to encourage facial skin to grow back again.
Gillies’ cousin, Archibald McIndoe, joined the work in
the 1930s, and then treated 4000 airmen from the battle of
Britain in WW2. McIndoe called his work reconstructive
surgery as he took years and many operations to slowly
give men back their hands and faces. These advances
marked a shift in people’s thinking about surgery – no longer was it just about “cutting things out” but
now began to be about restoring and replacing what was already there.
Reasons for changes: new technology and scientific understanding
The following technology and knowledge moved surgery along:
• Ophthalmoscope – 1851 – allowed the interior of the eye to be seen
• Louis Pasteur’s Germ Theory – 1864 – allowed Lister to make his link to antiseptics
• Oesophagoscope – 1868 – allowed foreign objects in the gullet to be seen and removed
• Robert Koch’s heat sterilisation – 1890 – ended the use of carbolic to sterilise operating theatres
• Rectoscope – 1895 – to see up the rectum
• Gastroscopes – late 1890s – to see into the stomach
• Microscope used directly on the body – 1899 – to see the cornea
• Cardiograph – 1903 – keeps the beating of the heart monitored during an operation.
• Fibre-optic endoscopes – 1930s – to see into the stomach with greater ease and less pain
• Karl Röntgen’s X-rays – 1895 – allowing the inside of the body to be seen without opening it up
• Karl Landsteiner’s antigens – 1901 – allowing blood groups to be discovered
• Marie Curie’s radiation treatments – 1904 – burning diseased cells away
• First microsurgery – 1921 – doing an operation looking through a microscope
• Ultraviolet microscope – 1925 – allowing viruses to be seen due to 2500x magnification.
• Electron microscope – 1934 – revealing cell structure due to 7000x magnification.
• Catheterization – 1940 – placing tubes in a patient to drain off fluid
• Heart-lung machine – 1940s – to keep the body working whilst in an operation
• Ian Donald’s ultrasound – 1950s – for assessing foetal development