Global Health
Following World War II, the deadliest conflict in history, the World Health Organization (WHO) was formally founded on April 7, 1948. Its vision aimed for universal health, advocating that the highest standards of care should be accessible to everyone.
The World Health Organization
From the early 20th century, international health agencies like the Pan American Health Organization (1902), L’Office International d’Hygiène Publique (1907), and the Health Organization at the League of Nations (1923) emerged, focusing on disease control, eradication, and quarantine measures.
Formally established on April 7, 1948, inheriting tasks from previous organizations, the WHO was mandated to promote the highest possible level of health globally. Initially working with a $5 million budget, it tackled malaria, tubercullsis, venereal diseases, leprosy, trachoma, and children’s health.
The WHO Today
By 2020, the WHO had 194 member states and a $4.2 billion budget. Key functions included issuing global health guidelines, sanitary regulations, health education, mass vaccination campaigns, and collecting global health data. The WHO’s notable achievement was smallpox eradication.
In the face of the 2020 COVID-19 pandemic, the WHO served as a global information centre, providing advice, scientific updates, and global mortality figures. World Health Day on April 7 commemorates the WHO’s establishment, aiming to promote global health awareness.
WASH YOUR HANDS HYGIENE
Texts from the ancient past that mention regular bathing and head-shaving to prevent lice reveal the awareness of cleanliness and good hygiene practices among Egyptians, Romans, and Greeks. Surprisingly, for the next 2000 years, the significance of hygiene wasn’t appreciated. During the mediaeval era, public health conditions declined as urban populations swelled and personal hygiene took a back seat. It was in the 1840s that two astute physicians, Ignaz Semmelweis in Austria and Oliver Wendell Holmes in the United States, discerned the critical connection between inadequate hygiene and its adverse impact on health.
The WHO Today
By 2020, the WHO had 194 member states and a $4.2 billion budget. Key functions included issuing global health guidelines, sanitary regulations, health education, mass vaccination campaigns, and collecting global health data. The WHO’s notable achievement was smallpox eradication.
In the face of the 2020 COVID-19 pandemic, the WHO served as a global information centre, providing advice, scientific updates, and global mortality figures. World Health Day on April 7 commemorates the WHO’s establishment, aiming to promote global health awareness.
Dialysis
In the 1920s, German physician Georg Haas pioneered early kidney dialysis attempts on human patients. Using various machines of his design, he initially employed leech saliva’s hirudin as an anticoagulant, causing allergic reactions. Shifting to heparin later, a naturally occurring anticoagulant in humans, Haas faced challenges as his procedures were too brief to yield therapeutic effects, leading to no patient survival.
Kolff’s Innovation
The pivotal moment arrived in 1945 when Kolff conducted a week-long dialysis on a 67-year-old patient with acute kidney failure. Employing a rotating drum dialyser, the precursor to modern kidney dialysis machines, Kolff ingeniously crafted the apparatus using wooden bed slats, cellophane sausage casing, and an electric motor. Blood, mixed with heparin, traversed cellophane tubes around the rotating drum through an electrolyte solution (dialysate). This innovative process filtered toxins through diffusion, achieving equilibrium before the purified blood returned to the patient.
Kolff’s Innovation
The pivotal moment arrived in 1945 when Kolff conducted a week-long dialysis on a 67-year-old patient with acute kidney failure. Employing a rotating drum dialyser, the precursor to modern kidney dialysis machines, Kolff ingeniously crafted the apparatus using wooden bed slats, cellophane sausage casing, and an electric motor. Blood, mixed with heparin, traversed cellophane tubes around the rotating drum through an electrolyte solution (dialysate). This innovative process filtered toxins through diffusion, achieving equilibrium before the purified blood returned to the patient.
Evolution to Modern Dialysis
Hemodialysis, employing a dialyser to filter blood, remains the predominant method, but peritoneal dialysis emerges as a viable alternative for approximately 300,000 kidney patients globally. This at-home procedure involves dialysis solution flowing into the abdomen via a catheter, where the peritoneum filters out waste products, which are subsequently drained from the body. Patients can perform this process multiple times a day or opt for machine-operated dialysis during sleep.
While the technology pioneered by Kolff thrives, the primary challenge lies in the overwhelming number of kidney failure patients. With over two million people worldwide undergoing dialysis (90,000 of whom receive kidney transplants), it is estimated to address only a fraction of the tenth in dire need due to barriers to access and affordability.
Steroids and Cortisone
Cortisol’s Curative Power
The pivotal turning point occurred in 1948 when the discovery revealed that cortisol, a hormone naturally produced in the adrenal cortex, could effectively alleviate rheumatoid arthritis. This breakthrough laid the foundation for the development of a novel class of anti-inflammatory drugs known as corticosteroids or steroids. Hench, along with colleagues Edward Kendall and Swiss researcher Tadeus Reichstein, was awarded the Nobel Prize in Physiology or Medicine in 1950 for their groundbreaking contributions.
Jaundice Clues and Insights
In the mid-19th century, British physician Alfred Garrod’s distinction between rheumatoid arthritis and gout provided crucial groundwork for subsequent research. By the 1920s, the prevailing belief linked most cases of rheumatoid arthritis to infections, but Hench questioned this hypothesis. Notably, in 1929, while leading the Mayo Clinic’s rheumatic diseases department, Hench observed that a patient’s rheumatoid arthritis improved after developing jaundice. Further exploration revealed similar relief in patients with various arthritic conditions during jaundice or pregnancy.
Jaundice Clues and Insights
In the mid-19th century, British physician Alfred Garrod’s distinction between rheumatoid arthritis and gout provided crucial groundwork for subsequent research. By the 1920s, the prevailing belief linked most cases of rheumatoid arthritis to infections, but Hench questioned this hypothesis. Notably, in 1929, while leading the Mayo Clinic’s rheumatic diseases department, Hench observed that a patient’s rheumatoid arthritis improved after developing jaundice. Further exploration revealed similar relief in patients with various arthritic conditions during jaundice or pregnancy.
The Search for Substance X
By 1938, Hench had meticulously studied over 30 cases where jaundice offered temporary relief to arthritis symptoms. Recognizing that conditions like pregnancy and allergies also induced similar relief, he hypothesized a natural steroid hormone’s involvement. Collaborating with Kendall, they delved into adrenal glands as the potential source of this hormone, eventually identifying and naming it “substance X.”
Throughout the 1930s, Kendall and others researched cortin, an adrenal cortex extract with vital hormones. By 1941, after a hiatus during World War II, Hench and Kendall identified “compound E” as the sought-after substance. Post-war collaboration with Merck allowed Kendall to produce larger quantities of compound E, enabling Hench to conduct crucial studies on rheumatoid arthritis. This pioneering research not only revolutionized understanding but also paved the way for effective treatments in the realm of autoimmune disorders.
Lithium and Bipolar Disorder
The Guinea Pig Experiment
Intrigued by his hypothesis, Cade conducted experiments injecting urine from bipolar patients into guinea pigs. Astonishingly, urine from manic patients proved more lethal for the animals than urine from non-bipolar individuals. Upon introducing lithium, a former treatment for gout, into the equation, Cade observed a significant reduction in toxicity. Large doses of lithium also induced passivity in the guinea pigs. Drawing parallels, Cade administered lithium to ten bipolar patients, witnessing remarkable improvements. Despite minimal recognition when published in 1949, these findings spurred further exploration of lithium’s potential.
The Guinea Pig Experiment
Intrigued by his hypothesis, Cade conducted experiments injecting urine from bipolar patients into guinea pigs. Astonishingly, urine from manic patients proved more lethal for the animals than urine from non-bipolar individuals. Upon introducing lithium, a former treatment for gout, into the equation, Cade observed a significant reduction in toxicity. Large doses of lithium also induced passivity in the guinea pigs. Drawing parallels, Cade administered lithium to ten bipolar patients, witnessing remarkable improvements. Despite minimal recognition when published in 1949, these findings spurred further exploration of lithium’s potential.
Mogens Schou’s Validation
Danish psychiatrist Mogens Schou, in 1970, provided crucial validation through research affirming lithium’s efficacy in treating bipolar disorder. The United States officially embraced lithium as a medication for bipolar disorder in the same year. Since the 1960s, European countries have already recognized lithium as a treatment for mania. Today, lithium stands as a primary medication for managing bipolar disorder, a testament to the transformative impact of pioneering research by Cade and subsequent endorsements by Schou.
Chlorpromazine and Antipsychotics
World’s First Antipsychotic
In 1952, French psychiatrists Jean Delay and Pierre Deniker of St Anne’s Hospital, Paris, began utilizing chlorpromazine to address mania and schizophrenia in inpatients. The drug exhibited remarkable efficacy in managing agitation and overexcitement, earning its classification as a “major tranquillizer,” later recognized as an “antipsychotic.”
Following successful small-scale trials conducted by Canadian psychiatrist Heinz Lehmann, chlorpromazine entered widespread use in the United States in 1954. By the 1960s, it became a prevalent prescription in Europe and North America, particularly for patients dealing with schizophrenia and bipolar disorder. Its mechanism of action involves blocking dopamine receptors in the brain, reducing the transmission of messages between brain cells and alleviating psychotic symptoms like delusions and hallucinations. Notably, chlorpromazine also diminished the reliance on treatments such as electroshock therapy.
Despite subsequent developments in antipsychotic medications since the 1960s, none have matched the success of chlorpromazine, establishing its status as the world’s first and enduringly impactful antipsychotic.
Behavioural and Cognitive Therapy
Amidst the post-World War II era, the pressing need for efficient short-term treatments for anxiety and depression in returning troops catalyzed a revolutionary shift in psychological therapy. Behavioural therapy, rooted in measurable external factors and rejecting Freudian introspection, gained prominence. American psychologist B.F. Skinner, a trailblazer in behaviorism, developed a science of behaviour by 1953, laying the foundation for modern psychotherapeutic practices and paving the way for cognitive behavioural therapy (CBT).
Conditioning Behavior
Skinner’s theories were built upon the research of earlier behaviorists Ivan Pavlov and John Watson. Pavlov’s classical conditioning experiments with dogs in the 1890s showcased learned responses. Skinner expanded on this concept, introducing “operant conditioning” in 1938. This process involved shaping behavior through positive or negative reinforcement, rewarding steps toward desired behavior and discouraging undesired conduct.
Integration into Cognitive Behavioral Therapy (CBT)
Therapists in the 1970s began merging behavioral and cognitive theories, giving rise to CBT. This comprehensive approach focused on modifying visible behaviors and reprogramming conscious thoughts, showcasing remarkable efficacy in numerous studies.
A Third Wave
The 1990s ushered in a third wave of therapies, broadening the scope of CBT. This wave emphasized transforming individuals' relationships with thoughts and emotions rather than altering thought content. Though CBT continually evolves, its roots in scientific experimentation and clinical case studies persist. Skinner's enduring emphasis on reinforcement as a catalyst for behavioral change continues to shape the landscape of psychotherapy and psychology as a science.
Cognitive Revolution
The 1960s marked a cognitive revolution, challenging Skinner’s exclusively behaviorist approach. Cognitive therapists like Aaron Beck argued that conditioned responses alone couldn’t explain all behaviors; distorted or dysfunctional thinking played a crucial role. Beck’s cognitive approach involved identifying, evaluating, and correcting negative perceptions or automatic thoughts, anchoring the foundation of cognitive therapy.
Conditioning Behavior
Skinner’s theories were built upon the research of earlier behaviorists Ivan Pavlov and John Watson. Pavlov’s classical conditioning experiments with dogs in the 1890s showcased learned responses. Skinner expanded on this concept, introducing “operant conditioning” in 1938. This process involved shaping behavior through positive or negative reinforcement, rewarding steps toward desired behavior and discouraging undesired conduct.
Cognitive Revolution
The 1960s marked a cognitive revolution, challenging Skinner’s exclusively behaviorist approach. Cognitive therapists like Aaron Beck argued that conditioned responses alone couldn’t explain all behaviors; distorted or dysfunctional thinking played a crucial role. Beck’s cognitive approach involved identifying, evaluating, and correcting negative perceptions or automatic thoughts, anchoring the foundation of cognitive therapy.
Integration into Cognitive Behavioral Therapy (CBT)
Therapists in the 1970s began merging behavioral and cognitive theories, giving rise to CBT. This comprehensive approach focused on modifying visible behaviors and reprogramming conscious thoughts, showcasing remarkable efficacy in numerous studies.
A Third Wave
The 1990s ushered in a third wave of therapies, broadening the scope of CBT. This wave emphasized transforming individuals’ relationships with thoughts and emotions rather than altering thought content. Though CBT continually evolves, its roots in scientific experimentation and clinical case studies persist. Skinner’s enduring emphasis on reinforcement as a catalyst for behavioral change continues to shape the landscape of psychotherapy and psychology as a science.
Ultrasound
The groundbreaking application of ultrasound technology in obstetrics was spearheaded by British physician Ian Donald at the University of Glasgow in 1956. Collaborating with engineer Tom Brown and obstetrician John McVicar, they engineered the first successful ultrasound diagnostic scanner.
Utilizing high-frequency sound waves beyond human hearing, ultrasound, or medical sonography emerged as a noninvasive alternative for obtaining crucial information about the fetus. Compared to X-rays, which pose radiation risks to the fetus, ultrasound employs a transducer to send waves into the body, capturing returning echoes. A computer then transforms these echoes into detailed images.
Diagnostic Firsts
Ian Donald’s endeavours followed earlier experimentation with ultrasound as a diagnostic tool. In 1942, Austrian neurologist Karl Dussik and his brother, Friedrich, ventured into locating brain tumours through ultrasound beam transmission measurements. American George Ludwig, in the late 1940s, utilized ultrasound to detect gallstones in animals. Concurrently, British physician John Wild, aided by electrical engineer John Reid in 1951, developed the first handheld contact scanner.The narrative expanded with the endeavours of Swedish cardiologist Inge Edler and German physicist Hellmuth Hertz in 1953. They achieved the first successful echocardiogram, utilizing ultrasound to delve into the intricacies of heart function. These collective milestones highlight the transformative journey of ultrasound, from early explorations to Ian Donald’s pivotal contribution to shaping modern diagnostic practices.
Chromosomes and Down Syndrome
In 1958, Marthe Gautier, a pioneering French paediatrics researcher in Paris, made a groundbreaking revelation about Down syndrome. Delving into slides at a hospital laboratory, she observed that individuals with Down syndrome possessed three copies of chromosome 21 instead of the typical two.
Two years prior to Gautier’s breakthrough, geneticists at Lund University, Sweden, had already unveiled a fundamental aspect of human genetics. They established that nearly all body cells contain 23 pairs of chromosomes, totalling 46, with each pair inherited from both parents. In contrast, sperm and egg cells possess a singular set of 23 unpaired chromosomes. Fertilization merges these cells into a new entity with 23 pairs of chromosomes.
The Intricacies of Trisomy
Current genetic understanding affirms that the presence of an additional copy, termed trisomy, can occur during meiosis, the process of producing sperm and egg cells in reproductive organs. While trisomy can manifest in any chromosome, trisomy 21, leading to Down syndrome, is the most prevalent, impacting approximately one birth in every 1,000. This genetic anomaly results in distinct physical features, such as a flatter facial profile and poor muscle tone, alongside a spectrum of learning disabilities from mild to moderate.
Beyond Down syndrome, trisomy 18 can cause Edwards syndrome, characterized by heart defects, affecting around one in 6,000 newborns. Marthe Gautier’s pivotal discovery, coupled with subsequent genetic insights, has significantly enhanced our comprehension of these chromosomal abnormalities and their diverse impacts on human development.
Interferon
In 1957, virologists Alick Isaacs and Jean Lindenmann at the U.K.’s National Institute for Medical Research unveiled interferon, a pivotal class of cytokines renowned for impeding viral infections.
Viruses, orchestrating their spread by commandeering cells, encounter interference from the body’s natural defence mechanism – interferons.
Amidst initial enthusiasm for interferon’s potential in antiviral drug development and its prospects in cancer treatment due to its inhibitory impact on cell growth, challenges surfaced. In the 1960s, Finnish scientist Kari Cantell discovered that infecting white blood cells with the Sendai virus prompted the production of alpha interferon. The 1980s witnessed a breakthrough in genetic engineering at a Swiss laboratory, leading to mass production of alpha and other interferons.
While early animal research hinted at interferon’s promise in cancer suppression, its application in patients revealed significant side effects, including flu-like symptoms, nausea, and severe depression. Despite these challenges, low doses of interferon persistently find application in treating various cancers, hepatitis, and multiple sclerosis, underscoring its enduring role in medical interventions.
Interferon
In 1957, virologists Alick Isaacs and Jean Lindenmann at the U.K.’s National Institute for Medical Research unveiled interferon, a pivotal class of cytokines renowned for impeding viral infections.
Viruses, orchestrating their spread by commandeering cells, encounter interference from the body’s natural defence mechanism – interferons.
Amidst initial enthusiasm for interferon’s potential in antiviral drug development and its prospects in cancer treatment due to its inhibitory impact on cell growth, challenges surfaced. In the 1960s, Finnish scientist Kari Cantell discovered that infecting white blood cells with the Sendai virus prompted the production of alpha interferon. The 1980s witnessed a breakthrough in genetic engineering at a Swiss laboratory, leading to mass production of alpha and other interferons.
While early animal research hinted at interferon’s promise in cancer suppression, its application in patients revealed significant side effects, including flu-like symptoms, nausea, and severe depression. Despite these challenges, low doses of interferon persistently find application in treating various cancers, hepatitis, and multiple sclerosis, underscoring its enduring role in medical interventions.
Pacemakers
The human heart, orchestrating over two billion beats in an average lifetime, pulsates with remarkable regularity. Yet, for approximately three million people globally, maintaining this rhythm hinges on the intervention of artificial pacemakers.
In 1951, Canadian engineer John Hopps pioneered the first functional pacemaker—an external, cumbersome, mains-powered apparatus that patients manoeuvred on a trolley. Seven years later, Swedish engineer Rune Elmqvist and cardiac surgeon Åke Senning, leveraging small batteries and compact transistors, crafted an implantable pacemaker, marking a pivotal shift from external to internal cardiac regulation.
In a courageous leap, Else-Marie Larsson convinced Elmqvist and Senning to test the device on her ailing husband, Arne. Racing against time, Elmqvist fashioned components from resin in a plastic cup, and on October 8, 1958, Senning implanted the prototype. While the initial model needed replacement the next day, a subsequent iteration functioned flawlessly. Larsson, receiving 25 more pacemakers over 43 years, passed away at the age of 86.
The 1960s heralded patient-controlled, variable-rate pacemakers, enhancing adaptability. Lithium batteries, introduced in 1972, extended battery life from a mere two years to a decade. Recent strides feature pill-sized pacemakers and sensors enabling automatic adjustments in heart pace based on body activity, exemplifying the ongoing commitment to refining cardiac care.
Lymphocytes and Iymphatics
The lymphatic system, a vital drainage network and a linchpin in infection defence relies on its intricate vessels, known as lymphatics. These vessels transport toxins and waste in lymph fluid while lymphocytes, a subtype of white blood cells, discern and combat invading germs.
In 1959, British physician James Gowans achieved a breakthrough by uncovering the circulation of lymphocytes between the lymphatic system and the bloodstream. This revelation marked a pivotal stride in comprehending the indispensable role played by lymphocytes and lymphatic circulation in the body’s immune defence.
A vital system
Following the delivery of nutrients and oxygen by blood to body cells, blood plasma carries away cellular waste. Some plasma, along with other fluids, infiltrates body tissues and eventually drains into lymphatic vessels, forming lymph—a clear fluid akin to blood plasma. Lymph meanders through the body, purging cellular waste before rejoining the bloodstream. The system encompasses approximately 600 nodes housing a meshlike tissue that filters lymph, ensnaring germs and toxins. Lymph is also the carrier of lymphocytes, minuscule pale blood cells initially described by British surgeon William Hewson in 1770.
While lymphocytes were identified in inflammatory reactions and bacterial diseases, their transient existence puzzled researchers until Gowans’ revelation. Contrary to previous assumptions, he demonstrated that lymphocytes don’t vanish but are absorbed into the lymphatic system. They circulate through tissues and lymph nodes before returning to the bloodstream. Far from being short-lived, lymphocytes boast a lifespan of up to 15 years, continuously recirculating. Gowans proposed that these cells, which carry antibodies, disperse them throughout the body by traversing tissues. Furthermore, he revealed that lymphocytes, through interaction with antigens on a pathogen’s surface, initiate an immune response. Today, they are acknowledged as the central cells orchestrating the body’s targeted, adaptive immune system.
Hormonal Contraception
In the mid-20th century, while condoms and diaphragms dominated contraception, the scientific groundwork for hormonal contraception had been laid since the 1920s. In 1951, birth control advocates Margaret Sanger urged biologist Gregory Pincus in the U.S. to develop a hormonal contraceptive in pill form. Simultaneously, in Mexico City, chemist Carl Djerassi, employed by Syntex, synthesized norethindrone—an artificial rendition of progesterone, the female sex hormone.
Recognizing progesterone’s ability to inhibit ovulation in animals, Pincus collaborated with gynecologist John Rocks. By 1953, they conducted trials on a birth control pill for women, later coined “the Pill.” Legal restrictions and opposition from the Catholic Church prompted them to relocate trials to Puerto Rico in 1955. Branded as Enovid, their drug contained hormone levels tenfold higher than modern pills. Unaware of potential side effects, 200 volunteers experienced symptoms like dizziness, nausea, headaches, and blood clots. In 1960, the U.S. Food and Drug Administration approved Enovid as an oral contraceptive, despite its elevated hormone levels (later halved in 1961).
Pincus and Djerassi, pivotal figures in the Pill’s development, earned the moniker “father of the Pill” due to their groundbreaking contributions. This marked a paradigm shift in contraception, offering women a transformative method that would shape reproductive choices for generations to come.
The FDA and Thalidomide
In 1937, a harrowing incident unfolded as over 100 US citizens, including numerous children, succumbed to the effects of a new drug named elixir sulfanilamide. The elixir, evaluated solely for taste and appearance, lacked toxicity testing. While sulfanilamide itself was benign, the solvent diethylene glycol in which it was dissolved proved fatal. The ensuing public outcry catalyzed the passage of the 1938 Food, Drug, and Cosmetic Act, laying the foundation for drug control mechanisms in the U.S. The law mandated companies to establish the safety of new drugs and permitted government inspections of manufacturing facilities.
Within the Food and Drug Administration (FDA), a pivotal figure emerged in the form of pharmacologist Frances Oldham Kelsey. Even while pursuing her doctorate, Kelsey was part of the team responsible for drug approvals. In 1960, she assumed a critical role in evaluating thalidomide, a drug proven effective for alleviating nausea in pregnant women. Despite thalidomide’s approval in 40 countries, Kelsey, exercising meticulous scrutiny, rejected its clearance.
The repercussions unfolded in 1961 when reports surfaced in Germany and the U.K., revealing severe congenital defects in newborn babies to mothers who had taken thalidomide. The drug, traversing the placenta, induced deformities in the fetus. While over 10,000 children worldwide suffered, with half succumbing within months of birth, the vigilant efforts of Kelsey resulted in only 17 cases in the U.S. This tragic episode underscored the imperative need for stringent drug regulation, ultimately reshaping the landscape of pharmaceutical oversight.
The FDA and Thalidomide
In 1937, a harrowing incident unfolded as over 100 US citizens, including numerous children, succumbed to the effects of a new drug named elixir sulfanilamide. The elixir, evaluated solely for taste and appearance, lacked toxicity testing. While sulfanilamide itself was benign, the solvent diethylene glycol in which it was dissolved proved fatal. The ensuing public outcry catalyzed the passage of the 1938 Food, Drug, and Cosmetic Act, laying the foundation for drug control mechanisms in the U.S. The law mandated companies to establish the safety of new drugs and permitted government inspections of manufacturing facilities.
Within the Food and Drug Administration (FDA), a pivotal figure emerged in the form of pharmacologist Frances Oldham Kelsey. Even while pursuing her doctorate, Kelsey was part of the team responsible for drug approvals. In 1960, she assumed a critical role in evaluating thalidomide, a drug proven effective for alleviating nausea in pregnant women. Despite thalidomide’s approval in 40 countries, Kelsey, exercising meticulous scrutiny, rejected its clearance.
The repercussions unfolded in 1961 when reports surfaced in Germany and the U.K., revealing severe congenital defects in newborn babies to mothers who had taken thalidomide. The drug, traversing the placenta, induced deformities in the fetus. While over 10,000 children worldwide suffered, with half succumbing within months of birth, the vigilant efforts of Kelsey resulted in only 17 cases in the U.S. This tragic episode underscored the imperative need for stringent drug regulation, ultimately reshaping the landscape of pharmaceutical oversight.
Tobacco and Lung Cancer
As of 2018, global data from the World Health Organization revealed lung cancer as the most prevalent cancer worldwide, with 2.1 million diagnoses and 1.76 million deaths, constituting 22% of all cancer-related fatalities. A staggering 80% of these deaths were attributed to tobacco smoking. Despite the stark correlation between tobacco and lung cancer, cigarette companies vehemently denied the association for decades, funding biased research to support their stance and deploying statisticians to challenge contrary evidence.
The British Doctors Study
In 1951, epidemiologists Richard Doll and Austin Bradford Hill initiated the British Doctors Study, aiming to quantify the connection between smoking and lung cancer. Surveying over 40,000 physicians, including a majority of smokers in the tobacco-heavy era, the study spanned until 2001. By 1965, it unequivocally established that smokers faced elevated risks of lung cancer and other ailments, with those commencing smoking before World War II losing an average of ten years of life. To fortify their findings, Hill applied the Bradford-Hill criteria, ensuring resilience against challenges from tobacco companies.
While radon gas, asbestos, and air pollution contribute to lung cancer, around 8% of cases have genetic origins linked to mutations on chromosomes 5, 6, or 15. However, the predominant cause remains tobacco smoke, laden with carcinogens triggering cancer. These substances activate oncogenes and suppress natural tumour suppressor genes, propelling abnormal cell proliferation.
The British Doctors Study
In 1951, epidemiologists Richard Doll and Austin Bradford Hill initiated the British Doctors Study, aiming to quantify the connection between smoking and lung cancer. Surveying over 40,000 physicians, including a majority of smokers in the tobacco-heavy era, the study spanned until 2001. By 1965, it unequivocally established that smokers faced elevated risks of lung cancer and other ailments, with those commencing smoking before World War II losing an average of ten years of life. To fortify their findings, Hill applied the Bradford-Hill criteria, ensuring resilience against challenges from tobacco companies.
While radon gas, asbestos, and air pollution contribute to lung cancer, around 8% of cases have genetic origins linked to mutations on chromosomes 5, 6, or 15. However, the predominant cause remains tobacco smoke, laden with carcinogens triggering cancer. These substances activate oncogenes and suppress natural tumour suppressor genes, propelling abnormal cell proliferation.
Triggering Cancer
Conditions like emphysema and bronchitis, induced by inhaled particulates, heighten lung cancer susceptibility. Treatment advancements, from the first pneumonectomy in 1933 by Evarts Graham to radiotherapy in the 1940s and chemotherapy in the 1970s, have improved outcomes. Despite modern approaches combining radiotherapy, chemotherapy, and surgery, lung cancer prognosis remains challenging.
Treating Lung Cancer
In the quest for effective lung cancer treatment, TRAIL therapy emerges as a promising avenue. TRAIL, or CD253, a cytokine targeting specific cancer cells, holds the potential for destruction without harming healthy tissue. Administered intravenously, TRAIL faces challenges of cancer cell resistance, yet ongoing trials persist in the pursuit of groundbreaking treatments that may revolutionize cancer cures.
Treating Lung Cancer
In the quest for effective lung cancer treatment, TRAIL therapy emerges as a promising avenue. TRAIL, or CD253, a cytokine targeting specific cancer cells, holds the potential for destruction without harming healthy tissue. Administered intravenously, TRAIL faces challenges of cancer cell resistance, yet ongoing trials persist in the pursuit of groundbreaking treatments that may revolutionize cancer cures.
Palliative Care
Saunders envisioned a dignified approach to the dying, advocating for compassionate treatment and access to effective pain relief. Her groundbreaking theory of “total pain” encompassed physical, emotional, social, and spiritual aspects of distress. St Christopher’s Hospice became the embodiment of Saunders’ vision, where individualized care, tailored medical treatment, and holistic support from a specialized team extended until the patient’s final moments.
During a pivotal era in British healthcare, marked by the establishment of the National Health Service (NHS) in 1948, the focus on terminally ill patients was minimal. Many spent their last hours in hospitals with generic pain management, highlighting a gap in end-of-life care.
Evolution in End-of-Life Practices
The presence of a doctor at a patient’s deathbed, now common, was historically absent. Earlier, doctors primarily aimed at curing diseases rather than aiding the terminally ill. In medieval Europe, early deaths were common, but medical advances in the late 19th century prolonged life, leading to the emergence of prolonged and painful deaths. Opium or laudanum, administered by a doctor, became crucial during the end-of-life process.
In the early 20th century, identifying adequate end-of-life pain relief was a challenge, with morphine as a default remedy. Anxiety stemmed from both physical pain and the isolation of dying patients. The NHS-funded hospitals from 1948 became the predominant location for terminally ill individuals in Britain.
Contemporary Palliative Care
The multidimensional pain experienced is categorized as physical, psychosocial, emotional, and spiritual. Health professionals employ assessments, including patient histories and situational evaluations, matched with severity tools endorsed by organizations such as the World Health Organization (WHO).
Contrasting the Victorian era’s one-size-fits-all pain relief, contemporary palliative medication is intricate. It involves a nuanced combination of analgesics (opioid and non-opioid painkillers) and adjuvant drugs to manage pain, like antidepressants and anti-anxiety pills. This comprehensive approach reflects Cicely Saunders’ commitment to engaging with patients’ lives and offering comfort.
Palliative care not only benefits patients and their families but also relieves the broader medical service to perform other crucial tasks. The demand for palliative care is escalating globally due to an aging population, with an estimated 40 million people requiring it annually. However, there’s significant progress to be made, as the WHO reported that, in 2020, only 14% of those in need received such care.
Contemporary Palliative Care
The multidimensional pain experienced is categorized as physical, psychosocial, emotional, and spiritual. Health professionals employ assessments, including patient histories and situational evaluations, matched with severity tools endorsed by organizations such as the World Health Organization (WHO).
Contrasting the Victorian era’s one-size-fits-all pain relief, contemporary palliative medication is intricate. It involves a nuanced combination of analgesics (opioid and non-opioid painkillers) and adjuvant drugs to manage pain, like antidepressants and anti-anxiety pills. This comprehensive approach reflects Cicely Saunders’ commitment to engaging with patients’ lives and offering comfort.
Palliative care not only benefits patients and their families but also relieves the broader medical service to perform other crucial tasks. The demand for palliative care is escalating globally due to an aging population, with an estimated 40 million people requiring it annually. However, there’s significant progress to be made, as the WHO reported that, in 2020, only 14% of those in need received such care.