centered image

Greatest Medical Discoveries In The Past 100 Years

Discussion in 'General Discussion' started by Mahmoud Abudeif, Aug 26, 2019.

  1. Mahmoud Abudeif

    Mahmoud Abudeif Golden Member

    Joined:
    Mar 5, 2019
    Messages:
    6,517
    Likes Received:
    42
    Trophy Points:
    12,275
    Gender:
    Male
    Practicing medicine in:
    Egypt

    [​IMG]
    It may be tempting to think that medicine—and, more generally, science—advances in leaps and bounds thanks to the work of geniuses. But the reality is much more quotidian, with knowledge accreting over time. One discovery leads to another in slow succession, and new understandings fade into existence.

    In the end, the advance of medicine is not dependent on the chance appearance of a genius on the scene—although geniuses don’t hurt. Rather, it relies on the time-consuming research and work of many dedicated and devoted men and women, sometimes over many years and even decades.

    With the idea that modern healthcare is rooted in discoveries of the past, let’s take a look at seven major medical milestones of the 20th century.

    Antibiotics: 1929

    Antibiotics represent the pinnacle of human ingenuity, and are—in part—responsible for the prosperity experienced in developed countries across the world. In the 1950s, antibiotics were referred to as “wonder drugs,” and lauded by patients, physicians, and policy makers for their ability to transform once-feared bacterial infections into curable conditions.

    In 1929, Sir Alexander Fleming, FRCS, of St. Mary's Hospital Medical School in London, United Kingdom, shared his observation that the culture medium on which a penicillium mold grew attacked certain types of bacteria. But chemists and bacteriologists working separately were unable to isolate the active substance in the mold ooze. In 1940, Oxford researcher Sir Howard Florey brought together a multidisciplinary team who were finally able to isolate the drug penicillin. Funds from the Rockefeller Foundation in the United States, as well as the Medical Research Council in the United Kingdom, helped support clinical testing and laboratory scale production of the drug.

    Penicillin became widely popular in the years surrounding World War II—not only for the treatment of battle wounds, but also for the treatment of syphilis. Moreover, penicillin led to a surge in healthcare utilization in the post-war era.

    Intriguingly, in the years before the discovery and dissemination of penicillin, infections took on a moral tenor:

    “Even more profound were the moral consequences of the use of the drugs,” wrote Robert Bud, Science Museum, London, United Kingdom, in an article published in BMJ. “Until the mid-1930s prevention rather than cure had been the general means of control of most infections. Injunctions to the healthy were complemented by a moral disdain for those who lapsed and then succumbed to disease. The introduction of antibiotics in the 1940s converted illness into a strictly technical problem. In richer countries the avoidance of ‘germs’ gradually ceased to be a duty.”

    Tissue culture: 1949

    American scientists John Enders, PhD, Thomas Weller, MD, and Frederick Robbins, MD, announced in 1949 that they had grown poliovirus in cultured human embryonic skin and muscle cells, thus taking tissue culture mainstream. This discovery led to methods of measuring immunity to polio and the Nobel Prize for the trio in 1953.

    The accessibility to tissue cultures resulted in a new era of virus discovery. Interestingly, the development of tissue culture methods would have been impossible without the discovery of antibiotics, which were used to limit bacterial contamination.

    Risks of smoking: 1950

    People sensed that smoking was bad for some time before research supported these assumptions. For instance, Henry Ford decried smoking as immoral, and a generation of Americans believed that it could stunt growth. But two landmark case-control studies published in JAMA and the BMJ in 1950 triggered substantial interest in the risks and harms of smoking. Further research was followed by a drop in the prevalence of smoking for the first time, much to the chagrin of Big Tobacco, which was more than willing to fight dirty—particularly through the use of false advertising campaigns—in order to retain market share. Fortunately, as you may have noticed, smoking has now hit an all-time low among US adults.

    Antipsychotics: 1952

    Before the discovery of antipsychotics and other psychotropics, asylums of yore housed stigmatized, dangerous, and mostly hopeless patients. Some of these patients received psychoanalysis, but most didn’t. Instead, these patients were treated as prisoners.

    By the 1940s, university researchers and drug manufacturers started exploring psychopharmacology and developing new compounds to treat psychiatric illness. By 1926, for instance, acetylcholine was understood to be a neurotransmitter. By 1937, antihistamines were identified, followed in 1943 by lysergic acid diethylamide (LSD). Furthermore, insulin coma therapy, electroconvulsive therapy, and leucotomy (ie, prefrontal lobotomy), as well as sedatives including bromides, barbiturates, and paraldehyde, were being used to treat those with mental illness.

    In 1950, chemist Paul Charpentier synthesized the drug 4560 RP, later called chlorpromazine, which is a member of the phenothiazine group of antihistamines. Based on the work of various luminaries, by 1954, chlorpromazine had been administered in double-blind trials in Canada, the United Kingdom, and the United States. However, psychoanalysts refused to accept the drug as a substitute for analytic psychotherapy. Nevertheless, the drug’s effects were undeniable and, starting in 1956, the number of inmates in UK asylums dropped substantially, with antipsychotics and antidepressants subsequently becoming very popular.

    “Without the discovery of drugs such as chlorpromazine we might still have the miserable confinements … a world of desperate remedies,” wrote psychiatrist Trevor Turner, Homerton Hospital, London, United Kingdom, in an article published in BMJ. “Then the attendant's role was akin to a zookeeper's: feeding, scrubbing, and forcibly treating hundreds of ‘demented’ patients. The psychiatric workforce was largely cut off from surgical and physician colleagues, was of poor quality, and was readily mocked.”

    DNA: 1953

    As late as 1952, geneticists didn’t know how DNA worked. All of this changed with the 1953 discovery of the double helix by James Watson, PhD, and Francis Crick, PhD. Their discovery of DNA’s structure was rooted in Gregor Mendel's theory on the principles of single gene inheritance in 1866, as well as Sir Archibald Garrod's elucidation of the inheritance pattern of alkaptonuria in 1923.

    Drs. Watson and Crick, as well as Maurice Wilkins, PhD, were honored with the Nobel Prize for Physiology or Medicine in 1962 for their discovery. But in the eyes of many, the prestige of this award will be forever tinged by sexism. In addition to Dr. Wilkins, Rosalind Franklin, PhD, helped produce x-ray diffraction images instrumental to the deduction of Drs. Watson and Crick that DNA is a three-dimensional helix. These images were shared with Drs. Crick and Watson without her permission, and she wasn’t credited in any way. Although Nobel Prizes can only be bestowed on living scientists—and Dr. Franklin died from ovarian cancer in 1958, possibly due to her work with x-rays—many feel that she never got her due recognition.

    On a more positive note, however, in January 2004, Chicago Medical School (North Chicago, IL) announced its intention to change its name to the Rosalind Franklin University of Medicine and Science.

    Immunology: 1958

    The field of immunology came into its own with the discovery of histocompatibility (HLA) antigens in 1958 by French researcher Jean Dausset, MD. The immune system utilizes the pattern of HLA antigens on the surface of cells as a sort of unique biological barcode. When unrecognized by the body, HLA antigens on a foreign cell results in the creation of antibodies and other substances by the host to attack and destroy.

    Oral rehydration therapy: 1960s–1970s

    As every physician knows, cholera kills by means of copious fluid loss in the form of diarrhea. Before the advent of oral rehydration therapy, healthcare facilities in India and West Bengal lacked the intravenous solution needed to rehydrate all patients infected with cholera.

    Both American and Indian researchers worked to hammer out the formulation and successful administration of oral rehydration therapy in the 1960s and 1970s. Of note, the discovery that glucose enhances the absorption of sodium and water across the intestinal brush-border membrane without causing morphological changes in the gut epithelium of patients with cholera was key to the successful development of oral rehydration therapy.

    Based on this work, a successful rehydration solution was implemented. In the early 1970s, thousands of starving Bangladeshi refugees sought asylum in refugee camps. The fear was that a cholera epidemic would ensue. Indeed, 30% of patients who contracted cholera in these camps died within a few days. It was there that Dilip Mahalanabis, MD, Johns Hopkins Center for Medical Research and Training, Calcutta, West Bengal, India, successfully used an oral rehydration solution that replaced the water and electrolytes lost in vomiting and diarrhea. His formulation reduced mortality in the camps to less than 1%.

    Source
     

    Add Reply

Share This Page

<