• June 01, 2021 2:12 PM | Anonymous member (Administrator)

    by Sarah Halter

    July 6, 2021 Update: If you missed our May 23, 2021 virtual program on “Medical Education and Body Snatching in Indiana,” never fear! The recording is now available here. One famous incident I mentioned during the program has sometimes been referred to as The Harrison Horror. The story was too long to tell in full, but as promised, here are the fascinating details. [Image: print of body snatchers at work, Library of Congress]

    In the early spring of 1878, General Benjamin Harrison visited his father at his home in Point Farm, Ohio, a suburb of Cincinnati. (He was still General Harrison, because he hadn't become the President of the United States yet. That didn't happen until 1889. Learn more about President Harrison from the Benjamin Harrison Presidential Site here.)

    During that visit, John Scott Harrison was in pretty good health, so when he died on May 26th, it was a bit of a  shock despite his age. Just the week before, in fact, General Harrison received a letter that his father had ridden 12 miles on horseback to attend the funeral service and burial of a distant nephew, a young man named Augustus Devin, who had died unexpectedly at just 23 years old. But on that Sunday morning, May 26, 1878, when General Harrison and his family came back home from church a waiting telegram informed him that his father had passed away in the night. Right away General Harrison and his wife, Caroline, got on a train and headed down to be with his family and lay his father to rest. [Image: photo of President Benjamin Harrison, 1896, Library of Congress]

    The old family mansion was suddenly a bustling place. Hundreds of people came to offer condolences to the Harrison family. They began making plans to bury John Scott Harrison at Congress Green Cemetery, near where Augustus Devin had been buried just the previous week. While visiting the cemetery before the funeral, family members noticed that there was something odd about Augustus' grave. It looked like it had been disturbed. When they got a closer look, to everyone's shock and horror, it was clear that the grave had been robbed… Augustus' body had been stolen.

    If someone stole Augustus' body right out of his grave, might someone do the same to John Scott Harrison's body? This was the beloved patriarch of the family and the last son of William Henry Harrison, the US President who famously died a month into his presidency. His body and his grave had to be protected at any cost, and the Harrison family could afford to pay for it.


    On May 29th, the vast funeral procession proceeded to Congress Green Cemetery. General Harrison himself supervised the lowering of the sturdy and secure metal casket that contained John Scott Harrison's body into the freshly dug grave. The grave was eight feet deep, and at the bottom was a brick vault into which the casket was placed. The walls of the vault were thick, and the bottom was lined with a stone floor. Workmen placed three massive stones on top of the vault, two at the foot end of the casket and one extra large stone at the head of the casket, where body snatchers usually struck. But they didn't stop there. Next the stones were cemented together, and then several men stood watch at the open grave for several hours while the cement dried. Then finally, the rest of the hole on top of the stones was filled in with dirt. General Harrison, fearful that the body snatchers would return, paid a watchman $30 to stand guard over the grave for 30 nights until the body decomposed enough to make it useless for dissection.


    General Harrison and his wife returned to Indianapolis after the funeral feeling pretty confident, I imagine, that his father's body was safe. He went on about his business as one of the City's most well-known lawyers. His brother John stayed in Cincinnati the night of the funeral, so that bright and early the next morning, he could begin the search for the missing body of Augustus Devin. 

    John Harrison and a constable named Lacey set out to search all of the medical schools in the area. They got a tip that a wagon was seen pulling up in the alley behind the Ohio Medical College building at 3am the previous night. A large object was removed from the wagon and carried into the college building, and then the wagon drove on. It was a promising lead. And so, they began the search there. [Image: Ohio Medical College, 1835, The Ohio History Connection]

    When they arrived, the irritated janitor reluctantly showed them around the building, taking them from room to room, so that they could see for themselves that there were no illegal bodies at the Ohio Medical College.

    After John and the constable had carefully searched the whole building without finding anything suspicious, they were about to leave when Constable Lacey noticed an odd thing. He saw a oddly placed air vent with a windlass. The windlass had a rope tied to it that hung down into the air shaft. And the rope looked tight, like something heavy might be hanging from it. They ordered the janitor to pull it up. And as the rope was pulled up, slowly a figure began to emerge. It was a body with a rope tied around its neck. But was it Augustus Devin? Was their search over already in the first place they tried?

    It was not.

    The head and shoulders were covered with a cloth, but John Harrison could tell by looking at the rest of the naked body that it was a much older man than Augustus Devin was. But whether it was Augustus or not, it was still a human body, one that was likely acquired illegally by the medical school. So the constable used his stick to remove the cloth and uncover the man's face.


    It was John Scott Harrison, whom the family had just buried the day before. General Harrison probably shouldn't have paid the watchman in advance to guard the grave for 30 days.

    John Harrison found a local undertaker to take custody of the body until he could figure out what to do, and he sent a telegram to General Harrison in Indianapolis, to let him know what had happened.

    But by the time he received word from John, he had already heard the shocking news elsewhere. Three of his relatives who went to the cemetery to visit John Scott Harrison's grave earlier that morning were horrified to discover that the grave had been dug up, and the two smaller stones at the foot end of the casket had been lifted on end. There was a hole in the top of the vault, and the sealed casket had been pried open. After all the trouble General Harrison went through to ensure his father's eternal rest, his body had still be stolen.

    Was one of the body snatchers present in the crowd at the burial? Had someone staked out the funeral to know what precautions were taken? This was a common practice for body snatchers, but removing the body through the foot end of the casket was not. Whoever stole the body obviously knew what they were dealing with ahead of time. 

    General Harrison immediately made arrangements to return to Cincinnati, thinking that he now had two bodies to locate. Right away, he met with the police, but he also hired the Pinkerton Detective Agency to find the culprits and track down the bodies. He wasn't messing around now, and he wasn't going to leave the whole case in the hands of the Cincinnati Police Department, who so far had not been very helpful. [Image: photograph of Pinkerton Detective Agency report on the Harrison body snatching, Benjamin Harrison Presidential Site]

    But with General Harrison back in town, the police now seemed more inclined to take action. They even made an arrest. Mr. Marshall, the janitor in the Ohio Medical College who had shown the constable and John Harrison around the building, was arrested and charged with "receiving, concealing, and secreting" John Scott Harrison's body which had been "unlawfully and maliciously removed from its grave."

    This did not please the medical school staff, who all rallied around Mr. Marshall and posted his $5,000 bail.

    This action by the medical school faculty did not please the good citizens of Cincinnati, who were a little creeped out by the medical school anyway and who were outraged by the body snatching that had been going on to supply it with bodies. It was shocking that the body of someone like John Scott Harrison might be treated so outrageously. And if it could happen to him, it could happen to anyone.

    The medical school faculty, realizing that they were now even more firmly planted on the wrong side of public opinion, issued a statement expressing their "…deep regret that the grave of Honorable J. Scott Harrison had been violated" which is a pretty poor apology, I think. It's not really an apology at all. 

    One journalist reporting on the angry public reaction to this whole mess wrote of the medical school, "...it would have been better for it to say nothing at all... And heroic doses of the Ohio Penitentiary are the best medical treatment the people of Cincinnati can prescribe for it."

    The Harrisons reburied their father and continued the search for Augustus Devin. Ohio Medical College faculty and staff were questioned again, this time before a grand jury. But they didn't have much to say, even under oath. A local journalist got a tip that a well-known and prolific body snatcher from Toledo, Ohio named Charles Morton and his gang of ghouls were responsible for both thefts. But no one could find him. He used several aliases, sometimes going by Gabriel Morton, or Dr. Christian, or Dr. Gordon.

    Finally Ohio Medical College professors admitted to the grand jury what everyone already knew...that like most other medical schools in the country, theirs had entered into a contract with unnamed body snatchers to receive a regular supply of cadavers each year so that they had the "material" they needed to properly educate their students. These professors insisted, though, that they were as shocked as anyone else that none other than John Scott Harrison had turned up in their dissecting room. They were under the impression that private burials were not to be disturbed. Bodies were supposed to be coming from public burying grounds, or places where paupers or unclaimed bodies from hospitals and prisons, were buried…people they seemed to think mattered less.

    These doctors who taught in these schools agreed that body snatching was a problem, but they also saw that it was a necessary evil. They felt sorry for the families and understood their anger, but they also supported the physicians who were driven to such means as purchasing bodies from body snatchers. It's tricky, isn't it? There were no imaging technologies, and the only way to better understand, and therefore better heal, the body was to look inside it and study it firsthand.


    The janitor of another medical school in Cincinnati came forward confirming that they too purchased bodies from Morton. And he said that while school was not in session, Morton paid him to use the medical building as a workspace for preparing and shipping these bodies all over. Some of the bodies had gone to Ann Arbor, Michigan, disguised with labels that read "Quimby & Co.," so a detective set off right away to Ann Arbor. The barrels were easy to track, it turned out, and he quickly located a barrel labeled "Quimby & Co." Sure enough, inside that barrel of pickled bodies was....finally...poor Augustus Devin.

    When the Harrisons in Cincinnati heard the news, they were quite relieved. Devin was laid to rest for the second time...for real this time. And though the whole business wasn't over...there was still an upcoming trial after all against Charles Morton and the janitor Mr. Marshall. And the Harrisons filed civil suits for the costs of the investigation and the pain and suffering caused by the whole terrible ordeal....But General Harrison could finally go home knowing that his father and Augustus were finally "home."

  • March 16, 2021 9:12 AM | Anonymous member (Administrator)

    by Haley Brinker

    Snake oil. Those two words illicit an immediate response of fraudulent hucksters, traveling salesmen with dubious morals, and a host of other suspicious characters, hawking questionable wares across the United States in the late nineteenth and early twentieth centuries. In modern times, calling someone a snake oil salesman is the equivalent of calling them a liar, a charlatan, peddling too-good-to-be-true products or ideas to make a quick buck. However, the history of snake oil itself is a little more interesting. Snake oil is and was a real product, and some scientists today acknowledge that it might actually work to cure bodily ills.

    So, what is snake oil exactly? During the 1800s, over 100,000 Chinese immigrants came to the United States in order to find work building the Transcontinental Railroad. They brought with them their families, their culture, and, most important to this story, their medicines. One medicine, perhaps corked into a small, glass bottle, was snake oil. This was no ordinary snake oil, either. It came from the Chinese water snake, and this snake was rich [1]. No, not that kind of rich. The oil from Chinese water snakes is chock full of omega-3 acids. These have been known to help with things like arthritis and other muscle and joint pain [2]. The Chinese immigrants working on the railroad (all the live-long day) would have been exhausted and probably incredibly sore. The perfect cure? Snake oil! They might have even shared some with their fellow workers, building the miracle-cure aura of this product and spreading the news far and wide.

    “But wait,” you say, “then why do people call snake oil fake?” That part comes next. Seizing on the newfound popularity of this miracle product, but lacking the ability to rustle up a Chinese water snake, the salesmen made do. Cue Clark Stanley, a Texan with panache and a scheme to get rich. Stanley called himself the “Rattlesnake King” and traveled across the United States, dressed as a cowboy, and put on shows [3]. In front of crowds, Stanley would slice open a live rattlesnake, throw it into boiling water, and bottle up the oil that rose to the top [1]. He claimed that this was what was in each bottle of “Stanley’s Snake Oil” and people lined up for a chance to buy it [3].

    Unfortunately, shockingly, some cowboys just can’t be trusted, and Clark Stanley was a prime example of this. When the United States government decided to analyze the oil in 1917, they found that the Rattlesnake King’s oil was actually just a combination of mineral and fatty oils with a few additives, none of which derived from snakes [1]. They found him guilty of misrepresentation of his product and he was fined twenty dollars [2]. It’s worth noting that, even if Clark Stanley’s snake oil really did have genuine rattlesnake oil in it, it probably wouldn’t have been very effective. Modern studies have shown that the original snake oil from China, made from Chinese water snakes, contains around “20 percent eicosapentaenoic acid,” which is a type of omega-3. Rattlesnakes, on the other hand, only have a little over eight percent of the acid. It’s also worth noting that salmon, which is far easier to procure and much less dangerous to handle, have about eighteen percent [4]. Admittedly, though, Salmon King just doesn’t have the same ring to it.

    After the Rattlesnake King’s treachery became public, it was only a matter of time before the term “snake oil salesman” became synonymous with fakes, frauds, and falsifiers. It was popularized when snake oil salesmen began popping up in American Westerns. It’s mentioned by the poet, Stephen Vincent Benet in John Brown’s Body and again in The Iceman Cometh by Eugene O’Neill [1]. These products of popular culture further propelled the terminology into the public lexicon. While snake oil itself (the good stuff from Chinese water snakes) is a great remedy in traditional Chinese culture, its benefits have been lost from sight in western culture, sinking beneath the surface of the slippery sea of slimy salesmen.

    [1] Gandhi, L. (2013, August 26). A History Of 'Snake Oil Salesmen'. Retrieved September 24,   2020, from https://www.npr.org/sections/codeswitch/2013/08/26/215761377/a-history-of-snake-oil-salesmen

    [2] Haynes, A. (2015, January 23). The history of snake oil. Retrieved September 24, 2020, from https://www.pharmaceutical-journal.com/opinion/blogs/the-history-of-snake-oil/20067691.blog?firstPass=false

    [3] Clark Stanley: The Original Snake Oil Salesman. (n.d.). Retrieved September 25, 2020, from https://ancientoriginsmagazine.com/clark-stanley

    [4] Graber, C. (2007, November 01). Snake Oil Salesmen Were on to Something. Retrieved September 25, 2020, from https://www.scientificamerican.com/article/snake-oil-salesmen-knew-something/

    Photo Credit: https://www.nlm.nih.gov/exhibition/ephemera/medshow.html, attributed to: Clark Stanley's Snake Oil Liniment, True Life in the Far West, 200 page pamphlet, illus., Worcester, Massachusetts, c. 1905

  • March 05, 2021 12:30 PM | Anonymous member (Administrator)

    by Norma Erickson

    On the weekend of March 5-7, when the Indianapolis Motor Speedway hosts a mass COVID-19 vaccination event, it will not be the first time the track has had a brush with a pandemic. Although no races ran in 1917 and 1918 because of World War I, it was still a very active place during a very dangerous time.

    The Speedway founders, recognized chiefly as promoters of the automobile in the early 20th century, were by no means stuck to the ground. The airplane was also a fascinating invention to them--so much so that they hosted an Aviation Week in June of 1910, with pursuit races and attempts at altitude records. This fit perfectly with Speedway principal Carl Fisher’s conviction that competition was the best way to prove a technology’s value. Newspapers quoted the Wright Brothers as claiming this was the “most important event of its kind ever attempted.”

    Fisher’s esteem for aviation led to him writing an article in 1915, after Europe became embroiled in war, that a fleet of airplanes was essential for the United States’ national defense. With such a profound interest in the flying machines and a facility that was already equipped for use as an airfield, he offered the Speedway to the US Army to support its fledgling aviation efforts.

    In 1918, the Army established an airplane repair depot on the grounds, bringing in four Aero squadrons. Almost 700 mechanics and support personnel eventually manned the facility beginning in February--the same month that the earliest cases of the influenza infections were reported.  Less than a month after the Speedway Depot ramped up, hundreds of soldiers were becoming severely ill at Camp Funston, Kansas, the first army installation hit with the virus--one that was training and shipping out large numbers of soldiers to other bases in the US and in Europe.  The wave of illness that swept through the summer was especially hard on over-crowded army camps.

    The first wave that started that spring seemed to ebb in the summer, but the virus mutated and by October, it had reached its deadly stride.  At that time, the Speedway Depot imposed a quarantine on its workforce--not to stop spread to the community, but to keep outsiders from infecting its soldiers and civilian employees. The first case at the Depot was reported on October 6th. Cleaning indoor areas was increased, and the floors were oiled to keep down dust. Sleeping cots were moved farther away from each other than usual to prevent any contact and were taken outside each day to air out and receive exposure to sunlight.

    By October 31st, the camp saw 22 infections and 2 deaths. By November new infections retreated nationally, the virus going into hiding until the next wave in 1919, but the city remained vigilant. The Speedway Dope newsletter published this reminder to the men: “In a memorandum issued from headquarters to the men of this command directs them to comply with these city regulations. The city health authorities have ruled that every person entering a crowded gathering of any kind, whether in stores, theaters, or other public places must wear a protective mask of gauze.” The men were cautioned to “live up strictly to the rules regarding wearing masks…for their own protection and the benefit of the Post.”

    Captain William Menkel, the Depot’s commander, noted in an article in the Indianapolis Star during the illness' peak in October of 1918, that the community had shown great concern and hospitality to the soldiers there, offering all kinds of supplies and good wishes.

    Fast forward to 2021. When the IMS President J. Douglas Boles says “We’re a community-first organization, and we’re extremely proud to assist with this important effort to keep Hoosiers safe and healthy,” the relationship of mutual concern and cooperation between the Community and the Indianapolis Motor Speedway comes full circle to confront today’s pandemic.

    Images from "Speedway: An Aviation Hub During World War I" from the Indiana History Blog via the Indiana Historical Bureau, a Division of the Indiana State Library:

    1) Speedway's aviation repair depot was bordered by Main St. on the west, 14th St. on the north, Polco St. on the east, and roughly contemporary Ford St. and 10th St. on the south. Photo courtesy of William Menkel, "New Plans for Old: The Work of the Aviation Repair Depots," Aerial Age Weekly, September 1, 1919

    2) 821st Aero Squadron. Photo courtesy of the Indianapolis Motor Speedway Collection

    3) Mask up order printed in the Depot's newsletter Speedway Dope November 29, 1918

  • March 02, 2021 2:35 PM | Anonymous member (Administrator)

    by Sarah Halter

    “I come to present the strong claims of suffering humanity. I come to place before the Legislature of Massachusetts the condition of the miserable, the desolate, the outcast. I come as the advocate of helpless, forgotten, insane men and women; of beings sunk to a condition from which the unconcerned world would start with real horror.”

    ~Dorothea Dix, 1843

    A lot of people remember Dorothea Dix for her work during the Civil War as the Superintendent of Army Nurses for the Union Army. She set very high standards for nurses, Her work increased the role of nursing during the war and gave the field of nursing a much-deserved boost by pressing for formal training and more respect for nurses.

    But before the War, her activism focused on asylum reform and improving the care and treatment of the mentally ill in the United States. This interest began while she was teaching classes to female prisoners in East Cambridge, Massachusetts. She noticed that many of the individuals incarcerated there had medical needs that weren't being addressed, and she noticed that many of the inmates there were not criminals at all. Many were just paupers. In the basement there, she found four people suffering from severe mental illnesses whose cells were "dark and bare and the air was stagnant and foul.” The poor and the mentally ill who had committed no crime, or whose crimes were directly related to their poverty or mental illness, were locked up along-side violent criminals and treated in the same inhumane way

    Beginning in the 1840s, Dix traveled to many states including Indiana, visiting the places where people with mental illnesses were housed and treated- jails as well as mental hospitals- and lobbying State governments for better facilities to care for these “helpless, forgotten” people.

    During the 1844-45 Indiana legislative session, on January 13, 1845 to be precise, an "Act to provide for the procuring a suitable site for the erection of a State Lunatic Asylum" was passed and approved. A small committee of physicians that included John Evans, Livingston Dunlap, and James Blake was appointed to choose a site for the new hospital and collect information from around the country about the best plans, specifications, and methods for locating and managing the hospital. Dr. Evans traveled to Ohio, Pennsylvania, and other states to visit similar hospitals and to learn about their design and operations. (Though his legacy is tainted*, his work was instrumental in establishing the Indiana Hospital for the Insane, and he served as the hospital’s first Superintendent.)

    In August of 1845, the group settled on 160 acres of farmland west of Indianapolis along the National Road as the site for the hospital. It was swampy land. But it was also two miles out of the city, and there weren't many residents in the area to complain about a mental hospital in their back yard. It had a high point that would command a pretty nice view of the surrounding area. Generally, the committee thought it would be well-suited for the hospital, and it was affordable at $13 and 12 1/2 cents per acre. The State acquired the property right away. In January of 1846, another act of legislation provided funding for construction of the buildings themselves, and work began on the main building: a three-story building with an additional basement and attic.

    The first five patients were admitted to the hospital on November 21, 1848, though the building took several more years to truly finish and new wards were still being added to the wings of the building until the early 1870s as the hospital population grew.

    Throughout the hospital’s history, it was plagued by many challenges including insufficient funds, corruption, and justified allegations of abuse and neglect. But there was another side of the hospital that is important to remember, too. It offered hope and healing to thousands of Hoosiers who previously would have ended up in the poor houses and jails that Dix and others condemned. Indiana had taken Dorothea Dix’ words to heart.

    (photo: Dorothea Dix, retouched, undated photograph, Library of Congress) 

    *In mentioning John Evans and his work in Indiana, it is important to also acknowledge his role in the Sand Creek Massacre. In 1864 as Governor of Colorado Territory, John Evans issued a proclamation authorizing the citizens of Colorado to “kill and destroy, as enemies of the country, wherever they may be found…hostile Indians.” Recent research indicates that likely he neither knew about nor helped plan the Sand Creek Massacre that occurred that year, but his policies toward and prior treatment of the Native Americans in his territory, make his culpability in the massacre clear. One hundred and fifty Cheyenne and Arapaho, mostly women and children, were brutally killed, and in the aftermath of the massacre, Evans defended what happened, showing little or no remorse for the suffering and loss that occurred.

  • February 18, 2021 1:56 PM | Anonymous member (Administrator)

    by Norma Erickson

    The COVID-19 pandemic has revealed many cracks in the American healthcare system. Healthcare disparity for racial and ethnic groups is just one of the problems that is standing in the way of battling this disease. Pacific Islanders, Latino, Black and Indigenous Americans all have a COVID-19 death rate of double or more than that of White and Asian Americans. Multiple factors contribute to this increase and someone who is not a person of color might have a problem understanding this.

    For African Americans, the specter of mistrust is one barrier that must be addressed in the current health crisis. Sometimes such doubt is attributed to skepticism about new technologies. Perhaps deeper, memories of disrespect and mistreatment by the healthcare system still linger, as current examples make their way to the news and  reinforce that belief.  These fears are leading to vaccine hesitancy among Black Americans. To address these fears and feelings, you must exhume and examine their origins.

    Time and time again, one particular incident is spotlighted as an example of how African Americans were mistreated and abused by a public health organization, leading to mistrust of the healthcare system. The Tuskegee Study of Untreated Syphilis in the African American Male was certainly one of the most orchestrated, egregious medical assaults against Black men, allowing some study subjects to live uncured when a viable treatment was available. But some might agree and say—“that was not a good thing, but what does that have to do with here and now? After all, that happened in Tuskegee, Alabama; it didn't happen here.” Au contraire. Wherever your “here” may be, tragedies likely occurred, but not on the scale of the Tuskegee study. They certainly did happen in Indianapolis’ history.

    In the first decade of the 20th century, physicians Indianapolis’ Black community surrendered to the fact that they were not going to be granted privileges to treat their patients in the municipal City Hospital and those patients who were admitted would not be well-treated.  They decided to establish their own clinics and hospitals to offer safe, dignified care. A few years prior to this, there were two incidents which were very much in the public eye that demonstrated the Black community’s belief that their lives would always be in peril if they had to enter the hospital.

    In November of 1904, two student nurses mistook a pitcher that contained a strong disinfectant for a jug of warm water that had been prepared for patient use. Their error resulted in two women receiving deadly carbolic acid enemas. One patient was White, the other Black; the incident was an example of the questionable safety of hospitalization for both communities because City Hospital was a training ground for both student doctors and nurses.  Since City Hospital was the only such facility in Indianapolis that admitted them, Black patients would always be at risk of being “clinical material” for trainees. Often, this is still the case in teaching hospitals. Many institutional policies protect the patient now—but can they be trusted?

    In March of 1905, Nurse Mary Mayes of the Flower Mission Hospital, a new tuberculosis facility on the grounds of the City Hospital, visited the home of Thomas Jones, a seriously ill African American man with a wife and child. Two Black physicians recently examined Jones and one wrote an order for him to be admitted to City Hospital. A neighbor, for unknown reasons, took the note to the Flower Mission Hospital instead of City Hospital.  Mrs. Mayes conducted a routine home visit to determine eligibility, but she called the City Hospital to send an ambulance to transport the very sick man there—not to the TB hospital. Since tuberculosis cases were prohibited from City Hospital, Mayes apparently did not think Jones was suffering from the disease. Eventually, his temperature reached 103 degrees and he began to bleed from his nose. When the City ambulance did not arrive, Mayes asked C.M. Willis, a Black undertaker, to take the man and his wife to the hospital in his carriage. Before Jones left her care, Mayes collected a sputum specimen to send to a lab for testing. When they arrived at the hospital, the doctor on duty, an intern, looked at the man—still in the carriage—and saw blood on the front of his clothes. He immediately assumed the patient was coughing up blood because he had tuberculosis. The doctor did not did take Jones’s temperature nor remove the patient to an examination room. Willis was then told to take the ill man to the county poor house. He did so and thirty minutes later, Thomas Jones died.

    This event caused an uproar in the Black community, but it was not unknown by the rest of Indianapolis. Reports of this case occupied the pages of the Indianapolis Star and the Indianapolis News for three weeks. Why was the patient turned away? Did the patient really have tuberculosis? Who was ultimately to blame for refusing to treat him? This story might never have made it to the press if the hospital was not already under scrutiny for the other horrific incident that happened just a few months prior.

    There were few immediate results of these tragic events. The nurses were exonerated for being overworked and the doctor on duty received a ten-day suspension for not following policy. What of Mr. Jones’s laboratory tests? Both ante mortem and postmortem samples were negative for tuberculosis bacilli.

    These appalling incidents explain why City Hospital’s poor standard of care—by both physicians and nurses—advanced the Black community’s decision to establish their own hospitals. Besides the mistrust of medical treatment, there was also fiscal mistrust. There were decades of unfulfilled promises by politicians and White doctors. One major struggle involved the 1939 accusations by Black citizens that the city had received $157,000 from the Federal Public Works Administration to build a new wing under the pretense that a portion of it would be used for Black nurses and interns. The money was spent, but the hospital did not staff the wards as promised. Every year that passed (until 1942 when Dr. Harvey Middleton was allowed to join the hospital staff) meant that Black physicians could not fully practice their profession; every year that passed without Black nurses on the City Hospital wards (until 1940 when Easter Goodnight was employed) denied Black women the financial advantage of a good salary and respected professional opportunities. Every year that passed without African Americans receiving treatment from providers who they trusted expanded the gap that prevented them engaging adequate care.

    So, you see, it’s not just about Tuskegee.

  • February 13, 2021 9:10 AM | Anonymous member (Administrator)

    by Haley Brinker

    With Valentine’s Day fast approaching, it seems fitting to offer a beauty routine fit for royalty. There is the slight issue that the products in this post may result in death, but beauty is pain, right? While we at the Indiana Medical History Museum highly recommend not following this regimen (the whole death thing), the steps involved were used by numerous ladies throughout history to lure in their prince charming. Of course, they perhaps weren’t able to stay with them as long due to the slow poisoning to which they were subjecting themselves. But, like they say, love never truly dies.

    Every beauty guru around knows that a good face of makeup starts with taking exquisite care of one’s skin. It is the foundation on which we place the foundation. Historical women were spoiled for poisonous choice for which debilitating chemical they would use to create the deathly pale complexion that was trendy at the time [1]. In the 1700s, a mixture of white lead and vinegar was the go-to complexion-paler. Referred to as ceruse, this mixture gave users the ghostly pallor they so desired, while simultaneously hiding any unsightly smallpox scars they wished to hide [1]. One of the most famous of these products was Empress Josephine Face Bleach, and it contained everything a girl could want in terms of chemicals that would eat away their skin, such as zinc oxide, lead carbonate, and mercuric chloride [2]. Of course, creams can be so cumbersome to carry on the go. What about a product that one could just ingest in the morning and have a clear complexion all day? Nineteenth century ladies would simply take arsenic wafers. Yes, the aptly named wafers made of literal arsenic were very popular and would supposedly help a woman achieve a better complexion [3]. Could a lady have serious health problems or even die after prolonged use of such products? Absolutely. Would they look deathly pale up until their expiration, thus making them the belle of the proverbial ball? You know it!

    “But wait,” you say, “what if I want a product that can do it all, while also slowly killing me?” Look no further than the beautiful lady herself, Belladonna! What can’t this highly poisonous and herbaceous photo-synthesizer do? This deadly multitasker was used as a face wash to “take off pimples and other excrescences from the skin” or used to “whiten the complexion” in order to achieve the forementioned deathly pale look [4]. Miss Bella doesn’t stop there, either. Those red berries? They were sometimes crushed to form blush and redden the cheeks [4]. After applying all of that deadly nightshade, a particularly daring lady might want to do something to freshen up those eyes. Not to worry, a few drops of belladonna into the eye will cause them to dilate [1], which was, apparently, something people used to be into. Who knew?

    Of course, no look is complete without a kissable pout. To keep with the theme of beauty to die for, do it like the ancient Egyptians with a little bromine mannite-based lip color!  What’s bromine mannite, you ask? Bromine mannite is a halogen compound with an alcohol sugar [5]. I know when I’m stalking the aisles at Sephora, I’m constantly complaining about the lack of halogen-based beauty products. If it can light up a room, why can’t it light up my face? Apparently, bromine mannite created a lovely “red-brown” shade that could make even Mark Antony swoon. He might have just fainted, though, as this lipstick would not only have poisoned the wearer, but anyone they might have kissed, too [6].

    There you have it, folks! This beauty routine is to die for, and I mean that literally. These products are highly toxic and could cause death. A lot of ladies and gentlemen dream of looking gorgeous at parties, but these products will have their users looking absolutely breathtaking at their wake.

    [1] Wischhover, Cheryl. “The Most Dangerous Beauty through the Ages,” December 17, 2013. https://www.thecut.com/2013/12/most-dangerous-beauty-through-the-ages.html.

    [2] Rance, Caroline. “Empress Josephine Face Bleach.” The Quack Doctor, October 9, 2018. http://thequackdoctor.com/index.php/empress-josephine-face-bleach/.

    [3] Peiss, Kathy Lee. Hope in a Jar: the Making of America's Beauty Culture. Philadelphia, PA: University of Pennsylvania Press, 2011.

    [4] Forbes, Thomas R. “Why Is It Called ‘Beautiful Lady’? A Note on Belladonna.” Bulletin of the New York Academy of Medicine 53, no. 4 (May 1977): 403–6.

    [5] Rattley, Matt. “Ambiguous Bromine.” Nature Chemistry 4 (June 2012): 512.

    [6] Freeman, Shanna. “How Lipstick Works.” HowStuffWorks, March 9, 2009. https://health.howstuffworks.com/skin-care/beauty/skin-and-makeup/lipstick5.htm.

    PHOTO: "Arsenic Complexion Wafers 1896" by Nesster is licensed with CC BY 2.0. To view a copy of this license, visit  https://creativecommons.org/licenses/by/2.0/

  • November 15, 2020 2:53 PM | Anonymous member (Administrator)

    by Norma Erickson

    (photo: Sisters of Charity first hospital 'Hospital Indianapolis News, June 10, 1911)

    When Vice president-elect Kamala Harris made her speech the night her running mate Joe Biden projected as the next President of the United States, she poignantly recognized  “Women who fought and sacrificed so much for equality, liberty and justice for all, including the Black women, who are often, too often overlooked, but so often prove that they are the backbone of our democracy.” She confessed she stood on the shoulders of Black women who came before her , struggling to transform our nation from a society that derided and excluded people of color, to one that could, as a whole, become a better place when all were lifted to an equal standing. In the decades that surrounded the turn of the twentieth century, women of all races and even classes undertook an effort to improve society, approaching the problem from different value systems.

    For white women who embraced the Progressive ideas of the time, their work became known as “municipal housekeeping”. Rooted in the idea of the woman was the mistress of her household domain, the existence of a healthy, well-ran home depended on a healthy, well-ran public sphere. They sought to “clean up City Hall” and improved many facets of life and work outside the home.

    (photo: ad from The Freeman, January 25, 1913)

    For some Black women, they were inspired by the Social Gospel Movement that recognized that Society—not just the Individual—required salvation.  One historian framed black clubwomen’s motives as their desire to take control of their lives and to fulfill the Social Gospel through action, like the women who followed the historical Jesus of the New Testament. Social reforms became the vehicle for saving individuals, and by extension, the civic realm. As with the white push for change, the goals of Black women included uplift specifically for women. The marriage of these two manifestations of faith—uplift for salvation and female empowerment became important for many them.

    Before going deeper in the health care history that involved these women, one misconception must be pointed out: Being Black in Indianapolis in the first decades of the twentieth century did not automatically mean you were poor. There might have been only one Madame C. J. Walker, the famous self-made millionaire,  but there were many successful black businesswomen and wives of businessmen who lived a comfortable life and desired respectful treatment. Second, even working class women, many who were domestics, desired the same respect and were members of clubs that provided social interaction and improvement activities, including adequate and dignified healthcare. One example of their battle for respectability and the quest to improve lives stands out—the founding of a hospital for African Americans in 1911.

    Several women’s clubs worked for improved healthcare in the African-American community of Indianapolis. Their efforts ranged from directly providing care, supporting the facilities, creating places for care, and training care givers. They undertook these projects with firm convictions that women possessed unique abilities that allowed them to carry out their missions of care and to do so with as much autonomy as possible. The most ambitious of these projects was the Sisters of Charity Hospital.  

    The Grand Body of the Sisters of Charity (GBSC), not to be confused with Catholic women’s religious orders with a similar name, was formed in Indianapolis in 1874 in response to the needs of large numbers of southern Blacks moving to the city near the end of the Reconstruction of the South that followed the end of the Civil War. Many women’s clubs formed for a variety of ends, some social or utility minded (for instance, sewing clubs) and some with for public goals in mind (one example is the Women’s Improvement Club). Many of them embraced the motto of the National Association of Colored Women: Lifting as We Climb. The GBSC differed slightly   from other women’s clubs in that it operated as a lodge, with benefits to its members that satisfied needs like burial to financial assistance when needed.  The underlying purpose of the hospital was to care for lodge members, but service was also extended to the entire Black community.

    Originally housed in a former residence at 15th and Missouri Streets in 1911 (where the parking garage for the IU Health Neuroscience Center now stands), the hospital moved to another house at 502 California Street in 1918 (now an open lawn on the IUPUI campus. The hospital also served the community by formally training young women as nurses. This professional activity held great prospects for the advancement of Black women. They also worked with the juvenile courts and “wayward” girls. However much these services were sorely needed, such a small institution had a difficult time keeping up with necessary maintenance and improvements that would make the hospital a suitable place for surgery or a maternity hospital. Keep in mind that the Sisters of Charity Hospital and Lincoln Hospital were providing a place for care and treatment that should have been accessible to Black doctors, nurses, and patients as citizens and tax-payers of Indianapolis. It closed around 1921.

    The Sisters of Charity pursued the quest for uplift for their community and briefly accomplished a unique achievement. The Sisters of Charity Hospital was a rare instance of an African American hospital owned and operated by black clubwomen in a northern state. 

    (photo: site of former SoCH at IU Health Neuroscience Center Garage 15th and Missouri, Google Maps 11-14-2020)

  • October 08, 2020 11:27 AM | Anonymous member (Administrator)

    by Haley Brinker

    The idea of drinking human blood or consuming bones might sound like something out of a horror movie to people today, but it was a fairly common practice during the early modern period of history. It actually goes back even further. Medical cannibalism can trace its roots all the way back to ancient Rome, where spectators of gladiatorial fights would drink the blood of fallen gladiators in an attempt to cure them of their ills [1]. It was also thought that it this vital blood could cure things like epilepsy [4]. Now, some might say that this could just be a rare case; a few ancient vampires among a sea of ‘normal people.’ They would be wrong. Medical cannibalism was incredibly widespread. (Image: "Cannibalism in Russia and Lithuania 1571")

    The popularity of medical cannibalism hit its peak in the 1500s and 1600s [2]. The practice of consuming body parts in various, creative ways was everywhere in Europe during this time. Egyptian mummies were thought to be incredibly powerful, so the grave robbers went to Egypt to steal them [1]. Now, any movie archaeologist or horror movie enthusiast would eye this practice warily; these robbers were begging to be cursed by the spirits of the former pharaohs. However, such reports can’t be located. Not all people believed that mummies needed to be Egyptian in order to be medicinally powerful. Many thought it just needed to be the mummified cadaver of any “healthy man” [5]. However, there was such a high demand for body parts from mummies that it created a black market of sorts, with industrious would-be grave robbers creating mummies of their own [3]. Like Ina Garten, they believed in the power of homemade. With this can-do attitude, they made local mummies by robbing the graves of local poor people or criminals, sometimes even just using animals and passing them off as human remains [3].

    With medical cannibalism being so popular, it, of course, had its famous supporters throughout history. King Charles II was a believer in the power of human remains’ ability to cure the medical maladies of the living. He believed in a medicine called “spirit of the skull,” which contained real skull [1]. In fact, he wanted to make it so badly that he paid six thousand dollars for the recipe, which he referred to as “King’s Drops” [3]. Another enormous fan of consuming literal human body parts in order to cure common ailments was the 17th century brain scientist Thomas Willis. He believed that one could cure excessive bleeding by mixing together the tantalizing concoction of human skull powder and delicious chocolate [2]. Who doesn’t love a little chocolate when they’re feeling down?

    The 16th century German and Swiss physician, Paracelsus, preferred the power of more “fresh corpses” [1]. Now, while it seems that he was a vampire, working to create an army of other vampires, that is, unexcitingly, not the case. More affluent would-be blood drinkers could go to their local apothecary to acquire the hemoglobin they so desired [2], while those of less wealth and status would simply attend a public execution and kindly ask for a cup of the deceased criminal’s blood from the executioner himself [1]. Paracelsus believed that when someone died suddenly (i.e. a hanging, an execution, etc.), their “vital spirits” could “burst forth to the circumference of the bone” and the living could use their highly powerful body parts to heal their ailments [3].

    The list of supporters didn’t end there, either. Marsilio Ficino, an Italian scholar from the 15th century believed that the elderly should “suck the blood of an adolescent” who was in good spirits and of sound body to regain some of their former vigor [3]. Saint Albertus Magnus stated that a distillation of blood could “cure any disease of the body” [3]. Elizabeth Bathory’s belief in bathing in the blood of young women doesn’t seem so far-fetched now, does it? Heinous? Yes. A horrific crime of tremendous magnitude? Absolutely. A belief system totally out of line with the times? Nope.

    Bones and blood weren’t the only ‘useful’ remedies at the time. The practitioners of medical cannibalism were what some might call… creative. Blood was thought to be the “vehicle of the soul,” so it was thought to be especially powerful [4], but how to deal with the pesky taste of drinking warm, human blood? Marmalade! Blood marmalade to be precise. A Franciscan apothecary in the 1600’s had a delightfully descriptive recipe to create the culinary confection that is blood marmalade [1]. Step one (the most important step, as we all know) was to find a donor with the following traits: “warm, moist temperament, such as those of a blotchy, red complexion and rather plump of build” [3]. It is quite difficult to pin down exactly what a ‘moist’ temperament is, but I’m sure those at the time had someone in mind as soon as they read the recipe. Bones were allegedly useful as well. It was believed that ‘like treated like,’ so skull powder was a great cure for any ailments of the head [3]. Even objects near the cadaver could hold power. A moss that grew on skulls was called usnea, which literally means “moss of the skull,” was thought to prevent nosebleeds by simply holding it or shoving it right into your nose [1].

    As stated previously, bones and blood weren’t the only parts of the body that could ‘cure.’ Human fat was thought to have all sorts of medicinal properties. For instance, fat could prevent bruising of the skin [3]. The fatty fun doesn’t stop there, though. It was believed that the magical properties in human fat could be used to create something called a ‘Thieves Candle.’ This human-fat-containing candle was thought to be able to “paralyze enemies” [2]. Fat was so important to medicine that the local executioners would directly deliver the fat from executed criminals right to the apothecaries around town [3].

    While this practice of consuming human remains was widely practiced and incredibly popular at this time, it didn’t prevent white Europeans from condemning tribal practices involving cannibalism with extreme revulsion. Puritans didn’t support belief in “transubstantiation” in Catholicism [5]. They believed that transforming bread and wine into the body and blood of Christ and then consuming it was a form of cannibalism [2]. Cannibalistic ritual practices performed by Native Americans were seen as ‘barbaric’ and used as an example of why they should be subjugated by the Europeans [3]. It is an interesting juxtaposition due to the fact that Native American cannibalistic practices were social and sacred and were done in order to “reintegrate the deceased into the tribe” [3] On the flip side, Europeans often didn’t know whose remains they were consuming. Often, bodies used for medical cannibalism belonged to those on the lowest rungs of the societal ladder: the poor, the disenfranchised, the ‘other.’

    Using ritual cannibalism as a stick with which to beat down those that the Europeans deemed ‘less,’ was very common. During the subjugation of the Irish by the English, Irish skulls were unburied and sent to German pharmacies and apothecaries to be ground into powder and sold as a commodity [3]. Joseph Hall, a past bishop of Exeter, did a fiery sermon referring to the Turkish people as “bloody, man-eating cannibals, mongrel troglodytes feeding upon bloody carcasses” [3]. Bishop Hall was apparently fine with his own people consuming bones mixed with chocolate and alcohol or smearing a little blood marmalade on crusty bread, but not with social, religious rituals of respect done by non-white, non-Protestant individuals.

    While the practice of medicinal cannibalism gradually dwindled, a book published in Germany in the early 1900s noted that a pharmaceutical company was still offering “genuine Egyptian mummy” in its catalog [5]. The human body is still used in medicine today, however these practices, such as blood transfusions and organ donations, are far more medically sound and don’t require any visits to the local executioner.


    [1] Sugg, R. (2008). The art of medicine, Corpse medicine: Mummies, cannibals, and vampires. The Lancet, 371(9630), perspectives. doi:https://doi.org/10.1016/S0140-6736(08)60907-1

    [2] Dolan, Maria. “The Gruesome History of Eating Corpses as Medicine.” Smithsonian.com. Smithsonian Institution, May 6, 2012. https://www.smithsonianmag.com/history/the-gruesome-history-of-eating-corpses-as-medicine-82360284/.

    [3] Lovejoy, B. (2016). A Brief History of Medical Cannibalism. Lapham's Quarterly, 9(5).

    [4] Himmelman, P. K. (1997). The Medicinal Body: An Analysis of Medicinal Cannibalism in Europe, 1300-1700. Dialectical Anthropology, 22(2).

    [5] Gordon-Grube, K. (1988). Anrhropophagy in Post-Renaissance Europe: The Tradition of Medicinal cannibalism. American Anthropologist, 90(2).

  • September 28, 2020 12:11 PM | Anonymous member (Administrator)

    by Norma Erickson

    When the Lincoln Hospital opened in December of 1909, the African American doctors of Indianapolis could no longer continue with the state of medical practice in Indianapolis. Shut out of the hospitals of the city, they could not continue to care for their patients who required hospitalizations, a situation that led to disastrous outcomes for some Black patients. Sometimes the disconnect that occurred when the patient was moved from home to hospital left a very sick person vulnerable to mistakes.

    One such case happened in March of 1905 when Thomas Jones, a seriously ill African American man, was denied an examination at the City Hospital. He had recently been seen by two Black physicians; one wrote an order for him to be admitted to City Hospital. A carriage was called, and when the driver arrived at the hospital, the intern on duty looked at the man in the carriage, saw blood on the front of his clothes and immediately determined that he had tuberculosis. The doctor did not take Jones’s temperature nor remove the patient to an examination room, because the clerk on duty would not help him. Tuberculosis cases were prohibited from City Hospital, so the intern told Willis to take him to the county poor house. He did so, and thirty minutes later Thomas Jones died. Knowing that City Hospital would not accept TB patients, the physician would not have requested he be admitted there, nor would the nurse who saw him in his home had called for the carriage to take him to there. The nurse had collected a sputum sample at his home before he was removed. When tested later, the sample was negative for TB. The Black community was outraged by this and reporting on the case appeared in both the Indianapolis Star and the Indianapolis News for three weeks.

    During this era, the role of hospitals was undergoing great changes. No longer a place merely for the poor to receive treatment, they underwent modernization that allowed life-saving surgeries to take place. But a Black physician did not have access to those, even in public tax-supported institutions like City Hospital. Black patients who would have liked to receive treatment in a hospital rather than homecare were put off by the uncomfortable environment of all-white medical and nursing staffs.  Between the loss of revenue and prestige as a surgeon, and the patients’ low confidence in the system, it was clearly time for a new approach by the African American community. If the segregationist rules did not change, then it was time for a public hospital for African Americans. The only way to get it was to start their own.

    Like Ward’s Sanitarium, the Lincoln Hospital also launched a nurse training program that attracted students from around the state. It also included a free dispensary to treat the poor, just like the public hospital. Women’s clubs stepped up to gather funds and donate goods. Two prominent white men, a business owner and a politician donated substantially to the effort to get the project off to a start. The physicians published a first annual report with glowing successful cases and also revealed the cases they lost. Five years later, the hospital closed.

    The reason most often cited was the lack of funding. That certainly could be true, but could there be another reason? Could it be that the Black doctors of Lincoln Hospital allowed it to end because it was time to make a push to be installed at the City Hospital? For five years, they managed a facility and demonstrated their abilities to successfully perform operations. One of their own would run for city council and win that year, dangling the hope of making changes at City Hospital almost within their reach. War had begun in Europe, bringing a possibility that young Blacks would enter military service soon—another way to prove the mettle of the Race.

    But the entrance of physicians into Indianapolis’ public hospital would not happen for another thirty years and access to both adequate and trusted healthcare would continue to deteriorate.

    Next month: The Sisters of Charity Hospital

  • September 14, 2020 3:05 PM | Anonymous member (Administrator)

    by Haley Brinker

    The story of radium began in the laboratory of Marie Curie and her husband, Pierre, in 1898. It was there that they discovered the power of this element, a discovery that would earn them a Nobel Prize in Physics [1], but radium’s story was just beginning. Soon, the entire world would be at rapt attention, hungry for any news of or product containing what some called a “miracle element” [2]

    By the early 1900s, radium was synonymous with health. Products of all kinds, touting the benefits of radium for the human body were everywhere. Radium water was incredibly popular and one company, Bailey Radium Laboratories, called far and wide that their product, Radithor, was the tonic to cure to all of the public’s bodily ills [3]. It was even picked up by a celebrity sponsor of sorts, Eben MacBurney Byers. He loved Radithor so much, and spoke its praises so highly, that he consumed over a thousand bottles in only five years [3]. There were radioactive toothpastes, radioactive makeup, and even entire spas dedicated to the healing power of radium [1]. One product, the Radio-Active Pad, even claimed that it could cure blindness (yes, blindness) by wearing the pad “on the back by day and over the stomach at night” [4]. Consumers at the time were spoiled for choice. They could have their radium in the form of a kissable pout with lipstick. They could spend a day with the girls, lavishing in a lush spa, cucumbers covering their eyes, while they received treatments enhanced with the restorative powers of this wondrous cure-all. 

    Radium in branding was so popular, in fact, that some companies simply named their companies after it but did not actually put any into their products. One of these companies was called “Radium Brand Creamery Butter,” which didn't likely contain any of its namesake element [4]. Where people today would jump at products labeled ‘organic’ or ‘gluten free,’ folks during the radium craze lunged for any product that claimed it was associated in any way with radium. This was the power of popularity. Radium was trendy; radium was chic.

    Like all fads, the public’s love affair with radium would not last forever. Stories around the world began circulating, highlighting the serious health problems of those who were ingesting radium. Our Radithor loving, celebrity endorser, Eben MacBurney Byers, was soon afflicted with a host of health problems. After five years and over a thousand bottles of Radithor, the radium in his system had caught up with him. An attorney, Robert Hiner Winn, went to interview Byers in 1930 on behalf of the Federal Trade Commission. When he met Byers, he was dumbfounded; this previously hardy and healthy man was now a shadow of his former self. Byers’ radium poisoning was so severe that “half of his face was missing” [3]. The health tonic Byers had spent years promoting and using had been his downfall. He had been misled, and he was not the only victim.

    During the peak of radium mania, the US Radium Corporation had set up factories to produce watches with glow in the dark dials. These dials were hand painted using radium laced paint by young women who, to keep the numbers crisp and precise, pointed the tips of their brushes with their lips. Each lip-pointed brush stroke was “slowly irradiating” the women “from within.”. Soon, the women began to come down with symptoms of radium poisoning [2]. Bright, young women attempting to make a good living became bed-ridden, in worse health than most septuagenarians, and they had not even reached the age of thirty.

    With these two stories, and a plethora of others, the world turned its back on radium. The government, namely the U.S. Food and Drug Administration, outlawed any patent medicines that had radium as an ingredient [2]. Ingesting radium as medicine became a thing of the past, a bad memory in the public consciousness. Its popularity and downfall give meaning to the phrase that anyone eyeing a product that seems to good to be true should remember: buyer beware.


    [1] Crezo, Adrienne. “9 Ways People Used Radium Before We Understood the Risks.” Mental Floss. Mental Floss, October 9, 2012. https://www.mentalfloss.com/article/12732/9-ways-people-used-radium-we-understood-risks.

    [2] Moss, Matthew. “The Radium Craze – America's Lethal Love Affair by Matthew Moss.” The History Vault. The History Vault, January 15, 2015. https://thehistoryvault.co.uk/the-radium-craze-americas-lethal-love-affair-by-matthew-moss/.

    [3] Brumfield, Dale M. “The Blessings of Radium Water Made His Head Disintegrate.” Medium. Medium, March 18, 2019. https://medium.com/lessons-from-history/the-blessings-of-radium-water-made-his-head-disintegrate-3ac052cb8620.

    [4] Orci, Taylor. “How We Realized Putting Radium in Everything Was Not the Answer.” The Atlantic. Atlantic Media Company, March 7, 2013. https://www.theatlantic.com/health/archive/2013/03/how-we-realized-putting-radium-in-everything-was-not-the-answer/273780/.

Copyright © 2021-2022 Indiana Medical History Museum

3270 Kirkbride Way, Indianapolis, IN 46222   (317) 635-7329

Powered by Wild Apricot Membership Software