• March 02, 2021 2:35 PM | Anonymous member (Administrator)

    by Sarah Halter

    “I come to present the strong claims of suffering humanity. I come to place before the Legislature of Massachusetts the condition of the miserable, the desolate, the outcast. I come as the advocate of helpless, forgotten, insane men and women; of beings sunk to a condition from which the unconcerned world would start with real horror.”

    ~Dorothea Dix, 1843

    A lot of people remember Dorothea Dix for her work during the Civil War as the Superintendent of Army Nurses for the Union Army. She set very high standards for nurses, Her work increased the role of nursing during the war and gave the field of nursing a much-deserved boost by pressing for formal training and more respect for nurses.

    But before the War, her activism focused on asylum reform and improving the care and treatment of the mentally ill in the United States. This interest began while she was teaching classes to female prisoners in East Cambridge, Massachusetts. She noticed that many of the individuals incarcerated there had medical needs that weren't being addressed, and she noticed that many of the inmates there were not criminals at all. Many were just paupers. In the basement there, she found four people suffering from severe mental illnesses whose cells were "dark and bare and the air was stagnant and foul.” The poor and the mentally ill who had committed no crime, or whose crimes were directly related to their poverty or mental illness, were locked up along-side violent criminals and treated in the same inhumane way

    Beginning in the 1840s, Dix traveled to many states including Indiana, visiting the places where people with mental illnesses were housed and treated- jails as well as mental hospitals- and lobbying State governments for better facilities to care for these “helpless, forgotten” people.

    During the 1844-45 Indiana legislative session, on January 13, 1845 to be precise, an "Act to provide for the procuring a suitable site for the erection of a State Lunatic Asylum" was passed and approved. A small committee of physicians that included John Evans, Livingston Dunlap, and James Blake was appointed to choose a site for the new hospital and collect information from around the country about the best plans, specifications, and methods for locating and managing the hospital. Dr. Evans traveled to Ohio, Pennsylvania, and other states to visit similar hospitals and to learn about their design and operations. (Though his legacy is tainted*, his work was instrumental in establishing the Indiana Hospital for the Insane, and he served as the hospital’s first Superintendent.)

    In August of 1845, the group settled on 160 acres of farmland west of Indianapolis along the National Road as the site for the hospital. It was swampy land. But it was also two miles out of the city, and there weren't many residents in the area to complain about a mental hospital in their back yard. It had a high point that would command a pretty nice view of the surrounding area. Generally, the committee thought it would be well-suited for the hospital, and it was affordable at $13 and 12 1/2 cents per acre. The State acquired the property right away. In January of 1846, another act of legislation provided funding for construction of the buildings themselves, and work began on the main building: a three-story building with an additional basement and attic.

    The first five patients were admitted to the hospital on November 21, 1848, though the building took several more years to truly finish and new wards were still being added to the wings of the building until the early 1870s as the hospital population grew.

    Throughout the hospital’s history, it was plagued by many challenges including insufficient funds, corruption, and justified allegations of abuse and neglect. But there was another side of the hospital that is important to remember, too. It offered hope and healing to thousands of Hoosiers who previously would have ended up in the poor houses and jails that Dix and others condemned. Indiana had taken Dorothea Dix’ words to heart.

    (photo: Dorothea Dix, retouched, undated photograph, Library of Congress) 

    *In mentioning John Evans and his work in Indiana, it is important to also acknowledge his role in the Sand Creek Massacre. In 1864 as Governor of Colorado Territory, John Evans issued a proclamation authorizing the citizens of Colorado to “kill and destroy, as enemies of the country, wherever they may be found…hostile Indians.” Recent research indicates that likely he neither knew about nor helped plan the Sand Creek Massacre that occurred that year, but his policies toward and prior treatment of the Native Americans in his territory, make his culpability in the massacre clear. One hundred and fifty Cheyenne and Arapaho, mostly women and children, were brutally killed, and in the aftermath of the massacre, Evans defended what happened, showing little or no remorse for the suffering and loss that occurred.


  • February 18, 2021 1:56 PM | Anonymous member (Administrator)

    by Norma Erickson

    The COVID-19 pandemic has revealed many cracks in the American healthcare system. Healthcare disparity for racial and ethnic groups is just one of the problems that is standing in the way of battling this disease. Pacific Islanders, Latino, Black and Indigenous Americans all have a COVID-19 death rate of double or more than that of White and Asian Americans. Multiple factors contribute to this increase and someone who is not a person of color might have a problem understanding this.


    For African Americans, the specter of mistrust is one barrier that must be addressed in the current health crisis. Sometimes such doubt is attributed to skepticism about new technologies. Perhaps deeper, memories of disrespect and mistreatment by the healthcare system still linger, as current examples make their way to the news and  reinforce that belief.  These fears are leading to vaccine hesitancy among Black Americans. To address these fears and feelings, you must exhume and examine their origins.


    Time and time again, one particular incident is spotlighted as an example of how African Americans were mistreated and abused by a public health organization, leading to mistrust of the healthcare system. The Tuskegee Study of Untreated Syphilis in the African American Male was certainly one of the most orchestrated, egregious medical assaults against Black men, allowing some study subjects to live uncured when a viable treatment was available. But some might agree and say—“that was not a good thing, but what does that have to do with here and now? After all, that happened in Tuskegee, Alabama; it didn't happen here.” Au contraire. Wherever your “here” may be, tragedies likely occurred, but not on the scale of the Tuskegee study. They certainly did happen in Indianapolis’ history.

    In the first decade of the 20th century, physicians Indianapolis’ Black community surrendered to the fact that they were not going to be granted privileges to treat their patients in the municipal City Hospital and those patients who were admitted would not be well-treated.  They decided to establish their own clinics and hospitals to offer safe, dignified care. A few years prior to this, there were two incidents which were very much in the public eye that demonstrated the Black community’s belief that their lives would always be in peril if they had to enter the hospital.

    In November of 1904, two student nurses mistook a pitcher that contained a strong disinfectant for a jug of warm water that had been prepared for patient use. Their error resulted in two women receiving deadly carbolic acid enemas. One patient was White, the other Black; the incident was an example of the questionable safety of hospitalization for both communities because City Hospital was a training ground for both student doctors and nurses.  Since City Hospital was the only such facility in Indianapolis that admitted them, Black patients would always be at risk of being “clinical material” for trainees. Often, this is still the case in teaching hospitals. Many institutional policies protect the patient now—but can they be trusted?

    In March of 1905, Nurse Mary Mayes of the Flower Mission Hospital, a new tuberculosis facility on the grounds of the City Hospital, visited the home of Thomas Jones, a seriously ill African American man with a wife and child. Two Black physicians recently examined Jones and one wrote an order for him to be admitted to City Hospital. A neighbor, for unknown reasons, took the note to the Flower Mission Hospital instead of City Hospital.  Mrs. Mayes conducted a routine home visit to determine eligibility, but she called the City Hospital to send an ambulance to transport the very sick man there—not to the TB hospital. Since tuberculosis cases were prohibited from City Hospital, Mayes apparently did not think Jones was suffering from the disease. Eventually, his temperature reached 103 degrees and he began to bleed from his nose. When the City ambulance did not arrive, Mayes asked C.M. Willis, a Black undertaker, to take the man and his wife to the hospital in his carriage. Before Jones left her care, Mayes collected a sputum specimen to send to a lab for testing. When they arrived at the hospital, the doctor on duty, an intern, looked at the man—still in the carriage—and saw blood on the front of his clothes. He immediately assumed the patient was coughing up blood because he had tuberculosis. The doctor did not did take Jones’s temperature nor remove the patient to an examination room. Willis was then told to take the ill man to the county poor house. He did so and thirty minutes later, Thomas Jones died.

    This event caused an uproar in the Black community, but it was not unknown by the rest of Indianapolis. Reports of this case occupied the pages of the Indianapolis Star and the Indianapolis News for three weeks. Why was the patient turned away? Did the patient really have tuberculosis? Who was ultimately to blame for refusing to treat him? This story might never have made it to the press if the hospital was not already under scrutiny for the other horrific incident that happened just a few months prior.

    There were few immediate results of these tragic events. The nurses were exonerated for being overworked and the doctor on duty received a ten-day suspension for not following policy. What of Mr. Jones’s laboratory tests? Both ante mortem and postmortem samples were negative for tuberculosis bacilli.

    These appalling incidents explain why City Hospital’s poor standard of care—by both physicians and nurses—advanced the Black community’s decision to establish their own hospitals. Besides the mistrust of medical treatment, there was also fiscal mistrust. There were decades of unfulfilled promises by politicians and White doctors. One major struggle involved the 1939 accusations by Black citizens that the city had received $157,000 from the Federal Public Works Administration to build a new wing under the pretense that a portion of it would be used for Black nurses and interns. The money was spent, but the hospital did not staff the wards as promised. Every year that passed (until 1942 when Dr. Harvey Middleton was allowed to join the hospital staff) meant that Black physicians could not fully practice their profession; every year that passed without Black nurses on the City Hospital wards (until 1940 when Easter Goodnight was employed) denied Black women the financial advantage of a good salary and respected professional opportunities. Every year that passed without African Americans receiving treatment from providers who they trusted expanded the gap that prevented them engaging adequate care.

    So, you see, it’s not just about Tuskegee.


  • February 13, 2021 9:10 AM | Anonymous member (Administrator)

    by Haley Brinker

    With Valentine’s Day fast approaching, it seems fitting to offer a beauty routine fit for royalty. There is the slight issue that the products in this post may result in death, but beauty is pain, right? While we at the Indiana Medical History Museum highly recommend not following this regimen (the whole death thing), the steps involved were used by numerous ladies throughout history to lure in their prince charming. Of course, they perhaps weren’t able to stay with them as long due to the slow poisoning to which they were subjecting themselves. But, like they say, love never truly dies.

    Every beauty guru around knows that a good face of makeup starts with taking exquisite care of one’s skin. It is the foundation on which we place the foundation. Historical women were spoiled for poisonous choice for which debilitating chemical they would use to create the deathly pale complexion that was trendy at the time [1]. In the 1700s, a mixture of white lead and vinegar was the go-to complexion-paler. Referred to as ceruse, this mixture gave users the ghostly pallor they so desired, while simultaneously hiding any unsightly smallpox scars they wished to hide [1]. One of the most famous of these products was Empress Josephine Face Bleach, and it contained everything a girl could want in terms of chemicals that would eat away their skin, such as zinc oxide, lead carbonate, and mercuric chloride [2]. Of course, creams can be so cumbersome to carry on the go. What about a product that one could just ingest in the morning and have a clear complexion all day? Nineteenth century ladies would simply take arsenic wafers. Yes, the aptly named wafers made of literal arsenic were very popular and would supposedly help a woman achieve a better complexion [3]. Could a lady have serious health problems or even die after prolonged use of such products? Absolutely. Would they look deathly pale up until their expiration, thus making them the belle of the proverbial ball? You know it!

    “But wait,” you say, “what if I want a product that can do it all, while also slowly killing me?” Look no further than the beautiful lady herself, Belladonna! What can’t this highly poisonous and herbaceous photo-synthesizer do? This deadly multitasker was used as a face wash to “take off pimples and other excrescences from the skin” or used to “whiten the complexion” in order to achieve the forementioned deathly pale look [4]. Miss Bella doesn’t stop there, either. Those red berries? They were sometimes crushed to form blush and redden the cheeks [4]. After applying all of that deadly nightshade, a particularly daring lady might want to do something to freshen up those eyes. Not to worry, a few drops of belladonna into the eye will cause them to dilate [1], which was, apparently, something people used to be into. Who knew?

    Of course, no look is complete without a kissable pout. To keep with the theme of beauty to die for, do it like the ancient Egyptians with a little bromine mannite-based lip color!  What’s bromine mannite, you ask? Bromine mannite is a halogen compound with an alcohol sugar [5]. I know when I’m stalking the aisles at Sephora, I’m constantly complaining about the lack of halogen-based beauty products. If it can light up a room, why can’t it light up my face? Apparently, bromine mannite created a lovely “red-brown” shade that could make even Mark Antony swoon. He might have just fainted, though, as this lipstick would not only have poisoned the wearer, but anyone they might have kissed, too [6].

    There you have it, folks! This beauty routine is to die for, and I mean that literally. These products are highly toxic and could cause death. A lot of ladies and gentlemen dream of looking gorgeous at parties, but these products will have their users looking absolutely breathtaking at their wake.

    [1] Wischhover, Cheryl. “The Most Dangerous Beauty through the Ages,” December 17, 2013. https://www.thecut.com/2013/12/most-dangerous-beauty-through-the-ages.html.

    [2] Rance, Caroline. “Empress Josephine Face Bleach.” The Quack Doctor, October 9, 2018. http://thequackdoctor.com/index.php/empress-josephine-face-bleach/.

    [3] Peiss, Kathy Lee. Hope in a Jar: the Making of America's Beauty Culture. Philadelphia, PA: University of Pennsylvania Press, 2011.

    [4] Forbes, Thomas R. “Why Is It Called ‘Beautiful Lady’? A Note on Belladonna.” Bulletin of the New York Academy of Medicine 53, no. 4 (May 1977): 403–6.

    [5] Rattley, Matt. “Ambiguous Bromine.” Nature Chemistry 4 (June 2012): 512.

    [6] Freeman, Shanna. “How Lipstick Works.” HowStuffWorks, March 9, 2009. https://health.howstuffworks.com/skin-care/beauty/skin-and-makeup/lipstick5.htm.

    PHOTO: "Arsenic Complexion Wafers 1896" by Nesster is licensed with CC BY 2.0. To view a copy of this license, visit  https://creativecommons.org/licenses/by/2.0/

  • November 15, 2020 2:53 PM | Anonymous member (Administrator)

    by Norma Erickson

    (photo: Sisters of Charity first hospital 'Hospital Indianapolis News, June 10, 1911)

    When Vice president-elect Kamala Harris made her speech the night her running mate Joe Biden projected as the next President of the United States, she poignantly recognized  “Women who fought and sacrificed so much for equality, liberty and justice for all, including the Black women, who are often, too often overlooked, but so often prove that they are the backbone of our democracy.” She confessed she stood on the shoulders of Black women who came before her , struggling to transform our nation from a society that derided and excluded people of color, to one that could, as a whole, become a better place when all were lifted to an equal standing. In the decades that surrounded the turn of the twentieth century, women of all races and even classes undertook an effort to improve society, approaching the problem from different value systems.

    For white women who embraced the Progressive ideas of the time, their work became known as “municipal housekeeping”. Rooted in the idea of the woman was the mistress of her household domain, the existence of a healthy, well-ran home depended on a healthy, well-ran public sphere. They sought to “clean up City Hall” and improved many facets of life and work outside the home.

    (photo: ad from The Freeman, January 25, 1913)

    For some Black women, they were inspired by the Social Gospel Movement that recognized that Society—not just the Individual—required salvation.  One historian framed black clubwomen’s motives as their desire to take control of their lives and to fulfill the Social Gospel through action, like the women who followed the historical Jesus of the New Testament. Social reforms became the vehicle for saving individuals, and by extension, the civic realm. As with the white push for change, the goals of Black women included uplift specifically for women. The marriage of these two manifestations of faith—uplift for salvation and female empowerment became important for many them.

    Before going deeper in the health care history that involved these women, one misconception must be pointed out: Being Black in Indianapolis in the first decades of the twentieth century did not automatically mean you were poor. There might have been only one Madame C. J. Walker, the famous self-made millionaire,  but there were many successful black businesswomen and wives of businessmen who lived a comfortable life and desired respectful treatment. Second, even working class women, many who were domestics, desired the same respect and were members of clubs that provided social interaction and improvement activities, including adequate and dignified healthcare. One example of their battle for respectability and the quest to improve lives stands out—the founding of a hospital for African Americans in 1911.

    Several women’s clubs worked for improved healthcare in the African-American community of Indianapolis. Their efforts ranged from directly providing care, supporting the facilities, creating places for care, and training care givers. They undertook these projects with firm convictions that women possessed unique abilities that allowed them to carry out their missions of care and to do so with as much autonomy as possible. The most ambitious of these projects was the Sisters of Charity Hospital.  

    The Grand Body of the Sisters of Charity (GBSC), not to be confused with Catholic women’s religious orders with a similar name, was formed in Indianapolis in 1874 in response to the needs of large numbers of southern Blacks moving to the city near the end of the Reconstruction of the South that followed the end of the Civil War. Many women’s clubs formed for a variety of ends, some social or utility minded (for instance, sewing clubs) and some with for public goals in mind (one example is the Women’s Improvement Club). Many of them embraced the motto of the National Association of Colored Women: Lifting as We Climb. The GBSC differed slightly   from other women’s clubs in that it operated as a lodge, with benefits to its members that satisfied needs like burial to financial assistance when needed.  The underlying purpose of the hospital was to care for lodge members, but service was also extended to the entire Black community.

    Originally housed in a former residence at 15th and Missouri Streets in 1911 (where the parking garage for the IU Health Neuroscience Center now stands), the hospital moved to another house at 502 California Street in 1918 (now an open lawn on the IUPUI campus. The hospital also served the community by formally training young women as nurses. This professional activity held great prospects for the advancement of Black women. They also worked with the juvenile courts and “wayward” girls. However much these services were sorely needed, such a small institution had a difficult time keeping up with necessary maintenance and improvements that would make the hospital a suitable place for surgery or a maternity hospital. Keep in mind that the Sisters of Charity Hospital and Lincoln Hospital were providing a place for care and treatment that should have been accessible to Black doctors, nurses, and patients as citizens and tax-payers of Indianapolis. It closed around 1921.

    The Sisters of Charity pursued the quest for uplift for their community and briefly accomplished a unique achievement. The Sisters of Charity Hospital was a rare instance of an African American hospital owned and operated by black clubwomen in a northern state. 


    (photo: site of former SoCH at IU Health Neuroscience Center Garage 15th and Missouri, Google Maps 11-14-2020)

  • October 08, 2020 11:27 AM | Anonymous member (Administrator)

    by Haley Brinker

    The idea of drinking human blood or consuming bones might sound like something out of a horror movie to people today, but it was a fairly common practice during the early modern period of history. It actually goes back even further. Medical cannibalism can trace its roots all the way back to ancient Rome, where spectators of gladiatorial fights would drink the blood of fallen gladiators in an attempt to cure them of their ills [1]. It was also thought that it this vital blood could cure things like epilepsy [4]. Now, some might say that this could just be a rare case; a few ancient vampires among a sea of ‘normal people.’ They would be wrong. Medical cannibalism was incredibly widespread. (Image: "Cannibalism in Russia and Lithuania 1571")

    The popularity of medical cannibalism hit its peak in the 1500s and 1600s [2]. The practice of consuming body parts in various, creative ways was everywhere in Europe during this time. Egyptian mummies were thought to be incredibly powerful, so the grave robbers went to Egypt to steal them [1]. Now, any movie archaeologist or horror movie enthusiast would eye this practice warily; these robbers were begging to be cursed by the spirits of the former pharaohs. However, such reports can’t be located. Not all people believed that mummies needed to be Egyptian in order to be medicinally powerful. Many thought it just needed to be the mummified cadaver of any “healthy man” [5]. However, there was such a high demand for body parts from mummies that it created a black market of sorts, with industrious would-be grave robbers creating mummies of their own [3]. Like Ina Garten, they believed in the power of homemade. With this can-do attitude, they made local mummies by robbing the graves of local poor people or criminals, sometimes even just using animals and passing them off as human remains [3].

    With medical cannibalism being so popular, it, of course, had its famous supporters throughout history. King Charles II was a believer in the power of human remains’ ability to cure the medical maladies of the living. He believed in a medicine called “spirit of the skull,” which contained real skull [1]. In fact, he wanted to make it so badly that he paid six thousand dollars for the recipe, which he referred to as “King’s Drops” [3]. Another enormous fan of consuming literal human body parts in order to cure common ailments was the 17th century brain scientist Thomas Willis. He believed that one could cure excessive bleeding by mixing together the tantalizing concoction of human skull powder and delicious chocolate [2]. Who doesn’t love a little chocolate when they’re feeling down?

    The 16th century German and Swiss physician, Paracelsus, preferred the power of more “fresh corpses” [1]. Now, while it seems that he was a vampire, working to create an army of other vampires, that is, unexcitingly, not the case. More affluent would-be blood drinkers could go to their local apothecary to acquire the hemoglobin they so desired [2], while those of less wealth and status would simply attend a public execution and kindly ask for a cup of the deceased criminal’s blood from the executioner himself [1]. Paracelsus believed that when someone died suddenly (i.e. a hanging, an execution, etc.), their “vital spirits” could “burst forth to the circumference of the bone” and the living could use their highly powerful body parts to heal their ailments [3].

    The list of supporters didn’t end there, either. Marsilio Ficino, an Italian scholar from the 15th century believed that the elderly should “suck the blood of an adolescent” who was in good spirits and of sound body to regain some of their former vigor [3]. Saint Albertus Magnus stated that a distillation of blood could “cure any disease of the body” [3]. Elizabeth Bathory’s belief in bathing in the blood of young women doesn’t seem so far-fetched now, does it? Heinous? Yes. A horrific crime of tremendous magnitude? Absolutely. A belief system totally out of line with the times? Nope.

    Bones and blood weren’t the only ‘useful’ remedies at the time. The practitioners of medical cannibalism were what some might call… creative. Blood was thought to be the “vehicle of the soul,” so it was thought to be especially powerful [4], but how to deal with the pesky taste of drinking warm, human blood? Marmalade! Blood marmalade to be precise. A Franciscan apothecary in the 1600’s had a delightfully descriptive recipe to create the culinary confection that is blood marmalade [1]. Step one (the most important step, as we all know) was to find a donor with the following traits: “warm, moist temperament, such as those of a blotchy, red complexion and rather plump of build” [3]. It is quite difficult to pin down exactly what a ‘moist’ temperament is, but I’m sure those at the time had someone in mind as soon as they read the recipe. Bones were allegedly useful as well. It was believed that ‘like treated like,’ so skull powder was a great cure for any ailments of the head [3]. Even objects near the cadaver could hold power. A moss that grew on skulls was called usnea, which literally means “moss of the skull,” was thought to prevent nosebleeds by simply holding it or shoving it right into your nose [1].

    As stated previously, bones and blood weren’t the only parts of the body that could ‘cure.’ Human fat was thought to have all sorts of medicinal properties. For instance, fat could prevent bruising of the skin [3]. The fatty fun doesn’t stop there, though. It was believed that the magical properties in human fat could be used to create something called a ‘Thieves Candle.’ This human-fat-containing candle was thought to be able to “paralyze enemies” [2]. Fat was so important to medicine that the local executioners would directly deliver the fat from executed criminals right to the apothecaries around town [3].

    While this practice of consuming human remains was widely practiced and incredibly popular at this time, it didn’t prevent white Europeans from condemning tribal practices involving cannibalism with extreme revulsion. Puritans didn’t support belief in “transubstantiation” in Catholicism [5]. They believed that transforming bread and wine into the body and blood of Christ and then consuming it was a form of cannibalism [2]. Cannibalistic ritual practices performed by Native Americans were seen as ‘barbaric’ and used as an example of why they should be subjugated by the Europeans [3]. It is an interesting juxtaposition due to the fact that Native American cannibalistic practices were social and sacred and were done in order to “reintegrate the deceased into the tribe” [3] On the flip side, Europeans often didn’t know whose remains they were consuming. Often, bodies used for medical cannibalism belonged to those on the lowest rungs of the societal ladder: the poor, the disenfranchised, the ‘other.’

    Using ritual cannibalism as a stick with which to beat down those that the Europeans deemed ‘less,’ was very common. During the subjugation of the Irish by the English, Irish skulls were unburied and sent to German pharmacies and apothecaries to be ground into powder and sold as a commodity [3]. Joseph Hall, a past bishop of Exeter, did a fiery sermon referring to the Turkish people as “bloody, man-eating cannibals, mongrel troglodytes feeding upon bloody carcasses” [3]. Bishop Hall was apparently fine with his own people consuming bones mixed with chocolate and alcohol or smearing a little blood marmalade on crusty bread, but not with social, religious rituals of respect done by non-white, non-Protestant individuals.

    While the practice of medicinal cannibalism gradually dwindled, a book published in Germany in the early 1900s noted that a pharmaceutical company was still offering “genuine Egyptian mummy” in its catalog [5]. The human body is still used in medicine today, however these practices, such as blood transfusions and organ donations, are far more medically sound and don’t require any visits to the local executioner.



    Bibliography

    [1] Sugg, R. (2008). The art of medicine, Corpse medicine: Mummies, cannibals, and vampires. The Lancet, 371(9630), perspectives. doi:https://doi.org/10.1016/S0140-6736(08)60907-1

    [2] Dolan, Maria. “The Gruesome History of Eating Corpses as Medicine.” Smithsonian.com. Smithsonian Institution, May 6, 2012. https://www.smithsonianmag.com/history/the-gruesome-history-of-eating-corpses-as-medicine-82360284/.

    [3] Lovejoy, B. (2016). A Brief History of Medical Cannibalism. Lapham's Quarterly, 9(5).

    [4] Himmelman, P. K. (1997). The Medicinal Body: An Analysis of Medicinal Cannibalism in Europe, 1300-1700. Dialectical Anthropology, 22(2).

    [5] Gordon-Grube, K. (1988). Anrhropophagy in Post-Renaissance Europe: The Tradition of Medicinal cannibalism. American Anthropologist, 90(2).


  • September 28, 2020 12:11 PM | Anonymous member (Administrator)

    by Norma Erickson

    When the Lincoln Hospital opened in December of 1909, the African American doctors of Indianapolis could no longer continue with the state of medical practice in Indianapolis. Shut out of the hospitals of the city, they could not continue to care for their patients who required hospitalizations, a situation that led to disastrous outcomes for some Black patients. Sometimes the disconnect that occurred when the patient was moved from home to hospital left a very sick person vulnerable to mistakes.

    One such case happened in March of 1905 when Thomas Jones, a seriously ill African American man, was denied an examination at the City Hospital. He had recently been seen by two Black physicians; one wrote an order for him to be admitted to City Hospital. A carriage was called, and when the driver arrived at the hospital, the intern on duty looked at the man in the carriage, saw blood on the front of his clothes and immediately determined that he had tuberculosis. The doctor did not take Jones’s temperature nor remove the patient to an examination room, because the clerk on duty would not help him. Tuberculosis cases were prohibited from City Hospital, so the intern told Willis to take him to the county poor house. He did so, and thirty minutes later Thomas Jones died. Knowing that City Hospital would not accept TB patients, the physician would not have requested he be admitted there, nor would the nurse who saw him in his home had called for the carriage to take him to there. The nurse had collected a sputum sample at his home before he was removed. When tested later, the sample was negative for TB. The Black community was outraged by this and reporting on the case appeared in both the Indianapolis Star and the Indianapolis News for three weeks.

    During this era, the role of hospitals was undergoing great changes. No longer a place merely for the poor to receive treatment, they underwent modernization that allowed life-saving surgeries to take place. But a Black physician did not have access to those, even in public tax-supported institutions like City Hospital. Black patients who would have liked to receive treatment in a hospital rather than homecare were put off by the uncomfortable environment of all-white medical and nursing staffs.  Between the loss of revenue and prestige as a surgeon, and the patients’ low confidence in the system, it was clearly time for a new approach by the African American community. If the segregationist rules did not change, then it was time for a public hospital for African Americans. The only way to get it was to start their own.

    Like Ward’s Sanitarium, the Lincoln Hospital also launched a nurse training program that attracted students from around the state. It also included a free dispensary to treat the poor, just like the public hospital. Women’s clubs stepped up to gather funds and donate goods. Two prominent white men, a business owner and a politician donated substantially to the effort to get the project off to a start. The physicians published a first annual report with glowing successful cases and also revealed the cases they lost. Five years later, the hospital closed.

    The reason most often cited was the lack of funding. That certainly could be true, but could there be another reason? Could it be that the Black doctors of Lincoln Hospital allowed it to end because it was time to make a push to be installed at the City Hospital? For five years, they managed a facility and demonstrated their abilities to successfully perform operations. One of their own would run for city council and win that year, dangling the hope of making changes at City Hospital almost within their reach. War had begun in Europe, bringing a possibility that young Blacks would enter military service soon—another way to prove the mettle of the Race.

    But the entrance of physicians into Indianapolis’ public hospital would not happen for another thirty years and access to both adequate and trusted healthcare would continue to deteriorate.

    Next month: The Sisters of Charity Hospital


  • September 14, 2020 3:05 PM | Anonymous member (Administrator)

    by Haley Brinker

    The story of radium began in the laboratory of Marie Curie and her husband, Pierre, in 1898. It was there that they discovered the power of this element, a discovery that would earn them a Nobel Prize in Physics [1], but radium’s story was just beginning. Soon, the entire world would be at rapt attention, hungry for any news of or product containing what some called a “miracle element” [2]

    By the early 1900s, radium was synonymous with health. Products of all kinds, touting the benefits of radium for the human body were everywhere. Radium water was incredibly popular and one company, Bailey Radium Laboratories, called far and wide that their product, Radithor, was the tonic to cure to all of the public’s bodily ills [3]. It was even picked up by a celebrity sponsor of sorts, Eben MacBurney Byers. He loved Radithor so much, and spoke its praises so highly, that he consumed over a thousand bottles in only five years [3]. There were radioactive toothpastes, radioactive makeup, and even entire spas dedicated to the healing power of radium [1]. One product, the Radio-Active Pad, even claimed that it could cure blindness (yes, blindness) by wearing the pad “on the back by day and over the stomach at night” [4]. Consumers at the time were spoiled for choice. They could have their radium in the form of a kissable pout with lipstick. They could spend a day with the girls, lavishing in a lush spa, cucumbers covering their eyes, while they received treatments enhanced with the restorative powers of this wondrous cure-all. 

    Radium in branding was so popular, in fact, that some companies simply named their companies after it but did not actually put any into their products. One of these companies was called “Radium Brand Creamery Butter,” which didn't likely contain any of its namesake element [4]. Where people today would jump at products labeled ‘organic’ or ‘gluten free,’ folks during the radium craze lunged for any product that claimed it was associated in any way with radium. This was the power of popularity. Radium was trendy; radium was chic.

    Like all fads, the public’s love affair with radium would not last forever. Stories around the world began circulating, highlighting the serious health problems of those who were ingesting radium. Our Radithor loving, celebrity endorser, Eben MacBurney Byers, was soon afflicted with a host of health problems. After five years and over a thousand bottles of Radithor, the radium in his system had caught up with him. An attorney, Robert Hiner Winn, went to interview Byers in 1930 on behalf of the Federal Trade Commission. When he met Byers, he was dumbfounded; this previously hardy and healthy man was now a shadow of his former self. Byers’ radium poisoning was so severe that “half of his face was missing” [3]. The health tonic Byers had spent years promoting and using had been his downfall. He had been misled, and he was not the only victim.

    During the peak of radium mania, the US Radium Corporation had set up factories to produce watches with glow in the dark dials. These dials were hand painted using radium laced paint by young women who, to keep the numbers crisp and precise, pointed the tips of their brushes with their lips. Each lip-pointed brush stroke was “slowly irradiating” the women “from within.”. Soon, the women began to come down with symptoms of radium poisoning [2]. Bright, young women attempting to make a good living became bed-ridden, in worse health than most septuagenarians, and they had not even reached the age of thirty.

    With these two stories, and a plethora of others, the world turned its back on radium. The government, namely the U.S. Food and Drug Administration, outlawed any patent medicines that had radium as an ingredient [2]. Ingesting radium as medicine became a thing of the past, a bad memory in the public consciousness. Its popularity and downfall give meaning to the phrase that anyone eyeing a product that seems to good to be true should remember: buyer beware.


    Bibliography

    [1] Crezo, Adrienne. “9 Ways People Used Radium Before We Understood the Risks.” Mental Floss. Mental Floss, October 9, 2012. https://www.mentalfloss.com/article/12732/9-ways-people-used-radium-we-understood-risks.

    [2] Moss, Matthew. “The Radium Craze – America's Lethal Love Affair by Matthew Moss.” The History Vault. The History Vault, January 15, 2015. https://thehistoryvault.co.uk/the-radium-craze-americas-lethal-love-affair-by-matthew-moss/.

    [3] Brumfield, Dale M. “The Blessings of Radium Water Made His Head Disintegrate.” Medium. Medium, March 18, 2019. https://medium.com/lessons-from-history/the-blessings-of-radium-water-made-his-head-disintegrate-3ac052cb8620.

    [4] Orci, Taylor. “How We Realized Putting Radium in Everything Was Not the Answer.” The Atlantic. Atlantic Media Company, March 7, 2013. https://www.theatlantic.com/health/archive/2013/03/how-we-realized-putting-radium-in-everything-was-not-the-answer/273780/.


  • August 31, 2020 1:14 PM | Anonymous member (Administrator)

    by Haley Brinker, IMHM graduate intern from the Public History Department at IUPUI

    In the Bacteriology Laboratory of the Indiana Medical History Museum, you’ll find a photograph of Dr. John Hurty, hard at work at his desk. Next to this photograph, you’ll discover a large poster depicting a goblinesque typhoid germ, beckoning and inviting you to meet it at the town pump. This poster, commissioned by the ever public health-conscious Dr. Hurty and created by cartoonist Garr Williams, is a reflection of the very serious typhoid problem threatening the health of Indiana’s citizens at that time. In order to combat this problem, Dr. Hurty recognized that commissioning memorable posters that left little room for confusion of their messages would make it easier for the public at large to understand the public health issues facing them.

    Another of these posters (above) depicts a Creature from the Black Lagoon lookalike, rising from a bottle of milk, while a helpless, diapered child looks on, his rattle his only defense. Looking at this poster today, one can’t help but wonder what on earth could be so deadly about drinking something so seemingly harmless as milk.

    To put it simply, milk, prior to pasteurization and federal regulation, was absolutely disgusting. One analysis showed that a sample of milk in New Jersey had so many bacterial colonies that the scientists just stopped counting. Dairymen at the time often used cost-saving and morally questionable tricks in order to ensure that they could milk (sorry) the most profit out of their product. One such trick was thinning the milk with water. In one case, a family reported that their milk appeared to be “wriggling.” Upon investigation, it was discovered that the milkman had used “stagnant” water nearby, which was apparently full of tiny, insect eggs that grew into tiny, insect larva, causing the “wriggling” the family had noticed. Aside from being a scene out of one of your elementary school lunchtime nightmares, it further illustrated the need to regulate the industry. After the thinning process, the milk would sometimes be discolored. In order to solve this problem, the dairymen simply added things like chalk or plaster to turn it back to the crisp, white color their customers expected. Then, it gets nauseating. In order to make doctored dairy look “richer” and more cream colored, a puree of calf brains would sometimes be added to the mixture.

    Samples of milk tested during that time often had “sticks, hairs, insects, blood, and pus,” but it gets worse. There was also a lot of manure present. There was so much manure in Indianapolis’s milk in 1900 that “it was estimated that the citizens of Indianapolis consumed more than 2000 pounds of manure in a given year.” How could the powers that be possibly fight against all the rampant bacteria and the illness it caused? With formaldehyde of course! What better way to cure society’s ills than with embalming fluid in the food we eat and the milk we drink. Even our illustrious Dr. Hurty was on board at the beginning. However, he soon realized that it was doing more harm than good. Often, formaldehyde-related outbreaks of illness would occur, and could even be deadly, especially in children. In 1901, Hurty stated that over 400 children had died from milk tainted with either the chemical, dirt, or bacteria.

    When the federal government finally got around to passing the Federal Pure Foods and Drugs Act in 1906, the practice of putting formaldehyde in food was finally banned. While government-mandated pasteurization of dairy was still a long way off, the tireless efforts of Dr. Hurty to remove formaldehyde from milk helped pave the way for legal change to better protect the public from those that would profit at the expense of their health.

    TO SEE MORE OF HURTY'S COMMISSIONED CARTOONS AND LEARN MORE ABOUT INDIANA's 1899 PURE FOOD & DRUGS ACT, VISIT THE ONLINE EXHIBIT "FOOD FIGHT!"

                                       

    References:

    Blum, D. (2018). The 19th-century fight against bacteria-ridden milk preserved with embalming fluid. Retrieved from https://www.smithsonianmag.com/science-nature/19th-century-fight-bacteria-ridden-milk-embalming-fluid-180970473/#:~:text=In%20late%201900%2C%20Hurty's%20health,was%20%E2%80%9Cwriggling.%E2%80%9D%20It%20turnedAugust 6, 2020.

    Thurman B. Rice, MD. “Dr. Thaddeus M. Stevens- Pioneer in Public Health [Chapter XIV].” In The Hoosier Health Officer: A Biography of Dr. John N. Hurty, 57–60, n.d.


  • August 24, 2020 9:43 AM | Anonymous member (Administrator)

    by Norma Erickson

    It’s sometimes difficult to grasp why racial health disparities still exist in the twenty-first century. There are many aspects to the problem. One that is very relatable to everyone today is …money. How is healthcare paid for and who pays for it?

    In the late 1800s and early 1900s, there were few choices. Starting with the most expensive, the very rich were cared for in their homes. Their physician made house calls and private duty nurses provided round-the-clock care. If one had the means, a private sanitarium (a for-profit hospital typically owned by one doctor, sometimes a group of them) cared for patients in need of surgery or other higher level care. If you had a little money, the public or municipal hospital offered affordable care for paying patients and the patient’s own doctor could still have charge of their case.

    The public hospital also admitted the poor, whose care fell to the hospital staff physicians. In the case of a municipal hospitals with connections to medical colleges, interns and student nurses gave care under the guidance of professional staff (Indianapolis City Hospital for instance). For minor care and medications, the very poor could access publically funded dispensaries; again, these often doubled as teaching sites.

    At the end of the Civil War, most of the nation’s African American population lived in the South, existing in an agriculture-based economy that placed no expectations on education. Eventually, many would leave to find better opportunities in the North’s large cities. Indianapolis was a very interesting northern city because, unlike some the larger metropolitan areas, its African American population grew at a relatively slow pace. This allowed the white population to become more familiar with their new neighbors and the establishment of businesses and occupations that crossed over the color line, a line of social segregation between the races that stood solidly until the latter years of the twentieth century.

    The Black community developed class strata, just as did the white side. There were well-to do folks, a hardworking middling group, laborers, and the indigent on both sides. On the Black side of the line, no matter the group, an underlying missing element—that most of the white side enjoyed as a given—was respect.  African Americans could find that respect within their own environment, but truly adequate healthcare existed only on the other side of the line, where respect was hard to gain. For many in the middle group (small business owners, craftspeople, high-level service workers like train porters), the public hospital was the only option, and they knew that even if they paid, they would be admitted to the worst section of an aging building without access to their own doctor and at the mercy of a staff that might not respect them.

    The leaders in the Black community understood that the providing and receiving healthcare was an economic issue. The community was missing out on opportunities for employment (nurses and developing technology specialists) and higher level physician skills with required modern surgical equipment and support.

    Except for the Alpha Home for Women that cared for aged black women, no institutional medical facilities for Blacks existed in the city until the 1896 when a new physician, Fernando Beamouth, opened a sanitarium at 651 North Senate Avenue. The Freeman, a major Black newspaper, noted that this was the first sanitarium in the state for Black patients and only the second in the nation to be started by doctor of color. Beamouth died in 1897. In August 1903, several prominent men in the Black community, including Dr. Sumner Furniss, tried to purchase a building in the 900 block of North Meridian to start a clinic, but abandoned the project when white neighbors objected.

    Later, Dr. Joseph H. Ward opened his sanitarium on Indiana Avenue around 1906 (the actual date is unclear).  This first viable effort mostly served the portion of the population able to pay for private care.  For the first few years, Ward did not advertise his sanitarium in the newspapers, but the society pages occasionally announced hospitalizations there, naming patients known as elite members of the black community.  Later, he was Madame C.J. Walker’s personal physician. It is likely he also cared for a few charity patients, too.

    Beamouth, Ward, and Furniss were also members of the Black business league. Ward acted on the fuller economic function of health care as a source of good-paying jobs by starting a nurse training program. His sanitarium filled a gap for the elite, but the middle class needed an alternative to the City Hospital. In 1909, Furniss and several other Black doctors formed the Lincoln Hospital that would function as a public hospital for the African American community with the ability to pay for care. The Lincoln Hospital and its physicians will be the subject of the blog post next month in The Struggle for Adequate Healthcare for African Americans in Indianapolis-1906-1925 Part III.

    Photo: Officers of the National Negro Business League, at Indianapolis in 1904 from the collection of the Schomburg Center for Research in Black Culture at the New York Public Library. Dr. Sumner Furniss is the first on the left in the second row. 

  • August 19, 2020 3:02 PM | Anonymous member (Administrator)

    by Sarah Halter

    Despite the ongoing pandemic and our temporary closure, these are exciting and productive times at the Indiana Medical History Museum.

    This organization has come a long way in recent years. Among other things, we are making it a priority to better manage and care for all of our collections, and, as much as possible, make them accessible to the public. In late 2019, after successfully completing a large project to catalog the Museum’s library collection, we began a similar project to catalog, organize, and better protect our extensive archival collection. Our goals were to improve accessibility of the materials, identify holes in the collection, better track conditions, prioritize materials for digitization, and better manage and track use of the materials.

    We currently don't know the full extent of our archival collection precisely, but we estimate that the collection contains approximately 5,500 documents (personal papers, research notes, pamphlets, charts, instruction sheets, loose records, photographs, sketches, advertisements, class photos, etc.), including many oversized or rolled documents, plus hundreds of pieces of framed artwork, ledger books, and 16mm film reels and about 11,000 (!) glass plate negatives.

    As was the case with the library collection before we completed Phase I of this project, we just don't know everything we have. We can't always locate materials that we know we have, because storage locations in many cases have changed numerous times over the years. Our archival collections have been disorganized and inadequately protected on shelves that are sometimes unstable and frequently inefficient and unsecured. To protect and make better use of these materials, we must organize and store them using archival quality materials and secure, and in some cases fire and water resistant, shelves and cabinets. Last month we were awarded a $15,000 Heritage Support Grant provided by the Indiana Historical Society and made possible by Lilly Endowment, Inc. to help us accomplish this.

    This is such important work. It’s critical, in fact, to our mission to preserve and present Indiana’s rich medical history. We are stewards of a wonderful collection that contains a wealth of knowledge and many rare and very historically significant materials. When this project is completed, these materials will be much more useful for our internal research, publications, and exhibits. And most will be available to patrons, as well, when we reopen to the public and establish our Reading Room hours.

    We miss seeing you all here in the Old Pathology Building for tours and programs. But we’re making good use of this time to improve our digital and virtual offerings and to improve your experience and your access to our collections when it’s safe to have you back. Thanks for your patience and your continued support! It means so much to us.


    PHOTOS

    Top: The IMHM collection includes many pieces of artwork, including works created by patients. The works of the transient artist John Zwara are among the most exceptional. We have 22 of his paintings, 21 of which were done while he was a patient at Central State Hospital in the spring and summer of 1938. Most depict the grounds of the hospital as they were at that time. He painted several of the hospital’s large buildings, like this one of the Pathological Department that now houses the IMHM, as well as areas of the grounds.

    Bottom: Our collection consists of many ledgers of autopsy records from Central State Hospital as well as admissions, bookkeeping, and other types of records from a number of other hospitals. Here is a ledger from Long Hospital in Indianapolis.

Copyright © 2021-2022 Indiana Medical History Museum

3270 Kirkbride Way, Indianapolis, IN 46222   (317) 635-7329

Powered by Wild Apricot Membership Software