Friday

April 26th, 2024

Etc.

Heroes or morons? Scientists who become their own guinea pigs and their self-inflicted mishaps

Erin Blakemore

By Erin Blakemore The Washington Post

Published June 29, 2017

Werner Forssmann had a plan - a plan he knew his superiors would never approve. The 24-year-old German surgeon was frustrated by how difficult it was to access the human heart, but he doubted he'd get permission to perform a risky new procedure. And so, in 1929, he tried it on himself, thereby joining an age-old club: scientists who use themselves as guinea pigs.

Forssmann's plan was rudimentary and extremely dangerous. With the help of a nurse who hadn't realized what was about to happen, he pushed an oiled urinary catheter through a vein in his arm and almost all the way to his heart, then rushed to another floor of his clinic to X-ray the results. An appalled colleague fought to remove the catheter but was unable to do so before Forssmann pushed the catheter all the way into his heart and proved the procedure wouldn't kill a patient.

Self-experimentation takes many forms, but basically it combines curiosity with the willingness to treat oneself as a test subject.

To Lawrence K. Altman, a physician who wrote a history of how self-experimenters shaped medicine, the practice shows "a degree of altruism" that has been good for science. When researchers are willing to use themselves as guinea pigs, that "should be reassuring" to people who eventually receive those treatments, says Altman, a global fellow at the Woodrow Wilson International Center for Scholars and a medical columnist for the New York Times. "The researcher is willing to take the risk before asking anyone else to take the risk."

Forssmann catheterized his own heart nine more times. He lost his job when he published his results without getting permission, but 26 years later, he was sitting in a bar when his wife rushed in with a message: He'd just been awarded the Nobel Prize for his discovery. Today, cardiac catheterization is common, widely and safely used to find heart defects, deliver medicine and open up blocked arteries.

Jonas Salk was part of the club, too. In 1953, he injected himself with an experimental polio vaccine and became the first human who didn't have polio to receive the shot. He was so confident in the new vaccine, using "killed" polio virus, that he injected his wife and young sons as well. They and others in Salk's lab were the first recipients of the vaccine, which is still administered today. Salk's vaccine - and variations that followed - helped eradicate polio in the Americas.

Scientists experiment on themselves for all kinds of reasons. As in the case of Barry Marshall, who in 1984 drank a beaker filled with tainted beef broth to prove that bacteria cause ulcers, they may feel their work is too risky to try on anyone but themselves. Or as with Wendy Suzuki, a neuroscientist who made a mid-career detour a decade ago to study the effects of exercise on her own brain, they may use personal experiments with lower stakes to find new avenues for research.

For others, including Kyong-Sup Yoon, self-experimentation begins with sheer necessity. The biologist and assistant professor at Southern Illinois University at Edwardsville was studying the effects of a common medication on head lice when he developed an itchy conundrum. Lice were in short supply, and when he tried sourcing them at elementary schools, he often returned to the lab empty-handed. So he decided to farm them on his own legs.

Yoon attached a plastic container to his leg, seeded it with head lice and let them feed on his blood. He admits the stakes were low; unlike body lice, their hair-loving cousins don't transmit disease. But the personal sacrifice was high. "It was really painful," Yoon says.

Despite escaped lice and worries about the social cost of becoming an insect farm, Yoon hosted the insects between 2001 and 2004, growing up to 300 at a time. He used the experiment to develop a method for growing large numbers of lice in the lab. Yoon's research has shed light on drug resistance in lice and how they transmit disease.

Not all self-experiments are done on the body. At the turn of the 20th century, Austrian scientist Karl Landsteiner used vials of his own blood to prove that different blood types in humans exist - and that incompatible ones will clump if mixed together. Today, 14.6 million blood transfusions based on that discovery - which won Landsteiner a Nobel - are performed each year in the United States alone, according to the Centers for Disease Control and Prevention.

Sometimes, the risks prove fatal to the experimenter. Jesse Lazear, a scientist working with Walter Reed to help prove how yellow fever was transmitted in Cuba, died after inoculating himself with an infected mosquito in 1900.

Self-experimentation is not just something from the past. Even today, all sorts of people - biotech execs trying out their own drugs and psychologists trying to hack their sleep habits and their weight - are keeping the practice alive and well. Sometimes it's a matter of convenience or lack of funds, but it can also serve to create the proof of concept needed to kick-start a more extensive trial.

"You can prove that [a treatment is] not too dangerous to use on humans by using it on yourself," says George Annas, director of the Center for Health Law, Ethics & Human Rights at the Boston University School of Public Health. "But you're just one data point."

A researcher's confidence based on self-experimentation should have no effect on whether others receive a treatment, Annas warns. Altman disagrees: "It's hard to see how a personal bias can affect the results of an overall data set in a trial," he says.

In any experiment, the researcher must abide by consent laws and reasonably protect a patient's safety. And any treatment that makes it to the general public must be backed by much more data than a single patient.

Annas points out ethical conundrums for researchers who make themselves part of the experiment. "The research is just too biased," he says, noting that people who experiment on themselves may ignore the risks of a procedure.

When stakes are high, it can be tempting to ignore those risks. In 1986, as the AIDS epidemic raged, French immunologist Daniel Zagury injected himself with the first dose of an experimental vaccine to fight HIV. "I am not crazy," he told the Los Angeles Times. Because there was a chance that the vaccine, which contained a protein from the virus, might damage the immune system or cause AIDS to develop, he didn't feel he could give the vaccine to anyone before trying it on himself. But soon after injecting himself, he used the vaccine on a small group of children in Congo, which was then known as Zaire.

Zagury tried to paint his initially secretive actions as compassionate and proper, but the publicity that followed revelations about his use of children - and the deaths of three adults who were later given the product - cast a negative light on the search for an AIDS vaccine. Although a probe against Zagury was eventually dropped, questions remain about his research practices, and the National Institutes of Health restricted collaborations with him. His experimental vaccine never made it to market.

As medical ethics have evolved, so has self-experimentation. Researchers no longer need to infect themselves with cholera - as Max von Pettenkofer did in 1892 to test how the disease spreads - or place themselves in a cyanide-filled chamber to show the effects of poisonous gases on pulmonary function - as Joseph Barcroft did in 1917. (Both scientists survived.)

Because there is no central database of self-experimentation, Altman says, there's no way to know how many scientists may be testing new therapies or ideas on themselves. Given the endless curiosity of researchers, though, the temptation to become a human guinea pig is unlikely to die anytime soon. As medical advances occur, and more and more experiments are done, Altman says, "there's more and more opportunity."

Columnists

Toons