How Animal Testing Harms Humans – by Amez-Droz

Ms. Amez-Droz is program manager of the Open Health Project at the Mercatus Center at George Mason University in Arlington, Va

The Senate last week unanimously passed Rand Paul and Cory Booker’s bipartisan FDA Modernization Act 2.0 to end the mandate that pharmaceutical drugs be tested on animals before human trials. The bill has a good chance of either passing the House this fall or being included in a year-end package. It’s a good reminder of how onerous regulation hinders innovation and harms people.

It took nearly a century to initiate a change in the law despite massive advances in drug-discovery technology. In 1938, Congress ordered that animal testing be conducted as part of the Food and Drug Administration’s drug-approval process. While that method might have made sense with the drug-testing capabilities of the time, studies have since shown that animal testing can be a poor predictor of toxic response in human beings. The new bill would make such testing optional, allowing pharmaceutical manufacturers to choose the most effective toxicity-testing techniques.

The decision to enshrine animal testing in law was misguided, but it teaches a valuable lesson: The more specific the mandate, the more harmful, innovation-hindering and costly the results.

Mandating specific technologies hurts patients. Ninety percent of drugs that undergo Phase 1 trials—the first trials on human subjects—are never commercialized. The reason is often toxicity, meaning that the drug passed animal testing but turned out to be harmful to humans. Patients suffer in the course of the trial, physically and mentally, since they put so much hope in the prospect of finding relief from a drug that passed animal trials only to prove toxic in humans.

By imposing an outdated testing method on drug developers, Congress inadvertently encouraged the administration of potentially harmful drugs to human subjects. If the law changes, pharmaceutical companies could rely on more-accurate tests before beginning human trials, allowing safer and more effective drugs to advance toward commercialization.

Scientists have developed effective methods that can allow for rapid drug discovery. For instance, with organs-on-chips—three-dimensional structures that mimic human organs—researchers can test new drugs and observe how diseases respond. This allows them to predict accurately how a drug would perform in a human subject, rendering animal testing obsolete. But companies still have to conduct animal testing to comply with the law. If Congress enacts the FDA Modernization Act 2.0, pharmaceutical companies would be free to rely on modern testing methods from the outset of human trials.

This would cut research-and-development costs and allow for quicker drug discovery, increasing the number of pharmaceutical products on the market. Given that it can already cost $1 billion or more to bring a drug to market, removing such legal barriers would be welcome, too. Ultimately, it would mean more competitive drug prices.

Here’s the best part: The bill doesn’t contain an exhaustive list of authorized testing methods. It is broad enough to allow researchers to employ the latest drug-discovery techniques in their preclinical trial tests, and it encourages them to keep innovating. Right now, computer models relying on artificial intelligence are changing the industry. They could accelerate the pace of discovery and application submission to the FDA while providing more-granular information about chemical interactions in patients taking several different drugs.

How many other decades-old laws are depriving people of affordable, high-quality treatments? We may never know. But this mandate should serve as a cautionary tale for future lawmakers. Enshrining state-of-the-art technology into law risks undermining the development of better methods. Policy makers should take the long view when legislating drug approval or other technologically complex areas.

%d bloggers like this: