Evidence of recent weeks suggests that the battle of high drug pricing is one that big pharma risks losing, raising questions over the economic model of an industry that relies heavily on US profits to reward investors and finance new drugs. Pharmaceutical executives say their ability to price drugs in the US, according to what the market will bear allows them to cover the considerable cost of finding new, often revolutionary, treatments; the cost of developing and winning approval for a new drug now runs at $2.6bn, according to the Tufts Center for the Study of Drug Development, compared with $802m in 2003.
If there is one thing that the pharma industry and those who foot the US healthcare bill agree on, it is that the US is paying too much relative to other western countries, especially Europe. Recently the pharma industry has come under attack in the media, again, for spending more on marketing and sales than R&D. Much of the recent press about drug pricing has taken a narrow view of the topic.
Many observers want the U.S. government to mandate lower drug prices — for example, by allowing re-importation of drugs or by having Medicare negotiate drug prices. It is tempting to believe that price reductions reduce company’s profits without affecting innovation. However, the absence of R&D into unprofitable third-world diseases, like Ebola, and the increase in drug approvals for rare diseases under the U.S. Orphan Drug Act belie this point. Indeed, economists estimate that a 10% increase in market size increases the number of new drugs by 40%. Price regulation has been shown to delay the launch of new drugs, limit their availability, and reduce the pace of innovation.
Much of the debate on the cost of medicines focuses on the initial U.S. price. But prices vary within different sectors of U.S. health care, across nations, and over time. And for most new drugs, patents expire approximately 12 years after market introduction. In the U.S. today, generics make up 86% of prescriptions.
Focusing only on the cost of a medicine — without considering its health-improving or life-saving benefits, or consequent reductions in other health care expenses — ignores its real value. While the cost is immediate, the benefits often don’t accrue for years. I co-led a team that brought alendronate from the laboratory to worldwide use. The incidence of osteoporosis-related hip fractures in women declined by 40% since the mid-1990s, when alendronate and (subsequently) similar drugs were introduced in the United States. Preventing such fractures avoids suffering for individuals and saves thousands of lives. It also lowers the costs of caring for people who would otherwise have endured a fracture.
Improving patients’ adherence to their medication regimens does increase spending on drugs, but the long-term savings can be so compelling that, for example, some insurers now offer all diabetes medicines to their members with no copay. The nonpartisan Congressional Budget Office acknowledges that use of prescription drugs can reduce health care spending.
Industry representatives agree that Americans are paying a large share of their research costs, but add that the nation enjoys the benefits. Dr. Gerald Mossinghoff, president of the Pharmaceutical Manufacturers Association, said that France’s stringent price controls had resulted in a drug industry that he described as “no longer world class,” adding, “There really is a cause-and-effect relationship between economic pressure and the amount of research.”