Health insurance systems that provide payment for new innovations also encourage medical advances. Medical treatments can be very expensive, and their cost would be beyond the reach of many people unless their risk of needing health care could be pooled though insurance (either public or private). The presence of health insurance provides some assurance to researchers and medical suppliers that patients will have the resources to pay for new medical products, thus encouraging research and development. At the same time, the promise of better health through improvements in medicine may increase the demand for health insurance by consumers looking for ways to assure access to the type of medical care that they want.
The continuing flow of new medical technology results from other factors including the desire by professionals to find better ways to treat their patients and the level of investment in basic science and research. Direct providers of care may incorporate new technology because they want to improve the care they offer their patients, but they also may feel the need to offer the “latest and best” as they compete with other providers for patients. Health care professionals, like people in other occupations, also may be motivated by professional goals (e.g., peer recognition, tenure, prestige) to find ways to improve practice. Commercial interests (such as pharmaceutical companies and medical device makers) are willing to invest large amounts in research and development because they have found strong consumer interest in, and financial reimbursement for, many of the new products they produce. In addition, public and private investments in basic science research lead directly and indirectly to advancements in medical practice; these investments in basic science are not necessarily motivated by an interest in creating new products but by the desire to increase human understanding.
An estimated $111 billion was spent on U.S. health research in 2005. The largest share was spent by Industry ($61 billion, or 55%), including the pharmaceutical industry ($35 billion, or 31%), the biotechnology industry ($16 billion, or 15%), and the medical technology industry ($10 billion, or 9%). Government spent $40 billion (36%), most of which was spent by the National Institutes of Health ($29 billion, or 26%), followed by other federal government agencies ($9 billion, or 8%), and state and local government ($3 billion, or 2%). Other Organizations (including universities, independent research institutes, voluntary health organizations, and philanthropic foundations) spent $10 billion (9%). About 5.5 cents of every health dollar was spent on health research in 2005, a decrease from 5.8 cents in 2004. 8 It is not known how much of health research was spent specifically on medical technology, though by definition most of the Industry spending ($61 billion) was spent on medical technology. Medical technology industries spent greater shares of research and development as a percent of sales in 2002 than did other U.S. industries: 11.4% for the Medical Devices industry and 12.9% for Drugs and Medicine, compared to 5.6% for Telecommunications, 4.1% for Auto, 3.9% for Electrical/Electronics, 3.5% for All Companies, and 3.1% for Aerospace/Defense. 9
Policy Issues
No matter the value of advances in medical care, as the rapid growth in health care costs increasingly strains personal, corporate, and government budgets, policymakers and the public must consider the question of how much health care we can afford. Can the U.S. continue to spend an expanding share of GDP on health (from 7.2% in 1970 to a projected 20% by 2015)? If the answer is no, then society must consider ways to reduce future health spending growth. And since, as described earlier, the development and diffusion of new medical technology is a significant contributor to the rapid growth in health care spending, it is new technology that we would look to for cost savings.
Currently, most suggestions to slow the growth in new medical technology in the U.S. focus on cost-effectiveness analysis. Other approaches have problems: some used by other countries are not popular in the U.S. (rationing, regulation, budget-driven constraints), some have been tried and found not to have a significant impact on technology-driven costs (managed care, certificate-of-need approval), while others are expected to have only limited impact on health care spending (consumer-driven health care, pay-for-performance, information technology). Cost-effectiveness analysis involves non-biased, well-controlled studies of a technology’s benefits and costs, followed by dissemination of the findings so they can be applied in clinical practice. The method to control the use of inappropriate technology could be through coverage and reimbursement decisions, by using financial incentives for physician and patients to use cost-effective treatments. Use of the cost-effectiveness findings could be implemented at the health plan level 14 or through a centralized, institutional process, such as Britain’s National Institute for Health and Clinical Excellence (NICE). If implemented at the national level, questions about the structure, placement, financing, and function of a centralized agency would have to be resolved. 15 Other issues include whether money would be saved by reducing costly technology where marginal value is low and how to monitor the cost impact, and whether a cost containment approach would discourage technological innovation.