September 26th, 2012
GrantWatch Blog invited the author, vice president of research and evaluation at the Robert Wood Johnson Foundation, to report on a webinar that the funder held this month.
The fortieth anniversary of the Robert Wood Johnson Foundation (RWJF), a national funder, is a time for reflection on how we at the foundation have done, what we could do better, and how we should do our work in the future. This past week, in a webinar, Bob Blendon, one of the early staff members at the foundation, noted that when the RWJF began, it wanted to bring an evidence revolution to health problems and solutions.
Early leaders of the RWJF funded evaluations both as a way of holding the foundation accountable for its work and as a way of providing information—disseminating innovations that worked. Blendon (who worked at the RWJF from 1972–1987) discussed the foundation’s first great success—the Emergency Medical Services (EMS) Program, which started in 1972 with forty-four sites. The evaluation of the EMS program found that people were getting emergency care faster than before, but people weren’t being taken to the closest, most appropriate hospital. Part of the EMS program’s goal was realized; the other part would take the next decade to fix.
More recently the RWJF also used evaluation to support the development of health policy. At the webinar, Jim Knickman (who worked at the RWJF from 1992–2006) described the Cash and Counseling Demonstration and Evaluation, in which elderly and disabled Medicaid recipients were allowed to purchase services from whomever they wanted, including family members, and buy equipment that would assist them, rather than receiving services from a Medicaid-approved agency. In the late 1990s, this was very controversial: policy makers were worried that costs would skyrocket, that family caregivers weren’t trained to do this, and that there would be fraud.
The evaluation conducted by Mathematica and jointly funded by the federal government and the RWJF found that consumers in Cash and Counseling programs were more satisfied and had better outcomes and that there was no fraud. The evaluation assuaged policy makers, and the program was adopted as a Medicaid option in the Deficit Reduction Act of 2005. Today, fifty states have some form of the program, and the Veterans Administration has it available in seventeen states. The key to its success was strong evidence and a partnership with the federal government—thus, improving the likelihood of acceptance.
Alan Cohen (who worked at the RWJF from 1984–1992) discussed the difficult problem of evaluating human capital programs that prepare people for leadership and scholarship in health and health care. Such programs report their outcomes two or three decades in the future. That problem was avoided in the evaluation of the RWJF Clinical Scholars program by evaluating it two decades after the program started. Evaluators Jack Rowe and Rashi Fein found that clinical scholar alumni had succeeded in medical careers but had not been active in health policy. In addition, placement of program sites only on the East and West coasts disadvantaged physicians in the middle of the country.
That evaluation led to changes: more emphasis was placed on health policy, and sites in the middle of the country were selected. Evaluations of human capital programs today are more likely to be formative—that is, assessing whether recent alumni are moving along the right path and thus providing feedback for program refinement.
Research has been used to identify and define problems. Linda Aiken (who worked at the RWJF from 1974–1987) discussed the development of a new approach to measuring access to care. She mentioned that with RWJF funding, researchers Ronald Andersen and Lu Ann Aday conducted a series of influential access-to-care surveys. The methods they developed are the standard today for federal surveys on access to care. Not only were the methods cutting edge at the time, but the surveys were used to answer pressing policy questions. For example, analysis of one of these surveys in the 1980s debunked the claim that charity care could provide an adequate substitute for health insurance.
Our research on tobacco, which I discussed during the webinar, has been used in quite different ways than most previous research has been. The foundation funded research that focused on establishing which policies could reduce smoking, especially among children. For example, Frank Chaloupka and his colleagues investigated the impact of tobacco taxes on smoking. They found that higher cigarette prices led to reductions in smoking by teenagers and young adults. The Campaign for Tobacco-Free Kids and the SmokeLess States National Tobacco Policy Initiative coalitions used the research to advocate for policy changes.
We celebrate successes, but also we learn from mistakes. In volume XIII of the RWJF’s To Improve Health and Health Care anthology, the other authors and I reviewed “programs that did not work out as expected.” Research and evaluation projects have also failed. Such projects fail because of poor design—a situation that wasn’t discussed in the September webinar. In a report that will be released next month, FSG, a consulting firm, examines the RWJF’s work on substance abuse from 1986 to 2009. The researchers conclude that the RWJF’s evaluations in this area did not contribute to learning and to making better programs to prevent substance abuse. Since then, much of that problem has been corrected by funding evaluations that provide ongoing feedback as well as measuring outcomes.
Research and evaluations can fail to provide timely information as was discussed in the webinar. In 1986 the RWJF funded the AIDS Health Services Program to provide services in eleven communities. Because of the great concern about AIDS, Congress adopted the RWJF’s approach in the Ryan White Comprehensive AIDS Resources Emergency (CARE) Act in 1990 before the foundation’s evaluation was finished.
In 1980 the RWJF, the John A. Hartford Foundation, and the federal government launched the National Hospice Study. Before study results were available, Congress passed a hospice benefit for Medicare beneficiaries. In this case, early results from the study informed that legislation.
My take is that the RWJF’s strategy for evaluation and research was naïvely similar to the voice in the Field of Dreams movie: if you build it (and evaluate it), they will come. Early on, the foundation was wildly successful with this humble approach. Times were different then. Today, we need to be strategic about how to translate and communicate our research in terms that advocates, influential persons, and policy makers can understand, and we need to be strategic about how to turn research into policy.
Research topics, designs, and uses have changed at the RWJF. Nevertheless, the commitment of this foundation to evidence-based philanthropy remains steadfast. The RWJF is still committed after all of these years to objective research—conducted by independent scholars and usually published in peer-reviewed journals—about practical issues.
View here the RWJF’s series of anthologies that look at the successes and shortcomings of many of the foundation’s programs.
Email This Post Print This Post