Blog Home

Author Archive

The VA Post-Scandal: New Law And New Leadership

October 2nd, 2014

Editor’s note: For more on this topic, see the Health Affairs Blog posts from Theodore Stefos and James Burgess and Jonathan Bush

In the wake of the recent scandals in the Department of Veterans Affairs (VA), new leadership was installed with former Procter & Gamble CEO Robert McDonald confirmed as Secretary by Congress on July 29 and the Veterans Access, Choice and Accountability Act of 2014 became law on August 7. The Act, fashioned with VA cooperation and described as a VA overhaul, provides resources for the Veterans Health Administration (VHA), demands greater accountability and transparency and introduces the ability of certain VHA enrollees to choose private health care.

The Act’s bipartisan support and the rapidity of its design and passage (perhaps a model of productive legislative discussions) reflect strong support in the country for veterans. As the new law and new VA leadership pass the one-month mark of the two-year window to the end of this administration, the focus will be on how effectively VA implements the law and makes other strides.

Read the rest of this entry »

Thoughts On The VA Scandal And The Future

June 13th, 2014

For eight years, until May 2013, I directed the Department of Veterans Affairs (VA) medical research program from its Central Office and became familiar with the operations of the Veterans Health Administration (VHA). It was my only VA job and I felt honored to be part of the VA’s vital mission, as did most VA employees I met. Based on this experience, I have some ground level observations on the state of the VA and its future planning in light of the present scandal.

VA’s Scope and Assets

VA has three components: a large health system (VHA), a benefit center (Veterans Benefits Administration, or VBA), and the highly regarded National Cemetery Administration. All report to the VA Secretary but have different missions, issues, and management requisites. For example VHA was a pioneer in the Electronic Health Record (EHR), while VBA has had a more recent painful conversion to information technology (IT). VHA is run by the Undersecretary for Health, on whom VA Secretaries almost totally rely given their general lack of experience in health care.

VHA is divided into 21 networks and has 8.9 million enrollees (out of the 22 million U.S. veterans). It cares for 6.4 million veterans annually at over 1,700 sites of care, including 152 hospitals, about 820 clinics, 130 long-term care facilities, 300 Vet Centers for readjustment problems, and a suicide hotline, as well as homelessness and other programs. It has partly trained two-thirds of U.S physicians and made groundbreaking medical research contributions. These assets create strong constituencies for VA both within and outside the veterans’ community.

Read the rest of this entry »

Applying Comparative Effectiveness Research To Individuals: Problems And Approaches

October 29th, 2013

A Comparative Effectiveness Research (CER) study shows that surgery is better than medical treatment for a particular cardiac condition. My patient is 78 years old and has complicated diabetes. – does the study apply? Another patient 48 years old and otherwise healthy. Does it apply here?

Can the overall results of a CER study be applied to all patients in the target population? Are there substantial, undetected variations among patients in the results of CER? What is the extent of exceptions? These are important policy questions in applying results of CER to day-to-day decisions, clinical guidelines, performance measures and other facets of the modern healthcare system.

The “gold standard” approach to CER is the randomized (RCT), a scientific comparison of two or more clinical strategies, with the downsides that it is generally conducted in a special environment and usually has a rather narrow (and possibly unrepresentative) population spectrum. Two variants, the Practical (or Pragmatic) Clinical Trial (PCT) and the Large Simple Trial (LST) are inclusive of a wider spectrum of patients and more diverse clinical settings.

These approaches provide “average” results and for the most part it is thought that averages do apply to a large segment of the population at large for which they are intended. However, there are clearly differences in effect (heterogeneities of treatment effect – HTE’s) that manifest among CER study subjects and presumably to a greater extent in the intended population outside the study. Two approaches may be equivalent on the average but one may be better in a particular group, and differences may be less apparent when the study’s population base is narrow. A long list of factors contribute to these HTEs for CER and other trials – comorbidities, severity of illness, genetics, age, medication adherence, susceptibility to adverse events, ethnicity, site, economics and others.

Read the rest of this entry »

The Privacy Conundrum And Genomic Research: Re-Identification And Other Concerns

September 11th, 2013

No matter what the arena — finance, health care, or national security — questions surrounding the provision of personal data are always the same: how much benefit vs. how much risk? Who handles these data, and can those individuals be trusted? How do organizations guard against data misuse? What are the legal safeguards to protect privacy, and are they sufficient in an era when more data are shared more widely?

Nowhere is the privacy discussion more personal than in genomics, the very hardwiring of our existence. Genomic data are unique to individuals (or identical twins) and, except for occasional mutations, do not change over a lifetime, thereby rendering disclosures permanent. Genomic data also have special properties regarding privacy, especially as comprehensive whole genome sequencing becomes the major technique.

The benefits of amassing genomic data in sufficient case numbers for validity and making this knowledge available to an appropriately wide body of expert investigators are extensive. Research derived from genomic databases offers potentially large health payoffs. Genomics can help scientists predict who will develop a disease (e.g., Huntington’s Disease) and tailor treatments. It also holds the potential to bring about a paradigm shift in how we think about and classify disease; i.e., allowing us to move from the pathology-based approach begun in the late 19th century — which focuses on the progression of disease in a specific organto a biochemical-and genomics-based approach. This new approach is already being applied to a number of diseases, including certain cancers.

Read the rest of this entry »

Seven Ways For Health Services Research To Lead Health System Change

May 30th, 2013

With ACA implementation now at hand — and with it, the formation of accountable care organizations (ACOs) — health services research (HSR) has an especially important role to play. As ACOs take steps that will substantially change health care delivery, the ability to measure and improve health system performance and acquire this data efficiently will be in greater demand. Is HSR up to the challenge?

As the “basic science” of the health care system, HSR focuses on access, cost, and quality. Health services researchers work to identify and assess vital signs of a well-functioning health care system and develop performance measures for examining system aspects. They also seek to improve the system by identifying gaps in quality and then testing and disseminating solutions to those problems. HSR’s goals and often its approaches are different from the development, testing, and translation of new drugs or other interventions.

But as we’ve seen all too frequently, the necessity of responding to ongoing change within the health system outpaces HSR’s ability to produce timely evidence. As a result, large scale changes are sometimes instituted on the basis of imperfect evidence (e.g., tying financial incentives to measures of physician or hospital performance before studies are completed). Or, as we’ve also seen, by the time a research project comes full cycle — proposed, funded, conducted, and published — external or internal forces impacting the health system may have rendered the original question moot. Within the VA, for example, studies regarding the effectiveness of telehealth interventions often have been overtaken by these technologies’ rapid dissemination throughout the system. For HSR to fulfill its mission of helping transform health care, it first must be able to transform itself.

Read the rest of this entry »

Reforming the Research Regulatory System

April 24th, 2013

There is a growing consensus that the regulatory system for research is in need of reform. Established 21 years ago by the Common Rule, it has functioned via a rigorous environment to assure that risk in research is dealt with and transparency maintained.

The trigger for these regulations is a definition of research as a “systematic investigation…designed to develop or contribute to “generalizable knowledge.” When this definition is satisfied, an intensive set of requirements ensues including review, approval, and continued oversight by an Institutional Review Board (IRB); reporting requirements;, the necessity for informed consent (often highly complex); and other administrative components. If projects are not “generalizable,” (e.g., local hospital programmatic or quality review), they fall strictly under healthcare system purview rather than under Common Rule regulatory oversight.

The current system has a strong moral imperative and has been critical to mitigating risk for research subjects and providing transparency. However, it is burdensome and fails to take into account the considerable progress made in both the research and clinical enterprises over the last few decades: in research, technological advances in generating data on routine care, and in healthcare, much more stringent oversight.

Read the rest of this entry »

New Approaches To Learning In The Learning Healthcare System

January 14th, 2013

A goal of Twenty-First Century Healthcare is to establish and enhance the Learning Healthcare System (LHS). As discussed in numerous forums, journals, and social media, the LHS is viewed as critical to improving healthcare. Fundamentally, the LHS converts data about care and operations into knowledge that it translates into evidence-based clinical practice and health system change. In so doing, the LHS utilizes as vehicles health information technology, databases, the electronic healthcare record (EHR) and, importantly, a research infrastructure. The continuing narrow evidence base for clinical care combined with the need for substantial amounts of data to fill large evidentiary gaps, among other factors, have fostered the LHS concept.

To assure the utility and validity of data converted and then translated into improvements by the LHS, we need rigorous research approaches that are also efficient. Research is defined as “a systematic investigation…designed to develop or contribute to generalizable knowledge.” At present, evidentiary inputs to the LHS range from activities not generally considered research (e.g., programmatic and quality improvement evaluations) to various forms of research that are sufficiently rigorous, but for various reasons, can be difficult to employ and translate into the routine workings of the LHS.

For example, the randomized clinical trial (RCT) — considered the cornerstone, or “gold standard” methodology — provides the best data, but by its very nature, is separate from the workings of clinical care in a given healthcare system (HCS). Instead, the RCT functions in an alternate environment precisely controlled for the approach’s particulars. Further, the RCT is costly and time-consuming, and its rather narrow entry criteria may diminish its ability to be generalized to routine patients (e.g., those with comorbidities, or those outside of the entry criteria of age and severity).

Read the rest of this entry »

The Million Veteran Program: Building VA’s Mega-Database for Genomic Medicine

November 19th, 2012

This year marks the fiftieth anniversary of Watson and Crick (and Wilkins) being named Nobel Prize recipients for discovering DNA, the genetic code. In the half century since, there has been an exponential growth of knowledge and accomplishment based on their findings. More recently, a confluence of scientific and technical advances have made possible vast progress in our understanding of human disease, its diagnosis, and the most effective treatment(s). Among these advances are genetic testing, high performance computing platforms, and the electronic health record (EHR), which together offer the possibility of clinically rich databases that link genetic information to treatment outcomes.

These and other advances have made it clear that the genetic predispositions to adult diseases are in many cases extremely complex. In its early phases, human genetics focused on single genes for single diseases that generally occurred in childhood; e.g., Tay-Sachs disease. The genomics of adult diseases—such as coronary heart disease—are associated with complexity resulting from multigene interactions and strong environmental influences (e.g., lifestyle and exposures), that may in some cases result in organ-specific “epigenetic” changes that modify DNA.

A prominent example of how these various factors come together can be seen by looking at diabetes. Having a gene associated with diabetes may modestly increase one’s chances of developing this condition from—let us say—6 to 12 percent. But whether diabetes actually results is influenced by additional factors, such as the sequences of other genes, environmental influences (such as diet and exercise), and age.

Read the rest of this entry »

Click here to email us a new post.