by Suzanne M. Rivera, Ph.D.
For legitimate reasons, the human research enterprise frequently is regarded with suspicion. Despite numerous rules in place to protect research participants’ rights and welfare, there is a perception that research is inherently exploitative and dangerous.
Consequently, most people don’t participate in research. This is not only a fairness problem (few people undergo risk and inconvenience so many can benefit from the knowledge derived), but also a scientific problem, in that the results of studies based on a relatively homogeneous few may not be representative and applicable to the whole population. Larger numbers of participants would improve statistical significance, allowing us to answer important questions faster and more definitively. And more heterogeneous subject populations would give us information about variations within and between groups (by age, gender, socio-economic status, ethnicity, etc.).
Put simply, it would be better for everyone if we had a culture that promoted research participation, whether active (like enrolling in a clinical trial) or passive (like allowing one’s data or specimens to be used for future studies), as an honorable duty. (Of course, this presumes the research is done responsibly and in a manner consistent with ethical and scientific standards, and the law.)Two events last week got me thinking more seriously about the idea that participation in research ought to be thought of more like a responsibility of citizenship (like voting) and less like a danger from which people need protection.
The first was a symposium at Case Western Reserve University on the occasion of the awarding of an ethics prize to environmentalist David Suzuki. He spoke about the need for a paradigm shift in the discourse about conservation versus growth, such that we would start approaching decisions with an eye toward effects on future generations, rather than measuring outcomes based on current-year balance sheets or four-year election cycles.
The second was President Obama’s acceptance speech at the Democratic National Convention. In it, he also promoted long-term thinking, offering a definition of citizenship that asks us to consider not only our responsibilities today but, “the idea that this country only works when we accept certain obligations to one another, and to future generations.”
As I listened to both Suzuki and Obama, I thought about the significant changes in behavior affecting public health that were made possible during my lifetime through deliberate culture change. Over the last forty years, public attitudes and norms about smoking, seatbelts, domestic violence, and littering have changed radically. As a result of new scientific knowledge, new laws, and campaigns to persuade the public that we’d all be better off if we would do or see things differently, we’ve actually changed what we do and how we think about it.
A debate is underway in policy circles about whether and how we should change the current regulations that govern research in the U.S. Some advocate changing the rules to increase protections related to use of existing data and specimens, even when personal identifiers have been removed. Others want to see the rules streamlined, with limits in place only for the riskiest kinds of studies.
Practically speaking, the fact that there is not a consensus probably means we will see only modest tinkering around the edges of rules that were promulgated more than 30 years ago. This is unfortunate because the current rules have not kept pace with technology or public attitudes.
But even more than a change in rules, what we need is a true paradigm shift from research protectionism to research promotion. This would require a public awareness campaign about the benefits of research and the safeguards in place to protect participants’ rights and welfare.
The best course for future generations would be for research participation to be viewed as a valued and honorable expression of shared responsibility for improving human health and welfare.