In the third year of what was supposed to be a five-year study, the data safety board monitoring the Systolic Blood Pressure Intervention Trial called it to a halt, as the gap in deaths widened between the two groups in the study.
Participants in the group that was treated more aggressively—who were given medication to keep their systolic blood pressure to 120 mm Hg or below—were clearly faring better than those in the comparison group being treated with a common target of 140 mm Hg or less. In the first cohort, 155 participants died; in the latter, 210 did. To proceed with the trial would not have been ethical, agreed researchers at the National Heart, Lung and Blood Institute.
But although the trial showed that more aggressively controlling hypertension saved lives, it raised other questions, and it left behind a wealth of pristine clinical data, potentially chock-full of hidden answers.
Now, the public will get a chance to learn more from that trial. In a new competition, the New England Journal of Medicine, which published the results of the Systolic Blood Pressure Intervention Trial, or Sprint, last November, is calling on data analysts, researchers and anyone else who's interested to take a fresh look at the Sprint data and glean new findings.
“We're pretty sure that such information is there, but we don't know what it is,” said Dr. Jeff Drazen, editor-in-chief of the New England Journal of Medicine. “We're challenging everybody in the world. You can be in Bangladesh or New Zealand or somewhere in China and access the data set and teach the world something.”
Participants in the Sprint Data Analysis Challenge could uncover a characteristic among certain patients that makes lowering blood pressure unnecessary, while for others doing so could even be excessively risky. Findings unearthed during this re-examination of the Sprint data could spur follow-up clinical trials.
“We want to get the most we can from the work that was done,” Drazen said.
In a way, the medical and research worlds also owe this secondary analysis to the 9,361 participants in Sprint, Drazen suggested.
“They put themselves at risk, and as a result, there were people who actually died because of one group assignment, compared to the other,” Drazen said. “We want to respectfully honor that commitment to improving the health of people in the world.”
The idea that data-sharing improves medical research has gained steam in recent years. If the Sprint challenge produces useful information, it would add evidence to shore up that belief.
“Although clinical trials generate vast amounts of data, a large portion is never published or made available to other researchers,” the National Academies Press said in issuing a January 2015 report, Sharing Clinical Trial Data: Maximizing Benefits, Minimizing Risk. “Data sharing could advance scientific discovery and improve clinical care by maximizing the knowledge gained from data collected in trials, stimulating new ideas for research, and avoiding unnecessarily duplicative trials,” it added.
In January 2016, the International Committee of Medical Journal Editors published an editorial in the New England Journal of Medicine, stating that responsibly sharing data from interventional clinical trials constituted “an ethical obligation.” Sharing such data is a serious undertaking, they said, but it can be done with responsible planning and leadership.
“Done well, sharing clinical trial data should also make progress more efficient by making the most of what may be learned from each trial and by avoiding unwarranted repetition,” they concluded.
In order to access the data in the Sprint challenge, entrants have to obtain approval or an exemption certificate from an institutional review board or ethics committee. That clearance will show that entrants are serious about the work and that they're qualified to do it.
“It's not a terribly limiting requirement, but it's one that shows that people who are joining the challenge do so at a respectful level,” Drazen said. “It's not just anybody on the street.” As for the number of entrants, “I hope to get somewhere more than 30 and less than 10,000,” he quipped.
Applications for the Sprint challenge opened Sept. 15; applicants will be able to access the data starting Nov. 1. They must pass a qualifying round by demonstrating their competency in working with the data before they can enter the challenge round, which runs from Dec. 1 to Feb. 14, 2017. They are allowed to use other publicly available data sets in addition to the Sprint data.
In sharing their findings, participants have to explain not only what they discovered but also how they did so. That requirement, plus the fact that everyone will have access to the same data, will prevent participants from falsifying their findings, Drazen said.
A few of the final details are still being hammered out, like the number of judges and the scoring system. There will likely be six judges—two clinical trialists, two data analysts and two patients—whose scores of zero through 10 will be based on three criteria: the utility of the findings, their originality and the quality of the methods used. Part of the final score also depends on the results of a public vote.
“We realize it's subjective,” Drazen said.
The winner will receive $5,000 and will present his or her findings at a summit on sharing clinical trial data in April 2017 in Boston. Second- and third-place winners will be awarded $2,500 and $1,500, respectively, and will also receive trips to the summit.