ORLANDO, FL – Computer-assisted evaluation of tumor response reduced errors and time of evaluation compared with standard-of-care manual tumor response evaluation methods in patients with metastatic renal cell carcinoma (mRCC), according to a study presented at the 2017 Genitourinary Cancers Symposium.1
“Objective response to systemic therapy as measured on computed tomography (CT) determines critical end points in patient care, such as progression-free survival,” said Brian C. Allen, MD, assistant professor of radiology in the department on abdominal imaging at Duke University School of Medicine in Durham, North Carolina. “RECIST 1.1 is the most common objective tumor response criteria and is based on uni-dimensional changes in tumor length.”
Dr Allen explained that tumor size changes are often delayed and underestimate tumor response in patients with mRCC treated with anti-angiogenic targeted therapy. Because RECIST 1.1 tasks are mostly performed manually, there is a risk of human error. Dr Allen and his colleagues compared the effectiveness of standard-of-care manual evaluation and computer-assisted tumor response evaluation (CARE).
Eleven readers from 10 different institutions independently categorized tumor response according to RECIST 1.1, Choi Criteria, and MASS criteria using paired baseline and initial post-therapy CT studies from 20 randomly identified patients with mRCC who received sunitinib as part of a phase 3 trial.
For the standard-of-care group, investigators provided readers with remote access to a standard viewer. All measurements, data, and calculations were entered manually into a Web-based Qualtrics survey to mimic the use of an electronic data capture device. CARE uses a custom image-viewing and semi-automated software platform called eMASS that gave readers stepwise guidance, interactive error identification and correction methods, automated tumor metric calculations, response categorization, and data/image archival.
The manual method was, on average, associated with at least 1 error in nearly a third of the 20 patients compared with CARE, which had a 0.0% error rate (P < .001).
“Mean patient evaluation time with CARE was twice as fast as the standard-of-care method, with 6.4 minutes vs 13.1 minutes, respectively,” Dr Allen added. “Efficiency was improved through automation of multiple steps.”
This study was limited by its retrospective design, and the true error rate was underestimated. The study was not powered or designed to assess objective response reclassifications, and did not mention costs of the automated software.
RELATED: Safety of Active Surveillance for Small Renal Masses
“CARE reduced errors and time of evaluation, indicating better overall effectiveness than manual tumor response evaluation methods that are the current standard of care,” said Dr Allen. “CARE may be widely applicable to a variety of solid tumors and therapies.”
- Allen B, Florez E, Sirous R, et al. Comparative effectiveness of tumor response assessment methods: Standard-of-care versus computer-assisted response evaluation. Paper presented at: 2017 Genitourinary Cancers Symposium; February 16-18, 2017; Orlando, FL.