Treatment advances for children and adolescents/young adults (AYA; patients between the ages of 15 and 39 years) diagnosed with acute lymphoblastic leukemia (ALL) have improved survival rates over recent decades, except among neonates. More precise risk stratification of clinically (National Cancer Institute [NCI] criteria) standard- and intermediate-risk patients using minimal residual disease (MRD) detection of leukemic cells can safely allow truncation of intensification chemotherapy for patients at low MRD-stratified risk of relapse, from two intensification courses to one.
The study also revealed low levels of 5-year central nervous system (CNS) relapse despite a low rate of cranial irradiation in this study population, bolstering the case that prophylactic cranial radiotherapy might be safely omitted from ALL treatment among young patients who are at low MRD risk of relapse. High-throughput deep-sequencing, a new method of MRD detection that is in development, might further improve risk stratification precision, potentially increasing the number of candidates for reduced intensification chemotherapy.
Introduction
Continue Reading
A recent analysis found that 5-year ALL survival rates have climbed from 83.7% during the early 1990s to 90.4% in the first 5 years of the 21st century.1 More than a third of childhood ALL deaths between 1990 and 2005 occurred among children who had clinically standard risks as assessed by NCI criteria.1
“Optimum use of chemotherapy, precise risk classification, and improved supportive care have increased the proportion of children with acute lymphoblastic leukemia (ALL) who have survived to 5 years to 90% or higher in several contemporary trials,” notes Ching-Hon Pui, MD, of the St. Jude Children’s Hospital and the University of Tennessee Health Science Center in Memphis, TN.2 “This achievement has shifted research emphasis to the reduction of toxic effects, especially in patients with a low risk of relapse, while preserving gains in long-term survival.”
ALL risk stratification for postinduction treatment intensity planning has traditionally involved clinical, cytogenetic, and morphologic criteria for early response assessment, but this approach yields less-specific risk stratification groups than those identified with the monitoring of minimal residual disease (MRD).3 MRD has been shown in several large trials to be a sensitive, specific predictor of patients’ risks of relapse, and “the strongest adverse prognostic factor” in ALL.3-4 Undetectable MRD at the end of induction therapy is associated with a “negligible” risk of relapse, but patients with MRD values exceeding 0.001% at the end of induction therapy have a relapse risk exceeding 20%.3
MRD Monitoring Methodologies
Several methods are in clinical use for MRD monitoring; two widely used modalities are flow cytometry (which can detect a single leukemia cell among 10,000 healthy cells, or a sensitivity of 0.01%) and allele-specific oligonucleotide PCR (ASO-PCR, with a sensitivity of 0.001%); the latter is time consuming, requiring patient-specific reagents and assay conditions.4 Several authors have noted the need for more precise MRD detection for better MRD-based risk stratification.2,4 (MRD results were unavailable for a third of all participants in the UKALL 2003 study, noted Dr. Pui.2,3)
High-throughput deep sequencing to identify all leukemic gene rearrangements at diagnosis might allow more precise monitoring of clonal evolution and MRD during and after therapy, and has favorably compared with PCR and flow cytometry for MDR detection.4