Individual radiosensitivity testing in 2025: Current advances and future directions for personalized radiotherapy

Radiation therapy (RT) is a cornerstone of modern oncology, with approximately 60 % of cancer patients receiving RT during their treatment course (Borras et al., 2015). Technological innovations, such as intensity-modulated RT (IMRT) and proton therapy, have significantly improved dose conformity and tumor targeting. However, these advances do not address a fundamental biological challenge: the interindividual variability in radiosensitivity (RS). While most patients tolerate RT within expected toxicity thresholds, 5–15 % develop severe, sometimes debilitating side effects, including fibrosis, cardiovascular damage, and secondary malignancies (Domina et al., 2018, Foray et al., 2016). Conversely, others exhibit radioresistant tumors that may require dose escalation to achieve effective tumor control. The inability to accurately predict these responses represents a major limitation in achieving truly personalized RT.

RS is determined by a complex interplay of genetic, molecular, and immune factors. Early studies on rare genetic syndromes, such as Ataxia Telangiectasia (AT) and Nijmegen Breakage Syndrome (NBS), provided critical insights into the role of DNA repair defects in extreme radiation hypersensitivity (Rothblum-Oviatt et al., 2016, Bakhshi et al., 2003). However, these conditions account for only a minority of cases. In the general population, common single-nucleotide polymorphisms (SNPs) in genes such as ATM, TGFB1, and TXNRD2 have been associated with radiation toxicity risk, but their clinical utility remains limited by polygenic complexity and inconsistent validation (International Radiogenomics Consortium (RgC), 2016, Talbot et al., 2012, Edvardsen et al., 2013).

Beyond genetics, functional assays have been developed to assess RS in real time. The Radiation-Induced Lymphocyte Apoptosis (RILA) assay, which measures apoptosis levels in CD8 + T-lymphocytes after ex vivo irradiation, has emerged as one of the most promising predictors of late radiation toxicity (Ozsahin et al., 2005). Prospective studies across multiple cancer types have demonstrated a strong correlation between high RILA scores and reduced rates of fibrosis and other late effects (Ozsahin et al., 2005, Azria et al., 2010, Azria et al., 2015). Other functional approaches, including fibroblast clonogenic survival assays and γ-H2AX foci quantification, offer alternative perspectives on RS but suffer from methodological limitations, such as the need for tissue biopsies or prolonged culture times (COPERNIC project investigators et al., 2016, Deschavanne and Fertil, 1996).

The advent of multi-omics technologies—genomics, transcriptomics, proteomics, and metabolomics—has further expanded our ability to characterize RS. Integrative analyses have identified radiation-specific gene expression signatures, such as a 12-gene panel predictive of RS in head and neck squamous cell carcinoma (Liu et al., 2020). Proteomic studies have highlighted oxidative stress response pathways as potential modulators of RS, opening avenues for therapeutic intervention (Oike et al., 2025). However, the translation of these discoveries into clinical practice is hindered by the need for large-scale validation and the complexity of integrating multi-omics data into routine decision-making.

An emerging but underexplored determinant of RS is the immune system’s role in modulating radiation responses. Recent evidence suggests that immune senescence—characterized by an accumulation of dysfunctional T-cells with pro-inflammatory properties—may be a key driver of late radiation toxicity (Nguyen et al., 2020, Paun et al., 2017). Senescent immune cells, particularly helper T-cells that produce pro-inflammatory IL-17 (Th17) CD4 + T-lymphocytes, have been implicated in the development of radiation-induced fibrosis and chronic inflammation (Nguyen et al., 2020). This highlights the need to move beyond purely genetic and cellular markers toward a more integrated view of RS that includes immune dynamics.

This review provides a comprehensive synthesis of current RS testing strategies, critically evaluating their biological basis, clinical relevance, and translational potential. By addressing key gaps—such as the role of immune responses and the standardization of predictive assays—we aim to outline a path toward biomarker-driven RT personalization. The ultimate goal is to move from a one-size-fits-all paradigm to individualized radiation strategies that minimize toxicity while maximizing therapeutic efficacy.

We refer to acute/early toxicity as events occurring during RT or within three months from the start of treatment, and to late toxicity as effects that manifest months to years after treatment. Acute reactions are typically driven by epithelial injury and inflammatory cascades and are often reversible; late effects reflect fibro-atrophic remodeling, vascular damage, and chronic inflammation and tend to be more persistent. This distinction matters for biomarker interpretation: some assays preferentially capture mechanisms of late injury. For example, the Radiation-Induced Lymphocyte Apoptosis (RILA) test has been primarily validated for late endpoints (Ozsahin et al., 2005, Azria et al., 2010, Azria et al., 2015). An RS test is a clinically oriented procedure giving an individual-level risk estimate for a predefined endpoint. Predictive biomarker is measurable biological feature quantified by an assay.

Comments (0)

No login
gif