Abstract
Rationale The diffusing capacity of the lung for carbon monoxide corrected for haemoglobin (DLCOcor) measures gas movement across the alveolar–capillary interface. We hypothesised that DLCOcor is a sensitive measure of injurious allograft processes disrupting this interface.
Objectives To determine the prognostic significance of the DLCOcor trajectory on chronic lung allograft dysfunction (CLAD) and survival.
Methods A retrospective analysis was conducted of all bilateral lung transplant recipients at a single centre, between January 1998 and January 2018, with one or more DLCOcor measurements. Low baseline DLCOcor was defined as the failure to achieve a DLCOcor >75% predicted. Drops in DLCOcor were defined as >15% below recent baseline.
Results 1259 out of 1492 lung transplant recipients were included. The median (range) time to peak DLCOcor was 354 (181–737) days and the mean±sd DLCOcor was 80.2±21.2% pred. Multivariable analysis demonstrated that low baseline DLCOcor was significantly associated with death (hazrd ratio (HR) 1.68, 95% CI 1.27–2.20; p<0.001). Low baseline DLCOcor was not independently associated with CLAD after adjustment for low baseline forced expiratory volume in 1 s or forced vital capacity. Any DLCOcor declines ≥15% were significantly associated with death, independent of concurrent spirometric decline. Lower percentage predicted DLCOcor values at CLAD onset were associated with shorter post-CLAD survival (HR 0.75 per 10%-unit change, p<0.01).
Conclusion Low baseline DLCOcor and post-transplant declines in DLCOcor were significantly associated with survival, independent of spirometric measurements. We propose that DLCOcor testing may allow identification of a subphenotype of baseline and chronic allograft dysfunction not captured by spirometry. There may be benefit in routine monitoring of DLCOcor after lung transplantation to identify patients at risk of poor outcomes.
Abstract
In a cohort spanning 20 years, the DLCO trajectory after lung transplantation is significantly associated with long-term outcomes including chronic lung allograft dysfunction and survival. https://bit.ly/3g3mvCk
Introduction
Graft survival after lung transplantation remains inferior to that demonstrated in other solid organ groups [1]. Longitudinal monitoring of allograft physiology after lung transplantation has been traditionally performed using the spirometric indices, forced expiratory volume in 1 s (FEV1) and forced vital capacity (FVC). Chronic lung allograft dysfunction (CLAD) is a major contributor to graft loss and is defined as the persistent and irreversible decline in FEV1 [2]. Loss of lung volume, based on longitudinal volumetric monitoring, is the sine qua non for the restrictive allograft syndrome (RAS), a phenotype with important clinical and prognostic implications [3, 4]. CLAD with gas trapping, based on an elevated residual volume to total lung capacity ratio, also predicts worse graft survival [5]. There is a paucity of research regarding the utility of alternative lung function tests in the definition of lung allograft dysfunction phenotypes and in prognostication.
The single-breath diffusing capacity of the lung for carbon monoxide (DLCO) measures the capacity of the lungs to exchange gas across the alveolar–capillary interface. The DLCOcor, corrected for haemoglobin (mL·min−1·mmHg−1), is the product of two simultaneous and separate measurements; the accessible alveolar volume and the rate constant for alveolar carbon monoxide uptake (KCO) [6–10].
Current knowledge regarding the observed and predicted DLCO measurements after lung transplant is based on early observational studies with limited sample sizes of between six and 34. Improvements in the post-operative percentage predicted DLCO values were observed in single lung transplantation for COPD and idiopathic pulmonary fibrosis [11–13]. Larger improvements in DLCO were observed after bilateral (in comparison with single) lung transplantation [14]. With regards to the DLCO trajectory, maximal achieved values occurred at 12 months after transplant and appeared to decline over time [13, 15]. Reductions in DLCO were observed during episodes of acute rejection and infection [16]. To the best of our knowledge, there have been no published studies to assess the effect of DLCO measurements after lung transplantation on CLAD and survival.
We hypothesised that DLCO is a sensitive measure of injurious allograft processes that can disrupt the alveolar–capillary interface and is a predictor of poor outcomes after lung transplantation. Our aims were to describe the trajectory of DLCO measurements after lung transplantation, and to determine the prognostic significance of the DLCO trajectory on CLAD and survival.
Methods
Subjects
A retrospective cohort analysis was conducted using a database of all lung transplant recipients at the Toronto General Hospital, encompassing 20 years between January 1998 and January 2018. Bilateral lung transplant recipients with one or more DLCO measurement(s) were included. Recipients transplanted at other centres were excluded. Single lung transplant patients were excluded from all outcome analyses. The study was approved by the institutional research ethics board (protocol number 15-9531-AE).
DLCO measurements
All study patients underwent measurement of DLCO as a single-breath-hold 10-s manoeuvre, and reported as per American Thoracic Society guidelines [6, 17]. Daily checks for gas volumes, weekly syringe calibration and twice-monthly biological calibrations were performed for quality control.
Our hospital post-transplant lung function surveillance protocol includes DLCO, with static lung volumes, at 3, 6, 9, 12, 18 and 24 months after transplantation and yearly thereafter. Values were obtained on the following commercial equipment at the Toronto General Hospital: Sensor Medics Vmax in 1997–1999, Morgan MDAS in 1999–2004 and Medisoft ExpAir 1.32.03 in 2004–2018. All observed DLCO measurements were corrected for the nearest available haemoglobin value (DLCOcor). The median (interquartile range (IQR)) time between DLCOcor and closest haemoglobin measurement was 0 (2) days. The Global Lung Function Initiative reference values for DLCO were used to generate predicted values [18].
Immunosuppression and CLAD treatment protocols
Refer to the supplementary material [19].
Clinical definitions
The baseline DLCOcor was defined as the single maximum DLCOcor value achieved after lung transplantation. Low baseline DLCOcor was defined as the failure to achieve a baseline DLCOcor >75% predicted. Drops in DLCOcor were defined as declines ≥15% below the best previously achieved value. Sustained declines were defined as irreversible reductions in DLCOcor.
Clinical outcomes
The primary outcome was the time from transplant to the onset of CLAD (see supplementary material for definitions of CLAD and CLAD phenotypes) [2, 20]. Secondary outcomes included all-cause graft survival from transplant to death or re-transplantation and post-CLAD survival from CLAD onset to death or re-transplantation.
Statistical analysis
Descriptive statistics were summarised by mean±sd or median (IQR) for continuous variables, and counts (%) for categorical variables. To assess for significant differences between groups, the Chi-squared test was used for categorical variables and the Wilcoxon rank test for continuous variables. The DLCOcor % pred trajectories were visualised with spaghetti plots to identify clinically relevant patterns. A multivariable logistic regression model was used to determine the association between baseline peri-operative variables and low first DLCOcor, defined as below median % pred and measured ≤4.5 months after transplant. Time-dependent multivariable Cox hazards models were used to determine the association between low baseline DLCOcor and both CLAD and graft survival. For this analysis, the start point was the time of transplant, assuming that the patient was not in the low baseline DLCOcor status until the first DLCO measurement, and changing status with each subsequent measurement. Variables of interest potentially associated with CLAD and survival outcomes after transplantation were established a priori and included recipient age, donor age, donor–recipient sex matching, native lung disease, cytomegalovirus serostatus matching and transplantation era [1]. The landmark approach was adopted to adjust for spirometric decline (of ≥12%) in the measurement of associations between low baseline DLCOcor and both CLAD and graft survival [21]. Time-dependent multivariable Cox hazards models were used to determine the association between any in DLCOcor and both CLAD and graft survival. Concurrent spirometric (FEV1 or FVC) decline of ≥12% at the time of each declined DLCOcor was included as a covariate [22]. For declines analysis, all patients start at “no decline” status and remain so until the second DLCO measurement, with their status changing based on the data points. A multivariable Cox proportional hazards model was used to determine the association between the DLCOcor at CLAD onset and post-CLAD survival. Model specifications are summarised in the supplementary material. Statistical analyses were performed using R version 3.4.3 and Prism version 9.1.0. Statistical significance was set at a two-sided level of 0.05.
Results
DLCOcor trajectory after transplantation
1723 patients underwent lung transplantation during the study period (figure 1). 231 patients were excluded from the study due to transplant not at Toronto General Hospital (n=6) or had fewer than one DLCOcor measurement recorded (n=225). Of the remaining 1492 patients, 1259 were bilateral lung transplants and 233 were single lung transplants. The majority of patients (77.2%) completed the DLCOcor surveillance as per protocol. The mean±sd baseline DLCOcor value after single lung transplantation at 61.3±16.5% pred was lower compared to double lung transplantation at 80.2±21.2% pred. The differences are summarised in supplementary table S1 and supplementary figure S1. Single lung transplant recipients were excluded from all subsequent analyses, which focused only on the 1259 double lung transplant recipients with a total of 9543 DLCOcor available measurements. The median (IQR) number of DLCOcor measurements per patient was 7 (4–10). The median (IQR) days between DLCOcor and closest haemoglobin measurement were 0 (0–2). Individual trajectories of DLCOcor % pred values, visualised in figure 2, demonstrated high inter-patient variability. The median (range) time to peak DLCOcor was 354 (181–737) days. For comparison, the median (IQR) time to peak FEV1 was 278 (180–713) days.
Peri-operative determinants of the first DLCOcor measurement
1074 patients had a first DLCOcor measured ≤4.5 months after transplant and the median first DLCOcor was 68.4% pred. In a logistic regression analysis, increasing donor age by 5 years (OR 1.04, 95% CI 1.03–1.05; p<0.01), Canadian listing status 3 at transplant admission (OR 1.15, 95% CI 1.03–1.28; p=0.01), post-transplant intensive care unit (ICU) length of stay (OR 1.01, 95% CI 1.01–1.02; p<0.001) and primary graft dysfunction grades 2–3 (OR 1.11, 95% CI 1.01–1.21; p=0.02) were significantly associated with an increased risk of low first DLCOcor (supplementary table S2). Higher A rejection scores, defined as the sum of all A grades divided by the number of available evaluable biopsies ≤4.5 months after transplant, were protective for low first DLCOcor (OR 0.92, 95% CI 0.86–0.97; p=0.004).
Baseline DLCOcor
500 (39.7%) patients had low baseline DLCOcor (<75% pred). There was a significant difference in the mean±sd baseline FEV1 (66.8±19.0% pred) in patients with low baseline DLCOcor, compared with 94.5±19.8% pred in those with normal baseline DLCOcor (p<0.01). Recipient and donor characteristics measured at the time of transplant are summarised in table 1, comparing patients with low baseline DLCOcor and those with normal baseline DLCOcor.
Clinical outcomes
The median (IQR) overall follow-up time was 4.3 (2.0–8.2) years post-transplant. 588 (46.7%) out of 1259 patients met the criteria for a diagnosis of CLAD and 615 (48.8%) out of 1259 had no CLAD. In 56 (4.4%) out of 1259 recipients, there was lung allograft dysfunction without CLAD and the alternative diagnosis in all of these cases was infection. 546 (43.4%) out of 1259 died within the study period. Causes of death, available in 482 (88.3%) out of 546 patients, included CLAD (43.2%), bacterial sepsis (20.5%) and malignancy (12.0%).
Association between low baseline DLCOcor and CLAD
In 116 recipients, a diagnosis of CLAD preceded the baseline DLCOcor measurement and these were excluded from this analysis. Univariable analysis focusing on CLAD demonstrated that low baseline DLCOcor, modelled as a time-dependent variable, was significantly associated with a shorter time to CLAD (hazard ratio (HR) 1.25, 95% CI 1.06–1.48; p=0.008). Multivariable analysis demonstrated that low baseline DLCOcor was independently associated with a shorter time to CLAD (HR 1.29, 95% CI 1.08–1.55; p=0.005). The results are summarised in table 2. A landmark analysis was performed at 2 years post-transplant in order to adjust for baseline FEV1 or FVC up to that pre-specified time point. Low baseline DLCOcor at 2 years after transplant, adjusted for low baseline FEV1 or FVC, demonstrated a positive correlation, albeit not statistically significant, with CLAD (supplementary table S3a). For visual illustration, figure 3a demonstrates a significant difference in the Kaplan–Meier curves for CLAD-free survival for patients with low baseline DLCOcor compared to those with normal baseline DLCOcor. To further illustrate the prognostic significance of DLCOcor independent of spirometry, figure 3b shows the relative contribution of DLCOcor to CLAD-free survival in the context of low or normal FEV1. Even among patients with normal baseline FEV1, those with low baseline DLCOcor had reduced CLAD-free survival compared to those with normal baseline DLCOcor. These findings were similar in the context of low or normal FVC (supplementary figure S3a).
Association between low baseline DLCOcor and survival
In univariable analysis focusing on survival, low baseline DLCOcor, modelled as time-dependent variable, was significantly associated with reduced survival (HR 3.26, 95% CI 2.64–4.01; p<0.001). In multivariable analysis, low baseline DLCOcor, was independently associated with reduced survival (HR 3.42, 95% CI 2.76–4.25; p<0.001). The results are summarised in table 3. Landmark analysis of low baseline DLCOcor at 2 years after transplant showed that this association was independent of low baseline FEV1 or FVC (supplementary table S3a). Figure 3c demonstrates a significant difference in the Kaplan–Meier curves for graft survival for patients with low baseline DLCOcor compared to those with normal baseline DLCOcor. Figure 3d demonstrates the relative contribution of DLCOcor on graft survival in the context of low or normal baseline FEV1. Among patients with normal baseline FEV1, patients with low baseline DLCOcor had significantly worse survival than patients with normal baseline DLCOcor. Similar findings were observed in the context of low or normal baseline FVC (supplementary figure S3b).
Post-transplant declines in DLCOcor
With regards to the DLCOcor trajectory after the best achieved, 372 (31.0%) patients maintained stable values within 10% of best achieved. 829 (69.0%) patients demonstrated a sustained decline in DLCOcor of ≥10% from baseline. In 673 patients (56.0%), the sustained decline was ≥15% below best achieved.
In 155 (12.9%) patients, a sustained decline in DLCOcor occurred prior to CLAD onset with a median (IQR) lead time of 149 (73.8–307.5) days. In univariable analysis, any declines in DLCOcor were significantly associated with CLAD (HR 1.39, 95% CI 1.14–1.69; p=0.001). In multivariable analysis after adjustment for concurrent FEV1 or FVC decline, there was no significant association with CLAD (HR 1.06, 95% CI 0.84–1.34; p=0.63) (table 4). In univariable and multivariable analysis, any declines in DLCOcor were independently associated with death (HR 2.49, 95% CI 1.97–3.15; p<0.001), even after adjustment for concurrent FEV1 or FVC decline (table 4).
DLCOcor at CLAD onset and post-CLAD survival
149 (12.4%) out of 1201 patients had a DLCOcor measured within 30 days before or after CLAD onset. The median (IQR) DLCOcor at at CLAD onset was 63.9 (50.4–77.2)% pred, significantly lower than the baseline value of 80.8 (69.8–90.4)% pred (p<0.01). In univariable analysis, the percent predicted DLCOcor at CLAD onset was significantly associated with post-CLAD survival (HR 0.78 per 10%-unit increase, 95% CI 0.69–0.89; p<0.01). In multivariable analysis, adjusting for concurrent percentage predicted FEV1 or FVC, lower DLCO % pred values at CLAD onset remained independently associated with shorter post-CLAD survival (HR 0.75 per 10%-unit change, 95% CI 0.66–0.86; p<0.01) (table 5). Figure 4a demonstrates a significant difference in the Kaplan–Meier post-CLAD survival curves, based on DLCOcor % pred values at CLAD onset, dichotomised by a median cut-off.
DLCOcor trajectories in CLAD phenotypes
Complete CLAD phenotyping with DLCOcor data was available in 174 patients. This included bronchiolitis obliterans syndrome (BOS) in 104 (60%) recipients, RAS in 16 (9%), mixed CLAD in nine (5%), undefined in 14 (8%) and 31 remained unclassified (18%). There was a clinically significant difference in the post-transplant DLCOcor % pred trajectory based on CLAD phenotype, with RAS, mixed and undefined showing greater decline than BOS and unclassified patterns (figure 4b). DLCOcor measured at CLAD onset (±30 days) was available in 40 patients with complete CLAD phenotyping and was significantly lower in RAS and undefined compared to BOS (p<0.01) (supplementary figure S2) [20].
Discussion
Our results, from a large retrospective cohort of lung transplant recipients spanning a 20-year period, demonstrate a high inter-patient variability in the DLCOcor trajectory after transplant. The post-transplant baseline diffusing capacity and any declines in DLCOcor were associated with CLAD; however, this was not independent of the spirometric trajectory. The post-transplant baseline diffusing capacity, any declines in DLCOcor and the percentage predicted DLCOcor were independently associated with graft survival, importantly after adjustment for the spirometric trajectory.
Prior knowledge regarding the DLCOcor after lung transplant was largely based on small, observational studies. Early studies demonstrated that a maximal achieved DLCO occurred at 12 months after single lung transplantation [13, 15]. In our study we observed a median time to a best achieved DLCOcor of 354 days after double lung transplant, longer than the median time to maximal FEV1. To explain this period of physiological maturation, we hypothesise that prolonged recovery after implantation ischaemia–reperfusion, and/or early alloreactive injuries may also improve the KCO over this time period [23]. Recovery from surgical trauma, thoracic pain and diaphragm weakness may improve the accessible alveolar volume during the first year.
Early studies demonstrated a significantly greater improvement in the observed DLCO after double compared with single lung transplantation [14]. Our study confirms that higher baseline DLCOcor % pred values are obtained after double lung transplant (81.3%) than those after single lung transplant (62.2%) [14]. This may be explained by the only partial correction in accessible alveolar volume after single lung transplant. Of interest, in our study only 16.1% of recipients achieved ≥100% pred diffusing capacity after transplant. 39.7% of recipients never achieved a normal (>75%) baseline diffusing capacity. This finding is consistent with studies examining the spirometric trajectory after lung transplant [24]. There are probable alloimmune and non-alloimmune injurious processes which may never allow the allograft to recover fully after lung transplant.
We examined perioperative variables associated with low first DLCOcor. The risk associated with increasing donor age may be explained by previous work demonstrating age-related declines in diffusion capacity independent of alveolar volume, suggesting alterations of the alveolar–capillary membrane [25]. Urgency listing status at transplant admission and post-transplant ICU length of stay are both markers of physiological vulnerability to allograft injury in the peri-operative period. High-grade primary graft dysfunction and need for post-transplant extra-corporeal membrane oxygenation probably reflect ischaemia–reperfusion injury of the transplanted lungs. Our findings are in keeping with previous work examining functional outcomes using cardiopulmonary exercise testing after lung transplant. In that study, lung volumes and diffusion capacity were significantly lower for recipients with grade 3 primary graft dysfunction within 72 h compared to those without [26]. We hypothesise that ischaemia–reperfusion may cause persistent disruption of the alveolar–capillary interface affecting gas transfer, and ventilatory restriction affecting the accessible alveolar volume. Unexpectedly higher A rejection scores were protective for low first DLCOcor. We hypothesised that recipients undergoing early transbronchial biopsy surveillance may select for a cohort with higher earlier DLCOcor values. While the association remained significant, there was a significant change in the odds ratio for low DLCOcor after adjustment for the number of biopsies the recipient underwent in the first 4.5 months. We believe that this result is hypothesis-generating. The association between DLCOcor and acute rejection deserves further attention with an appropriately designed study.
A key finding in our study is that recipients with low baseline DLCOcor are at increased risk of CLAD and early graft loss. This indicates a “horse-racing” effect, the concept that low baseline values predict future low lung function and reduced graft longevity [27, 28]. Increased physiological vulnerability to cumulative post-transplant injurious processes may explain these associations. Baseline lung allograft dysfunction, defined as the failure to achieve both FEV1 and FVC ≥80% pred after transplant, has previously been shown to be a dynamic risk state associated with reduced graft survival [24]. Importantly, the DLCOcor effect on survival that we see in our study is independent of FEV1 and FVC. We demonstrate that recipients with low baseline DLCOcor and normal baseline FEV1 showed significantly worse overall graft survival than patients with normal baseline DLCOcor and normal baseline FEV1 and also worse compared to patients with normal baseline DLCOcor and low baseline FEV1. Results were similar when analysing FVC in a similar fashion. This underscores the potential importance of incorporating the baseline percentage predicted DLCOcor value, as an additional metric to FEV1 and FVC, for physiological phenotyping and prognostication after lung transplant.
The majority of recipients (69.0%) demonstrated a sustained decline of ≥15% in DLCOcor in relation to baseline values at some point during their post-transplant trajectory. The absence of an independent association between any DLCOcor decline and subsequent CLAD is of interest. A wide range of pathological processes may cause a reduction in either the accessible alveolar volume or gas transfer. Some of these phenomena may represent non-CLAD causes of allograft dysfunction, or intercurrent pathological processes in patients with CLAD, underscoring the importance of clinical interpretation. We argue that DLCOcor decline and spirometric decline appear to be related metrics of CLAD. In a cross-sectional study, the diffusion capacity for nitrous oxide allowed for early detection of BOS, and this requires further longitudinal prospective evaluation [29]. Any DLCOcor decline was independently associated with reduced survival. DLCOcor decline likely represents increased physiological vulnerability to biological processes leading to reduced patient survival.
The percentage predicted DLCOcor at CLAD-onset predicted post-CLAD survival, independently of concurrent spirometric decline. Significant injury of the alveolar–capillary interface, affecting the transfer factor, is the likely explanation for this finding, independent of the accessible alveolar volume. Thus, we argue that the diffusing capacity may provide additional information to assist clinicians in prognostication of recipients with a diagnosis of CLAD.
There was a clinically significant difference in the DLCOcor trajectories based on established CLAD phenotypes with lower values in patients with restriction and fibrosis-like opacities on radiology. Further work is required to define the early inciting injurious allograft processes affecting the accessible alveolar volume and alveolar–capillary interface leading to CLAD and the fibrotic CLAD phenotypes. Whether DLCOcor may be helpful in subphenoptying of CLAD should be explored in future studies with larger numbers of patients with relevant measurements.
Our study spans a long period of time and encompasses multiple transplant eras. During this time there have been significant alterations to transplant practice including immunosuppression, use of ex vivo lung perfusion and clinical phenotyping. We observed a greater proportion of patients with low baseline DLCOcor in the latter era. However, transplant era was included as a covariate in all of our multivariable models and did not act as a significant confounder to the associations presented in this report.
There are limitations to the results presented in our study. Our study population is biased towards a healthier cohort, namely those patients who could perform one or more DLCOcor measurements. Technical limitations with DLCOcor measurements in recipients with advanced allograft dysfunction may have limited the number of measurements in the lowest range. We noted high inter- and intra-patient variability in diffusion capacity measurements. High inter-session variability in the measurement of DLCOcor in healthy individuals, which is dependent upon the baseline DLCOcor and method of testing used, has been demonstrated previously [30]. The use of three different commercial equipments over the 20-year period may also contribute to variability in observed values. Quality control was consistent over the 20-year period including daily checks for gas and volumes, twice monthly biological calibrations and monthly analyser linearity for primary standard gases. Given the consistency in quality control over the past 20 years, we believe that our results were consistent and comparable. Further work is required to elucidate the intersession variability in DLCOcor after lung transplantation.
Conclusion
The post-transplant baseline diffusing capacity and any declines in DLCOcor were associated with CLAD; however, this was not independent of the spirometric trajectory. The post-transplant baseline diffusing capacity, any declines in DLCOcor and the percentage predicted DLCOcor were independently associated with graft survival, importantly after adjustment for the spirometric trajectory. Routine monitoring of DLCOcor after lung transplantation may improve the identification of patients at risk of poor outcomes.
Supplementary material
Supplementary Material
Please note: supplementary material is not edited by the Editorial Office, and is uploaded as it has been supplied by the author.
Supplementary file 1. Supplementary methods. ERJ-03639-2020.Supplement_1
Supplementary file 2. Supplementary tables S1 and S2. ERJ-03639-2020.Supplement_2
Supplementary file 3. Supplementary figures S1 and S2. ERJ-03639-2020.Supplement_3
Supplementary file 4. Supplementary figure S3. ERJ-03639-2020.Supplement_4
Supplementary file 5. Supplementary table S3. ERJ-03639-2020.Supplement_5
Shareable PDF
Supplementary Material
This one-page PDF can be shared freely online.
Shareable PDF ERJ-03639-2020.Shareable
Acknowledgement
The authors thank the Toronto General Hospital Pulmonary Function Lab staff (Toronto, Canada) for their contributions towards data collection, including Henry Furlott and Lauren Day.
Footnotes
This article has supplementary material available from erj.ersjournals.com
Author contributions: All authors made substantial contributions to the conception and design of the work; the acquisition, analysis and interpretation of data for the work; drafting and revision for intellectual content and final approval of the version for publication. All authors agree to be accountable for all aspects of the work in ensuring that questions related to accuracy or integrity are appropriately investigated and resolved.
Conflict of interest: D.R. Darley has nothing to disclose.
Conflict of interest: J. Ma has nothing to disclose.
Conflict of interest: E. Huszti has nothing to disclose.
Conflict of interest: R. Ghany has nothing to disclose.
Conflict of interest: M. Hutcheon has nothing to disclose.
Conflict of interest: C-W. Chow has nothing to disclose.
Conflict of interest: J. Tikkanen has nothing to disclose.
Conflict of interest: S. Keshavjee has nothing to disclose.
Conflict of interest: L.G. Singer has nothing to disclose.
Conflict of interest: T. Martinu has nothing to disclose.
Support statement: D.R. Darley is a recipient of a St Vincent's Clinic Foundation Travelling Scholarship.
- Received September 26, 2020.
- Accepted June 3, 2021.
- Copyright ©The authors 2022. For reproduction rights and permissions contact permissions{at}ersnet.org