Abstract
TDM with dried blood spots can simply and simultaneously monitor treatment in patients with MDR-TB with HIV http://ow.ly/TKIo300mwIY
To the Editor:
HIV and tuberculosis (TB) are among the leading causes of death due to an infectious disease worldwide [1]. HIV co-infection was present in 12% of all TB cases in 2014; the African region accounted for 74% of those [1]. It has become crystal clear that both infections must be addressed simultaneously. Additionally, multidrug resistance (MDR) and extensive drug resistance (XDR) aggravate the problem even more [2]. The risk of having a poor MDR-TB treatment outcome is up to 10-fold higher among HIV co-infected individuals [3]. The pill burden for patients with MDR-TB with HIV co-infection is high and consists of a combination of at least six to 10 different drugs [2]. Therefore, drug–drug interactions, adverse drug reactions and/or suboptimal plasma concentrations are common [3]. Ultimately, the goal is to tailor both TB and HIV treatment (precision treatment) to optimise outcome and reduce adverse drug events in a way that is cost-effective and feasible worldwide. Of note, in nonaffluent settings where there is high disease co-endemicity, this requires a minimalist yet effective approach while resources are limited. What are needed are tools to determine the concentrations of all drugs, and treatment response markers by means of HIV viral load, and sputum smear and culture.
We postulate that therapeutic drug monitoring (TDM), the determination of plasma concentrations of drugs, could be of great value to optimise management of complex medication schemes for the treatment of MDR-TB with HIV co-infection [4]. TDM has not yet been included in routine care of HIV or TB-infected patients, but is recommended in the case of MDR-TB with HIV co-infection [5]. For TDM, plasma concentrations are measured in specialised, centralised laboratories; this poses logistical and financial problems, especially in resource-limited areas. In addition, TDM using venipuncture for all drugs used in the treatment of MDR-TB with HIV co-infection would be a burden for the patient and the healthcare system. We propose a patient-friendly method, called dried blood spot (DBS) sampling, that can help resolve this problem. DBS is a method that uses a drop of blood on a filter paper, collected by finger prick, for the analysis of drug concentrations [4]. DBS has several advantages over venous sampling [4]. An attractive feature in the setting of infectious diseases is the minimised biohazard risk due to the dried paper matrix. Important for TDM is the increased sample stability of DBS, eliminating cold-chain transport, and thereby reducing costs and logistical problems, especially in resource-limited areas with high humidity and temperature [4]. Here, we summarise all aspects needed for guidance and monitoring MDR-TB/HIV patients, and assess whether the different aspects of the treatment of MDR-TB with HIV co-infection could be covered by DBS.
For patients with MDR-TB, antiretroviral treatment (ART) should be initiated within 2–12 weeks after initiation of MDR-TB treatment, depending on the CD4+ count and clinical condition; in those with CD4+ counts <50 per mm3, early (<2 weeks) treatment initiation was shown to reduce mortality in HIV patients by one-third [5]. At the start of ART, subject to feasibility in resource-poor settings, genotyping for resistance testing ought to be performed as well as clinical monitoring of serum sodium, potassium, bicarbonate, chloride, blood urea nitrogen, alanine aminotransferase, aspartate aminotransferase, total bilirubin and creatinine, and haemoglobin, white cell count and CD4+ count by venous sampling [5]. Patients are re-assessed clinically 1 month after change of treatment and subsequently at 6 months [5]. However, serum creatinine and potassium need to be measured every 1–3 weeks for MDR-TB patients with HIV co-infection [6]. HIV viral load testing is recommended to be performed 2–8 weeks after start of ART or change of treatment and is repeated every 3–4 months until the viral load is <200 copies per mL, and every 6 months thereafter [5]. CD4+ count should be measured 3 months after initiation of ART, and subsequently every 3–6 months in the first 2 years of ART [5].
According to previous studies, TDM can best be performed 2 weeks after the start of treatment, dose change or a new drug is added when there is a possibility of a drug–drug interaction between antiretroviral (ARV) and MDR-TB drugs [7, 8]. For treatment efficacy assessment, sputum smear and culture is performed every 1–2 months [6].
ART and programmatic management of MDR-TB guidelines suggest certain laboratory testing schedules. TDM could be aligned with these time-points during treatment to increase feasibility and to reduce the burden for the patient [7]. Based on the available guidelines, we constructed a schedule for TDM in patients with MDR-TB with HIV co-infection in an optimal and least time-consuming manner (figure 1).
TDM could be performed using DBS by a single assay combining HIV viral load quantification and pharmacokinetic assessment [9–11]. Several assays describing the simultaneous determination of plasma concentrations of ARV drugs in DBS are available [9]. For second-line anti-TB drugs, this also seems feasible using similar analytical procedures [11, 12].
Serum creatinine was also shown to be quantifiable using DBS [13]; nevertheless, clinical monitoring through venipuncture is performed routinely in less specialised, noncentral laboratories, and could therefore still best be performed using venipuncture. This is in contrast to TDM, viral load assessment and CD4+ count, which need to be determined in central laboratories and which is difficult to perform through venous sampling for the aforementioned reasons. However, CD4+ count measurement by DBS has not yet been fully developed but initial steps have been made towards that direction [10].
To further tailor treatment to the individual patients, genotypic resistance testing needs to be performed at the start of treatment. Genotypic resistance testing for several HIV-1 pseudospecies has been validated for DBS in multiple studies [10]. In addition, human leukocyte antigen genotyping through whole-genome sequencing of DBS-derived samples has already been performed. The use of whole-genome sequencing opens perspectives for future genetic studies from DBS; for instance, studies into Mycobacterium tuberculosis mutations leading to resistance to second-line anti-TB drugs resulting in XDR-TB [14].
Additionally, pharmacogenetic testing to identify enzyme polymorphisms can also be performed with DBS. It was shown that DBS has several advantages over current DNA isolation kits because it allows isolating DNA in fewer steps, taking up less time, and reduces costs for transport and DNA isolation. Identifying enzyme polymorphisms, such as cytochrome P450 polymorphisms, can help individualise treatment because these are involved in several drug–drug interactions and can therefore predict possible drug toxicities or inefficacy [3, 15].
The next step will be to evaluate this approach to further enhance MDR-TB/HIV co-infection treatment. We showed that DBS can be used for different aspects such as measuring drug concentrations and genotypic resistance testing of HIV for the use in TDM [9, 11, 13, 14]. TDM could be a tool to decrease toxicity, increase efficacy and detect resistance early, thereby preventing further resistance development in a potentially cost-effective way, by preventing hospitalisation due to toxicity or drug–drug interactions. Further implementation of DBS in TDM needs to be examined. In addition, outcomes from this proposed monitoring need to be compared with outcomes of current care to determine the added value of TDM using DBS by means of a randomised controlled trial or operational research.
Footnotes
Conflict of interest: None declared.
- Received March 23, 2016.
- Accepted April 28, 2016.
- Copyright ©ERS 2016