4. Electronic Theses and Dissertations (ETDs) - Faculties submissions

Permanent URI for this communityhttps://hdl.handle.net/10539/37773

Browse

Search Results

Now showing 1 - 10 of 654
  • Thumbnail Image
    Item
    Prevalence and factors of HIV associated Oral Kaposi Sarcoma at Wits Oral Health Centre, Johannesburg
    (University of the Witwatersrand, Johannesburg, 2024-07) Chetty, Hasita; Padayachee, Sankeshan; Mafojane, Tumane
    Introduction: HIV associated Oral Kaposi Sarcoma (OKS) is a neoplasm predominantly occurring in immunocompromised patients. Therefore, it is often observed in an HIV positive and AIDS population (Moore & Chang, 2003). KS is caused by the Kaposi Sarcoma Herpes Virus (KSHV) or Human Herpes Virus-8 (HHV8) (Naidoo et al, 2016). The risk of acquiring KS increases in the presence of HIV infection and immunosuppression (Kamulegeya & Kalyanyama, 2008). HIV-KS can occur at any stage of HIV infection but has been more prevalent at the stage of AIDS or severe immune impairment (Khammissa et al., 2012). Both HHV8 infection and HIV/AIDS are highly prevalent in Africa (Kamulegaya and Kalyanyama, 2008). There is a lack of current evidence documenting the relationship between HIV/AIDS and OKS since the implementation of ART, therefore this study intends to augment the existing literature. This study aimed to determine the prevalence and factors of HIV associated OKS, on histopathologically diagnosed patients attending the WITS Oral Health Centre, Johannesburg, between 2008 and 2018. This period predates and postdates the roll out of ART in the South African public health sector in 2012 and speaks to the relationship between ART and the occurrence of HIV associated OKS. Materials and Method: This is a cross-sectional study using records from the Wits Oral Health Centre (WOHC) and National Health Laboratory Service (NHLS). The study period is 10 years from January 2008 to December 2018. Prevalence of OKS was calculated based on patients with a positive histopathological diagnosis of OKS within the study period from the NHLS database. Ethical Considerations: Permission was requested from the Academic Affairs and Research Management System (AARMS) National Health Laboratory Service (NHLS) and WOHC to access patient files. Records of patients with a positive OKS histopathological diagnosis were noted. They were allocated a study number and patient number on a data sheet. The corresponding patients’ files were accessed from WOHC to attain further information (as per the data sheet). Patient confidentiality was be maintained as no names were recorded and files were allocated a number for cross-referencing between NHLS and WOHC data. Ethics clearance was attained from the Wits Human Research Ethics Committee to carry out this study. Results: The prevalence of OKS that was found in this population (137679 patients seen at WOHC) was 0.017432%(n=24). There were more females that presented with OKS than males. The mean age of presentation was 39.11 years of age (SD 12.459). There was a significant relationship between high viral loads and a low CD4 count. The palate was the site most frequently biopsied in diagnosed OKS cases. Conclusion: The prevalence of OKS over the study period was very low. The mean age of OKS presentation was 39.21 years of age. More females presented with HIV associated OKS than males and the palate was the site, from which most biopsy samples were taken in OKS diagnoses. There is a significant relationship between high viral loads and low CD4 cell counts. This study is suggestive that a possible reason for the low number of OKS cases could be administration of ART by state institutions in South Africa, this can be further investigated to establish the effect of ART on OKS.
  • Thumbnail Image
    Item
    Occupational exposure to chemicals, and health outcomes, among nail technicians in Johannesburg, South Africa
    (University of the Witwatersrand, Johannesburg, 2023-08) Keretetse, Goitsemang; Brouwer, Derk H.; Nelson, Gill
    Introduction: Nail technicians are exposed to chemicals emitted from activities performed in nail salons, including simple buffing of nails, basic manicures and pedicures, application of nail polish, and the application and sculpting of artificial nails. The various products used during these processes may contain volatile organic compounds (VOCs), which pose a health risk to both the nail technicians and their clients. Associated health effects include skin, eye, and respiratory irritation, neurologic effects, reproductive effects, and cancer. The aim of this study was to effects within the formal and informal sectors in Johannesburg, South Africa. In this study, informal nail technicians are defined as those working in nail salons that are not licensed or registered with any formal enterprise or establishment, or in their own capacity. The objectives were 1) to estimate the prevalence of self-reported symptoms associated with the use of nail products, 2) to measure exposures to chemicals in nail products used in the formal and informal nail salons, 3) to investigate the feasibility and reliability of self-assessment of exposure as a method of estimating exposure to chemicals, and 4) to investigate the association between respiratory symptoms (chronic and acute) and chemical exposures in both formal and informal nail technicians. Methods: This was a cross-sectional study. A questionnaire, adapted from other studies, was piloted before being administered to the participating nail technicians. Data were collected from 54 formal and 60 informal nail technicians, regarding sociodemographic characteristics, perceptions of working with nail products, and self-reported symptoms of associated health effects. A subset of 20 formal and 20 informal nail technicians was conveniently selected from the 114 participants for the exposure assessment phase. The two groups were further divided into two groups of 10 for the controlled/expert exposure assessment (CAE) and the self-assessment of exposure (SAE). Personal 8-hr exposure measurements were performed using VOC and formaldehyde passive samplers attached to the participant’s breathing zone over three consecutive days. For the SAE approach, participants conducted their own exposure measurements, while the CAE approach was fully conducted by the principal researcher. Task-based measurements were carried out using a photoionization detector (PID) to measure peak concentrations during specific nail application activities. A probabilistic risk assessment was conducted to estimate the carcinogenic and non-carcinogenic life time risks from exposure to VOCs. Chemical analysis was conducted by a SANAS-accredited laboratory. After correcting for their respective evaporation rates, relative to the evaporation rate of d-limonene (the VOC with the lowest evaporation rate), the adjusted total VOC (TVOC) concentrations were calculated using the 13 VOCs that were detected at a frequency of 30% or more. VOC concentration data below the limit of detection (LoD) were imputed, using the regression on order statistic (Robust ROS) approach. The self-reported symptoms were categorised into neurological effects, respiratory effects, eye irritation, and skin irritation. The ACGIH additive effects formula was used to calculate the combined respiratory effect of selected VOCs. Different statistical tools were used to analyse the data for each objective. Results: Formal and informal nail technicians used different nail products, performed different nail applications, serviced different mean numbers of clients, and were exposed to different concentrations of selected VOCs. Acetone concentrations were higher in formal nail salons, due to the soak-off method used for removing existing nail applications, while methyl methacrylate (MMA) concentrations were higher in informal nail salons - related to acrylic methods being used more frequently in the informal than the formal nail salons. All VOC concentrations were below their respective occupational exposure limits, with the exception of formaldehyde (0.21 mg/m3). TVOC levels were higher in formal nail salons, due to the bystander effect from multiple nail technicians performing nail applications simultaneously. Sixty percent of the informal nail technicians reported health-related symptoms, compared to 52% of the formal nail technicians, and informal male nail technicians reported more symptoms than their female counterparts. All nail technicians' median and 95th percentile non-cancer risks exceeded the acceptable risk of 1 for xylene, 2-propanol, and benzene, while the cancer risk estimates (medians and 95th percentiles) for benzene and formaldehyde exceeded the US EPA cancer risk threshold of 1 x 10-6. Conclusion: This is the first study to assess exposures to VOCs in the often-overlooked informal sector and compare these exposures with those in the formal sector of the nail industry. Personal breathing zone concentration data for nail salon workers were generated in this study, including the informal sector, which is always challenging to access for research. Although banned in many countries, MMA is still used in South Africa in the informal nail sector. The SAE study showed that participatory research is feasible and enables a more reliable estimate of the exposure by expanding the amount of data. Using a combination of shift and task-based measurements was particularly effective in creating exposure profiles of employees and identifying activities that require targeted interventions. There is a need for the nail industry, especially the informal salons, to be more closely regulated, concerning the hazardous chemicals frequently encountered in nail products. Nail salons should reduce exposure frequency by regulating working hours, making informed decisions regarding the procurement of nail products, and adopting safe work practices to reduce emissions from harmful chemicals and thus exposure among nail salon workers and their clients.
  • Thumbnail Image
    Item
    Preventing Coal Mine Dust Lung Disease: Application of Bayesian Hierarchical Framework for Occupational Exposure Assessment in The South African Coal Mining Industry
    (University of the Witwatersrand, Johannesburg, 2023-10) Made, Felix; Brouwer, Derk; Lavoue, Jerome; Kandala, Ngianga-Bakwin
    Background: The world's largest energy source is coal with nearly 36% of all the fuel used to produce power. South Africa is the world's top exporter and the seventh-largest producer of coal. In the upcoming years, it is expected that South Africa's coal production output rate will rise. Coal mine dust lung disease (CMDLD) is an irreversible lung disease caused by the production of coal, the emission of dust, and prolonged exposure to the dust. When conducting safety evaluation, exposure is typically reported as an eight-hour time-weighted average dust concentration (TWA8h). In occupational exposure contexts, occupational exposure limits (OEL) are often used as a threshold where workers can be exposed repeatedly without adverse health effects. The workers are usually grouped into homogenous exposure groups (HEGs) or similar exposure groups (SEGs). In South Africa, a HEG is a group of coal miners who have had similar levels and patterns of exposure to respirable crystalline silica (RCS) dust in the workplace. Several statistical analysis methods for compliance testing and homogeneity assessment have been put into use internationally as well as in South Africa. The international consensus on occupational exposure analysis is based on guidelines from the American Industrial Hygiene Association (AIHA), the Committee of European Normalisation (CEN), and BOHS British and Dutch Occupational Hygiene Societies' guidelines (BOHS). These statistical approaches are based on Bayesian or frequentist statistics and consider the 90th percentile (P90) and 95th percentile (P95), with- and between-worker variances, and the lognormal distribution of the data. The current existing practices in South Africa could result in poor or incorrect risk and exposure control decision-making. Study Aims: The study aimed to improve the identification of coal dust overexposure by introducing new methods for compliance (reduced dust exposure) and homogeneity (similar dust exposure level) assessment in the South African coal mining industry. Study Objectives: The objectives of this study were: 1. To compare compliance of coal dust exposure by HEGs using DMRE-CoP approach and other global consensus methods. 2. To investigate and compare the within-group exposure variation between HEGs and job titles. 3. To determine the posterior probabilities of locating the exposure level in each of the OEL exposure categories by using the Bayesian framework with previous information from historical data and compare the findings and the DMRE-CoP approach. 4. To investigate the difference in posterior probabilities of the P95 exposure being found in OEL exposure category between previous information acquired from the experts and the current information from the data using Bayesian analysis. Methods: The TWA(8h) respirable coal dust concentrations were obtained in a cross-sectional study with all participants being male underground coal mine workers. The occupational hygiene division of the mining company collected the data between 2009 and 2018. The data were collected according to the South African National Accreditation System (SANAS) standards. From the data, 28 HEGs with a total of 728 participants were included in this study. In objectives 1 and 2, all 728 participants from the 28 HEGs were included in the analysis. For exposure compliance, the DMRE-CoP accepts 10% exceedance of exposure above the OEL (P90 exposure values from HEGs should be below the OEL). The 10% exceedance was compared to the acceptability criterion from international consensus which uses 5% exceedance above the OEL (P95 exposure is below the OEL) of the lognormal exposure data. For exposure data to be regarded as homogenous, the DMRE-CoP requires that the arithmetic mean (AM) and P90 must fall into the same DMRE-CoP OEL exposure category. The DMRE-CoP on assessment of homogeneity was also compared with the international approaches which include the Rappaport ratio (R-ratio) and the global geometric standard deviation (GSD). A GSD greater than 3 and an R-ratio greater than 2 would both indicate non-homogeneity of the exposure data of a HEG. The GSD and DMRE-CoP criteria were used to assess the homogeneity of job titles exposure within a HEG. In objective 3 a total of nine HEGs which have 243 participants, were included in the analysis. To investigate compliance, a Bayesian model was fitted with a Markov chain Monte Carlo (MCMC) simulation. A normal likelihood function with the GM and GSD from lognormal data was defined. The likelihood function was updated using informative prior derived as the GM and GSD with restricted bounds (parameter space) from the HEGs' historical data. The posterior probabilities of the P95 being located in each DMRE exposure band were produced and compared with the non-informative results and the DMRE approach DMRE-CoP using a point estimate inform of the 90 percentiles. In objective 4, a total of 10 job titles were analysed and selected. The selection of the job titles was based on if they have previous year's data so it can be used to develop prior information in the Bayesian model. The same job titles were found across different HEGs, so to ensure the mean is not different across HEGs, the median difference of a job title exposure distribution across HEGs was statistically compared using the Kruskal-Wallis test, a non-parametric alternative to analysis of variance (ANOVA). Job titles with statistically non-significant exposure differences were included in the analysis. Expert judgements about the probability of the P95 located in each of the DMRE exposure bands were elicited. The IDEA (Investigate", "Discuss", "Estimate" and "Aggregate) expert elicitation procedure was used to collect expert judgements. The SHELF tool was then used to produce the lognormal distribution of the expert judgements as GM and GSD to be used as informative prior. A similar Bayesian analysis approach as in objective 3 was used to produce the probability of the P95 falling in each of the DMRE exposure bands. The possible misclassification of exposure arising from the use of bounds in the parameter space was tested in a sensitivity analysis. Results: There were 21 HEGs out of 28 in objectives 1 and 2 that were non-compliant with the OEL across all methods. According to the DMRE-CoP approach, compliance to the OEL, or exposure that is below the OEL, was observed for 7 HEGs. The DMRE-CoP and CEN both had1 HEG with exposures below the OEL. While the DMRE-CoP showed 6 homogeneous HEGs, however, based on the GSDs 11 HEGs were homogeneous. The GSD and the DMRE-CoP agreed on homogeneity in exposures of 4 (14%) HEGs. It was discovered that by grouping according to job titles, most of the job titles within non-homogenous HEGs were homogenous. Five job titles had AMs above their parent HEG. For objective 3, the application of the DMRE-CoP (P90) revealed that the exposure of one HEG is below the OEL, indicating compliance. However, no HEG has exposures below the OEL, according to the Bayesian framework. The posterior GSD of the Bayesian analysis from non-informative prior indicated a higher variability of exposure than the informative prior distribution from historical data. Results with a non-informative prior had slightly lower values of the P95 and wider 95% credible intervals (CrI) than those with an informative prior. All the posterior P95 findings from both non-informative and informative prior distribution were classified in exposure control category 4 (i.e., poorly controlled since exceeding the OEL), with posterior probabilities in the informative approach slightly higher than in the non-informative approach. Job titles were selected as an alternative group to assess compliance in objective 4. The posterior GSD indicated lower variability of exposure from expert prior distribution than historical data prior distribution. The posterior P95 exposure was very likely (at least 98% probability) to be found in exposure control category 4 when using prior distribution from expert elicitation compared to the other Bayesian analysis approaches. The probabilities of the P95 from experts' judgements and historical data were similar. The non-informative prior generally showed a higher probability of finding the posterior P95 in lower exposure control categories than both experts and historical data prior distribution. The use of different parameter values to specify the bounds showed comparable results while the use of no parameter space at all put the posterior P95 in exposure category 4 with 100% probability. Conclusions: In comparison to other approaches, the DMRE-CoP tend to show that exposures are compliant more often. Overall, all methods show that the majority of HEGs were non-compliant. The HEGs that suggest non-homogeneity revealed that the constituent job titles were homogenous. Application of the GSD criterion indicated that HEGs are more likely to be considered as homogeneous than when using the DMRE-CoP approach. When using the GSD and the DMRE-CoP guidelines, alternative grouping by specific job titles showed a greater agreement of homogeneity. The use of job titles showed that using HEGs following the DMRE-CoP current guidelines might not show high-exposure job titles and would overestimate compliance. Additionally, since job titles within a HEG may be homogeneous or have a different exposure to the parent HEG, exposure variability is not properly recorded when using HEGs. In compliance assessment, it is important to use the P95 of the lognormal distribution rather than the DMRE-CoP approach that use the empirical P90. Our findings suggest that the subgrouping of exposure according to job titles within a HEG should be used in the retrospective assessment of exposure variability, and compliance with the OEL. Our results imply that the use of a Bayesian framework with informative prior from either historical or expert elicitation may confidently aid concise decision making on coal dust exposure risk. Contrary to informative prior distribution derived from historical data or expert elicitation, Bayesian analysis using the non-informative uniform prior distribution places HEGs in lower exposure categories. Results from noninformative prior distributions typically show high levels of uncertainty and variability, so a decision on dust control would be reached with less confidence. The Bayesian framework should be used in the assessment of coal mining dust exposure along with prior knowledge from historical data or professional judgment, according to this study. For exposure, findings are to be reported with high confidence and for sound decisions to be reached about risk mitigation, an exposure risk assessment should be considered while using historical data to update the current data. The study also promotes the use of experts in situations where it is necessary to combine current data with historical data, but the historical data is unavailable or inapplicable.
  • Thumbnail Image
    Item
    Development of an instrument to measure the quality of care after the withdrawal of life-sustaining treatment in the adult intensive care unit
    (University of the Witwatersrand, Johannesburg, 2023-09) Korsah, Emmanuel Kwame; Schmollgruber, Shelley
    Background: The majority of deaths in the intensive care unit occur after the withdrawal of life-sustaining treatment. Most patients often die within 24 hours after treatment has been withdrawn. The short time interval between treatment withdrawal and death has highlighted the urgent need to prioritize the quality of care provided for patients and their families during this period. In South Africa, the quality of care provided for patients after treatment withdrawal has been plagued with cultural differences, challenges, and ethical dilemmas. Currently, no instrument exists to measure the quality of care provided for patients at the end of life and their families after treatment is withdrawn in the adult ICU. Existing measuring instruments have been developed for western countries with no consideration for the South African context. Hence, using these measuring instruments, especially in a country where non-western cultures exist, may be inappropriate, unrealistic, and liable to fail, necessitating revision. Purpose: To develop an instrument to measure the quality of care provided for patients after the withdrawal of life-sustaining treatment and their family members in the adult intensive care unit. Methodology: An exploratory sequential mixed-methods research design was used. The study was conducted in two phases, namely: domain identification and item generation; and instrument development and validation. In Phase 1, a summary of findings from a scoping review of the literature and qualitative interviews with nurses, doctors, and family members were used to generate relevant content domains and items. Relevant items generated were synthesised and reduced to develop the first version of the measuring instrument in Phase 2. The instrument underwent further expert panel review for relevance and clarity. A content validity index and modified Kappa statistic were performed. Comments and feedback from the panel of nine experts were also used to assess the face validity of the instrument. Results: The instrument development and validation process yielded a final instrument that consisted of 64 items across 7 domains. From an initial set of 143 items, the content validity process found seven domains and 64 items. These included patient- and family-centered decision making (9 items), communication among the ICU team and with patients and families (12 items), continuity of care (3 items), emotional and practical support for patients and families (12 items), symptom management and comfort (7 items), spiritual care (5 items), and modifying the ICU environment (12 items). A study of content validity revealed that this instrument recorded an appropriate level of content validity. The overall content validity index of the instrument was high (S-CVI/Ave = 0.97) when using the average approach and moderate (S-CVI/UA = 0.77) when using the universal agreement approach. The moderate value of the S-CVI/UA can be advocated with respect to the high number of content experts that make consensus difficult. The instrument items also obtained excellent kappa values that ranged from 0.89 to 1.00. Conclusion: The researcher developed and validated the content of an instrument to measure the quality of care provided for patients and their families after the withdrawal of life-sustaining treatments in the adult ICU. This instrument will support the provision of care for patients and their families following treatment withdrawal and the training and education of healthcare providers in end-of-life care. It will also aid future research in the care of critically ill and dying patients in the ICU. Future research should conduct more assessments and pilot test the instrument.
  • Thumbnail Image
    Item
    Histo-morphological Perturbations in the Testes of Diabetic Sprague Dawley Rat Following Atripla® And Alcohol Co-administration
    (University of the Witwatersrand, Johannesburg, 2023-08) Owembabazi, Elna; Nkomozepi, Pilani; Mbajiorgu, Ejikeme Felix
    A decline in male fertility is increasingly becoming a major concern in this millennium. However, the associated effects of diabetes, alcohol abuse, and cART use and their interaction which are clinically important in male reproductive health have not received proper attention. Each of these conditions (diabetes, alcohol abuse, and cART use) has been shown to negatively impact the male reproductive system. Moreover, diabetes, alcohol abuse, and cART are intricately connected and thus can and occur simultaneously in an individual, a scenario that might provoke severe reproductive dysfunctions. Therefore, this study investigated the impact of diabetes, alcohol abuse, cART use, and their combinations on testicular histomorphometry, reproductive hormone profile, oxidative and inflammatory markers, germ cell proliferation and apoptosis, and androgen receptor expression. A total of forty-eight HIV naive rats (48) were divided into two main groups, non-diabetic and diabetic groups, which were further subdivided into four subgroups consisting of six rats each. The non-diabetic groups (1-4); Group 1; Control, Group 2: Alcohol treated (A), Group 3: combination antiretroviral therapy treated (cART), and Group 4: alcohol plus cART treated (A+cART). The diabetic groups (5-8); Group 5: diabetic only (DM), Group 6: diabetic treated with alcohol (DM+A), Group 7: diabetic treated with cART (DM+cART), and Group 8: diabetic treated with both alcohol and cART (DM+A+cART). The rats were fed normal rat chow and terminated after 90 days of treatment. Blood was drawn through a cardiac puncture into plain vacutainers, thereafter, animals were perfused with 0.1M phosphate buffer, and the testes harvested, weighed, and then fixed in 10% neutral buffered formalin for histology and immunohistochemistry analysis. During the experimental period, hyperglycemia, low glucose tolerance, and polydipsia were observed in the diabetic groups (5-8) only. Though, a general decrease in testis weight, volume, and size was found in all treated groups compared to the control group, a significant (p˂0.05) decrease was detected in the DM+A group only. With exception of the DM+A+cART group, the epithelial area fraction, epithelial height, and tubule area and diameter reduced significantly in all treated groups. The luminal area fraction and luminal diameter which significantly reduced in cART, DM, and DM+cART groups, were increased in A+cART. Further, connective tissue and interstitial area fractions increased significantly in all treated groups. The spermatogonia increased significantly in A, cART, DM, and DM+A+cART groups relative to control group, but reduced significantly in DM+A. However, spermatocytes, round spermatids, and elongated spermatids decreased significantly in all treated groups, with exception of spermatocytes of the alcohol group. All treated groups showed a decrease in the number of Sertoli cells relative to the control but a significant decrease was only found in the DM+A group, but all treated groups had significantly decreased Leydig cell diameter and volume. Johnsen’s testicular score was significantly reduced in A, A+cART, DM, and DM+A treated groups. Additionally, varying severity of seminiferous tubules (ST) lesions such as shrinkage of ST, lifting of epithelium, widened intercellular space, karyolysis, epithelial sloughing, ST atrophy, multinucleated giant cells, and germinal epithelium derangement were observed in the treated animal groups. Further, the thickness ST basement membrane increased significantly in cART, DM, and DM+cART groups but testis capsule thickness increased significantly in A+cART, DM+A, and DM+A+cART groups. The testicular interstitial connective tissue fibers viz. collagen, reticulin, and elastin reduced significantly in all treated groups, except for the reticulin which was non-significantly decreased in the alcohol (A) group. Furthermore, luteinizing hormone was significantly elevated in A and A+cART groups but significantly reduced in the DM+A+cART group. However, follicle-stimulating hormone increased significantly in all treated groups, with exception of DM+cART group. Testosterone levels were significantly reduced in DM, DM+A, and DM+A+cART groups, but no significant difference (p>0.05) was found in the inhibin B level in all treated groups compared to the control. In addition, the intensity and number of Sertoli and Leydig cells expressing androgen receptor were significantly reduced in all treated groups, except for the number of Sertoli cells expressing androgen receptor in the alcohol group. Expression of inflammatory markers (IL-1β, TNF-α, and IL-6) was upregulated in all treated groups, with exception of IL-1β of the A+cART group and TNF-α of cART and A+cART groups. The markers for oxidative stress (iNOS, MDA, and 8-OHDG) and apoptosis (caspase 3) were upregulated in all treated groups, with exception of MDA in the alcohol group. Though, the number of germ cells expressing proliferation marker Ki-67 was significantly reduced in all treated groups, the staining intensity was significantly increased compared to the control group. The results show that diabetes, alcohol abuse, cART, and their combinations have deleterious effects on the testicular histoarchitecture and function, which are suggested to result from the upregulation of oxidants and cytokines and androgen receptor depletion. The results further suggest that the interaction arising from a co-presence of cART and alcohol in diabetic condition could mildly diminish their independent effects due to their common cytochrome P450 metabolic pathway. This study yielded invaluable data on the contribution of cART alcohol-diabetes interaction on the male reproductive functioning/dysfunction and hopefully will help clinicians in managing reproductive challenges in patients of this category.
  • Thumbnail Image
    Item
    HIV infection, antiretroviral therapy and the haemostasis of pregnancy
    (University of the Witwatersrand, Johannesburg, 2023-07) Schapkaitz, Elise; Libhaber, Elena; Jacobson, Barry; Büller, Harry
    The human immunodeficiency virus (HIV) epidemic affects an estimated 30% of pregnant women living in South Africa. Increasing evidence suggests that women living with HIV are at a heightened risk for venous thrombo-embolism (VTE), which is a significant contributor to maternal mortality. In addition to a higher prevalence of obstetric and venous risk factors, this increased risk of VTE has been attributed to the effects of HIV and/or its treatment. HIV is characterized by immune activation and inflammation, which promote endothelial dysfunction and activation of coagulation. This is more pronounced with untreated HIV, yet this pro-inflammatory and pro-thrombotic balance may persist with long-term suppressive antiretroviral therapy (ART). However, the extent to which ongoing inflammation disrupts maternal haemostasis and predisposes pregnant women living with HIV to its prothrombotic consequences, is currently unknown. The aims of the work presented in this thesis in women living with HIV with access to ART were firstly to identify antepartum and postpartum risk factors for VTE; secondly to assess procoagulant changes in maternal haemostasis; and thirdly to determine risks of thrombosis and bleeding associated with thromboprophylaxis for VTE prevention. An epidemiological case-control study was performed in 128 cases with pregnancy related VTE and 640 matched controls. This study found at least a two-fold increased risk for VTE among pregnant and postpartum women living with HIV. In addition, antepartum risk factors, that may explain the disproportion of VTE risk in HIV, included medical co-morbidities and chronic hypertension, while postpartum risk factors included a personal history of VTE, medical co-morbidities, systemic infection, prolonged hospital admission and postpartum haemorrhage. Opportunistic infections, ART and the degree of immunosuppression were not associated with VTE risk. A sub-study followed and investigated antiphospholipid antibodies (aPL) in 215 women with thrombosis and/or obstetric complications. In this study, 15 (13.2%) of the women with HIV were positive at baseline for one of the five criteria aPL. The prevalence of aPL was not significantly increased among women with HIV, as compared to HIV negative women. Furthermore, the aPL profiles were not significantly different between the two groups. Lupus anticoagulant (LAC) positivity, on a single occasion, was associated with thrombosis (p < 0.003). Subsequently two prospective cross-sectional studies were conducted which assessed endothelial activation as well as fibrinolysis, coagulation and platelet activation in pregnant women with HIV, in each trimester. The studies included three groups: HIV negative, HIV with virological suppression (< 50 copies/mL) and HIV with viral load (VL) of >50 copies/mL. Endothelial activation was evaluated by measuring von Willebrand factor (VWF) antigen, VWF propeptide, multimer patterns and ADAMTS-13 antigen, activity, and antibody levels. The results showed an increase in the ratio of VWF propeptide to VWF antigen in the first, second and third trimester, in the HIV virologically suppressed group (1.7 ± 0.7, 1.7 ± 0.4, 1.6 ± 0.5) and the HIV group with VL > 50 copies/mL (1.9 ± 0.9, 1.7 ± 0.9, 1.6 ± 1.1) compared to the HIV negative group (1.4 ± 0.6, 1.3 ± 0.4, 1.2 ± 0.3, p < 0.05). Virological suppression was not associated with a significant reduction in this ratio, in each trimester. In addition, increased high molecular weight multimers were observed in the HIV groups, despite only a mild reduction in ADAMTS-13 activity compared to the HIV negative group (p < 0.001). Thereafter, fibrinolytic activity was evaluated by measuring d-dimer and plasminogen activator inhibitor-1 (PAI-1). Coagulation activity was determined by measuring thrombin-antithrombin (TAT) complex concentrations, and platelet factor-4 and platelet indices, namely mean platelet volume (MPV) and platelet distribution width as a measure of platelet activation. The results showed increased log d-dimer levels in the first, second and third trimester, in the HIV virologically suppressed group (-1.2 ± 0.5, -0.9 ± 0.4, -0.5 ± 0.3) and the HIV group with VL > 50 copies/mL (- 1.1 ± 0.4, -0.7 ± 0.4, -0.5 ± 0.5) compared to the HIV negative group (-1.4 ± 0.2, -1.1 ± 0.3, -0.8 ± 0.3, p < 0.05). Additionally, log PAI-1 levels were increased in the first, second, and third trimester, in the HIV virologically suppressed group (1.0 ± 0.4, 1.3 ± 0.4, 1.5 ± 0.4) and the HIV with VL > 50 copies/mL (0.8 ± 0.5, 1.2 ± 0.4, 1.5 ± 0.3) compared to the HIV negative group (0.4 ± 0.5, 0.8 ± 0.3, 1.3 ± 0.3, p < 0.05). Virological suppression was not associated with a significant reduction in first and third trimester d-dimer and PAI-1 levels. Thrombin-antithrombin complex levels were not increased, in the HIV virologically suppressed group as compared to the HIV negative group, beyond the first trimester. With regard to platelet parameters, only log MPV measured in the third trimester was decreased in in the HIV virologically suppressed group (2.3 ± 0.1) and the HIV group with VL > 50 copies/mL (2.3 ± 0.1) compared to the HIV negative group (2.5 ± 0.2) (p < 0.001). The last study was a longitudinal study of 129 pregnant women at intermediate or high risk of VTE, who received thromboprophylaxis. Venous thrombo-embolism occurred antepartum in 1.4%, 95% confidence interval (CI) 0.04-7.7 of intermediate and 3.4%, 95% CI 0.4-11.7 of high risk pregnancies. Major, clinically relevant non-major and minor bleeding events occurred in 7.1%, 95% CI 2.4-15.9 of intermediate and 8.5%, 95% CI 2.8-18.7 of high risk pregnancies. Owing to the small number of events, this study could not assess for HIV as a predictor of thrombosis and bleeding. Thus, in conclusion, the findings described in the studies in this thesis contribute to our knowledge in pregnant women living with HIV in the following ways. Firstly, HIV emerged as a significant antepartum and postpartum risk factor for VTE. Traditional obstetric and venous risk factors were also linked to the risk of thrombosis and could be useful for identifying women with HIV, who may benefit from postpartum and/or antepartum thromboprophylaxis. Secondly, this thesis identified heightened markers of endothelial activation and impaired fibrinolysis. Markers such as the ratio of VWF propeptide to VWF antigen, d-dimer and PAI-1 may provide a biological mechanism for the increased risk of pregnancy-related VTE in in HIV. Finally, this thesis provided rates of thrombosis and bleeding in women who received thromboprophylaxis in pregnancy and the postpartum period which can be used to advise women with HIV of the associated risks.
  • Thumbnail Image
    Item
    Life History Trade-offs associated with Evolution of Cancer
    (University of the Witwatersrand, Johannesburg, 2023-07) Worsley, Catherine Mary; Durand, Pierre; Mayne, Elizabeth; Veale, Rob
    The evolution of multicellularity requires cooperation between single cells to form new multicellular individuals. Changes in levels of selection occur during this process, with selection at the multicellular level overriding that at the single cell level. For a multicellular individual to function, somatic mutations and selection must be under tight regulation. Nevertheless, mutations and selective environmental pressures can select for cells with fitness advantages relative to normal cells, resulting in cancer. Therapeutic drugs and radiation are forms of artificial selection that can drive the development and selection of cell populations that are resistant to treatment. Cancer occurs because of the failure of multicellular systems to suppress somatic evolution. This somatic evolution results in tumour cells with a wide range of phenotypes with either fast (proliferating) or slow (quiescent) life history strategies. Evolutionary theory provides a framework for understanding what drives the formation of these phenotypes and the ecological niche that supports them, and helps in predicting tumour progression and response to therapy. The key hypothesis of this study was that selective pressures in the tumour microenvironment drive trade-offs between tumour cell survival, proliferation, and apoptosis. An extensive literature review was conducted to identify key selective pressures affecting tumour progression. Low extracellular pH was identified as a component of the tumour microenvironment that affects life history trade-offs, and particularly drives escape from immune-mediated destruction. A protocol was then developed to expose cancer cells to low pH in cell culture. Breast carcinoma and oesophageal squamous cell carcinoma cell lines were selected for these experiments based on the prevalence of these cancers and because of their different anatomical locations. Exposure to low pH induced different levels of apoptosis in each cell line. This also affected cell cycle progression and the secretion of growth factors and immunomodulatory cytokines. The oesophageal cell line, WHCO6, adapted to moderate acidity levels with some cells undergoing apoptosis. Factors released by these cells supported the growth and survival of related cells. In contrast, in the breast carcinoma MCF-7 cell line, low pH induced high rates of apoptosis, and factors released by dying cells stimulated death in related cells. This study highlights that different life history strategies are employed by different cancer types. It also shows the importance of the tumour microenvironment, and acidity in particular, in driving tumour cell adaptation and survival. This study also identifies apoptosis as a pro-tumorigenic driver of cancer progression which has important therapeutic implications.
  • Thumbnail Image
    Item
    Initial loss to follow up among tuberculosis patients: the role of ward-based outreach teams (wbots) and short message service (sms) technology
    (University of the Witwatersrand, Johannesburg, 2023-03) Mwansa-Kambafwile, Judith Reegan Mulubwa; Menezes, Colin; Chasela, Charles
    Introduction: In South Africa, tuberculosis (TB) is still a serious public health problem with rates of initial loss to follow up (initial LTFU) varying between 14.9% and 22.5%. Poor clinician-patient communication resulting in lack of clarity on next steps, patients not prioritizing their healthcare and patients not knowing that their results are ready at the clinic are some reasons for initial LTFU. This PhD aimed to assess the effectiveness of Ward-based Outreach Teams (WBOTs) or Short Message Service (SMS) technology in reducing TB initial LTFU in Johannesburg, South Africa between 2018 and 2020. Methods: A mixed methods approach comprising two phases (formative and intervention) was employed. In the formative phase, secondary data were analyzed for frequency distributions to determine the rates of initial LTFU in the study area. In addition, in-depth interviews with WBOT Managers and with TB Program Managers were conducted to determine their perceived reasons for TB initial LTFU. In the intervention phase, two interventions (WBOTs/SMS technology) were tested using a 3 arm randomized controlled trial (RCT) comparing each of the interventions to standard of care (SOC). The WBOTs delivered paper slip reminders while SMS intervention entailed sending reminder SMS messages to patients as soon as TB results were available. Chi square statistics, Poisson regression and Kaplan-Meier estimates were used to analyze the data. The RCT was followed by in-depth interviews with WBOT members and with some of the trial participants who had tested TB positive and had received reminder messages. To identify themes in the qualitative studies, both inductive and deductive coding were used in the hybrid analytic approach. Results: From the formative phase, the TB initial LTFU among the 271 patients was found to be 22.5% and the overall time to treatment initiation was 9 days. Interviews with managers revealed that relocation and “shopping around” were the main patient related factors found as the reasons for initial LTFU. Health system related factors for initial LTFU were communication and staff rotations. In terms of TB related work, WBOTs screened household members for TB and referred them for TB testing. The services of the WBOT/TB programs which were found to be integrated were: referral of symptomatic patients for TB testing and adherence monitoring in patients already on TB treatment. There was minimal involvement of the WBOTs in the treatment initiation of patients diagnosed with TB. Findings from the trial were that 11% (314/2850) of the participants tested positive for TB. The 314 TB patients were assigned to one of the 3 arms (SOC=104, WBOTs=105, and SMS=105). Overall, 255 patients (81.2%) were initiated treatment across all study arms. More patients in the SMS arm were initiated TB treatment than in the SOC arm (92/105; 88% and 81/104; 78% respectively; P=0.062). Patients in the SMS arm also had a shorter time to treatment initiation than those in the SOC arm (4 days versus 8 days; P<0.001). A comparison of the WBOTs arm and the SOC arm showed similar proportions initiated on treatment (45/62; 73% and 44/61; 72% respectively) as well as similar times to treatment initiation. Findings from the post-trial interviews showed that delivery of the reminder paper slips by the WBOTs during the trial was something new, but possible to incorporate into their daily schedule. The patient interviews revealed that various emotions (happiness, fear, worry etc.) were experienced upon receipt of the reminder messages. Participants also reported that receiving the reminder message did influence their decision to go back to collect the results. Conclusion: Reminder messages to patients are beneficial in TB treatment initiation. National TB programs can use SMS messaging because it is an affordable and feasible method. Although implementation of the WBOTs intervention was suboptimal, findings show that with proper integration of TB and WBOT programs, WBOTs have the potential to contribute to improved treatment initiation.
  • Thumbnail Image
    Item
    Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) transmission dynamics and social contact patterns
    (University of the Witwatersrand, Johannesburg, 2023-03) Kleynhans, Jacoba Wilhelmina; Cohen, Cheryl; Tempia, Stefano
    Background: Understanding the community burden and transmission dynamics of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) can assist to make informed decisions for prevention policies. Methods: From August through October 2018, before the SARS-CoV-2 pandemic, we performed a cross-sectional contact survey nested in a prospective household cohort in an urban (Jouberton, North West Province) and a rural community (Agincourt, Mpumalanga Province) in South Africa to measure contact rates in 535 study participants. Participants were interviewed to collect details on all contact events (within and outside of the household). During the SARS-CoV-2 pandemic we enrolled 1211 individuals from 232 randomly selected households in the same urban and rural community, and followed the cohort prospectively for 16 months (July 2020 through November 2021), collecting blood every two months to test for SARS-CoV-2 antibodies. Using these longitudinal SARS-CoV-2 seroprevalence estimates and comparing these with reported laboratory-confirmed cases, hospitalizations and deaths, we investigated the community burden and severity of SARS-CoV-2. We also performed a case-ascertained household transmission study of symptomatic SARS-CoV-2 index cases living with HIV (LWH) and not LWH (NLWH) in two urban communities (Jouberton, North West Province and Soweto, Gauteng Province) from October 2020 through September 2021. We enrolled 131 SARS-CoV-2 index cases at primary healthcare clinics. The index cases and their 457 household contacts were followed up for six weeks with thrice weekly visits to collect nasal swabs for SARS-CoV-2 testing on reverse transcription real-time polymerase chain reaction (rRT-PCR), irrespective of symptoms. We assessed household cumulative infection risk (HCIR), duration of virus detection and the interval between index and contact symptom onset (serial interval). By collecting high-resolution household contact patterns in these households using wearable sensors, we assessed the association between contact patterns and SARS-CoV-2 household transmission. Results: During the contact survey, we observed an overall contact rate of 14 (95% confidence interval (CI), 13-15) contacts per day, with higher contact rates in children aged 14-18 years (22, 95%CI 8-35) compared to children <7 years (15, 95%CI 12-17). We found higher contact rates in the rural site (21, 95%CI 14-28) compared to the urban site (12, 95%CI 11-13). When comparing the household cohort seroprevalence estimates to district SARS-CoV-2 laboratory-confirmed infections, we saw that only 5% of SARS-CoV-2 infections were reported to surveillance. Three percent of infections resulted in hospitalization and 0.7% in death. People LWH were not more likely to be seropositive for SARS-CoV-2 (odds ratio [OR] 1.0, 95%CI 0.7–1.5), although the sample size for people LWH was small (159/1131 LWH). During the case-ascertained household transmission study for SARS-CoV-2, we estimated a HCIR of 59% (220/373) in susceptible household members, with similar rates in households with an index LWH and NLWH (60% LWH vs 58% NLWH). We observed a higher risk of transmission from index cases aged 35–59 years (adjusted OR [aOR] 3.4, 95%CI 1.5–7.8) and ≥60 years (aOR 3.1, 95% CI 1.0–10.1) compared with those aged 18–34 years, and index cases with a high SARS-CoV-2 viral load (using cycle threshold values (Ct) <25 as a proxy, aOR 5.3, 95%CI 1.6–17.6). HCIR was also higher in contacts aged 13–17 years (aOR 7.1, 95%CI 1.5–33.9) and 18–34 years (aOR 4.4, 95% CI 1.0–18.4) compared with <5 years. Through the deployment of wearable sensors, we were able to measure high-resolution within-household contact patterns in the same households. We did not find an association between duration (aOR 1.0 95%CI 1.0-1.0) and frequency (aOR 1.0 95%CI 1.0-1.0) of close-proximity contact with SARS CoV-2 index cases and household members and transmission. Conclusion: We found high contact rates in school-going children, and higher contact rates in the rural community compared to the urban community. These contact rates add to the limited literature on measured contact patterns in South Africa. The burden of SARS-CoV-2 is underestimated in national surveillance, highlighting the importance of serological surveys to determine the true burden. Under-ascertainment of cases can hinder containment efforts through isolation and contact tracing. Based on seroprevalence estimates in our study, people LWH did not have higher SARS-CoV-2 community attack rates. In the household transmission study, we observed a high HCIR in households with symptomatic index cases, and that index cases LWH did not infect more household members compared to people NLWH. We found a correlation between age and SARS-CoV-2 transmission and acquisition, as well as between age and contact rates. Although we did not observe an association between household contact patterns and SARS-CoV-2 transmission, we generated SARS-CoV-2 transmission parameters and community and household contact data that can be used to parametrize infectious disease models for both SARS-CoV-2 and other pathogens to assist with forecasting and intervention assessments. The availability of robust data is important in the face of a pandemic where intervention strategies have to be adapted continuously.
  • Thumbnail Image
    Item
    Biomarkers to predict Tuberculosis treatment response
    (University of the Witwatersrand, Johannesburg, 2023-06) Boshielo, Itumeleng Tania; Tiemessen, Caroline; Kana, Bavesh
    Tuberculosis (TB) is a chronic disease caused by Mycobacterium tuberculosis (Mtb). Despite the implementation of multifaceted TB prevention and control efforts, a significant number of people still dee from TB. Consistent with this, an uptick in TB-related mortality was recently noted, which has been ascribed to the negative effects of Coronavirus disease-2019 (COVID 19) on TB programs. The complex life cycle of Mtb is largely due to the use of immune evasion mechanisms to establish initial infection, remain dormant in the host, and reactivate pathogenicity under favourable circumstances. The prolonged TB treatment regimen is necessitated by the slow response of bacterial populations to standard TB chemotherapy, a phenomenon that may be caused by persistent, drug-tolerant bacteria. Scientific literature has provided evidence for these types of bacterial populations in the form of Differentially Culturable Tubercle Bacilli (DCTB). It has been demonstrated that DCTB represent drug tolerant bacteria that appear to be cleared at slower rate than organisms detected by routine culture methods. However, it remains unclear if DCTB populations elicit different immune responses when compared to their conventionally culturable counterparts. Herein, we address this question by optimizing a laboratory model for the generation of DCTB in vitro and test the capacity of clinical isolates of Mtb from Lineage 2 (Beijing) and Lineage 4 (LAM) to adopt the DCTB state. Using the Most probable number (MPN) assay, in the presence of culture filtrate (CF) as a source of growth factors to resuscitate DCTB, and colony forming units, the amount of DCTB in our model was quantified. As demonstrated by the limited growth on agar plates and increased growth in liquid media supplemented with CF from an axenic culture of Mtb, our findings demonstrated that carbon starvation was able to generate DCTB from clinical Mtb strains. After generating these populations, we stimulated whole blood with DCTB and conventionally culturable populations and report on the stimulation of a select set of cytokines (IFN-γ, IL-4, IL-5, IL-6, IL-12p70 and TNF-α) using a Bead Array Multiplex Immunoassay. In comparison to H37Rv-DCTB and LAM-DCTB, Beijing-DCTB induced significantly reduced levels of IL-5 and TNF-α. When comparing cytokine production between culturable and DCTB populations, within a single strain, we noted that LAM-DCTB was delayed in the production of IFN-γ whilst Beijing-DCTB was not able to induce production of this cytokine when compared to conventionally culturable counterparts. These data suggest that shifting to a non-replicating DCTB state does indeed affect the ability of clinical isolates to induce immune responses. Based on these observations, we next set out to determine if DCTB affects immune responses during treatment of Mtb infected individuals. In prior work, using a prospective observational cohort, we demonstrated a substantive heterogeneity in clearance of DCTB in individuals with drug susceptible TB. We were able to classify these response patterns into three broad groups including (I) participants who were able to clear DCTB within the first two weeks of treatment (treatment-responsive); (II) those with delayed ability to clear these organisms (delayed-responsive) and (III) a group of individuals where DCTB did not change substantively during treatment (non-responders). Given these stark differences in treatment response patterns, we hypothesized that the immune responses associated with these patterns would be substantively different. In the second component of this work, we set out identify immune biomarkers that predict an effective response of DCTB to TB treatment. To quantify cytokines, chemokines and growth factors in plasma from these groups, we used a 65-plex Luminex assay, with a broad selection of targets. Statistically significant differences between these groups were analysed using the Kruskal-Wallis test with Dunn’s multiple comparisons, with p<0.05 was considered as statistically significant. When compared to patients who had TB and HIV co-infection, the number of cytokines that may possibly be used to report on the effectiveness of TB treatment was significantly higher in Mtb-only infected patients. This suggests that HIV infection significantly reduces the number of cytokines that can be used to report on TB treatment response. The ROC analysis of I-TAC, G-CSF and VEGF-A showed that these cytokines have a significant discriminatory power to distinguish treatment-responsive and non-responsive patients from HCs using DCTB as the measure of treatment response. No unifying cytokine signature that predicted DCTB response in all groups was identified. Together, our results indicate that some inflammatory markers are elevated in individuals with TB that rapidly clear bacteria during treatment. Given that these responses are based on DCTB, which represent drug tolerant populations, these select cytokines may be useful in evaluating the effectiveness of novel shorter TB treatment regimens.