by Martin Chebet, Kathy Burgoine, Joseph Rujumba, Noela Regina Akwi Okalany, Peter Olupot-Olupot, Thorkild Tylleskär, Andrew D. Weeks, Agnes Napyo, David Mukunya, Ingunn Marie S. Engebretsen
BackgroundIn sub-Saharan Africa, stillbirth rates remain high. To design effective interventions to reduce stillbirths, accurate determination of their aetiology is important. Conventional autopsy for accurate confirmation of cause is not acceptable or feasible in several societies in sub-Saharan Africa; minimal invasive tissue sampling (MITS), is a recently developed, less invasive alternative. In this study, we explored the acceptability of MITS in the community and among healthcare workers in Uganda to guide the future implementation.
MethodsA qualitative study was done among community members and healthcare workers in Mbale in Eastern Uganda. We undertook in-depth interviews and focus group discussions in English or local languages. Interviews were audio-recorded, transcribed as necessary prior to formal content analysis. The themes were organised using NVivo software and presented according to Sekhon’s theoretical framework.
ResultsOverall, participants preferred the idea of MITS to conventional autopsy because of the perception that it was fast, maintained the facial appearance and kept the body intact. It was thought that the procedure would improve the detection of the cause of stillbirths, which in turn would help to prevent future stillbirths. It would also resolve conflicts in the community between community members or the women and the healthcare workers about the cause of a stillbirth. It was suggested that some community members may not approve of MITS because of their religious beliefs; the fear that the body parts may be extracted and stolen for witchcraft or organ donation; and a lack of trust in the healthcare system. To implement the procedure, it was suggested that extensive community sensitization should be done, space limitations in healthcare facilities overcome, healthcare workers should be trained and limited human resource should be addressed.
ConclusionThe implementation of MITS in Mbale, Eastern Uganda, is likely to be acceptable given sufficient training and sensitisation.
by Tessy Luger, Felix Uhlemann, Florestan Wagenblast, Thomas Läubli, Barbara Munz, Manfred Schmolz, Monika A. Rieger, Benjamin Steinhilber
BackgroundWork-related musculoskeletal disorders (WMSDs) are prevalent in occupations characterised by high repetition and high force demands. Both factors not only evoke inflammatory and degenerative processes in affected musculoskeletal tissue, but also systemic responses identified by biomarkers in blood serum. Clarifying methodological aspects of biomarkers may provide insights into their predictive role in the pathway of developing WMSDs. This study will primarily assess reliability of systemic inflammatory biomarkers (CRP, TNF-α, IL-6, IL-1β) and immune cell reactivity by repeated measures in workers with constant workloads over time.
MethodsThis observational cross-sectional study will include two groups of workers: exposed group including workers exposed to higher upper-extremity physical workloads, especially affecting the elbow/forearm/hand-area; unexposed group, including office workers exposed to lower upper-extremity physical workloads. Recruited persons are screened against eligibility criteria followed by a medical anamnesis and blood analysis. Enrolled participants undergo nine repeated measurements once every two weeks, taking blood among others. Blood analyses will determine values of systemic inflammatory biomarkers and reactivity of immune cells. The absolute test-retest reliability of biomarkers and immune cell reactivity over time is assessed by the intra-class correlation coefficient applying two-way mixed-effects models. The relative test-retest reliability is assessed by the standard error of measurement.
DiscussionKnowledge of and models currently describing the pathological role of systemic inflammatory biomarkers are based on highly-controlled laboratory rat experiments. This study has the strength of assessing a human population under real-life conditions. The major challenge is in participant recruitment given the intensive and complex study design. The results of this study could provide fundamentals for initiating a cohort study and be used for developing work-related stress-recovery concepts for occupations with different physical demands to identify workers who may be at risk for developing WMSDs. German Clinical Trials Register (DRKS00031872, 25 May 2023).
by Chengya Feng, Xiaohe Lu, Zimian Fan, Xinxing Wang
BackgroundHuman papillomavirus (HPV) is the most prevalent sexually transmitted infection. Copper is essential for immune function, but its association with HPV infection remains unclear. This study aims to investigate the relationship between dietary copper intake and HPV infection.
MethodsThis cross-sectional study analyzed 8,071 participants from the National Health and Nutrition Examination Survey (2003–2016). Copper intake was assessed using two 24-hour recalls, and HPV status was confirmed by DNA testing. Weighted multivariable logistic regression and restricted cubic splines (RCS) were used.
ResultsAfter adjusting for multiple confounders, dietary copper intake was significantly inversely associated with vaginal HPV infection (odds ratio [OR], 0.79; 95% confidence interval [CI], 0.67–0.92). Compared with women in the lowest quartile of dietary copper intake, those in the highest quartile had a lower adjusted OR for vaginal HPV infection (OR, 0.60; 95% CI, 0.48–0.73). RCS analysis revealed an L-shaped association with a threshold at 1.2 mg/day of copper intake. Subgroup analyses showed that marital status moderated the association between copper intake and HPV infection (P for interaction Conclusion
An L-shaped association was observed between copper intake and HPV infection, suggesting that maintaining an optimal level of copper intake may be associated with reduced risk of HPV infection and related diseases.
by Francesco Colussi, Jacopo Favaro, Claudio Ancona, Edoardo Passarotto, Maria Federica Pelizza, Eleonora Lorenzon, Simone Ruzzante, Stefano Masiero, Giorgio Perilongo, Giovanni Sparacino, Irene Toldo, Stefano Sartori, Maria Rubega
Brain maturation from birth to adolescence involves profound transformations in neural dynamics, which can be studied in a minimally invasive manner using quantitative EEG. Most of the results published in the literature are based on spectral analysis approaches, which are extremely effective in detecting and assessing EEG rhythms. However, some aspects of EEG dynamics can only be investigated using nonlinear approaches, the use of which is still relatively unexplored in the pediatric population. The aim of the present paper is to assess the EEG differentiation of wakefulness from deep sleep (quiet sleep in neonates, stage N3 in older children) and its maturation across a wide developmental window (0–17 years) using the fractal dimension. Specifically, Higuchi fractal dimension (HFD) algorithm is used to analyse both wakefulness and sleep EEG recordings collected from 63 infants (aged 0-1 year) and 160 children (aged 2-17 years). To ensure methodological consistency, a data-driven criterion for the selection of HFD user parameters is implemented to enhance reproducibility. Our results show that HFD during wakefulness increases during the first year of life, followed by a stabilization or slight decrease in later years. In contrast, HFD during sleep exhibits a more stable profile, with only a mild increase over development. These findings are consistent with known neurodevelopmental processes—including synaptogenesis, pruning, and white matter maturation—and support the interpretation of HFD as a sensitive marker of large-scale integrative brain dynamics. These physiological trajectories of HFD both in wakefulness and sleep could be used as reference for future clinical applications in pediatric neurology and developmental monitoring.by Jakob Morén, Barbro Persson, Anna Sörman, Åke Lundkvist, Hanin Shihab, Marie Studahl, Malin Veje, Göran Günther, Gabriel Westman
BackgroundTick-borne encephalitis is a viral infection of the central nervous system that may cause severe illness and long-term sequelae, to which underlying mechanisms are not completely understood. Autoantibodies against the N-methyl-D-aspartate receptor (anti-NMDAR) may be triggered by immunologic events, occur sporadically, and can cause autoimmune encephalitis. Following herpes simplex encephalitis and Japanese encephalitis, anti-NMDAR autoantibodies may develop and have been associated with relapse or impaired cognitive recovery. Tick-borne encephalitis has been shown to trigger anti-NMDAR encephalitis in sporadic cases, but the frequency of autoimmunization is unknown.
ObjectivesThe objective of this study was to assess the frequency of intrathecal anti-NMDAR antibody development following tick-borne encephalitis and to explore whether such antibodies could be relevant to cognitive complaints.
MethodsAdult patients with tick-borne encephalitis were included retrospectively from one cohort and prospectively from another. A stored post-acute cerebrospinal fluid sample was required for anti-NMDAR analysis. Two commercial kits (Euroimmun AG, Lübeck, Germany) were used to detect anti-NMDAR IgG antibodies in cerebrospinal fluid.
ResultsA total of 71 cerebrospinal fluid samples from 53 patients were analyzed for anti-NMDAR antibodies. Samples were obtained at a median of 91 days (range 21–471) after onset of central nervous system symptoms. Anti-NMDAR antibodies were detected in two samples from a single tick-borne encephalitis patient, corresponding to 1.9% of patients (95% CI: 0.05–10.1%).
ConclusionsThe development of intrathecal anti-NMDAR autoantibodies following tick-borne encephalitis is a rare event, and further studies are needed to clarify their potential relevance to cognitive outcomes in a minority of cases. Testing for anti-NMDAR antibodies in cerebrospinal fluid may be considered in patients who experience clinical deterioration following an initial recovery.
by Mostafa Bondok, Moses Kasadhakawo, John Onyango, Oscar Turya, Khumbo Kalua
PurposeTo determine the prevalence and causes of blindness and vision impairment (VI) among adults aged ≥50 years in Western Uganda.
MethodsA population-based cross-sectional survey was conducted in Western Uganda (July-August 2023) using RAAB7. Adults aged ≥50 years who had resided in the study districts for at least six months in the past year were eligible. Participants were identified through door-to-door household visits using a two-stage cluster sampling approach. Primary outcomes include prevalence of blindness and VI and its causes. Secondary outcomes include cataract surgical coverage (CSC), effective CSC (eCSC), refractive error coverage (REC), and effective REC (eREC).
ResultsA total of 3,125 participants were examined (54.1% female). The adjusted prevalence of blindness (presenting visual acuity (PVA) Conclusion
Blindness and vision impairment remain major public health issues in Western Uganda, primarily due to untreated cataract and uncorrected refractive error. Poor post-operative outcomes highlight the urgent need to improve surgical quality. These findings may guide targeted interventions and policy to strengthen eye care services.
by Yu Zhang, Chen Chen, Tianhang Zhu, Wei Luo, Ranran Zhou, Wanlong Tan
Glucocorticoids play a pivotal role in tumorigenesis and cancer progression. However, the prognostic significance of glucocorticoid signaling-related genes remains poorly understood, particularly in kidney renal clear cell carcinoma (KIRC). Collected samples indicated KIRC patients exhibited elevated serum glucocorticoid levels compared to healthy donors (P P P Pby Mireia Solé Pi, Luz A. Espino, Péter Szenczi, Marcos Rosetti, Oxána Bánszegi
A long-standing question in the study of quantity discrimination is what stimulus properties are controlling choice. While some species have been found to do it based on the total amount of stimuli and without using numerical information, others prefer numeric rather than any continuous magnitude. Here, we tested cats, dogs, and humans using a simple two-way spontaneous choice paradigm (involving food for the first two, images for the latter) to see whether numerosity or total surface area has a greater influence on their decision. We found that cats showed preference for the larger amount of food when the ratio between the stimuli was 0.5, but not when it was 0.67; dogs did not differentiate between stimuli presenting the two options (smaller vs. larger amount of food) regardless of the ratio between them, but humans did so almost perfectly. When faced with two stimuli of the same area but different shapes, dogs and humans exhibited a preference for certain shapes, particularly the circle, while cats’ choices seemed to be at chance level. Furthermore, cats’ and dogs’ reaction times were equal across conditions, while humans were quicker when choosing between stimuli in trials where the shape was the same, but the surface area was different, and even more so when asked to choose between two differently sized circle shapes. Results suggest that there is no universal rule regarding how to process quantity, but rather that quantity estimation seems to be tied to the ecological context of each species. Future work should focus on testing quantity estimation in different contexts and different sources of motivation.by Bethelhem Bashe, Desalegn Dawit Assele, Worku Ketema, Mulugeta Sitot Shibeshi
BackgroundCerebral palsy is a frequent physical disability of childhood, causing motor impairment, sensory impairment, cognitive and behavioral issues, and secondary musculoskeletal deformities, with a global incidence of 1–4 per 1,000 children. It significantly impacts children’s quality of life and imposes an economic burden on families and healthcare systems. There is limited evidence of the risk factors of cerebral palsy in Ethiopia, including in the study setting. We investigated factors associated with cerebral palsy among children attending Hawassa University Comprehensive Specialized Hospital.
MethodsAn institution-based, unmatched case-control study was conducted among children who visited Hawassa University Comprehensive Specialized Hospital from January 2019 to December 2023. Consecutive cases were recruited until the required sample size was reached, and controls were randomly selected. Data were extracted from 80 cases and 160 control charts. Binary logistic regression analysis was used to identify risk factors for cerebral palsy. An adjusted odds ratio with a 95% confidence interval was reported to show the strength of the association. The significance of the association was declared at a p-value Results
A total of 240 participants (80 cases and 160 controls) were enrolled in the study. Maternal infection during pregnancy [AOR:4.1; 95%; 1.39, 12.1], low birth weight [AOR:4.1; 95%; 1.49, 11.2], prolonged labor [AOR:3.2; 95%;1.47, 7.00], history of perinatal asphyxia [AOR: 2.65; 95%;1.06, 6.65], and central nervous system infection during infancy [AOR:3.4; 95%; 1.21, 9.64] were risk factors for cerebral palsy.
ConclusionPerinatal asphyxia, maternal infection, low birth weight, prolonged labor, and CNS infection during infancy are significantly associated with cerebral palsy. Public health education should promote awareness about cerebral palsy, encourage antenatal care, and educate healthcare professionals on emergency obstetrics and newborn care. Appropriate measures should be taken to reduce the incidence of CNS infections during infancy.
by Muluken Chanie Agimas, Mekuriaw Nibret Aweke, Berhanu Mengistu, Lemlem Daniel Baffa, Elsa Awoke Fentie, Ever Siyoum Shewarega, Aysheshim Kassahun Belew, Esmael Ali Muhammad
IntroductionMalaria is a global public health problem, particularly in sub-Saharan African countries. It is responsible for 90% of all deaths worldwide. To reduce the impact and complications associated with delayed treatment of malaria among children under five, comprehensive evidence about the magnitude and determinants of delayed treatment for malaria could be the solution. But there are no national-level studies in the Horn of Africa for decision-makers.
ObjectiveTo assess the prevalence and associated factors of delay in seeking malaria treatment among under-five children in the Horn of Africa.
MethodPublished and unpublished papers were searched on Google, Google Scholar, PubMed/Medline, EMBASE, SCOPUS, and the published articles’ reference list. The search mechanism was established using Medical Subject Heading (MeSH) terms by combining the key terms of the title. Joana Brigg’s Institute critical appraisal checklist was used to assess the quality of articles. A sensitivity test was conducted to evaluate the heterogeneity of the studies. The visual funnel plot test and Egger’s and Begg’s statistics in the random effect model were done to evaluate the publication bias and small study effect. The I2 statistics were also used to quantify the amount of heterogeneity between the included studies.
ResultsThe pooled prevalence of delayed treatment for malaria among under-five children in the Horn of Africa was 48% (95% CI: 34%–63%). History of child death (OR =2.5, 95% CI: 1.73–3.59), distance >3000 meters (OR = 2.59, 95% CI: 2.03–3.3), drug side effect (OR = 2.94, 95% CI: 1.86–4.67), formal education (OR = 0.69, 95% CI: 0.49–0.96), middle income (OR = 0.42, 95% CI: 0.28–0.63), expensiveness (OR = 4.39, 95% CI: 2.49–7.76), and affordable cost (OR = 2.13, 95% CI: 1.41–3.2) for transport were factors associated with malaria treatment delay among children.
Conclusion and recommendationsAbout one out of two parents in the Horn of Africa put off getting their kids treated for malaria. High transportation expenses, long travel times (greater than 3,000 meters) to medical facilities, and anxiety about drug side effects were major risk factors that contributed to this delay. On the other hand, a middle-class income was found to be protective of treatment delays. These results highlight how crucial it is to improve access to healthcare services, both financially and physically, to minimize delays in treating malaria in the area’s children.
by Tadesse Tarik Tamir, Berhan Tekeba, Alebachew Ferede Zegeye, Deresse Abebe Gebrehana, Mulugeta Wassie, Gebreeyesus Abera Zeleke, Enyew Getaneh Mekonen
IntroductionSolitary childbirth—giving birth without any form of assistance—remains a serious global public health issue, especially in low-resource settings. It is associated with preventable maternal complications such as hemorrhage and sepsis, and poses significant risks to newborns, including birth asphyxia, infection, and early neonatal death. In Ethiopia, where many births occur outside health facilities, understanding the spatial and socio-demographic patterns of solitary childbirth is vital for informing targeted interventions to improve maternal and child health outcomes. This study aims to identify and map the spatial distribution of solitary childbirth across Ethiopia and to analyze its determinants using data from the 2019 national Interim Demographic and Health Survey.
MethodWe analyzed data from the 2019 Interim Ethiopian Demographic and Health Survey to determine the spatial distribution and factors of solitary birth in Ethiopia. A total weighted sample of 3,884 women was included in the analysis. Spatial analysis was used to determine the regional distribution of solitary birth, and multilevel logistic regression was employed to identify its determinants. ArcGIS 10.8 was used for spatial analysis, and Stata 17 was used for multilevel analysis. The fixed effect was analyzed by determining the adjusted odds ratio with a 95% confidence interval.
ResultThe prevalence of solitary childbirths in Ethiopia was 12.73%, with a 95% confidence interval spanning from 11.71% to 13.81%. The western and southern parts of Oromia, all of Benishangul-Gumuz, most parts of the SNNPR, and the west of Amhara regions were hotspot areas for solitary birth. Having no formal education, not attending ANC visits, and residing in pastoral regions were significantly associated with higher odds of solitary birth in Ethiopia.
CocnlusionA notable proportion of women are experiencing childbirth alone, which highlights a significant aspect of maternal health in the country, reflecting both the challenges and improvements in childbirth practices. The distribution of solitary births exhibited spatial clustering with its hotspot areas located in western and southern parts of Oromia, all of Benishangul-Gumuz, most parts of the SNNPR, and west of Amhara regions. Lack of education, not having an ANC visit, and being a resident of pastoral regions were significant determinants of solitary birth. The implementation of maternal and child health strategies in Ethiopia could benefit from considering the hotspot areas and determinants of solitary birth.
by Adedapo Olufemi Bashorun, Larry Kotei, Abdoulie F. Jallow, Ousubie Jawla, Emmanuel U. Richard-Ugwuadu, Muhammed Jagana, Lamin Bah, Amadou Tijan Bah, Karamo Conteh, Mamadou S.K. Jallow, Mehrab Karim, Bai Lamin Dondeh, Anne Segonds-Pichon, Gary M. Clifford, Iacopo Baussano, Bruno Pichon, David Jeffries, Ed Clarke
Human papillomavirus (HPV) infection is a primary cause of preventable deaths from cervical cancer, a condition of profound inequality with approximately 90% of deaths occurring in low- and middle-income countries, particularly in sub-Saharan Africa. In May 2018, the WHO Director-General declared a Joint Global Commitment to Cervical Cancer Elimination, highlighting the critical role of HPV vaccines in achieving this goal. However, there is a lack of systemically collected data on HPV prevalence in The Gambia, and impact data from high-income countries may not be reliably extrapolated to West African settings due to geographical variation in HPV types and distinct behavioural, biological, and sociodemographic exposures. The Gambia introduced a two-dose HPV vaccination schedule in 2019, but coverage has been very low, interrupted mainly by the COVID-19 pandemic. This presents a key opportunity to generate vital baseline data on HPV prevalence in the population before potential scale-up of vaccination efforts. The PHASE survey, a multi-stage cluster survey, aims to establish the baseline, population prevalence estimates of high-risk and low-risk, vaccine-type and non-vaccine-type HPV infection in 15- to 49-year-old females in The Gambia by measuring urinary HPV-DNA. The survey will also quantify the effects of various exposures on HPV prevalence, including sexual behaviour, the presence of other sexually-transmitted infections (STIs) - Neisseria gonorrhoea (NG), Chlamydia trachomatis (CT), Trichomonas vaginalis (TV), Mycoplasma genitalium (MG), syphilis, as well as blood borne viruses, human immunodeficiency virus (HIV), hepatitis B and hepatitis C; obstetric history, socio-demographic characteristics, and cervical cancer screening and/or treatment. Additionally, the study will provide important antimicrobial resistance (AMR) data for NG and MG in sub-Saharan Africa, a region poorly represented in global surveillance programs. This data is needed to guide regional treatment guidelines and advocate for new solutions, including gonococcal vaccines. The AMR data are expected to immediately influence recommendations regarding the appropriate choice of antibiotics for syndromic STI management in West Africa and hence to address an important driver of AMR in the sub-region. Leveraging on the Medical Research Council Unit The Gambia funded Health Demographic Surveillance system (HDSS) as its sampling frame, the survey will utilize validated diagnostic assays and culturally sensitive data collection methods, to ensure both scientific rigor and local relevance. Tools such as Audio Computer-Assisted Self-Interviewing (ACASI) technology, developed in consultation with local community advisory boards, are included to reduce social desirability bias in reporting sexual behaviour. This approach aims to maximize both the reliability and cultural appropriateness of the findings. This study directly addresses the critical need for baseline epidemiological data on HPV in a West African setting to accelerate vaccine impact and drive new interventions towards cervical cancer elimination. By understanding other factors that influence HPV (like other STIs, sexual behaviour, etc.), the study aims to ensure that, when the vaccine’s impact is measured later, changes in other confounding factors that may impact on HPV prevalence can be accounted for. The study will also establish the population prevalence of the measured STIs and their relationship to common symptoms and other adverse health outcomes related to STIs.by Wanbo Lu, Qibo Liu, Haofang Li
This paper employs the mixed-frequency Granger causality test, reverse unconstrained mixed-frequency data sampling models, and Chinese data from January 2006 to June 2024 to test the nexus between consumer confidence and the macroeconomy. The results show that changes in the real estate market, GDP, and urban unemployment rate are Granger causes of consumer confidence. In reverse, consumer confidence is a Granger cause of the CPI. Second, GDP and the real estate market (CPI and urban unemployment rate) have a significant positive (negative) impact on consumer confidence, while the conditions of industrial production, interest rate, and stock market do not. Third, the “animal spirits” extracted from consumer confidence cannot lead to noticeable fluctuations in China’s macroeconomy. This suggests that the “animal spirits” will not dominate economic growth, even though they affect the macroeconomy slightly and inevitably. The results are robust after replacing the dependent variable and considering the influence of the global financial crisis and the COVID-19 pandemic.by Juliana Rodrigues Tovar Garbin, Franciéle Marabotti Costa Leite, Ana Paula Brioschi dos Santos, Larissa Soares Dell’Antonio, Cristiano Soares da Silva Dell’Antonio, Luís Carlos Lopes-Júnior
A comprehensive understanding of the factors influencing the epidemiological dynamics of COVID-19 across the pandemic waves—particularly in terms of disease severity and mortality—is critical for optimizing healthcare services and prioritizing high-risk populations. Here we aim to analyze the factors associated with short-term and prolonged hospitalization for COVID-19 during the first three pandemic waves. We conducted a retrospective observational study using data from individuals reported in the e-SUS-VS system who were hospitalized for COVID-19 in a state in a southeast state of Brazil. Hospitalization duration was classified as short or prolonged based on a 7-day cutoff, corresponding to the median length of hospital stay during the second pandemic wave. Bivariate analyses were performed using the chi-square test for heterogeneity. Logistic regression models were used to estimate odds ratios (ORs) and their respective 95% confidence intervals (CIs), with statistical significance set at 5%. When analyzing hospitalization duration across the three waves, we found that 51.1% (95%CI: 49.3–53) of hospitalizations in the first wave were prolonged. In contrast, short-duration hospitalizations predominated in the second (54.7%; 95% CI: 52.4–57.0) and third (51.7%; 95% CI: 50.2–53.2) waves. Factors associated with prolonged hospitalization varied by wave. During the first wave, older adults (≥60 years) (OR=1.67; 95%CI: 1.35–2.06), individuals with ≥10 symptoms (OR=2.03; 95%CI: 1.04–3.94), obese individuals (OR=2.0; 95%CI: 1.53–2.74), and those with ≥2 comorbidities (OR=2.22; 95%CI: 1.71–2.89) were more likely to experience prolonged hospitalization. In the second wave, he likelihood of extended hospital stays was higher among individuals aged ≥60 years (OR=2.04; 95%CI: 1.58–2.62) and those with ≥2 comorbidities (OR=1.77; 95%CI: 1.29–2.41). In the third wave, prolonged hospitalization was more frequent among older adults (OR=1.89; 95%CI: 1.65–2.17,), individuals with 5–9 symptoms (OR=1.52; 95%CI: 1.20–1.92), obese individuals (OR=2.2; 95%CI: 1.78–2.73), and those with comorbidities (OR=1.45; 95%CI: 1.22–1.72 and OR=2.0; 95%CI: 1.69–2.45). In conclusion, we identified variations in hospitalization patterns across the pandemic waves, although the differences were relatively subtle. These variations likely reflect gradual shifts in the risk factors associated with prolonged hospital stays. Our findings highlight t the importance of implementing targeted public health interventions, particularly those designed to reduce disease severity and improve clinical outcomes among vulnerable populations at greater risk of extended hospitalization.by Esther Ba-Iredire, James Atampiiga Avoka, Luke Abanga, Abigail Awaitey Darkie, Emmanuel Junior Attombo, Eric Agboli
IntroductionThe alarming rate of drug-resistant tuberculosis (DR-TB) globally is a threat to treatment success among positive tuberculosis (TB) cases. Studies aimed at determining the prevalence, trend of DR-TB and socio-demographic and clinical risk factors contributing to DR-TB in the four regions of Ghana are currently unknown. This study sought to determine the prevalence and trend of DR-TB, identify socio-demographic and clinical risk factors that influence DR-TB, and analyse the relationship between underweight and adverse drug reactions and treatment outcomes among DR-TB patients in four regions of Ghana.
MethodIt was a retrospective review conducted over 5 years, from January 2018 to the end of December 2022. The data were retrieved from the DR-TB registers and folders at the Directly Observed Treatment (DOT) centres in the four regions. Analysis of the data was conducted using STATA version 17.
ResultsThe prevalence of DR-TB in Ashanti was 10.1%, Eastern 5.3%, 27.8% in Central, and 2.7% in the Upper West region for the year 2022. The overall prevalence rate of DR-TB for the period 2018–2022 was 13.8%. The socio-demographic and clinical risk factors that influence DR-TB in the four regions are: age, marital status (aOR 3.58, P-value Conclusion
The study shows that the prevalence of DR-TB in Ghana is low, probably not because the cases have reduced but due to inadequate GeneXpert machines to detect the cases. Age, marital status, education, alcohol intake, previously treated TB cases, adverse drug reactions, underweight, and treatment outcome are factors influencing the development of DR-TB. Therefore, interventions aimed at improving the nutritional status of DR-TB cases and minimising adverse drug reactions will improve treatment outcomes.
by Connie Nait, Simple Ouma, Saadick Mugerwa Ssentongo, Boniface Oryokot, Abraham Ignatius Oluka, Raymond Kusiima, Victoria Nankabirwa, John Bosco Isunju
BackgroundDespite advances in HIV care, viral load suppression (VLS) among adolescents living with HIV (ALHIV) in Uganda continue to lag behind that of adults, even with the introduction of dolutegravir (DTG)-based regimens, the Youth and Adolescent Peer Supporter (YAPS) model, and community-based approaches. Understanding factors associated with HIV viral load non-suppression in this population is critical to inform HIV treatment policy. This study assessed the prevalence and predictors of viral load non-suppression among ALHIV aged 10–19 years on DTG-based ART in Soroti City, Uganda.
MethodsWe conducted a cross-sectional study among 447 ALHIV attending three urban HIV clinics in Soroti City. Data were abstracted using a structured questionnaire and analyzed in STATA 15.0. Modified Poisson regression with robust error variance was used to identify predictors of viral load non-suppression. Adjusted relative risks (aRR) and 95% confidence intervals (CIs) were reported, with statistical significance set at p ≤ 0.05.
ResultsOf the 447 participants, 53.5% were female, with a median age of 16 years (IQR: 14.0–17.6). The majority (72.9%) were from Soroti district and had been on DTG-based ART for a median of 42.5 months (IQR: 37.0–48.0). Most were receiving multi-month dispensing (MMD) (75.2%) and were active in care (98%). The prevalence of viral load non-suppression was 19.2% (86/447). Independent predictors of non-suppression included older age (15–19 vs. 10–14 years) (aRR: 1.69; 95% CI: 1.08–2.67), male sex (aRR: 1.48; 95% CI: 1.05–2.11), prior non-suppression before switching to DTG (aRR: 1.76; 95% CI: 1.19–2.59), use of non-fixed dose DTG regimens (aRR: 2.03; 95% CI: 1.23–3.33), history of poor adherence (aRR: 4.36; 95% CI: 2.05–9.26), and not receiving MMD (aRR: 2.83; 95% CI: 1.93–4.15).
ConclusionNearly one in five adolescents on DTG-based ART in Soroti City had viral non-suppression, despite optimized treatment regimens. Targeted interventions−particularly enhanced adherence counseling for older and male adolescents, expanding MMD coverage, and provision of fixed-dose regimens−are urgently needed to improve VLS among ALHIV. These findings underscore the need for adolescent-centered HIV care strategies to close the viral suppression gap and advance progress towards epidemic control.
by Mazlum Uruc, Burak Menek
BackgroundDevelopmental Coordination Disorder (DCD) is a neurodevelopmental condition that adversely impacts motor skills, sensory processing, and daily activity participation. Telerehabilitation has recently emerged as a promising method to improve therapy access and foster family involvement. This study investigated the effects of integrating telerehabilitation with sensory-based intervention on motor performance, sensory processing, and participation in children with DCD.
MethodsThis randomized controlled trial included 20 children aged 3–7 years with a confirmed diagnosis of DCD. Participants were randomly assigned to either a sensory-based intervention (SBI) group or a telerehabilitation sensory-based intervention (TBSI) group. Both groups received weekly face-to-face sensory-based therapy for eight weeks. Additionally, the TBSI group participated in 30-minute weekly home-based telerehabilitation sessions. Outcome measures included the Canadian Occupational Performance Measure (COPM), the Functional Independence Measure for Children (WeeFIM), and the Dunn Sensory Profile.
ResultsBoth groups demonstrated statistically significant improvements; however, the TBSI group showed greater gains in WeeFIM motor, cognitive, and total scores as well as COPM performance and satisfaction scores (p Conclusions
Telerehabilitation is an effective intervention for improving motor and cognitive functions, sensory processing, and daily life participation in children with DCD. The findings support the integration of telerehabilitation into sensory-based approaches as part of a holistic model of care in occupational therapy practice.
Trial registrationClinicaltrials.gov NCT06977256.
by Yiming Jin, Rong Lu, Mingyuan Wang, Zihao Xu, Zhen Liu, Shuhong Xie, Yu Zhang
ObjectiveIn this study, we aimed to analyze the blood screening detection strategies employed for voluntary blood donation in a specific region of East China and evaluate the efficacy of the blood safety detection system.
Donors and MethodsA total of 539,117 whole blood samples were collected from voluntary blood donors between January 2018 and July 2021, as well as in 2023 and 2024. The samples were screened for hepatitis B surface antigen (HBsAg), hepatitis C virus (HCV) antibodies, human immunodeficiency virus antibodies/antigen (HIV Ab/Ag), and Treponema pallidum (TP) antibodies using enzyme-linked immunosorbent assay (ELISA). Alanine aminotransferase (ALT) levels were measured using a rapid method. Chemiluminescence immunoassay technology was used to detect five hepatitis B virus (HBV) markers. Polymerase chain reaction was employed to detect HBV DNA, HCV RNA, and HIV RNA. The reactivity rates of each marker were analyzed.
ResultsThe overall positivity rate for blood testing among donors in this region was 0.76% (4,078/539,117). The positivity rates for the individual markers were as follows: anti-TP (0.20%)> HBsAg (0.18%)> ALT (0.13%)> anti-HCV (0.085%)> nucleic acid testing (0.080%)> HIV antigen/anti-HIV (0.079%). No significant differences were observed (P > 0.05). Before 2023, the positivity rates for ALT and HBsAg exhibited occasional fluctuations, followed by a significant decline. Conversely, in 2024, a slight upward trend in the HIV positivity rate was noted.
ConclusionThe current multitiered blood screening and detection strategy in this region exhibits complementary advantages, ensuring effective blood safety. However, the observed slight upward trend in the HIV positivity rate among voluntary blood donors highlights the necessity for enhanced pre-donation counseling and risk assessment for key populations.
by Esther Mofiyinfoluwa Ola, Temitope Helen Balogun, Rasheed Olayinka Isijola, Oluwaremilekun Grace Ajakaye
Parasitic infections are a major cause of morbidity and mortality in Nigeria, with malaria and schistosomiasis having the highest burden. This study investigated the prevalence of malaria, urogenital schistosomiasis, and co-infections and their impact on the nutritional status of schoolchildren in two communities in Ondo State. A total of 185 participants from Ipogun and Oke Igbo were screened for malaria and schistosomiasis infection using the ParaHit malaria rapid diagnostic test kit and urine microscopy. Anthropometric measurements were used to assess the nutritional status of the participants. In this study, a higher prevalence of malaria was recorded in Oke Igbo, with 36 individuals (57.1%), compared to 60 individuals (49.2%) in Ipogun. Urogenital schistosomiasis was also more prevalent in Oke Igbo, affecting 18 individuals (28.6%), while only 5 individuals (4.1%) were affected in Ipogun. Co-infection with both diseases was more common in Oke Igbo, with 13 cases (20.6%), compared to 4 cases (3.3%) in Ipogun. However, malnutrition rates were similar between the two communities, with 60 cases (77.9%) in Ipogun and 28 cases (75.5%) in Oke Igbo. Notably, participants with either malaria or urogenital schistosomiasis, as well as those co-infected, exhibited a higher frequency of chronic malnutrition. The likelihood of co-infection was significantly associated with gender and locality, with individuals in Oke Igbo being 0.78 times less likely to be co-infected (P = 0.00; CI = 0.09–0.49), while males were 2.19 times more likely to have co-infections (P = 0.02; CI = 1.13–4.28). This study emphasised the significant health burden posed by malaria and urogenital schistosomiasis co-infections among schoolchildren in Ondo State, highlighting the need for comprehensive health and nutritional interventions to address the challenges associated with these parasitic diseases.by Vijeeth Guggilla, Jennifer A. Pacheco, Alexandre M. Carvalho, Grant R. Whitmer, Anna E. Pawlowski, Jodi L. Johnson, Catherine A. Gao, Chad J. Achenbach, Theresa L. Walunas
BackgroundAdults with immunosuppression are more likely to develop severe COVID-19 than adults without immunosuppression. Less is known about differences in outcomes for adults with immunosuppression who are hospitalized with COVID-19.
MethodsA retrospective cohort study of adults hospitalized with COVID-19 at Northwestern Medicine hospitals between 03/01/2020 and 05/31/2022 was performed. Regression analyses were performed comparing in-hospital mortality, intensive care unit (ICU) admission, oxygenation requirements, and hospital/ICU length of stay among patients without immunosuppression (n = 9079) and patients with immunosuppression (n = 873).
ResultsPatients with immunosuppression had significantly higher mortality than patients without immunosuppression (OR: 1.33, 95% CI: 1.11–1.60). This effect was even stronger when controlling for age at admission, diabetes, obesity, SARS-CoV-2 variant era, and COVID-19 medication use (adjusted OR: 1.78, 95% CI: 1.46–2.16). ICU admission (adjusted OR: 1.64, 95% CI: 1.41–1.90) and invasive ventilation (adjusted OR: 1.68, 95% CI: 1.36–2.06) were also significantly higher in patients with immunosuppression. Hospitalization length (median: 7 days) and ICU length of stay (median: 2.5 days) were longer in patients with immunosuppression compared to patients without immunosuppression (median: 5 days, adjusted p Conclusions
Patients with immunosuppression had worse outcomes than patients without immunosuppression. Subgroup analyses showed that patients with solid organ transplant had the worst outcomes overall. Patients with HIV had similar outcomes as patients without immunosuppression unless CD4 cell count was low.