by Lei Xiong, Ke Li, Wendy Siuyi Wong
BackgroundDigital media usage has become an integral part of daily life, but prolonged or emotionally driven engagement—especially during late-night hours—may lead to concerns about behavioral and mental health. Existing predictive systems fail to account for the nuanced interplay between users’ internal psychological states and their surrounding ecological contexts.
ObjectiveThis study aims to develop a psychologically and ecologically informed behavior prediction model to identify high-risk patterns of digital media usage and support early-stage intervention strategies.
MethodsWe propose a Dual-Channel Cross-Attention Network (DCCAN) architecture composed of three layers: signal identification (for psychological and ecological encoding), interaction modeling (via cross-modal attention), and behavior prediction. The model was trained and tested on a dataset of 9,782 users and 51,264 behavior sequences, annotated with labels for immersive usage, late-night activity, and susceptibility to health misinformation.
ResultsThe DCCAN model achieved superior performance across all three tasks, especially in immersive usage prediction (F1-score: 0.891, AUC: 0.913), outperforming LSTM, GRU, and XGBoost baselines. Ablation studies confirmed the critical role of both psychological and ecological signals, as well as the effectiveness of the cross-attention mechanism.
ConclusionsIncorporating psychological and ecological modalities through attention-based fusion yields interpretable and accurate predictions for digital risk behaviors. This framework shows promise for scalable, real-time behavioral health monitoring and adaptive content moderation on media platforms.
Our primary objectives were (1) to develop and validate an administrative data algorithm for the identification of hand trauma cases using clinical diagnoses documented in medical records as the reference standard and (2) to estimate the incidence of hand trauma in a universal public healthcare system from 1993 to 2023 using a population-based research cohort constructed using a validated case identification algorithm.
A population-based retrospective validation study.
Ontario, Canada, from 2022 to 2023 (validation) and from 1993 to 2023 (estimation).
Our reference standard was the known hand trauma status of 301 patients (N=147 with hand trauma) who presented to an urban tertiary-care hand trauma centre in Toronto, Ontario.
(1) The sensitivity, specificity, positive and negative predictive values of the optimal algorithm to identify hand trauma using provincial health administrative data and (2) age-standardised and sex-standardised incidence rates of hand trauma among men and women, by age, and by area of patient residence.
The optimal algorithm had a sensitivity of 73.8% (95% CI 66.6% to 81.0%), specificity of 80.1% (95% CI 73.8% to 86.5%), positive predictive value of 78.1% (95% CI 71.2% to 85.0%) and negative predictive value of 76.1% (95% CI 69.5% to 82.7%). Over the study period, the age-standardised and sex-standardised incidence of hand trauma increased from 384 to 530 per 100 000. The greatest increase was observed in males and individuals aged 0–19 and 80+, with higher incidence rates in Southern compared with Northern Ontario.
Our algorithm enabled identification of hand trauma cases using health administrative data suitable for population-level surveillance and health services research, revealing a rising burden of hand trauma from 1993 to 2023. These findings can support improved surveillance, resource allocation and care delivery for this public health problem.
by Natthakul Akarapredee, Chalirmporn Atasilp, Chonlaphat Sukasem, Pimonpan Jinda, Rattanaporn Sukprasong, Jiraporn Jensuriyarkun, Soravit Wongjitjanyong, Patompong Satapornpong, Natchaya Vanwong
IntroductionIrinotecan is a chemotherapy agent commonly prescribed for metastatic colorectal cancer but often leads to neutropenia. Variations in genes encoding drug-metabolizing enzymes and transporters may affect the toxicity and effectiveness of irinotecan. This study aimed to examine the impact of these genetic polymorphisms on irinotecan outcomes in Thai colorectal cancer patients.
MethodsThe study retrospectively analyzed 41 metastatic colorectal cancer patients treated with irinotecan-based chemotherapy. Genotyping was conducted for 23 single nucleotide polymorphisms in genes including UGT1A1, CYP3A4, CYP3A5, CES1, ABCB1, ABCC2, ABCC5, ABCG1, ABCG2, and SLCO1B1.Toxicity and efficacy were assessed, with statistical significance set at a Bonferroni-corrected P value Results
In terms of toxicity, UGT1A1*6 was significantly associated with both all-grade and severe neutropenia in the first cycle (p p p ABCC2 -24C > T variant was linked to all-grade neutropenia in the second cycle (p = 0.001). For efficacy, patients with the wild-type UGT1A1*6 had longer progression-free survival (PFS) (p SLCO1B1 521T > C variant was associated with improved PFS (p Conclusion
UGT1A1*6 and ABCC2 -24C > T variants emerge as potential predictors of irinotecan-induced neutropenia, while UGT1A1*6 and SLCO1B1 521T > C may serve as markers of prolonged PFS in Thai patients. Validation through larger prospective studies is essential to confirm and refine these genetic associations.
To examine the association between socioeconomic status (SES), financial subsidies and awareness-related factors such as age, cancer stage and family history, and the uptake of cancer genetic testing, with a focus on equitable access to care.
Retrospective cohort study.
Tertiary care cancer genetics service in Singapore.
The study population included 2687 individuals of all ages, genders and ethnicities who attended pretest counselling between 2014 and 2020 and were eligible for genetic testing for hereditary cancer syndromes.
The primary outcome was the uptake of genetic testing. The main explanatory variables were SES (proxied by Housing Index), subsidy status, age, cancer stage and family history. Analyses examined whether associations varied across SES and age subgroups.
Receipt of financial subsidies was strongly associated with testing uptake (adjusted OR 9.15, 95% CI 2.68 to 31.20). Uptake exceeded 90% among subsidised individuals across all socioeconomic strata, compared with 56–68% among non-subsidised individuals, with the largest gains in the lowest SES group (43 vs 28 percentage points (pp) in the highest). The level of subsidy was not associated with uptake. Younger patients (18–39 years) had higher uptake than those aged 60+ (66% vs 57%); patients with advanced cancer (stage IV) had the highest uptake (82% vs 57–66% in earlier stages); and family history was associated with increased uptake, strongest for having a child with cancer (+28 pp). Interaction analysis suggested that the additive effects of subsidies were greatest in lower SES groups and in older adults.
Financial subsidies were strongly associated with higher genetic testing uptake. Awareness indicators like age, cancer stage and family history were associated with higher uptake. The association between subsidies and uptake varied by SES and age, suggesting that subsidies may help reduce disparities and improve equitable access to genetic testing services.
by Berihun Agegn Mengistie, Getie Mihret Aragaw, Tazeb Alemu Anteneh, Kindu Yinges Wondie, Alemneh Tadesse Kassie, Alemken Eyayu Abuhay, Wondimnew Mersha Biset, Gebrye Gizaw Mulatu, Nuhamin Tesfa Tsega
BackgroundPrecancerous cervical lesions, or cervical intraepithelial neoplasia (CIN), represent a significant precursor to cervical cancer, posing a considerable threat to women’s health globally, particularly in developing countries. In Africa, the burden of premalignant cervical lesions is not well studied. Therefore, the main purpose of this systematic review and meta-analysis was to determine the overall prevalence of precancerous cervical lesions and identifying determinants among women who underwent cervical cancer screening in Africa.
MethodsThis study followed the Preferred Reporting Item Review and Meta-analysis (PRISMA) guidelines. The protocol for this systematic review and meta-analysis was registered on the International Prospective Register of Systematic Reviews (PROSPERO) (ID: CRD42025645427). We carried out a systematic and comprehensive search on electronic databases such as PubMed and Hinari. In addition, Google Scholar and ScienceDirect were utilized to find relevant studies related to precancerous cervical lesions. Data from the included studies were extracted using an Excel spreadsheet and analyzed using STATA version 17. The methodological quality of the eligible studies was examined using the Joanna Briggs Institute (JBI) assessment tool. Publication bias was checked by using the funnel plot and Egger’s tests. A random-effects model using the Der Simonian Laird method was used to estimate the pooled prevalence of pre-cancerous cervical lesions in Africa. The I-squared and Cochrane Q statistics were used to assess the level of statistical heterogeneity among the included studies.
ResultsA total of 112 eligible articles conducted in Africa, encompassing 212,984 study participants, were included in the quantitative meta-analysis. Thus, the pooled prevalence of pre-cancerous cervical lesions in Africa was 17.06% (95% confidence interval: 15.47%−18.68%). In this review, having no formal education (AOR = 4.07, 95% CI: 1.74, 9.53), being rural dweller(AOR = 2.38, 95% CI: 1.64, 3.46), history of STIs (AOR = 3.94, 95% CI: 2.97, 5.23), history of having multiple partners (AOR = 2.73, 95% CI: 2.28, 3.28), early initiation of coitus (AOR = 2.77, 95% CI: 2.11, 3.62), being HIV-seropositive women (AOR = 3.33, 95% CI: 2.32, 4.78), a CD4 count Conclusions
In Africa, the overall prevalence of pre-cancerous cervical lesions is high (17%). The findings of this review highlight that health professionals, health administrators, and all other concerned bodies need to work in collaboration to expand comprehensive cervical cancer screening methods in healthcare facilities for early detection and treatment of cervical lesions. In addition, increasing community awareness and health education, expanding visual inspection of the cervix with acetic acid in rural areas, offering special attention to high-risk groups (HIV-positive women), encouraging adherence to antiretroviral therapy for HIV-positive women, overcoming risky sexual behaviors and practices, and advocating early detection and treatment of precancerous cervical lesions.
by Sompot Jantarawong, Wipapan Khimmaktong, Pharkphoom Panichayupakaranant, Yutthana Pengjam
Ternary complex of curcuminoid-rich extract (CRE-Ter) is a developed water-soluble Curcuma longa extract containing 14% w/w curcuminoids, hydroxypropyl-β-cyclodextrin, and polyvinylpyrrolidone K30. This study aimed to investigate the biomolecular effects of CRE-Ter on differentiation of bone cells (murine MC3T3-E1 preosteoblasts), muscle cells (murine dexamethasone-treated C2C12 myotubes) atrophy and irisin expression. In MC3T3-E1 preosteoblasts, CRE-Ter treatment increased alkaline phosphatase activity, calcium deposition, and expression of Bmp-2, Runx2, and collagen 1a significantly and dose-dependently. 5, 10, and 20 µg/mL CRE-Ter upregulated β-catenin expression significantly. CRE-Ter improved the atrophy of dexamethasone-treated C2C12 myotubes. CRE-Ter decreased proinflammatory cytokine (TNF-α and IL-6) expression but increased FNDC5 and irisin expression and nitric oxide production in dexamethasone-treated C2C12 myotubes significantly and dose-dependently. Dexamethasone promoted β-catenin and total p38 expression in C2C12 myotubes. CRE-Ter at 2.5–20 µg/mL reversed the increase in β-catenin expression, whereas 2.5 µg/mL reversed total p38 expression. Crosstalk experiments further revealed that conditioned medium from C2C12 myotubes enhanced osteocalcin expression in MC3T3-E1 osteoblasts. Molecular docking simulations using CB-Dock2 showed strong interactions between each curcuminoid molecule and irisin. Therefore, CRE-Ter may stimulate osteoblast differentiation, ameliorate myotube atrophy, and increase irisin expression, indicating its therapeutic potential in osteoporosis, sarcopenia, and osteosarcopenia.by Jin-Hwa Kim, Ji-Soo Jeong, Jeong-Won Kim, Eun-Hye Chung, Su-Ha Lee, Je-Won Ko, Youn-Hwan Hwang, Tae-Won Kim
Moutan Cortex (MC), the dried root bark of Paeonia suffruticosa, is used in traditional Chinese and Korean medicine to treat enteritis for its anti-inflammatory properties. This study compared the pharmacokinetic (PK) profiles of paeonol and paeoniflorin in normal and dinitrobenzene sulfonic acid (DNBS)-induced colitis rats, and to determine how repeated low-dose MC [MC(L), 0.5 g/kg] or high-dose MC [MC(H), 2.5 g/kg] alters PK and disease severity. Using ultra-performance liquid chromatography–tandem mass spectrometry, we found that DNBS modestly increased paeonol AUClast (NC: 247.8 ± 63.7 vs DNBS: 337.0 ± 120.8 hr*ng/mL) and decreased paeoniflorin (NC: 474.1 ± 11.7 vs DNBS: 463.7 ± 106.8 hr*ng/mL) compared to controls (ns). After repeated dosing, the maximum plasma concentration (Cmax) of paeonol was higher in the MC(H) than that in the MC(L) group (MC(L): 63.81 ± 29.74 vs MC(H): 4221.5 ± 1579.2 ng/mL, p max in the MC(H) group was also higher than MC(L) group (MC(L): 60.5 ± 15.3 vs MC(H): 164.7 ± 74.7 ng/mL, pby Hailemariam Gezie, Endalk Birrie Wondifraw, Muluken Amare Wudu, Habtam Gelaye, Fekadeselassie Belege Getaneh
BackgroundNeural tube defects (NTDs) are severe congenital anomalies resulting from the incomplete closure of the embryonic neural tube, affecting around 300,000 newborns globally each year and leading to significant mortality and disability. While high-income countries have seen a reduction in NTD prevalence, developing nations like Ethiopia continue to face high rates. Families impacted by NTDs often endure emotional challenges, including grief, anxiety, and social isolation. This study aims to investigate the birth prevalence of NTDs and the associated parental stress, emphasizing the wider effects on families.
MethodologyAn institution-based cross-sectional study was conducted in Dessie and Deber Berhan comprehensive specialized hospitals from July 24, 2023, to July 24, 2024, to evaluate the birth prevalence of NTDs and the associated parental stress among parents of children aged 1 month to 12 years diagnosed with NTDs. A total of 308 parent-child pairs participated in the study. Data were gathered using a pretested questionnaire and an 18-item Parenting Stress Scale. Statistical analysis was performed using Stata version 17, where linear regression was utilized to identify significant predictors after verifying the necessary assumptions. The findings were presented in multiple formats for clarity and comprehensibility.
ResultsThe overall birth prevalence of neural tube defects was found to be 0.0052 (95% CI: 0.0038, 0.0067), which translates to 52 cases per 10,000 deliveries. Key factors associated with increased parental stress included being a mother (β = 2.51), older parental age (β = 0.18), the child’s age (β = 0.81), a prior history of having children with NTDs (β = 7.88), and the presence of a ventriculoperitoneal shunt in the child (β = 4.66).
ConclusionThe findings of this study indicate that the birth prevalence of NTDs is becoming a significant public health concern. Additionally, several factors contributing to increased parental stress were identified, including older parental age, the child’s age, a previous history of NTDs in siblings, and the presence of a ventriculoperitoneal shunt. These results highlight the urgent need for targeted support and resources for affected families to help mitigate the psychological impact associated with these conditions.
Acute coronary syndrome (ACS) is the leading cause of morbidity and mortality among individuals with cardiovascular disease, accounting for half of all global cardiovascular-related deaths. No prior research has examined ACS treatment outcomes and associated factors in the study area. This study aimed to evaluate the risk factors and treatment outcome of ACS patients admitted to public hospitals in Harari Regional State, Eastern Ethiopia.
A retrospective hospital-based cross-sectional study was conducted among 308 ACS patients. Patient records from admissions between 1 November 2018 and 31 October 2023 were reviewed, with data collected between 10 January and 10 February 2024 using a structured checklist adapted from previous research. Statistical analysis was performed using SPSS V.25.0, with bivariable and multivariable logistic regression identifying significant associations at a p value
The mean patient age was 56.4±16 years, with males comprising 77.3% of participants. Half (51.6%) resided in rural areas, and only 16.2% presented within 12 hours of symptom onset. Overall, 81 patients (26.3%) experienced a poor treatment outcome for ACS, including 39 (12.7%) in-hospital deaths, 24 (7.8%) referrals to higher-level facilities and 18 (5.8%) who left against medical advice. Factors significantly associated with poor outcome included hospital presentation more than 72 hours after symptom onset (AOR 2.734 (95% CI 1.006 to 7.435)), left ventricular ejection fraction (LVEF)
Poor treatment outcome was independently predicted by the presence of ischaemia features on the echocardiography, LVEF (
The MD Anderson Oropharynx Cancer (MDA-OPC) cohort is a unique single-institution, prospective longitudinal cancer cohort. The cohort aims to enhance the therapeutic index of OPC management by supporting data needs for independent investigators to conduct rigorous observational studies examining exposures and factors associated with acute and late toxicities, cancer progression, recurrence, new malignancies and quality of life in OPC survivors.
A total of 1811 patients with OPC with a minimum follow-up of 6 months have been consented to our prospective registry between 18 March 2015 and 29 December 2023. Clinical and treatment (Tx) data are available on all patients, including previously untreated patients (1443, 80%). Most previously untreated patients (97%) consented to longitudinal patient-reported outcomes and functional assessments for critical time points including pre-Tx, during-Tx and post-Tx at 3–6 months, 12 months, 18–24 months and annually up to 5 years.
The median age for the MDA-OPC cohort is 66 years (range, 25–96) with the majority being male (89%), white (92%) and with human papillomavirus (HPV)/p16-associated OPC (88%) primarily located in the tongue base or tonsil (90%). For previously untreated patients, 79% were diagnosed with stage I/II disease, and nearly half underwent curative intent chemoradiation. Overall survival was significantly higher for HPV/p16-associated OPC at 1 year (98% vs 93%) and 5 years (83% vs 54%; p
Future work includes expansion of the MDA-OPC cohort and survivorship surveillance to 10 years under the recently funded OPC-SURVIVOR research programme (P01CA285249), which aims to identify non-invasive, clinic-ready biomarkers and examine novel phenotypes and mechanistically matched mitigation strategies for latent OPC sequelae. Additionally, we aim to expand our advanced data infrastructure by integrating large data streams from parallel clinical trials and imaging registries.
Neutropenic fever (NF) has a crude mortality rate of 3–18%. International guidelines recommend that all patients with NF receive ultrabroad-spectrum antibiotics (UBSAs) within 1 hour of emergency department (ED) registration. However, over 70% patients presenting to hospital with suspected NF (sNF) cannot access absolute neutrophil count (ANC) result within 1 hour, do not have NF and do not require UBSAs. In ED and hospitalised patients with sNF, we hypothesise that the ASTERIC protocol effectively and safely reduces the use of UBSAs compared with standard care alone.
This pragmatic, parallel, multicentre, type 1, hybrid effectiveness-implementation, stepped-wedge, before-and-after, cluster randomised controlled trial aims to evaluate whether antibiotic prescribing can be safely reduced through implementing a multifaceted antibiotic stewardship intervention (ASTERIC) in adult patients with sNF presenting to EDs. The sNF was defined as a fever with a single oral temperature of ≥38.3°C (101°F) within 24 hours before ED registration or a temperature of ≥38.0°C (100.4°F) sustained over a 1-hour period, following last chemotherapy or targeted therapy within 6 weeks for any solid tumour, or in any period following therapies against leucaemia, lymphoma, myelodysplastic syndrome, aplastic anaemia, multiple myeloma or recipient of HSCT. The study will involve eight hospitals in Hong Kong with variable baseline practice. We will include 704 adult patients (352 patients in pre-implementation and post-implementation periods, respectively) with sNF (tympanic temperature ≥38.3°C) and 48 staff participants (6 staff participants in each hospital). Healthcare professionals will receive a multifaceted stewardship intervention consisting of risk assessment tools, fast-track ANCs, a decision tool for patient management and antibiotic use, supported by an educational package and staff interaction programmes (ASTERIC protocol). Patients’ blood ANC, and cancer therapy and chronic illness therapy scores will be measured. The RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) and Proctor conceptual frameworks will be followed for evaluation of implementation. The main outcome measures are the mean total dose of UBSAs prescribed in 7 days and serious adverse events at 30 days. Data analysis will incorporate intention-to-treat, per-protocol and as-treated analyses for service outcomes (effectiveness, safety, quality of life assessments and cost-effectiveness) and mixed methods for implementation outcomes, informed by the Theoretical Domains Framework. We expect that the study results will inform health policy with improvement in hospital services in treating stable sNF, evidenced by improved safe antibiotic stewardship, early antibiotic de-escalation and reduced costs and length of stay.
The institutional review boards of all study sites approved this study. This study will establish the ASTERIC protocol safely improves antibiotic stewardship and clinical management in adult patients with sNF. We will disseminate the findings through peer-reviewed publications, conference presentations and educational activities. All patients with sNF will be influenced by the new protocol which is agreed at hospital level. Randomisation is at hospital level, not patient level. Patient consent is sought for follow-up and data access, not for treatment. Staff consent is sought for interviewing.
by Sang Ah Lee, Jin-Myung Kim, Hye Eun Kwon, Youngmin Ko, Joo Hee Jung, Sung Shin, Young Hoon Kim, Sung-Han Kim, Hyunwook Kwon
PurposeOptimal perioperative antibiotic prophylaxis in kidney transplantation remains undefined despite routine antibiotic administration to prevent infections. In this retrospective observational cohort study with historical comparison, we compared the clinical efficacy of 6 days of ampicillin/sulbactam vs. a single dose of cefazolin.
Materials and methodsWe retrospectively analyzed 2322 kidney transplantation recipients at a single center, with the evaluation period spanning from 2015 through 2021. Patients were divided into 2 groups based on the perioperative antibiotic regimen received: 971 patients received ampicillin/sulbactam, and 1351 received cefazolin. This study focused on evaluating the impact of these regimens on postoperative infection incidence and the 6-month acute rejection (AR) rates.
ResultsThe cefazolin group exhibited a tendency toward higher urinary tract infection rates within 1 month after transplantation (3.4% vs. 2.2%, p= = 0.078). There were no significant differences in surgical site infections between the groups. The 6-month AR rates were significantly lower in the cefazolin group than in the ampicillin/sulbactam group (5.1% vs. 7.9%, p= = 0.009). Cefazolin was also confirmed to be significantly associated with reduced 6-month AR rates in the multivariable logistic regression analysis (odds ratio 0.63, 95% confidence interval [0.45-0.89], p= = 0.009).
ConclusionIn this study, we observed that a single dose of cefazolin as perioperative antibiotic prophylaxis may lead to higher rates of postoperative urinary tract infections, but it could potentially lower the incidence of acute rejection within six months.
by Danai Sangthong, Pradit Sangthong, Warin Rangubpit, Prapasiri Pongprayoon, Eukote Suwan, Kannika Wongpanit, Wissanuwat Chimnoi, Pacharathon Simking, Sinsamut Sae Ngow, Serge Morand, Roger W. Stich, Sathaporn Jittapalapong
Phylogenetic and population genetic analyses were conducted on tick specimens collected from cattle in northern, northeastern, central, and southern regions of Thailand. Morphological identification indicated these ticks consisted of three species, Rhipicephalus microplus from all four regions, R. sanguineus from the northern and northeastern regions, and a Haemaphysalis species only collected from the northeastern region. Analysis of cytochrome c oxidase subunit I gene (COI) sequences identified R. microplus clades A and C, while clade B was not detected in this study. The same analysis indicated specimens morphologically identified as Haemaphysalis were H. bispinosa, confirming previous reports of their prevalence in northeastern Thailand. H. bispinosa showed low haplotype and nucleotide diversity, suggesting either a bottleneck or founder effect. Both R. microplus clades displayed high haplotype diversity and low nucleotide diversity, a pattern associated with population expansion. Genetic structural analysis revealed significant genetic differences in R. microplus clade A, especially between mainland (northern, northeastern, and central regions) and peninsular (southern region) populations, which indicated limited gene flow between these areas while suggesting movement of these ticks across the mainland. The sequence analyses described in this report enhance understanding of the natural history of ticks in Thailand and are expected to guide and strengthen tick control strategies across Southeast Asia.by Efthymios Papadopoulos, Dmitry Rozenberg, Andy Kin On Wong, Sharon Hiu Ching Law, Sarah Costa, Angela M. Cheung, Shabbir M. H. Alibhai
BackgroundSkeletal muscle index (SMI), grip strength, and physical performance have been shown to predict clinically relevant outcomes in geriatric oncology. However, their predictive ability for chemotherapy toxicity is poorly understood. We examined whether SMI, grip strength, or physical performance are independently associated with severe toxicity among older adults receiving chemotherapy.
MethodsOlder adults (≥65y) who had received chemotherapy at an academic cancer center between June 2015 and June 2022 were included in the analysis. SMI prior to chemotherapy was determined via computed tomography (CT), using the entire cross-sectional area of the muscle (cm2) at the third lumbar vertebra (L3) divided by the square of patient height in meters. Grip strength and lower extremity physical performance were measured prior to chemotherapy. Multivariable logistic regression was used to examine the independent associations between SMI, low grip strength, and low physical performance with severe (grade≥3) chemotherapy toxicity.
ResultsOf the 115 older adults in the study, 71.3% were males. The most common disease site was genitourinary (53.9%) and most participants received chemotherapy with palliative intent (67.8%). A total of 69 (60.0%) participants experienced at least one grade ≥3 toxicity during the study. In multivariable analyses, low grip strength per the Sarcopenia Definitions and Outcomes Consortium (SDOC) definition was significantly associated with grade ≥3 toxicity (adjusted odds ratio (OR): 2.77, 95%CI: 1.03–7.45, p = 0.044). SMI either as a continuous (OR: 1.03, 95%CI: 0.97–1.09, p = 0.40) or categorical variable (OR: 1.17, 95%CI: 0.47–2.89, p = 0.74) was not predictive of grade ≥3 toxicity. Similarly, low physical performance did not have significant associations with grade ≥3 toxicity (OR: 2.06, 95%CI: 0.86–4.95, p = 0.11).
ConclusionLow grip strength may predict grade ≥3 toxicity among older adults receiving chemotherapy. Integrating grip strength into geriatric assessment may help clinicians identify older adults who might be at greater risk for severe chemotherapy toxicity.
While loneliness has been recognised as a global public health concern, there are still knowledge gaps about how to prevent or reduce loneliness. The Social Relationship Expectations (SRE) Framework (Akhter-Khan et al, 2023) has been developed to enhance mechanism-based interventions targeting individuals’ expectations for social relationships. However, no scale has yet been developed to measure these expectations. We aim to measure SRE across the six interdependent dimensions identified in the theoretical framework, across diverse settings. This protocol outlines the methodology for developing the SRE scale.
The scale will be developed using both inductive and deductive techniques in a multicountry observational study. First, items will be extracted from published qualitative studies on loneliness and SRE across 15 lower-middle-income countries and from a qualitative focus group study with older Myanmar and Thai adults. Second, using a Delphi process for item development, experts across five world regions (Africa, the Americas, Asia, Europe and Oceania) will be involved in the item selection and scale creation process. A preliminary item pool will be administered in English, German and Chinese. Classical test theory as well as network analysis will be used to assess the dimensionality of the scale, understand item relationships and clusters, and select the final items for the SRE scale.
Ethics approval for the scale development has been obtained from King’s College London (reference number: MRSP-24/25-46512). Informed consent will be obtained from all participants prior to completing the cognitive interviews and online surveys. Results will be disseminated in peer-reviewed journals in collaboration with coauthors across different countries and disciplines.
To explore determinants impacting an Electronic Health Record-based information system implementation and their association with implementation fidelity based on the Theoretical Domains Framework (TDF) from nurses' perspectives.
Exploratory sequential mixed-method design.
In stage one, semi-structured interviews with 53 purposively selected nurses informed the exploration of TDF domains influencing the implementation of the information system with directed content analysis. In stage two, a cross-sectional survey, informed by the qualitative findings, was conducted among 482 nurses to identify the most relevant and relatively important TDF domains by running generalised linear regression models.
The qualitative interviews generated 13 TDF domains that were identified as major influencing factors, including technology characteristics, knowledge, attitudes, role agreement, self-efficacy, goal-setting, information circulation, and communication among nurses. Quantitative findings showed that 70% of nurses used and printed the written form through the information system, and only 34% offered verbal education consistently. Regression analysis identified nine domains that were relevant and important factors for implementation fidelity, including knowledge, skills, role identity, beliefs in consequences, beliefs in capabilities, intentions, goals, memory and decision processes, and environmental context.
Our findings confirmed previous evidence on determinants of implementing digital health technologies, including knowledge, competencies, perceived effectiveness, role agreement, intentions, decision processes, and environmental context. Additionally, we highlighted the importance of goal-setting for successful implementation.
This study investigated the relatively important associated factors that can impact the successful implementation of the nurse-led information system for post-acute care based on nurses' perspectives. These results can guide nurse practitioners in implementing similar initiatives and support evidence-based decision-making. Researchers can also further investigate the relationships between the identified determinants.
Journal Article Reporting Standards for Mixed Methods Research.
No patient or public contribution.
by Oumarou I. Wone Adama, Iman Frédéric Youa, Alexandra Bitty-Anderson, Arnold Junior Sadio, Rogatien Comlan Atoun, Yao Rodion Konu, Hezouwe Tchade, Martin Kouame Tchankoni, Kokou Herbert Gounon, Kparakate Bouboune Kota-Mamah, Abissouwessim Egbare Tchade, Godonou Amivi Mawussi, Fiali Ayawa Lack, Fifonsi Adjidossi Gbeasor-Komlavi, Anoumou Claver Dagnra, Didier Koumavi Ekouevi
IntroductionIn Togo, the syndromic approach is used for the diagnosis and management of sexually transmitted infections (STIs). The aim of this study was to evaluate the syndromic approach for diagnosis of STIs among female sex workers (FSW) in Lomé, Togo.
MethodsA cross-sectional study was carried out from September to October 2023 among FSW in Lomé (Togo). FSW aged 18 years and above were included. A gynecological examination was performed for syndromic diagnosis, and the Xpert® CT/NG were used to screen vaginal swabs for Chlamydia trachomatis (CT) and Neisseria gonorrhoeae (NG). The performance (predictive values) of the syndromic approach to STI diagnosis was evaluated using the Xpert® CT/NG test as the gold standard.
ResultsA total of 357 FSW were recruited. The median age of FSW was 32 years (IQR: [26–40 years]) and 8.2% had attained a higher level of education. The prevalence of syndromic STI among FSW was 33.3%. Vaginal swabs were positive for CT (8.4%) and NG (8.7%), with a prevalence of bacterial STIs (CT and/or NG) of 14.3%. The syndromic approach to STI diagnosis demonstrated a positive predictive value of 24.3%.
ConclusionThe prevalence of STIs is relatively high among FSW in Lomé. According to this study, the diagnosis of STIs using the syndromic approach has limited relevance. National STI screening and management policies urgently need to be rethought, incorporating recent technological advances.
by Ariffin Kawaja, Aminath Shiwaza Moosa, Eric Kam Pui Lee, Ian Kwong Yun Phoon, Andrew Teck Wee Ang, Zi Ying Chang, Aileen Chelsea Ai’En Lim, Jonathan Yap, Weiting Huang, Ding Xuan Ng, Melvin Yuansheng Sng, Hao Yuan Loh, Chirk Jenn Ng
IntroductionRecent hypertension guidelines recommend ambulatory blood pressure monitoring (ABPM) for accurate diagnosis and monitoring. However, patients’ experiences with cuff and wearable ABPM devices in primary care remain unclear. This study compared the acceptance of three devices (oscillometry cuff, tonometry wrist, and photoplethysmography chest devices) among patients with hypertension in primary care.
MethodsA multi-method study was conducted. Thirty-five participants with hypertension were recruited from two public primary care clinics in Singapore. All participants used cuff-based and either wrist or chest wearable devices for 24 hours. Structured surveys and in-depth audio-recorded interviews were used to gather feedback on their views, experiences, and challenges using the devices. The interviews were thematically analysed, and the surveys were analysed using descriptive statistics.
ResultsAll participants used the cuff (n = 35) device, while the wrist and chest devices were used by two-thirds (n = 22) and a third (n = 11) of the participants, respectively.The device usability questionnaire found that most participants were satisfied with the chest device, which did not disrupt their daily activities. Conversely, cuff arm devices interfered with daily activities (48%) and sleep (26%), were cumbersome (32%), and caused embarrassment (26%). The wrist device was uncomfortable (33%) and painful (22%) for some participants.The qualitative data were categorised into five themes: comfort, convenience, perceived accuracy, and impact on routine and sleep. Participants found the chest device more comfortable and convenient than the cuff and wrist devices. The cuff device was perceived as the most accurate due to its inflation-based BP measurement. All devices minimally affected routines and sleep, though participants expressed safety concerns about the cuff device, particularly while driving.
ConclusionWhile wearable ABPM devices offer increased comfort, convenience and reduced impact on patient’s daily activities, concerns regarding their accuracy must be addressed before the widespread adoption of these devices in routine clinical practice.
by Chalachew Genet, Wendemagegn Enbiale, Anna Rommerskirchen, Rajiha Abubeker, Wudu Tafere, Tsehaynesh Gebre-Eyesus, Michael Getie, Alem Tsega, Muluken Acham, Addisu Melese, Tewachew Awoke, Wondemagegn Mulu, Degu Ashagrie, Tadele Amsalu, Achenef Motbainor, Endalew Gebeyehu, Mulugeta Kibret, Bayeh Abera, Endalkachew Nibret, Abaineh Munshea
IntroductionExtended spectrum β-lactamase (ESBL) and carbapenemase-producing Escherichia coli (E. coli) and Klebsiella pneumoniae (K. pneumoniae) emanating from raw cow milk are among the leading contributors to the spread of antimicrobial resistance (AMR). Due to the misuse and overuse of antibiotics in dairy farms, cow’s milk has become a reservoir of ESBL- and carbapenemase-producing E. coli and K. pneumoniae posing a growing public health threat, especially in areas where the consumption of raw milk is common. However, compared to the clinical sector, the prevalence of ESBL- and carbapenemase-producing E. coli and K. pneumoniae in the food sector is under-studied.
ObjectiveThis study aimed to determine the prevalence of ESBL and carbapenemase-producing E. coli and K. pneumoniae in raw bulk cow milk from Dairy Cooperatives in Northwest Amhara, Ethiopia.
MethodsA cross-sectional study was conducted from January to April, 2025 among 257 dairy cooperative member farms. Sociodemographic and related data were collected using a structured questionnaire. Five milliliters of raw bulk cow milk were collected aseptically from each farm in four Dairy Cooperatives (DCs) (DC-A to D). 10 microliters of milk sample were directly inoculated into MacConkey agar. Escherichia coli and K. pneumoniae were identified using standard microbiological techniques. Antimicrobial susceptibility testing was performed using the Kirby-Bauer disk diffusion method. ESBL and carbapenemase production were confirmed phenotypically via combination disk tests and modified carbapenem inactivation methods, respectively.
ResultsThe prevalence of E. coli and/or K. pneumoniae in raw cow milk was 21% (95% CI, 16.5–26.4%), with respective individual prevalence of 8.2% and 14.8%. ESBL-producing E. coli and K. pneumoniae accounted for 23.8% and 15.8% of isolates, respectively, while 2.6% of isolates (only K. pneumoniae) were carbapenemase producers. Resistance to ampicillin and amoxicillin-clavulanic acid exceeded 70%. All E. coli and 94.7% of K. pneumoniae isolates remained susceptible to carbapenems. Nearly half of all isolates (45.8%) were multidrug resistant (MDR), and 51.9% of MDR isolates were co-resistant to at least six antibiotics. Having additional non-farming occupations (AOR: 4.17, 95% CI: 1.49–11.67), large herd size (AOR: 3.21, 95% CI: 1.26–8.18), having pet animals (AOR: 6.53, 95% CI: 1.39–30.7), and use of calabash milk pail (AOR: 7.37, 95% CI: 1.45–37.49) were significantly associated with milk culture positive result for E. coli and/or K. pneumoniae.
ConclusionRaw milk in Northwest Amhara harbors ESBL and carbapenemase-producing E. coli and K. pneumoniae posing a substantial public health risk coupled with MDR and resistance to critically important antimicrobials. Strengthened AMR surveillance, improved farm hygiene, restricted antibiotic use, and public education on milk safety are urgently needed.
by Yoo Kyung Choi, Seok Hyun Son, Hong Seok Jang, In-Ho Kim, Sea-Won Lee, Soo-Yoon Sung
BackgroundRadiotherapy for locally advanced esophageal cancer can induce lymphopenia, potentially worsening outcomes. This study examines the association between clinical outcomes and the effective dose to the immune cells (EDIC), a measure of lymphocyte radiation exposure.
MethodsWe retrospectively analyzed 107 patients with locally advanced esophageal squamous cell carcinoma treated with definitive concurrent chemoradiotherapy (CCRT). The EDIC was calculated based on the mean lung dose, mean heart dose, and integral total body dose using established models. Patients were stratified into high (n = 42) and low (n = 65) effective dose to the immune cells (EDIC) groups using a cut-off value of 4.28 Gy. Survival outcomes, including overall survival (OS), progression-free survival (PFS), locoregional failure-free survival (LRFS), and distant metastasis-free survival (DMFS), were assessed.
ResultsThe 5-year OS and PFS rates were significantly lower in the high EDIC group than in the low EDIC group (51.9% vs. 66.6%, p = 0.043; 20.8% vs. 31.8%, p = 0.002, respectively). Multivariate analysis identified high EDIC as an independent predictor of poorer OS (hazard ratio (HR): 2.06, 95% confidence interval (CI): 1.1–3.86, p = 0.024) and PFS (HR: 1.7, 95% CI: 1.04–2.78, p = 0.034). Similarly, the 5-year LRFS and DMFS rates were significantly lower in the high EDIC group than in the low EDIC group (24.1% vs. 34.9%, p = 0.003; 29.0% vs. 44.0%, p = 0.018, respectively).
ConclusionA higher EDIC is an independent predictor of poor survival in patients with esophageal squamous cell carcinoma undergoing CCRT. Reducing radiation exposure to the immune system through optimized radiation planning and lymphocyte-sparing techniques may improve patient outcomes.