This study aimed to examine the level of vicarious posttraumatic growth among intensive care unit nurses in China and explore the mediating role of death coping ability in the relationship between moral resilience and vicarious posttraumatic growth.
A multicentre, cross-sectional study was conducted in accordance with the STROBE guidelines.
Between January and March 2025, a questionnaire survey was conducted among 666 intensive care unit nurses from nine tertiary Grade A hospitals across five provinces in China. Participants completed three standardised instruments: the Rushton Moral Resilience Scale, the Coping with Death Scale–Short Version, and the Vicarious Posttraumatic Growth Inventory. We used IBM SPSS 27.0 for descriptive statistics, univariate analyses, and correlation analyses, and employed AMOS 27.0 to perform structural equation modelling for testing mediation effects.
Intensive care unit nurses demonstrated a moderate level of vicarious posttraumatic growth. Moral resilience was positively associated with both death coping ability and vicarious posttraumatic growth. Death coping ability was found to play a partial mediating role in the relationship between moral resilience and vicarious posttraumatic growth.
Moral resilience and death coping ability are key factors associated with vicarious posttraumatic growth among intensive care unit nurses. Nurses with stronger moral resilience are more likely to cope constructively with death-related stress, which may support psychological growth in trauma-intensive environments.
This study highlights the need to enhance intensive care unit nurses' moral and emotional capacities through ethics education, emotional coping training, and institutional support strategies. Strengthening these competencies may foster professional development and mental wellbeing in critical care settings.
by Andrea C. Aplasca, Peter B. Johantgen, Christopher Madden, Kilmer Soares, Randall E. Junge, Vanessa L. Hale, Mark Flint
Amphibian skin is integral to promoting normal physiological processes in the body and promotes both innate and adaptive immunity against pathogens. The amphibian skin microbiota is comprised of a complex assemblage of microbes and is shaped by internal host characteristics and external influences. Skin disease is a significant source of morbidity and mortality in amphibians, and increasing research has shown that the amphibian skin microbiota is an important component in host health. The Eastern hellbender (Cryptobranchus alleganiensis alleganiensis) is a giant salamander declining in many parts of its range, and captive-rearing programs are important to hellbender recovery efforts. Survival rates of juvenile hellbenders in captive-rearing programs are highly variable, and mortality rates are overall poorly understood. Deceased juvenile hellbenders often present with low body condition and skin abnormalities. To investigate potential links between the skin microbiota and body condition, we collected skin swab samples from 116 juvenile hellbenders and water samples from two holding tanks in a captive-rearing program. We used 16s rRNA gene sequencing to characterize the skin and water microbiota and observed significant differences in the skin microbiota by weight class and tank. The skin microbiota of hellbenders that were housed in tanks in close proximity were generally more similar than those housed physically distant. A single taxa, Parcubacteria, was differentially abundant by weight class only and observed in higher abundance in low weight hellbenders. These results suggest a specific association between this taxa and Low weight hellbenders. Additional research is needed to investigate how husbandry factors and potential pathogenic organisms, such as Parcubacteria, impact the skin microbiota of hellbenders and ultimately morbidity and mortality in the species.by Jinghui Xie, Haofang Guan, Maohui Liu, Weijun Ding
BackgroundCurrent obesity treatments include behavioral interventions, pharmacotherapy and surgery. Recently, the combination of ‘medicinal food’ products such as the plant Crataegus pinnatifida and its interaction with the gut microbiota has shown promise as an alternative therapeutic strategy to treat obesity.
MethodsWe obtained secondary metabolites (SMs) of obesity-related gut microbiota and Crataegus pinnatifida from gutMGene database and NAPSS database. bioinformatics analysis was used to elucidate key target and signaling pathways, whereas molecular docking (MD), molecular dynamics simulation and quantum chemical calculations identified crucial SMs involved in these pathways. The toxicity and physicochemical properties of these SMs were also assessed.
ResultsPhosphoinositide-3-kinase regulatory subunit 1 (PIK3R1), a key mediator in the phosphoinositide 3-kinase (PI3K)/ Protein Kinase B (Akt) pathway that is crucial for regulating insulin signaling and adipogenesis, emerged as the central hub within the PPI network. Strong binders to PIK3R1 were predicted to be quercetin, kaempferol and naringenin chalcone, suggesting their potential as therapeutic agents to treat obesity.
ConclusionThe synergistic combination of Crataegus pinnatifida and the obesity-related gut microbiota holds promise as a novel therapeutic strategy for obesity by targeting PIK3R1 and modulating the PI3K/Akt signaling pathway. Further experimental validation is necessary to confirm these findings.
by Xinyu Zhang, Yoo Jung Oh, Yunhan Zhang, Jianfeng Zhu
The digital age has fueled a surge in ADHD self-diagnosis as people turn to online platforms for mental health information. However, the relationship between validation-seeking behaviors and self-perception in these online communities and users’ self-perception has received limited scholarly focus. Drawing on self-verification theory and utilizing natural language processing to analyze 452,026 posts from the r/ADHD subreddit, our study uncovers distinct patterns in validation-seeking behaviors. Results show that (a) self-diagnosed individuals with ADHD are more likely to seek social validation and media validation and to report higher levels of negative self-image and internalized stigma than clinically diagnosed individuals, (b) social validation was strongly associated with both positive and negative self-perceptions; and (c) diagnosis status significantly moderated these relationships, such that the effects of social validation on self-image and stigma were consistently weaker for the self-diagnosed group. Theoretically, this study extends self-verification theory by demonstrating that professional verification hierarchically moderates self-verification effectiveness. This implies a practical need for clinicians to acknowledge online validation seeking and for digital communities to affirm user experiences while mitigating stigma.by Xie Qiu, Shuo Hu, Shumin Dong, Haijun Sun
ObjectiveTo develop a predictive framework integrating machine learning and clinical parameters for postoperative pulmonary complications (PPCs) in non-small cell lung cancer (NSCLC) patients undergoing video-assisted thoracic surgery (VATS).
MethodsThis retrospective study analyzed 286 NSCLC patients (2022–2024), incorporating 13 demographic, metabolic-inflammatory, and surgical variables. An Improved Blood-Sucking Leech Optimizer (IBSLO) enhanced via Cubic mapping and opposition-based learning was developed. Model performance was evaluated using AUC-ROC, F1-score, and decision curve analysis (DCA). SHAP interpretation identified key predictors.
ResultsThe IBSLO demonstrated significantly superior convergence performance versus original BSLO, ant lion optimizer (ALO), Harris hawks optimization (HHO), and whale optimization algorithm (WOA) across all 12 CEC2022 test functions. Subsequently, the IBSLO-optimized automated machine learning (AutoML) model achieved ROC-AUC/PR-AUC values of 0.9038/0.8091 (training set) and 0.8775/0.8175 (testing set), significantly outperforming four baseline models: logistic regression (LR), support vector machine (SVM), XGBoost, and LightGBM. SHAP interpretability identified six key predictors: preoperative leukocyte count, body mass index (BMI), surgical approach, age, intraoperative blood loss, and C-reactive protein (CRP). Decision curve analysis demonstrated significantly higher net clinical benefit of the AutoML model compared to conventional methods across expanded threshold probability ranges (training set: 8–99%; testing set: 3–80%).
ConclusionThis study establishes an interpretable machine learning framework that improves preoperative risk stratification for NSCLC patients, offering actionable guidance for thoracic oncology practice.
To identify the barriers and facilitators in the implementation of fertility preservation (FP) shared decision-making (SDM) in oncology care.
Qualitative descriptive study.
Qualitative interviews with 16 female patients with cancer and seven healthcare providers were conducted between July 2022 and April 2024. Data were analyzed using directed content analysis, guided by the implementation science framework.
We identified 22 categories comprising 38 codes as barriers to SDM implementation and 17 categories comprising 26 codes as facilitators. Findings revealed that, at the innovation level, accessibility, feasibility, interdisciplinary collaboration, and quality improvement efforts were decisive in the implementation of FP SDM. At the individual level, healthcare providers' awareness and attitudes towards FP and SDM, as well as patients' knowledge, attitudes, and capabilities in FP SDM, were crucial factors in the implementation of FP SDM. In social, economic, and organizational contexts, support from significant others, social awareness about FP, multidisciplinary care, financial assistance, and educational resources were determinants in implementing FP SDM.
Implementing FP SDM among female patients with cancer necessitates a strategic approach that considers barriers and facilitators. Educating and promoting FP SDM among the public and healthcare providers, combined with incentivizing policies, can enhance individual knowledge and awareness while achieving systemic improvements, facilitating its successful implementation.
This study provides insights into barriers and facilitators and proposes strategic approaches to enhancing FP SDM implementation, contributing to improved quality of life for cancer survivors and advancements in clinical practice.
Perioperative adverse events increase morbidity and mortality. The rate and severity of complications and the risk for subsequent mortality are increased after high-risk procedures and in elevated-risk patients. Over the past decades, a multitude of prognostic studies identified perioperative risk factors at the population level. However, to allow for the advancement of precision surgery strategies, improved risk prediction on the individual patient level is warranted. Comprehensive, consecutive, multisource, structured, high-quality patient-related and procedure-related data sets, together with thorough follow-up and combined with state-of-the-art machine-learning analyses, are needed to facilitate precise prediction of perioperative complications. Therefore, we designed and currently conduct the Heidelberg Perioperative Deep Data study (HeiPoDD). Here, we report the rationale and design of the HeiPoDD study.
HeiPoDD is a prospective, single-centre, exploratory cohort study aiming to build up a large-scale deep-data base and corresponding biomaterial collection. 1040 adult patients planned for elective high-risk, non-cardiac surgery for any indication at Heidelberg University Hospital, Germany will be included. The obtained study-specific data set includes clinical data, lab values, genome- and proteome analysis as well as plasma, serum and peripheral blood mononuclear cells (PBMC) collected before and at days 1, 3 and 7 postsurgery. Urine samples are collected before and at day 1 postsurgery. Structured follow-up for perioperative complications such as redo-surgery, length of intensive care stay or length of hospital stay is conducted at days 30, 90 and 1 year postsurgery and for disease progression and survival after 3 and 5 years postsurgery. All study data will be transferred to the HeiPoDD registry to allow merging with all available routine clinical data from the hospital information system including imaging studies as well as haemodynamic and respiratory biosignals. Biomaterials will be stored in the HeiPoDD biomaterial bank to allow further analyses.
The trial protocol and amendments were approved by the ethics committee of the University of Heidelberg (S-758/2021). The protocol is registered with the German Clinical Trial Register (DRKS00024625). Participating patients’ data will be recorded only in pseudonymised form. After completion of the study, data collected during the study will be kept on file for up to 30 years. Biomedical samples collected during the study and entered into the biobank will be held for the same amount of time. The findings will be disseminated in peer-reviewed academic journals.
To develop and validate a machine learning-based risk prediction model for delirium in older inpatients.
A prospective cohort study.
A prospective cohort study was conducted. Eighteen clinical features were prospectively collected from electronic medical records during hospitalisation to inform the model. Four machine learning algorithms were employed to develop and validate risk prediction models. The performance of all models in the training and test sets was evaluated using a combination of the area under the receiver operating characteristic curve (AUC), accuracy, sensitivity, Brier score, and other metrics before selecting the best model for SHAP interpretation.
A total of 973 older inpatient data were utilised for model construction and validation. The AUC of four machine learning models in the training and test sets ranged from 0.869 to 0.992; the accuracy ranged from 0.931 to 0.962; and the sensitivity ranged from 0.564 to 0.997. Compared to other models, the Random Forest model exhibited the best overall performance with an AUC of 0.908 (95% CI, 0.848, 0.968), an accuracy of 0.935, a sensitivity of 0.992, and a Brier score of 0.053.
The machine learning model we developed and validated for predicting delirium in older inpatients demonstrated excellent predictive performance. This model has the potential to assist healthcare professionals in early diagnosis and support informed clinical decision-making.
By identifying patients at risk of delirium early, healthcare professionals can implement preventive measures and timely interventions, potentially reducing the incidence and severity of delirium. The model's ability to support informed clinical decision-making can lead to more personalised and effective care strategies, ultimately benefiting both patients and healthcare providers.
This study was reported in accordance with the TRIPOD statement.
No patient or public contribution.
by Juliana Rodrigues Tovar Garbin, Franciéle Marabotti Costa Leite, Ana Paula Brioschi dos Santos, Larissa Soares Dell’Antonio, Cristiano Soares da Silva Dell’Antonio, Luís Carlos Lopes-Júnior
A comprehensive understanding of the factors influencing the epidemiological dynamics of COVID-19 across the pandemic waves—particularly in terms of disease severity and mortality—is critical for optimizing healthcare services and prioritizing high-risk populations. Here we aim to analyze the factors associated with short-term and prolonged hospitalization for COVID-19 during the first three pandemic waves. We conducted a retrospective observational study using data from individuals reported in the e-SUS-VS system who were hospitalized for COVID-19 in a state in a southeast state of Brazil. Hospitalization duration was classified as short or prolonged based on a 7-day cutoff, corresponding to the median length of hospital stay during the second pandemic wave. Bivariate analyses were performed using the chi-square test for heterogeneity. Logistic regression models were used to estimate odds ratios (ORs) and their respective 95% confidence intervals (CIs), with statistical significance set at 5%. When analyzing hospitalization duration across the three waves, we found that 51.1% (95%CI: 49.3–53) of hospitalizations in the first wave were prolonged. In contrast, short-duration hospitalizations predominated in the second (54.7%; 95% CI: 52.4–57.0) and third (51.7%; 95% CI: 50.2–53.2) waves. Factors associated with prolonged hospitalization varied by wave. During the first wave, older adults (≥60 years) (OR=1.67; 95%CI: 1.35–2.06), individuals with ≥10 symptoms (OR=2.03; 95%CI: 1.04–3.94), obese individuals (OR=2.0; 95%CI: 1.53–2.74), and those with ≥2 comorbidities (OR=2.22; 95%CI: 1.71–2.89) were more likely to experience prolonged hospitalization. In the second wave, he likelihood of extended hospital stays was higher among individuals aged ≥60 years (OR=2.04; 95%CI: 1.58–2.62) and those with ≥2 comorbidities (OR=1.77; 95%CI: 1.29–2.41). In the third wave, prolonged hospitalization was more frequent among older adults (OR=1.89; 95%CI: 1.65–2.17,), individuals with 5–9 symptoms (OR=1.52; 95%CI: 1.20–1.92), obese individuals (OR=2.2; 95%CI: 1.78–2.73), and those with comorbidities (OR=1.45; 95%CI: 1.22–1.72 and OR=2.0; 95%CI: 1.69–2.45). In conclusion, we identified variations in hospitalization patterns across the pandemic waves, although the differences were relatively subtle. These variations likely reflect gradual shifts in the risk factors associated with prolonged hospital stays. Our findings highlight t the importance of implementing targeted public health interventions, particularly those designed to reduce disease severity and improve clinical outcomes among vulnerable populations at greater risk of extended hospitalization.by Dong Min Jung, Yong Jae Kwon, Yong Wan Cho, Jong Geol Baek, Dong Jae Jang, Yongdo Yun, Seok-Ho Lee, Gahee Son, Hyunjong Yoo, Min Cheol Han, Jin Sung Kim
Volumetric modulated arc therapy (VMAT) for lung cancer involves complex multileaf collimator (MLC) motion, which increases sensitivity to interplay effects with tumour motion. Current dynamic conformal arc methods address this issue but may limit the achievable dose distribution optimisation compared with standard VMAT. This study examined the clinical utility of a VMAT technique with monitor unit limits (VMATliMU) to mimic conformal arc delivery and reduce interplay effects while maintaining plan quality. VMATliMU was implemented by applying monitor unit limitations during VMAT reoptimisation to minimise MLC encroachment into target volumes. Using mesh-type reference computational phantom CT images, treatment plans were generated for a simulated stage I lung cancer case prescribed to 45 Gy in three fractions. VMATliMU, conventional VMAT, VMAT with leaf speed limitations, dynamic conformal arc therapy, and constant dynamic conformal arc therapy were compared. Plans were optimised for multiple isodose line prescriptions (50%, 60%, 70%, 80%, and 90%) to investigate the impact of dose distribution. Evaluation parameters included MLC positional accuracy using area difference ratios, dosimetric indices, gradient metrics, and organ-at-risk doses. VMATliMU prevented MLC encroachment into the internal target volume across 60%–90% isodose lines, showing superior MLC accuracy compared with other methods. At the challenging 50% isodose line, VMATliMU had 4.5 times less intrusion than VMAT with leaf speed limits. VMAT plans had better dosimetric indices than dynamic conformal arc plans. VMATliMU reduced monitor units by 5.1%–19.2% across prescriptions. All plans met the clinical dose constraints, with the aortic arch below tolerance and acceptable lung doses. VMATliMU combines VMAT’s dosimetric benefits with the dynamic conformal arcs’s simplicity, minimising MLC encroachment while maintaining plan quality. Reduced monitor units lower low-dose exposure, treatment time, and interplay effects. VMATliMU is usable in existing planners with monitor unit limits, offering a practical solution for lung stereotactic body radiation therapy.by Esther Ba-Iredire, James Atampiiga Avoka, Luke Abanga, Abigail Awaitey Darkie, Emmanuel Junior Attombo, Eric Agboli
IntroductionThe alarming rate of drug-resistant tuberculosis (DR-TB) globally is a threat to treatment success among positive tuberculosis (TB) cases. Studies aimed at determining the prevalence, trend of DR-TB and socio-demographic and clinical risk factors contributing to DR-TB in the four regions of Ghana are currently unknown. This study sought to determine the prevalence and trend of DR-TB, identify socio-demographic and clinical risk factors that influence DR-TB, and analyse the relationship between underweight and adverse drug reactions and treatment outcomes among DR-TB patients in four regions of Ghana.
MethodIt was a retrospective review conducted over 5 years, from January 2018 to the end of December 2022. The data were retrieved from the DR-TB registers and folders at the Directly Observed Treatment (DOT) centres in the four regions. Analysis of the data was conducted using STATA version 17.
ResultsThe prevalence of DR-TB in Ashanti was 10.1%, Eastern 5.3%, 27.8% in Central, and 2.7% in the Upper West region for the year 2022. The overall prevalence rate of DR-TB for the period 2018–2022 was 13.8%. The socio-demographic and clinical risk factors that influence DR-TB in the four regions are: age, marital status (aOR 3.58, P-value Conclusion
The study shows that the prevalence of DR-TB in Ghana is low, probably not because the cases have reduced but due to inadequate GeneXpert machines to detect the cases. Age, marital status, education, alcohol intake, previously treated TB cases, adverse drug reactions, underweight, and treatment outcome are factors influencing the development of DR-TB. Therefore, interventions aimed at improving the nutritional status of DR-TB cases and minimising adverse drug reactions will improve treatment outcomes.
Chronic heart failure (CHF) is a progressive life-limiting condition that necessitates early implementation of advance care planning (ACP). However, patients and caregivers encounter emotional, informational, and cultural barriers to effective ACP engagement. This meta-synthesis consolidates qualitative evidence to deepen our understanding of ACP practices in CHF care.
This study aimed to explore experiences of CHF patients and their caregivers in ACP, which is defined as a proactive decision-making process to establish future treatment plans based on patients' values. The study also aimed to identify barriers and facilitators influencing ACP decisions and assess the impact of flexible, personalized ACP approaches on care quality.
Using qualitative meta-synthesis, we analyzed 10 qualitative studies on CHF patients' and caregivers' ACP experiences. Data were thematically synthesized to identify emotional, relational, and practical factors that influence engagement in ACP.
Three themes emerged: (1) heart failure patients and caregivers face difficulties in ACP (difficulties from patients, difficulties from the family, and difficulties from the society), (2) multidimensional drivers and impacts of ACP (advance care planning drivers, acceptance and implementation of ACP, emotions and effects of ACP), (3) flexible, personalized ACP delivers tangible benefits (timing and effectiveness of ACP discussions, patients and caregivers have personalized needs for ACP, and patients and caregivers affirm ACP benefits).
ACP plays a critical role in improving end-of-life care quality and reducing emotional and decision-making burdens on caregivers. Flexible and personalized ACP strategies supported by trained healthcare professionals more effectively meet the unique needs of patients and families. To overcome persistent barriers and promote broader ACP adoption, healthcare systems should prioritize provider communication training, ACP education, and support systems tailored to diverse cultural contexts.
Guideline-based strategies to prevent chronic kidney disease (CKD) progression and complications are available, yet their implementation in clinical practice is uncertain. We aimed to synthesise the available evidence on the concordance of CKD care with clinical guidelines to identify gaps and inform future CKD care.
Systematic review and meta-analysis.
We systematically searched MEDLINE (OVID), EMBASE (OVID) and CINAHL (EBSCOhost) (to 18 July 2025) for observational studies of adults with CKD reporting data on the quality of CKD care. We assessed data on quality indicators of CKD care across domains that related to patient monitoring (glomerular filtration rate and albuminuria), medications use (ACE inhibitors (ACEIs) and angiotensin receptor blockers (ARBs), statins) and treatment targets (blood pressure (BP) and HbA1c). Pooled estimates (95% CI) of the percentage of patients who met the quality indicators for CKD care were estimated using random effects model.
59 studies across 24 countries, including a total of 3 003 641 patients with CKD, were included. Across studies, 81.3% (95% CI: 75% to 87.6%) of patients received eGFR monitoring, 47.4% (95% CI: 40.0% to 54.7%) had albuminuria testing, and 90% (95% CI: 84.3% to 95.9%) had BP measured. ACEIs/ARBs were prescribed among 56.7% (95% CI: 51.5% to 62%), and statins among 56.6% (95% CI: 48.9% to 64.3%) of patients. BP (systolic BP ≤140/90 mm Hg) and HbA1c (
Current evidence shows substantial variation in CKD care quality globally. Guideline-concordant care varied according to quality measures and across patient groups, with gaps in indicators like albuminuria testing. These findings underscore the need for effective quality improvement strategies to address gaps in CKD care, including increased albuminuria testing for risk stratification, together with systematic measures for monitoring care quality.
CRD42023391749.
To explore mental health help-seeking behaviours among East Asian American dementia caregivers and construct a theory grounded in their behaviour patterns.
Qualitative using constructivist grounded theory design.
We recruited 20 East Asian American dementia caregivers between August 2023 and March 2024 using purposive sampling. We conducted one-on-one interviews and analysed the data using constructivist grounded theory coding.
We constructed a theory including six concepts and 22 categories. While ‘providing care’, caregivers manage caregiving tasks and personal life, experiencing caregiving challenges. ‘Individual capacity’ is a key to perceiving caregiving situations and ‘considering seeking support’. Various factors can affect ‘using support’. Different types of support can be used separately or in combination. When receiving adequate support, caregivers can ‘gain benefits from support’. These benefits, alongside individual capacities, can shape caregivers' ‘outlook on the present and the future’.
This study explains the mental health help-seeking process within East Asian culture, broadening perspectives on diverse populations and highlighting insights into culturally tailored services.
This study offers clinicians and communities insights into the mental health help-seeking process among East Asian American dementia caregivers and highlights strategies to encourage their use of mental health services.
This theory incorporates aspects of East Asian culture, addressing a research gap in studies of Asian Americans. It may enhance understanding of culturally tailored approaches and facilitate future funding for research and services, considering cultural diversity.
The Standards for Reporting Qualitative Research.
No Patient or Public Contribution.
by Minjung Lee
Efficient and effective public health surveillance during epidemics relies heavily on active and voluntary public participation, including timely COVID-19 testing and disclosure of results to contacts. This study aimed to investigate predictors of COVID-19 testing and disclosure hesitancy, with a focus on the role of responsibility attribution during the early stages of the Omicron epidemic in South Korea. A cross-sectional survey was conducted with 1,000 participants between February 25 and March 2, 2022. Chi-square tests and multivariable logistic regression models were used for analysis. Findings showed that 41.5% of participants expressed hesitancy toward COVID-19 testing, and 59.4% expressed hesitancy toward disclosing test results to contacts. Greater attribution of responsibility to individuals was significantly associated with increased hesitancy toward testing (OR = 0.75, 95% CI = 0.63–0.90, pby Bijuan Chen, Zhouwei Zhan, Sisi Yu, Jiali Huang, Chuying Chen, Jie Wang, Jianji Pan, Shaojun Lin, Yun Xu
BackgroundLaryngeal cancer attributable to occupational asbestos exposure remains a significant public health concern, particularly in industrialized regions. This study analyzes the burden, trends, and contributing factors of laryngeal cancer due to asbestos exposure in China from 1990 to 2021.
MethodsData were obtained from the Global Burden of Disease Study (1990–2021). We analyzed age-standardized death rates, disability-adjusted life years (DALYs), years lived with disability (YLDs), and years of life lost (YLLs). Temporal trends were assessed using joinpoint and decomposition analyses, and an age-period-cohort (APC) model was applied to examine mortality and DALY trends across different cohorts.
ResultsIn 2021, there were 234 deaths and 4,430 DALYs due to laryngeal cancer attributable to occupational asbestos exposure, predominantly affecting males. Mortality rates declined from 1990 to 2008, followed by a rise until 2012, and a subsequent decline. YLDs showed a consistent increase over time. APC analysis revealed higher mortality and DALY rates in older age groups and earlier birth cohorts. Decomposition analysis indicated that epidemiological changes were the largest driver of increased deaths in men, followed by population growth and aging. For DALYs, aging and population growth were key drivers, while epidemiological changes mitigated the burden.
ConclusionsThe burden of laryngeal cancer attributable to asbestos exposure has declined overall, but disability rates continue to rise, particularly among males. Effective strategies targeting prevention, early detection, and management of asbestos exposure are needed to reduce the disease burden in China.
This study aims to explore the trajectories and co-occurrence of perceived control and caregiver self-efficacy among patients with heart failure (HF) and their caregivers within 3 months post-discharge and identify associated risk factors.
A prospective cohort design.
A prospective cohort study was conducted from March to June 2024 in Tianjin, China. Information on perceived control and caregiver self-efficacy was collected 24 h before discharge, 2 weeks, 1 month, and 3 months after discharge. Group-Based Dual Trajectory Modelling (GBDTM) and logistic regression were used for analysis.
The study included 203 dyads of patients with HF and their caregivers (HF dyads). Perceived control identified three trajectories: low curve (15.3%), middle curve (57.1%) and high curve (27.6%). Caregiver self-efficacy demonstrated three trajectories: low curve (17.2%), middle curve (56.7%) and high stable (26.1%). GBDTM revealed nine co-occurrence patterns, with the highest proportion (36.7%) being ‘middle-curve group for perceived control and middle-curve group for caregiver self-efficacy’, and 16.7% being ‘high-curve group for perceived control and high-stable group for caregiver self-efficacy’. Age, gender, household income, NYHA class, symptom burden and psychological resilience were identified as risk factors for perceived control trajectories; marital status, regular exercise and psychological resilience were identified as risk factors for caregiver self-efficacy trajectories.
We identified distinct trajectories, co-occurrence patterns and risk factors of perceived control and caregiver self-efficacy among HF dyads. These findings help clinical nurses to better design and implement interventions, strengthening the comprehensive management and care outcomes for HF dyads.
These findings highlighted the interactive relationship between perceived control and caregiver self-efficacy trajectories, suggesting that interventions should boost both to improve personalised treatment plans and outcomes for HF dyads.
This study adhered to the STROBE checklist.
Patients and their caregivers contributed by participating in the study and completing the questionnaire.
To examine the relationship between weight loss and problems with oral intake in institutionalised older adults.
A 1-year longitudinal observational study.
Data were obtained from a prospective study conducted in three nursing homes and two long-term care facilities in Japan. Participants' problems with oral intake were assessed using items published in 2021 by the Japanese Ministry of Health, Labour and Welfare. Baseline and follow-up factors were compared between individuals who experienced a weight loss of 5% or more and those who did not. Separate multivariable logistic regression models were constructed for each oral intake assessment item to examine its independent association with weight loss of 5% or more, accounting for transitions in each item between baseline and the 1-year follow-up.
In total, 172 institutionalised older adults were included in the analysis. Among them, 57 (33.1%) participants experienced a weight decrease of 5% or more. The emergence of somnolence or clouding of consciousness during meals at the 1-year follow-up in participants without these signs at baseline was independently associated with a weight loss of 5% or more, after adjustment for baseline characteristics.
Recognising signs of somnolence or clouding of consciousness during meals may be useful for the early detection and prevention of weight loss in institutionalised older adults.
Early detection of individuals at risk is essential to prevent significant weight loss and its associated adverse outcomes. Recognising somnolence or clouding of consciousness during meals may enable earlier detection and intervention to prevent weight loss and improve the quality of care for older adults.
Strengthening the Reporting of Observational Studies in Epidemiology.
No patient or public contribution.
Prescribing patterns for hyperopia in children vary widely among eye care providers worldwide. This scoping review aims to identify and map the current literature on optical correction and catalogue outcomes reported, particularly in the domains of vision, vision-related functional outcomes and quality of life (QoL) in school-aged children with hyperopia.
This protocol was developed in accordance with the Joanna Briggs Institute’s Manual for Evidence Synthesis. We will include studies involving school-aged children with hyperopia without restrictions on sex, gender, race, ethnicity, type of optical correction, length of intervention, publication date or country of origin. We will include studies with internal or external comparison groups. We will exclude studies associated with myopia control treatments, ocular and visual pathway pathologies affecting vision or visual function. We will search Cochrane CENTRAL, Embase.com and PubMed. Examples of data to be extracted include population demographics, visual acuity, study-specific definitions for refractive error, treatment regimens for optical correction, vision and vision-related functional outcomes and QoL (general or vision-related) as quantified by validated instruments.
Informed consent and Institutional Review Board approval will not be required, as this scoping review will only use published data. The results from the scoping review will be disseminated by publication in a peer-reviewed scientific journal and at professional conferences.