FreshRSS

🔒
❌ Acerca de FreshRSS
Hay nuevos artículos disponibles. Pincha para refrescar la página.
AnteayerPLOS ONE Medicine&Health

Midlife and old-age cardiovascular risk factors, educational attainment, and cognition at 90-years – population-based study with 48-years of follow-up

by Anni Varjonen, Toni Saari, Sari Aaltonen, Teemu Palviainen, Mia Urjansson, Paula Iso-Markku, Jaakko Kaprio, Eero Vuoksimaa

We examined the associations of midlife and old-age cardiovascular risk factors, education, and midlife dementia risk scores with cognition at 90 + years, using data from a population-based study with 48 years of follow-up. Participants were 96 individuals aged 90–97 from the older Finnish Twin Cohort study. Individual cardiovascular risk factors assessed via questionnaires in 1975, 1981, 1990, and 2021–2023 included blood pressure, body mass index, physical activity, and cholesterol, and self-reported educational attainment. The Cardiovascular Risk Factors, Aging, and Dementia (CAIDE) score and an educational-occupational attainment score were used as midlife dementia risk scores. Cognitive assessments included semantic fluency, immediate and delayed recall from a 10-word list learning task, and a composite cognitive score. Regression analyses were conducted with dementia risk factors predicting cognition at 90 + years, adjusting for age, sex, education, follow-up time, and apolipoprotein E genotype (ε4-carrier vs non-carriers). Results showed that higher education and higher educational-occupational score were associated with better cognitive performance in all cognitive measures. Those with high midlife blood pressure scored significantly higher in all cognitive tests than those with normal blood pressure. Conversely, those with high old-age blood pressure scored lower in semantic fluency and composite cognitive score, but not in immediate or delayed recall. Other cardiovascular risk factors and the CAIDE score did not show consistent associations with cognition. Education appears to have a long-lasting protective effect in cognitive aging, whereas midlife and old-age cardiovascular risk factors were not significantly associated with cognition at 90 + years.

Factors associated with prolonged hospitalizations for COVID-19 during the first three waves of the pandemic: Evidence from a Southeastern State of Brazil

by Juliana Rodrigues Tovar Garbin, Franciéle Marabotti Costa Leite, Ana Paula Brioschi dos Santos, Larissa Soares Dell’Antonio, Cristiano Soares da Silva Dell’Antonio, Luís Carlos Lopes-Júnior

A comprehensive understanding of the factors influencing the epidemiological dynamics of COVID-19 across the pandemic waves—particularly in terms of disease severity and mortality—is critical for optimizing healthcare services and prioritizing high-risk populations. Here we aim to analyze the factors associated with short-term and prolonged hospitalization for COVID-19 during the first three pandemic waves. We conducted a retrospective observational study using data from individuals reported in the e-SUS-VS system who were hospitalized for COVID-19 in a state in a southeast state of Brazil. Hospitalization duration was classified as short or prolonged based on a 7-day cutoff, corresponding to the median length of hospital stay during the second pandemic wave. Bivariate analyses were performed using the chi-square test for heterogeneity. Logistic regression models were used to estimate odds ratios (ORs) and their respective 95% confidence intervals (CIs), with statistical significance set at 5%. When analyzing hospitalization duration across the three waves, we found that 51.1% (95%CI: 49.3–53) of hospitalizations in the first wave were prolonged. In contrast, short-duration hospitalizations predominated in the second (54.7%; 95% CI: 52.4–57.0) and third (51.7%; 95% CI: 50.2–53.2) waves. Factors associated with prolonged hospitalization varied by wave. During the first wave, older adults (≥60 years) (OR=1.67; 95%CI: 1.35–2.06), individuals with ≥10 symptoms (OR=2.03; 95%CI: 1.04–3.94), obese individuals (OR=2.0; 95%CI: 1.53–2.74), and those with ≥2 comorbidities (OR=2.22; 95%CI: 1.71–2.89) were more likely to experience prolonged hospitalization. In the second wave, he likelihood of extended hospital stays was higher among individuals aged ≥60 years (OR=2.04; 95%CI: 1.58–2.62) and those with ≥2 comorbidities (OR=1.77; 95%CI: 1.29–2.41). In the third wave, prolonged hospitalization was more frequent among older adults (OR=1.89; 95%CI: 1.65–2.17,), individuals with 5–9 symptoms (OR=1.52; 95%CI: 1.20–1.92), obese individuals (OR=2.2; 95%CI: 1.78–2.73), and those with comorbidities (OR=1.45; 95%CI: 1.22–1.72 and OR=2.0; 95%CI: 1.69–2.45). In conclusion, we identified variations in hospitalization patterns across the pandemic waves, although the differences were relatively subtle. These variations likely reflect gradual shifts in the risk factors associated with prolonged hospital stays. Our findings highlight t the importance of implementing targeted public health interventions, particularly those designed to reduce disease severity and improve clinical outcomes among vulnerable populations at greater risk of extended hospitalization.

Diversity, distribution, and population structure of <i>Escherichia coli</i> in the lower gastrointestinal tract of humans

by Rasel Barua, Paul Pavli, David Gordon, Claire O’Brien

Several studies report the diversity, and population structure of Escherichia coli (E. coli) in the human gut, but most used faecal specimens as the source of E. coli for analysis. In the present study, we collected mucosal biopsies from three different locations: the terminal ileum, transverse colon, and rectum from 46 individuals. To identify unique strains, we fingerprinted about 3300 isolates of E. coli via the multiple-locus variable-number tandem-repeat analysis (MLVA) technique. An example of each strain per individual then underwent PCR for phylogrouping, and specific phylogrouped strains were further screened to determine whether they belonged to one of four common human-associated sequence types (ST69, ST73, ST95, and ST131), and to identify B2-subtypes. We detected on average 2.5 unique strains per individual. The frequency of unique strain(s) appeared in individuals as follows: 35% (16/46) had only one strain, 22% (10/46) had two strains, 24% (11/46) had three strains and 4% (2/46), 9% (4/46) and 7% (3/46) had 4, 5 and 6 strains, respectively. Strain richness did not depend on gender, age, or disease status. The most abundant phylogroup in all gut locations was B2 followed by A, B1, and D. Strain richness overall and across gut locations was decreased if an individual’s dominant strain belonged to phylogroup B2. ST95, ST131, and ST73 constituted more than half of the total B2 strains. Analysis of B2 sub-types revealed that sub-types IX (STc95) and I (STc131) were more common than other sub-types. The phylogroup and ST of strains at different gut locations did not vary significantly. However, there were multiple examples of individuals who carried strains detected only in one gut location. The present study suggests that particular phylogroups and STs are likely to dominate in different locations in the lower gut of humans.

Weighted Hypoxemia Index: An adaptable method for quantifying hypoxemia severity

by Diane C. Lim, Cheng-Bang Chen, Ankita Paul, Yujie Wang, Jinyoung Kim, Soonhyun Yook, Emily Y. Kim, Edison Q. Kim, Anup Das, Medhi Wangpaichitr, Virend K. Somers, Chi Hang Lee, Phyllis C. Zee, Toshihiro Imamura, Hosung Kim

Objective

To quantitate hypoxemia severity.

Methods

We developed the Weighted Hypoxemia Index to be adapted to different clinical settings by applying 5 steps to the oxygen saturation curve: (1) Identify desaturation/resaturation event i by setting the upper threshold; (2) Exclude events as artifact by setting a lower threshold; (3) Calculate weighted area for each i, as (Δi × Φi); (4) Calculate a normalization factor Ω for each subject; (5) Calculate the Weighted Hypoxemia Index as the summation of all weighted areas multiplied by Ω. We assessed the Weighted Hypoxemia Index predictive value for all-cause mortality and cardiovascular mortality using the Sleep Heart Health Study (enrollment 1995–1998, 11.1 years mean follow-up).

Results

We set varying upper thresholds at 92%, 90%, 88%, and 86%, a lower threshold of 50%, calculated area under the curve and area above the curve, with and without a linear weighted factor (duration of each event i), and used the same normalization factor of total sleep time Conclusion

The Weighted Hypoxemia Index offers a versatile and clinically relevant method for quantifying hypoxemia severity, with potential applications to evaluate mechanisms and outcomes across various patient populations.

Work ability during the COVID-19 pandemic: A cross-sectional study in a low-income urban setting in Brazil

by Ana Paula Cândido Oliveira, Daniela Alencar Vieira, Cristiane Wanderley Cardoso, Tereza Magalhães, Rosangela Oliveira Anjos, Eduardo José Farias Borges Reis, Kionna Oliveira Bernardes Santos, Guilherme Sousa Ribeiro

Work ability is a subjective concept that reflects the balance between an individual’s perception of the physical, mental, and social demands of work and their competence and resources to meet those demands. The COVID-19 crisis significantly impacted health, work, and socioeconomic conditions worldwide. However, few studies have examined work ability in disadvantaged urban communities during this period. To analyze factors associated with work ability within the context of social vulnerability during the COVID-19 pandemic, we conducted a cross-sectional study in a low-income neighborhood in Salvador, Brazil, between February and June 2022. Sociodemographic, health, and labor data were collected, and work ability was assessed using the Work Ability Index (WAI), a widely used tool for evaluating work ability. Multivariable analyses based on a hierarchical model were run to investigate factors associated with low WAI scores. The study included 292 workers aged ≥16 years (59.6% women; median age 41 years). Most workers (84.6%) were classified as having adequate work ability based on their WAI scores. Multivariable analyses found that inadequate work ability was more frequent among women (prevalence ratio [PR]: 1.89, 95% confidence interval [CI]: 1.02-3.48), individuals who self-rated their health as moderate/good (PR: 5.91; 95% CI: 1.45-24.05) or poor/very poor (PR: 21.62; 95% CI: 5.14-90.91) compared to those with excellent/very good health, and those reporting diabetes (PR: 2.1; 95% CI: 1.13-3.9). Working >40 hours per week (PR: 0.47; 95% CI: 0.28-0.96) was negatively associated with inadequate work ability, suggesting that individuals with adequate work ability may be selected for longer working hours. A history of COVID-19 was not associated with inadequate work ability. These findings suggest that targeted interventions to improve work ability in low-income communities should prioritize women and workers with chronic health conditions, such as diabetes.

The changing role of substances: trends, characteristics of individuals and prior healthcare utilization among individuals with accidental substance-related toxicity deaths in Ontario Canada

by Shaleesa Ledlie, Alice Holton, Pamela Leece, Bisola Hamzat, Joanna Yang, Gillian Kolla, Nikki Bozinoff, Rob Boyd, Mike Franklyn, Ashley Smoke, Paul Newcombe, Tara Gomes

Objective

To investigate trends and the circumstances surrounding fatal substance-related toxicities directly attributed to alcohol, stimulants, benzodiazepines or opioids and combinations of substances in Ontario, Canada.

Methods

We conducted a population-based cross-sectional study of all accidental substance-related toxicity deaths in Ontario, Canada from January 1, 2018 to June 30, 2022. We reported monthly rates of substance-related toxicity deaths and investigated the combination of substances most commonly involved in deaths. Demographic characteristics, location of incident, and prior healthcare encounters for non-fatal toxicities and substance use disorders were examined.

Results

Overall, 10,022 accidental substance-related toxicity deaths occurred, with the annual number of deaths nearly doubling between the first and last 12 months of the study period (N = 1,570–2,702). Opioids were directly involved in the majority of deaths (84.1%; N = 8,431), followed by stimulants (60.9%; N = 6,108), alcohol (13.4%; N = 1,346) and benzodiazepines (7.8%; N = 782). In total, 56.9% (N = 5,698) of deaths involved combinations of substances. Approximately one-fifth of individuals were treated in a hospital setting for a substance-related toxicity in the past year, with the majority being opioid-related (17.4%; N = 1,748). Finally, 60.9% (N = 6,098) of people had a substance use disorder diagnosis at time of death.

Conclusions

Our study shows not only the enormous loss of life from substance-related toxicities but also the growing importance of combinations of substances in these deaths. A large proportion of people had previously interacted within an hospital setting for prior substance-related toxicity events or related to a substance use disorder, representing important missed intervention points in providing appropriate care.

Patient-Selection of a Clinical Trial Primary Outcome: The ENHANCE-AF Outcomes Survey

by Randall S. Stafford, Eli N. Rice, Rushil Shah, Mellanie T. Hills, Julio C. Nunes, Katie DeSutter, Amy Lin, Karma Lhamo, Bryant Lin, Ying Lu, Paul J. Wang

Introduction

Before the initiation of the ENHANCE-AF clinical trial, which tested a novel digital shared decision-making tool to guide the use of anticoagulants in stroke prevention for patients with atrial fibrillation, this study aimed to identify the most appropriate, patient-selected primary outcome and to examine whether outcome selection varied by demographic and clinical characteristics.

Methods

Our cross-sectional survey asked 100 participants with atrial fibrillation to rank two alternative scales based on the scales’ ability to reflect their experiences with decision-making for anticoagulation. The Decisional Conflict Scale (DCS), a 16-item scale, measures perceptions of uncertainty in choosing options. The 5-item Decision Regret Scale (DRS) focuses on remorse after a healthcare decision. We included adults with non-valvular AFib and CHA2DS2VASc scores of at least 2 for men and 3 for women. Multivariable logistic regression with backward selection identified characteristics independently associated with scale choice.

Results

The DCS was chosen over the DRS by 77% [95% confidence interval (CI) 68 to 85%] of participants. All subgroups designated a preference for the DCS. Those with higher CHA2DS2VASc scores (≥5, n = 26) selected the DCS 54% of the time compared with 86% of those with lower scores (p =  0.002). Multiple logistic regression confirmed a weaker preference for the DCS among those with higher CHA2DS2VASc scores.

Conclusions

Individuals with atrial fibrillation preferred the DCS over the DRS for measuring their decision-making experiences. As a result of this survey, the DCS was designated as the ENHANCE-AF clinical trial’s primary endpoint.

Exploring the patient’s recovery journey and information needs following a shoulder fracture: A qualitative interview study

by Pauline May, Firoza Davies, Gillian Yeowell, Chris Littlewood

Background

Shoulder fractures (proximal humerus fractures) are common, painful, debilitating injuries. Recovery is a long process often hindered by complications such as mal-union and frozen shoulder. The purpose of this qualitative study was to explore the experiences and information needs of people at different time points after a shoulder fracture and how views on recovery change over time.

Methods

This longitudinal telephone interview study used a semi-structured approach based on a pre-planned interview topic guide. Recruitment was from June to November 2023. Participants were interviewed approximately two months and five to six months after their injury. Interviews were audio-recorded and transcribed verbatim. Data were analysed using thematic analysis.

Results

14 participants were recruited (age range 44–80 years; three male). The themes identified were dependence, vulnerability, information needs, and recovery. Loss of function and identity were associated with dependence. Feelings of vulnerability were present for most participants at six months post-injury. Information needs evolved, with information about the extent of the injury and practical advice needed first, but later participants emphasized the importance of reassurance and expected timelines for recovery. Recovery meant regaining function and independence, and returning to meaningful activities, which was also not fully achieved for most participants by six months.

Conclusions

This study is the first to explore information needs and experiences along the timeline of recovery from a shoulder fracture. What recovery means to individual patients, along with recognition of the extent to which feelings of vulnerability affect recovery are important factors to consider. Clinicians should be aware of the full impact of these injuries to guide patients on their recovery journey, including identifying feelings of vulnerability and regaining their identity. Adopting a person-centred care approach, and considering the changing priorities and information needs of patients throughout their recovery journey may lead to improved patient care.

Real-world treatment patterns and outcomes among unresectable stage III non-small cell lung cancer

by Ashwini Arunachalam, Sneha Sura, John Murphy, Paul Conkling, Jerome Goldschmidt

Background

In 2018, the treatment options for unresectable stage III non-small cell lung cancer (NSCLC) changed with durvalumab, an immune checkpoint inhibitor (ICI), which was approved for consolidation therapy following concurrent chemoradiotherapy (cCRT) without disease progression. Despite durvalumab’s clinical benefit, many patients receiving this therapy developed progression. This study evaluated treatment patterns and clinical outcomes in real-world community oncology practices for patients with unresectable stage III NSCLC who received cCRT.

Methods

This study used The US Oncology Network’s (iKnowMed) electronic health record database supplemented by chart review and included adults diagnosed with unresectable stage III NSCLC initiating cCRT between 11/01/2017 and 10/31/2019, with follow-up through 04/30/2022. cCRT included concurrent treatment with platinum-based chemotherapy and radiation therapy (+/-14 days). Real-world overall survival (rwOS) and real-world progression-free survival (rwPFS) were estimated from cCRT initiation using the Kaplan–Meier method.

Results

Among 426 patients, 61.5% received durvalumab post-cCRT (cCRT+durvalumab) and 38.5% did not (cCRT alone). Death (28.3%) and disease progression (22.2%) were the most common reasons for not initiating durvalumab. The median age for the cCRT+durvalumab and cCRT alone cohorts were 70 and 71 years, and 71.8% and 61.6% had Eastern Cooperative Oncology Group performance status of 0–1, respectively. 51.5% of cCRT+durvalumab discontinued durvalumab, primarily due to adverse events (35.8%) and disease progression (28.4%). Median rwOS was 50.2 (95% confidence interval [CI]:41.4, not reached) and 11.6 (95% CI:6.5,15.9) months for cCRT+durvalumab and cCRT alone, respectively. Median rwPFS was 28.5 (95% CI:23.3,36.4) months for cCRT+durvalumab and 6.3 (95% CI:4.3,9.3) months for cCRT alone, respectively. 23.7% (cCRT+durvalumab) and 26.2% (cCRT alone) received subsequent treatment, of which, 59.7% (cCRT+durvalumab) and 46.5% (cCRT alone) received ICI.

Conclusion

Four out of ten patients did not receive consolidation durvalumab mainly due to disease progression. Even among patients who initiated durvalumab, many patients relapsed and were retreated with ICIs. These findings underscore the need to refine treatment strategies for better outcomes in stage III unresectable NSCLC.

Benzothiazinone analogs as Anti-<i>Mycobacterium tuberculosis</i> DprE1 irreversible inhibitors: Covalent docking, validation, and molecular dynamics simulations

by Mahmoud A. A. Ibrahim, Doaa G. M. Mahmoud, Alaa H. M. Abdelrahman, Khlood A. A. Abdeljawaad, Gamal A. H. Mekhemer, Tamer Shoeib, Mohamed A. El-Tayeb, Peter A. Sidhom, Paul W. Paré, Mohamed-Elamir F. Hegazy

Mycobacterium tuberculosis is a lethal human pathogen, with the key flavoenzyme for catalyzing bacterial cell-wall biosynthesis, decaprenylphosphoryl-D-ribose oxidase (DprE1), considered an Achilles heal for tuberculosis (TB) progression. Inhibition of DprE1 blocks cell wall biosynthesis and is a highly promising antitubercular target. Macozinone (PBTZ169, a benzothiazinone (BTZ) derivative) is an irreversible DprE1 inhibitor that has attracted considerable attention because it exhibits an additive activity when combined with other anti-TB drugs. Herein, 754 BTZ analogs were assembled in a virtual library and evaluated against the DprE1 target using a covalent docking approach. After validation of the employed covalent docking approach, BTZ analogs were screened. Analogs with a docking score less than –9.0 kcal/mol were advanced for molecular dynamics (MD) simulations, followed by binding energy evaluations utilizing the MM-GBSA approach. Three BTZ analogs–namely, PubChem-155-924-621, PubChem-127-032-794, and PubChem-155-923-972– exhibited higher binding affinities against DprE1 compared to PBTZ169 with ΔGbinding values of –77.2, –74.3, and –65.4 kcal/mol, versus –49.8 kcal/mol, respectively. Structural and energetical analyses were performed for the identified analogs against DprE1 throughout the 100 ns MD simulations, and the results demonstrated the great stability of the identified BTZ analogs. Physicochemical and ADMET characteristics indicated the oral bioavailability of the identified BTZ analogs. The obtained in-silico results provide promising anti-TB inhibitors that are worth being subjected to in-vitro and in-vivo investigations.

Physical activity, obesity and risk of atherosclerotic cardiovascular diseases among patients with hypertension and diabetes attending a teaching hospital in Edo State, Nigeria

by Tijani Idris Ahmad Oseni, Sulaiman Dazumi Ahmed, Pauline Etuajie Eromon, Neba Francis Fuh, Isaac Newton Omoregbe

Introduction

Preventing Atherosclerotic Cardiovascular Diseases (ASCVD) can best be achieved by promoting a healthy lifestyle through improvements in diet, physical activity, and avoidance of tobacco use and exposure to second-hand smoke. The study aimed to determine the association between physical activity as well as obesity and the risk of atherosclerotic cardiovascular diseases among patients with hypertension and diabetes attending Irrua Specialist Teaching Hospital (ISTH), Irrua, Nigeria.

Methodology

The research was a descriptive, cross-sectional study of 394 systematically selected consenting patients with hypertension and diabetes presenting to a teaching hospital in Irrua, Edo State, Nigeria. The Cardiovascular risk assessment was determined using the Framingham 10year Risk of General Cardiovascular Disease. Anthropometric assessment, blood pressure and blood glucose were determined. Data was collected with a semi-structured questionnaire and analysed with Stata version 16. Chi square and logistic regression was used to test for association and significance level was set at p = 0.05.

Results

The study included 394 participants with a mean age of 54±15.47years. Respondents were mostly females (55.3%), physically inactive (70.3%), overweight (42.4%) and had a high risk (41.8%) of developing CVD in 10 years using Framingham categorisation. There was a significant association between physical activity (P Conclusion

The study found a statistically significant relationship between physical inactivity, obesity, and the risk of atherosclerotic cardiovascular diseases. Increasing physical activity levels need to be a top priority at all levels of healthcare as well as the general population.

Monitoring emerging pathogens using negative nucleic acid test results from endemic pathogens in pig populations: Application to porcine enteric coronaviruses

by Ana Paula Serafini Poeta Silva, Guilherme Arruda Cezar, Edison Sousa Magalhães, Kinath Rupasinghe, Srijita Chandra, Gustavo S. Silva, Marcelo Almeida, Bret Crim, Eric Burrough, Phillip Gauger, Christopher Siepker, Marta Mainenti, Michael Zeller, Rodger G. Main, Mary Thurn, Paulo Fioravante, Cesar Corzo, Albert Rovira, Hemant Naikare, Rob McGaughey, Franco Matias Ferreyra, Jamie Retallick, Jordan Gebhardt, Angela Pillatzki, Jon Greseth, Darren Kersey, Travis Clement, Jane Christopher-Hennings, Melanie Prarat, Ashley Johnson, Dennis Summers, Craig Bowen, Kenitra Hendrix, Joseph Boyle, Daniel Correia Lima Linhares, Giovani Trevisan

This study evaluated the use of endemic enteric coronaviruses polymerase chain reaction (PCR)-negative testing results as an alternative approach to detect the emergence of animal health threats with similar clinical diseases presentation. This retrospective study, conducted in the United States, used PCR-negative testing results from porcine samples tested at six veterinary diagnostic laboratories. As a proof of concept, the database was first searched for transmissible gastroenteritis virus (TGEV) negative submissions between January 1st, 2010, through April 29th, 2013, when the first porcine epidemic diarrhea virus (PEDV) case was diagnosed. Secondly, TGEV- and PEDV-negative submissions were used to detect the porcine delta coronavirus (PDCoV) emergence in 2014. Lastly, encountered best detection algorithms were implemented to prospectively monitor the 2023 enteric coronavirus-negative submissions. Time series (weekly TGEV-negative counts) and Seasonal Autoregressive-Integrated Moving-Average (SARIMA) were used to control for outliers, trends, and seasonality. The SARIMA’s fitted and residuals were then subjected to anomaly detection algorithms (EARS, EWMA, CUSUM, Farrington) to identify alarms, defined as weeks of higher TGEV-negativity than what was predicted by models preceding the PEDV emergence. The best-performing detection algorithms had the lowest false alarms (number of alarms detected during the baseline) and highest time to detect (number of weeks between the first alarm and PEDV emergence). The best-performing detection algorithms were CUSUM, EWMA, and Farrington flexible using SARIMA fitted values, having a lower false alarm rate and identified alarms 4 to 17 weeks before PEDV and PDCoV emergences. No alarms were identified in the 2023 enteric negative testing results. The negative-based monitoring system functioned in the case of PEDV propagating epidemic and in the presence of a concurrent propagating epidemic with the PDCoV emergence. It demonstrated its applicability as an additional tool for diagnostic data monitoring of emergent pathogens having similar clinical disease as the monitored endemic pathogens.
❌