skip to main content


Title: Identification of Novel Biomarkers and Pathways for Coronary Artery Calcification in Nondiabetic Patients on Hemodialysis Using Metabolomic Profiling
Background A better understanding of the pathophysiology involving coronary artery calcification (CAC) in patients on hemodialysis (HD) will help to develop new therapies. We sought to identify the differences in metabolomics profiles between patients on HD with and without CAC. Methods In this case-control study, nested within a cohort of 568 incident patients on HD, the cases were patients without diabetes with a CAC score >100 ( n =51), and controls were patients without diabetes with a CAC score of zero ( n =48). We measured 452 serum metabolites in each participant. Metabolites and pathway scores were compared using Mann–Whitney U tests, partial least squares–discriminant analyses, and pathway enrichment analyses. Results Compared with controls, cases were older (64±13 versus 42±12 years) and were less likely to be Black (51% versus 94%). We identified three metabolites in bile-acid synthesis (chenodeoxycholic, deoxycholic, and glycolithocholic acids) and one pathway (arginine/proline metabolism). After adjusting for demographics, higher levels of chenodeoxycholic, deoxycholic, and glycolithocholic acids were associated with higher odds of having CAC; comparing the third with the first tertile of each bile acid, the OR was 6.34 (95% CI, 1.12 to 36.06), 6.73 (95% CI, 1.20 to 37.82), and 8.53 (95% CI, 1.50 to 48.49), respectively. These associations were no longer significant after further adjustment for coronary artery disease and medication use. Per 1 unit higher in the first principal component score, arginine/proline metabolism was associated with CAC after adjusting for demographics (OR, 1.83; 95% CI, 1.06 to 3.15), and the association remained significant with additional adjustments for statin use (OR, 1.84; 95% CI, 1.04 to 3.27). Conclusions Among patients on HD without diabetes mellitus, chenodeoxycholic, deoxycholic, and glycolithocholic acids may be potential biomarkers for CAC, and arginine/proline metabolism is a plausible mechanism to study for CAC. These findings need to be confirmed in future studies.  more » « less
Award ID(s):
1934962
NSF-PAR ID:
10288785
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ; ; ; ;
Date Published:
Journal Name:
Kidney360
Volume:
2
Issue:
2
ISSN:
2641-7650
Page Range / eLocation ID:
279 to 289
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Background and Aim:

    Copper is an essential trace metal serving as a cofactor in innate immunity, metabolism, and iron transport. We hypothesize that copper deficiency may influence survival in patients with cirrhosis through these pathways.

    Methods:

    We performed a retrospective cohort study involving 183 consecutive patients with cirrhosis or portal hypertension. Copper from blood and liver tissues was measured using inductively coupled plasma mass spectrometry. Polar metabolites were measured using nuclear magnetic resonance spectroscopy. Copper deficiency was defined by serum or plasma copper below 80 µg/dL for women or 70 µg/dL for men.

    Results:

    The prevalence of copper deficiency was 17% (N=31). Copper deficiency was associated with younger age, race, zinc and selenium deficiency, and higher infection rates (42% vs. 20%,p=0.01). Serum copper correlated positively with albumin, ceruloplasmin, hepatic copper, and negatively with IL-1β. Levels of polar metabolites involved in amino acids catabolism, mitochondrial transport of fatty acids, and gut microbial metabolism differed significantly according to copper deficiency status. During a median follow-up of 396 days, mortality was 22.6% in patients with copper deficiency compared with 10.5% in patients without. Liver transplantation rates were similar (32% vs. 30%). Cause-specific competing risk analysis showed that copper deficiency was associated with a significantly higher risk of death before transplantation after adjusting for age, sex, MELD-Na, and Karnofsky score (HR: 3.40, 95% CI, 1.18–9.82,p=0.023).

    Conclusions:

    In advanced cirrhosis, copper deficiency is relatively common and is associated with an increased infection risk, a distinctive metabolic profile, and an increased risk of death before transplantation.

     
    more » « less
  2. null (Ed.)
    Abstract Aims Coronary artery calcium (CAC) scoring is an established tool for cardiovascular risk stratification. However, the lack of widespread availability and concerns about radiation exposure have limited the universal clinical utilization of CAC. In this study, we sought to explore whether machine learning (ML) approaches can aid cardiovascular risk stratification by predicting guideline recommended CAC score categories from clinical features and surface electrocardiograms. Methods and results In this substudy of a prospective, multicentre trial, a total of 534 subjects referred for CAC scores and electrocardiographic data were split into 80% training and 20% testing sets. Two binary outcome ML logistic regression models were developed for prediction of CAC scores equal to 0 and ≥400. Both CAC = 0 and CAC ≥400 models yielded values for the area under the curve, sensitivity, specificity, and accuracy of 84%, 92%, 70%, and 75%, and 87%, 91%, 75%, and 81%, respectively. We further tested the CAC ≥400 model to risk stratify a cohort of 87 subjects referred for invasive coronary angiography. Using an intermediate or higher pretest probability (≥15%) to predict CAC ≥400, the model predicted the presence of significant coronary artery stenosis (P = 0.025), the need for revascularization (P < 0.001), notably bypass surgery (P = 0.021), and major adverse cardiovascular events (P = 0.023) during a median follow-up period of 2 years. Conclusion ML techniques can extract information from electrocardiographic data and clinical variables to predict CAC score categories and similarly risk-stratify patients with suspected coronary artery disease. 
    more » « less
  3. null (Ed.)
    Background: Both lifestyle and genetic factors confer risk for cardiovascular diseases, type 2 diabetes, and dyslipidemia. However, the interactions between these 2 groups of risk factors were not comprehensively understood due to previous poor estimation of genetic risk. Here we set out to develop enhanced polygenic risk scores (PRS) and systematically investigate multiplicative and additive interactions between PRS and lifestyle for coronary artery disease, atrial fibrillation, type 2 diabetes, total cholesterol, triglyceride, and LDL-cholesterol. Methods: Our study included 276 096 unrelated White British participants from the UK Biobank. We investigated several PRS methods (P+T, LDpred, PRS continuous shrinkage, and AnnoPred) and showed that AnnoPred achieved consistently improved prediction accuracy for all 6 diseases/traits. With enhanced PRS and combined lifestyle status categorized by smoking, body mass index, physical activity, and diet, we investigated both multiplicative and additive interactions between PRS and lifestyle using regression models. Results: We observed that healthy lifestyle reduced disease incidence by similar multiplicative magnitude across different PRS groups. The absolute risk reduction from lifestyle adherence was, however, significantly greater in individuals with higher PRS. Specifically, for type 2 diabetes, the absolute risk reduction from lifestyle adherence was 12.4% (95% CI, 10.0%–14.9%) in the top 1% PRS versus 2.8% (95% CI, 2.3%–3.3%) in the bottom PRS decile, leading to a ratio of >4.4. We also observed a significant interaction effect between PRS and lifestyle on triglyceride level. Conclusions: By leveraging functional annotations, AnnoPred outperforms state-of-the-art methods on quantifying genetic risk through PRS. Our analyses based on enhanced PRS suggest that individuals with high genetic risk may derive similar relative but greater absolute benefit from lifestyle adherence. 
    more » « less
  4. Introduction Studies have reported that antidiabetic medications (ADMs) were associated with lower risk of dementia, but current findings are inconsistent. This study compared the risk of dementia onset in patients with type 2 diabetes (T2D) treated with sulfonylurea (SU) or thiazolidinedione (TZD) to patients with T2D treated with metformin (MET). Research design and methods This is a prospective observational study within a T2D population using electronic medical records from all sites of the Veterans Affairs Healthcare System. Patients with T2D who initiated ADM from January 1, 2001, to December 31, 2017, were aged ≥60 years at the initiation, and were dementia-free were identified. A SU monotherapy group, a TZD monotherapy group, and a control group (MET monotherapy) were assembled based on prescription records. Participants were required to take the assigned treatment for at least 1 year. The primary outcome was all-cause dementia, and the two secondary outcomes were Alzheimer’s disease and vascular dementia, defined by International Classification of Diseases (ICD), 9th Revision, or ICD, 10th Revision, codes. The risks of developing outcomes were compared using propensity score weighted Cox proportional hazard models. Results Among 559 106 eligible veterans (mean age 65.7 (SD 8.7) years), the all-cause dementia rate was 8.2 cases per 1000 person-years (95% CI 6.0 to 13.7). After at least 1 year of treatment, TZD monotherapy was associated with a 22% lower risk of all-cause dementia onset (HR 0.78, 95% CI 0.75 to 0.81), compared with MET monotherapy, and 11% lower for MET and TZD dual therapy (HR 0.89, 95% CI 0.86 to 0.93), whereas the risk was 12% higher for SU monotherapy (HR 1.12 95% CI 1.09 to 1.15). Conclusions Among patients with T2D, TZD use was associated with a lower risk of dementia, and SU use was associated with a higher risk compared with MET use. Supplementing SU with either MET or TZD may partially offset its prodementia effects. These findings may help inform medication selection for elderly patients with T2D at high risk of dementia. 
    more » « less
  5. Abstract Background

    Psychological stress is prevalent among reproductive‐aged men. Assessment of semen quality for epidemiological studies is challenging as data collection is expensive and cumbersome, and studies evaluating the effect of perceived stress on semen quality are inconsistent.

    Objective

    To examine the association between perceived stress and semen quality.

    Material and methods

    We analyzed baseline data on 644 men (1,159 semen samples) from two prospective preconception cohort studies during 2015–2021: 592 in Pregnancy Study Online (PRESTO) and 52 in SnartForaeldre.dk (SF). At study entry, men aged ≥21 years (PRESTO) and ≥18 years (SF) trying to conceive without fertility treatment completed a questionnaire on reproductive and medical history, socio‐demographics, lifestyle, and the 10‐item version of the Perceived Stress Scale (PSS; interquartile range [IQR] of scores: 0–40). After enrollment (median weeks: 2.1, IQR: 1.3–3.7), men were invited to perform in‐home semen testing, twice with 7–10 days between tests, using the Trak Male Fertility Testing System. Semen quality was characterized by semen volume, sperm concentration, and total sperm count. We fit generalized estimating equation linear regression models to estimate the percent difference in mean log‐transformed semen parameters by four PSS groups (<10, 10–14, 15–19, ≥20), adjusting for potential confounders.

    Results

    The median PSS score and IQR was 15 (10–19), and 136 men (21.1%) had a PSS score ≥20. Comparing men with PSS scores ≥20 with <10, the adjusted percent difference was −2.7 (95% CI: −9.8; 5.0) for semen volume, 6.8 (95% CI: ‐10.9; 28.1) for sperm concentration, and 4.3 (95% CI: −13.8; 26.2) for total sperm count.

    Conclusion

    Our findings indicate that perceived stress is not materially associated with semen volume, sperm concentration, or total sperm count.

     
    more » « less