skip to main content


Title: A validated, real-time prediction model for favorable outcomes in hospitalized COVID-19 patients
Abstract The COVID-19 pandemic has challenged front-line clinical decision-making, leading to numerous published prognostic tools. However, few models have been prospectively validated and none report implementation in practice. Here, we use 3345 retrospective and 474 prospective hospitalizations to develop and validate a parsimonious model to identify patients with favorable outcomes within 96 h of a prediction, based on real-time lab values, vital signs, and oxygen support variables. In retrospective and prospective validation, the model achieves high average precision (88.6% 95% CI: [88.4–88.7] and 90.8% [90.8–90.8]) and discrimination (95.1% [95.1–95.2] and 86.8% [86.8–86.9]) respectively. We implemented and integrated the model into the EHR, achieving a positive predictive value of 93.3% with 41% sensitivity. Preliminary results suggest clinicians are adopting these scores into their clinical workflows.  more » « less
Award ID(s):
1928614
NSF-PAR ID:
10347275
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; more » ; ; ; ; « less
Date Published:
Journal Name:
npj Digital Medicine
Volume:
3
Issue:
1
ISSN:
2398-6352
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Survival and second malignancy prediction models can aid clinical decision making. Most commonly, survival analysis studies are performed using traditional proportional hazards models, which require strong assumptions and can lead to biased estimates if violated. Therefore, this study aims to implement an alternative, machine learning (ML) model for survival analysis: Random Survival Forest (RSF). In this study, RSFs were built using the U.S. Surveillance Epidemiology and End Results to (1) predict 30-year survival in pediatric, adolescent, and young adult cancer survivors; and (2) predict risk and site of a second tumor within 30 years of the first tumor diagnosis in these age groups. The final RSF model for pediatric, adolescent, and young adult survival has an average Concordance index (C-index) of 92.9%, 94.2%, and 94.4% and average time-dependent area under the receiver operating characteristic curve (AUC) at 30-years since first diagnosis of 90.8%, 93.6%, 96.1% respectively. The final RSF model for pediatric, adolescent, and young adult second malignancy has an average C-index of 86.8%, 85.2%, and 88.6% and average time-dependent AUC at 30-years since first diagnosis of 76.5%, 88.1%, and 99.0% respectively. This study suggests the robustness and potential clinical value of ML models to alleviate physician burden by quickly identifying highest risk individuals. 
    more » « less
  2. Transparent electromagnetic interference (EMI) shielding is needed in many optoelectronic applications to protect electronic devices from surrounding radiation while allowing for high visible light transmission. However, very high transmission (over 92.5%), high EMI shielding efficiency (over 30 dB) structures have yet to be achieved in the literature. Bayesian optimization is used to optimize different nanophotonic structures for high EMI shielding efficiency (SE) and high visible light transmission (T¯<#comment/>vis). Below 90% average visible light transmission, sandwich structures consisting of high index dielectric/silver/high index dielectric films are determined to be optimal, where they are able to achieve 43.1 dB SE and 90.0%T¯<#comment/>vis. The high index of refraction dielectric layers reduce absorption losses in the silver and can be engineered to provide for antireflection through destructive interference. However, for optimal EMI shielding withT¯<#comment/>visabove 90%, the reflection losses at the air/dielectric interfaces need to be further reduced. Optimized double sided nanocone sandwich structures are determined to be best where they can achieve 41.2 dB SE and 90.8%T¯<#comment/>visas well as 35.6 dB SE and 95.1%T¯<#comment/>vis. K-means clustering is utilized to show the performance of characteristic near-Pareto optimal structures. Double sided nanocone structures are shown to exhibit omnidirectional visible transmission withSE = 35.6 dB and over 85%T¯<#comment/>visat incidence angles of 70∘<#comment/>.

     
    more » « less
  3. The complexity of transplant medicine pushes the boundaries of innate, human reasoning. From networks of immune modulators to dynamic pharmacokinetics to variable postoperative graft survival to equitable allocation of scarce organs, machine learning promises to inform clinical decision making by deciphering prodigious amounts of available data. This paper reviews current research describing how algorithms have the potential to augment clinical practice in solid organ transplantation. We provide a general introduction to different machine learning techniques, describing their strengths, limitations, and barriers to clinical implementation. We summarize emerging evidence that recent advances that allow machine learning algorithms to predict acute post-surgical and long-term outcomes, classify biopsy and radiographic data, augment pharmacologic decision making, and accurately represent the complexity of host immune response. Yet, many of these applications exist in pre-clinical form only, supported primarily by evidence of single-center, retrospective studies. Prospective investigation of these technologies has the potential to unlock the potential of machine learning to augment solid organ transplantation clinical care and health care delivery systems. 
    more » « less
  4. society as a whole. Technologies that are able to detect individuals at risk of fall before occurrence could help reduce this burden by targeting those individuals for rehabilitation to reduce risk of falls. Wearable technologies especially, which can continuously monitor aspects of gait, balance, vital signs, and other aspects of health known to be related to falls, may be useful and are in need of study. A systematic review was conducted in accordance with the Preferred Reporting Items for Systematics Reviews and Meta-Analysis (PRISMA) 2009 guidelines to identify articles related to the use of wearable sensors to predict fall risk. Fifty four studies were analyzed. The majority of studies (98.0%) utilized inertial measurement units (IMUs) located at the lower back (58.0%), sternum (28.0%), and shins (28.0%). Most assessments were conducted in a structured setting (67.3%) instead of with free-living data. Fall risk was calculated based on retrospective falls history (48.9%), prospective falls reporting (36.2%), or clinical scales (19.1%). Measures of the duration spent walking and standing during free-living monitoring, linear measures such as gait speed and step length, and nonlinear measures such as entropy correlate with fall risk, and machine learning methods can distinguish between falls. However, because many studies generating machine learning models did not list the exact factors being considered, it is difficult to compare these models directly. Few studies to date have utilized results to give feedback about fall risk to the patient or to supply treatment or lifestyle suggestions to prevent fall, though these are considered important by end users. Wearable technology demonstrates considerable promise in detecting subtle changes in biomarkers of gait and balance related to an increase in fall risk. However, more large-scale studies measuring increasing fall risk before first fall are needed, and exact biomarkers and machine learning methods used need to be shared to compare results and pursue the most promising fall risk measurements. There is a great need for devices measuring fall risk also to supply patients with information about their fall risk and strategies and treatments for prevention. 
    more » « less
  5. Planning, the process of evaluating the future consequences of actions, is typically formalized as search over a decision tree. This procedure increases expected rewards but is computationally expensive. Past attempts to understand how people mitigate the costs of planning have been guided by heuristics or the accumulation of prior experience, both of which are intractable in novel, high-complexity tasks. In this work, we propose a normative framework for optimizing the depth of tree search. Specifically, we model a metacognitive process via Bayesian inference to compute optimal planning depth. We show that our model makes sensible predictions over a range of parameters without relying on retrospection and that integrating past experiences into our model produces results that are consistent with the transition from goal-directed to habitual behavior over time and the uncertainty associated with prospective and retrospective estimates. Finally, we derive an online variant of our model that replicates these results. 
    more » « less