skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on November 4, 2025

Title: Wasserstein-Based Similarity Constrained Matrix Factorization for Drug-Drug Interaction Prediction
Award ID(s):
1741490 1840265
PAR ID:
10566345
Author(s) / Creator(s):
; ;
Publisher / Repository:
IEEE
Date Published:
ISBN:
979-8-3503-7375-2
Page Range / eLocation ID:
49 to 53
Format(s):
Medium: X
Location:
Cambridge, MA, USA
Sponsoring Org:
National Science Foundation
More Like this
  1. Drug resistance is one of the fundamental challenges in modern medicine. Using combinations of drugs is an effective solution to counter drug resistance as is harder to develop resistance to multiple drugs simultaneously. Finding the correct dosage for each drug in the combination remains to be a challenging task. Testing all possible drug-drug combinations on various cell lines for different dosages in wet-lab experiments is infeasible since there are many combinations of drugs as well as their dosages yet the drugs and the cell lines are limited in availability and each wet-lab test is costly and time-consuming. Efficient and accurate in silico prediction methods are surely needed. Here we present a novel computational method, PartialFibers to address this challenge. Unlike existing prediction methods PartialFibers takes advantage of the distribution of the missing drug-drug interactions and effectively predicts the dosage of a drug in the combination. Our results on real datasets demonstrate that PartialFibers is more flexible, scalable, and achieves higher accuracy in less time than the state of the art algorithms. 
    more » « less
  2. Evans, Conor L.; Chan, Kin Foong (Ed.)