skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Decoding of Code-Multiplexed Coulter Sensor Signals via Deep Learning
Code-multiplexed Coulter sensors can easily be integrated into microfluidic devices and provide information on spatiotemporal manipulations of suspended particles for quantitative sample assessment. In this paper, we introduced a deep learning-based decoding algorithm to process the output waveform from a network of code- multiplexed Coulter sensors on a microfluidic device. Our deep learning-based algorithm both simplifies the design of coded Coulter sensors and increases the signal processing speed. As a proof of principle, we designed and fabricated a microfluidic platform with 10 code-multiplexed Coulter sensors, and used a suspension of human ovarian cancer cells as a test sample to characterize the system. Our deep learning-based algorithm resulted in an 87% decoding accuracy at a sample processing speed of 800 particles/s.  more » « less
Award ID(s):
1752170
PAR ID:
10399311
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Transducers 2019
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Beyond their conventional use of counting and sizing particles, Coulter sensors can be used to spatially track suspended particles, with multiple sensors distributed over a microfluidic chip. Code-multiplexing of Coulter sensors allows such integration to be implemented with simple hardware but requires advanced signal processing to extract multi-dimensional information from the output waveform. In this work, we couple deep learning-based signal analysis with microfluidic code-multiplexed Coulter sensor networks. Specifically, we train convolutional neural networks to analyze Coulter waveforms not only to recognize certain sensor waveform patterns but also to resolve interferences among them. Our technology predicts the size, speed, and location of each detected particle. We show that the algorithm yields a >90% pattern recognition accuracy for distinguishing non-correlated waveform patterns at a processing speed that can potentially enable real-time microfluidic assays. Furthermore, once trained, the algorithm can readily be applied for processing electrical data from other microfluidic devices integrated with the same Coulter sensor network. 
    more » « less
  2. Coulter counters electrically detect and size suspended particles from intermittent changes in impedance between electrodes. By combining the impedance-based sensing with microfabrication, Coulter counters can be distributed across a lab-on-a-chip platform for code-multiplexed monitoring of microfluidic manipulations. In this paper, we augment a code-multiplexed Coulter sensor network with a deep learning-based decoding algorithm for multiplexed detection of cancer cells sorted into different microfluidic channels. 
    more » « less
  3. Microfluidic devices integrated with Coulter sensors have been widely used in counting and characterizing suspended particles. The electrodes in these devices are mostly arranged in a coplanar fashion due to a simple fabrication process and leads to non-uniform electric fields confined to the floor of the microfluidic channel. We have recently introduced a simple fabrication method that can effortlessly create parallel electrodes in microfluidic devices built with soft-lithography. In this paper, we theoretically and experimentally analyze the developed parallel-electrode Coulter sensor and compare its sensitivity with that of the Coulter sensor built on conventional coplanar electrodes. Both our simulation results and experiments with cell suspensions show that parallel-electrode Coulter sensor can provide as much as ~5× sensitivity improvement over conventional coplanar electrodes. 
    more » « less
  4. Microfluidic technologies have long enabled the manipulation of flow-driven cells en masse under a variety of force fields with the goal of characterizing them or discriminating the pathogenic ones. On the other hand, a microfluidic platform is typically designed to function under optimized conditions, which rarely account for specimen heterogeneity and internal/external perturbations. In this work, we demonstrate a proof-of-principle adaptive microfluidic system that consists of an integrated network of distributed electrical sensors for on-chip tracking of cells and closed-loop feedback control that modulates chip parameters based on the sensor data. In our system, cell flow speed is measured at multiple locations throughout the device, the data is interpreted in real-time via deep learning-based algorithms, and a proportional-integral feedback controller updates a programmable pressure pump to maintain a desired cell flow speed. We validate the adaptive microfluidic system with both static and dynamic targets and also observe a fast convergence of the system under continuous external perturbations. With an ability to sustain optimal processing conditions in unsupervised settings, adaptive microfluidic systems would be less prone to artifacts and could eventually serve as reliable standardized biomedical tests at the point of care. 
    more » « less
  5. Abstract—Accurately capturing dynamic scenes with wideranging motion and light intensity is crucial for many vision applications. However, acquiring high-speed high dynamic range (HDR) video is challenging because the camera’s frame rate restricts its dynamic range. Existing methods sacrifice speed to acquire multi-exposure frames. Yet, misaligned motion in these frames can still pose complications for HDR fusion algorithms, resulting in artifacts. Instead of frame-based exposures, we sample the videos using individual pixels at varying exposures and phase offsets. Implemented on a monochrome pixel-wise programmable image sensor, our sampling pattern captures fast motion at a high dynamic range. We then transform pixel-wise outputs into an HDR video using end-to-end learned weights from deep neural networks, achieving high spatiotemporal resolution with minimized motion blurring. We demonstrate aliasing-free HDR video acquisition at 1000 FPS, resolving fast motion under low-light conditions and against bright backgrounds — both challenging conditions for conventional cameras. By combining the versatility of pixel-wise sampling patterns with the strength of deep neural networks at decoding complex scenes, our method greatly enhances the vision system’s adaptability and performance in dynamic conditions. Index Terms—High-dynamic-range video, high-speed imaging, CMOS image sensors, programmable sensors, deep learning, convolutional neural networks. 
    more » « less