skip to main content

Title: Intelligent nanoscope for rapid nanomaterial identification and classification
Machine learning image recognition and classification of particles and materials is a rapidly expanding field. However, nanomaterial identification and classification are dependent on the image resolution, the image field of view, and the processing time. Optical microscopes are one of the most widely utilized technologies in laboratories across the world, due to their nondestructive abilities to identify and classify critical micro-sized objects and processes, but identifying and classifying critical nano-sized objects and processes with a conventional microscope are outside of its capabilities, due to the diffraction limit of the optics and small field of view. To overcome these challenges of nanomaterial identification and classification, we developed an intelligent nanoscope that combines machine learning and microsphere array-based imaging to: (1) surpass the diffraction limit of the microscope objective with microsphere imaging to provide high-resolution images; (2) provide large field-of-view imaging without the sacrifice of resolution by utilizing a microsphere array; and (3) rapidly classify nanomaterials using a deep convolution neural network. The intelligent nanoscope delivers more than 46 magnified images from a single image frame so that we collected more than 1000 images within 2 seconds. Moreover, the intelligent nanoscope achieves a 95% nanomaterial classification accuracy using 1000 images of training sets, which is 45% more accurate than without the microsphere array. The intelligent nanoscope also achieves a 92% bacteria classification accuracy using 50 000 images of training sets, which is 35% more accurate than without the microsphere array. This platform accomplished rapid, accurate detection and classification of nanomaterials with miniscule size differences. The capabilities of this device wield the potential to further detect and classify smaller biological nanomaterial, such as viruses or extracellular vesicles.  more » « less
Award ID(s):
2104295 1807601
Author(s) / Creator(s):
; ; ; ; ; ; ;
Date Published:
Journal Name:
Lab on a Chip
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. This paper experimentally examines different configurations of a multi-camera array microscope (MCAM) imaging technology. The MCAM is based upon a densely packed array of “micro-cameras” to jointly image across a large field-of-view (FOV) at high resolution. Each micro-camera within the array images a unique area of a sample of interest, and then all acquired data with 54 micro-cameras are digitally combined into composite frames, whose total pixel counts significantly exceed the pixel counts of standard microscope systems. We present results from three unique MCAM configurations for different use cases. First, we demonstrate a configuration that simultaneously images and estimates the 3D object depth across a 100×135mm2FOV at approximately 20 µm resolution, which results in 0.15 gigapixels (GP) per snapshot. Second, we demonstrate an MCAM configuration that records video across a continuous 83×123mm2FOV with twofold increased resolution (0.48 GP per frame). Finally, we report a third high-resolution configuration (2 µm resolution) that can rapidly produce 9.8 GP composites of large histopathology specimens.

    more » « less
  2. Optical imaging with nanoscale resolution and a large field of view is highly desirable in many research areas. Unfortunately, it is challenging to achieve these two features simultaneously while using a conventional microscope. An objective lens with a low numerical aperture (NA) has a large field of view but poor resolution. In contrast, a high NA objective lens will have a higher resolution but reduced field of view. In an effort to close the gap between these trade-offs, we introduce an acoustofluidic scanning nanoscope (AS-nanoscope) that can simultaneously achieve high resolution with a large field of view. The AS-nanoscope relies on acoustofluidic-assisted scanning of multiple microsized particles. A scanned 2D image is then compiled by processing the microparticle images using an automated big-data image algorithm. The AS-nanoscope has the potential to be integrated into a conventional microscope or could serve as a stand-alone instrument for a wide range of applications where both high resolution and large field of view are required. 
    more » « less
  3. Traditional miniaturized fluorescence microscopes are critical tools for modern biology. Invariably, they struggle to simultaneously image with a high spatial resolution and a large field of view (FOV). Lensless microscopes offer a solution to this limitation. However, real-time visualization of samples is not possible with lensless imaging, as image reconstruction can take minutes to complete. This poses a challenge for usability, as real-time visualization is a crucial feature that assists users in identifying and locating the imaging target. The issue is particularly pronounced in lensless microscopes that operate at close imaging distances. Imaging at close distances requires shift-varying deconvolution to account for the variation of the point spread function (PSF) across the FOV. Here, we present a lensless microscope that achieves real-time image reconstruction by eliminating the use of an iterative reconstruction algorithm. The neural network-based reconstruction method we show here, achieves more than 10000 times increase in reconstruction speed compared to iterative reconstruction. The increased reconstruction speed allows us to visualize the results of our lensless microscope at more than 25 frames per second (fps), while achieving better than 7 µm resolution over a FOV of 10 mm2. This ability to reconstruct and visualize samples in real-time empowers a more user-friendly interaction with lensless microscopes. The users are able to use these microscopes much like they currently do with conventional microscopes.

    more » « less
  4. Diffraction-limited optical imaging through scattering media has the potential to transform many applications such as airborne and space-based imaging (through the atmosphere), bioimaging (through skin and human tissue), and fiber-based imaging (through fiber bundles). Existing wavefront shaping methods can image through scattering media and other obscurants by optically correcting wavefront aberrations using high-resolution spatial light modulators—but these methods generally require (i) guidestars, (ii) controlled illumination, (iii) point scanning, and/or (iv) statics scenes and aberrations. We propose neural wavefront shaping (NeuWS), a scanning-free wavefront shaping technique that integrates maximum likelihood estimation, measurement modulation, and neural signal representations to reconstruct diffraction-limited images through strong static and dynamic scattering media without guidestars, sparse targets, controlled illumination, nor specialized image sensors. We experimentally demonstrate guidestar-free, wide field-of-view, high-resolution, diffraction-limited imaging of extended, nonsparse, and static/dynamic scenes captured through static/dynamic aberrations.

    more » « less
  5. Optical projection tomography (OPT) is a three-dimensional (3D) fluorescence imaging technique, in which projection images are acquired for varying orientations of a sample using a large depth of field. OPT is typically applied to a millimeter-sized specimen, because the rotation of a microscopic specimen is challenging and not compatible with live cell imaging. In this Letter, we demonstrate fluorescence optical tomography of a microscopic specimen by laterally translating the tube lens of a wide-field optical microscope, which allows for high-resolution OPT without rotating the sample. The cost is the reduction of the field of view to about halfway along the direction of the tube lens translation. Using bovine pulmonary artery endothelial cells and 0.1 µm beads, we compare the 3D imaging performance of the proposed method with that of the conventional objective-focus scan method.

    more » « less