Search by keywords or author
• Vol. 1, Issue 4, 046001 (2019)
Jingxi Li1、2、3、†, Deniz Mengu1、2、3, Yi Luo1、2、3, Yair Rivenson1、2、3, and Aydogan Ozcan1、2、3、*
Author Affiliations
• 1University of California at Los Angeles, Department of Electrical and Computer Engineering, Los Angeles, California, United States
• 2University of California at Los Angeles, Department of Bioengineering, Los Angeles, California, United States
• 3University of California at Los Angeles, California NanoSystems Institute, Los Angeles, California, United States
• show less

Abstract

Optical computing provides unique opportunities in terms of parallelization, scalability, power efficiency, and computational speed and has attracted major interest for machine learning. Diffractive deep neural networks have been introduced earlier as an optical machine learning framework that uses task-specific diffractive surfaces designed by deep learning to all-optically perform inference, achieving promising performance for object classification and imaging. We demonstrate systematic improvements in diffractive optical neural networks, based on a differential measurement technique that mitigates the strict nonnegativity constraint of light intensity. In this differential detection scheme, each class is assigned to a separate pair of detectors, behind a diffractive optical network, and the class inference is made by maximizing the normalized signal difference between the photodetector pairs. Using this differential detection scheme, involving 10 photodetector pairs behind 5 diffractive layers with a total of 0.2 million neurons, we numerically achieved blind testing accuracies of 98.54%, 90.54%, and 48.51% for MNIST, Fashion-MNIST, and grayscale CIFAR-10 datasets, respectively. Moreover, by utilizing the inherent parallelization capability of optical systems, we reduced the cross-talk and optical signal coupling between the positive and negative detectors of each class by dividing the optical path into two jointly trained diffractive neural networks that work in parallel. We further made use of this parallelization approach and divided individual classes in a target dataset among multiple jointly trained diffractive neural networks. Using this class-specific differential detection in jointly optimized diffractive neural networks that operate in parallel, our simulations achieved blind testing accuracies of 98.52%, 91.48%, and 50.82% for MNIST, Fashion-MNIST, and grayscale CIFAR-10 datasets, respectively, coming close to the performance of some of the earlier generations of all-electronic deep neural networks, e.g., LeNet, which achieves classification accuracies of 98.77%, 90.27%, and 55.21% corresponding to the same datasets, respectively. In addition to these jointly optimized diffractive neural networks, we also independently optimized multiple diffractive networks and utilized them in a way that is similar to ensemble methods practiced in machine learning; using 3 independently optimized differential diffractive neural networks that optically project their light onto a common output/detector plane, we numerically achieved blind testing accuracies of 98.59%, 91.06%, and 51.44% for MNIST, Fashion-MNIST, and grayscale CIFAR-10 datasets, respectively. Through these systematic advances in designing diffractive neural networks, the reported classification accuracies set the state of the art for all-optical neural network design. The presented framework might be useful to bring optical neural network-based low power solutions for various machine learning applications and help us design new computational cameras that are task-specific.
 $Im,out=Im,+−Im,−Im,++Im,−,$ (1)

#### View in Article

 $Im,out′=Im,out/T,$ (2)

#### View in Article

 $Im,out′=Immax(Im)×T,$ (3)

#### View in Article

 $Loss=−∑m=1Mgm log(cm),$ (4)

#### View in Article

Copy Citation Text
Jingxi Li, Deniz Mengu, Yi Luo, Yair Rivenson, Aydogan Ozcan. Class-specific differential detection in diffractive optical neural networks improves inference accuracy[J]. Advanced Photonics, 2019, 1(4): 046001