Highly Accurate and Efficient 3D Implementations Empowered by Deep Neural Network for 2DLMs
-Based Metamaterials
Naixing Feng
,
Huan Wang
,
Xuan Wang
,
Yuxian Zhang
,
Chao Qian
,
Zhixiang Huang
and
Hongsheng Chen
Streamlining the on-demand design of metamaterials, both forward and inverse, is highly demanded for unearthing complex light-matter interaction. Deep learning, as a popular data-driven method, has recently found to largely alleviate the time-consuming and experience-orientated features in widely-used numerical simulations. In this work, we propose a convolution-based deep neural network to implement the inverse design and spectral prediction of a broadband absorber, and deep neural network (DNN) not only achieves highly-accurate results based on small data samples, but also converts the one-dimensional (1D) spectral sequence into a 2D picture by employing the Markov transition field method so as to enhance the variability between spectra. From the perspective of a single spectral sample, spectral samples carry not enough information for neural network due to the constraints of the number of sampling points; from the perspective of multiple spectral samples, the gap between different spectral samples is very small, which can hinder the performance of the reverse design framework. Markov transition field method can enhance the performance of the model from those two aspects. The experimental results show that the final value of the soft required accuracy of the one-dimensional fully connected neural network model and the two-dimensional residual neural network model differ by nearly 1%, the final value of the soft accuracy of the one-dimensional residual neural network model is 97.6%. The final value of the two-dimensional residual neural network model model is 98.5%. The model utilises a data enhancement approach to improve model accuracy and also provides a key reference for designing two-dimensional layered materials (2DLMs) based metamaterials with on-demand properties before they are put into manufacturing.