Page d'accueil // Recherche // Highlights // Big Data at the Nanoscale

Big Data at the Nanoscale

twitter linkedin facebook email this page
Publié le mercredi 22 janvier 2020

An international team of scientists, including physicists from the University of Luxembourg, have reported a comprehensive view-point on how machine learning approaches can be used in Nanoscience to analyse and extract new insights from large data sets, and accelerate material discovery, and to guide experimental design. Moreover, they discuss some of the main physical challenging behind the realisation of tailored memristive devices for machine learning.

The researchers have published a Mini Review in Nano Letters, the prestigious journal of the American Chemical Society committed to publishing key advances in fundamental nanoscience. The article was produced in cooperation with researchers at the University of Boston, the University of Pennsylvania, the US Naval Research Laboratory, and the Interuniversity Microelectronics Centre (Belgium), the world-leading R&D and innovation hub in nanoelectronics and digital technologies.

In nanoscience, high-throughput experiments enabled by the small size of nanoscale samples and rapid, high-resolution imaging tools are becoming increasingly widespread. For example, in nanophotonics and catalysis material properties have been varied systematically across the same wafer-sized substrate and characterised locally using high-resolution scanning probe and optical or electron micro-spectroscopy techniques. These or similar methods can generate data sets that are too vast and complex for researchers to mentally parse without computational assistance; yet, these data are rich in relationships that the researchers would like to understand. In this framework, machine learning enables researchers to analyse large data sets by training models that can be used to classify observations into discrete groups, learn which features determine a metric of performance, or predict the outcome of new experiments. Furthermore, machine learning can assist researchers in designing experiments to optimise performance or test hypotheses more effectively. “From nano-optoelectronics, to catalysis, to the bio-nano interface, machine learning is reshaping how researchers collect, analyse, and interpret their data” says Nicolò Maccaferri, Researcher at the Department of Physics and Materials Science (DPHYMS) of the University of Luxembourg.

“In the upcoming years, data-driven science will be fundamental for the discovery and the design of new materials which can help us to increase the efficiency of a plethora of processes, from chemistry to electronics” explains Maccaferri. Within the digital strategy of University of Luxembourg, machine learning approaches will help in this direction. “These methodologies can help experimentalists to advance faster in designing experiments and to process and interpret their data. "In our particular case, using machine learning we can analyse and process the large amount of information encoded in the optical spectra of nanostructures we study in our laboratory, thus enabling quasi-error-free data readout. At the same time, we can use these data for the inverse design and optimisation of photonic nanostructures which can be used for developing post-CMOS devices and systems beyond von Neumann architectures. In this paradigm shift the wave nature of light and related inherent operations, such as interference and diffraction, can play a major role in enhancing computational throughput of machine learning approaches” conclude Maccaferri, who is also looking forward to actively collaborate with theoreticians and data scientists here at the University to develop new methodologies for improving the speed at which electronic components work.

Publication: Machine Learning in Nanoscience: Big Data at Small Scales, Nano Letters, January 2020