Highlights of Data Science (GDS) Talks @ APS 2020 March Meeting
American Physics Society (APS) March meeting is one of the largest physics meetings in the world. In 2020, the meeting was canceled due to concerns over the rapid spread of COVID-19.
To help the community quickly catch up on the work to be presented in this meeting, Paper Digest Team processed all talk abstracts, and generated one highlight sentence (typically the main topic) for each. Readers are encouraged to read these machine generated highlights / summaries to quickly get the main idea of each talk. This article is on the talks related to Data Science (GDS).
If you do not want to miss any interesting academic paper, you are welcome to sign up our free daily paper digest service to get updates on new papers published in your area every day. You are also welcome to follow us on Twitter and Linkedin to get updated with new conference digests.
Paper Digest Team
team@paperdigest.org
TABLE : Data Science (GDS)
Title | Authors | Highlight | Session | |
---|---|---|---|---|
1 | Designing curricula for data science based on fundamental skills and competencies informed by expert interviews | Silvia, Devin; Hawkins, Nathaniel; O’Shea, Brian; Caballero, Marcos | In this talk, I will present the results of our research and highlight some of the design choices we have made for our introductory computational science course and how those choices connect to our research findings. | Session 1: Data Science in the Physics Curriculum |
2 | A university-wide approach to integrative data science education and career paths | Stone, Sarah | At the University of Washington’s eScience Institute, we have developed a university-wide approach to addressing this challenge by developing an integrative education program through formal data science options, now offered by 18 academic units at UW, and informal education programs. | Session 1: Data Science in the Physics Curriculum |
3 | A beginner’s guide to using data science for physicists | Rhone, Trevor David | This talk provides a beginner’s guide to data science for the inquisitive physicist. | Session 1: Data Science in the Physics Curriculum |
4 | Deep Learning Data Science Competencies to Promote Workplace Readiness | Shahmoradi, Amir | In this talk, I will describe our efforts at the University of Texas to 1. | Session 1: Data Science in the Physics Curriculum |
5 | Data Science Tools in the Classroom | Soltanieh-Ha, Mohammad | In this talk, I will provide an overview of tools and techniques that can improve both the learning experience of the students and the instructor’s ability to manage the class and materials. | Session 1: Data Science in the Physics Curriculum |
6 | Information extraction, analysis and feedback for directing matter by design | Sumpter, Bobby | In this talk I will discuss how we are probing in-situ, chemical reactions and materials transformations, including hierarchical assembly, as a modality for direct feedback to an experiment in order to precisely impart directed energy (electrons, ions, photons, thermal) that manipulates a material at the nanoscale. | Session 2: Data Science: Big Data & ML |
7 | High throughput search for plasmonic semiconductors using DFT databases | Shapera, Ethan; Schleife, Andre | We describe, validate, and demonstrate an approach which rapidly screens existing online databases to identify high quality factor plasmonic semiconductors. | Session 2: Data Science: Big Data & ML |
8 | Combined High-Throughput and Machine Learning Approach for Prediction of Lattice Thermal Conductivity | Juneja, Rinkle; Yumnam, George; Satsangi, Swanti; Singh, Abhishek | For the machine learning prediction models, the descriptor set is usually tuned via conventional algorithms such as least absolute selection and shrinkage operator (LASSO). | Session 2: Data Science: Big Data & ML |
9 | Machine Learning-Assisted Design and Discovery of Next Generation 2D Materials | Venturi, Victor; Parks, Holden; Ahmad, Zeeshan; Viswanathan, Venkat | In this work, we use a novel machine learning technique – Crystal Graph Convolutional Neural Networks (CGCNN) [1] – to train accurate models that can predict monolayer 2D material properties more efficiently than density functional theory simulations. | Session 2: Data Science: Big Data & ML |
10 | Revealing the Spectrum of Unknown Layered Materials with Super-Human Predictive Abilities | Cheon, Gowoon; Cubuk, Ekin; Antoniuk, Evan; Goldberger, Joshua; Reed, Evan | To achieve super-human performance, we employ semi-supervised learning techniques for the first time in materials discovery. | Session 2: Data Science: Big Data & ML |
11 | Probing the microscopic origin of magnetism in two-dimensional materials using machine learning | Rhone, Trevor David; Kaxiras, Efthimios | We study two-dimensional (2D) materials with intrinsic magnetic order and explore the microscopic origins of magnetism in these novel materials. | Session 2: Data Science: Big Data & ML |
12 | Topological data analysis for magnetic domain structure characterization | Kotsugi, Masato | We utilized persistent homology to extract the topological feature of the magnetic domain structure, and principal component analysis was used to construct the correlation between persistence diagram and a magnetic hysteresis loop. | Session 2: Data Science: Big Data & ML |
13 | Extracting Interpretable Physical Parameters from Spatiotemporal Systems using Unsupervised Learning | Lu, Peter; Kim, Samuel; Soljacic, Marin | We demonstrate an unsupervised machine learning technique for extracting interpretable physical parameters from noisy spatiotemporal data and for building a transferable predictive model of the system. | Session 2: Data Science: Big Data & ML |
14 | Prediction of Seismic Wave Arrivals Using a Convolutional Neural Network | Garcia, Jorge; Waszek, Lauren | We employ a Convolutional Neural Network (CNN) to predict the arrival time of the mantle shear-wave phases in a seismogram in an effort to accelerate and make consistent the task of data processing. | Session 2: Data Science: Big Data & ML |
15 | Using Reinforcement Learning to Optimize Crystal Structure Determination | Ratcliff, William; Meuse, Kate; Opsahl-Ong, Jessica; Rath, Joeseph; Kienzle, Paul; Yan, Telon; Cho, Ryan; Wilson, Abigail | We compare several approaches within this framework including epsilon-greedy, Q-learning, and actor-critic. | Session 2: Data Science: Big Data & ML |
16 | Active Learning for Quantum Experimental Controling | Wu, Yadong; Zhai, Hui | Here we apply one semi-supervised machine learning method, active learning, to this parameters tuning task to find the suitable param- eters automatically. | Session 2: Data Science: Big Data & ML |
17 | International Radiological Information Exchange (IRIX) Standards for Emergency Radiation Monitoring Data Reporting | Mukhopadhyay, Sanjoy | The article will discuss applications of IRIX in IRMIS. | Session 2: Data Science: Big Data & ML |
18 | Deep Learning-enabled Computational Microscopy and Sensing | Ozcan, Aydogan | In fact, deep learning is mysteriously powerful and has been surprising optics researchers in what it can achieve for advancing optical microscopy, and introducing new image reconstruction and transformation methods. | Session 3: Data Science: Deep Learning |
19 | Exploring Organic Ferroelectrics Using Data-driven Approaches | Ghosh, Ayana; Lubbers, Nicholas; Nakhmanson, Serge; Zhu, Jian-Xin | Here we propose to use data-driven approaches to judiciously shortlist candidates from a wide range of chemical space with ferroelectric functionalities. | Session 3: Data Science: Deep Learning |
20 | Deep Learning Model for Finding New Superconductors | Konno, Tomohiko; Kurokawa, Hodaka; Nabeshima, Fuyuki; Sakishita, Yuki; Ogawa, Ryo; Hosako, Iwao; Maeda, Atsutaka | We represented the periodic table in a way that allows a deep learning model to learn it. | Session 3: Data Science: Deep Learning |
21 | Deep Learning for Energetic Materials: Predicting Material Properties from Electronic Structure using Convolutional Neural Networks | Casey, Alex; Barnes, Brian; Bilionis, Ilias; Son, Steven | We develop a convolution neural network capable of directly parsing the 3D electronic structure of a molecule described by spatial point data for charge density and electrostatic potential concatenated into a 4D tensor. | Session 3: Data Science: Deep Learning |
22 | Optimization of Molecular Characteristic using Continuous Representation of Molecules by Variational Autoencoder with Discriminator | Sato, Kyosuke; Tsuruta, Kenji | In the present study, we focus on the deep learning variational auto-encoder (VAE) model[1], where molecules represented by SMILES strings can be efficiently converted to multivariable continuous space. | Session 3: Data Science: Deep Learning |
23 | An Initial Design-based Deep Learning Procedure for the Optimization of High Dimensional ReaxFF Parameters | SENGUL, MERT; Song, Yao; He, Linglin; Hung, Ying; Dasgupta, Tirthankar; van Duin, Adri | Here, we propose a deep learning (DL)-based procedure to be used in ReaxFF parameter optimization. | Session 3: Data Science: Deep Learning |
24 | Feature Extraction Using Semi-Supervised Deep Learning. | El Khatib, Muammar; De Jong, Wibe | We will explore how the use of semi-supervised learning techniques can be a powerful tool for the extraction of features for atomistic simulations. | Session 3: Data Science: Deep Learning |
25 | Unsupervised feature extraction in simple physical models through mutual information maximization | Sarra, Leopoldo; Marquardt, Florian | By defining relevant features as low dimensional variables that preserve the largest mutual information with the original coordinates of the system, we set up an unsupervised learning technique to automatically extract those features. | Session 3: Data Science: Deep Learning |
26 | Integration of Neural Network-Based Symbolic Regression in Deep Learning for Scientific Discovery | Kim, Samuel; Lu, Peter; Gilbert, Michael; Mukherjee, Srijon; Jing, Li; Čeperić, Vladimir; Soljacic, Marin | Here we use a neural network for symbolic regression based on the EQL network and integrate it into other deep learning architectures such that the whole system can be trained end-to-end through backpropagation. | Session 3: Data Science: Deep Learning |
27 | Rapid machine learning-based solutions of partial differential equations on complex domains. | Dwivedi, Vikas; Srinivasan, Balaji | In this paper, we present physics informed extreme learning machine (PIELM), a new machine-learning algorithm, which solves this problem with a simpler neural network architecture and an extremely fast learning routine. | Session 3: Data Science: Deep Learning |
28 | Probabilistically-autoencoded horseshoe-disentangled multidomain item-response theory models | Chang, Joshua; Vattikuti, Shashaank; Chow, Carson | We propose skipping the initial factor analysis by using a sparsity-promoting horseshoe prior to perform factorization directly within the IRT model so that all training occurs in a single self-consistent step. | Session 3: Data Science: Deep Learning |
29 | Turbulence-generating networks | Garcia, Armando; Gudimetla, Rao; Munoz, Jorge | We propose a new method to simulate the effect of turbulence on light propagation that uses mathematical graphs (networks) that are agnostic of spatial dimension or angle information. | Session 3: Data Science: Deep Learning |
30 | SignalTrain: Modeling Time-dependent Nonlinear Signal Processing Effects Using Deep Neural Networks | Mitchell, William; Hawley, Scott | Unique contributions of this effort include the ability to emulate the individual settings or “knobs” you would see on an analog piece of equipment, and with the production of commercially viable audio, i.e. 44.1kHz sampling rate at 16-bit resolution. | Session 3: Data Science: Deep Learning |
31 | Addressing the Elephant in the Room: Uncertainties in Physical Predictions From Machine-Learned Force Fields | Chmiela, Stefan; Sauceda, Huziel; Müller, Klaus-Robert; Tkatchenko, Alexandre | Here, we present an analysis of the uncertainties in properties derived from learned-FFs, such as vibrational spectrum and thermodynamics. | Session 4: Data Science: Machine Learning |
32 | Understanding key challenges in digitizing and contextualizing experimental results | Kwon, Ha-Kyung; Gopal, Chirranjeevi; Storey, Brian; Caicedo, Santiago; Kirschner, Jared | In this talk, we present our findings from user research conducted in three different academic labs, on researchers’ behaviors and needs throughout the experimental process. | Session 4: Data Science: Machine Learning |
33 | Using Machine Learning to Reduce Low-Q Disorder in Quasiparticle Interference Maps | Witeck, Aidan; Liu, Yu; Hoffman, Jennifer | Here we present a novel algorithm that uses Fourier filtering and machine learning to reduce low-q noise. | Session 4: Data Science: Machine Learning |
34 | CdTe nanoparticles as temperature sensors via machine learning of optical properties | Colton, John; Erikson, James; Lewis, Charles; McClure, Carrie; Sanchez, Derek; Munro, Troy | We have investigated using CdTe nanoparticles as non-invasive temperature sensors. | Session 4: Data Science: Machine Learning |
35 | Machine Learning X-ray Spectra: Theoretical Training for Experimental Predictions | Marcella, Nicholas; Frenkel, Anatoly | We have found that the neural network (NN) is capable of modeling this relationship in X-ray absorption near edge (XANES) and extended fine structure (EXAFS), resulting in a powerful analytic tool. | Session 4: Data Science: Machine Learning |
36 | Machine learning on the electron-phonon spectral function and the superconductor gap function | Hsu, Ming-Chien; Li, Wan-Ju; Lee, Ting-Kuo; Huang, Shin-Ming | The mapping can be learned by using the machine learning technique. | Session 4: Data Science: Machine Learning |
37 | Characteristic space of XRD patterns in machine-learning | Uchimura, Keishu; Yano, Masao; Kimoto, Hiroyuki; Hongo, Kenta; Maezono, Ryo | In this study we adopted an unsupervised machine learning technique, auto-encoder, to analyze XRD patterns. | Session 4: Data Science: Machine Learning |
38 | Using machine learning to understand mutations | Villagran, Martha; Mitsakos, Nikolaos; Miller, John; Azevedo, Ricardo | We are investigating the capability of machine learning, with a focus on deep learning architectures, for detecting and predicting potential mutation locations in mtDNA. | Session 4: Data Science: Machine Learning |
39 | Machine Learning of Energetic Material Properties and Performance | Barnes, Brian; Rice, Betsy; Sifain, Andrew | We present advances in accurate, rapid prediction of detonation pressure, detonation velocity, heat of formation, density, and melting point of energetic molecules. | Session 4: Data Science: Machine Learning |
40 | Simulation of atmospheric turbulence with generative machine learning models | Rodriguez, Arturo; Cuellar, Carlos; Rodriguez, Luis; Garcia, Armando; Terrazas, Jose; Kotteda, VM Krushnarao; Gudimetla, Rao; Kumar, Vinod; Munoz, Jorge | Machine learning techniques provide a novel way to propagate the effects from inner- to outer-scale in atmospheric turbulence spectrum and to accelerate its characterization on long-distance laser propagation. | Session 4: Data Science: Machine Learning |
41 | Identification of informative acoustic features in the transition from non-violent to violent crowd behavior | Pedersen, Katrina; Butler, Brooks; Warnick, Sean; Gee, Kent; Transtrum, Mark | In this work, I conduct a feature-importance study to identify which acoustic metrics are most informative for correctly classifying peaceful and violent crowds. | Session 4: Data Science: Machine Learning |
42 | "Robust Speaker Identification System Under Adverse Conditions" | Prasad, Swati | It will facilitate in the easier and secure communication between Man and Machine using speech. | Session 4: Data Science: Machine Learning |
43 | Hyperbolic non-metric multidimensional scaling reveals intrinsic geometric structure in high-dimensional data | Zhou, Yuansheng; Sharpee, Tatyana | We develop non-metric multidimensional scaling (MDS) in hyperbolic space to perform hyperbolic embedding of points. | Session 4: Data Science: Machine Learning |
44 | Data Augmentation and Pre-training for Template-Based Retrosynthetic Prediction | Fortunato, Mike; Coley, Connor; Barnes, Brian; Jensen, Klavs | In this work we discuss the augmentation of open-access reaction databases with synthetically generated molecular transformations to teach neural networks generalized template applicability. | Session 4: Data Science: Machine Learning |
45 | Neural network-assisted analysis of X-ray absorption spectra of metal oxide clusters | Liu, Yang; Marcella, Nicholas; Frenkel, Anatoly | In this work, we apply the neural network method to the analysis of grazing incidence XANES spectra of size-selective Cu oxide clusters on flat support, measured in operando condition. | Session 4: Data Science: Machine Learning |
46 | Generative and Reinforcement Learning assisted Material Design | Rajak, Pankaj | In recent years, machine learning (ML) models based on supervised learning has shown tremendous success in materials property prediction such as band gap, elastic modules, thermo-electric properties that has accelerated the discovery of new materials. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
47 | Unbiasing machine learning for molecular dynamics: emphasising out-of-equilibrium geometries using clustering | Cordeiro Fonseca, Grégory; Poltavskyi, Igor; Tkatchenko, Alexandre | We propose a method to train unbiased ML FF, which leads to equally accurate predictions independently of the density of training data. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
48 | Challenges in developing an extremely accurate many-body force field | Decolvenaere, Elizabeth; Kormos, Rian; Donchev, Alexander; Klepeis, John; Shaw, David | I will describe some of our recent work that leverages machine learning ideas to generate high-accuracy quantum chemical reference data for the fitting of two-body and many-body models. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
49 | Uncertainty quantification in molecular simulations with dropout neural network potentials | Wen, Mingjian; Tadmor, Ellad | In this paper, we propose a new class of Dropout Uncertainty Neural Network (DUNN) potentials, which provide rigorous uncertainty estimates that can be understood from both Bayesian and frequentist statistics perspectives. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
50 | Improving Fidelity and Transferability of Machine-Learned Reactive Interatomic Models Through Active Learning | Lindsey, Rebecca; Fried, Laurence; Goldman, Nir; Bastea, Sorin | In this work, we present the Chebyshev Interaction Model for Efficient Simulation (ChIMES), a ML force field targeting chemistry in condensed phase systems. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
51 | Uncertainty quantification of classical interatomic potentials in OpenKIM database | Kurniawan, Yonatan; Petrie, Cody; Williams, Kinamo; Transtrum, Mark | I compare Bayesian (Markov Chain Monte Carlo) and Frequentist (profile likelihood) methods to quantify uncertainty of IM parameters. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
52 | Molecular dynamics density and viscosity simulations of alkanes | Santak, Pavao; Conduit, Gareth | We develop a new method to systematically identify the range of shear rates at which the simulations are performed. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
53 | Study of the microstructure of amorphous silicon and its effect on Li transportation with neural network potential | Li, Wenwen; Ando, Yasunobu | In this talk, neural network (NN) potential is used to study Li diffusion mechanism in amorphous silicon ( a-Si). | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
54 | Relative entropy indicates an ideal concentration for structure-based coarse graining of binary mixtures | Rosenberger, David; van der Vegt, Nico | Many methodological approaches have been proposed to improve systematic or bottom-up coarse-graining techniques to enhance the representability and transferability of the derived interaction potentials. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
55 | Exploring, fitting, and characterizing the configuration space of materials with multiscale universal descriptors | Bernstein, Noam; Deringer, Volker; Csányi, Gábor | We present a universal set of multiscale Smooth Overlaps of Atomic Position (SOAP) parameters that can be used for a wide range of purposes. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
56 | Predictive Atomistic Simulations of Materials using SNAP Data-Driven Potentials | Thompson, Aidan; Wood, Mitchell; Cusentino, Mary Alice; Tranchida, Julien; Lubbers, Nicholas; Moore, Stan; Gayatri, Rahul | The relatively large computational cost of SNAP is offset by combining LAMMPS’ spatial parallel algorithms with Kokkos-based hierarchical multithreading, enabling the efficient use of Peta- to Exa-scale CPU and GPU platforms. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
57 | Accurate and Data-Efficient Machine Learning Force Fields for Periodic Systems | Gálvez-González, Luis; Sauceda, Huziel; Chmiela, Stefan; Posada-Amarillas, Alvaro; Paz-Borbón, Lauro Oliver; Müller, Klaus-Robert; Tkatchenko, Alexandre | In this work, we present an extension of the symmetrized gradient-domain machine learning (sGDML) framework [1][2] for periodic systems, which allows the construction of accurate molecular force fields with high data efficiency. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
58 | Phase diagrams of nuclear pasta phases in neutron star matter | Munoz, Jorge; Lopez, Jorge | We performed classical molecular dynamics simulations with modified Pandharipande potentials at temperatures from 0.2 to 4 MeV, densities from 0.04 to 0.08 nucleons/fm3, and proton fraction from 0.1 to 0.5. We built a dataset of configurations by selecting 9,600 uncorrelated instants from the simulations and calculated the Minkowski functionals (volume, surface, integral mean curvature, and Euler characteristic) from which the phase of the nuclear pasta at each instant can be determined. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
59 | Deep Learning for molecular simulation and spectra calculation | Zhang, Linfeng | I will discuss some mathematical perspectives of model representation and exploration of ab initio data for generating reliable deep learning-based models that represent the interatomic potential energy surface and electronic information of complex systems. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
60 | Recurrent Neural Networks Based Integrators for Molecular Dynamics Simulations | Kadupitiya, JCS; Fox, Geoffrey; Jadhao, Vikram | We introduce and develop recurrent neural networks (RNN) based Integrators (“surrogate”) for learning MD dynamics of physical systems generally simulated with Verlet solvers. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
61 | Computing RPA adsorption enthalpies by machine learning thermodynamic perturbation theory | Rocca, Dario; Chehaibou, Bilal; Badawi, Michael; Bucko, Tomas; Bazhirov, Timur | We propose a method that couples machine learning techniques with thermodynamic perturbation theory to estimate finite-temperature properties using correlated approximations [1]. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
62 | Neural Network Potentials for Twisted Few-Layer Materials | Kucukbenli, Emine; Kaxiras, Efthimios | In this talk we present our attempt at developing such a potential via neural networks, and how the challenges highlighted above translate into practical steps during training and testing. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
63 | Developement of reliable neural network potential for metal–semiconductor interface reaction: case study for Ni silicidation | Jeong, Wonseok; Yoo, Dongsun; Lee, Kyuhyun; Han, Seungwu | We present a systematic way to build up the training set that can describe the interface reaction. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
64 | Linearized machine learning potential with high-order rotational polynomial invariants for multi-component systems | Seko, Atsuto; Tanaka, Isao | The present study proposes a formulation of linearized MLP extended to multi-component systems involving high-order rotational invariants. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
65 | Transfer learning of neural network potentials for reactive chemistry | Goodpaster, Jason | In this study, we are developing a method to train a neural network potential with high-level wavefunction theory on targeted system of interest that are able to describe bond breaking. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
66 | Insights about accelrated dynamics calculations of the acid pKa beyond biasing the coordination number collective variable | Wexler, Carlos; Guo, Jiasen; Albesa, Alberto | We studied the deprotonation of acetic acid using the ReaxFF in simulation boxes of varying sizes and observed significant size dependence of ΔG when biasing with a single CV representing the coordination number of the acetate oxygen atoms. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
67 | Multitask machine learning of collective variables for enhanced sampling of reactive molecular dynamics | Sun, Lixin; Batzner, Simon; Vandermause, Jonathan; Xie, Yu; Kozinsky, Boris | In this work, we propose a multi-task machine learning algorithm to learn collective variables (CVs) from short MD trajectories and transition path sampling, which preserves the information on state labels and potential energies. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
68 | Exploring Cucurbituril-Fentanyl Binding (and Beyond) with Parallel Biasing Methods | Leonhard, Anne; Whitmer, Jonathan | Here, we discuss new techniques for computing the binding free energy of small organic molecules to cucurbituril (CB) macrocycles (specifically, cucurbit[7]uril with fentanyl) as both a test case of intermediate complexity, and one of great potential application in sensing and remediation platforms. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
69 | Neural Network Interatomic Potentials for Water | Torres, Alberto; Pedroza, Luana; Rocha, Alexandre | In this work we employ ANNs to represent the water potential surface with DFT- and CC-quality, and compare ANNs trained at different levels of theory to discuss the accuracy of different methods in describing macroscopic properties of water under different conditions including nuclear quantum effects. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
70 | Anharmonicity in a linear chain of Lennard-Jones atoms | De La Rocha, Adrian; Munoz, Jorge | Here we present a study of the anharmonicity that arises from atoms in a linear chain interacting via the Lennard-Jones potential and a test run of a dynamic mean field theory in which the atoms only see a harmonic potential but the stiffness of the potentials depends on the configuration of the system which is determined using a machine learning classifier. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
71 | Hunting FOX: Using Fragments to Sniff Out Drug Leads for Antibiotic Discovery | Mansbach, Rachael; Leus, Inga; Mehla, Jitender; Lopez, Cesar; Walker, John; Rybenkov, Valentin; Hengartner, Nicolas; Zgurskaya, Helen; Gnanakaran, S | We approach drug repurposing by introducing an algorithm–which we term "Hunting FOX" for "Hunting Fragments Of X"–that combines a fragment-based representation with traditional machine learning to identify the most important submolecules correlating with an activity of interest. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
72 | The Self Learning Kinetic Monte Carlo (SLKMC) method augmented with data analytics for adatom-island diffusion on surfaces | Rahman, Talat | In this talk, I will present results for the diffusion kinetics of two dimensional adatoms islands in homoepitaxial and heteroepitaxial systems. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
73 | Accelerated Discovery of Dielectric Polymer Materials Using Graph Convolutional Neural Networks | Mishra, Ankit; Rajak, Pankaj; Cubuk, Ekin; Nomura, Ken-ichi; Kalia, Rajiv; Nakano, Aiichiro; Deshmukh, Ajinkya; Chen, Lihua; Sotzing, Greg; Cao, Yang; Ramprasad, Ramamurthy; Vashishta, Priya | Here, we propose a deep learning-based graph convolutional neural network (GNN) model that can identify polymer systems capable of exhibiting increased energy and power density. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
74 | Deep Learning embedding layers for better prediction of atomic forces in solids | Niv, Sivan; Gordon, Goren; Natan, Amir | We demonstrate this by the calculation of phonons in several solids and by the analysis of force derivatives in systems where we move single atoms and compare the DL predicted force derivatives to the DFT results. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
75 | A molecular dynamics study of water crystallization using deep neural network potentials of ab-initio quality | Piaggi, Pablo; Car, Roberto | We describe the complex interactions between water molecules using deep neural network potentials[1] and employ state of the art enhanced sampling methods[2] to convert reversibly liquid water into ice Ih. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
76 | Machine learning force field using decomposed atomic energies from ab initio calculations | Wang, Lin-Wang | In this talk, we will present our results using both neural network model and Gaussian process regression to represent such force fields. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
77 | Machine learning to derive quantum-informed and chemically-aware force fields to simulate interfaces and defects in hybrid halide perovskites | Larsen, Ross; Jankousky, Matthew; Vigil-Fowler, Derek; Holder, Aaron; Johnson, K. | We demonstrate a machine learning (ML) approach to predict quantum-derived atomic properties (e.g., charge, dipole moment, etc.) from descriptors of the local environment. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
78 | Active Learning of Coarse Grained Force Fields with Gaussian Process Regression | Duschatko, Blake; Vandermause, Jonathan; Molinari, Nicola; Kozinsky, Boris | We propose a novel machine learning method for automatically constructing coarse grained force fields by active learning. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
79 | External Potential Ensembles to Improve the Learning of Transferable Coarse-Grained Potentials | Shen, Kevin; Delaney, Kris; Shell, M. Scott; Fredrickson, Glenn | We demonstrate this approach by using external potential ensembles with the relative entropy optimization to parametrize highly coarse grained models of solvent mixtures from atomistic force fields. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
80 | Data-driven parameterization of coarse-grained models of soft materials using machine learning tools | Johnson, Lilian; Phelan, Frederick | Here, we combine a bottom-up coarse-grained model with a dissipative potential to obtain a chemically specific, thermodynamically consistent, and dynamically correct model. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
81 | JAX, M.D. End-to-End Differentiable, Hardware Accelerated, Molecular Dynamics in Pure Python | Schoenholz, Sam; Cubuk, Ekin | In this presentation we explore the architecture of JAX MD and its capabilities through several vignettes. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
82 | A neural network interatomic potential for molten NaCl | Li, Qingjie; Kucukbenli, Emine; Lam, Stephen; Khaykovich, Boris; Kaxiras, Efthimios; Li, Ju | In this talk, we present the application of artificial neural-network (NN) in training accurate interatomic potentials that enable fast evaluations of salt properties on desired time-and length-scales. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
83 | Simulating Aluminum Corrosion Using DFT Trained Deep Neural Network Potentials | Saidi, Wissam; Dwaraknath, Shyam | Here we demonstrate the power of a deep neural network potential (DNP) to model the stability of various phases and terminations of Al2O3 on Al. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
84 | Tensor-Field Molecular Dynamics: A Deep Learning model for highly accurate, symmetry-preserving force-fields from small data sets | Batzner, Simon; Sun, Lixin; Smidt, Tess; Kozinsky, Boris | We present a framework to learn highly accurate Machine-Learning Force-Fields from small training sets. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
85 | Using Topological Constraints to Modify Polymer Materials | Kremer, Kurt | Using Topological Constraints to Modify Polymer Materials | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
86 | Simpler is Better: How Linear Prediction Tasks Improve Transfer Learning in Chemical Autoencoders | Iovanac, Nick; Savoie, Brett | In this talk we investigate these questions using an autoencoder latent space as a latent variable for transfer learning models trained on the QM9 dataset that have been supplemented with quantum chemistry calculations. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
87 | Neural Network Based Molecular Dynamics to Study Polymers | Kuenneth, Christopher; Ramprasad, Ramamurthy | Neural network based models for molecular dynamics, the subject of this study, are capable of learning from reference quantum mechanical data. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
88 | Applications of Automatic Differentiation to Materials Design | King, Ella; Goodrich, Carl; Schoenholz, Sam; Cubuk, Ekin; Brenner, Michael | We demonstrate the power of AD in materials design by building on a seminal paper by Torquato. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
89 | Trainable Molecular Dynamics Models | Goodrich, Carl; King, Ella; Schoenholz, Samuel; Cubuk, Ekin; Brenner, Michael | We will discuss the first steps towards Trainable Molecular Dynamics Models (TMDMs): how they work, their significant potential for scientific and technological discovery, and initial discoveries of non-trivial self-assembly pathways. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
90 | Hydrogen-Oxygen Combustion: Data-Driven Generation of Quantum-Accurate Interatomic Potentials | Avila, Allan; Bertels, Luke; Mezic, Igor; Head-Gordon, Martin | We demonstrate how the programmable potentials methodology can be utilized to develop quantum accurate molecular level potentials for several intermediate reactions involved in hydrogen-oxygen combustion. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
91 | Toward optimal descriptors for accurate machine learning of flexible molecules | Vassilev Galindo, Valentin; Poltavskyi, Igor; Tkatchenko, Alexandre | Our objective is to test how the ability to accurately reproduce PES depends upon the choice of a molecular descriptor. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
92 | Towards transferable parametrization of Density-Functional Tight-Binding with machine learning | Medrano Sandonas, Leonardo; Stoehr, Martin; Tkatchenko, Alexandre | Using the QM7-X database of small organic molecules, we demonstrate that the DFTB repulsive energy can be effectively learned by means of ML-approaches including neural networks and kernel ridge regression. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
93 | Active learning of fast Bayesian force fields with mapped gaussian processes – application to stability of stanene | Xie, Yu; Vandermause, Jonathan; Sun, Lixin; Cepellotti, Andrea; Kozinsky, Boris | We present progress in implementing automated active learning workflows for training BFFs, aimed at large-scale simulations of rare event dynamics in complex materials. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
94 | Nuclear quantum delocalization enhances non-covalent intramolecular interactions: A machine learning and path integral molecular dynamics study | Sauceda, Huziel; Vassilev Galindo, Valentin; Chmiela, Stefan; Müller, Klaus-Robert; Tkatchenko, Alexandre | In this study, we present evidence that nuclear delocalization can enhance electronic and electrostatic interactions that promote localized dynamics. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
95 | Active learning identifies optimal π-conjugated peptide chemistries for optoelectronics | Shmilovich, Kirill; Ferguson, Andrew | In this work we perform active learning discovery within an embedded chemical space of pi-conjugated peptides using coarse-grained molecular dynamics simulation to discover molecules with emergent optoelectronic behavior. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
96 | A Self-consistent Artificial Neural Network Inter-atomic Potential for Li/C Systems | Shaidu, Yusuf; Lot, Ruggero; Pellegrini, Franco; Kucukbenli, Emine; de Gironcoli, Stefano | In this talk we first present a self-consistent approach to construct a neural network potential for Carbon using the PANNA code[2]. | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
97 | Active Learning Driven Machine Learning Inter-Atomic Potentials Generation: A Case Study for Hafnium dioxide | Sivaraman, Ganesh; Krishnamoorthy, Anand; Baur, Matthias; Holm, Christian; Stan, Marius; Csányi, Gábor; Benmore, Chris; Vazquez-Mayagoitia, Alvaro | We propose a novel active learning scheme to automate the configuration selection to fit the Gaussian Approximation Potential (GAP). | Session 5: Emerging Trends in Molecular Dynamics Simulations and Machine Learning |
98 | Classifying Snapshots of the Doped Hubbard Model with Machine Learning | Bohrdt, Annabelle; Chiu, Christie; Ji, Geoffrey; Xu, Muqing; Greif, Daniel; Greiner, Markus; Demler, Eugene; Grusdt, Fabian; Knap, Michael | We use machine learning techniques to analyse and classify such snapshots of ultracold atoms. | Session 6: Machine Learning for Quantum Matter |
99 | AI Assisted Discovery in Quantum Gas Microscope Images | Guardado-Sanchez, Elmer; Spar, Benjamin; Carrasquilla, Juan; Scalettar, Richard; Bakr, Waseem; Khatami, Ehsan | We try this unbiased approach on images taken in the non-Fermi liquid phase of the Hubbard model around optimal doping. | Session 6: Machine Learning for Quantum Matter |
100 | Unsupervised machine learning of topological phase transitions | Rodriguez Nieva, Joaquin; Scheurer, Mathias | In this talk, I will discuss an unsupervised machine-learning approach that we propose [see Nature Physics 15, 790-795 (2019)], which is capable of “learning” topological invariants from raw, unlabeled data. | Session 6: Machine Learning for Quantum Matter |
101 | Classification of optical quantum states using machine learning | Ahmed, Shahnawaz; Sánchez Muñoz, Carlos; Nori, Franco; Frisk Kockum, Anton | To benchmark our method, we compare with a naive classifier using maximum likelihood estimation. | Session 6: Machine Learning for Quantum Matter |
102 | Unsupervised learning of quantum phase transitions using nonlinear dimension reduction methods | Lidiak, Alexander; Gong, Zhexuan | This motivates us to investigate nonlinear dimension reduction methods such as diffusion maps and autoencoders. | Session 6: Machine Learning for Quantum Matter |
103 | Machine learning the Mattis glass transformation | Lozano-Gomez, Daniel; Pereira, Darren; Gingras, Michel J | In this context, we consider classical spin models in which we introduce a so-called Mattis gauge transformation. | Session 6: Machine Learning for Quantum Matter |
104 | Augmenting machine learning algorithms with the addition of a physics based intelligence prior | Singh, Christopher; Redell, Matthew; Elhamod, Mohannad; Bu, Jie; Karpatne, Anuj; Lee, Wei-Cheng | We outline this simple mechanism to decrease the number of exposures, and enhance the predictive power with a number of examples relevant to the study of quantum phase transitions. | Session 6: Machine Learning for Quantum Matter |
105 | Adversarial machine learning for modeling the distribution of large-scale ultracold atom experiments | Casert, Corneel; Mills, Kyle; Vieijra, Tom; Ryckebusch, Jan; Tamblyn, Isaac | We present how artificial neural networks allow for the direct and targeted generation of large-scale microstates, while restricting the time-consuming simulations or measurements to a small number of particles. | Session 6: Machine Learning for Quantum Matter |
106 | Using Convolutional Neural Networks to analyze phase transitions and calculate critical exponents | Maskara, Nishad; Van Nieuwenburg, Evert; Endres, Manuel | In this work, we present an alternative framework for analyzing phase transitions by using neural networks to learn order parameters directly from data. | Session 6: Machine Learning for Quantum Matter |
107 | Unsupervised learning of topological indices | Balabanov, Oleksandr; Granath, Mats | I will present an unsupervised protocol for learning topological indices of quantum systems [1]. | Session 6: Machine Learning for Quantum Matter |
108 | Machine Learning based BCS superconductivity Predictor from Normal State Properties | Han, Fei; Andrejevic, Nina; Nguyen, Thanh; Nguyen, Quynh; Parjan, Shreya; Li, Mingda | In this study, we employed a few deep learning architectures to correlate the normal state properties to superconductivity. | Session 6: Machine Learning for Quantum Matter |
109 | Unlocking quantum critical phenomena with physics guided artificial intelligence | Singh, Christopher; Redell, Matthew; Elhamod, Mohannad; Bu, Jie; Lee, Wei-Cheng; Karpatne, Anuj | By analyzing the predictions for the total phase space, we can confidently identify the location of criticality from the evolution of the predicted wavefunctions. | Session 6: Machine Learning for Quantum Matter |
110 | Neural-Network Approach to Dissipative Quantum Many-Body Dynamics | Hartmann, Michael; Carleo, Giuseppe | Here we present an approach to the effective simulation of the dynamics of open quantum many-body systems based on machine-learning techniques. | Session 6: Machine Learning for Quantum Matter |
111 | Materials discovery through artificial intelligence | Aykol, Muratahan | In this talk, I will present new AI tools developed at TRI for end-to-end material discovery systems. | Session 6: Machine Learning for Quantum Matter |
112 | Working without data: overcoming gaps in deep learning and physics-based extrapolation | Tamblyn, Isaac | Other challenges include the lack of a standardized methodology for reporting and understanding model errors as well as the frequent requirement for large quantities of data. | Session 6: Machine Learning for Quantum Matter |
113 | Machine learning models of properties of hybrid 2D materials as potential super lubricants | Fronzi, Marco; Abu Ghazaleh, Mutaz; Isayev, Olexandr; Winkler, David; shapter, joe; Ford, Michael | We describe a time and resource-efficient machine learning approach to create a large dataset of structural properties of van der Waals layered structures. | Session 6: Machine Learning for Quantum Matter |
114 | Charge Density Prediction through 3D-CNN for Fast Convergence of Self-Consistent DFT calculation | Kurata, Iori; Shinagawa, Chikashi; Sawada, Ryohto | In this study, we propose a machine-learning algorithm to predict the charge densities of crystals using a three-dimensional convolutional neural network (3DCNN). | Session 6: Machine Learning for Quantum Matter |
115 | Data-driven studies of the magnetic anisotropy of two-dimensional magnetic materials | Xie, Yiqi; Rhone, Trevor David; Tritsaris, Georgios; Grånäs, Oscar; Kaxiras, Efthimios | Our data-driven study aims to uncover physical insights into the microscopic origins of magnetism in reduced dimensions and to demonstrate the success of a high-throughput computational approach for the targeted design of quantum materials with potential applications from sensing to data storage. | Session 6: Machine Learning for Quantum Matter |
116 | Robust cluster expansion of multicomponent systems using machine learning with structured sparsity | Leong, Zhidong; Tan, Teck Leong | We present group lasso as an efficient method for obtaining robust cluster expansions (CE) of multicomponent systems, a popular computational technique for modeling the thermodynamic properties of such systems. | Session 6: Machine Learning for Quantum Matter |
117 | Generalizing an Energy Predictor based on Wavelet Scattering for 3D Atomic Systems | Sinz, Paul; Swift, Michael; Brumwell, Xavier; Kim, Kwang Jin; Qi, Yue; Hirn, Matthew | In this work, we test the generalizability of our Li xSi energy predictor to properties that were not included in the training set, such as elastic constants and migration barriers. | Session 6: Machine Learning for Quantum Matter |
118 | Using Machine Learning Models to Predict Higher-Level Quantities from Energy Models | Malenfant-Thuot, Olivier; Cote, Michel | Machine learning methods are now used more and more as a substitute for Density Functional Theory calculations due to their low computational costs. | Session 6: Machine Learning for Quantum Matter |
119 | AI-guided engineering of nanoscale topological materials | Srinivasan, Srilok; Cherukara, Mathew; Eckstein, David; Avarca, Anthony; Sankaranarayanan, Subramanian; Darancet, Pierre | Inspired by recent progress in classifying topological phases in armchair, cove-edged and chevron graphene nanoribbons, we develop a high-throughput framework based on the computation of the Zak phase and the Z2 invariants using tight-binding and density functional theory to explore the topology of low-symmetry 1D and 2D periodic organic compounds. | Session 6: Machine Learning for Quantum Matter |
120 | Motif-based machine learning for crystalline materials | Banjade, Huta; Zhang, Shanshan; Hauri, Sandro; Vucetic, Slobodan; Yan, Qimin | In this talk, we propose a novel representation of crystalline solid-state materials (such as complex metal oxides) as graphs composed of structure motifs. | Session 6: Machine Learning for Quantum Matter |
121 | Machine learning powered kinetic energy functional finding in solid state physics | Ren, Hongbin; Dai, Xi; Wang, Lei | In this work, we use machine learning method to build a kinetic energy functional for 1D extended system, our solution combines the dimensionality reduction method with the Gauss process regression, and use a simple scaling trick to generalize the functional to 1D lattice with arbitrary lattice constant. | Session 6: Machine Learning for Quantum Matter |
122 | Frustrated magnets and fermions with Neural Network Quantum States | Choo, Kenny Jing; Mezzacapo, Antonio; Neupert, Titus; Carleo, Giuseppe | On small test molecules, we achieve energies below chemical accuracy, and frequently improves upon coupled cluster methods. | Session 6: Machine Learning for Quantum Matter |
123 | Learning the Ground State Wavefunction of Periodic Systems Using Recurrent Neural Networks | Roth, Christopher; MacDonald, Allan | Here, we present an approach for simulating periodic quantum systems using long short term memory networks (LSTMs), whose recurrent structure is able to efficiently capture invariance to discrete translations in the bulk. | Session 6: Machine Learning for Quantum Matter |
124 | Calculating Renyi Entropies with Neural Autoregressive Quantum States | Wang, Zhaoyou; Davis, Emily | We therefore propose an improved “conditional sampling” method exploiting the autoregressive structure of the network ansatz, which outperforms direct sampling in both 1D and 2D Heisenberg models. | Session 6: Machine Learning for Quantum Matter |
125 | Probabilistic Simulation of Quantum Circuits with the Transformer | Carrasquilla, Juan; Luo, Di; Perez, Felipe; Clark, Bryan; Milsted, Ashley; Volkovs, Maksims; Aolita, Mario | In this work, we present an exact probabilistic formulation of quantum dynamics through positive value-operator measurements (POVM). | Session 6: Machine Learning for Quantum Matter |
126 | Variational optimization in the AI era | Clark, Bryan; Kochkov, Dmitrii; Luo, Di | We will describe these advancements and our effort to push forward, in the age of AI, the variational approach to the quantum many body problem. | Session 6: Machine Learning for Quantum Matter |
127 | Deep neural network solution of the electronic Schrödinger equation | Hermann, Jan; Schätzle, Zeno; Noe, Frank | We demonstrate that PauliNet outperforms comparable state-of-the-art trial wave functions on atoms, small molecules, and a strongly correlated model system. | Session 6: Machine Learning for Quantum Matter |
128 | Ab-Initio Solution of the Many-Electron Schrödinger Equation with Deep Neural Networks | Spencer, James; Pfau, David; Matthews, Alex; Foulkes, W Matthew | Calculating analytic solutions to the Schrödinger equation is impossible except in a small number of special cases. | Session 6: Machine Learning for Quantum Matter |
129 | Towards neural network quantum states with nonabelian symmetries | Vieijra, Tom; Casert, Corneel; Nys, Jannes; De Neve, Wesley; Haegeman, Jutho; Ryckebusch, Jan; Verstraete, Frank | We demonstrate that this problem can be overcome by sampling in the basis of irreducible representations instead of spins, for which the corresponding ansatz respects the nonabelian symmetries of the system. | Session 6: Machine Learning for Quantum Matter |
130 | Designing neural networks for stationary states in open quantum many-body systems | Yoshioka, Nobuyuki; Hamazaki, Ryusuke | We propose a new variational scheme based on the neural-network quantum states to simulate the stationary states of open quantum many-body systems [1]. | Session 6: Machine Learning for Quantum Matter |
131 | Deep Learning-Enhanced Variational Monte Carlo Method for Quantum Many-Body Physics | Yang, Li; Leng, Zhaoqi; Li, Li; Patel, Ankit; Hu, Wenjun; Pu, Han | We introduce an importance sampling gradient optimization (ISGO) algorithm (arXiv:1905.10730), which significantly improves the computational speed of training DNN in VMC. | Session 6: Machine Learning for Quantum Matter |
132 | Differentiable programming tensor networks and quantum circuits | Liu, JinGuo; Wang, Lei | This talk covers a brief survey of the state of the art differential programming frameworks, and their applications to condensed matter physics and quantum computing. | Session 6: Machine Learning for Quantum Matter |
133 | Machine learning effective models from a Boltzmann perspective | Rigo, Jonas; Mitchell, Andrew | We investigate the derivation of effective models for quantum impurity type problems using machine learning methods. | Session 6: Machine Learning for Quantum Matter |
134 | Automatic design of Hamiltonians | Pakrouski, Kiryl | We formulate an optimization problem of Hamiltonian design. | Session 6: Machine Learning for Quantum Matter |
135 | Direct and Reverse Structure-Electronic Property Relationship Prediction with Deep Learning and Bayesian Optimization | Pimachev, Artem; Neogi, Sanghamitra | We propose a reverse approach based on Bayesian optimization to predict the structure from measured system’s properties of interest. | Session 6: Machine Learning for Quantum Matter |
136 | Machine Learning of Single-Atom Defects in 2D Transition Metal Dichalcogenides with Sub-Picometer Precision | Khan, Abid; Clark, Bryan; Lee, Chia-Hao; Luo, Di; Shi, Chuqiao; Kang, Sangmin; Zhu, Wenjuan; Huang, Pinshane | By employing deep learning techniques, we quickly identify and classify various defect species, including metal substitutions, chalcogen vacancies, and chalcogen substitutions. | Session 6: Machine Learning for Quantum Matter |
137 | Dictionary Learning in Fourier Transform Scanning Tunneling Spectroscopy | Wieteska, Jedrzej; Lau, Yenson; Hanaguri, Tetsuo; Wright, John; Eremin, Ilya; Pasupathy, Abhay | We have developed a new algorithm based on nonconvex optimization, applicable to any microscopy modality, that directly uncovers the fundamental motifs present in a real-space image. | Session 6: Machine Learning for Quantum Matter |
138 | Machine Learning Tool for Crystal Structure Predictions | Stanev, Valentin; Liang, Haotong; Kusne, Aaron; Takeuchi, Ichiro | In this talk I will present an alternative approach that utilizes machine learning for crystal structure predictions. | Session 6: Machine Learning for Quantum Matter |
139 | Transferable and interpretable machine learning model for four-dimensional scanning transmission electron microscopy data | Matty, Michael; Cao, Michael; Chen, Zhen; Li, Li; Muller, David | We benchmark against conventional approaches using quantitative metrics for resolution and contrast. | Session 6: Machine Learning for Quantum Matter |
140 | Tight-binding deep learning approach to band structures calculations | Sapper, Florian; Peano, Vittorio; Marquardt, Florian | In this talk we present a numerical method for band structure calculations that is based on deep neural networks (NNs). | Session 6: Machine Learning for Quantum Matter |
141 | Self-learning projective quantum Monte Carlo simulations guided by restricted Boltzmann machines | Pilati, Sebastiano; Inack, Estelle; Pieri, Pierbiagio | In this work, we present a novel method that uses unsupervised machine learning techniques to combine the two steps above. We present extensive benchmarks that demonstrate the efficiency of our self-learning method. | Session 7: Machine learning for quantum matter |
142 | Self-learning Hybrid Monte Carlo method for first-principles molecular simulations | Nagai, Yuki; Okumura, Masahiko; Kobayashi, Keita; Shiga, Motoyuki | We propose a novel approach called Self-Learning Hybrid Monte Carlo (SLHMC)[1] which is a general method to make use of machine learning potentials to accelerate the statistical sampling of first-principles density-functional-theory (DFT) simulations. | Session 7: Machine learning for quantum matter |
143 | On-the-fly machine learning algorithm for accelerating Monte Carlo sampling: Application to the stochastic analytical continuation | Yoon, Hongkee; Han, Myung Joon | We present a new Monte Carlo method whose sampling is assisted by modern machine learning (ML) technique. | Session 7: Machine learning for quantum matter |
144 | Automatic Differentiable Monte Carlo: Theory | Zhang, Shixin; Wan, Zhou-Quan; Yao, Hong | Here we propose a general theory framework with detach function techniques enabling infinite order automatic differentiation on Monte Carlo expectations with unnormalized probability distributions. | Session 7: Machine learning for quantum matter |
145 | Automatic Differentiable Monte Carlo: Applications | Wan, Zhouquan; Zhang, Shixin; Yao, Hong | By introducing automatic differentiable Monte Carlo (ADMC), we can leverage state-of-the-art machine learning frameworks and techniques to traditional Monte Carlo approaches in statistics and physics by simply implementing relevant Monte Carlo algorithms on computation graphs. | Session 7: Machine learning for quantum matter |
146 | Optimal Real-Space Renormalization-Group Transformations with Artificial Neural Networks | Chung, Jui-Hui; Kao, Ying-Jer | We introduce a general method for optimizing real-space renormalization-group transformations to study the critical properties of a classical system.The scheme is based on minimizing the Kullback-Leibler divergence between the distribution of the system and the normalizing factor of the transformation parametrized by a restricted Boltzmann machine. | Session 7: Machine learning for quantum matter |
147 | Machine-learning-accelerated predictions of optical properties of condensed systems based on many-body perturbation theory | Dong, Sijia; Govoni, Marco; Galli, Giulia | We present an approach to improve the efficiency of first principles calculations of absorption spectra of complex materials at finite temperature, based on the solution of the Bethe-Salpeter equation (BSE) [1]. | Session 7: Machine learning for quantum matter |
148 | Machine Learned Spectral Functions for the Quantum Impurity Problem | Sturm, Erica; Carbone, Matthew; Lu, Deyu; Weichselbaum, Andreas; Konik, Robert | This work leverages a feed-forward neural network (NN) to predict the spectral functions of the single impurity Anderson model (SIAM) as a function of five physical parameters including the Coulomb interaction U, hybridization constant Γ, impurity energy ε d, magnetic field B, and temperature T. | Session 7: Machine learning for quantum matter |
149 | Finding New Mixing Strategies for Self Consistent Field Procedures Using Reinforcement Learning | Abarbanel, Daniel; Guo, Hong | We present a new method to discover mixing strategies by applying a reinforcement learning algorithm (RLA). | Session 7: Machine learning for quantum matter |
150 | Machine learning spin dynamics in the double-exchange systems | Zhang, Puhan; Saha, Preetha; Chern, Gia-Wei | Here we propose a machine learning (ML) technique that can solve the dynamics of the DE model in linear time complexity. | Session 7: Machine learning for quantum matter |
151 | Machine learning of high-throughput DFT electron densities | Hung, Linda; Schweigert, Daniel; Bhargava, Arjun; Gopal, Chirranjeevi | In this talk, we demonstrate how electron density datasets from these databases can be used to train machine learning models that complement and enhance the capabilities of DFT. | Session 7: Machine learning for quantum matter |
152 | Machine learning as a solution to the electronic structure problem | Gonzalez del Rio, Beatriz; Ramprasad, Ramamurthy | A promising development in recent years is the use of machine learning (ML) methodologies to train surrogate models with DFT data to predict quantum-accurate results for larger systems. | Session 7: Machine learning for quantum matter |
153 | Machine learning spectral indicators of topology | Andrejevic, Nina; Andrejevic, Jovana; Rycroft, Christopher; Li, Mingda | Here, we study the effectiveness of XAS as a predictor of topology using machine learning methods to disentangle key structural information from the complex spectral features. | Session 7: Machine learning for quantum matter |
154 | Nicholas Metropolis Award Talk: Enhancing Quantum Simulators with Neural Networks | Torlai, Giacomo | I will present results for a cold Rydberg-atom quantum simulator and quantum chemistry calculations on a superconducting quantum hardware. | Session 7: Machine learning for quantum matter |
155 | Topological codes revisited: Hamiltonian learning and topological phase transitions | Greplova, Eliska; Valenti, Agnes; Van Nieuwenburg, Evert; Boschung, Gregor; Schäfer, Frank; Loerch, Niels; Huber, Sebastian | We introduce a neural net based approach to this challenge. | Session 7: Machine learning for quantum matter |
156 | Real time evolution with neural network quantum states | Lopez Gutierrez, Irene; Mendl, Christian | In this work, we propose the use of standard machine learning optimization techniques, combined with a modified backpropagation for a neural network with complex parameters, to tackle the time evolution of an example system: the Ising model in 1 and 2-D. | Session 7: Machine learning for quantum matter |
157 | Hunting for Hamiltonians with a General-Purpose Symmetry-to-Hamiltonian Approach | Chertkov, Eli; Villalonga, Benjamin; Clark, Bryan | In this talk, we will introduce the SHC method and discuss the topological Hamiltonians that we find. | Session 7: Machine learning for quantum matter |
158 | Studying inhomogeneous quantum many-body problems using neural networks | Blania, Alexander; Van Nieuwenburg, Evert; Marquardt, Florian | We show how convolutional neural networks can be employed to learn the mapping from arbitrary potential landscapes to observables in quantum many-body systems. | Session 7: Machine learning for quantum matter |
159 | Calculating Wannier functions via basis pursuit using a machine learned dictionary | Magnetta, Bradley; Ozolins, Vidvuds | In this work we provide a modern method for calculating Wannier functions via projection by incorporating basis pursuit into the quantum variational method to automatically generate a set of localized functions needed to enforce localization while obtaining the ground state. | Session 7: Machine learning for quantum matter |
160 | Classical Quantum Optimization with Neural Network Quantum States | Gomes, Joseph | Here, we demonstrate the utility of the variational representation of quantum states based on artificial neural networks for performing quantum optimization. | Session 7: Machine learning for quantum matter |
161 | Solving frustrated quantum many-particle models with convolutional neural networks | Liang, Xiao | In this paper, we design a brand new convolutional neural network (CNN) to solve such quantum many-particle problems. | Session 7: Machine learning for quantum matter |
162 | Quantum dynamics in driven spin systems with neural-network quantum states | Hofmann, Damian; Carleo, Giuseppe; Rubio, Angel; Sentef, Michael | In this talk, we study magnetic excitations in a driven two-dimensional Heisenberg antiferromagnet. | Session 7: Machine learning for quantum matter |
163 | Study of phi-4 theories with deep learning methods | Lai, Zhong Yuan; Meirinhos, Francisco; Li, Xiaopeng | In this talk I will present our recent work combining field theoretical and deep learning methods to systematically account for non-perturbative aspects. | Session 7: Machine learning for quantum matter |
164 | Unsupervised machine learning for accelerating discoveries from temperature dependent X-ray data | Venderley, Jordan; Matty, Michael; Kishore, Varsha; Pleiss, Geoff; Weinberger, Kilian; Kim, Eun-Ah | Here, we present a novel unsupervised machine learning approach for accelerating the analysis of temperature dependent single crystal X-ray diffraction data. | Session 7: Machine learning for quantum matter |
165 | Machine learning effective models for quantum systems | Mitchell, Andrew; Rigo, Jonas | Using information theoretic techniques, we propose a model machine learning approach that optimizes an effective model based on an estimation of its partition function. | Session 7: Machine learning for quantum matter |
166 | Data science and video games | Stirling, Spencer | We use machine learning to touch each of these facets, hoping to improve engagement, profitability, and fun! | Session 8: New Ways of Seeing with Data Science |
167 | Modeling complex physical systems with big data and machine-learning | Hamann, Hendrik; Lu, Siyuan | In this talk we present a general framework, which advances such endeavor. | Session 8: New Ways of Seeing with Data Science |
168 | Machine learning for seeing and hearing more | Riley, Patrick | I’ll survey some exciting results from the Google Accelerated Science team in the areas of cellular imaging for biomedical research, extracting surprising results from human clinical imaging, and disease staging from auditory signals. | Session 8: New Ways of Seeing with Data Science |
169 | Machine Learning in Scanning probe microscopy: accelerating imaging, enhancing resolution and Bayesian methodologies for theory-experiment matching | Vasudevan, Rama; Kalinin, Sergei; Kelley, Kyle; Jesse, Stephen; Ziatdinov, Maxim; Borodinov, Nikolay | In this talk, I will discuss how SPM can be greatly enhanced via careful and tailored use of machine/statistical learning methodologies in every aspect, from data acquisition to real-time analytics to model comparison and selection. | Session 8: New Ways of Seeing with Data Science |
170 | Immunotherapy Modeling: Molecular Interaction and Recognition of MHC/peptide/TCR Complexes | Zhou, Ruhong | In this talk, I will talk about our recent collaborative work which solves one mystery behind this low response rate with molecular modeling and machine learning techniques. | Session 8: New Ways of Seeing with Data Science |
171 | A nonlinear and statistical physics approach to machine learning electronic hardware | Lathrop, Daniel; Shaughnessy, Liam; Hunt, Brian; Komkov, Heidi; Restelli, Alessandro | We present research developing novel machine learning hardware that relies on a large network of nonlinear electronic nodes to instantiate a reservoir computer. | Session 9: Statistical Physics Meets Machine Learning |
172 | Reservoir Computer Optimization for Parity Checking | Barbosa, Wendson; Ribeill, Guilhem; Nguyen, Minh-Hai; Ohki, Thomas; Rowlands, Graham; Gauthier, Daniel | We shall discuss the reservoir computer hyper-parameters optimization and exploration of different architectures for inputting data to the reservoir to improve the parity classification performance as well as paths toward high-speed hardware implementation. | Session 9: Statistical Physics Meets Machine Learning |
173 | Using Machine Learning to Infer Composition of Complex Chemical Mixtures | Javed, Unab; Ramaiyan, Kannan; Kreller, Cortney; Brosha, Eric; Mukundan, Rangachary; Morozov, Alexandre | We have developed a Bayesian algorithm which, given a set of readings from the array, identifies and quantifies all gases present in the system. | Session 9: Statistical Physics Meets Machine Learning |
174 | Deep generative spin-glass models with normalizing flows | Mohseni, Masoud; Hartnett, Gavin | We explore two alternative methods for training the normalizing flow based on minimizing reverse and forward Kullback-Leibler divergence. | Session 9: Statistical Physics Meets Machine Learning |
175 | A Continuous Formulation of Discrete Spin-Glass Systems | Hartnett, Gavin; Mohseni, Masoud | In this talk, we introduce our general formalism and theoretically establish the similarities and differences with the mean-field models and the Thouless-Anderson-Palmer equation. | Session 9: Statistical Physics Meets Machine Learning |
176 | Machine-learning the DFT of a classical statistical-mechanical system | Yatsyshin, Petr; Duncan, Andrew; Kalliadasis, Serafim | In this talk, we address the inverse problem of finding the free energy functional, given the particle data corresponding to the system in equilibrium. | Session 9: Statistical Physics Meets Machine Learning |
177 | Dynamical loss functions for Machine Learning | Ruiz Garcia, Miguel; Zhang, Ge; Schoenholz, Samuel; Liu, Andrea | We take a different approach by exploring new loss functions. | Session 9: Statistical Physics Meets Machine Learning |
178 | A mechanical model for supervised learning | Stern, Menachem; Arinze, Chukwunonso; Perez, Leron; Palmer, Stephanie; Murugan, Arvind | In this work, we apply the supervised learning framework to self-folding sheets, using a physically motivated learning rule. | Session 9: Statistical Physics Meets Machine Learning |
179 | Quantifying statistical mechanical learning in a many-body system with machine learning | Zhong, Weishun; Gold, Jacob; Marzen, Sarah; England, Jeremy; Yunger Halpern, Nicole | Our strategy relies on a parallel that we identify between representation learning and statistical mechanics in the presence of a drive. | Session 9: Statistical Physics Meets Machine Learning |
180 | Information-bottleneck renormalization group for self-supervised representation learning | Ngampruetikorn, Vudtiwat; Bialek, William; Schwab, David | Here we propose a self-supervised learning method that combines the concepts of the information bottleneck and the renormalization group. | Session 9: Statistical Physics Meets Machine Learning |
181 | On matching symmetries and information between training time series and machine dynamics. | Engelbrecht, Jan; Yang, Owen; Mirollo, Renato | On matching symmetries and information between training time series and machine dynamics. | Session 9: Statistical Physics Meets Machine Learning |
182 | Deep Learning on the 2-Dimensional Ising Model to Extract the Crossover Region | Walker, Nicholas; Tam, Ka-Ming; Jarrell, Mark | The 2-dimensional square Ising model is investigated with a variational autoencoder in the non-vanishing field case for the purpose of extracting the crossover region between the ferromagnetic and paramagnetic phases. | Session 9: Statistical Physics Meets Machine Learning |
183 | Training and classification using Restricted Boltzmann Machine (RBM) on the D-Wave 2000Q | Dixit, Vivek; Kais, Sabre; Alam, Muhammad | Training and classification using Restricted Boltzmann Machine (RBM) on the D-Wave 2000Q | Session 9: Statistical Physics Meets Machine Learning |
184 | Statistical Physics Analysis of Training of Restricted Boltzmann Machines | Oh, Sangchul; Baggag, Abdelkader | We analyze the training process of the restricted Boltzmann machine in the context of statistical physics. | Session 9: Statistical Physics Meets Machine Learning |
185 | Mode-Assisted Unsupervised Learning of Restricted Boltzmann Machines | Manukian, Haik; Pei, Yan Ru; Bearden, Sean; Di Ventra, Massimiliano | In this work we show that properly combining standard gradient approximations with an off-gradient direction, constructed from samples of the RBM ground state (mode), improves their training dramatically over the standard methods. | Session 9: Statistical Physics Meets Machine Learning |