Publications
This work introduces a novel framework to evaluate the quality of AI explanations in Graph Neural Networks (GNNs), ensuring that the materials discovered by AI are selected based on reliable and physically meaningful data patterns.
Authors: Ding Zhang, Siddharth Betala, Chirag Agarwal
LeMat-GenBench provides the first standardized framework to rigorously evaluate and compare how well different AI models generate new, stable chemical structures.
Authors: Siddharth Betala, Samuel P. Gleason, Félix Therrien, Rocío Mercado, Alexandre Duval
This critical research demonstrates that while adsorption energy is a key metric, AI discovery must also consider reaction kinetics and stability to truly identify viable catalysts.
Authors: Shahana Chatterjee, Alexander Davis, Yoshua Bengio, Alexandre Duval, Félix Therrien
LeMat-Bulk is a massive, cleaned database that merges multiple quantum chemistry sources to provide a high-quality foundation for training large-scale AI models for materials.
Authors: Martin Siron, Inel Djafar, Ali Ramlaoui, Felix Therrien, Alexandre Duval
This paper introduces and analyzes batching algorithms for Graph Neural Networks (GNNs), demonstrating that optimized dynamic batching can achieve up to a 12.5x speedup in training time, significantly accelerating Al-driven materials discovery.
Authors: Daniel T. Speckhard, Tim Bechtel, Sebastian Kehl, Jonathan Godwin, Claudia Draxl
LeMat-Synth is a toolbox that uses AI to automatically extract and standardize chemical synthesis protocols from millions of scientific papers to build comprehensive discovery databases.
Authors:
Magdalena Lederbauer, Siddharth Betala, Ayush Jain, Alexandre Duval, Samuel P. Gleason
This work reviews state-of-the-art atomistic workflows and demonstrates how advanced data management enables the interoperability of experimental and computational research.
Authors:
Daniel T. Speckhard, Martin Kuban, Christoph T. Koch, Joseph F. Rudzinski, Claudia Draxl
LeMat-Traj provides a massive, standardized dataset of atomic trajectories to benchmark and improve the accuracy of machine learning models in predicting material dynamics.
Authors: Ali Ramlaoui, Martin Siron, Inel Djafar, Joseph Musielewicz, Alexandre Duval
This work introduces Catalyst GFlowNet, an AI framework that autonomously discovers high-performance catalysts for the hydrogen evolution reaction by navigating complex chemical spaces.
Authors: Lena Podina, Christina Humer, Alexandre Duval, Victor Schmidt, Yoshua Bengio
This work utilizes a neural-architecture search to optimize message-passing neural networks for predicting the physical properties of solids, achieving superior accuracy in band-gap and formation-energy regression[cite: 2750, 2752].
Authors: Tim Bechtel, Daniel T. Speckhard, Jonathan Godwin, Claudia Draxl
This work introduces machine-learning models to extrapolate DFT calculations to the complete basis-set limit, enabling high-precision material property predictions while significantly reducing computational costs.
Authors: Daniel T. Speckhard, Christian Carbogno, Luca M. Ghiringhelli, Sven Lubeck, Matthias Scheffler, Claudia Draxl