Profile avatar
bravo-abad.bsky.social
Professor of Physics at UAM | Profesor Titular. PI of the AI for Materials Lab | Director del Laboratorio de IA para Materiales. ai4materials.org
364 posts 2,497 followers 3,982 following
Prolific Poster

Delgado-Granados et al. use machine learning to refine two-electron RDMs from upper- and lower-bound methods. By predicting the optimal combination weight, they recover near-exact energies for diatomics—achieving big accuracy gains without wave-function scaling. pubs.acs.org/doi/full/10....

Chen et al. introduce a GPU-accelerated tensor-network approach for simulating large-qubit QSVMs. It enables near-quadratic scaling, handling hundreds of qubits in seconds, showing that quantum-enhanced ML can be practically validated on classical HPC. iopscience.iop.org/article/10.1...

Schleinitz et al. show how targeted data sets and acquisition functions enhance regioselectivity predictions for C(sp3)–H oxidation and arene borylation, using fewer data points yet improving accuracy on complex molecules. pubs.acs.org/doi/full/10....

Conrad et al. show that linking predictors across multiple targets can boost accuracy by up to 15% on sparse materials data. AutoML-based approaches outperform both single-task models and neural nets, offering a powerful tool for data-limited materials design. iopscience.iop.org/article/10.1...

Fellinger et al. unveil GRADE and X-GRADE, floating-point fingerprints that boost protein–ligand binding predictions via ML models. Testing on PDBbind showed correlations up to 0.74, while 3D-QSAR tasks gained accuracy over standard methods. pubs.acs.org/doi/full/10....

Harris et al. introduce a resimulation-based self-supervised approach that boosts jet classification accuracy and reduces simulation uncertainties. Their contrastive pretraining on millions of events significantly outperforms fully supervised methods. journals.aps.org/prd/abstract...

Duarte et al. develop an outlier-resistant physics-informed neural network using Tsallis statistics, preserving accurate solutions even when over half the measurements are corrupted. journals.aps.org/pre/abstract...

Ruzmetov et al. use generative deep learning to expand protein conformational sampling, producing new stable conformations from minimal simulation data. Their internal coordinate approach unveils rare states and extends our grasp of protein flexibility. pubs.acs.org/doi/full/10....

Inada et al. develop machine learning-based elemental reactivity maps by combining curated positive and negative data sets. The maps guide discovery of new compounds, demonstrated by synthesizing Co–Al–Ge materials. pubs.acs.org/doi/full/10....

Cui et al. present a test-time adaptation framework that refines machine learning models on-the-fly, reducing force and energy errors by ~30% and enabling stable molecular dynamics simulations without extra data. www.nature.com/articles/s41...

Davidson et al. introduce a model that learns human-like goals by representing them as reward-producing programs and generating new, creative “games.” www.nature.com/articles/s42... & arxiv.org/abs/2405.13242

Bar-Lev et al. combine coding theory and deep neural networks for faster, more reliable DNA data retrieval, cutting error rates by 40% and boosting speed by 3,200×. www.nature.com/articles/s42...

Gauvin-Ndiaye et al. demonstrate how neural quantum states accurately capture Mott transitions in a disordered Hubbard model. Their hidden fermion determinantal approach handles volume-law entanglement beyond standard methods. journals.aps.org/prl/abstract...

Chen and Sivaraman present a two-step LLM approach that extracts and classifies protein purification methods from 64,909 PDB-linked articles, reducing trial-and-error and revealing key trends in buffers, tags, and expression strategies. advanced.onlinelibrary.wiley.com/doi/10.1002/...

Chaves et al. present a super learner model for protein–protein binding free energies. Their PBEE pipeline integrates Rosetta descriptors, achieving ~2 kcal/mol error and ~0.70 correlation vs experiment. This fast, accurate tool speeds up protein interaction engineering. pubs.acs.org/doi/full/10....

Zhang et al. link protein language models with a biofoundry-driven design–build–test cycle for automated protein evolution. Within 10 days and four rounds, they achieve a 2.4× improved tRNA synthetase variant, showcasing faster and more accurate protein engineering. www.nature.com/articles/s41...

Li et al. introduce TransPeakNet, a deep learning method that predicts solvent-aware 2D NMR (HSQC) peaks. By pretraining on 1D data and fine-tuning unsupervised on unlabeled spectra, it achieves state-of-the-art accuracy and automatic peak assignment. www.nature.com/articles/s42...

Merz et al. introduce a machine learning approach with quantum-enhanced features to predict proton affinities. By combining 186 descriptors and parameterized quantum circuits, they achieve near-experimental accuracy while cutting down on computational resources. pubs.acs.org/doi/full/10....

Nuñez-Andrade et al. propose “eOHE,” compressing chemical tokens into fewer real values. It slashes memory costs in deep learning while preserving model performance, helping large-scale molecule generation remain efficient and robust. pubs.rsc.org/en/content/a...

Chen et al. introduce “Delete,” a single framework uniting multiple lead optimization tasks with a 3D, protein-aware neural network. It designed new LTK inhibitors, including a potent and selective molecule validated in vitro and in vivo. www.nature.com/articles/s42... & arxiv.org/abs/2308.02172

Kramer et al. urge a long-term, blinded benchmarking of pose- and activity-prediction tools in drug discovery. They argue machine learning gains require new, truly unseen data and open sharing, ensuring methods evolve and deliver reliable, real-world results. pubs.acs.org/doi/10.1021/...

Kyro et al. introduce T-ALPHA, a transformer-driven model unifying protein and ligand features for binding affinity prediction. It outperforms prior methods, excels with predicted structures, and uses uncertainty-aware self-learning to improve target-specific rankings. pubs.acs.org/doi/full/10....

Wu et al. show how quantum neural networks can process data in parallel by placing multiple training samples in superposition. They confirm orthogonal states match individual training performance, while nonorthogonal states bring new interference-driven possibilities journals.aps.org/prresearch/a...

Iwamoto et al. devised an unsupervised deep learning method that denoises nanoparticle signals, achieving detection of ~30 nm particles at over 100k events/s. They reveal rare EVs (0.002% of total) in serum, underscoring its promise for sensitive, large-scale analyses. www.nature.com/articles/s41...

Lowet et al. show how an opponent striatal circuit tracks full reward distributions. Their distributional reinforcement model encodes variance and tails, enhancing prediction beyond average outcomes. Dopamine disruptions impair only the variance signals. www.nature.com/articles/s41...

Yang et al. combine deep learning and transfer learning to predict infrared and Raman spectra of large proteins, opening new possibilities for efficient biomolecular spectroscopy. pubs.acs.org/doi/full/10....

Kraemer et al. use machine learning to model epidemics more efficiently. Their work shows faster forecasting, better handling of incomplete data, and more accurate infection estimates, pointing to improved global health decision making. www.nature.com/articles/s41...

Zhang et al. used active learning with Gaussian process regression to find strained polymers reaching above 1.0 W/mK in thermal conductivity. Their iterative machine learning method rapidly identified 10 top polymers with fewer simulations. pubs.rsc.org/en/content/a...

Lapo et al. introduce an unsupervised approach (mrCOSTS) that extracts coherent patterns in multiscale data, revealing both fast and slow processes in systems as diverse as climate, neuroscience, and fluid flows. www.pnas.org/doi/abs/10.1...

Liu et al. used a combined DFT and machine learning approach to achieve near-experimental accuracy (R²≈0.77, MAE=0.065 eV) for predicting optical gaps in conjugated polymers, boosting confidence in rapid screening and design of new photoelectronic materials. pubs.rsc.org/en/Content/A...

Stienstra et al. built a graph-transformer model that predicts ion vibrational spectra with about 21% higher accuracy than typical quantum chemical methods, speeding up structure identification for diverse molecules. pubs.acs.org/doi/full/10....

Guo et al. present organic synaptic transistors paired with machine learning, achieving 800 potentiation states and 93% recognition of digit and ECG data. Their system highlights promising energy efficiency for artificial visual applications. pubs.acs.org/doi/10.1021/...

Veraldi et al. create a fully programmable photonic Ising machine that encodes spin couplings via “focal plane division.” A single SLM and parallel intensity readout can solve 32-spin max-cut problems, showcasing speed, scalability, and general QUBO programmability. journals.aps.org/prl/abstract...

Su et al. fuse physics and geometry into a deep learning model called LumiNet for protein–ligand binding free energies. It outperforms prior methods, adapts with few data, and offers interpretable atomic-level interaction insights—boosting drug design. pubs.rsc.org/en/Content/A...

Zeng et al. use a neural network to optimize approximate GKP codewords, requiring fewer large-amplitude states while enhancing error correction under photon loss and dephasing. Their method outperforms standard constructions at common squeezing levels. journals.aps.org/prl/abstract...

Zhao et al. blend DFT and physics-informed ML to map local electric fields on nickel nanoparticles. They show how embedding a first-order Taylor expansion in ML rapidly predicts field-driven adsorption with near-DFT accuracy for diverse catalysts. pubs.acs.org/doi/full/10....

Zaza et al. use machine learning and an energy-based shape scale to predict conditions for new copper nanocrystal morphologies—demonstrating a data-light approach that discovers shapes and speeds up colloidal synthesis development. pubs.acs.org/doi/10.1021/...

Wang et al. show that prompting LLMs to “speak as” certain identities can distort or flatten those identities. When replaced for real people in surveys or user tests, LLMs risk perpetuating stereotypes, misportraying minority standpoints, and erasing lived experiences. www.nature.com/articles/s42...

Hao et al. introduce scAGDE, a deep graph autoencoder for single-cell ATAC-seq. By learning topological embeddings of chromatin accessibility, it clusters cell types, recovers critical regulatory regions, and reveals nuanced epigenetic landscapes. www.nature.com/articles/s41...

Gosztolai et al. introduce MARBLE, a geometric learning method mapping local flow fields of neural populations into interpretable latent spaces.Their approach decodes brain activity across tasks & even across different subjects without requiring extra behavioral labels www.nature.com/articles/s41...

Wang et al. introduce a machine learning-driven system that autonomously optimizes polymer film processing, navigating 933,120 conditions to achieve high conductivity. They show AI can streamline manufacturing of electronic polymers from lab to roll-to-roll scale. www.nature.com/articles/s41...

Qian et al. combine multi-view cell data with a multi-graph autoencoder to classify cell niches more accurately. Their approach scales to millions of cells, revealing how spatial neighborhoods drive cellular changes in health and disease. www.nature.com/articles/s41...

Mastracco et al. combine molecular dynamics, density functional theory, and interpretable ML to reveal how subtle peptide fluctuations add extra near-gap states. The interplay of side-chain packing, backbone tension, and electrostatics is key to controlling conductance. pubs.acs.org/doi/full/10....

Huang and Cole adapt standard language models for optoelectronics with 80% less computational cost, yet surpass larger general models on specialized tasks. They reach ~80% exact-match accuracy in domain questions and release all data and tools publicly. pubs.acs.org/doi/10.1021/...

Marchand et al. use deep learning to target “neosurfaces,” where a small molecule reshapes a protein’s binding region. The approach recovers correct partners ~70% of the time, with up to 12% interface area from the ligand boosting predicted binding energies by ~28%. www.nature.com/articles/s41...

Model-constrained deep learning enables real-time Li-ion battery fault diagnosis under random conditions. Harnessing big EV data, Cao et al. detect early signs of thermal runaway, short circuits, leakage, and aging, improving battery safety and reliability. www.nature.com/articles/s41...

Antonov & Dayan extend hippocampal replay theory to uncertain environments, showing how offline reactivation helps solve the exploration–exploitation trade-off. Their framework predicts adaptive replay patterns in novel tasks, guiding efficient directed exploration. www.nature.com/articles/s41...

Chi et al. develop a neural-network-based end-to-end design for meta-optics, achieving full control of amplitude, phase, and polarization across multiple wavelengths. advanced.onlinelibrary.wiley.com/doi/10.1002/...

Chen et al. build a “SERS chemical space” with ML, linking structures and spectra to identify unknown linear organic molecules and predict their SERS spectra. A two-step pipeline (RF + PLS) attains 100% group classification and near–1-carbon precision. pubs.acs.org/doi/full/10....

Tawfik et al. extend DeepDFT to handle charged materials. By encoding net charge into a neural network, they accurately predict electron densities in charged solids and molecules, speeding up quantum-like simulations, from diamond vacancies to Li-ion cathodes. pubs.acs.org/doi/full/10....