BigData Analytics in Materials Science · BigData Analytics in Materials Science Luca M....
Transcript of BigData Analytics in Materials Science · BigData Analytics in Materials Science Luca M....
BigData Analytics in
Materials Science
Luca M. GhiringhelliFritz Haber Institute
Handson workshop and HumboldtKolleg:DensityFunctional Theory and Beyond Basic Principles and Modern Insights
Isfahan University of Technology, Isfahan, Iran, May 2 to 13, 2016
Data, data, data: big data
Bigdata challenge, fourV:Volume (amount of data)Variety (heterogeneity, of form and meaning of data)Veracity (uncertainty of data quality)Velocity ?
Highthroughput screening: query and read out what was stored
Shouldn't we do more?
Analysis
Identify (so far) hidden correlations Identify which materials should be studied next as most promising candidates Identify anomalies
We have a dream
From the periodic table of the elements to a chart of materials
Mendeleev's 1871 periodic table
We have a dream
From the periodic table of the elements to a chart of materials
Mendeleev's 1871 periodic table
Ga=69.7 Ge=72.6
We have a dream
From the periodic table of the elements to a chart of materials:Organize materials according to their properties and functions, e.g.
figure of merit of thermoelectrics (as function of T )
turnover frequency of catalytic materials (as function of T and p)
efficiency of photovoltaic systems
Training setCalculate properties and
functions P, for many materials, iDensityFunctional Theory
Fast PredictionCalculate properties
and functions for new values of d (new materials)
Big Data Analysis
DescriptorFind the appropriate
descriptor di, build a table: | i | di | Pi |
LearningFind the function PSL(d) for the table;
do cross validation.Statistical learning
(Orbital period)² = C (orbit's major axis)³
Learning Discovery→
Suppose to know the trajectories of all planets in the solar system, from accurate observations (experiment)orby numerically integrating general relativity equations (calculations at the highest level of theory)
Data (collected by Tycho Brahe)
Statistical learning(performed by
Johannes Kepler)
Physical law(assessed by
Isaac Newton)
Databases, platforms
“Just Databases”ICSD Inorganic Crystal Structure DB http://icsd.fizkarlsruhe.deCOD Crystallography Open DB http://www.crystallography.net/ESP Electronic Structure Project http://gurka.fysik.uu.se/ESP/CCCBDB Comp. Chemistry Comparison and Benchmark DB
http://cccbdb.nist.gov/Databases + analytic toolsMaterials project http://www.materialsproject.orgAFLOW Atomatic Flow for Materials Discovery http://aflowlib.orgAiiDA Automated interactive infrastructure and DB for Atomistic Simulations
http://www.aiida.netOQMD Open Quantum Materials DB http://oqmd.org/
http://nomadrepository.euhttp://nomadcoe.eu
Codedependent raw data conversion layer codeindependent representation→ →
Outline
Descriptors and fingerprints
A (personaltaste compiled) zoo of machine learning / data mining techniques
Regularised regression
Linear and nonlinear dimensionality reduction
Feature selection
Some words on causal descriptorproperty relationship
Descriptors
Can we predict an optimal material for a complex process (e.g. heterogenous catalysis)
by looking to a simple (set of) descriptor(s) ?
A simple but insightful descriptor
(Geneticlike) fingerprint: 1D polymers “eugenetics”
Data: 175 linear 4blocks periodic polymers. 7 blocks: CH2, SiF2, SiCl2, GeF2, GeCl2, SnF2, SnCl2,
Descriptor: 20 dimensions [# building blocks of type i, of ii pairs, of iii triplets]
Pilania, Wang, …, and Ramprasad, Scientific Reports 3, 2810 (2013). DOI: 10.1038/srep02810
Isayev, …, and Curtarolo, Chemistry of Materials 27, 735 (2015)
(Geneticlike) fingerprint
Isayev, …, and Curtarolo, Chemistry of Materials 27, 735 (2015)
(Geneticlike) fingerprint
Supervised learning
d → P mapping Support vector machinesNeural networksDecision treesGenetic programming (symbolic regression)Kernel ridge regressionCompressed sensing
Unsupervised learning
d → d'Find patterns / trends
Principalcomponents analysisNonlinear dim. reduction Sketch mapClusteringLocal pattern discovery
Machine learning / data mining : a classification
Focus on “learning”: the algorithm has to improve with data size (“learning by experience”)
Supervised learning
d → P mapping Kernel ridge regressionCompressed sensing (+ symbolic regression)
Unsupervised learning
d → d'Find patterns / trends
Principalcomponents analysisNonlinear dim. reduction Sketch map
Machine learning / data mining : a classification
Focus on “learning”: the algorithm has to improve with data size (“learning by experience”)
Figure of merit to be optimized:
Regularization (prefer “lower complexity” in the solution)
(Linear) ridge regression
Explicit solver:
Alternative view, via Hilbert space representation theorem:
Sum over data points!
Ridge Regression: Mathematical formulation
norm
Linear kernel
Kernel Ridge Regression: Mathematical formulation
Nonlinear kernel
Kernel Ridge Regression: Mathematical formulation
Nonlinear kernel
Linear kernel
Gaussian (radial basis function) kernel
Laplacian kernel
Polynomial kernel
Kernel Ridge Regression: Mathematical formulation
Nonlinear kernel
In all cases, a kernel introduces a similarity measure
KRR success stories: Gaussian Approximation Potentials
Translation, rotational, permutational invariant, unique, smooth localenvironment descriptor.(Sphericallyaveraged spherical harmonic expansion of Gaussian densities centered on nuclei)
KRR success stories: Molecular properties
Pilania, Wang, …, and Ramprasad, Scientific Reports 3, 2810 (2013). DOI: 10.1038/srep02810
KRR success stories: 1D polymers “eugenetics”
Data: 175 linear 4blocks periodic polymers. 7 blocks: CH2, SiF2, SiCl2, GeF2, GeCl2, SnF2, SnCl2,
Descriptor: 20 dimensions [# building blocks of type i, of ii pairs, of iii triplets]
Regularized regression in practice: beware of overfitting
Regularized regression in practice: beware of overfitting
Regularized regression in practice: do validation
Supervised learning
d → P mapping Kernel ridge regressionCompressed sensing (+ symbolic regression)
Unsupervised learning
d → d'Find patterns / trends
Principalcomponents analysisNonlinear dim. reduction Sketch map
Machine learning / data mining : a classification
Focus on “learning”: the algorithm has to improve with data size (“learning by experience”)
Showcase: classification octet binaries crystal structures
The chemical space
Ansatz: atomic features
● Valence number Zv● Energy of valence s orbital Es● Energy of valence p orbital Ep● Radius of valence s orbital rs● Radius of valence p orbital rp
Ansatz: atomic features
● Valence number Zv● Energy of valence s orbital Es● Energy of valence p orbital Ep● Radius of valence s orbital rs● Radius of valence p orbital rp
KS
leve
ls [
eV]
Valence p
Valence sRadial probability densities
[Å]
Primary (atomic) features
Radius @ maxAverage radiusTurning point
example: Sn (Tin)
Valence p (HOMO)
Valence s
KS lev els [eV
]
LUMO
(Linear) dimensionality reduction: principal components
Principal component analysis
Pearson, K. "On Lines and Planes of Closest Fit to Systems of Points in Space". Philosophical Magazine 2, 559 (1901)
Orthonormal transformation of coordinates, converting a set of (possibly) linearly correlated coordinates into a new set of linearly uncorrelated (called principal or normal) components, such that the first component has the largest variance and each subsequent has the largest variance constrained to being orthogonal to all the preceding components
Principal component analysis
Pearson, K. "On Lines and Planes of Closest Fit to Systems of Points in Space". Philosophical Magazine 2, 559 (1901)
(Linear) dimensionality reduction: principal components
Saad, …, Chelikowsky, and Andreoni, PRB 85, 104104 (2012)1 2 3A
rb. (
linea
r) s
cale
Components
Ansatz: atomic features
● Valence number Zv● Energy of valence s orbital Es● Energy of valence p orbital Ep● Radius of valence s orbital rs● Radius of valence p orbital rp
rs, rp, Es/Zv, Ep/zv,
for A and B atoms
Ansatz: atomic features
● Valence number Zv● Energy of valence s orbital Es● Energy of valence p orbital Ep● Radius of valence s orbital rs● Radius of valence p orbital rp
rs, rp, Es/Zv, Ep/zv,
for A and B atoms
What's on the axes?
Linear combination of (possibly all) the initial dimensions
Supervised learning
d → P mapping Kernel ridge regressionCompressed sensing (+ symbolic regression)
Unsupervised learning
d → d'Find patterns / trends
Principalcomponents analysisNonlinear dim. reduction Sketch map
Machine learning / data mining : a classification
Focus on “learning”: the algorithm has to improve with data size (“learning by experience”)
(Nonlinear) dimensionality reduction
(Nonlinear) dimensionality reduction
(Nonlinear) dimensionality reduction
Proximity matchingProximity matching
Sketchmap algorithm
Minimization of the stress function (for a set of landmarks points)
Sketchmap algorithm
From clusters to defects in bulk
The high dimensional representation is still an important choice
From clusters to defects in bulk
What's on the axes?
Supervised learning
d → P mapping Kernel ridge regressionCompressed sensing (+ symbolic regression)
Unsupervised learning
d → d'Find patterns / trends
Principalcomponents analysisNonlinear dim. reduction Sketch map
Machine learning / data mining : a classification
Focus on “learning”: the algorithm has to improve with data size (“learning by experience”)
What about having a dimensionality reduction, or call it feature selection,
i.e., such that the (best) lowdimensional representation is selected
among (many many) given candidates?
It is time for: compressed sensing
Reference:LMG, J. Vybiral, S. V. Levchenko, C. Draxl, and M. Scheffler,
Phys. Rev. Lett. 114, 105503 (2015)Don't overlook the Supplementary Information!
82 octet AB binary compounds
We have a dreamProof of Concept: Descriptor for the Classification “Zincblende/Wurtzite or Rocksalt?”
Rocksalt
ZincblendeRocksalt/Zincblende
82 octet AB binary compounds
d1
d2 RS
J. A. van Vechten, Phys. Rev. 182, 891 (1969).J. C. Phillips, Rev. Mod. Phys. 42, 317 (1970).J. St. John and A.N. Bloch, Phys. Rev. Lett. 33, 1095 (1974)A. Zunger, Phys. Rev. B 22, 5839 (1980).D. G. Pettifor, Solid State Commun. 51, 31 (1984).Y. Saad, D. Gao, T. Ngo, S. Bobbitt, J. R. Chelikowsky, and W. Andreoni, Phys. Rev. B 85, 104104 (2012).
?
We have a dreamProof of Concept: Descriptor for the Classification “Zincblende/Wurtzite or Rocksalt?”
Rocksalt
ZincblendeRS/ZB
82 octet AB binary compounds
d1
d2 RS
We have a dreamProof of Concept: Descriptor for the Classification “Zincblende/Wurtzite or Rocksalt?”
Ansatz: atomic features
● HOMO● LUMO● Ionization Potential● Electron Affinity● Radius of valence s orbital● Radius of valence p orbital● Radius of valence d orbital● … ?
Ansatz: atomic features
● HOMO● LUMO● Ionization Potential● Electron Affinity● Radius of valence s orbital● Radius of valence p orbital● Radius of valence d orbital● … ?
E(Rocksalt) – E(Zinkblende)E(Rocksalt) – E(Zinkblende)
Rocksalt
ZincblendeRS/ZB
Figure of merit to be optimized:
Regularization (prefer “lower complexity” in the solution)
A more complex regularization:
(Linear) ridge regression
Mathematical formulation of the problem
NP – hard !!!
Mathematical formulation of the problem: sparsity
LASSO: convex problem, equivalent to the NP-hard if features (columns of D) are uncorrelated
LASSO, compressed/ive sensing in Materials Science
Find a descriptor AND an accurate evaluation for the difference in energy between RS and ZB crystal structures for all (82) AB octet semiconductors.ΔE = ΔE ( d )
Possibly identify a 2D descriptor which gives a “nice” representation of the materials in a plane
The task
Ansatz: atomic features
● HOMO● LUMO● Ionization Potential● Electron Affinity● Radius of valence s orbital● Radius of valence p orbital● Radius of valence d orbital● … ?
Ansatz: atomic features
● HOMO● LUMO● Ionization Potential● Electron Affinity● Radius of valence s orbital● Radius of valence p orbital● Radius of valence d orbital● … ?
E(Rocksalt) – E(Zinkblende)E(Rocksalt) – E(Zinkblende)
KS level 1 KS level 2
+
Radius 1 Radius 2
/
| x y |
Systematic construction of the feature space
+
Radius 1 Radius 2 KS level 1 KS level 2
| x y |
exp(x)
(x)^n
In practice: formalism borrowed form symbolic regression
Systematic construction of the feature space: EUREQA
Descriptor (candidates: 242)a The largest distance between a H atom and its nearest Si neighborb The shortest distance between a Si atom and its sixthnearest Si neighborc The maximum bond valence sum on a Si atomd The smallest value for the fifthsmallest relative bond length around a Si atome The fourthshortest distance between a Si atom and its eighthnearest neighborf The secondshortest distance between a Si atom and its fifthnearest neighborg The thirdshortest distance between a Si atom and its sixthnearest neighborh The HSi nearestneighbor distance for the hydrogen atom with the fourthsmallest difference between the distances to the two Si atoms nearest to a H atom
T. Müller et al. PRB 89 115202 (2014):Data: ~1000 amorphous structures of 216 Si atoms (saturated)
Property: hole trap depth
EUREQA: genetic programming software. Global optimization (genetic algorithm).Schmidt M., Lipson H., Science, Vol. 324, No. 5923, (2009)
Find a descriptor AND an accurate evaluation for the difference in energy between RS and ZB crystal structures for all (82) AB octet semiconductors.ΔE = ΔE ( d )
Possibly identify a 2D descriptor which gives a “nice” representation of the materials in a plane
The task
Ansatz: atomic features
● HOMO H● LUMO L● Ionization Potential IP● Electron Affinity EA● Radius of valence s orbital rs● Radius of valence p orbital rp● Radius of valence d orbital rd● Thousands of nonlinear
functions of the above
Ansatz: atomic features
● HOMO H● LUMO L● Ionization Potential IP● Electron Affinity EA● Radius of valence s orbital rs● Radius of valence p orbital rp● Radius of valence d orbital rd● Thousands of nonlinear
functions of the above
E(Rocksalt) – E(Zinkblende)E(Rocksalt) – E(Zinkblende)
1D
2D
3D
“Extended” LASSO : features are correlated, so the first 25-30 features selected by lasso when scanning from large to low λ are selected and all single features, all pairs, all triplets... are separately tested via linear regression (the NP-hard problem, but only with 25-30 features)
1D 2D 3D
Finding the descriptor
Twodimensional descriptor
0 0.2 eV 0.45 eV 1.0 eV-0.2 eV
A good model must be predictive within the data domain (interpolation):
cross validation
Performance of the descriptors: accuracy, validation
ε
!
“Complexity”
Erro
r
Training err.
Validation err.
Leave 10% out cross validation
Errors are energies, in eV
Max Absolute Error
Convergence with dimensionality of the descriptor
Regularized regression in practice: do validation
A good model must be predictive within the data domain (interpolation):
cross validation
A better model should be causal:stability analysis
Few words on causality
There are four possibilities (types of causality relationship) behind P(d):
1. d → P : P “listens” to d
2. P → d : d “listens” to P
3. A → d and A → P : There is no direct connection between d and P, but d and P both “listen” to a third “actuator”
4. There is no direct connection between d and P, but they have a common effect (Berkson paradox)
...that listens to both and screams: “I occurred” [Judea Pearl]
[If the admission criteria to a certain graduate school call for either high grades as an undergraduate or special musical talents, then these two attributes will be found to be correlated (negatively) in the student population of that school, even if these attributes are uncorrelated in the population at large (selection bias). Indeed, students with low grades are likely to be exceptionally gifted in music, which explains their admission to graduate school.]
Few words on causality
We are not able to write down a scientific law that connects the descriptor
directly with the total-energy difference between RS and ZB structures.However, ZA, ZB determine these descriptors, and ZA, ZB determine the many-body Hamiltonians and the total-energy difference.
ML has diligently took over “Kepler's work”, but no Newton, yetQuestion: is the latter step always necessary?
Quantitative analysis: effect of noise
The same 2D descriptor is found:
A good model must be predictive within the data domain (interpolation):
cross validation
A better model should be causal:stability analysis
An ideal model should be predictive outside the data domain (extrapolation)
When both carbon diamond and BN are excluded from training:
If all C containing binaries (C, SiC, GeC, and SnC) are excluded from training, i.e. no explicit information on C is given to the model:
Hadn't we known about diamond … we'd have predicted it!
Hadn't we known about any carboncontaining binary …we'd have predicted carbon chemistry (from atomic features)
E(LDA) E(predicted)
C -2.64 eV -1.37 eV
SiC -0.67 eV -0.48 eV
GeC -0.81 eV -0.46 eV
SnC -0.45 eV -0.23 eV
E(LDA) E(predicted)
C -2.64 eV -1.44 eV
BN -1.71 eV -1.37 eV
Bigdata for Materials Science: Infrastructures
Descriptors and fingerprints
(Selected) machinelearning / datamining methods:Kernel ridge regressionAutomatic descriptor search: dimensionality reduction
Principal component analysisSketch map
Automatic descriptor search: Feature selectionLASSO (compressed sensing) + symbolic regression
Application to a model materialsscience problemApplication of compressed sensing to basisset construction
Some words on causal descriptorproperty relationshipCrossvalidation and Stability analysis
Summary