rspeare.blogspot.com
Labels
Action
Adiabatic Expansion
Andrew Ng
Asymptotics
Bayes
Bias-Variance
bispectrum
Bivariate Normal
Central Limit Theorem
Chernoff
Classical Mechanics
cluster
Collaborative Filtering
conditional
connected moments
contingency tables
Correlation Coefficient
cosmology
covid
cumulants
cumulants. statistics
deep learning
degrees of freedom
entropy
Equation of State
Estimators
Experimental Design
feature selection
Field Theory
fisher
fisher information
Fluid Dynamics
Fourier
Gaussian
Green's Functions
Information Theory
Integral Expansion
Integration
Jaynes
kaggle
lagrange multipliers
Learning Theory
logistic regression
Machine Learning
Markov
Matrix Derivatives
Maximum Entropy
Method of Steepest Descent
Multinormal
non negative matrix factorization
null hypothesis
Numerical
Optimal Design
optimization
p_values
Partition Function
Path Integral
poisson
Polling
Posterior
power spectrum
probability
Propagators
Quals
Quantum Harmonic Oscillator
random variables
rasmussen
recommender systems
Regression
Reinforcement Learning
Self-Fourier
significance
sparse auto-encoding
Stationary Distribution
Statistical Mechanics
Statistics
stochastic process
Sturm Liouville
transforming a pdf
Variational
Tuesday, January 14, 2020
The wider we go on nets, the more they look like GP's...
Like Rasmussen mentions in his book, seems like people are finally getting around to experimentally verifying:
https://github.com/thegregyang/GP4A
Newer Posts
Older Posts
Home
Subscribe to:
Posts (Atom)