The document discusses hyperparameter optimization in machine learning models. It introduces various hyperparameters that can affect model performance, and notes that as models become more complex, the number of hyperparameters increases, making manual tuning difficult. It formulates hyperparameter optimization as a black-box optimization problem to minimize validation loss and discusses challenges like high function evaluation costs and lack of gradient information.
This document contains notes from a machine learning discussion. It includes:
1. An introduction to BakFoo Inc. CEO Yuta Kashino's background in astrophysics, Python, and realtime data platforms.
2. References to papers and researchers in Bayesian deep learning and probabilistic programming, including Edward library creators Dustin Tran and Blei Lab.
3. An overview of how Edward combines TensorFlow for deep learning with probabilistic programming to perform Bayesian modeling, inference via VI and MCMC, and criticisms.
The document discusses hyperparameter optimization in machine learning models. It introduces various hyperparameters that can affect model performance, and notes that as models become more complex, the number of hyperparameters increases, making manual tuning difficult. It formulates hyperparameter optimization as a black-box optimization problem to minimize validation loss and discusses challenges like high function evaluation costs and lack of gradient information.
This document contains notes from a machine learning discussion. It includes:
1. An introduction to BakFoo Inc. CEO Yuta Kashino's background in astrophysics, Python, and realtime data platforms.
2. References to papers and researchers in Bayesian deep learning and probabilistic programming, including Edward library creators Dustin Tran and Blei Lab.
3. An overview of how Edward combines TensorFlow for deep learning with probabilistic programming to perform Bayesian modeling, inference via VI and MCMC, and criticisms.