This document discusses methods for automated machine learning (AutoML) and optimization of hyperparameters. It focuses on accelerating the Nelder-Mead method for hyperparameter optimization using predictive parallel evaluation. Specifically, it proposes using a Gaussian process to model the objective function and perform predictive evaluations in parallel to reduce the number of actual function evaluations needed by the Nelder-Mead method. The results show this approach reduces evaluations by 49-63% compared to baseline methods.
1) Canonical correlation analysis (CCA) is a statistical method that analyzes the correlation relationship between two sets of multidimensional variables.
2) CCA finds linear transformations of the two sets of variables so that their correlation is maximized. This can be formulated as a generalized eigenvalue problem.
3) The number of dimensions of the transformed variables is determined using Bartlett's test, which tests the eigenvalues against a chi-squared distribution.
ゼロから始める深層強化学習(NLP2018講演資料)/ Introduction of Deep Reinforcement LearningPreferred Networks
?
Introduction of Deep Reinforcement Learning, which was presented at domestic NLP conference.
言語処理学会第24回年次大会(NLP2018) での講演資料です。
http://www.anlp.jp/nlp2018/#tutorial
1) Canonical correlation analysis (CCA) is a statistical method that analyzes the correlation relationship between two sets of multidimensional variables.
2) CCA finds linear transformations of the two sets of variables so that their correlation is maximized. This can be formulated as a generalized eigenvalue problem.
3) The number of dimensions of the transformed variables is determined using Bartlett's test, which tests the eigenvalues against a chi-squared distribution.
ゼロから始める深層強化学習(NLP2018講演資料)/ Introduction of Deep Reinforcement LearningPreferred Networks
?
Introduction of Deep Reinforcement Learning, which was presented at domestic NLP conference.
言語処理学会第24回年次大会(NLP2018) での講演資料です。
http://www.anlp.jp/nlp2018/#tutorial
ERATO感謝祭 Season IV
【参考】Satoshi Hara and Takanori Maehara. Enumerate Lasso Solutions for Feature Selection. In Proceedings of the 31st AAAI Conference on Artificial Intelligence (AAAI'17), pages 1985--1991, 2017.
1. Fisher線形判別分析とFisher Weight Maps
[0] 「Fisher線形判別分析」,
C.M.ビショップ,
パターン認識と学習(上),
シュプリンガー?ジャパン,2007.
[1] Y. Shinohara and N. Otsu,
“Facial Expression Recognition Using Fisher Weight Maps,”
IEEE International Conference on Automatic Face and Gesture
Recognition, 2004.
[2] T. Harada, H. Nakayama, and Y. Kuniyoshi,
“Improving Local Descriptors by Embedding Global and Local
Spatial Information,”
European Conference on Computer Vision, 2010.
2013/06/19 上智大学 山中高夫
11. Facial Expression Recognition
Using Fisher Weight Maps
Y. Shinohara and N. Otsu,
IEEE International Conference on Automatic Face
and Gesture Recognition, 2004.
12. 背景と目的
? 顔の表情認識
? 局所特徴量ベースの手法
? 画像ベクトルベースの手法
? 局所特徴量ベースの手法例
? Gabor wavelet features
? Texture features
? HLAC (Higher-order Local Auto-Correlation) features
? 画像ベクトルベースの手法例
? Eigenfaces method (Principal Component Analysis: PCA)
? Fisherfaces method (Fisher Linear Discriminant Analysis: LDA)
? 目的
? 局所特徴量ベースと画像ベクトルベースの手法を組み合わせた手法を提案
? 手法
? 画像中の各領域の局所特徴量に対する重み付けをFisher Criterion(クラス
内?クラス間分散比)に基づいて決定
26. Improving Local Descriptors by Embedding
Global and Local Spatial Information
T. Harada, H. Nakayama, and Y. Kuniyoshi,
European Conference on Computer Vision, 2010.