Energy based models and boltzmann machines - v2.0Soowan Lee
油
This is version 2.0 of my previous slides. (http://www.slideshare.net/blaswan/energy-based-models-and-boltzmann-machines)
Removed very simple recommendation example and Added feature extractor example referenced from Hinton's lecture.
2017 tensor flow dev summit (Sequence Models and the RNN API)
焔 襭襦 2017 2 22 ろ 8 覿 Maru180
GDG Seoul 譯殊 2017 Tensorflow Dev Summit Extended Seou
覦襯 讌
Sequence Models and the RNN API 襴 伎 螻旧
2017 tensor flow dev summit (Sequence Models and the RNN API)
焔 襭襦 2017 2 22 ろ 8 覿 Maru180
GDG Seoul 譯殊 2017 Tensorflow Dev Summit Extended Seou
覦襯 讌
Sequence Models and the RNN API 襴 伎 螻旧
Auto Scalable Deep Learning Production AI Serving Infra 蟲 覦 AI DevOps...hoondong kim
油
[Tensorflow-KR Offline 碁碁 覦襭]
Auto Scalable Deep Learning Production AI Serving Infra 蟲 覦 AI DevOps Cycle 蟲 覦覯襦. (Azure Docker PaaS 1襷 TPS Tensorflow Inference Serving 覦覯襦 螻旧)
4. 4
蠏碁殊 朱 誤覃
(encoder)
(decoder)
(X X 蟇磯Μ襯 豕 encoder, decoder )
(code)
activation function (ReLU螳 non-linearity襯 譯朱 )
(output)
x x 蟇磯Μ loss function 豕襦 training
z dimension p x dimension d覲企 蠍 覓語 encoder x襯 豢 り 螳
蠏碁Μ螻 p < d 螳 regularization 朱 identity function(f(x)=x)襯 牛蠍 覓語 蠎 regularization
deterministic mapping朱 code output
https://en.wikipedia.org/wiki/Autoencoder
6. 6
Autoencoder襯 豌 朱(Baldi, P. and Hornik, K. 1989)
backpropagation朱 PCA襯 蟲 襯
Baldi, P. and Hornik, K. (1989) Neural networks and principal components analysis: Learning from examples without local minima. Neural Networks
蠏碁る...PCA(Principal Component Analysis) 覓伎語...?
10. 10
Baldi, P. and Hornik, K.
Neural networks and principal components analysis: Learning from
examples without local minima. Neural Networks
Hinton & Salakhutdinov,
Reduction the Dimensionality of Data with Neural Network, Science
Pretraining Deep Autoencoder 企慨!
Reduction the Dimensionality of Data with Neural Network, Hinton & Salakhutdinov, Science, 2006
2006
1989
11. 11
Reduction the Dimensionality of Data with Neural Network, Hinton & Salakhutdinov, Science, 2006
[Lecture 15.2] Deep autoencoders by Geoffrey Hinton
Hinton螻 Salakhutdinov螳 襷, 豌覯讌 炎概 Deep Autoencoder
Deep Autoencoder non-linear dimension reduction 語 レ
flexible mapping 螳
learning time training case 觜襦蟇磯 蟆 蟇碁
豕譬 encoding 覈語 蟒 觜襴
蠏碁! backpropagation朱 weight 豕蠍 企れ 2006
豐蠍 weight 螳 覃 local minima襦
豐蠍 weight 螳 朱 backpropagation vanishing gradient 覓語 覦
襦 覦覯 autoencoder 豌覯讌碁 炎概 deep autoencoder襯
燕
layer-by-layer襦 pre-training
Echo-State Nets豌 weight 豐蠍壱(initialization)
Reconstruction ( dimension reduction襷..)螳 譯朱覦蠍
19. 19
Denoising autoencoder (2008)
Extracting and Composing Robust Features with Denoising Autoencoders (P. Vincent, H.
Larochelle Y. Bengio and P.A. Manzagol, ICML08, pages 1096 - 1103, ACM, 2008)
Sparse autoencoder (2008)
Fast Inference in Sparse Coding Algorithms with Applications to Object Recognition (K.
Kavukcuoglu, M. Ranzato, and Y. LeCun, CBLL-TR-2008-12-01, NYU, 2008)
Sparse deep belief net model for visual area V2 (H. Lee, C. Ekanadham, and A.Y. Ng.,
NIPS 20, 2008)
Stacked Denoising autoencoder (2010)
Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network
with a Local Denoising Criterion (P. Vincent, H. Larochelle, I. Lajoie, Y. Bengio and P.A.
Manzagol, J. Mach. Learn. Res. 11 3371-3408, 2010)
Variational autoencoder (2013, 2014)
Auto-encoding variational Bayes (D. P. Kingma and M. Welling. arXiv preprint
arXiv:1312.6114, 2013)
Stochastic backpropagation and approximate inference in deep generative models
(Rezende, Danilo Jimenez, Mohamed, Shakir, and Wierstra, Daan. arXiv preprint
arXiv:1401.4082, 2014)
2006
1989
2008
Autoencoder ロ(variants)
2010
2013
蠍 語 Contrative Autoencoder, Multimodal Autoencoder 襷 variants螳 譟伎
讌 覩語 generative model
input り 覲糾規
endcoding input 朱襷 l
譴觜 螳 覿譟燕伎() denoising autoencoder襷 螳 襴給...
20. 20
朱 input襯 螳讌螻 training伎 覲糾規 original input output朱 襷
identity function learning 覦讌螻 noise robust 覈語 襷り鍵
襦語: input stochastic corruption(ろ ) 豢螳
v 襯襦, random蟆 input 朱襯 0朱 れ(v~0.5)
覦覯 襷螻 るジ 覦覯 corruption 企 覓企逢
襦語 豬: Corrupted input tilda X襦覿 X襯 reconstruct(覲旧)
企ゼ 伎 input distribution 狩
Loss function x noise螳 覲碁 input x襯 觜蟲伎 螻壱
Unsupervised pretraining RBM螻 觜訣蟇磯 譬 焔レ 覲伎
Denoising autoencoder
/zukun/icml2012-tutorial-representationlearning