ºÝºÝߣ

ºÝºÝߣShare a Scribd company logo
Vector optimization
ì„ ì§„ 환
용어 정리
• Convex
• Regularization
• L1, L2
최ì í™” 문제 í‘œì¤€ì  í˜•íƒœ
Q : What is optimal( ì œì¼ ì¢‹ì€ ) vector x* in this primal problem?
Step 1. Existence
( Slater’s Condition )
Step 2. Where?
( KKT Condition )
Step 3. How?
( Backtracking Line Search, Gradient Descent, … )
example
Deep Neural Network
example
선호 체계가 í•­ìƒ ëª…í™•í•œ ê²ƒì€ ì•„ë‹ˆë‹¤.
확실한 ê²ƒì€ ê³ ë“±ì–´ 4마리와 2000ì›ì´ ë‘ ê²½ìš°ë³´ë‹¤ ë” ì¢‹ë‹¤ëŠ” 것!
짬뽕과 짜장면 ì¤‘ì— í•˜ë‚˜ë¥¼ ì„ íƒí•´ì•¼í•˜ëŠ” 순간..
Regularization termì„ ì¶”ê°€í•˜ëŠ” ì´ìœ 
- ëª¨ìˆ˜ì˜ ì ˆëŒ€ê°’ì„ ì¤„ì´ê¸° 위함.
- ìˆ˜ë¦¬ì  í‘œí˜„ì— ëŒ€í•´ 고민해보ìž.
Standard form of “Vector optimizationâ€
Proper coneì€ ë¶€ë¶„ì ìœ¼ë¡œ 대소 관계를 만들어준다.
‘(ì œì¼) 좋ì€â€™ ì˜ ì •ì˜, Optimal and Pareto optimal
최고 와 ìµœì„ ì˜ ì°¨ì´
Vector Optimization
Remark.
• Vector optimization ì—서는 optimal(최ì ) point 를 찾기보다는
Pareto(최선) optimal ì„ ì°¾ëŠ” ê²ƒì´ ì ì ˆí•˜ë‹¤.
• Pareto optimal value lies in the boundary of the set of achievable object values.
• Optimal point is Pareto optimal.
• Pareto optimal point 를 ëª¨ë‘ ì°¾ìž.
• ê·¸ ì¤‘ì— Optimal pointê°€ 있다면 optimal value를 쉽게 ì•Œ 수 있다.
Pareto optimal ì„ ì°¾ëŠ” 방법 – Scalarization
ì´ ë°©ë²•ìœ¼ë¡œ 모든 pareto optimal
ì„ ì°¾ì„ ìˆ˜ 있는 ê²ƒì€ ì•„ë‹ˆë‹¤.
• Lemma
ð¿ð‘’𑡠𑆠ð‘ð‘’ ð‘Ž ð‘ð‘œð‘›ð‘£ð‘’ð‘¥ ð‘ ð‘’ð‘¡. ð¹ð‘œð‘Ÿ ð‘Žð‘›ð‘¦ ð‘šð‘–ð‘›ð‘–ð‘šð‘Žð‘™ ð‘’ð‘™ð‘’ð‘šð‘’ð‘›ð‘¡ ð‘‹ ð‘œð‘“ ð‘†,
ð‘‡â„Žð‘’ð‘Ÿð‘’ ð‘’ð‘¥ð‘–ð‘ ð‘¡ð‘  ð‘Ž ð‘›ð‘œð‘›ð‘§ð‘’ð‘Ÿð‘œ 𜆠≽<∗ 0 ð‘ ð‘¢ð‘â„Ž ð‘¡â„Žð‘Žð‘¡ ð‘‹ ð‘šð‘–ð‘›ð‘–ð‘šð‘–ð‘§ð‘’ð‘  ðœ†@
𑧠ð‘œð‘£ð‘’𑟠𑧠∈ ð‘†.
pf ) easy
• Proposition
ð¹ð‘œð‘Ÿ ð¶ð‘œð‘›ð‘£ð‘’ð‘¥ ð‘£ð‘’ð‘ð‘¡ð‘œð‘Ÿ ð‘œð‘ð‘¡ð‘–ð‘šð‘–ð‘§ð‘Žð‘¡ð‘–ð‘œð‘› ð‘ð‘Ÿð‘œð‘ð‘™ð‘’ð‘š, for every Pareto optimal point ð‘¥RS
, ð‘¡â„Žð‘’ð‘Ÿð‘’ ð‘–ð‘  ð‘Ž ð‘›ð‘œð‘›ð‘§ð‘’ð‘Ÿð‘œ 𜆠≽<∗ 0 ð‘ ð‘¢ð‘â„Ž ð‘¡â„Žð‘Žð‘¡ ð‘¥RS
ð‘šð‘–ð‘›ð‘–ð‘šð‘–ð‘§ð‘’ð‘  ð‘¡â„Žð‘’ ð‘ ð‘ð‘Žð‘™ð‘Žð‘Ÿð‘–ð‘§ð‘’ð‘‘ ð‘ð‘Ÿð‘œð‘ð‘™ð‘’ð‘š.
• Remark!
• Optimal of the scalarized problem ⟶ Pareto optimal ⋯ 𜆠≻<∗ 0
• Pareto optimal ⟶ Optimal of the scalarized problem ⋯ 𜆠≽<∗ 0
초과ì¸ì§€ ì´ìƒì¸ì§€ëŠ” 실무ì ìœ¼ë¡œ 별 문제 아니다 !!
지금까지 í•œ ê²ƒë“¤ì„ ì •ë¦¬ë¥¼ 하면
1. 선호 체계 불확실 할 수 있다.
2. Vector optimization
3. Optimal? No! Pareto optimal !
4. 어떻게? Scalarization !
5. 왜? Propositionì— ì˜í•´ !
Application
• Regularized least squares
• Smoothing regularization
• Reconstruction, smoothing and de-noising
Application
• Regularized least squares
• Smoothing regularization
• Reconstruction, smoothing and de-noising
Regularized least squares
A : data set, b : label ê°€ 주어지고 선형회귀를 í–ˆì„ë•Œ, ì ë‹¹í•œ ì ë‹¹í•œ parameter 𑥠를 찾는 문제
ð´ð‘–𑚠∶ ð‘“ð‘–ð‘›ð‘‘ ð‘¥ ð‘¡â„Žð‘Žð‘¡ ð‘”ð‘–ð‘£ð‘’ð‘  ð‘Ž ð‘”ð‘œð‘œð‘‘ ð‘“ð‘–ð‘¡ ð‘Žð‘›ð‘‘ ð‘¡â„Žð‘Žð‘¡ ð‘–ð‘  ð‘›ð‘œð‘¡ ð‘¡ð‘œð‘œ ð‘™ð‘Žð‘Ÿð‘”ð‘’
ì´ ë¬¸ì œì˜ Pareto optimalì„ ì°¾ëŠ” 법??
Scalarization !!
Recall : Pareto optimal ì„ ì°¾ëŠ” 방법 – Scalarization
ì´ ë°©ë²•ìœ¼ë¡œ 모든 pareto optimal
ì„ ì°¾ì„ ìˆ˜ 있는 ê²ƒì€ ì•„ë‹ˆë‹¤.
Regularized least squares
ê·¹ì†Œì  âˆ¶ 미분해서 0ì´ ë˜ëŠ” ì 
ðœ†[ 와 ðœ†ì˜ ë¹„ìœ¨ì„ ë‹¤ë¥´ê²Œí•´ì„œ 다른 결과를 만들 수 있다.
- Fitting ì„ ìš°ì„ ì‹œí• ì§€
- Regularization ì„ ìš°ì„ ì‹œ 할지 𜇠≫ 1
Application
• Regularized least squares
• Smoothing regularization
• Reconstruction, smoothing and de-noising
Smoothing regularization
• ì¼ë°˜ì ì¸ Regularizationì˜ ì‘ìš©.
• ð‘šð‘–ð‘›ð‘–ð‘šð‘–ð‘§ð‘’ (ð¿ ð´, ð’™, ð‘ , ∥ 𒙠∥)
• ð’™ ê°€ ì‹œê°„ì— ë”°ë¥¸ ë³€ìˆ˜ì¼ ë•Œ
• (ex, ð‘¥d : [0, 1]ì„ n분할 í–ˆì„ ë•Œ, i/n ì‹œì ì˜ 온ë„)
• | ð’™ | ëŒ€ì‹ ì— | ð·ð’™ | 를 넣는다.
• ð·ð‘¥ : ð‘¥ ì˜ ë³€ë™ì„±ì´ë‚˜ 매ë„러운 ì •ë„를 측정하는 함수
Smoothing regularization
∆𑥠d = i번째 ì‹œì ì˜ ê³¡ë¥ ì˜ ê·¼ì‚¬ê°’
𛿠= 0
𜇠= 0.005
𛿠= 0
𜇠= 0.05
𛿠= 0.3
𜇠= 0.05
𛿠가 ì»¤ì§ˆìˆ˜ë¡ smoothing 해지고
𜇠가 ì»¤ì§ˆìˆ˜ë¡ inputì˜ í¬ê¸°ê°€ 작아진다.
Application
• Regularized least squares
• Smoothing regularization
• Reconstruction, smoothing and de-noising
Reconstruction, smoothing and de-noising
• 𑥠∶ ð‘ ð‘–ð‘”ð‘›ð‘Žð‘™ ð‘Ÿð‘’ð‘ð‘Ÿð‘’ð‘ ð‘’ð‘›ð‘¡ð‘’ð‘‘ ð‘𑦠𑎠ð‘£ð‘’ð‘ð‘¡ð‘œð‘Ÿ ð‘–ð‘› â„m
• ð’™ = ð‘¥[, ð‘¥, … , ð‘¥m , ð‘¥d = ð‘¥d(ð‘¡)
• Assume that 𒙠usually doesn’t vary rapidly.
• e.g. audio signal or video.
• ð‘¥oSp = ð‘¥ + ð‘£, 𑣠∶ ð‘›ð‘œð‘–ð‘ ð‘’
• ì›ëž˜ 신호(ð‘¥)ì— ë…¸ì´ì¦ˆ(ð‘£)ê°€ ì„žì¸ ì‹ í˜¸(ð‘¥oSp) 를 ìƒ˜í”Œë§ í–ˆì„ ë•Œ, ì›ëž˜ 신호(ð‘¥) ê°€ 무엇ì¸ì§€
추정하고 싶다.
• ð‘¤ð‘’ ð‘¤ð‘Žð‘›ð‘¡ ð‘¡ð‘œ ð‘’ð‘ ð‘¡ð‘–ð‘šð‘Žð‘¡ð‘’ sð‘¥ ð‘œð‘“ ð‘¡â„Žð‘’ ð‘œð‘Ÿð‘–ð‘”ð‘–ð‘›ð‘Žð‘™ ð‘ ð‘–ð‘”ð‘›ð‘Žð‘™ ð‘¥, ð‘”ð‘–ð‘£ð‘’ð‘› ð‘¡â„Žð‘’ ð‘ð‘œð‘Ÿð‘Ÿð‘¢ð‘ð‘¡ð‘’ð‘‘ ð‘ ð‘–ð‘”ð‘›ð‘Žð‘™ ð‘¥oSp.
Reconstruction, smoothing and de-noising
𜙠∶ ð‘Ÿð‘’ð‘”ð‘¢ð‘™ð‘Žð‘Ÿð‘–ð‘§ð‘Žð‘¡ð‘–ð‘œð‘› ð‘“ð‘¢ð‘›ð‘ð‘¡ð‘–ð‘œð‘› ð‘œð‘Ÿ ð‘ ð‘šð‘œð‘œð‘¡â„Žð‘–ð‘›ð‘” ð‘œð‘ð‘—ð‘’ð‘ð‘¡ð‘–ð‘£ð‘’
𜙠ð‘šð‘’ð‘Žð‘ ð‘¢ð‘Ÿð‘’ð‘  ð‘Ÿð‘œð‘¢ð‘”â„Žð‘›ð‘’ð‘ ð‘  ð‘œð‘Ÿ ð‘™ð‘Žð‘𑘠ð‘œð‘“ ð‘ ð‘šð‘œð‘œð‘¡â„Žð‘›ð‘’ð‘ ð‘  ð‘œð‘“ sð‘¥ + convex
Reconstruction, smoothing and de-noising
Reconstruction, smoothing and de-noising
Reconstruction, smoothing and de-noising
Rapid variationì´ ìžˆëŠ” 경우
ë.

More Related Content

Similar to Vector Optimization (20)

강화 학습 기초 Reinforcement Learning an introduction
강화 학습 기초 Reinforcement Learning an introduction강화 학습 기초 Reinforcement Learning an introduction
강화 학습 기초 Reinforcement Learning an introduction
Taehoon Kim
Ìý
Coursera Machine Learning (by Andrew Ng)_ê°•ì˜ì •ë¦¬
Coursera Machine Learning (by Andrew Ng)_ê°•ì˜ì •ë¦¬Coursera Machine Learning (by Andrew Ng)_ê°•ì˜ì •ë¦¬
Coursera Machine Learning (by Andrew Ng)_ê°•ì˜ì •ë¦¬
SANG WON PARK
Ìý
ê°€ê¹ê³ ë„ 먼 Trpo
ê°€ê¹ê³ ë„ 먼 Trpoê°€ê¹ê³ ë„ 먼 Trpo
ê°€ê¹ê³ ë„ 먼 Trpo
Woong won Lee
Ìý
파ì´ì¬ê³¼ ì¼€ë¼ìŠ¤ë¡œ 배우는 강화학습 ì €ìžíŠ¹ê°•
파ì´ì¬ê³¼ ì¼€ë¼ìŠ¤ë¡œ 배우는 강화학습 ì €ìžíŠ¹ê°•íŒŒì´ì¬ê³¼ ì¼€ë¼ìŠ¤ë¡œ 배우는 강화학습 ì €ìžíŠ¹ê°•
파ì´ì¬ê³¼ ì¼€ë¼ìŠ¤ë¡œ 배우는 강화학습 ì €ìžíŠ¹ê°•
Woong won Lee
Ìý
[í™ëŒ€ ë¨¸ì‹ ëŸ¬ë‹ ìŠ¤í„°ë”” - 핸즈온 머신러ë‹] 4장. ëª¨ë¸ í›ˆë ¨
[í™ëŒ€ ë¨¸ì‹ ëŸ¬ë‹ ìŠ¤í„°ë”” - 핸즈온 머신러ë‹] 4장. ëª¨ë¸ í›ˆë ¨[í™ëŒ€ ë¨¸ì‹ ëŸ¬ë‹ ìŠ¤í„°ë”” - 핸즈온 머신러ë‹] 4장. ëª¨ë¸ í›ˆë ¨
[í™ëŒ€ ë¨¸ì‹ ëŸ¬ë‹ ìŠ¤í„°ë”” - 핸즈온 머신러ë‹] 4장. ëª¨ë¸ í›ˆë ¨
Haesun Park
Ìý
[한글] Tutorial: Sparse variational dropout
[한글] Tutorial: Sparse variational dropout[한글] Tutorial: Sparse variational dropout
[한글] Tutorial: Sparse variational dropout
Wuhyun Rico Shin
Ìý
3 sat with randomization
3 sat with randomization3 sat with randomization
3 sat with randomization
Changki Yun
Ìý
03. linear regression
03. linear regression03. linear regression
03. linear regression
Jeonghun Yoon
Ìý
GBRT tutorial
GBRT tutorialGBRT tutorial
GBRT tutorial
Minsub Yim
Ìý
[밑러ë‹] Chap06 학습관련기술들
[밑러ë‹] Chap06 학습관련기술들[밑러ë‹] Chap06 학습관련기술들
[밑러ë‹] Chap06 학습관련기술들
종현 최
Ìý
DL from scratch(6)
DL from scratch(6)DL from scratch(6)
DL from scratch(6)
Park Seong Hyeon
Ìý
ESM Mid term Review
ESM Mid term ReviewESM Mid term Review
ESM Mid term Review
Mario Cho
Ìý
코드와 실습으로 ì´í•´í•˜ëŠ” ì¸ê³µì§€ëŠ¥
코드와 실습으로 ì´í•´í•˜ëŠ” ì¸ê³µì§€ëŠ¥ì½”드와 실습으로 ì´í•´í•˜ëŠ” ì¸ê³µì§€ëŠ¥
코드와 실습으로 ì´í•´í•˜ëŠ” ì¸ê³µì§€ëŠ¥
ë„형 ìž„
Ìý
3.neural networks
3.neural networks3.neural networks
3.neural networks
Haesun Park
Ìý
Deep Learning from scratch 4장 : neural network learning
Deep Learning from scratch 4장 : neural network learningDeep Learning from scratch 4장 : neural network learning
Deep Learning from scratch 4장 : neural network learning
JinSooKim80
Ìý
Decision tree
Decision treeDecision tree
Decision tree
Jeonghun Yoon
Ìý
Deep Learning from scratch 5장 : backpropagation
 Deep Learning from scratch 5장 : backpropagation Deep Learning from scratch 5장 : backpropagation
Deep Learning from scratch 5장 : backpropagation
JinSooKim80
Ìý
[GomGuard] 뉴런부터 YOLO 까지 - ë”¥ëŸ¬ë‹ ì „ë°˜ì— ëŒ€í•œ ì´ì•¼ê¸°
[GomGuard] 뉴런부터 YOLO 까지 - ë”¥ëŸ¬ë‹ ì „ë°˜ì— ëŒ€í•œ ì´ì•¼ê¸°[GomGuard] 뉴런부터 YOLO 까지 - ë”¥ëŸ¬ë‹ ì „ë°˜ì— ëŒ€í•œ ì´ì•¼ê¸°
[GomGuard] 뉴런부터 YOLO 까지 - ë”¥ëŸ¬ë‹ ì „ë°˜ì— ëŒ€í•œ ì´ì•¼ê¸°
JungHyun Hong
Ìý
Variational inference intro. (korean ver.)
Variational inference intro. (korean ver.)Variational inference intro. (korean ver.)
Variational inference intro. (korean ver.)
Kiho Hong
Ìý
Control as Inference.pptx
Control as Inference.pptxControl as Inference.pptx
Control as Inference.pptx
ssuserbd1647
Ìý
강화 학습 기초 Reinforcement Learning an introduction
강화 학습 기초 Reinforcement Learning an introduction강화 학습 기초 Reinforcement Learning an introduction
강화 학습 기초 Reinforcement Learning an introduction
Taehoon Kim
Ìý
Coursera Machine Learning (by Andrew Ng)_ê°•ì˜ì •ë¦¬
Coursera Machine Learning (by Andrew Ng)_ê°•ì˜ì •ë¦¬Coursera Machine Learning (by Andrew Ng)_ê°•ì˜ì •ë¦¬
Coursera Machine Learning (by Andrew Ng)_ê°•ì˜ì •ë¦¬
SANG WON PARK
Ìý
ê°€ê¹ê³ ë„ 먼 Trpo
ê°€ê¹ê³ ë„ 먼 Trpoê°€ê¹ê³ ë„ 먼 Trpo
ê°€ê¹ê³ ë„ 먼 Trpo
Woong won Lee
Ìý
파ì´ì¬ê³¼ ì¼€ë¼ìŠ¤ë¡œ 배우는 강화학습 ì €ìžíŠ¹ê°•
파ì´ì¬ê³¼ ì¼€ë¼ìŠ¤ë¡œ 배우는 강화학습 ì €ìžíŠ¹ê°•íŒŒì´ì¬ê³¼ ì¼€ë¼ìŠ¤ë¡œ 배우는 강화학습 ì €ìžíŠ¹ê°•
파ì´ì¬ê³¼ ì¼€ë¼ìŠ¤ë¡œ 배우는 강화학습 ì €ìžíŠ¹ê°•
Woong won Lee
Ìý
[í™ëŒ€ ë¨¸ì‹ ëŸ¬ë‹ ìŠ¤í„°ë”” - 핸즈온 머신러ë‹] 4장. ëª¨ë¸ í›ˆë ¨
[í™ëŒ€ ë¨¸ì‹ ëŸ¬ë‹ ìŠ¤í„°ë”” - 핸즈온 머신러ë‹] 4장. ëª¨ë¸ í›ˆë ¨[í™ëŒ€ ë¨¸ì‹ ëŸ¬ë‹ ìŠ¤í„°ë”” - 핸즈온 머신러ë‹] 4장. ëª¨ë¸ í›ˆë ¨
[í™ëŒ€ ë¨¸ì‹ ëŸ¬ë‹ ìŠ¤í„°ë”” - 핸즈온 머신러ë‹] 4장. ëª¨ë¸ í›ˆë ¨
Haesun Park
Ìý
[한글] Tutorial: Sparse variational dropout
[한글] Tutorial: Sparse variational dropout[한글] Tutorial: Sparse variational dropout
[한글] Tutorial: Sparse variational dropout
Wuhyun Rico Shin
Ìý
3 sat with randomization
3 sat with randomization3 sat with randomization
3 sat with randomization
Changki Yun
Ìý
03. linear regression
03. linear regression03. linear regression
03. linear regression
Jeonghun Yoon
Ìý
GBRT tutorial
GBRT tutorialGBRT tutorial
GBRT tutorial
Minsub Yim
Ìý
[밑러ë‹] Chap06 학습관련기술들
[밑러ë‹] Chap06 학습관련기술들[밑러ë‹] Chap06 학습관련기술들
[밑러ë‹] Chap06 학습관련기술들
종현 최
Ìý
ESM Mid term Review
ESM Mid term ReviewESM Mid term Review
ESM Mid term Review
Mario Cho
Ìý
코드와 실습으로 ì´í•´í•˜ëŠ” ì¸ê³µì§€ëŠ¥
코드와 실습으로 ì´í•´í•˜ëŠ” ì¸ê³µì§€ëŠ¥ì½”드와 실습으로 ì´í•´í•˜ëŠ” ì¸ê³µì§€ëŠ¥
코드와 실습으로 ì´í•´í•˜ëŠ” ì¸ê³µì§€ëŠ¥
ë„형 ìž„
Ìý
3.neural networks
3.neural networks3.neural networks
3.neural networks
Haesun Park
Ìý
Deep Learning from scratch 4장 : neural network learning
Deep Learning from scratch 4장 : neural network learningDeep Learning from scratch 4장 : neural network learning
Deep Learning from scratch 4장 : neural network learning
JinSooKim80
Ìý
Deep Learning from scratch 5장 : backpropagation
 Deep Learning from scratch 5장 : backpropagation Deep Learning from scratch 5장 : backpropagation
Deep Learning from scratch 5장 : backpropagation
JinSooKim80
Ìý
[GomGuard] 뉴런부터 YOLO 까지 - ë”¥ëŸ¬ë‹ ì „ë°˜ì— ëŒ€í•œ ì´ì•¼ê¸°
[GomGuard] 뉴런부터 YOLO 까지 - ë”¥ëŸ¬ë‹ ì „ë°˜ì— ëŒ€í•œ ì´ì•¼ê¸°[GomGuard] 뉴런부터 YOLO 까지 - ë”¥ëŸ¬ë‹ ì „ë°˜ì— ëŒ€í•œ ì´ì•¼ê¸°
[GomGuard] 뉴런부터 YOLO 까지 - ë”¥ëŸ¬ë‹ ì „ë°˜ì— ëŒ€í•œ ì´ì•¼ê¸°
JungHyun Hong
Ìý
Variational inference intro. (korean ver.)
Variational inference intro. (korean ver.)Variational inference intro. (korean ver.)
Variational inference intro. (korean ver.)
Kiho Hong
Ìý
Control as Inference.pptx
Control as Inference.pptxControl as Inference.pptx
Control as Inference.pptx
ssuserbd1647
Ìý

More from SEMINARGROOT (20)

Metric based meta_learning
Metric based meta_learningMetric based meta_learning
Metric based meta_learning
SEMINARGROOT
Ìý
Sampling method : MCMC
Sampling method : MCMCSampling method : MCMC
Sampling method : MCMC
SEMINARGROOT
Ìý
Demystifying Neural Style Transfer
Demystifying Neural Style TransferDemystifying Neural Style Transfer
Demystifying Neural Style Transfer
SEMINARGROOT
Ìý
Towards Deep Learning Models Resistant to Adversarial Attacks.
Towards Deep Learning Models Resistant to Adversarial Attacks.Towards Deep Learning Models Resistant to Adversarial Attacks.
Towards Deep Learning Models Resistant to Adversarial Attacks.
SEMINARGROOT
Ìý
The ways of node embedding
The ways of node embeddingThe ways of node embedding
The ways of node embedding
SEMINARGROOT
Ìý
Graph Convolutional Network
Graph  Convolutional NetworkGraph  Convolutional Network
Graph Convolutional Network
SEMINARGROOT
Ìý
Denoising With Frequency Domain
Denoising With Frequency DomainDenoising With Frequency Domain
Denoising With Frequency Domain
SEMINARGROOT
Ìý
Bayesian Statistics
Bayesian StatisticsBayesian Statistics
Bayesian Statistics
SEMINARGROOT
Ìý
Coding Test Review 3
Coding Test Review 3Coding Test Review 3
Coding Test Review 3
SEMINARGROOT
Ìý
Time Series Analysis - ARMA
Time Series Analysis - ARMATime Series Analysis - ARMA
Time Series Analysis - ARMA
SEMINARGROOT
Ìý
Differential Geometry for Machine Learning
Differential Geometry for Machine LearningDifferential Geometry for Machine Learning
Differential Geometry for Machine Learning
SEMINARGROOT
Ìý
Generative models : VAE and GAN
Generative models : VAE and GANGenerative models : VAE and GAN
Generative models : VAE and GAN
SEMINARGROOT
Ìý
Effective Python
Effective PythonEffective Python
Effective Python
SEMINARGROOT
Ìý
Understanding Blackbox Prediction via Influence Functions
Understanding Blackbox Prediction via Influence FunctionsUnderstanding Blackbox Prediction via Influence Functions
Understanding Blackbox Prediction via Influence Functions
SEMINARGROOT
Ìý
Attention Is All You Need
Attention Is All You NeedAttention Is All You Need
Attention Is All You Need
SEMINARGROOT
Ìý
Attention
AttentionAttention
Attention
SEMINARGROOT
Ìý
WWW 2020 XAI Tutorial Review
WWW 2020 XAI Tutorial ReviewWWW 2020 XAI Tutorial Review
WWW 2020 XAI Tutorial Review
SEMINARGROOT
Ìý
Coding test review 2
Coding test review 2Coding test review 2
Coding test review 2
SEMINARGROOT
Ìý
Locality sensitive hashing
Locality sensitive hashingLocality sensitive hashing
Locality sensitive hashing
SEMINARGROOT
Ìý
Coding Test Review1
Coding Test Review1Coding Test Review1
Coding Test Review1
SEMINARGROOT
Ìý
Metric based meta_learning
Metric based meta_learningMetric based meta_learning
Metric based meta_learning
SEMINARGROOT
Ìý
Sampling method : MCMC
Sampling method : MCMCSampling method : MCMC
Sampling method : MCMC
SEMINARGROOT
Ìý
Demystifying Neural Style Transfer
Demystifying Neural Style TransferDemystifying Neural Style Transfer
Demystifying Neural Style Transfer
SEMINARGROOT
Ìý
Towards Deep Learning Models Resistant to Adversarial Attacks.
Towards Deep Learning Models Resistant to Adversarial Attacks.Towards Deep Learning Models Resistant to Adversarial Attacks.
Towards Deep Learning Models Resistant to Adversarial Attacks.
SEMINARGROOT
Ìý
The ways of node embedding
The ways of node embeddingThe ways of node embedding
The ways of node embedding
SEMINARGROOT
Ìý
Graph Convolutional Network
Graph  Convolutional NetworkGraph  Convolutional Network
Graph Convolutional Network
SEMINARGROOT
Ìý
Denoising With Frequency Domain
Denoising With Frequency DomainDenoising With Frequency Domain
Denoising With Frequency Domain
SEMINARGROOT
Ìý
Bayesian Statistics
Bayesian StatisticsBayesian Statistics
Bayesian Statistics
SEMINARGROOT
Ìý
Coding Test Review 3
Coding Test Review 3Coding Test Review 3
Coding Test Review 3
SEMINARGROOT
Ìý
Time Series Analysis - ARMA
Time Series Analysis - ARMATime Series Analysis - ARMA
Time Series Analysis - ARMA
SEMINARGROOT
Ìý
Differential Geometry for Machine Learning
Differential Geometry for Machine LearningDifferential Geometry for Machine Learning
Differential Geometry for Machine Learning
SEMINARGROOT
Ìý
Generative models : VAE and GAN
Generative models : VAE and GANGenerative models : VAE and GAN
Generative models : VAE and GAN
SEMINARGROOT
Ìý
Effective Python
Effective PythonEffective Python
Effective Python
SEMINARGROOT
Ìý
Understanding Blackbox Prediction via Influence Functions
Understanding Blackbox Prediction via Influence FunctionsUnderstanding Blackbox Prediction via Influence Functions
Understanding Blackbox Prediction via Influence Functions
SEMINARGROOT
Ìý
Attention Is All You Need
Attention Is All You NeedAttention Is All You Need
Attention Is All You Need
SEMINARGROOT
Ìý
WWW 2020 XAI Tutorial Review
WWW 2020 XAI Tutorial ReviewWWW 2020 XAI Tutorial Review
WWW 2020 XAI Tutorial Review
SEMINARGROOT
Ìý
Coding test review 2
Coding test review 2Coding test review 2
Coding test review 2
SEMINARGROOT
Ìý
Locality sensitive hashing
Locality sensitive hashingLocality sensitive hashing
Locality sensitive hashing
SEMINARGROOT
Ìý
Coding Test Review1
Coding Test Review1Coding Test Review1
Coding Test Review1
SEMINARGROOT
Ìý

Vector Optimization

  • 3. 최ì í™” 문제 í‘œì¤€ì  í˜•íƒœ Q : What is optimal( ì œì¼ ì¢‹ì€ ) vector x* in this primal problem? Step 1. Existence ( Slater’s Condition ) Step 2. Where? ( KKT Condition ) Step 3. How? ( Backtracking Line Search, Gradient Descent, … )
  • 6. 선호 체계가 í•­ìƒ ëª…í™•í•œ ê²ƒì€ ì•„ë‹ˆë‹¤. 확실한 ê²ƒì€ ê³ ë“±ì–´ 4마리와 2000ì›ì´ ë‘ ê²½ìš°ë³´ë‹¤ ë” ì¢‹ë‹¤ëŠ” 것!
  • 7. 짬뽕과 짜장면 ì¤‘ì— í•˜ë‚˜ë¥¼ ì„ íƒí•´ì•¼í•˜ëŠ” 순간.. Regularization termì„ ì¶”ê°€í•˜ëŠ” ì´ìœ  - ëª¨ìˆ˜ì˜ ì ˆëŒ€ê°’ì„ ì¤„ì´ê¸° 위함. - ìˆ˜ë¦¬ì  í‘œí˜„ì— ëŒ€í•´ 고민해보ìž.
  • 8. Standard form of “Vector optimization†Proper coneì€ ë¶€ë¶„ì ìœ¼ë¡œ 대소 관계를 만들어준다.
  • 9. ‘(ì œì¼) 좋ì€â€™ ì˜ ì •ì˜, Optimal and Pareto optimal 최고 와 ìµœì„ ì˜ ì°¨ì´
  • 11. Remark. • Vector optimization ì—서는 optimal(최ì ) point 를 찾기보다는 Pareto(최선) optimal ì„ ì°¾ëŠ” ê²ƒì´ ì ì ˆí•˜ë‹¤. • Pareto optimal value lies in the boundary of the set of achievable object values. • Optimal point is Pareto optimal. • Pareto optimal point 를 ëª¨ë‘ ì°¾ìž. • ê·¸ ì¤‘ì— Optimal pointê°€ 있다면 optimal value를 쉽게 ì•Œ 수 있다.
  • 12. Pareto optimal ì„ ì°¾ëŠ” 방법 – Scalarization ì´ ë°©ë²•ìœ¼ë¡œ 모든 pareto optimal ì„ ì°¾ì„ ìˆ˜ 있는 ê²ƒì€ ì•„ë‹ˆë‹¤.
  • 13. • Lemma ð¿ð‘’𑡠𑆠ð‘ð‘’ ð‘Ž ð‘ð‘œð‘›ð‘£ð‘’ð‘¥ ð‘ ð‘’ð‘¡. ð¹ð‘œð‘Ÿ ð‘Žð‘›ð‘¦ ð‘šð‘–ð‘›ð‘–ð‘šð‘Žð‘™ ð‘’ð‘™ð‘’ð‘šð‘’ð‘›ð‘¡ ð‘‹ ð‘œð‘“ ð‘†, ð‘‡â„Žð‘’ð‘Ÿð‘’ ð‘’ð‘¥ð‘–ð‘ ð‘¡ð‘  ð‘Ž ð‘›ð‘œð‘›ð‘§ð‘’ð‘Ÿð‘œ 𜆠≽<∗ 0 ð‘ ð‘¢ð‘â„Ž ð‘¡â„Žð‘Žð‘¡ ð‘‹ ð‘šð‘–ð‘›ð‘–ð‘šð‘–ð‘§ð‘’ð‘  ðœ†@ 𑧠ð‘œð‘£ð‘’𑟠𑧠∈ ð‘†. pf ) easy • Proposition ð¹ð‘œð‘Ÿ ð¶ð‘œð‘›ð‘£ð‘’ð‘¥ ð‘£ð‘’ð‘ð‘¡ð‘œð‘Ÿ ð‘œð‘ð‘¡ð‘–ð‘šð‘–ð‘§ð‘Žð‘¡ð‘–ð‘œð‘› ð‘ð‘Ÿð‘œð‘ð‘™ð‘’ð‘š, for every Pareto optimal point ð‘¥RS , ð‘¡â„Žð‘’ð‘Ÿð‘’ ð‘–ð‘  ð‘Ž ð‘›ð‘œð‘›ð‘§ð‘’ð‘Ÿð‘œ 𜆠≽<∗ 0 ð‘ ð‘¢ð‘â„Ž ð‘¡â„Žð‘Žð‘¡ ð‘¥RS ð‘šð‘–ð‘›ð‘–ð‘šð‘–ð‘§ð‘’ð‘  ð‘¡â„Žð‘’ ð‘ ð‘ð‘Žð‘™ð‘Žð‘Ÿð‘–ð‘§ð‘’ð‘‘ ð‘ð‘Ÿð‘œð‘ð‘™ð‘’ð‘š. • Remark! • Optimal of the scalarized problem ⟶ Pareto optimal ⋯ 𜆠≻<∗ 0 • Pareto optimal ⟶ Optimal of the scalarized problem ⋯ 𜆠≽<∗ 0 초과ì¸ì§€ ì´ìƒì¸ì§€ëŠ” 실무ì ìœ¼ë¡œ 별 문제 아니다 !!
  • 14. 지금까지 í•œ ê²ƒë“¤ì„ ì •ë¦¬ë¥¼ 하면 1. 선호 체계 불확실 í•  수 있다. 2. Vector optimization 3. Optimal? No! Pareto optimal ! 4. 어떻게? Scalarization ! 5. 왜? Propositionì— ì˜í•´ !
  • 15. Application • Regularized least squares • Smoothing regularization • Reconstruction, smoothing and de-noising
  • 16. Application • Regularized least squares • Smoothing regularization • Reconstruction, smoothing and de-noising
  • 17. Regularized least squares A : data set, b : label ê°€ 주어지고 선형회귀를 í–ˆì„ë•Œ, ì ë‹¹í•œ ì ë‹¹í•œ parameter 𑥠를 찾는 문제 ð´ð‘–𑚠∶ ð‘“ð‘–ð‘›ð‘‘ ð‘¥ ð‘¡â„Žð‘Žð‘¡ ð‘”ð‘–ð‘£ð‘’ð‘  ð‘Ž ð‘”ð‘œð‘œð‘‘ ð‘“ð‘–ð‘¡ ð‘Žð‘›ð‘‘ ð‘¡â„Žð‘Žð‘¡ ð‘–ð‘  ð‘›ð‘œð‘¡ ð‘¡ð‘œð‘œ ð‘™ð‘Žð‘Ÿð‘”ð‘’ ì´ ë¬¸ì œì˜ Pareto optimalì„ ì°¾ëŠ” 법?? Scalarization !!
  • 18. Recall : Pareto optimal ì„ ì°¾ëŠ” 방법 – Scalarization ì´ ë°©ë²•ìœ¼ë¡œ 모든 pareto optimal ì„ ì°¾ì„ ìˆ˜ 있는 ê²ƒì€ ì•„ë‹ˆë‹¤.
  • 19. Regularized least squares ê·¹ì†Œì  âˆ¶ 미분해서 0ì´ ë˜ëŠ” ì  ðœ†[ 와 ðœ†ì˜ ë¹„ìœ¨ì„ ë‹¤ë¥´ê²Œí•´ì„œ 다른 결과를 만들 수 있다. - Fitting ì„ ìš°ì„ ì‹œí• ì§€ - Regularization ì„ ìš°ì„ ì‹œ 할지 𜇠≫ 1
  • 20. Application • Regularized least squares • Smoothing regularization • Reconstruction, smoothing and de-noising
  • 21. Smoothing regularization • ì¼ë°˜ì ì¸ Regularizationì˜ ì‘ìš©. • ð‘šð‘–ð‘›ð‘–ð‘šð‘–ð‘§ð‘’ (ð¿ ð´, ð’™, ð‘ , ∥ 𒙠∥) • ð’™ ê°€ ì‹œê°„ì— ë”°ë¥¸ ë³€ìˆ˜ì¼ ë•Œ • (ex, ð‘¥d : [0, 1]ì„ n분할 í–ˆì„ ë•Œ, i/n ì‹œì ì˜ 온ë„) • | ð’™ | ëŒ€ì‹ ì— | ð·ð’™ | 를 넣는다. • ð·ð‘¥ : ð‘¥ ì˜ ë³€ë™ì„±ì´ë‚˜ 매ë„러운 ì •ë„를 측정하는 함수
  • 22. Smoothing regularization ∆𑥠d = i번째 ì‹œì ì˜ ê³¡ë¥ ì˜ ê·¼ì‚¬ê°’
  • 23. 𛿠= 0 𜇠= 0.005 𛿠= 0 𜇠= 0.05 𛿠= 0.3 𜇠= 0.05 𛿠가 ì»¤ì§ˆìˆ˜ë¡ smoothing 해지고 𜇠가 ì»¤ì§ˆìˆ˜ë¡ inputì˜ í¬ê¸°ê°€ 작아진다.
  • 24. Application • Regularized least squares • Smoothing regularization • Reconstruction, smoothing and de-noising
  • 25. Reconstruction, smoothing and de-noising • 𑥠∶ ð‘ ð‘–ð‘”ð‘›ð‘Žð‘™ ð‘Ÿð‘’ð‘ð‘Ÿð‘’ð‘ ð‘’ð‘›ð‘¡ð‘’ð‘‘ ð‘𑦠𑎠ð‘£ð‘’ð‘ð‘¡ð‘œð‘Ÿ ð‘–ð‘› â„m • ð’™ = ð‘¥[, ð‘¥, … , ð‘¥m , ð‘¥d = ð‘¥d(ð‘¡) • Assume that ð’™ usually doesn’t vary rapidly. • e.g. audio signal or video. • ð‘¥oSp = ð‘¥ + ð‘£, 𑣠∶ ð‘›ð‘œð‘–ð‘ ð‘’ • ì›ëž˜ 신호(ð‘¥)ì— ë…¸ì´ì¦ˆ(ð‘£)ê°€ ì„žì¸ ì‹ í˜¸(ð‘¥oSp) 를 ìƒ˜í”Œë§ í–ˆì„ ë•Œ, ì›ëž˜ 신호(ð‘¥) ê°€ 무엇ì¸ì§€ 추정하고 싶다. • ð‘¤ð‘’ ð‘¤ð‘Žð‘›ð‘¡ ð‘¡ð‘œ ð‘’ð‘ ð‘¡ð‘–ð‘šð‘Žð‘¡ð‘’ sð‘¥ ð‘œð‘“ ð‘¡â„Žð‘’ ð‘œð‘Ÿð‘–ð‘”ð‘–ð‘›ð‘Žð‘™ ð‘ ð‘–ð‘”ð‘›ð‘Žð‘™ ð‘¥, ð‘”ð‘–ð‘£ð‘’ð‘› ð‘¡â„Žð‘’ ð‘ð‘œð‘Ÿð‘Ÿð‘¢ð‘ð‘¡ð‘’ð‘‘ ð‘ ð‘–ð‘”ð‘›ð‘Žð‘™ ð‘¥oSp.
  • 26. Reconstruction, smoothing and de-noising 𜙠∶ ð‘Ÿð‘’ð‘”ð‘¢ð‘™ð‘Žð‘Ÿð‘–ð‘§ð‘Žð‘¡ð‘–ð‘œð‘› ð‘“ð‘¢ð‘›ð‘ð‘¡ð‘–ð‘œð‘› ð‘œð‘Ÿ ð‘ ð‘šð‘œð‘œð‘¡â„Žð‘–ð‘›ð‘” ð‘œð‘ð‘—ð‘’ð‘ð‘¡ð‘–ð‘£ð‘’ 𜙠ð‘šð‘’ð‘Žð‘ ð‘¢ð‘Ÿð‘’ð‘  ð‘Ÿð‘œð‘¢ð‘”â„Žð‘›ð‘’ð‘ ð‘  ð‘œð‘Ÿ ð‘™ð‘Žð‘𑘠ð‘œð‘“ ð‘ ð‘šð‘œð‘œð‘¡â„Žð‘›ð‘’ð‘ ð‘  ð‘œð‘“ sð‘¥ + convex
  • 29. Reconstruction, smoothing and de-noising Rapid variationì´ ìžˆëŠ” 경우
  • 30. ë.