ݺߣ

ݺߣShare a Scribd company logo
CourseraMachineLearning으로
기계학습배우기:week5
1
개요
지난시간에이어CourseraMachineLearning으로기계학습배우기:week5
정리를진행한다.
해당포스팅은연재글로써지난연재는아래의링크를참고한다.
CourseraMachineLearning으로기계학습배우기:week1
CourseraMachineLearning으로기계학습배우기:week2
CourseraMachineLearning으로기계학습배우기:week3
CourseraMachineLearning으로기계학습배우기:week4
2
글을읽기에앞서…
본글은필자가코세라기계학습을공부를하는과정에서개념을확고히정
리하기위하는데목적이있다.(필자가나중에내용을다시찾아보기위한
목적이있다.)
코세라강의week개수에맞추어포스팅을진행할예정이다.
코세라의슬라이드에한글주석을단것이핵심으로내용에서글을읽을필
요없이슬라이드그림만으로최대한이해가되게끔하는데목적이있다.
수학은한국의고등수학을베이스로한다.수학적개념이나올때가급적
고등학교수학을베이스로내용을정리한다.
정리내용의목차구성을코세라강의와동일하게맞추고또한제목을원문
으로둔다.(원본강의내용과정리내용을서로서로찾아보기쉽게하기위
함이다.)
3
CostFunctionandBackpropagation
CostFunction
4
CostFunction(1/2)
5
CostFunction(2/2)
6
BackpropagationAlgorithm
BackPropagation의또다른이름은Error‑Backpropagation이다.즉에러를
역전파하는알고리즘이라는뜻이다.
7
BackpropagationAlgorithm(1/4)
8
BackpropagationAlgorithm(2/4)
9
BackpropagationAlgorithm(3/4)
10
BackpropagationAlgorithm(4/4)
11
BackpropagationIntuition
12
BackpropagationIntuition(1/4)
13
BackpropagationIntuition(2/4)
14
BackpropagationIntuition(3/4)
15
BackpropagationIntuition(4/4)
16
BackpropagationinPractice
Implementationnote:UnrollingParameters
17
Implementationnote:UnrollingParameters(1/7)
18
Implementationnote:UnrollingParameters(2/7)
19
Implementationnote:UnrollingParameters(3/7)
20
Implementationnote:UnrollingParameters(4/7)
직접옥타브로실습하여어떤개념인지구체적으로이해해보자.
>> Theta1 = ones(10,11)
Theta1 =
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
21
Implementationnote:UnrollingParameters(5/7)
>> Theta2 = 2*ones(10,11)
Theta2 =
2 2 2 2 2 2 2 2 2 2 2
2 2 2 2 2 2 2 2 2 2 2
2 2 2 2 2 2 2 2 2 2 2
2 2 2 2 2 2 2 2 2 2 2
2 2 2 2 2 2 2 2 2 2 2
2 2 2 2 2 2 2 2 2 2 2
2 2 2 2 2 2 2 2 2 2 2
2 2 2 2 2 2 2 2 2 2 2
2 2 2 2 2 2 2 2 2 2 2
2 2 2 2 2 2 2 2 2 2 2
22
Implementationnote:UnrollingParameters(6/7)
>> Theta3 = 3*ones(1,11)
Theta3 =
3 3 3 3 3 3 3 3 3 3 3
>> thetaVec = [Theta1(:); Theta2(:); Theta3(:)];
>> size(thetaVec )
ans =
231 1
23
Implementationnote:UnrollingParameters(7/7)
thetaVec에서세타1으로사용할행렬을reshape을통해추출할수있다.
>> reshape(thetaVec(1:110), 10, 11)
ans =
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
24
GradientChecking
25
GradientChecking(1/4)
26
GradientChecking(2/4)
27
GradientChecking(3/4)
28
GradientChecking(4/4)
29
RandomInitialization
30
RandomInitialization(1/3)
31
RandomInitialization(2/3)
32
RandomInitialization(3/3)
33
PuttingitTogether
34
PuttingitTogether(1/6)
35
PuttingitTogether(2/6)
36
PuttingitTogether(3/6)
37
PuttingitTogether(4/6)
38
PuttingitTogether(5/6)
39
PuttingitTogether(6/6)
40

More Related Content

Coursera Machine Learning으로 기계학습 배우기 : week5