- 비용 함수와 기울기 하강을 이용해 선형 회귀를 위한 알고리즘, 또는 데이터에 맞는 일차함수를 구한다.
Gradient descent algorithm
Repeatuntilconvergence{
θj←θj−α∂∂θjJ(θ)(j=0,1,...,n)
}
Linear regression model
hθ(x)=θ0+θ1x1+⋯+θnxn
J(θ)=12m∑mi=1{hθ(x(i))−y(i)}2
∂∂θjJ(θ)
=∂∂θj12m∑mi=1{hθ(x(i))−y(i)}2
=∂∂θj12m∑mi=1{θ0+θ1x1+⋯+θnxn−y(i)}2
j=0:∂∂θjJ(θ)=1m∑mi=1{hθ(x(i))−y(i)}
j=k:∂∂θjJ(θ)=1m∑mi=1[{hθ(x(i))−y(i)}x(i)] (for 1≤k≤n)
Gradient descent algorithm
Repeatuntilconvergence{
θ0←θ0−α1m∑mi=1{hθ(x(i))−y(i)}
θk←θk−α1m∑mi=1[{hθ(x(i))−y(i)}x(i)]
}
(Updateθ0andθksimultaneously)
* Notation
m: The number of training examples
n: The number of features
i: i-th training example
α: learning rate
'Machine Learning > Machine Learning' 카테고리의 다른 글
Feature scaling (0) | 2021.02.22 |
---|---|
Gradient descent (0) | 2021.02.21 |
가설과 비용함수 (0) | 2021.02.20 |
Unsupervised learning (0) | 2020.12.20 |
Supervised learning (0) | 2020.12.20 |