Processing math: 100%
본문 바로가기

Gradient descent for linear regression

- 비용 함수와 기울기 하강을 이용해 선형 회귀를 위한 알고리즘, 또는 데이터에 맞는 일차함수를 구한다.

 

Gradient descent algorithm

Repeatuntilconvergence{

    θjθjαθjJ(θ)(j=0,1,...,n)

}

Linear regression model

hθ(x)=θ0+θ1x1++θnxn

J(θ)=12mmi=1{hθ(x(i))y(i)}2

 

θjJ(θ)

=θj12mmi=1{hθ(x(i))y(i)}2

=θj12mmi=1{θ0+θ1x1++θnxny(i)}2

 

j=0:θjJ(θ)=1mmi=1{hθ(x(i))y(i)}

j=k:θjJ(θ)=1mmi=1[{hθ(x(i))y(i)}x(i)] (for 1kn)

 

Gradient descent algorithm

Repeatuntilconvergence{

    θ0θ0α1mmi=1{hθ(x(i))y(i)}

    θkθkα1mmi=1[{hθ(x(i))y(i)}x(i)]

}

(Updateθ0andθksimultaneously)

 

 

* Notation

m: The number of training examples

n: The number of features

i: i-th training example

α: learning rate

'Machine Learning > Machine Learning' 카테고리의 다른 글

Feature scaling  (0) 2021.02.22
Gradient descent  (0) 2021.02.21
가설과 비용함수  (0) 2021.02.20
Unsupervised learning  (0) 2020.12.20
Supervised learning  (0) 2020.12.20