C Lang/machine learing

9-4. Logistic regression implementation with Numpy

iliosncelini 2019. 5. 13. 22:40

3_logistic_regression_with_numpy.ipynb
0.05MB

 

 

 

#x가 200*3, theta가 3*1 이므로 hypothesis_function는 200*1

 

# y는 200*1 hypothesis_function는 200*1이므로 y를 T해줌.
y.T.dot(np.log(hypothesis_function(x,theta)))는 1*1이므로 스칼라값
# (1-y).T.dot(np.log(1- hypothesis_function(x,theta))))도 1*1이므로 스칼라값 

# 아래 두개는 같은 방법인데 여러방법보라고 그냥 적어놓은거
# (-1.0 / m )* (y.T.dot(np.log(hypothesis_function(x,theta))) + (1-y).T.dot(np.log(1- hypothesis_function(x,theta))))
# (-1.0 /m )* (y * np.log(hypothesis_function(x,theta)) + (1-y) * np.log(1- hypothesis_function(x,theta))).sum()

 

 

 

# partial_marginal 200*1
# delta 200*1
# grad_i는 (1* 200)(200*1)이므로 스칼라