尝试自己写一个批梯度下降算法,在对cost function微分时遇到了问题。
计算cost function的函数有三个参数,如何用diff
来微分?
import numpy as np
from sympy import *
from pylab import *
# h(x) = theta0 + theta1 * x1 + theta2 * x2 + ...
def hypothesis(x_sample, theta):
temp = [x * y for (x, y) in zip(x_sample, theta)]
result = sum(temp)
return result
# cost function (j(theta)) ...
def cost_func(x_set, y_set, theta):
result = sum([pow(hypothesis(x_set, theta) - y, 2) for (x, y) in zip(x_set, y_set)])
result = result / 2
return result
def batch_theta_update(x_set, y_set, theta, learning_rate):
for theta_j in theta:
x_set, y_set, theta = symbols('x y z')
gradient = diff(cost_func(x, y, z), theta_j)
theta_j -= learning_rate * gradient
return theta
在batch_theta_update函数中,我要计算梯度gradient
,请问如何写?