fminunc在numpy中交替

Anu*_*rma 20 python matlab numpy octave python-2.7

fminunc在python中是否有替代函数(来自octave/matlab)?我有一个二元分类器的成本函数.现在我想运行梯度下降来获得theta的最小值.八度/ matlab实现将如下所示.

%  Set options for fminunc
options = optimset('GradObj', 'on', 'MaxIter', 400);

%  Run fminunc to obtain the optimal theta
%  This function will return theta and the cost 
[theta, cost] = ...
    fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);
Run Code Online (Sandbox Code Playgroud)

我已经使用numpy库在python中转换了我的costFunction,并在numpy中查找fminunc或任何其他梯度下降算法实现.

Car*_*ter 26

有关这里感兴趣的功能的更多信息:http://docs.scipy.org/doc/scipy-0.10.0/reference/tutorial/optimize.html

此外,您似乎正在使用Coursera机器学习课程,但在Python中.您可以查看http://aimotion.blogspot.com/2011/11/machine-learning-with-python-logistic.html ; 这家伙也在做同样的事情.


cha*_*mmu 20

我也试图实现逻辑回归,如Coursera ML课程中讨论的那样,但是在python中.我发现scipy很有帮助.在最小化函数中尝试不同的算法实现后,我发现Newton Conjugate Gradient是最有帮助的.此外,在检查其返回值后,它似乎等同于Octave中的fminunc.我已将我的实现包含在python下面找到最佳theta.

import numpy as np
import scipy.optimize as op

def Sigmoid(z):
    return 1/(1 + np.exp(-z));

def Gradient(theta,x,y):
    m , n = x.shape
    theta = theta.reshape((n,1));
    y = y.reshape((m,1))
    sigmoid_x_theta = Sigmoid(x.dot(theta));
    grad = ((x.T).dot(sigmoid_x_theta-y))/m;
    return grad.flatten();

def CostFunc(theta,x,y):
    m,n = x.shape; 
    theta = theta.reshape((n,1));
    y = y.reshape((m,1));
    term1 = np.log(Sigmoid(x.dot(theta)));
    term2 = np.log(1-Sigmoid(x.dot(theta)));
    term1 = term1.reshape((m,1))
    term2 = term2.reshape((m,1))
    term = y * term1 + (1 - y) * term2;
    J = -((np.sum(term))/m);
    return J;

# intialize X and y
X = np.array([[1,2,3],[1,3,4]]);
y = np.array([[1],[0]]);

m , n = X.shape;
initial_theta = np.zeros(n);
Result = op.minimize(fun = CostFunc, 
                                 x0 = initial_theta, 
                                 args = (X, y),
                                 method = 'TNC',
                                 jac = Gradient);
optimal_theta = Result.x;
Run Code Online (Sandbox Code Playgroud)


Jan*_*Jan 8

看起来你必须改为scipy.

在那里,您可以轻松实现所有基本优化算法.

http://docs.scipy.org/doc/scipy/reference/optimize.html