wha*_*ick 10 python numpy linear-regression
我正在执行如下所示的最小二乘回归(单变量).我想用R ^ 2来表达结果的重要性.Numpy返回一个未缩放残差的值,这将是一种正常化的合理方法.
field_clean,back_clean = rid_zeros(backscatter,field_data)
num_vals = len(field_clean)
x = field_clean[:,row:row+1]
y = 10*log10(back_clean)
A = hstack([x, ones((num_vals,1))])
soln = lstsq(A, y )
m, c = soln [0]
residues = soln [1]
print residues
Run Code Online (Sandbox Code Playgroud)
Joe*_*ton 19
见http://en.wikipedia.org/wiki/Coefficient_of_determination
你的R2值=
1 - residual / sum((y - y.mean())**2)
Run Code Online (Sandbox Code Playgroud)
这相当于
1 - residual / (n * y.var())
Run Code Online (Sandbox Code Playgroud)
举个例子:
import numpy as np
# Make some data...
n = 10
x = np.arange(n)
y = 3 * x + 5 + np.random.random(n)
# Note that polyfit is an easier way to do this...
# It would just be "model, resid = np.polyfit(x,y,1,full=True)[:2]"
A = np.vstack((x, np.ones(n))).T
model, resid = np.linalg.lstsq(A, y)[:2]
r2 = 1 - resid / (y.size * y.var())
print r2
Run Code Online (Sandbox Code Playgroud)
归档时间: |
|
查看次数: |
9210 次 |
最近记录: |