高斯拟合python中的直方图数据:Trust Region v/s Levenberg Marquardt

Myc*_*ofD 2 python gaussian curve-fitting python-2.7

我的直方图清晰地显示了两个峰.但是,当用双高斯曲线拟合它时,它只显示一个峰值.几乎跟随stackoverflow中显示的每个答案.但未能得到正确的结果.它以前是由我在Fortran的老师完成的,他有两个高峰.我在一次试验中使用leastsq了python scipy.optimize.我也应该提供我的数据吗?这是我的代码.

binss = (max(x) - min(x))/0.05 #0.05 is my bin width
n, bins, patches = plt.hist(x, binss, color = 'grey') #gives the histogram

x_a = []
for item in range(len(bins)-1):
    b = (bins[item]+bins[item+1])/2
    x_a.append(b)

x_avg = np.array(x_a)
y_real = n

def gauss(x, A, mu, sigma):
    gaus = []
    for item in range(len(x)):
        gaus.append(A*e**(-(x[item]-mu)**2./(2.*sigma**2)))
    return np.array(gaus)
A1, A2, m1, m2, sd1, sd2 = [25, 30, 0.3, 0.6, -0.9, -0.9]

#Initial guesses for leastsq
p = [A1, A2, m1, m2, sd1, sd2]
y_init = gauss(x_avg, A1, m1, sd1) + gauss(x_avg, A2, m2, sd2)    #initially guessed y

def residual(p, x, y):
    A1, A2, m1, m2, sd1, sd2 = p
    y_fit = gauss(x, A1, m1, sd1) + gauss(x, A2, m2, sd2)
    err = y - y_fit
    return err

sf = leastsq(residual, p, args = (x_avg , y_real))

y_fitted1 = gauss(x_avg, sf[0][0], sf[0][2], sf[0][4])
y_fitted2 = gauss(x_avg, sf[0][1], sf[0][3], sf[0][5])

y_fitted = y_fitted1 + y_fitted2

plt.plot(x_avg, y_init, 'b', label='Starting Guess')
plt.plot(x_avg, y_fitted, color = 'red', label = 'Fitted Data')
plt.plot(x_avg, y_fitted1, color= 'black', label = 'Fitted1 Data')
plt.plot(x_avg, y_fitted2, color = 'green', label = 'Fitted2 Data')
Run Code Online (Sandbox Code Playgroud)

即使我得到的数字也不顺利.它只有54分在x_avg请帮忙.甚至不能在这里张贴这个数字.

在MATLAB上绘图时,获得了正确的结果.原因:MATLAB使用Trust Region算法而不是Levenberg-Marquardt算法,这不适用于约束约束.

只有当它显示为3个单独高斯的总和而不是2时,才会得到正确的结果.

我如何决定使用哪个算法以及何时使用?

eme*_*eth 7

您的问题似乎mixtures of Gaussian也被称为Gaussian mixture model.有几种实现方式.sklearn值得考虑.

import numpy as np
from sklearn import mixture
import matplotlib.pyplot as plt

comp0 = np.random.randn(1000) - 5 # samples of the 1st component
comp1 = np.random.randn(1000) + 5 # samples of the 2nd component

x = np.hstack((comp0, comp1)) # merge them

gmm = mixture.GMM(n_components=2) # gmm for two components
gmm.fit(x) # train it!

linspace = np.linspace(-10, 10, 1000)

fig, ax1 = plt.subplots()
ax2 = ax1.twinx()

ax1.hist(x, 100) # draw samples
ax2.plot(linspace, np.exp(gmm.score_samples(linspace)[0]), 'r') # draw GMM
plt.show()
Run Code Online (Sandbox Code Playgroud)

输出是 在此输入图像描述