Gartner

文章正文
发布时间:2024-09-12 23:41

ժƷ

ߣ

ȥʮ£һΪTheHundred-Page Machine LearningBookĻѧϰ̳Ѹ߻Gartner˾ѧϰŶӸˡ˹ܲʿAndriyBurkov׫дⱾԣȥĿ¼128ҳȴ˻ѧϰ50ʵüֵĸֲϡ

߽˵ѧϰijѧ߽ڱл㹻ϸڣԺܡݣоʵ߿ʹⱾΪһƵָϡ

Ȿ齲ʲô

ⱾһΪ󲿷֣ڽ˻ѧϰĻ֪ʶ֮󣬱8½SupervisedLearningලʽѧϰ3½UnsupervisedLearningǼලʽѧϰѧϰʽ

Ŀ¼ͼʾ

ʾѾԴ

Ȿ漰Ŀ붼GitHubϿԴ

Ҳ˵ҿһ߿ѧϰһÿԴĴʵˡò˵Щ̫Ѻˣرϸ

Ԫ˹ֲGaussianMixture Model GMMݣ9.2.4ϸĽ⣺

GitHubҲжӦϸ룺

importnumpy as npimportscipy as spimportmatplotlibimportmatplotlib.pyplot as pltimportmath

fromsklearn.neighbors import KernelDensity

importscipy.integrate as integratefromsklearn.kernel_ridge import KernelRidge

matplotlib.rcParams['mathtext.fontset']= 'stix'matplotlib.rcParams['font.family']= 'STIXGeneral'matplotlib.rcParams.update({'font.size':18})

mu1,sigma1 = 3.0, 1.0mu2,sigma2 = 8.0, 3.5

defsample_points():s1= np.random.normal(mu1, math.sqrt(sigma1), 50)

s2= np.random.normal(mu2, math.sqrt(sigma2), 50)return list(s1) + list(s2)

defcompute_bi(mu1local, sigma1local, mu2local, sigma2local, phi1local,phi2local):bis= []forxi in x:bis.append((sp.stats.norm.pdf(xi, mu1local, math.sqrt(sigma1local)) *phi1local)/(sp.stats.norm.pdf(xi, mu1local, math.sqrt(sigma1local)) *phi1local + sp.stats.norm.pdf(xi, mu2local, math.sqrt(sigma2local)) *phi2local))return bis#generate points used to plotx_plot= np.linspace(-2, 12, 100)#generate points and keep a subset of themx =sample_points()

colors= ['red', 'blue', 'orange', 'green']lw = 2mu1_estimate= 1.0mu2_estimate= 2.0sigma1_estimate= 1.0sigma2_estimate= 2.0phi1_estimate= 0.5phi2_estimate= 0.5

count =0whileTrue:plt.figure(count)axes = plt.gca()axes.set_xlim([-2,12])axes.set_ylim([0,0.8])plt.xlabel("$x$")plt.ylabel("pdf")plt.scatter(x, [0.005] * len(x), color='navy', s=30, marker=2,label="training examples")plt.plot(x_plot, [sp.stats.norm.pdf(xp, mu1_estimate,math.sqrt(sigma1_estimate)) for xp in x_plot], color=colors[1],linewidth=lw, label="$f(x_i \\mid \\mu_1 ,\\sigma_1^2)$")plt.plot(x_plot, [sp.stats.norm.pdf(xp, mu2_estimate,math.sqrt(sigma2_estimate)) for xp in x_plot], color=colors[3],linewidth=lw, label="$f(x_i \\mid \\mu_2 ,\\sigma_2^2)$")plt.plot(x_plot, [sp.stats.norm.pdf(xp, mu1, math.sqrt(sigma1)) forxp in x_plot], color=colors[0], label="true pdf")plt.plot(x_plot, [sp.stats.norm.pdf(xp, mu2, math.sqrt(sigma2)) forxp in x_plot], color=colors[0])

plt.legend(loc='upper right')plt.tight_layout()

fig1 = plt.gcf()fig1.subplots_adjust(top = 0.98, bottom = 0.1, right = 0.98, left =0.08, hspace = 0, wspace = 0)fig1.savefig('../../Illustrations/gaussian-mixture-model-' +str(count) + '.eps', format='eps', dpi=1000, bbox_inches = 'tight',pad_inches = 0)fig1.savefig('../../Illustrations/gaussian-mixture-model-' +str(count) + '.pdf', format='pdf', dpi=1000, bbox_inches = 'tight',pad_inches = 0)fig1.savefig('../../Illustrations/gaussian-mixture-model-' +str(count) + '.png', dpi=1000, bbox_inches = 'tight', pad_inches = 0)#plt.show()bis1 = compute_bi(mu1_estimate, sigma1_estimate, mu2_estimate,sigma2_estimate, phi1_estimate, phi2_estimate)bis2 = compute_bi(mu2_estimate, sigma2_estimate, mu1_estimate,sigma1_estimate, phi2_estimate, phi1_estimate)#print bis1[:5]#print bis2[:5]mu1_estimate = sum([bis1[i] * x[i] for i in range(len(x))]) /sum([bis1[i] for i in range(len(x))])mu2_estimate = sum([bis2[i] * x[i] for i in range(len(x))]) /sum([bis2[i] for i in range(len(x))])

sigma1_estimate = sum([bis1[i] * (x[i] - mu1_estimate)**2 for i inrange(len(x))]) / sum([bis1[i] for i in range(len(x))])sigma2_estimate = sum([bis2[i] * (x[i] - mu2_estimate)**2 for i inrange(len(x))]) / sum([bis2[i] for i in range(len(x))])#print mu1_estimate, mu2_estimate#print sigma1_estimate, sigma2_estimatephi1_estimate = sum([bis1[i] for i in range(len(x))])/float(len(x))phi2_estimate = 1.0 - phi1_estimate

print phi1_estimate

count += 1

plt.close(count)

ifcount > 50:break

λʹ

ӣ

?id=start

:

https://github.com/aburkov/theMLbook

ȻժҲزʹ룬̨ظ100ҳͿԻϽʼѧϰɣ

ʵϰ/ȫְ༭Ƹing

ǣһרҵƼýдÿϸڣǰҵһȺ鲼ȫһɳ걱廪ţڴժҳԻҳظƸ˽顣ֱӷzz@bigdatadigest.cn

׷΢Źںţժ߸˹۵㣬ѶͶ߾ݴ˲Ե

α༭ HN666

首页
评论
分享
Top