小编don*_*ald的帖子

高斯混合的轮廓分析

我正在使用GaussianMixture进行轮廓分析.我试图修改用scikit网站编写的类似代码,但得到了奇怪的错误: -

- > 82个center = clusterer.cluster_centers_ 83#在聚类中心绘制白色圆圈84 ax2.scatter(中心[:,0],中心[:,1],marker ='o',

AttributeError:'GaussianMixture'对象没有属性'cluster_centers_'

from sklearn.metrics import silhouette_samples, silhouette_score

import matplotlib.pyplot as plt
import matplotlib.cm as cm
import numpy as np

print(__doc__)

X=reduced_data.values
range_n_clusters = [2, 3, 4, 5, 6]

for n_clusters in range_n_clusters:
    # Create a subplot with 1 row and 2 columns
    fig, (ax1, ax2) = plt.subplots(1, 2)
    fig.set_size_inches(18, 7)

    # The 1st subplot is the silhouette plot
    # The silhouette coefficient can range from -1, 1 but in this example …
Run Code Online (Sandbox Code Playgroud)

python cluster-analysis machine-learning scikit-learn data-science

1
推荐指数
1
解决办法
1264
查看次数

sbt 程序集合并问题 [重复数据删除:以下文件内容不同]

我在 stackoverflow 中遵循了其他 sbt 程序集合并问题并添加了合并策略,但仍然没有得到解决。我添加了依赖树插件,但它没有显示传递库的依赖关系。我已经使用了 sbt 的最新合并策略,但是这个重复的内容问题仍然存在。

build.sbt:-

import sbtassembly.Log4j2MergeStrategy

name := ""
organization := "" // change to your org
version := "0.1"

scalaVersion := "2.11.8"
val sparkVersion = "2.1.1"

resolvers += "jitpack" at "https://jitpack.io"
resolvers += "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven/"

resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)

resolvers += Resolver.url("bintray-sbt-plugins", url("http://dl.bintray.com/sbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)

resolvers +=Resolver.typesafeRepo("releases")

//addSbtPlugin("org.spark-packages" % "sbt-spark-package" % "0.2.6")



libraryDependencies ++= Seq(
  ("org.apache.spark" %% "spark-core" % "2.1.1" %"provided").
    exclude("commons-beanutils", "commons-beanutils-core").
    exclude("commons-collections", "commons-collections").
    exclude("commons-logging", "commons-logging").
    exclude("com.esotericsoftware.minlog", "minlog"),
  ("org.apache.spark" %% "spark-hive" % "2.1.1" %"provided").
    exclude("commons-beanutils", …
Run Code Online (Sandbox Code Playgroud)

scala sbt sbt-assembly apache-spark sbt-plugin

1
推荐指数
1
解决办法
2601
查看次数