警告:python 中的未压缩距离矩阵

AKP*_*AKP 4 python machine-learning hierarchical-clustering scikit-learn

我尝试制作与凝聚层次聚类关联的树状图,并且我需要距离矩阵。我开始于:

import numpy as np 
import pandas as pd
from scipy import ndimage 
from scipy.cluster import hierarchy 
from scipy.spatial import distance_matrix 
from matplotlib import pyplot as plt 
from sklearn import manifold, datasets 
from sklearn.cluster import AgglomerativeClustering 
from sklearn.datasets.samples_generator import make_blobs 
%matplotlib inline
X1, y1 = make_blobs(n_samples=50, centers=[[4,4], [-2, -1], [1, 1], [10,4]], cluster_std=0.9)
plt.scatter(X1[:, 0], X1[:, 1], marker='o') 
agglom = AgglomerativeClustering(n_clusters = 4, linkage = 'average')
agglom.fit(X1,y1)
# Create a figure of size 6 inches by 4 inches.
plt.figure(figsize=(6,4))

# These two lines of code are used to scale the data points down,
# Or else the data points will be scattered very far apart.

# Create a minimum and maximum range of X1.
x_min, x_max = np.min(X1, axis=0), np.max(X1, axis=0)

# Get the average distance for X1.
X1 = (X1 - x_min) / (x_max - x_min)

# This loop displays all of the datapoints.
for i in range(X1.shape[0]):
    # Replace the data points with their respective cluster value 
    # (ex. 0) and is color coded with a colormap (plt.cm.spectral)
    plt.text(X1[i, 0], X1[i, 1], str(y1[i]),
             color=plt.cm.nipy_spectral(agglom.labels_[i] / 10.),
             fontdict={'weight': 'bold', 'size': 9})

# Remove the x ticks, y ticks, x and y axis
plt.xticks([])
plt.yticks([])
#plt.axis('off')



# Display the plot of the original data before clustering
plt.scatter(X1[:, 0], X1[:, 1], marker='.')
# Display the plot
plt.show()
dist_matrix = distance_matrix(X1,X1) 
print(dist_matrix)
Run Code Online (Sandbox Code Playgroud)

当我写这个时我得到一个错误:

Z = hierarchy.linkage(dist_matrix, 'complete')
Run Code Online (Sandbox Code Playgroud)

/home/jupyterlab/conda/envs/python/lib/python3.6/site-packages/ipykernel_launcher.py:1: ClusterWarning: scipy.cluster: 对称非负空心观察矩阵看起来可疑地像一个未压缩的距离矩阵“” “启动 IPython 内核的入口点。

首先,这是什么意思以及如何解决?谢谢

aha*_*gen 7

scipy.cluster.heirarchy.linkage需要一个压缩距离矩阵,而不是方形/非压缩距离矩阵。您已经计算了方形距离矩阵,需要将其转换为压缩形式。我建议使用scipy.spatial.distance.squareform. 以下片段重现了您的功能(为了简洁起见,我删除了绘图),没有警告。

from sklearn.cluster import AgglomerativeClustering 
from sklearn.datasets import make_blobs
from scipy.spatial import distance_matrix
from scipy.cluster import hierarchy
from scipy.spatial.distance import squareform

X1, y1 = make_blobs(n_samples=50, centers=[[4,4],
                                           [-2, -1],
                                           [1, 1],
                                           [10,4]], cluster_std=0.9)

agglom = AgglomerativeClustering(n_clusters = 4, linkage = 'average')
agglom.fit(X1,y1)

dist_matrix = distance_matrix(X1,X1)
print(dist_matrix.shape)
condensed_dist_matrix = squareform(dist_matrix)
print(condensed_dist_matrix.shape)
Z = hierarchy.linkage(condensed_dist_matrix, 'complete')
Run Code Online (Sandbox Code Playgroud)

  • 由于距离矩阵通常是在一组点与其自身(即该组点中的所有元素对)之间计算的,因此它是对称的并且对角线全为零。Squareform/condensed 仅表示上三角点,从而节省内存,而 uncondensed 则表示完整矩阵。这有帮助吗? (3认同)