即使安装了Numpy,使用MLlib时也会出现NumPy异常

pel*_*tor 6 python numpy apache-spark pyspark apache-spark-mllib

这是我正在尝试执行的代码:

from pyspark.mllib.recommendation import ALS
iterations=5
lambdaALS=0.1
seed=5L
rank=8
model=ALS.train(trainingRDD,rank,iterations, lambda_=lambdaALS, seed=seed)
Run Code Online (Sandbox Code Playgroud)

我跑的时候 model=ALS.train(trainingRDD,rank,iterations, lambda_=lambdaALS, seed=seed)依赖于numpy命令时,Spark使用的Py4Java库会抛出以下消息:

Py4JJavaError: An error occurred while calling o587.trainALSModel.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 67.0 failed 4 times, most recent failure: Lost task 0.3 in stage 67.0 (TID 195, 192.168.161.55): org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/home/platform/spark/python/lib/pyspark.zip/pyspark/worker.py", line 98, in main
    command = pickleSer._read_with_length(infile)
  File "/home/platform/spark/python/lib/pyspark.zip/pyspark/serializers.py", line 164, in _read_with_length
    return self.loads(obj)
  File "/home/platform/spark/python/lib/pyspark.zip/pyspark/serializers.py", line 421, in loads
    return pickle.loads(obj)
  File "/home/platform/spark/python/lib/pyspark.zip/pyspark/mllib/__init__.py", line 27, in <module>
Exception: MLlib requires NumPy 1.4+
Run Code Online (Sandbox Code Playgroud)

NumPy 1.10安装在错误消息中指定的计算机上.此外,我在Jupyter笔记本中直接执行以下命令时获得版本1.9.2: import numpy numpy.version.version

我显然运行的是早于1.4的NumPy版本,但我不知道在哪里.如何判断更新我的NumPy版本需要哪台机器?

Ran*_*anP 16

这是Mllib init代码中的一个错误

import numpy
if numpy.version.version < '1.4':
    raise Exception("MLlib requires NumPy 1.4+")
Run Code Online (Sandbox Code Playgroud)

'1.10'<从'1.4'你可以使用NumPy 1.9.2.

如果你必须使用NumPy 1.10并且不想升级到spark 1.5.1.手动更新代码. https://github.com/apache/spark/blob/master/python/pyspark/mllib/ 初始化的.py