小编Abh*_*k P的帖子

spark-nlp:DocumentAssembler 初始化失败,出现“java.lang.NoClassDefFoundError: org/apache/spark/ml/util/MLWritable$class”

我正在尝试https://medium.com/spark-nlp/applying-context-aware-spell-checking-in-spark-nlp-3c29c46963bc 中提供的 ContenxtAwareSpellChecker

管道中的第一个组件是DocumentAssembler

from sparknlp.annotator import *
from sparknlp.base import *
import sparknlp


spark = sparknlp.start()
documentAssembler = DocumentAssembler()\
    .setInputCol("text")\
    .setOutputCol("document")
Run Code Online (Sandbox Code Playgroud)

运行失败时的上述代码如下

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\Users\pab\AppData\Local\Continuum\anaconda3.7\envs\MailChecker\lib\site-packages\pyspark\__init__.py", line 110, in wrapper
    return func(self, **kwargs)
  File "C:\Users\pab\AppData\Local\Continuum\anaconda3.7\envs\MailChecker\lib\site-packages\sparknlp\base.py", line 148, in __init__
    super(DocumentAssembler, self).__init__(classname="com.johnsnowlabs.nlp.DocumentAssembler")
  File "C:\Users\pab\AppData\Local\Continuum\anaconda3.7\envs\MailChecker\lib\site-packages\pyspark\__init__.py", line 110, in wrapper
    return func(self, **kwargs)
  File "C:\Users\pab\AppData\Local\Continuum\anaconda3.7\envs\MailChecker\lib\site-packages\sparknlp\internal.py", line 72, in __init__
    self._java_obj = self._new_java_obj(classname, self.uid)
  File "C:\Users\pab\AppData\Local\Continuum\anaconda3.7\envs\MailChecker\lib\site-packages\pyspark\ml\wrapper.py", line 69, in _new_java_obj
    return …
Run Code Online (Sandbox Code Playgroud)

python apache-spark pyspark johnsnowlabs-spark-nlp

5
推荐指数
1
解决办法
1674
查看次数