当我启动计算每个键平均值的应用程序时出现此错误。我将该函数combineBykey与 lambda 表达式 (java8) 一起使用。key我读取了一个包含三个寄存器( 、time、 )的文件float。我的worker和master都有java 8
16/05/06 15:48:23 INFO DAGScheduler: ShuffleMapStage 0 (mapToPair at ProcesarFichero.java:115) failed in 3.774 s
16/05/06 15:48:23 INFO DAGScheduler: Job 0 failed: saveAsTextFile at ProcesarFichero.java:153, took 3.950483 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 0.0 failed 4 times, most recent failure: Lost task 1.3 in stage 0.0 (TID 5, mcava-slave0): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to …Run Code Online (Sandbox Code Playgroud) 我无法从声纳服务器下载“质量配置文件”。而且你也不太明白,如果服务器上发生某些变化,我如何更新配置文件。文档说:
同步规则、问题和排除。首先通过用户设置(SonarLint 部分)配置连接,然后在工作区设置中绑定项目。如果服务器端配置发生变化,您可以在命令面板上触发本地更新更新 SonarLint 绑定到 SonarQube/SonarCloud 命令。
但我不知道,我必须使用什么命令来更新绑定。
我不知道如何启动Java应用程序spark-submit.
当我运行以下命令时:
spark@mcava-master:/home/miren/NetBeansProjects$ /opt/spark/bin/spark-submit --class /home/miren/NetBeansProjects/SparkExample/src/main/java/com/mycompany/sparkexample/CountWords.java --master spark://192.168.1.105:7077 /home/miren/NetBeansProjects/SparkExample/target/SparkExample-1.0-SNAPSHOT.jar spark://192.168.1.105:7077
Run Code Online (Sandbox Code Playgroud)
我得到一个例外:
java.lang.ClassNotFoundException: /home/miren/NetBeansProjects/SparkExample/src/main/java/com/mycompany/sparkexample/CountWords.java
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:174)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Run Code Online (Sandbox Code Playgroud)
我如何指出类路径?
16/04/26 16:58:46 DEBUG ProtobufRpcEngine: Call: complete took 3ms
Exception in thread "main" java.lang.NoClassDefFoundError: com/datastax/spark/connector/japi/CassandraJavaUtil
at com.baitic.mcava.lecturahdfssaveincassandra.TratamientoCSV.main(TratamientoCSV.java:123)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.datastax.spark.connector.japi.CassandraJavaUtil
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
16/04/26 16:58:46 INFO SparkContext: Invoking stop() from shutdown hook
16/04/26 16:58:46 INFO SparkUI: Stopped Spark web UI at http://10.128.0.5:4040
16/04/26 16:58:46 INFO SparkDeploySchedulerBackend: Shutting down all executors
16/04/26 16:58:46 INFO SparkDeploySchedulerBackend: Asking …Run Code Online (Sandbox Code Playgroud)