小编Ani*_*nil的帖子

如何找到Spark的安装目录?

我想改变spark-env.sh.如何在ubuntu中找到安装目录?

我查看了用户界面,但没有找到任何东西.

whereis spark 
Run Code Online (Sandbox Code Playgroud)

结果: spark:

这是locate命令locate spark的日志

/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/blockmgr-db3a931b-7f1a-423e-b5da-b75a958a1909/11
/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/blockmgr-db3a931b-7f1a-423e-b5da-b75a958a1909/13
/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/httpd-16b4313e-72dc-4665-b4ac-df491869386d/files
/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/httpd-16b4313e-72dc-4665-b4ac-df491869386d/jars
/home/sys6002/Desktop/diff spark hadoop.png
/home/sys6002/Desktop/sparkmain
/home/sys6002/Downloads/learning-spark-master.zip
/home/sys6002/Downloads/mongo-spark-master
/home/sys6002/Downloads/spark-1.5.1
/home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6
/home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6 (2)
/home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6.tgz
/home/sys6002/Downloads/spark-1.5.1-bin-without-hadoop
/home/sys6002/Downloads/spark-cassandra-connector-master
/home/sys6002/Downloads/spark-core_2.9.3-0.8.0-incubati
home/sys6002/anaconda3/pkgs/odo-0.3.2-np19py34_0/lib/python3.4/site-packages/odo/backends/tests/__pycache__/test_sparksql.cpython-34.pyc
/home/sys6002/spark-example/a.txt
/home/sys6002/spark-example/a.txt~
/home/sys6002/spark-example/pom.xml
/home/sys6002/spark-example/pom.xml~
/home/sys6002/spark-example/src
/home/sys6002/spark-example/src/main
/home/sys6002/spark-example/src/test
/home/sys6002/spark-example/src/main/java
/home/sys6002/spark-example/src/main/java/com
/home/sys6002/spark-example/src/main/java/com/geekcap
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/App.java
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/WordCount.java~
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample/WordCount.java
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample/WordCount.java~

/home/sys6002/spark-example/src/test/java/com/geekcap/javaworld/AppTest.java
/usr/share/app-install/desktop/lightspark:lightspark.desktop
/usr/share/app-install/desktop/sparkleshare:sparkleshare-invite-opener.desktop
/usr/share/app-install/desktop/sparkleshare:sparkleshare.desktop
Run Code Online (Sandbox Code Playgroud)

java ubuntu apache-spark

15
推荐指数
3
解决办法
4万
查看次数

Apache Spark:不执行联合操作

我知道火花做懒惰的评价.

但这是预期的行为?使用以下程序,输出为20.

但如果是打印声明

  System.out.println("/////////////////// After "+MainRDD.count());
Run Code Online (Sandbox Code Playgroud)

如果没有注释,输出将为40

我不是在我的应用程序中这样做,但只是为了演示,我创建了这个程序..

 SparkConf sparkConf = new SparkConf().setMaster("local").setAppName("JavaSparkSQL");
JavaSparkContext sc = new JavaSparkContext(sparkConf);

JavaRDD<Integer> MainRDD;
ArrayList<Integer> list = new ArrayList<>();
JavaRDD<Integer> tmp;
for (int i = 0; i < 20; i++) {
    list.add(i);
}

MainRDD = sc.parallelize(list);// MainRDD.union(tmp);
System.out.println("//////////////////////First "+MainRDD.count());

list.clear();
for (int i = 20; i < 25; i++) {
    for (int j = 1; j < 5; j++) {
        list.add(i*j);
    }
    tmp = sc.parallelize(list);

  //      System.out.println("/////////////////// Before "+MainRDD.count());
    MainRDD = MainRDD.union(tmp);
//        System.out.println("/////////////////// …
Run Code Online (Sandbox Code Playgroud)

java apache-spark

5
推荐指数
1
解决办法
863
查看次数

java.lang.NoSuchMethodError:org.apache.spark.ui.SparkUI.addStaticHandler(Ljava/lang/String; Ljava/lang/String;

我正在运行以下关于Java + Spark + SQL的示例.

https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/sql/JavaSparkSQL.java

但得到这个例外.编译时没有错误

我怎么能避免这个?无法找到有关此例外的任何内容.请帮我.

SparkConf sparkConf = new SparkConf().setMaster("local").setAppName("JavaSparkSQL");
JavaSparkContext ctx = new JavaSparkContext(sparkConf);
SQLContext sqlContext = new SQLContext(ctx);
Run Code Online (Sandbox Code Playgroud)

异常跟踪:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.ui.SparkUI.addStaticHandler(Ljava/lang/String;Ljava/lang/String;)V
    at org.apache.spark.sql.execution.ui.SQLTab.<init>(SQLTab.scala:36)
    at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:79)
    at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:79)
    at scala.Option.foreach(Option.scala:257)
    at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:79)
    at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:69)
    at org.sun.JavaSparkSQL.main(JavaSparkSQL.java:47)
2015-11-06 18:35:22,834 INFO  org.apache.spark.SparkContext.logInfo:59 - Invoking stop() from shutdown hook
Run Code Online (Sandbox Code Playgroud)

pom.xml依赖项

 <dependency>
     <groupId>org.apache.spark</groupId>
     <artifactId>spark-sql_2.11</artifactId>
     <version>1.5.1</version>
 </dependency>
 <dependency>
     <groupId>org.apache.spark</groupId>
     <artifactId>spark-core_2.11</artifactId>
     <version>1.4.0</version>
 </dependency>
Run Code Online (Sandbox Code Playgroud)

java apache-spark

0
推荐指数
1
解决办法
4361
查看次数

标签 统计

apache-spark ×3

java ×3

ubuntu ×1