使Spark 1.4.1在IntelliJ Idea 14中运行

Gis*_*gen 1 scala intellij-idea maven apache-spark

我在IntelliJ Idea 14.1.4中使用Scala 2.11.7安装Spark 1.4.1时遇到问题.首先:我安装了源代码版本.我应该安装Hadoop 2.4+的版本吗?我做了什么:我从tgz文件中创建了一个Maven项目并保存了它.我需要做更多吗?pom.xml文件的第一行是:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <parent>
    <groupId>org.apache</groupId>
    <artifactId>apache</artifactId>
    <version>14</version>
  </parent>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-parent_2.10</artifactId>
  <version>1.4.1</version>
  <packaging>pom</packaging>
  <name>Spark Project Parent POM</name>
  <url>http://spark.apache.org/</url>
  <licenses>
    <license>
      <name>Apache 2.0 License</name>
      <url>http://www.apache.org/licenses/LICENSE-2.0.html</url>
      <distribution>repo</distribution>
    </license>
  </licenses>
  <scm>
    <connection>scm:git:git@github.com:apache/spark.git</connection>
    <developerConnection>scm:git:https://git-wip-us.apache.org/repos/asf/spark.git</developerConnection>
    <url>scm:git:git@github.com:apache/spark.git</url>
    <tag>HEAD</tag>
  </scm>
  <developers>
    <developer>
      <id>matei</id>
      <name>Matei Zaharia</name>
      <email>matei.zaharia@gmail.com</email>
      <url>http://www.cs.berkeley.edu/~matei</url>
      <organization>Apache Software Foundation</organization>
      <organizationUrl>http://spark.apache.org</organizationUrl>
    </developer>
  </developers>
Run Code Online (Sandbox Code Playgroud)

它试图在build.sbt中用一个简单的例子来运行spark:

name := "hello"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-parent_2.10" % "1.4.1"
Run Code Online (Sandbox Code Playgroud)

但我得到错误:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/08/27 11:14:03 INFO SparkContext: Running Spark version 1.4.1
15/08/27 11:14:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/08/27 11:14:07 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: A master URL must be set in your configuration
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:368)
    at Hello$.main(Hello.scala:12)
    at Hello.main(Hello.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
15/08/27 11:14:07 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:368)
    at Hello$.main(Hello.scala:12)
    at Hello.main(Hello.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
15/08/27 11:14:07 INFO Utils: Shutdown hook called
Run Code Online (Sandbox Code Playgroud)

我的第一个想法是我需要安装Spark的预构建版本.如果我下载这个,我是否需要删除另一个并再次执行相同的步骤?还是有另一个错误?非常感谢所有的帮助:D

Avi*_*mka 9

我认为您的问题与您的代码中的spark上下文初始化有关.

您需要为要连接的spark上下文设置主URL.例如:

val sparkConf = new SparkConf().setAppName("My Spark Job").setMaster("local")
val sparkContext = new SparkContext(sparkConf)
Run Code Online (Sandbox Code Playgroud)

哪里:

  • "local" - 表示在主节点上本地运行作业.