相关疑难解决方法(0)

是否可以将json4s 3.2.11与Spark 1.3.0一起使用?

Spark依赖于json4s 3.2.10,但是这个版本有几个bug,我需要使用3.2.11.我在build.sbt中添加了json4s-native 3.2.11依赖,所有编译都很好.但是当我点火提交我的JAR时,它为我提供了3.2.10.

build.sbt

import sbt.Keys._

name := "sparkapp"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core"  % "1.3.0" % "provided"

libraryDependencies += "org.json4s" %% "json4s-native" % "3.2.11"`
Run Code Online (Sandbox Code Playgroud)

plugins.sbt

logLevel := Level.Warn

resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
Run Code Online (Sandbox Code Playgroud)

App1.scala

import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD
import org.apache.spark.{Logging, SparkConf, SparkContext}
import org.apache.spark.SparkContext._

object App1 extends Logging {
  def main(args: Array[String]) = {
    val conf = new SparkConf().setAppName("App1")
    val sc = new SparkContext(conf)
    println(s"json4s version: ${org.json4s.BuildInfo.version.toString}")
  }
} …
Run Code Online (Sandbox Code Playgroud)

scala sbt json4s sbt-assembly apache-spark

5
推荐指数
1
解决办法
1330
查看次数

标签 统计

apache-spark ×1

json4s ×1

sbt ×1

sbt-assembly ×1

scala ×1