小编use*_*805的帖子

Spark Shell依赖异常

我的主机系统是Windows 10,我有cloudera vm,我的spark版本是1.6。我试图使用以下命令加载spark-shell。

spark-shell --packages org.apache.spark:spark-streaming-twitter_2.10:1.6.0
Run Code Online (Sandbox Code Playgroud)

但是它抛出以下异常:

:::: ERRORS Server access error at url https://repo1.maven.org/maven2/org/apache/spark/spark-streaming-twitter_2.10/1.6.0/spark-streaming-twitter_2.10-1.6.0.pom (javax.net.ssl.SSLException: Received fatal alert: protocol_version)
    Server access error at url https://repo1.maven.org/maven2/org/apache/spark/spark-streaming-twitter_2.10/1.6.0/spark-streaming-twitter_2.10-1.6.0.jar (javax.net.ssl.SSLException: Received fatal alert: protocol_version)
Run Code Online (Sandbox Code Playgroud)

::使用详细或调试消息级别获取更多详细信息线程中的异常

"main" java.lang.RuntimeException: [unresolved dependency: org.apache.spark#spark-streaming-twitter_2.10;1.6.0: not found] at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1067) at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:287) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:154) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Run Code Online (Sandbox Code Playgroud)

scala apache-spark

1
推荐指数
1
解决办法
840
查看次数

toDF 在 spark scala ide 中不起作用,但在 spark-shell 中运行良好

我是 Spark 的新手,我正在尝试从 spark-shell 和 spark scala eclipse ide 运行以下命令

当我从 shell 运行它时,它完美地工作。

但是在 ide 中,它给出了编译错误。请帮忙

    package sparkWCExample.spWCExample

    import org.apache.log4j.Level
    import org.apache.spark.sql.{ Dataset, SparkSession, DataFrame, Row }
    import org.apache.spark.sql.functions._
    import org.apache.spark.SparkContext
    import org.apache.spark.SparkConf
    import org.apache.spark.sql._

    object TwitterDatawithDataset {
      def main(args: Array[String]) {
        val conf = new SparkConf()
            .setAppName("Spark Scala WordCount Example")
            .setMaster("local[1]")
        val spark = SparkSession.builder()
            .config(conf)
            .appName("CsvExample")
            .master("local")
            .getOrCreate()
        val csvData = spark.sparkContext
            .textFile("C:\\Sankha\\Study\\data\\bank_data.csv", 3)

val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._
        case class Bank(age: Int, job: String)
        val …
Run Code Online (Sandbox Code Playgroud)

scala apache-spark

-3
推荐指数
1
解决办法
331
查看次数

标签 统计

apache-spark ×2

scala ×2