Spark"没有任务已经开始"

nnc*_*nnc 5 apache-spark rdd apache-spark-sql

我是初学者,在独立模式下火花和运行火花.任务dataframe.count()挂起

    SparkConf conf = new SparkConf();
    conf.set("spark.driver.allowMultipleContexts", "true");
    conf.set("spark.executor.memory", "10g");
    conf.set("spark.dirver.maxResultSize","10g");
    conf.set("spark.driver.memory ", "10g");
    //Initialize sparkcontext
    Dataframt dt = //load data from reshift
    JavaRDD<String> rdd = sc.textFile(url);
    JavaPairRDD<String, String> pairRdd = rdd.mapToPair(SparkFunctionsImpl
                .strToMap());

   //dt.count()
   //pairrdd => map => collectAsMap()
Run Code Online (Sandbox Code Playgroud)

Spark任务挂起在count()和collectasMap(),并且不从那里继续. 在此输入图像描述

看起来rdd.collectasMap和dataframe.count()正在并行执行,并且spark挂起,没有任何任务正在进行