spark类型不匹配:无法从JavaRDD <Object>转换为JavaRDD <String>

bac*_*ack 4 java java-8 apache-spark

我已经开始将我的Pyspark应用程序编写为Java实现.我正在使用Java 8.我刚开始在java中执行一些基本的spark progrma.我使用了以下wordcount 示例.

SparkConf conf = new SparkConf().setMaster("local").setAppName("Work Count App");

// Create a Java version of the Spark Context from the configuration
JavaSparkContext sc = new JavaSparkContext(conf);

JavaRDD<String> lines = sc.textFile(filename);

JavaPairRDD<String, Integer> counts = lines.flatMap(line -> Arrays.asList(line.split(" ")))
                    .mapToPair(word -> new Tuple2(word, 1))
                    .reduceByKey((x, y) -> (Integer) x + (Integer) y)
                    .sortByKey();
Run Code Online (Sandbox Code Playgroud)

我收到Type mismatch: cannot convert from JavaRDD<Object> to JavaRDD<String>错误lines.flatMap(line -> Arrays.asList(line.split(" "))) 当我用Google搜索时,在所有基于Java 8的火花示例中,我看到了相同的上述实现.我的环境或程序出了什么问题.

有人能帮我吗 ?

aba*_*hel 7

使用此代码.实际问题是rdd.flatMap函数Iterator<String>在您的代码创建时期望的List<String>.调用iterator()将解决问题.

JavaPairRDD<String, Integer> counts = lines.flatMap(line -> Arrays.asList(line.split(" ")).iterator())
            .mapToPair(word -> new Tuple2<String, Integer>(word, 1))
            .reduceByKey((x, y) ->  x +  y)
            .sortByKey();

counts.foreach(data -> {
        System.out.println(data._1()+"-"+data._2());
    });
Run Code Online (Sandbox Code Playgroud)