小编Roc*_*kan的帖子

如何在Spark中将HashMap转换为JavaPairRDD?

我是Apache Spark的新手.我正在尝试创建一个JavaPairRdd来自HashMap.我有一个HashMap类型<String,<Integer,Integer>> 如何将其转换为JavaPairRdd?我在下面粘贴了我的代码:

HashMap<String, HashMap<Integer,String>> canlist =
    new HashMap<String, HashMap<Integer,String>>();

for (String key : entityKey)
{
    HashMap<Integer, String> clkey = new HashMap<Integer, String>();
    int f=0;
    for (String val :mentionKey)
    {
        //do something
        simiscore = (longerLength - costs[m.length()]) / (double) longerLength;

        if (simiscore > 0.6) {
            clkey.put(v1,val);
            System.out.print(
                " The mention  " + val + " added to link entity  " + key);
            }
            f++;
            System.out.println("Scan Completed");
    }
    canlist.put(key,clkey);
    JavaPairRDD<String, HashMap<Integer, String>> …
Run Code Online (Sandbox Code Playgroud)

java apache-spark

2
推荐指数
1
解决办法
7782
查看次数

标签 统计

apache-spark ×1

java ×1