小编use*_*607的帖子

ValueError:无法获取未知等级的形状的长度

我正在尝试将我们的输入管道移动到 tensorflow 数据集 api。为此,我们已将图像和标签转换为 tfrecords。然后我们通过dataset api读取tfrecords,比较初始数据和读取的数据是否相同。到现在为止还挺好。下面是将 tfrecords 读入数据集的代码

def _parse_function2(proto):

    # define your tfrecord again. Remember that you saved your image as a string.

    keys_to_features = {"im_path": tf.FixedLenSequenceFeature([], tf.string, allow_missing=True),
                        "im_shape": tf.FixedLenSequenceFeature([], tf.int64, allow_missing=True),
                        "score_shape": tf.FixedLenSequenceFeature([], tf.int64, allow_missing=True),
                        "geo_shape": tf.FixedLenSequenceFeature([], tf.int64, allow_missing=True),
                        "im_patches": tf.FixedLenSequenceFeature([], tf.string, allow_missing=True),
                        "score_patches": tf.FixedLenSequenceFeature([], tf.string, allow_missing=True),
                        "geo_patches": tf.FixedLenSequenceFeature([], tf.string, allow_missing=True),
                        }

    # Load one example
    parsed_features = tf.parse_single_example(serialized=proto, features=keys_to_features)

    parsed_features['im_patches'] = parsed_features['im_patches'][0]
    parsed_features['score_patches'] = parsed_features['score_patches'][0]
    parsed_features['geo_patches'] = parsed_features['geo_patches'][0]

    parsed_features['im_patches'] = tf.decode_raw(parsed_features['im_patches'], tf.uint8)
    parsed_features['im_patches'] = tf.reshape(parsed_features['im_patches'], parsed_features['im_shape']) …
Run Code Online (Sandbox Code Playgroud)

python deep-learning keras tensorflow

5
推荐指数
1
解决办法
7726
查看次数

Apache Spark 2.1:java.lang.UnsupportedOperationException:找不到scala.collection.immutable.Set [String]的编码器

我使用Spark 2.1.1与Scala 2.11.6.我收到以下错误.我没有使用任何案例类.

java.lang.UnsupportedOperationException: No Encoder found for scala.collection.immutable.Set[String]
 field (class: "scala.collection.immutable.Set", name: "_2")
 field (class: "scala.Tuple2", name: "_2")
 root class: "scala.Tuple2"
Run Code Online (Sandbox Code Playgroud)

以下代码部分是stacktrace指向的位置.

val tweetArrayRDD = nameDF.select("namedEnts", "text", "storylines")
    .flatMap {
    case Row(namedEnts: Traversable[(String, String)], text: String, storylines: Traversable[String]) =>
      Option(namedEnts) match {
        case Some(x: Traversable[(String, String)]) =>
          //println("In flatMap:" + x + " ~~&~~ " + text + " ~~&~~ " + storylines)
          namedEnts.map((_, (text, storylines.toSet)))
        case _ => //println("In flatMap: blahhhh")
          Traversable()
      }
    case _ => //println("In flatMap: …
Run Code Online (Sandbox Code Playgroud)

scala apache-spark apache-spark-encoders

2
推荐指数
1
解决办法
5394
查看次数