没有足够的空间来缓存内存中的rdd警告

Eda*_*ame 4 amazon-s3 amazon-web-services apache-spark rdd

我正在运行一个火花作业,我没有足够的空间来缓存内存中的rdd_128_17000警告.然而,在附件中,它显然只说90.8摹出719.3 g ^使用.这是为什么?谢谢!


15/10/16 02:19:41 WARN storage.MemoryStore: Not enough space to cache rdd_128_17000 in memory! (computed 21.4 GB so far)
15/10/16 02:19:41 INFO storage.MemoryStore: Memory use = 4.1 GB (blocks) + 21.2 GB (scratch space shared across 1 thread(s)) = 25.2 GB. Storage limit = 36.0 GB.
15/10/16 02:19:44 WARN storage.MemoryStore: Not enough space to cache rdd_129_17000 in memory! (computed 9.4 GB so far)
15/10/16 02:19:44 INFO storage.MemoryStore: Memory use = 4.1 GB (blocks) + 30.6 GB (scratch space shared across 1 thread(s)) = 34.6 GB. Storage limit = 36.0 GB.
15/10/16 02:25:37 INFO metrics.MetricsSaver: 1001 MetricsLockFreeSaver 339 comitted 11 matured S3WriteBytes values
15/10/16 02:29:00 INFO s3n.MultipartUploadOutputStream: uploadPart /mnt1/var/lib/hadoop/s3/959a772f-d03a-41fd-bc9d-6d5c5b9812a1-0000 134217728 bytes md5: qkQ8nlvC8COVftXkknPE3A== md5hex: aa443c9e5bc2f023957ed5e49273c4dc
15/10/16 02:38:15 INFO s3n.MultipartUploadOutputStream: uploadPart /mnt/var/lib/hadoop/s3/959a772f-d03a-41fd-bc9d-6d5c5b9812a1-0001 134217728 bytes md5: RgoGg/yJpqzjIvD5DqjCig== md5hex: 460a0683fc89a6ace322f0f90ea8c28a
15/10/16 02:42:20 INFO metrics.MetricsSaver: 2001 MetricsLockFreeSaver 339 comitted 10 matured S3WriteBytes values
Run Code Online (Sandbox Code Playgroud)

在此输入图像描述

Jem*_*ker 5

这很可能是由于配置spark.storage.memoryFraction太低造成的.Spark将仅使用分配的内存的这一部分来缓存RDD.

尝试:

  • 增加储存分数
  • rdd.persist(StorageLevel.MEMORY_ONLY_SER) 通过序列化RDD数据来减少内存使用量
  • rdd.persist(StorageLevel.MEMORY_AND_DISK) 如果达到内存限制,则部分持久保存到磁盘上.