小编Dar*_*ger的帖子

Spark错误:没有足够的空间来缓存内存中的分区rdd_8_2!可用内存为58905314字节

当我使用它的示例代码BinaryClassification.scala用我自己的数据运行星火作业时,它总是显示类似错误"内存中没有足够的空间来缓存分区rdd_8_2!免费的内存是58905314个字节."

我通过CONF =新SparkConf()设置内存到4G.setAppName(S "BinaryClassification与$ PARAMS")集( "spark.executor.memory", "4G"),这是行不通的.有没有人有任何想法?谢谢:)

我在带有16GB内存的Macbook Pro上本地运行它.

bin/spark-submit --class BinaryClassification ~/dev/scalaworkspace/BinaryClassification/BinaryClassification_fat.jar ~/data/trajectory.libsvm --algorithm LR

Spark assembly has been built with Hive, including Datanucleus jars on classpath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
14/11/22 17:07:24 INFO SecurityManager: Changing view acls to: wangchao,
14/11/22 17:07:24 INFO SecurityManager: Changing modify acls to: wangchao,
14/11/22 17:07:24 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: …
Run Code Online (Sandbox Code Playgroud)

scala out-of-memory apache-spark rdd

11
推荐指数
1
解决办法
1万
查看次数

标签 统计

apache-spark ×1

out-of-memory ×1

rdd ×1

scala ×1