星星之火java.lang.OutOfMemoryError:Java堆空间
spark.executor.memory=4g, Dspark.akka.frameSize=512
问题是:
第一
val imageBundleRDD = sc.newAPIHadoopFile(...)
第二
val res = imageBundleRDD.map(data => { val desPoints = threeDReconstruction(data._2, bg) (data._1, desPoints) })
最活的,刚过去的
res.saveAsNewAPIHadoopFile(...)
..... 14/01/15 21:42:27 INFO cluster.ClusterTaskSetManager: Starting task 1.0:24 as TID 33 on executor 9: Salve7.Hadoop (NODE_LOCAL) 14/01/15 21:42:27 INFO cluster.ClusterTaskSetManager: Serialized task 1.0:24 as 30618515 bytes in 210 ms 14/01/15 21:42:27 INFO cluster.ClusterTaskSetManager: Starting task 1.0:36 as TID 34 on executor 2: Salve11.Hadoop (NODE_LOCAL) 14/01/15 21:42:28 INFO cluster.ClusterTaskSetManager: Serialized task 1.0:36 as 30618515 bytes in 449 ms 14/01/15 21:42:28 INFO cluster.ClusterTaskSetManager: Starting task 1.0:32 as TID 35 on executor 7: Salve4.Hadoop (NODE_LOCAL) Uncaught error from thread [spark-akka.actor.default-dispatcher-3] shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for ActorSystem[spark] java.lang.OutOfMemoryError: Java heap space
PS
森林海
FFIVE
相关分类