微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

scala – 增加Spark shell可用的内存

我正在尝试在RaspBerry Pi1 Model B上安装Apache Spark

一旦我启动命令shell并尝试命令:

val l = sc.parallelize(List()).collect

我收到例外:

scala> val l = sc.parallelize(List()).collect
15/03/22 19:52:44 INFO SparkContext: Starting job: collect at <console>:21
15/03/22 19:52:44 INFO DAGScheduler: Got job 0 (collect at <console>:21) with 1 output partitions (allowLocal=false)
15/03/22 19:52:44 INFO DAGScheduler: Final stage: Stage 0(collect at <console>:21)
15/03/22 19:52:44 INFO DAGScheduler: Parents of final stage: List()
15/03/22 19:52:44 INFO DAGScheduler: Missing parents: List()
15/03/22 19:52:44 INFO DAGScheduler: Submitting Stage 0 (ParallelCollectionRDD[0] at parallelize at <console>:21),which has no missing parents
#
# A Fatal error has been detected by the Java Runtime Environment:
#
#  SIGILL (0x4) at pc=0x9137c074,pid=3596,tid=2415826032
#
# JRE version: Java(TM) SE Runtime Environment (8.0-b132) (build 1.8.0-b132)
# Java VM: Java HotSpot(TM) Client VM (25.0-b70 mixed mode linux-arm )
# Problematic frame:
# C  [snappy-unkNown-b62d2fa0-8fdd-4b4b-8c2c-2f24ddaeee74-libsnappyjava.so+0x1074]  _init+0x1a7
#
# Failed to write core dump. Core dumps have been disabled. To enable core dumping,try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /home/pi/spark-1.3.0-bin-hadoop2.4/bin/hs_err_pid3596.log
./spark-shell: line 55:  3596 Segmentation fault      "$FWDIR"/bin/spark-submit --class org.apache.spark.repl.Main "${SUBMISSION_OPTS[@]}" spark-shell "${APPLICATION_OPTS[@]}"

启动命令shell时,我允许磁盘内存利用率:

./spark-shell --conf StorageLevel=MEMORY_AND_disK

但仍然收到同样的例外.

当启动spark shell时,有267MB可用内存:

15/03/22 17:09:49 INFO MemoryStore: MemoryStore started with capacity 267.3 MB

这应该是足够的内存在shell中运行Spark命令吗?

这是正确的命令启动spark shell,将不可用的内存溢出到磁盘:./ spark-shell –conf StorageLevel = MEMORY_AND_disK?

更新:

我试过了 :

./spark-shell --conf spark.driver.memory=256m

val l = sc.parallelize(List()).collect

但结果相同

解决方法

尝试使用–driver-memory选项为驱动程序进程设置内存.例:

./spark-shell --driver-memory 2g

适用于2 GB内存.

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐