微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

在Yarn上运行Apache Spark时出现错误

如何解决在Yarn上运行Apache Spark时出现错误

我是大数据领域的新秀。
我只想运行代码spark-shell --master yarn --deploy-mode client,并且经过很长的等待时间后出现了错误

20/09/27 21:14:55 ERROR cluster.YarnClientSchedulerBackend: The YARN application has already ended! It might have been killed or the Application Master may have Failed to start. Check the YARN application logs for more details.
20/09/27 21:14:55 ERROR spark.SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Application application_1601211854090_0001 Failed 2 times due to Error launching appattempt_1601211854090_0001_000002. Got exception: java.net.ConnectException: Call From localhost/127.0.0.1 to localhost:46345 Failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
        at sun.reflect.GeneratedConstructorAccessor44.newInstance(UnkNown Source)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:824)
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:754)
        at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1544)
        at org.apache.hadoop.ipc.Client.call(Client.java:1486)
        at org.apache.hadoop.ipc.Client.call(Client.java:1385)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)
        at com.sun.proxy.$Proxy83.startContainers(UnkNown Source)
        at org.apache.hadoop.yarn.api.impl.pb.client.ContainerManagementProtocolPBClientImpl.startContainers(ContainerManagementProtocolPBClientImpl.java:128)
        at sun.reflect.GeneratedMethodAccessor14.invoke(UnkNown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

当我想在纱线上运行火花时,就会发生这种情况。我可以在centos7中使用spark-shell。 我在yarn-site.sh添加了很多东西:

<property>
        <name>yarn.nodemanager.pmem-check-enabled</name>
        <value>false</value>
    </property>
    <property>
        <name>yarn.nodemanager.vmem-check-enabled</name>
        <value>false</value>
    </property>
    <property>
        <name>yarn.nodemanager.vmem-pmem-ratio</name>
        <value>4</value>
    </property>

还有我的spark-env.sh

export SCALA_HOME=/home/scala-2.11.12
export JAVA_HOME=/home/java/jdk1.8.0_251
export HADOOP_HOME=/home/hadoop/hadoop-2.10.0
export SPARK_disT_CLAsspATH=$(/home/hadoop/hadoop-2.10.0/bin/hadoop classpath)
export HADOOP_CONF_DIR=/home/hadoop/hadoop-2.10.0/etc/hadoop
export SPARK_MASTER_HOST=192.168.99.104

我真的不知道该如何保存Spark。

解决方法

请尝试在YARN History Server上找到您的应用程序并访问日志,它们可能提供更多指示实际问题的实际错误信息。

根据您的情况,搜索application_1601211854090_0001

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。