启动spark-shell --master yarn的bug

Posted Hi,Fairy.

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了启动spark-shell --master yarn的bug相关的知识,希望对你有一定的参考价值。

报错如下

18/06/06 15:55:31 ERROR cluster.YarnClientSchedulerBackend: Yarn application has already exited with state FINISHED!
18/06/06 15:55:31 ERROR client.TransportClient: Failed to send RPC 8494797597505892687 to /192.168.1.21:44578: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
        at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
18/06/06 15:55:31 ERROR cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Sending RequestExecutors(0,0,Map(),Set()) to AM was unsuccessful
java.io.IOException: Failed to send RPC 8494797597505892687 to /192.168.1.21:44578: java.nio.channels.ClosedChannelException
        at org.apache.spark.network.client.TransportClient.lambda$sendRpc$2(TransportClient.java:237)
        at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
        at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:481)
        at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
        at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.safeSetFailure(AbstractChannel.java:852)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.write(AbstractChannel.java:738)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.write(DefaultChannelPipeline.java:1251)
        at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:733)
        at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:725)

需要在yarn-site.xml中添加如下配置

    <property>
        <name>yarn.nodemanager.pmem-check-enabled</name>
        <value>false</value>
    </property>

    <property>
        <name>yarn.nodemanager.vmem-check-enabled</name>
        <value>false</value>
    </property>

分发yarn-site.xml并重启yarn,再试一次spark-shell --master yarn就可以了.

FIN

以上是关于启动spark-shell --master yarn的bug的主要内容,如果未能解决你的问题,请参考以下文章

启动spark-shell --master yarn的bug

spark启动问题,发现任务都是在localhost下面运行的,原来启动spark-shell的时候需要带主节点的参数

spark学习

spark-shell启动失败

spark-shell启动报错如下。请问该如何解决。多谢帮助!

spark-shell启动报错:Yarn application has already ended! It might have been killed or unable to launch ap