spark-submitting to YARN 时出错“未知队列:root.default”
Posted
技术标签:
【中文标题】spark-submitting to YARN 时出错“未知队列:root.default”【英文标题】:Error "unknown queue: root.default" when spark-submitting to YARN 【发布时间】:2020-02-20 21:42:04 【问题描述】:我正在通过 Airflow 和 SparkSubmitOperator 向新建的 YARN 集群提交一个简单的 Pyspark 字数统计作业。作业命中 YARN,我可以在 ResourceManager UI 中看到它,但失败并出现以下错误:
“诊断:应用程序 application_1582063076991_0002 由用户 root 提交到未知队列:root.default”
*User: root
Name: PySpark Wordcount
Application Type: SPARK
Application Tags:
YarnApplicationState: FAILED
Queue: root.default
FinalStatus Reported by AM: FAILED
Started: Fri Feb 21 08:01:25 +1100 2020
Elapsed: 0sec
Tracking URL: History
Diagnostics: Application application_1582063076991_0002 submitted by user root to unknown queue: root.default*
default.root 队列似乎确实存在:
*Application Queues
Legend:CapacityUsedUsed (over capacity)Max Capacity
.root 0.0% used
..Queue: default 0.0% used
'default' Queue Status
Queue State: RUNNING
Used Capacity: 0.0%
Configured Capacity: 100.0%
Configured Max Capacity: 100.0%
Absolute Used Capacity: 0.0%
Absolute Configured Capacity: 100.0%
Absolute Configured Max Capacity: 100.0%
Used Resources: <memory:0, vCores:0>
Num Schedulable Applications: 0
Num Non-Schedulable Applications: 0
Num Containers: 0
Max Applications: 10000
Max Applications Per User: 10000
Max Application Master Resources: <memory:3072, vCores:1>
Used Application Master Resources: <memory:0, vCores:0>
Max Application Master Resources Per User: <memory:3072, vCores:1>
Configured Minimum User Limit Percent: 100%
Configured User Limit Factor: 1.0
Accessible Node Labels: *
Preemption: disabled*
我在这里缺少什么?谢谢
【问题讨论】:
【参考方案1】:使用队列名称default
提交。
资源管理器中的root
仅用于以分层形式对队列进行分组。
【讨论】:
工作谢谢!好奇 root.default 是开箱即用的以上是关于spark-submitting to YARN 时出错“未知队列:root.default”的主要内容,如果未能解决你的问题,请参考以下文章
spark-submit / spark-shell > yarn-client 和 yarn-cluster 模式的区别
Spark-submit on Yarn 错误 YarnConfiguration.useHttps NoSuchMethodError
HDFS HA yarn 在 spark-submit 上的这个错误是啥
Yarn 是不是根据我们在 spark-submit 命令中传递的 executor 数量为 application master 分配一个容器
spark on yarn falling back to uploading libraries under SPARK_HOME.