Spark启动:WARN util.Utils: Your hostname, ... resolves to a loopback address: ...; using ... instead(代
Posted Z.Q.Fengᯤ⁵ᴳ
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Spark启动:WARN util.Utils: Your hostname, ... resolves to a loopback address: ...; using ... instead(代相关的知识,希望对你有一定的参考价值。
项目场景:
系统:Ubuntu20.04
Hadoop版本:Hadoop3.3.1
Spark版本:Spark3.2.0
问题描述:
在启动 spark-shell
时,出现如下输出:
hadoop@fzqs-Laptop:/usr/local$ spark-shell
2021-12-09 23:49:14,625 WARN util.Utils: Your hostname, fzqs-Laptop resolves to a loopback address: 127.0.1.1; using 10.132.13.98 instead (on interface wlo1)
2021-12-09 23:49:14,626 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Setting default log level to "WARN".
......
原因分析:
这里是 spark 的本地 IP 地址未绑定导致的警告,其中 using 10.132.13.98 instead (on interface wlo1)
其中的 10.132.13.98
就是 spark 默认选择的本地 IP 地址,这里需要我们记住自己的这个 IP 地址(你的电脑上的输出,不是我的!)
解决方案:
修改配置文件,配置 SPARK_LOCAL_IP
变量即可:
cd /usr/local/spark #你自己的spark目录
sudo vim conf/spark-env.sh
添加内容如下:
export SPARK_LOCAL_IP=10.132.13.98 #注意一定要是你们自己输出对应的IP
键入 ESC 键,:wq
,保存并退出。重新启动 spark-shell
:
/usr/local/spark/bin/spark-shell
问题解决。
以上是关于Spark启动:WARN util.Utils: Your hostname, ... resolves to a loopback address: ...; using ... instead(代的主要内容,如果未能解决你的问题,请参考以下文章
Spark WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
运行 Spark 时遇到“WARN ProcfsMetricsGetter:尝试计算页面大小时出现异常”错误
Spark执行样例报警告:WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources
WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java