PySpark 安装错误

Posted

技术标签:

【中文标题】PySpark 安装错误【英文标题】:PySpark install error 【发布时间】:2018-01-20 18:03:29 【问题描述】:

我已按照包括this、this、this 和this 在内的各种博客帖子的说明在我的笔记本电脑上安装 pyspark。但是,当我尝试从终端或 jupyter notebook 使用 pyspark 时,我不断收到以下错误。

我已安装问题底部所示的所有必要软件。

我已将以下内容添加到我的.bashrc

function sjupyter_init()

#Set anaconda3 as python
export PATH=~/anaconda3/bin:$PATH

#Spark path (based on your computer)
SPARK_HOME=/opt/spark
export PATH=$SPARK_HOME:$PATH

export PYTHONPATH=$SPARK_HOME/python:/home/khurram/anaconda3/bin/python3
export PYSPARK_DRIVER_PYTHON="jupyter"
export PYSPARK_DRIVER_PYTHON_OPTS="notebook"
export PYSPARK_PYTHON=python3

我从终端执行sjupyter_init,然后是jupyter notebook,以使用pyspark 启动jupyter notebook。

在笔记本中我执行以下操作没有错误

import findspark
findspark.init('/opt/spark')
from pyspark.sql import SparkSession

但是当我在下面执行时

spark = SparkSession.builder.appName("test").getOrCreate() 

它会导致此错误消息

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/01/20 17:10:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/opt/spark/python/pyspark/sql/session.py", line 173, in getOrCreate
    sc = SparkContext.getOrCreate(sparkConf)
  File "/opt/spark/python/pyspark/context.py", line 334, in getOrCreate
    SparkContext(conf=conf or SparkConf())
  File "/opt/spark/python/pyspark/context.py", line 118, in __init__
    conf, jsc, profiler_cls)
  File "/opt/spark/python/pyspark/context.py", line 180, in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
  File "/opt/spark/python/pyspark/context.py", line 273, in _initialize_context
    return self._jvm.JavaSparkContext(jconf)
  File "/home/khurram/anaconda3/lib/python3.6/site-packages/py4j/java_gateway.py", line 1428, in __call__
    answer, self._gateway_client, None, self._fqn)
  File "/home/khurram/anaconda3/lib/python3.6/site-packages/py4j/protocol.py", line 320, in get_return_value
    format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.ExceptionInInitializerError
        at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:546)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:373)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:236)
        at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
        at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
        at py4j.GatewayConnection.run(GatewayConnection.java:214)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.UnknownHostException: linux-0he7: linux-0he7: Name or service not known
        at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
        at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:891)
        at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:884)
        at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:884)
        at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:941)
        at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:941)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.util.Utils$.localHostName(Utils.scala:941)
        at org.apache.spark.internal.config.package$.<init>(package.scala:204)
        at org.apache.spark.internal.config.package$.<clinit>(package.scala)
        ... 14 more
Caused by: java.net.UnknownHostException: linux-0he7: Name or service not known
        at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
        at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
        at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
        at java.net.InetAddress.getLocalHost(InetAddress.java:1500)
        ... 23 more

我的操作系统详细信息是

操作系统:

OpenSuse Leap 42.2 64-bit

Java:

    khurram@linux-0he7:~> java -version
    openjdk version "1.8.0_151"

斯卡拉

    khurram@linux-0he7:~> scala -version
    Scala code runner version 2.12.4 -- Copyright 2002-2017, LAMP/EPFL and Lightbend, Inc.

Hadoop 3.0

khurram@linux-0he7:~> echo $HADOOP_HOME
/opt/hadoop

Py4J

khurram@linux-0he7:~> pip show py4j
Name: py4j
Version: 0.10.6
Summary: Enables Python programs to dynamically access arbitrary Java objects
Home-page: https://www.py4j.org/
Author: Barthelemy Dagenais
Author-email: barthelemy@infobart.com
License: BSD License
Location: /home/khurram/anaconda3/lib/python3.6/site-packages
Requires: 
khurram@linux-0he7:~> 

我已经为hadoopspark 目录执行了chmod 777

khurram@linux-0he7:~> ls -al /opt/
total 8
drwxr-xr-x 1 root    root   96 Jan 19 20:22 .
drwxr-xr-x 1 root    root  222 Jan 20 14:54 ..
lrwxrwxrwx 1 root    root   18 Jan 19 20:22 hadoop -> /opt/hadoop-3.0.0/
drwxrwxrwx 1 khurram users 126 Dec  8 19:42 hadoop-3.0.0
lrwxrwxrwx 1 root    root   30 Jan 19 19:40 spark -> /opt/spark-2.2.1-bin-hadoop2.7
drwxrwxrwx 1 khurram users 150 Jan 19 19:33 spark-2.2.1-bin-hadoop2.7
khurram@linux-0he7:~>

hosts文件的内容

khurram@linux-0he7:> cat /etc/hosts

127.0.0.1       localhost

# special IPv6 addresses
::1             localhost ipv6-localhost ipv6-loopback

fe00::0         ipv6-localnet

ff00::0         ipv6-mcastprefix
ff02::1         ipv6-allnodes
ff02::2         ipv6-allrouters
ff02::3         ipv6-allhosts

【问题讨论】:

@user8371915 将/etc/hosts 的内容添加到OP。根据您提到的问题,我应该将linux-0he7 添加到主机文件中吗? 完全正确 - 127.0.0.1 linux-0he7 @user8371915 您的建议已解决问题。你能把它放在答案中,这样我就可以投票给你了 顺便说一下,spark-2.2.1-bin-hadoop2.7 下载包含 Hadoop...您不需要单独下载 Hadoop,或者如果您需要,则需要 2.7 版 在更一般的层面上,设置PYSPARK_DRIVER_PYTHON="jupyter" 是一个非常糟糕的做法;有关将 Jupyter 设置为与 pyspark 一起使用的正确方法,请参阅 ***.com/questions/47824131/… 【参考方案1】:

UnknownHostException

抛出表示无法确定主机的IP地址。

它被扔到堆栈跟踪的底部:

原因:java.net.UnknownHostException: linux-0he7: Name or service not known

查看您的提示 shell linux-0he7 所以我假设您使用的是本地模式。这意味着您的/etc/hosts 不包括linux-0he7

添加

127.0.0.1    linux-0he7

/etc/hosts 应该可以解决问题。

您还可以使用spark.driver.bindAddressspark.driver.host 为驱动程序使用特定的主机IP。

独立于异常 Hadoop 3.0.0 尚不受支持。我建议暂时使用 2.x。

【讨论】:

/etc/hosts 在我的云上工作。 spark.driver.bindAddress 不起作用

以上是关于PySpark 安装错误的主要内容,如果未能解决你的问题,请参考以下文章

PySpark安装错误

如何在安装 spark 2.4.4 后尝试运行 pyspark 时修复“TypeError:需要一个整数(获取类型字节)”错误

使用 Spark-Submit 在 kubernetes 上安装 PySpark 软件包:找不到常春藤缓存文件错误

无法安装 pyspark

使用架构详细信息创建数据框时 Dataproc 上的 Pyspark 错误

安装 Spark 问题。无法使用 pyspark 打开 IPython Notebook