SPARK 安装后无法在 MAC 中运行 pyspark

Posted

技术标签:

【中文标题】SPARK 安装后无法在 MAC 中运行 pyspark【英文标题】:Cannot run pyspark in MAC after SPARK installation 【发布时间】:2017-03-22 07:43:02 【问题描述】:

我最近使用以下命令在我的 MAC 计算机中安装了 SPARK:

brew install apache-spark

现在我试着跑了 pyspark 但它向我显示以下错误。

pyspark

Python 3.6.0 |Anaconda custom (x86_64)| (default, Dec 23 2016, 13:19:00
[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.57)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
Traceback (most recent call last)
  File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/pyspark/shell.py", line 30, in <module>
 import pyspark
File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/pyspark/__init__.py", line 44, in <module>
from pyspark.context import SparkContext
 File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/pyspark/context.py", line 36, in <module>
   from pyspark.java_gateway import launch_gateway
File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/pyspark/java_gateway.py", line 31, in <module>
from py4j.java_gateway import java_import, JavaGateway, GatewayClient
  File "<frozen importlib._bootstrap>", line 961, in _find_and_load
  File "<frozen importlib._bootstrap>", line 950, in _find_and_load_unlocked
 File "<frozen importlib._bootstrap>", line 646, in _load_unlocked
 File "<frozen importlib._bootstrap>", line 616, in _load_backward_compatible
 File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 18, in <module>
 File "/Users/hellmaster/anaconda/lib/python3.6/pydoc.py", line 62, in <module>

    import pkgutil
 File "/Users/hellmaster/anaconda/lib/python3.6/pkgutil.py", line 22, in <module>
    ModuleInfo = namedtuple('ModuleInfo', 'module_finder name ispkg')
 File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/pyspark/serializers.py", line 393, in namedtuple
 cls = _old_namedtuple(*args, **kwargs)
TypeError: namedtuple() missing 3 required keyword-only arguments: 'verbose', 'rename', and 'module'

我该如何解决这个问题?

【问题讨论】:

【参考方案1】:

这是因为 Spark 2.1.0 与 Python 3.6 不兼容。

另见this question。

【讨论】:

以上是关于SPARK 安装后无法在 MAC 中运行 pyspark的主要内容,如果未能解决你的问题,请参考以下文章

在 Ubuntu 20.04 上使用 Hadoop 进行新 Spark 设置后无法运行 spark-shell 命令

Mac MySql安装包解压后无法在系统偏好设置中显示要怎么办

运行chown后无法在mac上访问mysql

Mac 配置Spark环境(Spark1.6.0)

错误记录Mac 中 IntelliJ IDEA 运行 Python 程序报错 ( “Python“ 因为出现问题而无法打开 )

无法在 mac 上安装 dotnet sdk