在Windows启动pyspark shell:Failed to find Spark jars directory. You need to build Spark before running
Posted 144823836yj
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了在Windows启动pyspark shell:Failed to find Spark jars directory. You need to build Spark before running相关的知识,希望对你有一定的参考价值。
D:\Develop tools\spark-2.2.0-bin-hadoop2.7\bin>pyspark2.cmd
‘tools\spark-2.2.0-bin-hadoop2.7\bin\..\jars""\‘ 不是内部或外部命令,也不是可运
行的程序
或批处理文件。
Failed to find Spark jars directory.
You need to build Spark before running this program.
错误原因:路径中含有空格(D:\Develop tools\spark-2.2.0-bin-hadoop2.7\bin的Develop tools中间有空格)
以上是关于在Windows启动pyspark shell:Failed to find Spark jars directory. You need to build Spark before running的主要内容,如果未能解决你的问题,请参考以下文章
如何启动 pyspark 并进入 ipython shell