Spark记录-spark报错Unable to load native-hadoop library for your platform
Posted 信方互联网硬汉
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Spark记录-spark报错Unable to load native-hadoop library for your platform相关的知识,希望对你有一定的参考价值。
解决方案一:
#cp $HADOOP_HOME/lib/native/libhadoop.so $JAVA_HOME/jre/lib/amd64
#源码编译snappy---./configure make & make install
#cp libsnappy.so $JAVA_HOME/jre/lib/amd64
当这两个文件准备好后再次启动spark shell不会出现这个问题。
解决方案二:
在spark的conf/spark-env.sh文件加入:export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native
在/etc/profile设置一下:export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native
以上是关于Spark记录-spark报错Unable to load native-hadoop library for your platform的主要内容,如果未能解决你的问题,请参考以下文章
导入spark程序的maven依赖包时,无法导入,报错Unable to import maven project: See logs for details
spark-shell启动报错:Yarn application has already ended! It might have been killed or unable to launch ap
Spark - ERROR Executor: Exception in tjava.lang.OutOfMemoryError: unable to create new native thread
spark 启动时候报 Unable to load native-hadoop library for your platform 警告