Spark-Local模式环境搭建
Posted 广彐水厂
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Spark-Local模式环境搭建相关的知识,希望对你有一定的参考价值。
一、前置准备
CentOS7、jdk1.8、scala-2.11.12、spark-2.4.5
想要完成本期视频中所有操作,需要以下准备:
二、环境搭建
2.1 下载并解压
下载 Spark
安装包,这里我下载的是spark-2.4.5-bin-hadoop2.7.tgz
。下载地址:
# 解压
[xiaokang@hadoop ~]$ tar -zxvf spark-2.4.5-bin-hadoop2.7.tgz -C /opt/software/
# 重命名(可选)
[xiaokang@hadoop ~]$ mv /opt/software/spark-2.4.5-bin-hadoop2.7/ /opt/software/spark-2.4.5
2.2 配置环境变量
[xiaokang@hadoop ~]$ sudo vim /etc/profile.d/env.sh
在原来基础上更新配置环境变量:
export SPARK_HOME=/opt/software/spark-2.4.5
export PATH=${JAVA_HOME}/bin:${HADOOP_HOME}/bin:${HADOOP_HOME}/sbin:${ZOOKEEPER_HOME}/bin:${HIVE_HOME}/bin:${ZEPPELIN_HOME}/bin:${HBASE_HOME}/bin:${SQOOP_HOME}/bin:${FLUME_HOME}/bin:${PYTHON_HOME}/bin:${SCALA_HOME}/bin:${MAVEN_HOME}/bin:${GRADLE_HOME}/bin:${KAFKA_HOME}/bin:${SPARK_HOME}/bin:$PATH
执行 source
命令,使得配置的环境变量立即生效:
[xiaokang@hadoop ~]$ source /etc/profile.d/env.sh
2.3 修改配置
进入 ${SPARK_HOME}/conf/
目录下,复制一份spark-env.sh.template
文件进行更改
[xiaokang@hadoop conf]$ cp spark-env.sh.template spark-env.sh
export JAVA_HOME=/opt/moudle/jdk1.8.0_191
export SCALA_HOME=/opt/moudle/scala-2.11.12
# Options read when launching programs locally with
# ./bin/run-example or ./bin/spark-submit
# - HADOOP_CONF_DIR, to point Spark towards Hadoop configuration files
SPARK_LOCAL_IP=hadoop
# - SPARK_PUBLIC_DNS, to set the public dns name of the driver program
2.4 启动测试
[xiaokang@hadoop ~]$ spark-shell
2.5 wordcount案例
准备一个需要统计词频的小文件,部分词频数据:
Spark之WordCount案例实操:
scala> val result=sc.textFile("file:///home/xiaokang/wordcount-xiaokang.txt").flatMap(_.split("\\t")).map((_,1)).reduceByKey(_ + _).collect
result: Array[(String, Int)] = Array((Flink,617), (Spark,614), (MapReduce,631), (Hive,636), (xiaokang,647), (HBase,642), (微信公众号:小康新鲜事儿,647), (Hadoop,644))
查看 Spark Web UI 界面,端口为4040
:
以上是关于Spark-Local模式环境搭建的主要内容,如果未能解决你的问题,请参考以下文章
spark-local 模式 提示 /tmp/hive hdfs 权限不够的问题
spring练习,在Eclipse搭建的Spring开发环境中,使用set注入方式,实现对象的依赖关系,通过ClassPathXmlApplicationContext实体类获取Bean对象(代码片段
spark-local-运行异常-Could not locate executable nullinwinutils.exe in the Hadoop binaries