Hadoop.2.x_源码编译
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Hadoop.2.x_源码编译相关的知识,希望对你有一定的参考价值。
一、基本环境搭建
1. 准备
hadoop-2.5.0-src.tar.gz apache-maven-3.0.5-bin.tar.gz jdk-7u67-linux-x64.tar.gz protobuf-2.5.0.tar.gz 可联外部网络
2. 安装 jdk-7u67-linux-x64.tar.gz 与 apache-maven-3.0.5-bin.tar.gz
[[email protected] ~]$ vi /etc/profile #JAVA_HOME export JAVA_HOME=/opt/modules/jdk1.7.0_67 export PATH=$PATH:$JAVA_HOME/bin #MAVEN_HOME export MAVEN_HOME=/opt/modules/apache-maven-3.0.5 export PATH=$PATH:$MAVEN_HOME/bin [[email protected] ~] source /etc/profile [[email protected] ~] java -version java version "1.7.0_67" Java(TM) SE Runtime Environment (build 1.7.0_67-b01) Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode) [[email protected] ~]$ echo $MAVEN_HOME /opt/modules/apache-maven-3.0.5 [[email protected] ~]# mvn -v Apache Maven 3.0.5 (r01de14724cdef164cd33c7c8c2fe155faf9602da; 2013-02-19 05:51:28-0800) Maven home: /opt/modules/apache-maven-3.0.5 Java version: 1.7.0_67, vendor: Oracle Corporation Java home: /opt/modules/jdk1.7.0_67/jre Default locale: en_US, platform encoding: UTF-8 OS name: "linux", version: "2.6.32-504.el6.x86_64", arch: "amd64", family: "unix"
PS:准备文件中最好准备好maven的仓库文件,否则将下载很久
[[email protected] hadoop-2.5.0-src]# ls /root/.m2/repository/ ant biz commons-chain commons-el commons-validator junit sslext antlr bouncycastle commons-cli commons-httpclient dom4j log4j tomcat aopalliance bsh commons-codec commons-io doxia logkit xerces asm cglib commons-collections commons-lang io net xml-apis avalon-framework classworlds commons-configuration commons-logging javax org xmlenc backport-util-concurrent com commons-daemon commons-net jdiff oro xpp3 bcel commons-beanutils commons-digester commons-pool jline regexp
3. yum 安装 cmake,zlib-devel,openssl-devel,gcc gcc-c++,ncurses-devel
[[email protected] ~]# yum -y install cmake [[email protected] ~]# yum -y install zlib-devel [[email protected] ~]# yum -y install openssl-devel [[email protected] ~]# yum -y install gcc gcc-c++ [[email protected] hadoop-2.5.0-src]# yum -y install ncurses-devel
4. 安装 protobuf-2.5.0.tar.gz(解压后进入protobuf主目录)
[[email protected] protobuf-2.5.0]# mkdir -p /opt/modules/protobuf [[email protected] protobuf-2.5.0]# ./configure --prefix=/opt/modules/protobuf ... [[email protected] protobuf-2.5.0]# make ... [[email protected] protobuf-2.5.0]# make install ... [[email protected]-hadoop protobuf-2.5.0]# vi /etc/profile ... #PROTOBUF_HOME export PROTOBUF_HOME=/opt/modules/protobuf export PATH=$PATH:$PROTOBUF_HOME/bin [[email protected] protobuf-2.5.0]# source /etc/profile [[email protected] protobuf-2.5.0]# protoc --version libprotoc 2.5.0
5. 解压Hadoop源文件压缩包,并进入主目录进行编译
[[email protected] protobuf-2.5.0]# cd ../../files/ [[email protected] files]# tar -zxf hadoop-2.5.0-src.tar.gz -C ../src/ [[email protected] files]# cd ../src/hadoop-2.5.0-src/ [[email protected] hadoop-2.5.0-src]# mvn package -DskipTests -Pdist,native ... [INFO] Executed tasks [INFO] [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist --- [INFO] Building jar: /home/liuwl/opt/src/hadoop-2.5.0-src/hadoop-dist/target/hadoop-dist-2.5.0-javadoc.jar [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................ SUCCESS [8:22.179s] [INFO] Apache Hadoop Project POM ......................... SUCCESS [5:14.366s] [INFO] Apache Hadoop Annotations ......................... SUCCESS [1:50.627s] [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.795s] [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1:11.384s] [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [1:55.962s] [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [10:21.736s] [INFO] Apache Hadoop Auth ................................ SUCCESS [4:01.790s] [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [35.829s] [INFO] Apache Hadoop Common .............................. SUCCESS [12:51.374s] [INFO] Apache Hadoop NFS ................................. SUCCESS [29.567s] [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.220s] [INFO] Apache Hadoop HDFS ................................ SUCCESS [1:04:44.352s] [INFO] Apache Hadoop HttpFS .............................. SUCCESS [1:40.397s] [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [1:24.100s] [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [12.020s] [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.239s] [INFO] hadoop-yarn ....................................... SUCCESS [0.298s] [INFO] hadoop-yarn-api ................................... SUCCESS [2:07.150s] [INFO] hadoop-yarn-common ................................ SUCCESS [3:13.690s] [INFO] hadoop-yarn-server ................................ SUCCESS [1.009s] [INFO] hadoop-yarn-server-common ......................... SUCCESS [54.750s] [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [2:53.418s] [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [23.570s] [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [16.137s] [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [1:17.456s] [INFO] hadoop-yarn-server-tests .......................... SUCCESS [9.170s] [INFO] hadoop-yarn-client ................................ SUCCESS [17.790s] [INFO] hadoop-yarn-applications .......................... SUCCESS [0.132s] [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [6.689s] [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [3.015s] [INFO] hadoop-yarn-site .................................. SUCCESS [0.102s] [INFO] hadoop-yarn-project ............................... SUCCESS [13.562s] [INFO] hadoop-mapreduce-client ........................... SUCCESS [0.526s] [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [1:27.794s] [INFO] hadoop-mapreduce-client-common .................... SUCCESS [1:32.320s] [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [19.368s] [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [26.041s] [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [31.938s] [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [38.261s] [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [5.923s] [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [12.856s] [INFO] hadoop-mapreduce .................................. SUCCESS [15.510s] [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [20.631s] [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [51.096s] [INFO] Apache Hadoop Archives ............................ SUCCESS [13.185s] [INFO] Apache Hadoop Rumen ............................... SUCCESS [22.877s] [INFO] Apache Hadoop Gridmix ............................. SUCCESS [25.861s] [INFO] Apache Hadoop Data Join ........................... SUCCESS [9.764s] [INFO] Apache Hadoop Extras .............................. SUCCESS [7.152s] [INFO] Apache Hadoop Pipes ............................... SUCCESS [23.914s] [INFO] Apache Hadoop OpenStack support ................... SUCCESS [21.289s] [INFO] Apache Hadoop Client .............................. SUCCESS [18.486s] [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.966s] [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [37.039s] [INFO] Apache Hadoop Tools Dist .......................... SUCCESS [9.809s] [INFO] Apache Hadoop Tools ............................... SUCCESS [0.192s] [INFO] Apache Hadoop Distribution ........................ SUCCESS [34.114s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 2:21:11.103s [INFO] Finished at: Wed Sep 14 11:49:38 PDT 2016 [INFO] Final Memory: 86M/239M [INFO] ------------------------------------------------------------------------
以上是关于Hadoop.2.x_源码编译的主要内容,如果未能解决你的问题,请参考以下文章