1.前言:
1)生產(chǎn)環(huán)境建議使用:CDH薄声、HDP等商業(yè)版本,因?yàn)槎嗫蚣苷戏矫婕嫒菪员容^好
2)統(tǒng)一軟件安裝包下載路徑:http://archive.cloudera.com/cdh5/cdh/5/
選擇統(tǒng)一的cdh5.7.0尾號(hào):hadoop-2.6.0-cdh5.7.0.tar.gz沦零、hive-1.1.0-cdh5.7.0.tar.gz
3)用戶hadoop,密碼hadoop
機(jī)器目錄:/home/hadoop/
software? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? 存放安裝軟件
data? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?存放測(cè)試數(shù)據(jù)
source? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? 存放源代碼
lib? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? 存放相關(guān)開(kāi)發(fā)的jar
app? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? 軟件安裝目錄
app/tmp? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? 存放HDFS/Kafka/ZK數(shù)據(jù)目錄
maven_repo? ? ? ? ? ? ? ? ? ? ? ? ? ? maven本地倉(cāng)庫(kù)
shell? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? 存放上課相關(guān)的腳本
mysql? 用戶名root货岭,密碼123456
4)當(dāng)前環(huán)境:
a)虛擬機(jī)為VM10
b)Linux系統(tǒng)為centos6.5
c)JDK為jdk-7u80-linux-x64.tar.gz
d)Maven為apache-maven-3.3.9-bin.zip
e)findbugs為findbugs-1.3.9.zip
f)protoc為protobuf-2.5.0.tar.gz
【以上環(huán)境必須提前裝備好】
2.編譯hadoop-2.6.0-cdh5.7.0(因?yàn)橹坝玫腶pache版本路操,所以在此重新編譯部署hadoop-2.6.0-cdh5.7.0疾渴,正好復(fù)習(xí)一下)
1)下載
下載hadoop-2.6.0-cdh5.7.0-src.tar.gz
下載地址:http://archive.cloudera.com/cdh5/cdh/5/hadoop-2.6.0-cdh5.7.0-src.tar.gz
下載jdk-7u80-linux-x64.tar.gz(失敗經(jīng)驗(yàn)告訴我們編譯hadoop-2.6.0-cdh5.7.0應(yīng)該用1.7版本,1.8版本會(huì)編譯失斞胺鳌)
下載地址:http://ghaffarian.net/downloads/Java/
2)上傳(rz)
上傳hadoop-2.6.0-cdh5.7.0-src.tar.gz至/home/hadoop/source
上傳jdk-7u80-linux-x64.tar.gz程奠、apache-maven-3.3.9-bin.zip、findbugs-1.3.9.zip祭钉、protobuf-2.5.0.tar.gz至/home/hadoop/software
3)解壓
[root@hadoop001 ~]#?tar -zxvf /home/hadoop/software/jdk-7u80-linux-x64.tar.gz -C /usr/java
[root@hadoop001 ~]#?tar -zxvf /home/hadoop/software/protobuf-2.5.0.tar.gz?-C /usr/local
[hadoop@hadoop001 source]$ tar -zxvf?hadoop-2.6.0-cdh5.7.0-src.tar.gz
[hadoop@hadoop001 software]$ unzip?apache-maven-3.3.9-bin.zip?findbugs-1.3.9.zip -d?/home/hadoop/app
解壓后注意觀察解壓后文件夾的用戶瞄沙、用戶組對(duì)不對(duì),不對(duì)的話chown -R XXX:XXX dir一下
4)查看環(huán)境要求
[hadoop@hadoop001 ~]$ cd source/hadoop-2.6.0-cdh5.7.0
[hadoop@hadoop001 hadoop-2.6.0-cdh5.7.0]$ cat BUILDING.txt
Requirements:
* Windows System
* JDK 1.7+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer
* Windows SDK or Visual Studio 2010 Professional
* Unix command-line tools from GnuWin32 or Cygwin: sh, mkdir, rm, cp, tar, gzip
* zlib headers (if building native code bindings for zlib)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
If using Visual Studio, it must be Visual Studio 2010 Professional (not 2012).
Do not use Visual Studio Express.? It does not support compiling for 64-bit,
which is problematic if running a 64-bit system.? The Windows SDK is free to
download here:
http://www.microsoft.com/en-us/download/details.aspx?id=8279
5)配置maven目錄
[hadoop@hadoop001 ~]$ cd app/apache-maven-3.3.9/conf
[hadoop@hadoop001 conf]$ vi?settings.xml
6)預(yù)編譯安裝
[root@hadoop001 protobuf-2.5.0]# yum install -y gcc gcc-c++ make cmake
[root@hadoop001 protobuf-2.5.0]# ./configure --prefix=/usr/local/protobuf
[root@hadoop001 protobuf-2.5.0]# make && make install
7)配置環(huán)境變量
[hadoop@hadoop001 ~]$ vi .bash_profile
# .bash_profile
# Get the aliases and functions
if [ -f ~/.bashrc ]; then
? ? ? ? . ~/.bashrc
fi
# User specific environment and startup programs
PATH=$PATH:$HOME/bin
export PATH
export JAVA_HOME=/usr/java/jdk1.7.0_80
export MVN_HOME=/home/hadoop/app/apache-maven-3.3.9
export FINDBUGS_HOME=/home/hadoop/app/findbugs-1.3.9
export PROTOC_HOME=/usr/local/protobuf
export PATH=$PROTOC_HOME/bin:$FINDBUGS_HOME/bin:$MVN_HOME/bin:$JAVA_HOME/bin:$PATH
[hadoop@hadoop001 ~]$ source .bash_profile
[hadoop@hadoop001 ~]$ which java
/usr/java/jdk1.7.0_80/bin/java
[hadoop@hadoop001 ~]$ which mvn
~/app/apache-maven-3.3.9/bin/mvn
[hadoop@hadoop001 ~]$ which findbugs
~/app/findbugs-1.3.9/bin/findbugs
[hadoop@hadoop001 ~]$ which protoc
/usr/local/protobuf/bin/protoc
8)查看軟件版本
[hadoop@hadoop001 ~]$ java -version
java version "1.7.0_80"
Java(TM) SE Runtime Environment (build 1.7.0_80-b15)
Java HotSpot(TM) 64-Bit Server VM (build 24.80-b11, mixed mode)
[hadoop@hadoop001 ~]$ mvn -version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
Maven home: /home/hadoop/app/apache-maven-3.3.9
Java version: 1.7.0_80, vendor: Oracle Corporation
Java home: /usr/java/jdk1.7.0_80/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-431.el6.x86_64", arch: "amd64", family: "unix"
[hadoop@hadoop001 ~]$ findbugs -version
1.3.9
[hadoop@hadoop001 ~]$ protoc --version
libprotoc 2.5.0
9)yum源安裝其他組件
[root@hadoop001 ~]# yum install -y openssl openssl-devel svn ncurses-devel zlib-devel libtool
[root@hadoop001 ~]# yum install -y snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake
10)開(kāi)始編譯
[hadoop@hadoop001 ~]$ cd source/hadoop-2.6.0-cdh5.7.0
[hadoop@hadoop001 hadoop-2.6.0-cdh5.7.0]$ pwd
/home/hadoop/source/hadoop-2.6.0-cdh5.7.0
[hadoop@hadoop001 hadoop-2.6.0-cdh5.7.0]$ mvn clean package -Pdist,native -DskipTests -Dtar
3.hdfs偽分布式部署
1)解壓編譯好的hadoop安裝包
將編譯好的hadoop-2.6.0-cdh5.7.0.tar.gz移至/home/hadoop/software
[hadoop@hadoop001 ~]$ cd source/hadoop-2.6.0-cdh5.7.0/hadoop-dist/target
[hadoop@hadoop001 target]$ cp?hadoop-2.6.0-cdh5.7.0.tar.gz?/home/hadoop/software
[hadoop@hadoop001 software]$??tar -zxvf hadoop-2.6.0-cdh5.7.0.tar.gz -C ~/app/
2)配置環(huán)境變量
[hadoop@hadoop001 ~]$ vi .bash_profile
# .bash_profile
# Get the aliases and functions
if [ -f ~/.bashrc ]; then
? ? ? ? . ~/.bashrc
fi
# User specific environment and startup programs
PATH=$PATH:$HOME/bin
export PATH
export JAVA_HOME=/usr/java/jdk1.7.0_80
export MVN_HOME=/home/hadoop/app/apache-maven-3.3.9
export FINDBUGS_HOME=/home/hadoop/app/findbugs-1.3.9
export PROTOC_HOME=/usr/local/protobuf
export HADOOP_HOME=/home/hadoop/app/hadoop-2.6.0
export PATH=$HADOOP_HOME/bin:$PROTOC_HOME/bin:$FINDBUGS_HOME/bin:$MVN_HOME/bin:$JAVA_HOME/bin:$PATH
[hadoop@hadoop001 ~]$ source .bash_profile
[hadoop@hadoop001 ~]$ which hadoop
~/app/hadoop-2.6.0/bin/hadoop
3)修改配置文件
[hadoop@hadoop001 ~]$ /home/hadoop/app/hadoop-2.6.0/etc/hadoop
[hadoop@hadoop001 hadoop]$ vi hadoop-env.sh
export JAVA_HOME=/usr/java/jdk1.7.0_80
[hadoop@hadoop001 hadoop]$ vi core-site.xml
[hadoop@hadoop001 hadoop]$ vi hdfs-site.xml
vi ?slaves
192.168.137.141
4)ssh配置信任
5)格式化
[hadoop@hadoop001 ~]$?hdfs namenode -format
6)啟動(dòng)hdfs
[hadoop@hadoop001 ~]$ cd app/hadoop-2.6.0/sbin
[hadoop@hadoop001 sbin]$ ./start-dfs.sh
[hadoop@hadoop001 sbin]$ jps
4066 DataNode
4376 Jps
4201 SecondaryNameNode
3976 NameNode