1.下載hadoop
http://hadoop.apache.org/releases.html
2.解壓下載的hadoop文件
tar -zxvf /HADOOP-VERSION-FULL-PATH.tar.gz
3.配置hadoop環(huán)境
# vim /etc/profile
##添加HADOOP_HOME路徑
# export HADOOP_HOME=/HADOOP-VERSION-FULL-PATH
# export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
# wq
# source /etc/profile
## 創(chuàng)建hadoop數(shù)據(jù)目錄
# sudo mkdir /var/hadoop
# sudo chmod 777 /var/hadoop
4.hadoop需要java環(huán)境的支持买喧,配置java環(huán)境略
5.修改hadoop配置文件,主要包括以下幾個文件
- core.site.xml
# vim /HADOOP-VERSION-FULL-PATH/etc/hadoop/core.site.xml
在configuration節(jié)點(diǎn)中添加以下內(nèi)容
<property>
<name>fs.defaultFS</name>
<value>hdfs://hadoop:9000</value>
</property>
- hdfs.site.xml文件
# vim /HADOOP-VERSION-FULL-PATH/etc/hadoop/hdfs.site.xml
在configuration節(jié)點(diǎn)中添加以下內(nèi)容
<!-- 單節(jié)點(diǎn)hadoop不需要復(fù)制 -->
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/var/hadoop/data/hdfs/nn</value>
</property>
<property>
<name>fs.checkpoint.dir</name>
<value>file:/var/hadoop/data/hdfs/snn</value>
</property>
<property>
<name>fs.checkpoint.edits.dir</name>
<value>file:/var/hadoop/data/doop/hdfs/snn</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:/var/hadoop/data/hdfs/dn</value>
</property>
- mapred-site.xml
# cp /HADOOP-VERSION-FULL-PATH/etc/hadoop/mapred-site.xml.template
/HADOOP-VERSION-FULL-PATH/etc/hadoop/mapred-site.xml
在configuration節(jié)點(diǎn)中添加以下內(nèi)容
<property>
<name>mapreduce.frameword.name</name>
<value>yarn</value>
</property>
- yarn-site.xml
# vim /HADOOP-VERSION-FULL-PATH/etc/hadoop/yarn-site.xml
在configuration節(jié)點(diǎn)中添加以下內(nèi)容
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
- hadoop-env.sh
# vim /HADOOP-VERSION-FULL-PATH/etc/hadoop/hadoop-env.sh
覆蓋正確的JAVA_HOME
export JAVA_HOME=JAVA_HOME_PATH
6.格式化hdfs文件系統(tǒng)
# /HADOOP-VERSION-FULL-PATH/bin/hdfs namenode -format
7.啟動hdfs服務(wù)
# /HADOOP-VERSION-FULL-PATH/sbin/start-dfs.sh
8.啟動yran服務(wù)
# /HADOOP-VERSION-FULL-PATH/sbin/start-yarn.sh
9.通過web接口驗(yàn)證正在運(yùn)行的服務(wù)
http://127.0.0.1:50070
http://127.0.0.1:8088
10.配置ssh免秘鑰登錄
# cd ~/.ssh/ # 若沒有該目錄,請先執(zhí)行一次ssh localhost
# ssh-keygen -t rsa # 會有提示受神,都按回車就可以
# cat id_rsa.pub >> authorized_keys # 加入授權(quán)
# chmod 600 ./authorized_keys # 修改文件權(quán)限