小記:網(wǎng)上搜了很多fuse掛載hdfs的文檔,基本都是坑运授,要不就是版本太低烤惊,要不就是xxxx,很好奇這是為什么吁朦。柒室。。試了n多次逗宜,一把鼻涕一把淚雄右,終于掛載成功空骚,分享出來,希望大家少走些彎路擂仍。囤屹。。
環(huán)境
1. 安裝依賴包
yum -y install lzo-devel zlib-devel gcc gcc-c++ autoconf automake libtool openssl-devel fuse-devel cmake
2. 安裝protobuf
wget https://github.com/google/protobuf/releases/download/v2.5.0/protobuf-2.5.0.tar.gz -O protobuf-2.5.0.tar.gz
tar -zxvf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
./configure --prefix=/opt/protobuf-2.5.0
make && make install
vim /etc/profile
#加入下面一行
export PATH=/opt/protobuf-2.5.0/bin:$PATH
#測(cè)試是否安裝成功
protoc --version
fuse掛載hdfs
下載hadoop 源碼包逢渔,這里是hadoop-2.9.1-src.tar.gz
tar -zxvf hadoop-2.9.1-src.tar.gz
cd hadoop-2.9.1-src
#編譯本地庫肋坚,時(shí)間灰常的長(zhǎng)。肃廓。智厌。
mvn package -Pdist,native -DskipTests -Dtar
如果沒有安裝maven,參照下面:
wget http://mirrors.tuna.tsinghua.edu.cn/apache/maven/maven-3/3.3.9/binaries/apache-maven-3.3.9-bin.tar.gz
tar -zxvf apache-maven-3.3.9-bin.tar.gz
vim /etc/profile
#加入下面兩行
M2_HOME=/root/soft/apache-maven-3.3.9 #解壓的那個(gè)目錄
export PATH=${M2_HOME}/bin:${PATH}
source /etc/profile
#測(cè)試
mvn -v
漫長(zhǎng)等待后,會(huì)產(chǎn)生兩個(gè)重要的文件:
- /root/soft/hadoop-2.9.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/main/native/fuse-dfs
- /root/soft/hadoop-2.9.1-src/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/fuse-dfs/fuse_dfs_wrapper.sh
我們把fuse_dfs_wrapper.sh復(fù)制出來盲赊,修改一下
#!/usr/bin/env bash
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
#新加幾個(gè)參數(shù)
HADOOP_HOME=/opt/hadoop-2.9.1
HADOOP_PREFIX=/root/soft/hadoop-2.9.1-src
HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
if [ "$HADOOP_PREFIX" = "" ]; then
echo "HADOOP_PREFIX is empty. Set it to the root directory of Hadoop source code"
exit 1
fi
export FUSEDFS_PATH="$HADOOP_PREFIX/hadoop-hdfs-project/hadoop-hdfs-native-client/target/main/native/fuse-dfs"
#export LIBHDFS_PATH="$HADOOP_PREFIX/hadoop-hdfs-project/hadoop-hdfs-native-client/target/usr/local/lib"
#把上面的改成下面的
export LIBHDFS_PATH="$HADOOP_PREFIX/hadoop-hdfs-project/hadoop-hdfs-native-client/target/native/target/usr/local/lib"
if [ "$OS_ARCH" = "" ]; then
export OS_ARCH=amd64
fi
if [ "$JAVA_HOME" = "" ]; then
export JAVA_HOME=/usr/local/java
fi
if [ "$LD_LIBRARY_PATH" = "" ]; then
export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:/usr/local/lib
fi
while IFS= read -r -d '' file
do
export CLASSPATH=$CLASSPATH:$file
done < <(find "$HADOOP_PREFIX/hadoop-client" -name "*.jar" -print0)
while IFS= read -r -d '' file
do
export CLASSPATH=$CLASSPATH:$file
done < <(find "$HADOOP_PREFIX/hadoop-hdfs-project" -name "*.jar" -print0)
export CLASSPATH=$HADOOP_CONF_DIR:$CLASSPATH
export PATH=$FUSEDFS_PATH:$PATH
export LD_LIBRARY_PATH=$LIBHDFS_PATH:$JAVA_HOME/jre/lib/$OS_ARCH/server
fuse_dfs "$@"
最后一步铣鹏,掛載
mkdir /mnt/fuse_hdfs
./fuse_dfs_wrapper.sh hdfs://namenode:9000 /mnt/fuse_hdfs/
然后就可以看到hdfs里面的數(shù)據(jù)了
卸載掛載點(diǎn)
umount /mnt/fuse_hdfs