存儲(chǔ)越困難耳璧,提取越容易
HDFS客戶端操作---開(kāi)發(fā)環(huán)境準(zhǔn)備
步驟一:編譯對(duì)應(yīng)HadoopJar包内舟,配置Hadoop變量
image-20200622092736684
步驟二:創(chuàng)建Maven工程,導(dǎo)入pom依賴
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>RELEASE</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.8.2</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.2</version>
</dependency>
</dependencies>
步驟三:創(chuàng)建日志文件
在resources目錄下創(chuàng)建log4j.properties,輸入如下信息
log4j.rootLogger=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - %m%n
log4j.appender.logfile=org.apache.log4j.FileAppender
log4j.appender.logfile.File=target/spring.log
log4j.appender.logfile.layout=org.apache.log4j.PatternLayout
log4j.appender.logfile.layout.ConversionPattern=%d %p [%c] - %m%n
步驟四:創(chuàng)建HDFSClient類
public class HDFSClient{
@Test
public void testMkdirs() throws IOException, InterruptedException, URISyntaxException{
// 1 獲取文件系統(tǒng)
Configuration configuration = new Configuration();
// 2 連接集群
FileSystem fs = FileSystem.get(new URI("hdfs://hadoop102:9000"), configuration, "zhutiansama");
// 3 創(chuàng)建目錄
fs.mkdirs(new Path("/1108/daxian/banzhang"));
// 4 關(guān)閉資源
fs.close();
}
}
? 客戶端去操作HDFS時(shí)护桦,是有一個(gè)用戶身份的巧号。默認(rèn)情況下族奢,HDFS客戶端API會(huì)從JVM中獲取一個(gè)參數(shù)來(lái)作為自己的用戶身份:-DHADOOP_USER_NAME=zhutiansama,zhutiansama為用戶名稱丹鸿。至此Hadoop的Win開(kāi)發(fā)環(huán)境搭建完成越走,大家可以通過(guò)API去操作Hadoop啦
相關(guān)資料
1
本文配套GitHub:https://github.com/zhutiansama/FocusBigData