hive server
hive 1
啟動(dòng):
hive --service hiveserver
jdbc連接
org.apache.hadoop.hive.jdbc.HiveDriver
dbc:hive://172.22.1.100:10000/test
hive2
hive --service hiveserver2
jdbc連接
org.apache.hive.jdbc.HiveDriver
dbc:hive2://172.22.1.100:10000/test
開(kāi)啟遠(yuǎn)程寫(xiě)的權(quán)限
hdfs-site.xml
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
配置用戶自定義安全策略
首先hive支持多種安全認(rèn)證方式:NONE,NOSASL, KERBEROS, LDAP, PAM ,CUSTOM等涩澡,一般默認(rèn)配置為none就是沒(méi)有任何驗(yàn)證赏淌,所以需要其他的安全策略池户,我選擇的是CUSTOM(自定義安全策略)店量。
如果將hive.server2.authentication設(shè)置成CUSTOM泽篮,則需要設(shè)置
hive.server2.custom.authentication.class來(lái)指定用于權(quán)限認(rèn)證的類丐膝,這個(gè)類需要實(shí)現(xiàn)
org.apache.hive.service.auth.PasswdAuthenticationProvider接口丑瞧。
首先實(shí)現(xiàn)一個(gè)自定義類:
package com.imis.project;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader;
import java.io.IOException;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
import javax.security.sasl.AuthenticationException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hive.conf.HiveConf;
import org.apache.hive.service.auth.PasswdAuthenticationProvider;
public class CustomHiveServer2Auth implements PasswdAuthenticationProvider {
@Override
public void Authenticate(String username, String password)
throws AuthenticationException {
boolean ok = false;
String passMd5 = new MD5().md5(password);
HiveConf hiveConf = new HiveConf();
Configuration conf = new Configuration(hiveConf);
String filePath=null;
File file=null;
try {
filePath=conf.get("hive.server2.custom.authentication.file");
System.out.println("hive.server2.custom.authentication.file ["
+ filePath + "] ..");
file = new File(filePath);
} catch (Exception e) {
System.out.println("not found!");
}
BufferedReader reader = null;
try {
reader = new BufferedReader(new FileReader(file));
String tempString = null;
while ((tempString = reader.readLine()) != null) {
String[] datas = tempString.split(",", -1);
if(datas.length != 2) continue;
//ok
if(datas[0].equals(username) && datas[1].equals(passMd5)) {
ok = true;
break;
}
}
reader.close();
} catch (Exception e) {
e.printStackTrace();
throw new AuthenticationException("read auth config file error, [" + filePath + "] ..", e);
} finally {
if (reader != null) {
try {
reader.close();
} catch (IOException e1) {}
}
}
if(ok) {
System.out.println("user [" + username + "] auth check ok .. ");
} else {
System.out.println("user [" + username + "] auth check fail .. ");
throw new AuthenticationException("user [" + username + "] auth check fail .. ");
}
}
//MD5加密
class MD5 {
private MessageDigest digest;
private char hexDigits[] = {'0','1','2','3','4','5','6','7',
'8','9','a','b','c','d','e','f'};
public MD5() {
try {
digest = MessageDigest.getInstance("MD5");
} catch (NoSuchAlgorithmException e) {
throw new RuntimeException(e);
}
}
public String md5(String str) {
byte[] btInput = str.getBytes();
digest.reset();
digest.update(btInput);
byte[] md = digest.digest();
// 把密文轉(zhuǎn)換成十六進(jìn)制的字符串形式
int j = md.length;
char strChar[] = new char[j * 2];
int k = 0;
for (int i = 0; i < j; i++) {
byte byte0 = md[i];
strChar[k++] = hexDigits[byte0 >>> 4 & 0xf];
strChar[k++] = hexDigits[byte0 & 0xf];
}
return new String(strChar);
}
}
}
然后整個(gè)項(xiàng)目打包趾痘,包括包的結(jié)構(gòu)命名為HiveServer2Auth.jar,放到$HIVE_HOME/lib下
hive-site.xml
<property>
<name>hive.server2.authentication</name>
<value>CUSTOM</value>
</property>
<property>
<name>hive.server2.custom.authentication.class</name>
<value>com.imis.project.CustomHiveServer2Auth</value>
</property>
<property>
<name>hive.server2.custom.authentication.file</name>
<value>/usr/local/apache-hive-0.13.1-bin/conf/hive.server2.users.conf</value>
</property>
在$HIVE_HOME/conf下新建文件hive.server2.users.conf凹耙,里面寫(xiě)入內(nèi)容:
[root@dev conf]# cat hive.server2.users.conf
user,5f4dcc3b5aa765d61d8327deb882cf99
然后重要的一件事姿现,配置hive的日志
在hive的conf目錄下執(zhí)行
cp hive-log4j.properties.template hive-log4j.properties
修改
hive.log.dir=/home/hadoop/hive/conf # 默認(rèn)的存儲(chǔ)位置
hive.log.file=hive.log # 默認(rèn)的文件名
最后在相應(yīng)目錄下新建hive.log文件就可以看到hive系統(tǒng)日志了
當(dāng)然你也可以配置job的日志
啟動(dòng)hiveserver2,已經(jīng)做到了權(quán)限控制肠仪。
遠(yuǎn)程調(diào)用mapreduce的問(wèn)題
返回錯(cuò)誤代碼1肖抱,表示的是用戶認(rèn)證的錯(cuò)誤
返回錯(cuò)誤代碼2,表示的是參數(shù)的問(wèn)題
此處查看hive.log發(fā)現(xiàn)是
user=imis, access=EXECUTE, inode="/tmp/hadoop-yarn":hadoop:supergroup:rwx----)
此處解決方案:首先發(fā)現(xiàn)是權(quán)限的問(wèn)題然后通過(guò)命令
hadoop fs -ls /tmp
發(fā)現(xiàn)有兩個(gè)目錄hadoop-yarn和hive异旧,在hive里面就是剛剛在自定義安全策略里面的用戶名的目錄比如user
然后就會(huì)發(fā)現(xiàn)user是沒(méi)有辦法使用hadoop-yarn目錄的意述,因?yàn)閡ser沒(méi)有權(quán)限,所以首先把hive目錄下的用戶授予權(quán)限吮蛹,然后改變hadoop-yarn的權(quán)限荤崇。
hdfs dfs -chmod -R 777 /tmp/hadoop-yarn
hdfs dfs -chomd -R 777 /tmp/hive
注:此處hadoop-tarn和hive的用戶是hadoop組是supergroup,有一個(gè)現(xiàn)象就是root用戶沒(méi)有授予權(quán)限的權(quán)限潮针,只能由hadoop用戶授予777權(quán)限术荤。