Hive的實戰(zhàn)

Hive的命令和使用

!clear;
exit;
show database;

// 查看結(jié)構(gòu);
desc tb_student;
desc formatted tb_student; // 詳細(xì)信息,格式化展示;

// 查看Hive詳細(xì)執(zhí)行日志:   可通過conf/hive-log4j.prop中查看其目錄;
tail -f /tmp/bigdata/hive.log 

在shell中執(zhí)行hive命令: 
hive -e "select * from tb_student"

BeeLine的命令:

beeline
!connect  jdbc:hive2://ldsver53:10000/default
# 格式: ! connect jdbc:hive2://hostname:[HiveServer2的端口]/database;

beeline -u jdbc:hive2//ldsver53:10000 -n bigdata

show database;

!exit –退出shell
! helper  # 顯示命令幫助

Usage: java org.apache.hive.cli.beeline.BeeLine
-u <database url> the JDBC URL to connect to
-n <username> the username to connect as
-p <password> the password to connect as
-d <driver class> the driver class to use
-i <init file> script file for initialization
-e <query> query that should be executed
-f <exec file> script file that should be executed
-w (or) --password-file <password file> the password file to read password from
--hiveconf property=value Use value for given property

Hive的部署

hive的配置

  1. 配置環(huán)境變量
export HIVE_HOME=/home/app/hadoop//hive
export HIVE_CONF_DIR=$HIVE_HOME/conf
export HIVE_AUX_JARS_PATH=$HIVE_HOME//lib/*

  1. 修改hive-site.xml配置

先準(zhǔn)備mysql作為元數(shù)據(jù)庫

// 1. 先建表; 

CREATE USER 'hive'@'metastorehost' IDENTIFIED BY 'mypassword';
mysql> REVOKE ALL PRIVILEGES, GRANT OPTION FROM 'hive'@'metastorehost';
mysql> GRANT ALL PRIVILEGES ON metastore.* TO 'hive'@'metastorehost';
mysql> FLUSH PRIVILEGES;
mysql> quit;

  <!-- JDBC 注冊驅(qū)動 -->
  <property>
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>com.mysql.jdbc.Driver</value>
  </property>

  <!-- 配置JDBC -->
  <property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://ldsver53:3306/hive?createDatabaseIfNotExist=true&amp;useSSL=false</value>
  </property>

  <!-- Mysql的用戶名 -->
  <property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>bigdata</value>
  </property>
  
  <!-- Mysql 密碼 -->
  <property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>hq.2020.DB</value>
  </property>
  <property>
    <name>hive.metastore.schema.verification</name>
    <value>false</value>
  </property> 

<property>
  <name>datanucleus.autoCreateSchema</name>
  <value>true</value>
</property>
  1. 測試元數(shù)據(jù)塊
// 測試 meta數(shù)據(jù)服務(wù)
schematool -initSchema -dbType mysql --verdose

// 若初始化有異常, 可以試下如下命令,查看: 啟動元數(shù)據(jù)服務(wù); 測試
hive --service metastore & 

// 測試hive命令
hive 

Hive相關(guān)報錯

HiveMetaException: Failed to get schema version

org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
*** schemaTool failed ***

報錯: Unable to instantiate SessionHiveMetaStoreClient

hive> show databases;
FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

DefaultAuthorizationProvider.checkFsPermission() : namenode校驗權(quán)限時,權(quán)限校驗不通過:
denied: user=anonymous, access=EXECUTE, inode="/tmp":bigdata:supergroup:drwx-----

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
Query ID = bigdata_20200607211818_63bab830-07e5-4cd3-8963-8f95fc07b93b
Total jobs = 3
Launching Job 1 out of 3
Number of reduce tasks is set to 0 since there's no reduce operator
org.apache.hadoop.security.AccessControlException: Permission denied: user=anonymous, access=EXECUTE, inode="/tmp":bigdata:supergroup:drwx------
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:201)
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:154)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3885)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3868)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkOwner(FSDirectory.java:3833)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOwner(FSNamesystem.java:6793)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermissionInt(FSNamesystem.java:1957)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermission(FSNamesystem.java:1937)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setPermission(NameNodeRpcServer.java:665)
    at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.setPermission(AuthorizationProviderProxyClientProtocol.java:179)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setPermission(ClientNamenodeProtocolServerSideTranslatorPB.java:466)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2278)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2274)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2272)

    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
    at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
    at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
    at org.apache.hadoop.hdfs.DFSClient.setPermission(DFSClient.java:2524)
    at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1441)
    at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1437)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.setPermission(DistributedFileSystem.java:1437)
    at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:616)
    at org.apache.hadoop.mapreduce.JobResourceUploader.uploadFiles(JobResourceUploader.java:94)
    at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:99)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:194)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)
    at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:578)
    at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:573)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:573)
    at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:564)
    at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:436)
    at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:142)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:214)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:99)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2054)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1750)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1503)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1287)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1282)
    at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:236)
    at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:89)
    at org.apache.hive.service.cli.operation.SQLOperation$3$1.run(SQLOperation.java:301)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
    at org.apache.hive.service.cli.operation.SQLOperation$3.run(SQLOperation.java:314)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=anonymous, access=EXECUTE, inode="/tmp":bigdata:supergroup:drwx------
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:201)
    at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:154)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末,一起剝皮案震驚了整個濱河市,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌茴丰,老刑警劉巖挽拂,帶你破解...
    沈念sama閱讀 221,635評論 6 515
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件芦鳍,死亡現(xiàn)場離奇詭異各拷,居然都是意外死亡形病,警方通過查閱死者的電腦和手機(jī)磷脯,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 94,543評論 3 399
  • 文/潘曉璐 我一進(jìn)店門蛾找,熙熙樓的掌柜王于貴愁眉苦臉地迎上來,“玉大人赵誓,你說我怎么就攤上這事打毛。” “怎么了俩功?”我有些...
    開封第一講書人閱讀 168,083評論 0 360
  • 文/不壞的土叔 我叫張陵幻枉,是天一觀的道長。 經(jīng)常有香客問我诡蜓,道長熬甫,這世上最難降的妖魔是什么? 我笑而不...
    開封第一講書人閱讀 59,640評論 1 296
  • 正文 為了忘掉前任蔓罚,我火速辦了婚禮椿肩,結(jié)果婚禮上瞻颂,老公的妹妹穿的比我還像新娘。我一直安慰自己郑象,他們只是感情好贡这,可當(dāng)我...
    茶點(diǎn)故事閱讀 68,640評論 6 397
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著厂榛,像睡著了一般盖矫。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上噪沙,一...
    開封第一講書人閱讀 52,262評論 1 308
  • 那天炼彪,我揣著相機(jī)與錄音,去河邊找鬼正歼。 笑死,一個胖子當(dāng)著我的面吹牛拷橘,可吹牛的內(nèi)容都是我干的局义。 我是一名探鬼主播,決...
    沈念sama閱讀 40,833評論 3 421
  • 文/蒼蘭香墨 我猛地睜開眼冗疮,長吁一口氣:“原來是場噩夢啊……” “哼萄唇!你這毒婦竟也來了?” 一聲冷哼從身側(cè)響起术幔,我...
    開封第一講書人閱讀 39,736評論 0 276
  • 序言:老撾萬榮一對情侶失蹤另萤,失蹤者是張志新(化名)和其女友劉穎,沒想到半個月后诅挑,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體四敞,經(jīng)...
    沈念sama閱讀 46,280評論 1 319
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 38,369評論 3 340
  • 正文 我和宋清朗相戀三年拔妥,在試婚紗的時候發(fā)現(xiàn)自己被綠了忿危。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片。...
    茶點(diǎn)故事閱讀 40,503評論 1 352
  • 序言:一個原本活蹦亂跳的男人離奇死亡没龙,死狀恐怖铺厨,靈堂內(nèi)的尸體忽然破棺而出,到底是詐尸還是另有隱情硬纤,我是刑警寧澤解滓,帶...
    沈念sama閱讀 36,185評論 5 350
  • 正文 年R本政府宣布,位于F島的核電站筝家,受9級特大地震影響洼裤,放射性物質(zhì)發(fā)生泄漏。R本人自食惡果不足惜肛鹏,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 41,870評論 3 333
  • 文/蒙蒙 一逸邦、第九天 我趴在偏房一處隱蔽的房頂上張望恩沛。 院中可真熱鬧,春花似錦缕减、人聲如沸雷客。這莊子的主人今日做“春日...
    開封第一講書人閱讀 32,340評論 0 24
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽搅裙。三九已至,卻和暖如春裹芝,著一層夾襖步出監(jiān)牢的瞬間部逮,已是汗流浹背。 一陣腳步聲響...
    開封第一講書人閱讀 33,460評論 1 272
  • 我被黑心中介騙來泰國打工嫂易, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留兄朋,地道東北人。 一個月前我還...
    沈念sama閱讀 48,909評論 3 376
  • 正文 我出身青樓怜械,卻偏偏與公主長得像颅和,于是被迫代替她去往敵國和親。 傳聞我的和親對象是個殘疾皇子缕允,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 45,512評論 2 359

推薦閱讀更多精彩內(nèi)容