關注公眾號:分享電腦學習
回復"百度云盤" 可以免費獲取所有學習文檔的代碼(不定期更新)
云盤目錄說明:
tools目錄是安裝包
res 目錄是每一個課件對應的代碼和資源等
doc 目錄是一些第三方的文檔工具
承接上一篇文檔《新增訪客數(shù)量MR統(tǒng)計之MR數(shù)據(jù)輸出到MySQL》
hive-1.2.1的版本可以直接映射HBase已經存在的表
如果說想在hive創(chuàng)建表,同時HBase不存在對應的表,也想做映射色建,那么采用編譯后的hive版本hive-1.2.1-hbase
1. Hive中創(chuàng)建外部表跌穗,關聯(lián)hbase
CREATEEXTERNALTABLEevent_log_20180728(
keystring,
plstring,
verstring,
s_timestring,
u_udstring,
u_sdstring,
enstring)
STOREDBY'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITHSERDEPROPERTIES ("hbase.columns.mapping"=":key,info:pl,info:ver,info:s_time,info:u_ud,info:u_sd,info:en")
TBLPROPERTIES("hbase.table.name"="event_log_20180728");
統(tǒng)計多少個新用戶:
selectcount(*)fromevent_log_20180728whereen="e_l";
2. 提取數(shù)據(jù)区转,進行初步的數(shù)據(jù)過濾操作姆泻,最終將數(shù)據(jù)保存到臨時表
創(chuàng)建臨時表
CREATETABLEstats_hourly_tmp01(
plstring,
verstring,
s_timestring,
u_udstring,
u_sdstring,
enstring,
`date`string,
hourint
);
將原始數(shù)據(jù)提取到臨時表中
INSERTOVERWRITETABLEstats_hourly_tmp01
SELECT pl,ver,s_time,u_ud,u_sd,en,
from_unixtime(cast(s_time/1000asint),'yyyy-MM-dd'),hour(from_unixtime(cast(s_time/1000asint),'yyyy-MM-dd HH:mm:ss'))
FROMevent_log_20200510
WHERE en="e_l"oren="e_pv";
SELECTfrom_unixtime(cast(s_time/1000asint),'yyyy-MM-dd'),from_unixtime(cast(s_time/1000asint),'yyyy-MM-dd HH:mm:ss')FROMevent_log_20180728;
查看結果
3. 具體kpi的分析
創(chuàng)建臨時表保存數(shù)據(jù)結果
CREATETABLEstats_hourly_tmp02(
plstring,
verstring,
`date`string,
kpistring,
hourint,
valueint
);
統(tǒng)計活躍用戶 u_ud 有多少就有多少用戶
統(tǒng)計platform維度是:(name,version)
INSERTOVERWRITETABLEstats_hourly_tmp02
SELECT pl,ver,`date`,'hourly_new_install_users'askpi,hour,COUNT(distinctu_ud)asv
FROM stats_hourly_tmp01
WHERE en="e_l"
GROUPBYpl,ver,`date`,hour;
查看結果:
統(tǒng)計會話長度指標
會話長度 = 一個會話中最后一條記錄的時間 - 第一條的記錄時間 = maxtime - mintime
步驟:
1. 計算出每個會話的會話長度 group by u_sd
2. 統(tǒng)計每個區(qū)間段的總會話長度
統(tǒng)計platform維度是:(name,version)
INSERTINTOTABLE
SELECT pl,ver,`date`,'hourly_session_length'askpi,hour,sum(s_length)/1000asv
FROM (
SELECTpl,ver,`date`,hour,u_sd,(max(s_time) -min(s_time))ass_length
FROM stats_hourly_tmp01
GROUPBYpl,ver,`date`,hour,u_sd
) tmp
GROUPBYpl,ver,`date`,hour;
查看結果
將tmp02的數(shù)據(jù)轉換為和mysql表結構一致的數(shù)據(jù)
窄表轉寬表 => 轉換的結果保存到臨時表中
CREATETABLEstats_hourly_tmp03(
plstring, verstring,`date`string, kpistring,
hour00int, hour01int, hour02int, hour03int,
hour04int, hour05int, hour06int, hour07int,
hour08int, hour09int, hour10int, hour11int,
hour12int, hour13int, hour14int, hour15int,
hour16int, hour17int, hour18int, hour19int,
hour20int, hour21int, hour22int, hour23int
);
INSERTOVERWRITETABLEstats_hourly_tmp03
SELECT pl,ver,`date`,kpi,
max(casewhenhour=0thenvalueelse0end)ash0,
max(casewhenhour=1thenvalueelse0end)ash1,
max(casewhenhour=2thenvalueelse0end)ash2,
max(casewhenhour=3thenvalueelse0end)ash3,
max(casewhenhour=4thenvalueelse0end)ash4,
max(casewhenhour=5thenvalueelse0end)ash5,
max(casewhenhour=6thenvalueelse0end)ash6,
max(casewhenhour=7thenvalueelse0end)ash7,
max(casewhenhour=8thenvalueelse0end)ash8,
max(casewhenhour=9thenvalueelse0end)ash9,
max(casewhenhour=10thenvalueelse0end)ash10,
max(casewhenhour=11thenvalueelse0end)ash11,
max(casewhenhour=12thenvalueelse0end)ash12,
max(casewhenhour=13thenvalueelse0end)ash13,
max(casewhenhour=14thenvalueelse0end)ash14,
max(casewhenhour=15thenvalueelse0end)ash15,
max(casewhenhour=16thenvalueelse0end)ash16,
max(casewhenhour=17thenvalueelse0end)ash17,
max(casewhenhour=18thenvalueelse0end)ash18,
max(casewhenhour=19thenvalueelse0end)ash19,
max(casewhenhour=20thenvalueelse0end)ash20,
max(casewhenhour=21thenvalueelse0end)ash21,
max(casewhenhour=22thenvalueelse0end)ash22,
max(casewhenhour=23thenvalueelse0end)ash23
FROM stats_hourly_tmp02
GROUPBYpl,ver,`date`,kpi;
selecthour14,hour15,hour16fromstats_hourly_tmp03;
結果:
將維度的屬性值轉換為id,使用UDF進行轉換
1. 將udf文件夾中的所有自定義HIVE的UDF放到項目中
2. 使用run maven install環(huán)境進行打包
3. 將打包形成的jar文件上傳到HDFS上的/jar文件夾中
4. hive中創(chuàng)建自定義函數(shù)蜜托,命令如下:
createfunctiondateconverteras'com.xlgl.wzy.hive.udf.DateDimensionConverterUDF'usingjar'hdfs://master:9000/jar/transformer-0.0.1.jar';
createfunctionkpiconverteras'com.xlgl.wzy.hive.udf.KpiDimensionConverterUDF'usingjar'hdfs://master:9000/jar/transformer-0.0.1.jar';
createfunctionplatformconverteras'com.xlgl.wzy.hive.udf.PlatformDimensionConverterUDF'usingjar'hdfs://master:9000/jar/transformer-0.0.1.jar';
創(chuàng)建hive中對應mysql的最終表結構
CREATETABLEstats_hourly(
platform_dimension_idint,
date_dimension_idint,
kpi_dimension_idint,
hour00int, hour01int, hour02int, hour03int,
hour04int, hour05int, hour06int, hour07int,
hour08int, hour09int, hour10int, hour11int,
hour12int, hour13int, hour14int, hour15int,
hour16int, hour17int, hour18int, hour19int,
hour20int, hour21int, hour22int, hour23int
);
INSERTOVERWRITETABLEstats_hourly
SELECT
platformconverter(pl,ver),dateconverter(`date`,'day'),kpiconverter(kpi),
hour00,hour01,hour02,hour03,
hour04,hour05,hour06,hour07,
hour08,hour09,hour10,hour11,
hour12,hour13,hour14,hour15,
hour16,hour17,hour18,hour19,
hour20,hour21,hour22,hour23
FROMstats_hourly_tmp03;
導出sqoop-》mysql
bin/sqoop export \
--connect jdbc:mysql://master:3306/test \
--username root \
--password123456\
--table stats_hourly \
--export-dir/user/hive/warehouse/log_lx.db/stats_hourly \
-m1\
--input-fields-terminated-by'\001'
查詢mysql