hive執(zhí)行計(jì)劃舉例

執(zhí)行計(jì)劃例子:

insert overwrite TABLE lpx SELECT t1.bar, t1.foo, t2.foo FROM pokes t1 JOIN invites t2 ON (t1.bar = t2.bar) ;

OK

ABSTRACT SYNTAX TREE:

? (TOK_QUERY (TOK_FROM (TOK_JOIN (TOK_TABREF (TOK_TABNAME pokes) t1) (TOK_TABREF (TOK_TABNAME invites) t2) (= (. (TOK_TABLE_OR_COL t1) bar) (. (TOK_TABLE_OR_COL t2) bar)))) (TOK_INSERT (TOK_DESTINATION (TOK_TAB (TOK_TABNAME lpx))) (TOK_SELECT (TOK_SELEXPR (. (TOK_TABLE_OR_COL t1) bar)) (TOK_SELEXPR (. (TOK_TABLE_OR_COL t1) foo)) (TOK_SELEXPR (. (TOK_TABLE_OR_COL t2) foo)))))

STAGE DEPENDENCIES:

? Stage-1 is a root stage ? /根

? Stage-0 depends on stages: Stage-1 /0依賴1

? Stage-2 depends on stages: Stage-0 /2依賴0

STAGE PLANS:

Stage: Stage-1

Map Reduce//這個(gè)階段是一個(gè)mapreduce作業(yè)????? Alias -> Map Operator Tree:???//map操作樹赴涵,對(duì)應(yīng)map階段

t1

TableScan?? //掃描表獲取數(shù)據(jù) ??from加載表,描述中有行數(shù)和大小等

??????????? alias: t1???? //表別名

Reduce Output Operator?//這里描述map的輸出,也就是reduce的輸入寡润。比如key何乎,partition摹恨,sort等信息?

????????????? key expressions:??//t1表輸出到reduce階段的key信息

expr: bar

type: string

sort order: +? //一個(gè)排序字段铃岔,這個(gè)排序字段是key=bar牲距,多個(gè)排序字段多個(gè)+

Map-reduce partition columns:? //partition的信息江咳,由此也可以看出hive在join的時(shí)候會(huì)以join on后的列作為partition的列逢净,以保證具有相同此列的值的行被分到同一個(gè)reduce中去

expr: bar

type: string

tag: 0???????????????????????? //對(duì)t1表打標(biāo)簽

value expressions:?? //t1表輸出到reduce階段的value信息

expr: foo

type: int

expr: bar

type: string

t2

TableScan

alias: t2

Reduce Output Operator

key expressions:

expr: bar

type: string

sort order: +

Map-reduce partition columns:

expr: bar

type: string

tag: 1

value expressions:

expr: foo

type: int

Reduce Operator Tree://reduce操作樹,相當(dāng)于reduce階段Join Operator

condition map:

Inner Join 0 to 1

????????? condition expressions:

??????????? 0 {VALUE._col0} {VALUE._col1} //對(duì)應(yīng)前面t1.bar, t1.foo

??????????? 1 {VALUE._col0} //對(duì)應(yīng)前面t2.foo

????????? handleSkewJoin: false

????????? outputColumnNames: _col0, _col1, _col5

Select Operator //篩選列歼指,描述中有列名爹土、類型,輸出類型踩身、大小等胀茵。

expressions:

expr: _col1

type: string

expr: _col0

type: int

expr: _col5

????????????????? type: int

??????????? outputColumnNames: _col0, _col1, _col2 ? //為臨時(shí)結(jié)果字段按規(guī)則起的臨時(shí)字段名

File Output Operator //輸出結(jié)果到臨時(shí)文件中,描述介紹了壓縮格式挟阻、輸出文件格式琼娘。

compressed: false

GlobalTableId: 1

table:

input format: org.apache.hadoop.mapred.TextInputFormat

output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat

serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe

name: default.lpx

? Stage: Stage-0

Move Operator //Stage-0簡(jiǎn)單把結(jié)果從臨時(shí)目錄,移動(dòng)到表lpx相關(guān)的目錄附鸽。

tables:

replace: true

table:

input format: org.apache.hadoop.mapred.TextInputFormat

output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat

serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe

name: default.lpx

Stage: Stage-2

Stats-Aggr Operator

========================================

========================================

從信息頭:

STAGE DEPENDENCIES:

Stage-1 is a root stage

Stage-0 depends on stages: Stage-1

Stage-2 depends on stages: Stage-0

從這里可以看出Plan計(jì)劃的Job任務(wù)結(jié)構(gòu)脱拼,整個(gè)任務(wù)會(huì)分為3個(gè)Job執(zhí)行,第一個(gè)Job將由Stage-1構(gòu)成;

第二個(gè)Job處理由Stage-0構(gòu)成坷备,Stage-0的處理必須依賴Stage-1階段的結(jié)果;

第三個(gè)Job處理由Stage-2構(gòu)成熄浓,Stage-2的處理必須依賴Stage-0階段的結(jié)果。

下面分別解釋Stage-1和Stage-0省撑,執(zhí)行SQL可以分成兩步:(1)玉组、SELECT t1.bar, t1.foo, t2.foo FROM pokes t1 JOIN invites t2 ON (t1.bar = t2.bar);

(2)

谎柄、insert overwrite TABLE lpx;

Stage: Stage-1對(duì)應(yīng)一次完整的Map Reduce任務(wù),包括:Map Operator Tree和Reduce Operator Tree兩步操作,Map Operator Tree對(duì)應(yīng)Map任務(wù)惯雳,Reduce Operator Tree對(duì)應(yīng)Reduce任務(wù)朝巫。從Map Operator Tree階段可以看出進(jìn)行了兩個(gè)并列的操作t1和t2,分別SELECT t1.bar, t1.foo FROM t1;和SELECT t2.foo FROM t2;而且兩個(gè)Map任務(wù)分別產(chǎn)生了Reduce階段的輸入[Reduce Output Operator]石景。從Reduce Operator Tree分析可以看到如下信息劈猿,條件連接Map的輸出以及通過預(yù)定義的輸出格式生成符合default.lpx的存儲(chǔ)格式的數(shù)據(jù)存儲(chǔ)到HDFS中。在我們創(chuàng)建lpx表的時(shí)候潮孽,沒有指定該表的存儲(chǔ)格式揪荣,默認(rèn)會(huì)以Text為存儲(chǔ)格式,輸入輸出會(huì)以TextInputFormat與TextOutputFormat進(jìn)行讀寫:

table:

input format: org.apache.hadoop.mapred.TextInputFormat

output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat

serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe

name: default.lpx

input format的值對(duì)應(yīng)org.apache.hadoop.mapred.TextInputFormat往史,這是因?yàn)樵陂_始的Map階段產(chǎn)生的臨時(shí)輸出文件是以TextOutputFormat格式保存的仗颈,自然Reduce的讀取是由TextInputFormat格式處理讀入數(shù)據(jù)。這些是由Hadoop的MapReduce處理細(xì)節(jié)來(lái)控制椎例,而Hive只需要指定處理格式即可挨决。

Serde值為org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe類,這時(shí)這個(gè)對(duì)象的保存的值為_col0, _col1, _col2订歪,也就是我們預(yù)期要查詢的t1.bar, t1.foo, t2.foo脖祈,這個(gè)值具體的應(yīng)該為_col0+表lpx設(shè)置的列分割符+_col1+表lpx設(shè)置的列分割符+_col2。outputformat:org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat可以知道output的處理是使用該類來(lái)處理的刷晋。

Stage-0對(duì)應(yīng)上面提到的第二步操作盖高。這時(shí)stage-1產(chǎn)生的臨時(shí)處理文件舉例如tmp,需要經(jīng)過stage-0階段操作處理到lpx表中。Move Operator代表了這并不是一個(gè)MapReduce任務(wù)眼虱,只需要調(diào)用MoveTask的處理就行喻奥,在處理之前會(huì)去檢查輸入文件是否符合lpx表的存儲(chǔ)格式。

hive執(zhí)行計(jì)劃作用

分析作業(yè)執(zhí)行過程捏悬,優(yōu)化作業(yè)執(zhí)行流程撞蚕,提升作業(yè)執(zhí)行效率;例如邮破,數(shù)據(jù)過濾條件從reduce端提前到map端诈豌,有效減少map/reduce間shuffle數(shù)據(jù)量仆救,提升作業(yè)執(zhí)行效率抒和;

提前過濾數(shù)據(jù)數(shù)據(jù)集,減少不必要的讀取操作彤蔽;例如: hive join操作先于where條件顧慮摧莽,將分區(qū)條件放入on語(yǔ)句中,能夠有效減少輸入數(shù)據(jù)集顿痪;

執(zhí)行計(jì)劃分析問題hql

select a.*, b.cust_uid

from ods_ad_bid_deliver_info b join mds_ad_algo_feed_monitor_data_table a

where a.dt<=20140101 and a.dt<=20140108 and key='deliver_id_bucket_id' and a.dt=b.dt and a.key_slice=b.deliver_id

==========================================================================

==========================================================================

執(zhí)行計(jì)劃:

抽象語(yǔ)法樹:

ABSTRACT SYNTAX TREE:

? (TOK_QUERY (TOK_FROM (TOK_JOIN (TOK_TABREF (TOK_TABNAME ods_ad_bid_deliver_info) b) (TOK_TABREF (TOK_TABNAME mds_ad_algo_feed_monitor_data_table) a))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_ALLCOLREF (TOK_TABNAME a))) (TOK_SELEXPR (. (TOK_TABLE_OR_COL b) cust_uid))) (TOK_WHERE (and (and (and (and (<= (. (TOK_TABLE_OR_COL a) dt) 20140101) (<= (. (TOK_TABLE_OR_COL a) dt) 20140108)) (= (TOK_TABLE_OR_COL key) 'deliver_id_bucket_id')) (= (. (TOK_TABLE_OR_COL a) dt) (. (TOK_TABLE_OR_COL b) dt))) (= (. (TOK_TABLE_OR_COL a) key_slice) (. (TOK_TABLE_OR_COL b) deliver_id))))))

STAGE DEPENDENCIES:

? Stage-1 is a root stage

? Stage-0 is a root stage

STAGE PLANS:

? Stage: Stage-1

??? Map Reduce

????? Alias -> Map Operator Tree:

??????? a

?????????TableScan

??????????? alias: a

???????????Filter Operator

????????????? predicate:

????????????????? expr: (key = 'deliver_id_bucket_id')?//按key指定值在map階段過濾

????????????????? type: boolean

?????????????Reduce Output Operator

??????????????? sort order:

??????????????? tag: 1

??????????????? value expressions:?//select *導(dǎo)致輸出到reduce的數(shù)據(jù)是全部的列信息

?????????????????????expr: key

????????????????????? type: string

????????????????????? expr: key_slice

????????????????????? type: string

????????????????????? expr: billing_mode_slice

????????????????????? type: string

????????????????????? expr: bucket_id

????????????????????? type: string

????????????????????? expr: ctr

????????????????????? type: string

????????????????????? expr: ecpm

????????????????????? type: string

????????????????????? expr: auc

????????????????????? type: string

????????????????????? expr: pctr

????????????????????? type: string

????????????????????? expr: pctr_ctr

????????????????????? type: string

????????????????????? expr: total_pv

????????????????????? type: string

????????????????????? expr: total_click

????????????????????? type: string

????????????????????? expr: dt

????????????????????? type: string

??????? b

?????????TableScan

??????????? alias: b

???????????Reduce Output Operator

????????????? sort order:

????????????? tag: 0

????????????? value expressions:

??????????????????? expr: deliver_id

??????????????????? type: string

??????????????????? expr: cust_uid

??????????????????? type: string

??????????????????? expr: dt

??????????????????? type: string

????? Reduce Operator Tree:

???????Join Operator

????????? condition map:

?????????????? Inner Join 0 to 1

????????? condition expressions:

??????????? 0 {VALUE._col0} {VALUE._col6} {VALUE._col35}

??????????? 1 {VALUE._col0} {VALUE._col1} {VALUE._col2} {VALUE._col3} {VALUE._col4} {VALUE._col5} {VALUE._col6} {VALUE._col7} {VALUE._col8} {VALUE._col9} {VALUE._col10} {VALUE._col11}

????????? handleSkewJoin: false

????????? outputColumnNames: _col0, _col6, _col35, _col38, _col39, _col40, _col41, _col42, _col43, _col44, _col45, _col46, _col47, _col48, _col49

?????????Filter Operator

??????????? predicate:

???????????????expr: (((((_col49 <= 20140101) and (_col49 <= 20140108)) and (_col38 = 'deliver_id_bucket_id')) and (_col49 = _col35)) and (_col39 = _col0))

??????????????? type: boolean

???????????Select Operator

????????????? expressions:

??????????????????? expr: _col38

??????????????????? type: string

??????????????????? expr: _col39

??????????????????? type: string

??????????????????? expr: _col40

??????????????????? type: string

??????????????????? expr: _col41

??????????????????? type: string

??????????????????? expr: _col42

??????????????????? type: string

??????????????????? expr: _col43

??????????????????? type: string

??????????????????? expr: _col44

??????????????????? type: string

??????????????????? expr: _col45

??????????????????? type: string

??????????????????? expr: _col46

??????????????????? type: string

??????????????????? expr: _col47

??????????????????? type: string

??????????????????? expr: _col48

??????????????????? type: string

??????????????????? expr: _col49

??????????????????? type: string

??????????????????? expr: _col6

??????????????????? type: string

????????????? outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8, _col9, _col10, _col11, _col12

?????????????File Output Operator

??????????????? compressed: false

??????????????? GlobalTableId: 0

??????????????? table:

??????????????????? input format: org.apache.hadoop.mapred.TextInputFormat

??????????????????? output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat

? Stage: Stage-0

??? Fetch Operator

????? limit: -1

優(yōu)化之后hql:

select a.*, b.cust_uid

from ods_ad_bid_deliver_info b

join mds_ad_algo_feed_monitor_data_table a

on(a.dt<=20140101 and a.dt<=20140108 and a.dt=b.dt and a.key_slice=b.deliver_id and a.key='deliver_id_bucket_id')

=================================================================

=================================================================

執(zhí)行計(jì)劃:

抽象語(yǔ)法樹:

STAGE DEPENDENCIES:

? Stage-1 is a root stage

? Stage-0 is a root stage

STAGE PLANS:

? Stage: Stage-1

??? Map Reduce

????? Alias -> Map Operator Tree:

??????? a

??????????TableScan

??????????? alias: a

???????????Filter Operator

????????????? predicate:

????????????????? expr: (key = 'deliver_id_bucket_id')

????????????????? type: boolean

?????????????Filter Operator

??????????????? predicate:

??????????????????? expr: (dt <= 20140101)? //分區(qū)過濾條件在map端生效

??????????????????? type: boolean

???????????????Filter Operator

????????????????? predicate:

????????????????????? expr: (dt <= 20140108)? //分區(qū)過濾條件在map端生效

????????????????????? type: boolean

????????????????? Filter Operator

??????????????????? predicate:

??????????????????????? expr: (key = 'deliver_id_bucket_id')

??????????????????????? type: boolean

??????????????????? Reduce Output Operator

????????????????????? key expressions:

??????????????????????????? expr: dt

??????????????????????????? type: string

??????????????????????????? expr: key_slice

??????????????????????????? type: string

?????????? ???????????sort order: ++

????????????????????? Map-reduce partition columns:

??????????????????????????? expr: dt

??????????????????????????? type: string

??????????????????????????? expr: key_slice

??????????????????????????? type: string

?????????????? ???????tag: 1

????????????????????? value expressions:

??????????????????????????? expr: key

??????????????????????????? type: string

??????????????????????????? expr: key_slice

??????????????????????????? type: string

??????????????????????????? expr: billing_mode_slice

??????????????????????????? type: string

??????????????????????????? expr: bucket_id

??????????????????????????? type: string

??????????????????????????? expr: ctr

??????????????????????????? type: string

??????????????????????????? expr: ecpm

??????????????????????????? type: string

??????????????????????????? expr: auc

??????????????????????????? type: string

??????????????????????????? expr: pctr

??????????????????????????? type: string

??????????????????????????? expr: pctr_ctr

??????????????????????????? type: string

??????????????????????????? expr: total_pv

??????????????????????????? type: string

??????????????????????????? expr: total_click

??????????????????????????? type: string

??????????????????????????? expr: dt

?????? ?????????????????????type: string

??????? b

??????????TableScan

??????????? alias: b

??????????? Reduce Output Operator

????????????? key expressions:

??????????????????? expr: dt

??????????????????? type: string

??????????????????? expr: deliver_id

???? ???????????????type: string

????????????? sort order: ++

????????????? Map-reduce partition columns:

??????????????????? expr: dt

??????????????????? type: string

??????????????????? expr: deliver_id

??????????????????? type: string

????????????? tag: 0

? ????????????value expressions:

??????????????????? expr: cust_uid

??????????????????? type: string

????? Reduce Operator Tree:

??????? Join Operator

????????? condition map:

?????????????? Inner Join 0 to 1

????????? condition expressions:

??????????? 0 {VALUE._col6}

??????????? 1 {VALUE._col0} {VALUE._col1} {VALUE._col2} {VALUE._col3} {VALUE._col4} {VALUE._col5} {VALUE._col6} {VALUE._col7} {VALUE._col8} {VALUE._col9} {VALUE._col10} {VALUE._col11}

????????? handleSkewJoin: false

????????? outputColumnNames: _col6, _col38, _col39, _col40, _col41, _col42, _col43, _col44, _col45, _col46, _col47, _col48, _col49

????????? Select Operator

??????????? expressions:

????????????????? expr: _col38

????????????????? type: string

????????????????? expr: _col39

????????????????? type: string

????????????????? expr: _col40

????????????????? type: string

????????????????? expr: _col41

????????????????? type: string

????????????????? expr: _col42

????????????????? type: string

????????????????? expr: _col43

?????????????????type: string

????????????????? expr: _col44

????????????????? type: string

????????????????? expr: _col45

????????????????? type: string

????????????????? expr: _col46

????????????????? type: string

????????????????? expr: _col47

????????????????? type: string

????????????????? expr: _col48

????????????????? type: string

????????????????? expr: _col49

????????????????? type: string

????????????????? expr: _col6

????????????????? type: string

??????????? outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8, _col9, _col10, _col11, _col12

??????????? File Output Operator

????????????? compressed: false

????????????? GlobalTableId: 0

????????????? table:

????????????????? input format: org.apache.hadoop.mapred.TextInputFormat

????????????????? output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat

? Stage: Stage-0

??? Fetch Operator

????? limit: -1

例子:

select?*?from?emp e

left?join?dept d?on?e.deptno=d.deptno

where d.dt='2018-06-04';

花費(fèi)時(shí)間:Time taken: 44.401 seconds, Fetched: 17 row(s)

執(zhí)行計(jì)劃:

STAGE DEPENDENCIES:

? Stage-4 is a root stage

? Stage-3 depends on stages: Stage-4

? Stage-0 depends on stages: Stage-3

STAGE PLANS:

? Stage: Stage-4

? ? Map Reduce Local Work ?/本地執(zhí)行

? ? ? Alias -> Map Local Tables:

? ? ? ? d

? ? ? ? ? Fetch Operator

? ? ? ? ? ? limit: -1

? ? ? Alias -> Map Local Operator Tree:

? ? ? ? d

? ? ? ? ? TableScan

? ? ? ? ? ? alias: d

? ? ? ? ? ? Statistics: Num rows: 1 Data size: 168 Basic stats: PARTIAL Column stats: PARTIAL

HashTable Sink Operator/ReduceSinkOperator將Map端的字段組合序列化為Reduce Key/value, Partition Key镊辕,只可能出現(xiàn)在Map階段油够,同時(shí)也標(biāo)志著Hive生成的MapReduce程序中Map階段的結(jié)束。

? ? ? ? ? ? ? keys:

? ? ? ? ? ? ? ? 0 deptno (type: string)

? ? ? ? ? ? ? ? 1 deptno (type: string)

? Stage: Stage-3

? ? Map Reduce

? ? ? Map Operator Tree:

? ? ? ? ? TableScan

? ? ? ? ? ? alias: e

? ? ? ? ? ? Statistics: Num rows: 1 Data size: 757 Basic stats: PARTIAL Column stats: PARTIAL

? ? ? ? ? ? Map Join Operator

? ? ? ? ? ? ? condition map:

? ? ? ? ? ? ? ? ? ?Left Outer Join0 to 1

? ? ? ? ? ? ? keys:

? ? ? ? ? ? ? ? 0 deptno (type: string)

? ? ? ? ? ? ? ? 1 deptno (type: string)

? ? ? ? ? ? ? outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8, _col12, _col13, _col14, _col15

? ? ? ? ? ? ? Statistics: Num rows: 1 Data size: 832 Basic stats: COMPLETE Column stats: NONE

? ? ? ? ? ? ? Filter Operator

? ? ? ? ? ? ? ? predicate: (_col15 = '2018-06-04') (type: boolean)

? ? ? ? ? ? ? ? Statistics: Num rows: 1 Data size: 832 Basic stats: COMPLETE Column stats: NONE

? ? ? ? ? ? ? ? Select Operator

? ? ? ? ? ? ? ? ? expressions: _col0 (type: string), _col1 (type: string), _col2 (type: string), _col3 (type: string), _col4 (type: string), _col5 (type: string), _col6 (type: string), _col7 (type: string), _col8 (type: str

ing), _col12 (type: string), _col13 (type: string), _col14 (type: string), '2018-06-04' (type: string)? ? ? ? ? ? ? ? ? outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8, _col9, _col10, _col11, _col12

? ? ? ? ? ? ? ? ? Statistics: Num rows: 1 Data size: 832 Basic stats: COMPLETE Column stats: NONE

? ? ? ? ? ? ? ? ? File Output Operator

? ? ? ? ? ? ? ? ? ? compressed: false

? ? ? ? ? ? ? ? ? ? Statistics: Num rows: 1 Data size: 832 Basic stats: COMPLETE Column stats: NONE

? ? ? ? ? ? ? ? ? ? table:

? ? ? ? ? ? ? ? ? ? ? ? input format: org.apache.hadoop.mapred.TextInputFormat

? ? ? ? ? ? ? ? ? ? ? ? output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat

? ? ? ? ? ? ? ? ? ? ? ? serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe

? ? ? Local Work:

? ? ? ? Map Reduce Local Work

? Stage: Stage-0

? ? Fetch Operator

? ? ? limit: -1

? ? ? Processor Tree:

? ? ? ? ListSink

select?*?from?emp e

left?join?dept d?on (e.deptno=d.deptno and? d.dt='2018-06-04');

花費(fèi)時(shí)間:Time taken: 23.804 seconds, Fetched: 17 row(s)

STAGE DEPENDENCIES:

? Stage-4 is a root stage

? Stage-3 depends on stages: Stage-4

? Stage-0 depends on stages: Stage-3

STAGE PLANS:

? Stage: Stage-4

? ? Map Reduce Local Work

? ? ? Alias -> Map Local Tables:

? ? ? ? d

? ? ? ? ? Fetch Operator

? ? ? ? ? ? limit: -1

? ? ? Alias -> Map Local Operator Tree:

? ? ? ? d

? ? ? ? ? TableScan

? ? ? ? ? ? alias: d

? ? ? ? ? ? filterExpr: (dt = '2018-06-04') (type: boolean)

? ? ? ? ? ? Statistics: Num rows: 1 Data size: 84 Basic stats: PARTIAL Column stats: PARTIAL

? ? ? ? ? ? HashTable Sink Operator

? ? ? ? ? ? ? keys:

? ? ? ? ? ? ? ? 0 deptno (type: string)

? ? ? ? ? ? ? ? 1 deptno (type: string)

? Stage: Stage-3

? ? Map Reduce

? ? ? Map Operator Tree:

? ? ? ? ? TableScan

? ? ? ? ? ? alias: e

? ? ? ? ? ? Statistics: Num rows: 1 Data size: 757 Basic stats: PARTIAL Column stats: PARTIAL

? ? ? ? ? ? Map Join Operator

? ? ? ? ? ? ? condition map:

? ? ? ? ? ? ? ? ? ?Left Outer Join0 to 1

? ? ? ? ? ? ? keys:

? ? ? ? ? ? ? ? 0 deptno (type: string)

? ? ? ? ? ? ? ? 1 deptno (type: string)

? ? ? ? ? ? ? outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8, _col12, _col13, _col14, _col15

? ? ? ? ? ? ? Statistics: Num rows: 1 Data size: 832 Basic stats: COMPLETE Column stats: NONE

? ? ? ? ? ? ? Select Operator

? ? ? ? ? ? ? ? expressions: _col0 (type: string), _col1 (type: string), _col2 (type: string), _col3 (type: string), _col4 (type: string), _col5 (type: string), _col6 (type: string), _col7 (type: string), _col8 (type: strin

g), _col12 (type: string), _col13 (type: string), _col14 (type: string), _col15 (type: string)? ? ? ? ? ? ? ? outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8, _col9, _col10, _col11, _col12

? ? ? ? ? ? ? ? Statistics: Num rows: 1 Data size: 832 Basic stats: COMPLETE Column stats: NONE

? ? ? ? ? ? ? ? File Output Operator

? ? ? ? ? ? ? ? ? compressed: false

? ? ? ? ? ? ? ? ? Statistics: Num rows: 1 Data size: 832 Basic stats: COMPLETE Column stats: NONE

? ? ? ? ? ? ? ? ? table:

? ? ? ? ? ? ? ? ? ? ? input format: org.apache.hadoop.mapred.TextInputFormat

? ? ? ? ? ? ? ? ? ? ? output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat

? ? ? ? ? ? ? ? ? ? ? serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe

? ? ? Local Work:

? ? ? ? Map Reduce Local Work

? Stage: Stage-0

? ? Fetch Operator

? ? ? limit: -1

? ? ? Processor Tree:

? ? ? ? ListSink

?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
  • 序言:七十年代末征懈,一起剝皮案震驚了整個(gè)濱河市石咬,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌卖哎,老刑警劉巖鬼悠,帶你破解...
    沈念sama閱讀 217,277評(píng)論 6 503
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場(chǎng)離奇詭異亏娜,居然都是意外死亡焕窝,警方通過查閱死者的電腦和手機(jī),發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 92,689評(píng)論 3 393
  • 文/潘曉璐 我一進(jìn)店門维贺,熙熙樓的掌柜王于貴愁眉苦臉地迎上來(lái)它掂,“玉大人,你說(shuō)我怎么就攤上這事溯泣∨扒铮” “怎么了?”我有些...
    開封第一講書人閱讀 163,624評(píng)論 0 353
  • 文/不壞的土叔 我叫張陵发乔,是天一觀的道長(zhǎng)熟妓。 經(jīng)常有香客問我,道長(zhǎng)栏尚,這世上最難降的妖魔是什么起愈? 我笑而不...
    開封第一講書人閱讀 58,356評(píng)論 1 293
  • 正文 為了忘掉前任,我火速辦了婚禮译仗,結(jié)果婚禮上抬虽,老公的妹妹穿的比我還像新娘。我一直安慰自己纵菌,他們只是感情好阐污,可當(dāng)我...
    茶點(diǎn)故事閱讀 67,402評(píng)論 6 392
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著咱圆,像睡著了一般笛辟。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上序苏,一...
    開封第一講書人閱讀 51,292評(píng)論 1 301
  • 那天手幢,我揣著相機(jī)與錄音,去河邊找鬼忱详。 笑死围来,一個(gè)胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播监透,決...
    沈念sama閱讀 40,135評(píng)論 3 418
  • 文/蒼蘭香墨 我猛地睜開眼桶错,長(zhǎng)吁一口氣:“原來(lái)是場(chǎng)噩夢(mèng)啊……” “哼!你這毒婦竟也來(lái)了胀蛮?” 一聲冷哼從身側(cè)響起院刁,我...
    開封第一講書人閱讀 38,992評(píng)論 0 275
  • 序言:老撾萬(wàn)榮一對(duì)情侶失蹤,失蹤者是張志新(化名)和其女友劉穎粪狼,沒想到半個(gè)月后黎比,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 45,429評(píng)論 1 314
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡鸳玩,尸身上長(zhǎng)有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 37,636評(píng)論 3 334
  • 正文 我和宋清朗相戀三年阅虫,在試婚紗的時(shí)候發(fā)現(xiàn)自己被綠了。 大學(xué)時(shí)的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片不跟。...
    茶點(diǎn)故事閱讀 39,785評(píng)論 1 348
  • 序言:一個(gè)原本活蹦亂跳的男人離奇死亡颓帝,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出窝革,到底是詐尸還是另有隱情购城,我是刑警寧澤,帶...
    沈念sama閱讀 35,492評(píng)論 5 345
  • 正文 年R本政府宣布虐译,位于F島的核電站瘪板,受9級(jí)特大地震影響,放射性物質(zhì)發(fā)生泄漏漆诽。R本人自食惡果不足惜侮攀,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 41,092評(píng)論 3 328
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望厢拭。 院中可真熱鬧兰英,春花似錦、人聲如沸供鸠。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,723評(píng)論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽(yáng)楞捂。三九已至薄坏,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間寨闹,已是汗流浹背胶坠。 一陣腳步聲響...
    開封第一講書人閱讀 32,858評(píng)論 1 269
  • 我被黑心中介騙來(lái)泰國(guó)打工, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留鼻忠,地道東北人涵但。 一個(gè)月前我還...
    沈念sama閱讀 47,891評(píng)論 2 370
  • 正文 我出身青樓,卻偏偏與公主長(zhǎng)得像帖蔓,于是被迫代替她去往敵國(guó)和親矮瘟。 傳聞我的和親對(duì)象是個(gè)殘疾皇子,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 44,713評(píng)論 2 354

推薦閱讀更多精彩內(nèi)容