Spark在local模式下運(yùn)行日志分析

本例以卡特門羅求Pi的計(jì)算模型的日志做分析餐屎。運(yùn)行在local模式中廓啊,具體代碼如下
val conf = new SparkConf().setAppName("Spark Pi").setMaster("local[2]")/
val spark=new SparkContext(conf);
val slices = 100;
val n = 1000 * slices
val count = spark.parallelize(1 to n,slices).map({ i =>
def random: Double = java.lang.Math.random()
val x = random * 2 - 1
val y = random * 2 - 1
if (xx + yy < 1) 1 else 0
}).reduce(_ + _)
println("Pi is roughly " + 4.0 * count / n)
spark.stop()

日志分析:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties //使用默認(rèn)的log4j日志模塊進(jìn)行日志輸出
17/04/10 18:43:09 INFO SparkContext: Running Spark version 1.3.1 //spark上下文環(huán)境運(yùn)行在Spark1.3.1的版本下
17/04/10 18:43:10 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable //沒有使用yarn模式买鸽,所以詞條警告信息不影響實(shí)際運(yùn)行
17/04/10 18:43:10 INFO SecurityManager: Changing view acls to: Administrator
17/04/10 18:43:10 INFO SecurityManager: Changing modify acls to: Administrator
17/04/10 18:43:10 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(Administrator); users with modify permissions: Set(Administrator)
17/04/10 18:43:10 INFO Slf4jLogger: Slf4jLogger started
17/04/10 18:43:10 INFO Remoting: Starting remoting
17/04/10 18:43:10 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@jdlzy:57217]//spark底層使用akka進(jìn)行通信式塌,隨機(jī)生成一個(gè)端口進(jìn)行監(jiān)聽
17/04/10 18:43:10 INFO Utils: Successfully started service 'sparkDriver' on port 57217.//成功在spark中創(chuàng)建臨時(shí)監(jiān)聽端口
17/04/10 18:43:11 INFO SparkEnv: Registering MapOutputTracker
17/04/10 18:43:11 INFO SparkEnv: Registering BlockManagerMaster
17/04/10 18:43:11 INFO DiskBlockManager: Created local directory at C:\Users\ADMINI~1.USE\AppData\Local\Temp\spark-7b2ae0d5-95ce-4727-9179-74cee0fa6dab\blockmgr-7c7505c5-c9cb-4e03-bead-67d3ef882930 //數(shù)據(jù)塊的管理者DiskBlockManager在本地位置中管理內(nèi)存
17/04/10 18:43:11 INFO MemoryStore: MemoryStore started with capacity 969.8 MB //本次任務(wù)可使用的內(nèi)存數(shù)為969.8Mb
17/04/10 18:43:11 INFO HttpFileServer: HTTP File server directory is C:\Users\ADMINI~1.USE\AppData\Local\Temp\spark-f9e6ad82-19d9-439b-893b-f7f505b84b95\httpd-baf7acc4-154a-448b-b613-88b243249c03
17/04/10 18:43:11 INFO HttpServer: Starting HTTP Server
17/04/10 18:43:11 INFO Server: jetty-8.y.z-SNAPSHOT
17/04/10 18:43:11 INFO AbstractConnector: Started SocketConnector@0.0.0.0:57218
17/04/10 18:43:11 INFO Utils: Successfully started service 'HTTP file server' on port 57218.
17/04/10 18:43:11 INFO SparkEnv: Registering OutputCommitCoordinator
17/04/10 18:43:11 INFO Server: jetty-8.y.z-SNAPSHOT
17/04/10 18:43:11 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
17/04/10 18:43:11 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/04/10 18:43:11 INFO SparkUI: Started SparkUI at http://jdlzy:4040
17/04/10 18:43:11 INFO Executor: Starting executor ID <driver> on host localhost //executor調(diào)度資源
17/04/10 18:43:11 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@jdlzy:57217/user/HeartbeatReceiver
17/04/10 18:43:12 INFO NettyBlockTransferService: Server created on 57238
17/04/10 18:43:12 INFO BlockManagerMaster: Trying to register BlockManager
17/04/10 18:43:12 INFO BlockManagerMasterActor: Registering block manager localhost:57238 with 969.8 MB RAM, BlockManagerId(<driver>, localhost, 57238)
17/04/10 18:43:12 INFO BlockManagerMaster: Registered BlockManager
17/04/10 18:43:12 INFO SparkContext: Starting job: reduce at MySparkPi.scala:25 //開始執(zhí)行任務(wù)
17/04/10 18:43:12 INFO DAGScheduler: Got job 0 (reduce at MySparkPi.scala:25) with 100 output partitions (allowLocal=false) //獲取到了RDD林艘,并要分割job
17/04/10 18:43:12 INFO DAGScheduler: Final stage: Stage 0(reduce at MySparkPi.scala:25)//第0個(gè)stage
17/04/10 18:43:12 INFO DAGScheduler: Parents of final stage: List()//返回一個(gè)List
17/04/10 18:43:12 INFO DAGScheduler: Missing parents: List()
17/04/10 18:43:12 INFO DAGScheduler: Submitting Stage 0 (MapPartitionsRDD[1] at map at MySparkPi.scala:15), which has no missing parents //stage0沒有parent墓陈,提交任務(wù)
17/04/10 18:43:12 INFO MemoryStore: ensureFreeSpace(1832) called with curMem=0, maxMem=1016950947 //說明使用了多少的內(nèi)存空間
17/04/10 18:43:12 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1832.0 B, free 969.8 MB)
17/04/10 18:43:12 INFO MemoryStore: ensureFreeSpace(1293) called with curMem=1832, maxMem=1016950947
17/04/10 18:43:12 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1293.0 B, free 969.8 MB)
17/04/10 18:43:12 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:57238 (size: 1293.0 B, free: 969.8 MB)
17/04/10 18:43:12 INFO BlockManagerMaster: Updated info of block broadcast_0_piece0
17/04/10 18:43:12 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:839
17/04/10 18:43:12 INFO DAGScheduler: Submitting 100 missing tasks from Stage 0 (MapPartitionsRDD[1] at map at MySparkPi.scala:15) //開始提交了1個(gè)任務(wù)井仰,此任務(wù)屬于stage0
17/04/10 18:43:12 INFO TaskSchedulerImpl: Adding task set 0.0 with 100 tasks
17/04/10 18:43:12 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 1260 bytes)//提交task任務(wù)埋嵌,
17/04/10 18:43:12 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, localhost, PROCESS_LOCAL, 1260 bytes)
17/04/10 18:43:12 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)//executor開始執(zhí)行stage0中的task1.
17/04/10 18:43:12 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
17/04/10 18:43:12 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 736 bytes result sent to driver //完成stage0中的task1,并將736字節(jié)的數(shù)據(jù)返回給driver
17/04/10 18:43:12 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 736 bytes result sent to driver
·
·
·
·
17/04/10 18:43:13 INFO TaskSetManager: Finished task 99.0 in stage 0.0 (TID 99) in 15 ms on localhost (100/100)
17/04/10 18:43:13 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
17/04/10 18:43:13 INFO DAGScheduler: Stage 0 (reduce at MySparkPi.scala:25) finished in 0.883 s//stage0任務(wù)執(zhí)行完畢
17/04/10 18:43:13 INFO DAGScheduler: Job 0 finished: reduce at MySparkPi.scala:25, took 1.218286 s //job0任務(wù)執(zhí)行完畢
Pi is roughly 3.12704
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
17/04/10 18:43:13 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
17/04/10 18:43:13 INFO SparkUI: Stopped Spark web UI at http://jdlzy:4040
17/04/10 18:43:13 INFO DAGScheduler: Stopping DAGScheduler
17/04/10 18:43:13 INFO MapOutputTrackerMasterActor: MapOutputTrackerActor stopped!
17/04/10 18:43:13 INFO MemoryStore: MemoryStore cleared
17/04/10 18:43:13 INFO BlockManager: BlockManager stopped
17/04/10 18:43:13 INFO BlockManagerMaster: BlockManagerMaster stopped
17/04/10 18:43:13 INFO SparkContext: Successfully stopped SparkContext
17/04/10 18:43:13 INFO OutputCommitCoordinator$OutputCommitCoordinatorActor: OutputCommitCoordinator stopped!

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
  • 序言:七十年代末俱恶,一起剝皮案震驚了整個(gè)濱河市雹嗦,隨后出現(xiàn)的幾起案子范舀,更是在濱河造成了極大的恐慌,老刑警劉巖俐银,帶你破解...
    沈念sama閱讀 212,718評(píng)論 6 492
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件尿背,死亡現(xiàn)場(chǎng)離奇詭異,居然都是意外死亡捶惜,警方通過查閱死者的電腦和手機(jī)田藐,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 90,683評(píng)論 3 385
  • 文/潘曉璐 我一進(jìn)店門,熙熙樓的掌柜王于貴愁眉苦臉地迎上來吱七,“玉大人汽久,你說我怎么就攤上這事∮徊停” “怎么了景醇?”我有些...
    開封第一講書人閱讀 158,207評(píng)論 0 348
  • 文/不壞的土叔 我叫張陵,是天一觀的道長(zhǎng)吝岭。 經(jīng)常有香客問我三痰,道長(zhǎng),這世上最難降的妖魔是什么窜管? 我笑而不...
    開封第一講書人閱讀 56,755評(píng)論 1 284
  • 正文 為了忘掉前任散劫,我火速辦了婚禮,結(jié)果婚禮上幕帆,老公的妹妹穿的比我還像新娘获搏。我一直安慰自己,他們只是感情好失乾,可當(dāng)我...
    茶點(diǎn)故事閱讀 65,862評(píng)論 6 386
  • 文/花漫 我一把揭開白布常熙。 她就那樣靜靜地躺著,像睡著了一般碱茁。 火紅的嫁衣襯著肌膚如雪裸卫。 梳的紋絲不亂的頭發(fā)上,一...
    開封第一講書人閱讀 50,050評(píng)論 1 291
  • 那天早芭,我揣著相機(jī)與錄音彼城,去河邊找鬼。 笑死退个,一個(gè)胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的调炬。 我是一名探鬼主播语盈,決...
    沈念sama閱讀 39,136評(píng)論 3 410
  • 文/蒼蘭香墨 我猛地睜開眼,長(zhǎng)吁一口氣:“原來是場(chǎng)噩夢(mèng)啊……” “哼缰泡!你這毒婦竟也來了刀荒?” 一聲冷哼從身側(cè)響起代嗤,我...
    開封第一講書人閱讀 37,882評(píng)論 0 268
  • 序言:老撾萬榮一對(duì)情侶失蹤,失蹤者是張志新(化名)和其女友劉穎缠借,沒想到半個(gè)月后干毅,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 44,330評(píng)論 1 303
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡泼返,尸身上長(zhǎng)有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 36,651評(píng)論 2 327
  • 正文 我和宋清朗相戀三年硝逢,在試婚紗的時(shí)候發(fā)現(xiàn)自己被綠了。 大學(xué)時(shí)的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片绅喉。...
    茶點(diǎn)故事閱讀 38,789評(píng)論 1 341
  • 序言:一個(gè)原本活蹦亂跳的男人離奇死亡渠鸽,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出柴罐,到底是詐尸還是另有隱情徽缚,我是刑警寧澤,帶...
    沈念sama閱讀 34,477評(píng)論 4 333
  • 正文 年R本政府宣布革屠,位于F島的核電站凿试,受9級(jí)特大地震影響,放射性物質(zhì)發(fā)生泄漏似芝。R本人自食惡果不足惜那婉,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 40,135評(píng)論 3 317
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望国觉。 院中可真熱鬧吧恃,春花似錦、人聲如沸麻诀。這莊子的主人今日做“春日...
    開封第一講書人閱讀 30,864評(píng)論 0 21
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽蝇闭。三九已至呻率,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間呻引,已是汗流浹背礼仗。 一陣腳步聲響...
    開封第一講書人閱讀 32,099評(píng)論 1 267
  • 我被黑心中介騙來泰國(guó)打工, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留逻悠,地道東北人元践。 一個(gè)月前我還...
    沈念sama閱讀 46,598評(píng)論 2 362
  • 正文 我出身青樓,卻偏偏與公主長(zhǎng)得像童谒,于是被迫代替她去往敵國(guó)和親单旁。 傳聞我的和親對(duì)象是個(gè)殘疾皇子,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 43,697評(píng)論 2 351

推薦閱讀更多精彩內(nèi)容

  • 每天逼自己寫篇文章饥伊,這樣才能把自己所掌握的知識(shí)給融會(huì)貫通象浑。人必須對(duì)自己狠點(diǎn)蔫饰,要不然都不知道自己的底線在哪里。 條理...
    老杜克Duke閱讀 333評(píng)論 0 0
  • PDI中文叫做權(quán)力距離指數(shù)愉豺,指人們對(duì)待比自己更高等級(jí)階層的態(tài)度篓吁,特別是指普羅大眾對(duì)權(quán)威的重視和尊重程度。我一直覺得...
    許多的yolyol閱讀 3,580評(píng)論 1 1
  • 大多數(shù)國(guó)人比較實(shí)誠(chéng)蚪拦,形而上的東西不好懂杖剪,摸胸口畫十字也摸不出神佛附體,菩提樹更是難找外盯,拿芭蕉當(dāng)飯吃更不是吃貨該干的...
    袖卷千重雪閱讀 488評(píng)論 0 1
  • 媽媽,這次考試考不好了箱熬,這幾天我已經(jīng)想過了类垦,主要是物理數(shù)學(xué)考不好,拉分很嚴(yán)重城须。雖然總分不會(huì)差蚤认,但是五科總分比起上個(gè)...
    紫凝11閱讀 629評(píng)論 0 0