spark集成Kerberos

隸屬于文章系列:大數(shù)據(jù)安全實(shí)戰(zhàn) http://www.reibang.com/p/76627fd8399c


bin/spark-submit  --principal testyarn-nopwd@TT.COM --keytab /etc/hadoop/conf/testyarn-nopwd.keytab --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode client  ./examples/jars/spark-examples_2.11-2.2.0.jar 10
kinit kduser
cd /var/opt/spark-2.2.0-bin-hadoop2.7 ;
bin/spark-submit   --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode client  ./examples/jars/spark-examples_2.11-2.2.0.jar 10

遇到問題:

18/03/21 11:55:20 ERROR client.TransportClient: Failed to send RPC 6258076510815284346 to /172.20.178.32:5800: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
        at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
18/03/21 11:55:20 INFO storage.BlockManagerMaster: Removed 1 successfully in removeExecutor
18/03/21 11:55:20 INFO scheduler.DAGScheduler: Shuffle files lost for executor: 1 (epoch 0)
18/03/21 11:55:20 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to get executor loss reason for executor id 1 at RPC address 172.20.178.32:5806, but got no response. Marking as slave lost.
java.io.IOException: Failed to send RPC 6258076510815284346 to /172.20.178.32:5800: java.nio.channels.ClosedChannelException
        at org.apache.spark.network.client.TransportClient.lambda$sendRpc$2(TransportClient.java:237)
        at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
        at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:481)
        at io.netty.util.concurrent.DefaultPromise.access$000(DefaultPromise.java:34)
        at io.netty.util.concurrent.DefaultPromise$1.run(DefaultPromise.java:431)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.nio.channels.ClosedChannelException
        at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
18/03/21 11:55:20 ERROR cluster.YarnScheduler: Lost executor 1 on v-hadoop5-kbds.sz.kingdee.net: Slave lost
18/03/21 11:55:26 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark-client://YarnAM)
18/03/21 11:55:26 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> v-hadoop-kbds.sz.kingdee.net, PROXY_URI_BASES -> http://v-hadoop-kbds.sz.kingdee.net:8088/proxy/application_1521601147932_0014), /proxy/application_1521601147932_0014
18/03/21 11:55:26 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
18/03/21 11:55:27 INFO yarn.Client: Deleted staging directory hdfs://cluster/user/kduser/.sparkStaging/application_1521601147932_0014
18/03/21 11:55:27 ERROR cluster.YarnClientSchedulerBackend: Yarn application has already exited with state FINISHED!
18/03/21 11:55:27 INFO server.AbstractConnector: Stopped Spark@1734f68{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
18/03/21 11:55:27 INFO ui.SparkUI: Stopped Spark web UI at http://172.20.178.28:4040
18/03/21 11:55:27 ERROR spark.SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Spark context stopped while waiting for backend
        at org.apache.spark.scheduler.TaskSchedulerImpl.waitBackendReady(TaskSchedulerImpl.scala:673)
        at org.apache.spark.scheduler.TaskSchedulerImpl.postStartHook(TaskSchedulerImpl.scala:186)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:567)
▽       at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
18/03/21 11:55:27 INFO spark.SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.IllegalStateException: Spark context stopped while waiting for backend
        at org.apache.spark.scheduler.TaskSchedulerImpl.waitBackendReady(TaskSchedulerImpl.scala:673)
        at org.apache.spark.scheduler.TaskSchedulerImpl.postStartHook(TaskSchedulerImpl.scala:186)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:567)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
18/03/21 11:55:27 INFO storage.DiskBlockManager: Shutdown hook called
18/03/21 11:55:27 INFO util.ShutdownHookManager: Shutdown hook called
18/03/21 11:55:27 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-38915f98-aa5d-41bd-883d-fcaf60c1e9f8
18/03/21 11:55:27 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-38915f98-aa5d-41bd-883d-fcaf60c1e9f8/userFiles-a3ceadef-3acf-4a10-9e2c-9f532aa0e97f

解決:修改 yarn-site.xml文件

<property>
    <name>yarn.nodemanager.pmem-check-enabled</name>
    <value>false</value>
</property>

<property>
    <name>yarn.nodemanager.vmem-check-enabled</name>
    <value>false</value>
</property>

參考:
spark運(yùn)維問題記錄 - CSDN博客

?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末,一起剝皮案震驚了整個濱河市,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌,老刑警劉巖筏养,帶你破解...
    沈念sama閱讀 217,826評論 6 506
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異,居然都是意外死亡仙蚜,警方通過查閱死者的電腦和手機(jī),發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 92,968評論 3 395
  • 文/潘曉璐 我一進(jìn)店門厂汗,熙熙樓的掌柜王于貴愁眉苦臉地迎上來委粉,“玉大人,你說我怎么就攤上這事娶桦〖纸冢” “怎么了?”我有些...
    開封第一講書人閱讀 164,234評論 0 354
  • 文/不壞的土叔 我叫張陵衷畦,是天一觀的道長栗涂。 經(jīng)常有香客問我,道長祈争,這世上最難降的妖魔是什么斤程? 我笑而不...
    開封第一講書人閱讀 58,562評論 1 293
  • 正文 為了忘掉前任,我火速辦了婚禮菩混,結(jié)果婚禮上忿墅,老公的妹妹穿的比我還像新娘。我一直安慰自己沮峡,他們只是感情好疚脐,可當(dāng)我...
    茶點(diǎn)故事閱讀 67,611評論 6 392
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著帖烘,像睡著了一般亮曹。 火紅的嫁衣襯著肌膚如雪橄杨。 梳的紋絲不亂的頭發(fā)上,一...
    開封第一講書人閱讀 51,482評論 1 302
  • 那天照卦,我揣著相機(jī)與錄音式矫,去河邊找鬼。 笑死役耕,一個胖子當(dāng)著我的面吹牛采转,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播瞬痘,決...
    沈念sama閱讀 40,271評論 3 418
  • 文/蒼蘭香墨 我猛地睜開眼故慈,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了框全?” 一聲冷哼從身側(cè)響起察绷,我...
    開封第一講書人閱讀 39,166評論 0 276
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎津辩,沒想到半個月后拆撼,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 45,608評論 1 314
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡喘沿,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 37,814評論 3 336
  • 正文 我和宋清朗相戀三年闸度,在試婚紗的時候發(fā)現(xiàn)自己被綠了。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片蚜印。...
    茶點(diǎn)故事閱讀 39,926評論 1 348
  • 序言:一個原本活蹦亂跳的男人離奇死亡莺禁,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出窄赋,到底是詐尸還是另有隱情哟冬,我是刑警寧澤,帶...
    沈念sama閱讀 35,644評論 5 346
  • 正文 年R本政府宣布寝凌,位于F島的核電站柒傻,受9級特大地震影響,放射性物質(zhì)發(fā)生泄漏较木。R本人自食惡果不足惜红符,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 41,249評論 3 329
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望伐债。 院中可真熱鬧预侯,春花似錦、人聲如沸峰锁。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,866評論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽虹蒋。三九已至糜芳,卻和暖如春飒货,著一層夾襖步出監(jiān)牢的瞬間,已是汗流浹背峭竣。 一陣腳步聲響...
    開封第一講書人閱讀 32,991評論 1 269
  • 我被黑心中介騙來泰國打工塘辅, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留,地道東北人皆撩。 一個月前我還...
    沈念sama閱讀 48,063評論 3 370
  • 正文 我出身青樓扣墩,卻偏偏與公主長得像,于是被迫代替她去往敵國和親扛吞。 傳聞我的和親對象是個殘疾皇子呻惕,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 44,871評論 2 354

推薦閱讀更多精彩內(nèi)容