Flink On Yarn 模式部署提交

Flink On Yarn 模式部署提交

一、環(huán)境準(zhǔn)備

Ubuntu

hadoop 2.6.0(官網(wǎng)下載)

Flink 1.12.2

jdk 8

二、Hadoop 完全分布式-yarn配置

  1. 永久關(guān)閉防火墻

  2. 修改主機(jī)名

    vim /etc/hosts

    192.168.73.130 hadoop01

  3. 修改環(huán)境變量

    export JAVA_HOME=/usr/lib/jdk export HADOOP_HOME=/home/ad/hadoop-2.6.0 export HADOOP_PREFIX=HADOOP_HOME export FLINK_HOME=/usr/lib/flink export HADOOP_CLASSPATH=`{HADOOP_HOME}/bin/hadoop classpath` export PATH=PATH:JAVA_HOME/bin:HADOOP_HOME/bin:HADOOP_HOME/sbin:$FLINK_HOME/bin

    環(huán)境變量生效

    $ source /etc/profile

    驗(yàn)證

    $ hadoop version

    Hadoop 2.6.0 Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1 Compiled by jenkins on 2014-11-13T21:10Z Compiled with protoc 2.5.0 From source with checksum 18e43357c8f927c0695f1e9522859d6a This command was run using /home/ad/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar

  4. SSH免密登錄

  5. 修改配置文件

core-site.xml

<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://hadoop01:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/opt/hadoop/tmp</value>
</property>
</configuration>

hadoop-env.sh mapred-env.sh yarn-env.sh

修改$JAVA_HOME 路徑

hdfs-site.xml

<configuration>
<property>

<name>dfs.replication</name>
<value>1</value> </property>
</configuration>

yarn-site.xml

<configuration>


<property>

<name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.resourcemanager.hostname</name>
<value>hadoop01</value>
</property>
</configuration>

cp mapred-site.xml.templat mapred-site.xml

<configuration>

<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>

slaves

hadoop01

  1. hadoop01上格式化

hadoop namenode -format

  1. 啟動(dòng)hadoop集群

$ start-all.sh

This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh Starting namenodes on [hadoop01] hadoop01: starting namenode, logging to /home/ad/hadoop-2.6.0/logs/hadoop-root-namenode-ad-virtual-machine.out hadoop01: starting datanode, logging to /home/ad/hadoop-2.6.0/logs/hadoop-root-datanode-ad-virtual-machine.out Starting secondary namenodes [0.0.0.0] 0.0.0.0: starting secondarynamenode, logging to /home/ad/hadoop-2.6.0/logs/hadoop-root-secondarynamenode-ad-virtual-machine.out starting yarn daemons starting resourcemanager, logging to /home/ad/hadoop-2.6.0/logs/yarn-root-resourcemanager-ad-virtual-machine.out hadoop01: starting nodemanager, logging to /home/ad/hadoop-2.6.0/logs/yarn-root-nodemanager-ad-virtual-machine.out

  1. 訪問hadoop01:8080
image-20210814105855230.png

三剧蹂、驗(yàn)證hadoop yarn

創(chuàng)建HDFS數(shù)據(jù)目錄

創(chuàng)建一個(gè)目錄,用于保存MapReduce任務(wù)的輸入文件:
hadoop fs -mkdir -p /data/wordcount1

創(chuàng)建一個(gè)目錄全度,用于保存MapReduce任務(wù)的輸出文件:

hadoop fs -mkdir /output1

查看剛剛創(chuàng)建的兩個(gè)目錄:

hadoop fs -ls /
drwxr-xr-x - root supergroup 0 2017-09-01 20:34 /data
drwxr-xr-x - root supergroup 0 2017-09-01 20:35 /output1

(3)創(chuàng)建一個(gè)單詞文件题造,并上傳到HDFS

創(chuàng)建的單詞文件如下:

cat myword.txt
leaf yyh
yyh xpleaf
katy ling
yeyonghao leaf
xpleaf katy1.2.3.4.5.6.</pre>

上傳該文件到HDFS中:

hadoop fs -put myword.txt /data/wordcount1

在HDFS中查看剛剛上傳的文件及內(nèi)容:

hadoop fs -ls /data/wordcount
-rw-r--r-- 1 root supergroup 57 2017-09-01 20:40 /data/wordcount/myword.txt
hadoop fs -cat /data/wordcount/myword.txt
leaf yyh
yyh xpleaf
katy ling
yeyonghao leaf
xpleaf katy1.2.3.4.5.6.7.8.</pre>

(4)運(yùn)行wordcount程序

執(zhí)行如下命令:

登錄后復(fù)制

hadoop jar /usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.5.jar wordcount /data/wordcount /output/wordcount

...
17/09/01 20:48:14 INFO mapreduce.Job: Job job_local1719603087_0001 completed successfully
17/09/01 20:48:14 INFO mapreduce.Job: Counters: 38
File System Counters
FILE: Number of bytes read=585940
FILE: Number of bytes written=1099502
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=114
HDFS: Number of bytes written=48
HDFS: Number of read operations=15
HDFS: Number of large read operations=0
HDFS: Number of write operations=4
Map-Reduce Framework
Map input records=5
Map output records=10
Map output bytes=97
Map output materialized bytes=78
Input split bytes=112
Combine input records=10
Combine output records=6
Reduce input groups=6
Reduce shuffle bytes=78
Reduce input records=6
Reduce output records=6
Spilled Records=12
Shuffled Maps =1
Failed Shuffles=0
Merged Map outputs=1
GC time elapsed (ms)=92
CPU time spent (ms)=0
Physical memory (bytes) snapshot=0
Virtual memory (bytes) snapshot=0
Total committed heap usage (bytes)=241049600
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=57
File Output Format Counters
Bytes Written=48</pre>

三、Flink on yarn環(huán)境搭建

  1. Flink Session

  2. Flink Per-job

$./bin/flink run -m yarn-cluster ./examples/batch/WordCount.jar

SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/ad/flink/flink-1.12.2/lib/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/ad/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Executing WordCount example with default input data set. Use --input to specify file input. Printing result to stdout. Use --output to specify output path. 2021-08-14 11:17:35,074 WARN org.apache.flink.yarn.configuration.YarnLogConfigUtil [] - The configuration directory ('/home/ad/flink/flink-1.12.2/conf') already contains a LOG4J config file.If you want to use logback, then please delete or rename the log configuration file. 2021-08-14 11:17:35,122 INFO org.apache.hadoop.yarn.client.RMProxy [] - Connecting to ResourceManager at hadoop01/192.168.73.130:8032 2021-08-14 11:17:35,238 INFO org.apache.flink.yarn.YarnClusterDescriptor [] - No path for the flink jar passed. Using the location of class org.apache.flink.yarn.YarnClusterDescriptor to locate the jar 2021-08-14 11:17:35,339 WARN org.apache.flink.yarn.YarnClusterDescriptor [] - Neither the HADOOP_CONF_DIR nor the YARN_CONF_DIR environment variable is set. The Flink YARN Client needs one of these to be set to properly load the Hadoop configuration for accessing YARN. 2021-08-14 11:17:35,372 INFO org.apache.flink.yarn.YarnClusterDescriptor [] - The configured JobManager memory is 1600 MB. YARN will allocate 2048 MB to make up an integer multiple of its minimum allocation memory (1024 MB, configured via 'yarn.scheduler.minimum-allocation-mb'). The extra 448 MB may not be used by Flink. 2021-08-14 11:17:35,373 INFO org.apache.flink.yarn.YarnClusterDescriptor [] - The configured TaskManager memory is 1728 MB. YARN will allocate 2048 MB to make up an integer multiple of its minimum allocation memory (1024 MB, configured via 'yarn.scheduler.minimum-allocation-mb'). The extra 320 MB may not be used by Flink. 2021-08-14 11:17:35,374 INFO org.apache.flink.yarn.YarnClusterDescriptor [] - Cluster specification: ClusterSpecification{masterMemoryMB=1600, taskManagerMemoryMB=1728, slotsPerTaskManager=4} 2021-08-14 11:17:39,080 INFO org.apache.flink.yarn.YarnClusterDescriptor [] - Submitting application master application_1628910991546_0001 2021-08-14 11:17:39,472 INFO org.apache.hadoop.yarn.client.api.impl.YarnClientImpl [] - Submitted application application_1628910991546_0001 2021-08-14 11:17:39,472 INFO org.apache.flink.yarn.YarnClusterDescriptor [] - Waiting for the cluster to be allocated 2021-08-14 11:17:39,474 INFO org.apache.flink.yarn.YarnClusterDescriptor [] - Deploying cluster, current state ACCEPTED 2021-08-14 11:17:49,830 INFO org.apache.flink.yarn.YarnClusterDescriptor [] - YARN application has been deployed successfully. 2021-08-14 11:17:49,833 INFO org.apache.flink.yarn.YarnClusterDescriptor [] - Found Web Interface ad-virtual-machine:36059 of application 'application_1628910991546_0001'. Job has been submitted with JobID addaa84fd2ee06164ba7d53a029a6342 Program execution finished Job with JobID addaa84fd2ee06164ba7d53a029a6342 has finished. Job Runtime: 12937 ms Accumulator Results:

  • 23a767877a2b6289cf181a8732c5d46a (java.util.ArrayList) [170 elements]

(a,5) (action,1) (after,1) (against,1) (all,2) (and,12) (arms,1) (arrows,1) (awry,1) (ay,1) (bare,1) (be,4) (bear,3) (bodkin,1) (bourn,1) (but,1) (by,2) (calamity,1) (cast,1) (coil,1) (come,1) (conscience,1) (consummation,1) (contumely,1) (country,1) (cowards,1) (currents,1) (d,4) (death,2) (delay,1) (despis,1) (devoutly,1) (die,2) (does,1) (dread,1) (dream,1) (dreams,1) (end,2) (enterprises,1) (er,1) (fair,1) (fardels,1) (flesh,1) (fly,1) (for,2) (fortune,1) (from,1) (give,1) (great,1) (grunt,1) (have,2) (he,1) (heartache,1) (heir,1) (himself,1) (his,1) (hue,1) (ills,1) (in,3) (insolence,1) (is,3) (know,1) (law,1) (life,2) (long,1) (lose,1) (love,1) (make,2) (makes,2) (man,1) (may,1) (merit,1) (might,1) (mind,1) (moment,1) (more,1) (mortal,1) (must,1) (my,1) (name,1) (native,1) (natural,1) (no,2) (nobler,1) (not,2) (now,1) (nymph,1) (o,1) (of,15) (off,1) (office,1) (ophelia,1) (opposing,1) (oppressor,1) (or,2) (orisons,1) (others,1) (outrageous,1) (pale,1) (pangs,1) (patient,1) (pause,1) (perchance,1) (pith,1) (proud,1) (puzzles,1) (question,1) (quietus,1) (rather,1) (regard,1) (remember,1) (resolution,1) (respect,1) (returns,1) (rub,1) (s,5) (say,1) (scorns,1) (sea,1) (shocks,1) (shuffled,1) (sicklied,1) (sins,1) (sleep,5) (slings,1) (so,1) (soft,1) (something,1) (spurns,1) (suffer,1) (sweat,1) (take,1) (takes,1) (than,1) (that,7) (the,22) (their,1) (them,1) (there,2) (these,1) (this,2) (those,1) (thought,1) (thousand,1) (thus,2) (thy,1) (time,1) (tis,2) (to,15) (traveller,1) (troubles,1) (turn,1) (under,1) (undiscover,1) (unworthy,1) (us,3) (we,4) (weary,1) (what,1) (when,2) (whether,1) (whips,1) (who,2) (whose,1) (will,1) (wish,1) (with,3) (would,2) (wrong,1) (you,1)

?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
  • 序言:七十年代末谍椅,一起剝皮案震驚了整個(gè)濱河市误堡,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌雏吭,老刑警劉巖锁施,帶你破解...
    沈念sama閱讀 217,277評(píng)論 6 503
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場(chǎng)離奇詭異,居然都是意外死亡悉抵,警方通過查閱死者的電腦和手機(jī)肩狂,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 92,689評(píng)論 3 393
  • 文/潘曉璐 我一進(jìn)店門,熙熙樓的掌柜王于貴愁眉苦臉地迎上來基跑,“玉大人婚温,你說我怎么就攤上這事∠狈瘢” “怎么了?”我有些...
    開封第一講書人閱讀 163,624評(píng)論 0 353
  • 文/不壞的土叔 我叫張陵荆秦,是天一觀的道長(zhǎng)篱竭。 經(jīng)常有香客問我,道長(zhǎng)步绸,這世上最難降的妖魔是什么掺逼? 我笑而不...
    開封第一講書人閱讀 58,356評(píng)論 1 293
  • 正文 為了忘掉前任,我火速辦了婚禮瓤介,結(jié)果婚禮上吕喘,老公的妹妹穿的比我還像新娘。我一直安慰自己刑桑,他們只是感情好氯质,可當(dāng)我...
    茶點(diǎn)故事閱讀 67,402評(píng)論 6 392
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著祠斧,像睡著了一般闻察。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上琢锋,一...
    開封第一講書人閱讀 51,292評(píng)論 1 301
  • 那天辕漂,我揣著相機(jī)與錄音,去河邊找鬼吴超。 笑死钉嘹,一個(gè)胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的鲸阻。 我是一名探鬼主播跋涣,決...
    沈念sama閱讀 40,135評(píng)論 3 418
  • 文/蒼蘭香墨 我猛地睜開眼,長(zhǎng)吁一口氣:“原來是場(chǎng)噩夢(mèng)啊……” “哼赘娄!你這毒婦竟也來了仆潮?” 一聲冷哼從身側(cè)響起,我...
    開封第一講書人閱讀 38,992評(píng)論 0 275
  • 序言:老撾萬榮一對(duì)情侶失蹤遣臼,失蹤者是張志新(化名)和其女友劉穎性置,沒想到半個(gè)月后,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體揍堰,經(jīng)...
    沈念sama閱讀 45,429評(píng)論 1 314
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡鹏浅,尸身上長(zhǎng)有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 37,636評(píng)論 3 334
  • 正文 我和宋清朗相戀三年嗅义,在試婚紗的時(shí)候發(fā)現(xiàn)自己被綠了。 大學(xué)時(shí)的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片隐砸。...
    茶點(diǎn)故事閱讀 39,785評(píng)論 1 348
  • 序言:一個(gè)原本活蹦亂跳的男人離奇死亡之碗,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出季希,到底是詐尸還是另有隱情褪那,我是刑警寧澤,帶...
    沈念sama閱讀 35,492評(píng)論 5 345
  • 正文 年R本政府宣布式塌,位于F島的核電站博敬,受9級(jí)特大地震影響,放射性物質(zhì)發(fā)生泄漏峰尝。R本人自食惡果不足惜偏窝,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 41,092評(píng)論 3 328
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望武学。 院中可真熱鬧祭往,春花似錦、人聲如沸火窒。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,723評(píng)論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽沛鸵。三九已至括勺,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間曲掰,已是汗流浹背疾捍。 一陣腳步聲響...
    開封第一講書人閱讀 32,858評(píng)論 1 269
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留栏妖,地道東北人乱豆。 一個(gè)月前我還...
    沈念sama閱讀 47,891評(píng)論 2 370
  • 正文 我出身青樓,卻偏偏與公主長(zhǎng)得像吊趾,于是被迫代替她去往敵國和親宛裕。 傳聞我的和親對(duì)象是個(gè)殘疾皇子,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 44,713評(píng)論 2 354

推薦閱讀更多精彩內(nèi)容