Flink 1.10 sql-client連接hive踩坑記錄

說明:本人使用的cdh版本是5.13.2恰矩,對應(yīng)的hive版本是1.1.0-cdh5.13.2电禀,flink版本為1.10弊仪。以下為配置過程:

首先修改flink-1.10/conf/sql-client-defaults.yaml配置汰寓,為hive配置catalog相關(guān)參數(shù)菊碟,cdh版本的hive-conf目錄為:/etc/hive/conf.cloudera.hive:


#==============================================================================

# Catalogs

#==============================================================================

# Define catalogs here.

#catalogs: [] # empty list

    hive-version: 1.1.0

#catalogs: [] # empty list

# A typical catalog definition looks like:

#  - name: myhive

#    type: hive

#    hive-conf-dir: /opt/hive_conf/

#    default-database: ...

catalogs:

  - name: myhive

    type: hive

    hive-conf-dir: /etc/hive/conf.cloudera.hive

    hive-version: 1.2.1

    property-version: 1

    default-database: default

由于下載的flink本身不自帶hive相關(guān)的lib裕照,因此決定自行導(dǎo)入:

wget https://repo1.maven.org/maven2/org/apache/flink/flink-connector-hive_2.11/1.10.0/flink-connector-hive_2.11-1.10.0.jar

wget https://repo1.maven.org/maven2/org/apache/flink/flink-hadoop-compatibility_2.11/1.10.0/flink-hadoop-compatibility_2.11-1.10.0.jar

下載hive1.2.1版本攒发,并解壓:

http://archive.apache.org/dist/hive/hive-1.2.1/

將以下jar導(dǎo)入到flink-1.10/lib:


-rw-r--r-- 1 root root 164368 Mar 27 15:54 antlr-runtime-3.4.jar

-rw-r--r-- 1 root root    448794 Mar 27 15:57 apache-log4j-extras-1.2.17.jar

-rw-r--r-- 1 root root    91473 Mar 27 16:11 commons-cli-2.0-mahout.jar

-rw-r--r-- 1 root root    298829 Mar 27 16:16 commons-configuration-1.6.jar

-rw-r--r-- 1 root root    62050 Mar 27 15:56 commons-logging-1.1.3.jar

-rw-r--r-- 1 root root    339666 Mar 27 15:54 datanucleus-api-jdo-3.2.6.jar

-rw-r--r-- 1 root root  1890075 Mar 27 15:54 datanucleus-core-3.2.10.jar

-rw-r--r-- 1 root root  1809447 Mar 27 15:54 datanucleus-rdbms-3.2.9.jar

-rw-r--r-- 1 root root    292290 Mar 27 15:53 hive-common-1.2.1.jar

-rw-r--r-- 1 root root  20599030 Mar 27 15:56 hive-exec-1.2.1.jar

-rw-r--r-- 1 root root  5505100 Mar 27 15:54 hive-metastore-1.2.1.jar

-rw-r--r-- 1 root root    108914 Mar 27 15:53 hive-shims-common-1.2.1.jar

-rw-r--r-- 1 root root    201124 Mar 27 15:57 jdo-api-3.0.1.jar

-rw-r--r-- 1 root root    313686 Mar 27 15:54 libfb303-0.9.2.jar

-rw-r--r-- 1 root root    481535 Mar 27 15:57 log4j-1.2.16.jar

-rw-r--r-- 1 root root      9931 Sep  2  2019 slf4j-log4j12-1.7.15.jar

使用命令bin/sql-client.sh embedded啟動,報錯及解決過程:


Exception in thread "main" java.lang.NoSuchMethodError: org.apache.commons.cli.Option.builder(Ljava/lang/String;)Lorg/apache/commons/cli/Option$Builder;

        at org.apache.flink.table.client.cli.CliOptionsParser.<clinit>(CliOptionsParser.java:43)

        at org.apache.flink.table.client.SqlClient.main(SqlClient.java:172)

刪除commons-cli-1.2.1.jar晋南,從cdh jars目錄導(dǎo)入commons-cli-2.0:


[root@kylin03 flink-1.10.0]# cd lib

[root@kylin03 lib]# ls

antlr-runtime-3.4.jar          datanucleus-core-3.2.10.jar                flink-table_2.11-1.10.0.jar        hive-shims-common-1.2.1.jar          slf4j-log4j12-1.7.15.jar

apache-log4j-extras-1.2.17.jar  datanucleus-rdbms-3.2.9.jar                flink-table-blink_2.11-1.10.0.jar  jdo-api-3.0.1.jar

commons-cli-1.2.1.jar          flink-connector-hive_2.11-1.10.0.jar        hive-common-1.2.1.jar              libfb303-0.9.2.jar

commons-logging-1.1.3.jar      flink-dist_2.11-1.10.0.jar                  hive-exec-1.2.1.jar                log4j-1.2.16.jar

datanucleus-api-jdo-3.2.6.jar  flink-hadoop-compatibility_2.11-1.10.0.jar  hive-metastore-1.2.1.jar          mysql-connector-java-5.1.42-bin.jar

[root@kylin03 lib]# rm commons-cli-1.2.1.jar

rm: remove regular file `commons-cli-1.2.1.jar'? y

[root@kylin03 lib]# cp /opt/cloudera/parcels/CDH/jars/commons-cli-

commons-cli-1.2.jar        commons-cli-2.0-mahout.jar 

[root@kylin03 lib]# cp /opt/cloudera/parcels/CDH/jars/commons-cli-2.0-mahout.jar .


[root@kylin03 flink-1.10.0]# bin/sql-client.sh embedded

No default environment specified.

Searching for '/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml'...found.

Reading default environment from: file:/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml

No session environment specified.

Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.

        at org.apache.flink.table.client.SqlClient.main(SqlClient.java:190)

Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.

        at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:753)

        at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:228)

        at org.apache.flink.table.client.SqlClient.start(SqlClient.java:98)

        at org.apache.flink.table.client.SqlClient.main(SqlClient.java:178)

Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration

        at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:84)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:371)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$4(ExecutionContext.java:547)

        at java.util.HashMap.forEach(HashMap.java:1288)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$5(ExecutionContext.java:546)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:240)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:545)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:494)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:159)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:118)

        at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:742)

        ... 3 more

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration

        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

        ... 14 more

缺少Hadoop相關(guān)jar包惠猿,從cdh目錄拷貝:

[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-co

hadoop-common-2.6.0-cdh5.13.2.jar        hadoop-common-2.6.0-cdh5.13.2-tests.jar  hadoop-core-2.6.0-mr1-cdh5.13.2.jar     

[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-co

hadoop-common-2.6.0-cdh5.13.2.jar        hadoop-common-2.6.0-cdh5.13.2-tests.jar  hadoop-core-2.6.0-mr1-cdh5.13.2.jar     

[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-common-2.6.0-cdh5.13.2.jar lib/

[root@kylin03 flink-1.10.0]# bin/sql-client.sh embedded                                             

No default environment specified.

Searching for '/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml'...found.

Reading default environment from: file:/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml

No session environment specified.

Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.

        at org.apache.flink.table.client.SqlClient.main(SqlClient.java:190)

Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.

        at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:753)

        at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:228)

        at org.apache.flink.table.client.SqlClient.start(SqlClient.java:98)

        at org.apache.flink.table.client.SqlClient.main(SqlClient.java:178)

Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/TaskAttemptContext

        at java.lang.Class.forName0(Native Method)

        at java.lang.Class.forName(Class.java:264)

        at org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:146)

        at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:141)

        at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:100)

        at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:368)

        at org.apache.hadoop.hive.conf.HiveConf.<clinit>(HiveConf.java:105)

        at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:162)

        at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:140)

        at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:84)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:371)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$4(ExecutionContext.java:547)

        at java.util.HashMap.forEach(HashMap.java:1288)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$5(ExecutionContext.java:546)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:240)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:545)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:494)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:159)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:118)

        at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:742)

        ... 3 more

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.TaskAttemptContext

        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

        ... 23 more

缺少mapreduce相關(guān)jar包,從cdh目錄拷貝:

[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-core-2.6.0-cdh5.13.2.jar lib/

[root@kylin03 flink-1.10.0]# bin/sql-client.sh embedded

No default environment specified.

Searching for '/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml'...found.

Reading default environment from: file:/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml

No session environment specified.

Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.

        at org.apache.flink.table.client.SqlClient.main(SqlClient.java:190)

Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.

        at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:753)

        at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:228)

        at org.apache.flink.table.client.SqlClient.start(SqlClient.java:98)

        at org.apache.flink.table.client.SqlClient.main(SqlClient.java:178)

Caused by: java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration

        at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:38)

        at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:36)

        at org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroupInformation.java:139)

        at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:259)

        at org.apache.hadoop.conf.Configuration$Resource.getRestrictParserDefault(Configuration.java:245)

        at org.apache.hadoop.conf.Configuration$Resource.<init>(Configuration.java:213)

        at org.apache.hadoop.conf.Configuration$Resource.<init>(Configuration.java:205)

        at org.apache.hadoop.conf.Configuration.addResource(Configuration.java:863)

        at org.apache.flink.api.java.hadoop.mapred.utils.HadoopUtils.getHadoopConfiguration(HadoopUtils.java:102)

        at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:171)

        at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:140)

        at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:84)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:371)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$4(ExecutionContext.java:547)

        at java.util.HashMap.forEach(HashMap.java:1288)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$5(ExecutionContext.java:546)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:240)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:545)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:494)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:159)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:118)

        at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:742)

        ... 3 more

Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration

        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

        ... 25 more

[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/commons-

commons-beanutils-1.8.3.jar      commons-compress-1.4.1.jar        commons-exec-1.1.jar              commons-lang-2.6.jar              commons-math3-3.2.jar

commons-beanutils-1.9.2.jar      commons-compress-1.4.jar          commons-fileupload-1.3.2.jar      commons-lang3-3.1.jar            commons-math3-3.4.1.jar

commons-beanutils-core-1.8.0.jar  commons-compress-1.9.jar          commons-httpclient-3.0.1.jar      commons-lang3-3.3.2.jar          commons-net-1.4.1.jar

commons-cli-1.2.jar              commons-configuration-1.6.jar    commons-httpclient-3.1.jar        commons-lang3-3.4.jar            commons-net-3.1.jar

commons-cli-2.0-mahout.jar        commons-configuration-1.7.jar    commons-io-1.4.jar                commons-logging-1.1.1.jar        commons-pool-1.5.4.jar

commons-codec-1.4.jar            commons-daemon-1.0.13.jar        commons-io-2.1.jar                commons-logging-1.1.3.jar        commons-pool2-2.4.2.jar

commons-codec-1.6.jar            commons-daemon.jar                commons-io-2.3.jar                commons-logging-1.1.jar          commons-vfs2-2.0.jar

commons-codec-1.8.jar            commons-dbcp-1.4.jar              commons-io-2.4.jar                commons-logging-1.2.jar         

commons-codec-1.9.jar            commons-digester-1.8.1.jar        commons-jexl-2.1.1.jar            commons-math-2.1.jar             

commons-collections-3.2.2.jar    commons-digester-1.8.jar          commons-lang-2.4.jar              commons-math-2.2.jar             

commons-compiler-2.7.6.jar        commons-el-1.0.jar                commons-lang-2.5.jar              commons-math3-3.1.1.jar         

[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/commons-configuration-1.6.jar lib/

[root@kylin03 flink-1.10.0]# bin/sql-client.sh embedded                                         

No default environment specified.

Searching for '/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml'...found.

Reading default environment from: file:/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml

No session environment specified.

Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.

        at org.apache.flink.table.client.SqlClient.main(SqlClient.java:190)

Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.

        at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:753)

        at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:228)

        at org.apache.flink.table.client.SqlClient.start(SqlClient.java:98)

        at org.apache.flink.table.client.SqlClient.main(SqlClient.java:178)

Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName

        at org.apache.hadoop.security.UserGroupInformation.getOSLoginModuleName(UserGroupInformation.java:442)

        at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:487)

        at org.apache.hadoop.conf.Configuration$Resource.getRestrictParserDefault(Configuration.java:245)

        at org.apache.hadoop.conf.Configuration$Resource.<init>(Configuration.java:213)

        at org.apache.hadoop.conf.Configuration$Resource.<init>(Configuration.java:205)

        at org.apache.hadoop.conf.Configuration.addResource(Configuration.java:863)

        at org.apache.flink.api.java.hadoop.mapred.utils.HadoopUtils.getHadoopConfiguration(HadoopUtils.java:102)

        at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:171)

        at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:140)

        at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:84)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:371)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$4(ExecutionContext.java:547)

        at java.util.HashMap.forEach(HashMap.java:1288)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$5(ExecutionContext.java:546)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:240)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:545)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:494)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:159)

        at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:118)

        at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:742)

        ... 3 more

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.PlatformName

        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

        ... 23 more

[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-co

hadoop-common-2.6.0-cdh5.13.2.jar        hadoop-common-2.6.0-cdh5.13.2-tests.jar  hadoop-core-2.6.0-mr1-cdh5.13.2.jar     

[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-co

hadoop-common-2.6.0-cdh5.13.2.jar        hadoop-common-2.6.0-cdh5.13.2-tests.jar  hadoop-core-2.6.0-mr1-cdh5.13.2.jar     

[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-common-2.6.0-cdh5.13.2.jar  lib/

cp: overwrite `lib/hadoop-common-2.6.0-cdh5.13.2.jar'? y

[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-auth-2.6.0-cdh5.13.2.jar lib/

終于负间,可以正常打開了:


[root@kylin03 flink-1.10.0]# bin/sql-client.sh embedded                                             

No default environment specified.

Searching for '/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml'...found.

Reading default environment from: file:/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml

No session environment specified.

                                  ?▓██▓██?

                              ▓████??█▓?▓███▓?

                            ▓███▓??        ???▓██?  ?

                          ?██?  ??▓▓█▓▓??      ?████

                          ██?        ??▓███?    ?█?█?

                            ?▓█            ███  ▓??██

                              ▓█      ?????▓██▓???▓▓█

                            █? █  ???      ███▓▓█ ?█???

                            ████?  ?▓█▓      ██??? ▓███?

                        ??█▓▓██      ▓█?    ▓█?▓██▓ ?█?

                  ▓??▓████? ██        ?█    █▓??█???█?

                  ███▓?██▓  ▓█          █  █▓ ?▓█▓▓█?

                ?██▓  ?█?            █  █? ?█████▓? ██▓??

              ███? ? █?          ▓ ?█ █████???    ?█?▓  ▓?

              ██▓█ ??▓?          ▓███████▓?      ?█? ?▓ ▓██▓

          ?██▓ ▓█ █▓█      ??█████▓▓??        ██??  █ ?  ▓█?

          ▓█▓  ▓█ ██▓ ?▓▓▓▓▓▓▓?              ?██▓          ?█?

          ▓█    █ ▓███▓??              ?▓▓▓███▓          ??? ▓█

          ██▓    ██?    ??▓▓███▓▓▓▓▓██████▓?            ▓███  █

          ▓███? ███  ?▓▓???  ?▓████▓?                  ??▓?  █▓

          █▓??▓▓██  ??????????▓██▓?                            █▓

          ██ ▓??█  ▓▓▓▓???  ?█▓      ?▓▓██▓    ▓?          ??▓

          ▓█▓ ▓?█  █▓?  ??▓▓██?            ?▓█?  ??????▓█████?

          ██? ▓█?█?  ?▓▓?  ▓█                █?      ????  ?█?

          ▓█  ?█▓  ?    █?                ?█              █▓

            █▓  ██        █?                ▓▓        ?█▓▓▓?█?

            █▓ ?▓██?      ▓?                  ▓█▓?????▓█?    ?█

              ██  ▓█▓?      ?                    ??█?██?      ▓▓

              ▓█?  ?█▓??                        ?? █?█▓?????██

                ?██?    ?▓▓?                    ▓██▓?█? ?▓▓▓▓?█▓

                  ?▓██?                          ▓?  ?█▓█  ?????

                      ?▓▓▓▓▓?????????????????????????▓▓  ▓??█?

    ______ _ _      _      _____  ____  _        _____ _ _            _  BETA 

  |  ____| (_)    | |    / ____|/ __ \| |      / ____| (_)          | | 

  | |__  | |_ _ __ | | __ | (___ | |  | | |      | |    | |_  ___ _ __ | |_

  |  __| | | | '_ \| |/ /  \___ \| |  | | |      | |    | | |/ _ \ '_ \| __|

  | |    | | | | | |  <  ____) | |__| | |____  | |____| | |  __/ | | | |_

  |_|    |_|_|_| |_|_|\_\ |_____/ \___\_\______|  \_____|_|_|\___|_| |_|\__|

        Welcome! Enter 'HELP;' to list all available commands. 'QUIT;' to exit.

Flink SQL> show databases;

default_database

Flink SQL> show tables;

[INFO] Result was empty.

Flink SQL> use catalog myhive;

Flink SQL> show databases;

default

dim

temp

Flink SQL> show tables;

my_first_table

test

Flink SQL> quit;

[INFO] Exiting Flink SQL CLI Client...

Shutting down the session...

done.

[root@kylin03 flink-1.10.0]#

感謝:

https://blog.csdn.net/u014310499/article/details/103905804

https://blog.csdn.net/h335146502/article/details/100689010

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末偶妖,一起剝皮案震驚了整個濱河市,隨后出現(xiàn)的幾起案子政溃,更是在濱河造成了極大的恐慌趾访,老刑警劉巖,帶你破解...
    沈念sama閱讀 222,252評論 6 516
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件董虱,死亡現(xiàn)場離奇詭異扼鞋,居然都是意外死亡,警方通過查閱死者的電腦和手機(jī)愤诱,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 94,886評論 3 399
  • 文/潘曉璐 我一進(jìn)店門云头,熙熙樓的掌柜王于貴愁眉苦臉地迎上來,“玉大人转锈,你說我怎么就攤上這事盘寡。” “怎么了撮慨?”我有些...
    開封第一講書人閱讀 168,814評論 0 361
  • 文/不壞的土叔 我叫張陵竿痰,是天一觀的道長脆粥。 經(jīng)常有香客問我,道長影涉,這世上最難降的妖魔是什么变隔? 我笑而不...
    開封第一講書人閱讀 59,869評論 1 299
  • 正文 為了忘掉前任,我火速辦了婚禮蟹倾,結(jié)果婚禮上匣缘,老公的妹妹穿的比我還像新娘。我一直安慰自己鲜棠,他們只是感情好肌厨,可當(dāng)我...
    茶點故事閱讀 68,888評論 6 398
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著豁陆,像睡著了一般柑爸。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上盒音,一...
    開封第一講書人閱讀 52,475評論 1 312
  • 那天表鳍,我揣著相機(jī)與錄音,去河邊找鬼祥诽。 笑死譬圣,一個胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的雄坪。 我是一名探鬼主播厘熟,決...
    沈念sama閱讀 41,010評論 3 422
  • 文/蒼蘭香墨 我猛地睜開眼,長吁一口氣:“原來是場噩夢啊……” “哼诸衔!你這毒婦竟也來了盯漂?” 一聲冷哼從身側(cè)響起,我...
    開封第一講書人閱讀 39,924評論 0 277
  • 序言:老撾萬榮一對情侶失蹤笨农,失蹤者是張志新(化名)和其女友劉穎就缆,沒想到半個月后,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體谒亦,經(jīng)...
    沈念sama閱讀 46,469評論 1 319
  • 正文 獨居荒郊野嶺守林人離奇死亡竭宰,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點故事閱讀 38,552評論 3 342
  • 正文 我和宋清朗相戀三年,在試婚紗的時候發(fā)現(xiàn)自己被綠了份招。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片切揭。...
    茶點故事閱讀 40,680評論 1 353
  • 序言:一個原本活蹦亂跳的男人離奇死亡,死狀恐怖锁摔,靈堂內(nèi)的尸體忽然破棺而出廓旬,到底是詐尸還是另有隱情,我是刑警寧澤谐腰,帶...
    沈念sama閱讀 36,362評論 5 351
  • 正文 年R本政府宣布孕豹,位于F島的核電站涩盾,受9級特大地震影響,放射性物質(zhì)發(fā)生泄漏励背。R本人自食惡果不足惜春霍,卻給世界環(huán)境...
    茶點故事閱讀 42,037評論 3 335
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望叶眉。 院中可真熱鬧址儒,春花似錦、人聲如沸衅疙。這莊子的主人今日做“春日...
    開封第一講書人閱讀 32,519評論 0 25
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽炼蛤。三九已至妖爷,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間理朋,已是汗流浹背。 一陣腳步聲響...
    開封第一講書人閱讀 33,621評論 1 274
  • 我被黑心中介騙來泰國打工绿聘, 沒想到剛下飛機(jī)就差點兒被人妖公主榨干…… 1. 我叫王不留嗽上,地道東北人。 一個月前我還...
    沈念sama閱讀 49,099評論 3 378
  • 正文 我出身青樓熄攘,卻偏偏與公主長得像兽愤,于是被迫代替她去往敵國和親。 傳聞我的和親對象是個殘疾皇子挪圾,可洞房花燭夜當(dāng)晚...
    茶點故事閱讀 45,691評論 2 361

推薦閱讀更多精彩內(nèi)容

  • 1浅萧、運行環(huán)境 主機(jī)IP 主機(jī)名 2、配置主機(jī)名(分別在五臺機(jī)器上執(zhí)行) hostname +主機(jī)名例如: h...
    獻(xiàn)給記性不好的自己閱讀 3,560評論 0 6
  • 說明:不少讀者反饋哲思,想使用開源組件搭建Hadoop平臺洼畅,然后再部署Kylin,但是遇到各種問題棚赔。這里我為讀者部署一...
    大詩兄_zl閱讀 2,083評論 0 2
  • 小編的一天是醬紫的: 晚上加班到深夜,早上早早起床趕公交胧后,去到公司不討好芋浮,工作沒有什么大風(fēng)大浪,一到辦公室看看郵件...
    驢子自駕游閱讀 550評論 6 5
  • 突然之間 誰都不想再愛 突然之間 想誰都不認(rèn)識 突然之間 想置身一切事外 有些路 注定要一個人去走 有些人 注定要...
    雪美梅花閱讀 218評論 0 0
  • 有時候镇草,時常的想著 倘若將一段光陰,只默默托付給一個人 是樁多么何暇,多么欣喜陶夜,而值得慶幸的事 那份期許,好像溫潤天空...
    王芊驊閱讀 351評論 0 3