Flink Sql线上环境踩坑记录

发布时间 2024-01-09 09:52:06作者: 粒子先生

背景:Flink Sql程序本机测试执行没问题,上到生产环境就报各种诡异得问题,搞得头都大了。。。特此记录下解决过程。

问题原因主要两点:

1.JDK版本问题

2.Flink Sql相关jar包冲突

问题一

2020-09-27 06:06:33,125 INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager [] - Registering TaskManager with ResourceID c6131ee551f94eb9c3db0568f40b4ad2 (akka.tcp://flink@10.42.4.11:6122/user/rpc/taskmanager_0) at ResourceManager
2020-09-27 06:06:46,727 INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager [] - Registering TaskManager with ResourceID 6e7bbcf953908a8bdd42327b40d325c7 (akka.tcp://flink@10.42.1.117:6122/user/rpc/taskmanager_0) at ResourceManager
2020-09-27 06:07:47,447 WARN org.apache.flink.runtime.webmonitor.handlers.JarRunHandler [] - Configuring the job submission via query parameters is deprecated. Please migrate to submitting a JSON request instead.
2020-09-27 06:07:47,524 INFO org.apache.flink.client.ClientUtils [] - Starting program (detached: true)
2020-09-27 06:07:47,548 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - ------------program params-------------------------
2020-09-27 06:07:47,548 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - -metadataUrl
2020-09-27 06:07:47,548 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - http://dev-env.jcinfo.com//metadata/api/pipeline/0e67d7c9ee02445a9c709f83b1a2ca82
2020-09-27 06:07:47,548 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - -------------------------------------------
2020-09-27 06:07:47,569 INFO com.jc.dw.metadata.MetadataInfoImpl [] - http get metadata. url:http://dev-env.jcinfo.com//metadata/api/pipeline/0e67d7c9ee02445a9c709f83b1a2ca82
2020-09-27 06:07:48,107 INFO com.jc.dw.metadata.MetadataInfoImpl [] - PipelineMetadata before sorting:{"attributeProcess":[],"destModelName":"ccc","destSchema":"{\"type\":\"record\",\"name\":\"ccc\",\"doc\":\"\",\"fields\":[{\"name\":\"ccc\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}","dictInputs":[],"inputs":[{"database":"default","layer":10,"layerName":"ODS","name":"ceshi19","password":"password","tableName":"ods_ceshi19","type":"hive","url":"172.31.6.20:10000","username":"username"}],"output":{"database":"default","layer":20,"layerName":"DWD","password":"password","tableName":"dwd_ccc","type":"hive","url":"172.31.6.20:10000","username":"username"},"pipelineId":"0e67d7c9ee02445a9c709f83b1a2ca82","pipelineName":"sql测试01","pipelineType":"2","sqls":["insert into user_behavior_sink3 SELECT keyword FROM sql_test_07"],"srcInput":{"$ref":"$.inputs[0]"},"srcModelName":"ceshi19","srcSchema":"{\"type\":\"record\",\"name\":\"测试\",\"doc\":\"\",\"fields\":[{\"name\":\"c\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}"}
2020-09-27 06:07:48,108 INFO com.jc.dw.metadata.MetadataInfoImpl [] - PipelineMetadata:{"attributeProcess":[],"destModelName":"ccc","destSchema":"{\"type\":\"record\",\"name\":\"ccc\",\"doc\":\"\",\"fields\":[{\"name\":\"ccc\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}","dictInputs":[],"inputs":[{"database":"default","layer":10,"layerName":"ODS","name":"ceshi19","password":"password","tableName":"ods_ceshi19","type":"hive","url":"172.31.6.20:10000","username":"username"}],"output":{"database":"default","layer":20,"layerName":"DWD","password":"password","tableName":"dwd_ccc","type":"hive","url":"172.31.6.20:10000","username":"username"},"pipelineId":"0e67d7c9ee02445a9c709f83b1a2ca82","pipelineName":"sql测试01","pipelineType":"2","sqls":["insert into user_behavior_sink3 SELECT keyword FROM sql_test_07"],"srcInput":{"$ref":"$.inputs[0]"},"srcModelName":"ceshi19","srcSchema":"{\"type\":\"record\",\"name\":\"测试\",\"doc\":\"\",\"fields\":[{\"name\":\"c\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}"}
2020-09-27 06:07:49,096 INFO org.apache.hadoop.hive.conf.HiveConf [] - Found configuration file jar:file:/tmp/jars/flink-web-35e52647-4cdb-484b-a37d-bf3949e2acea/flink-web-upload/2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar!/hive-site.xml
2020-09-27 06:07:49,653 INFO com.jc.dw.sql.catalog.HiveCatalogManager [] - getHiveCataLog. name:myhive, defaultDatabase:default
2020-09-27 06:07:49,671 INFO org.apache.flink.table.catalog.hive.HiveCatalog [] - Created HiveCatalog 'myhive'
2020-09-27 06:07:49,820 WARN org.apache.hadoop.util.NativeCodeLoader [] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2020-09-27 06:07:49,916 ERROR org.apache.hadoop.hive.metastore.utils.MetaStoreUtils [] - Got exception: java.lang.ClassCastException class [Ljava.lang.Object; cannot be cast to class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module java.base of loader 'bootstrap')
java.lang.ClassCastException: class [Ljava.lang.Object; cannot be cast to class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module java.base of loader 'bootstrap')
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.resolveUris(HiveMetaStoreClient.java:262) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:182) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Unknown Source) ~[?:?]
at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?]
at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?]
at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1]
at com.jc.dw.sql.catalog.HiveCatalogManager.registerAndUseHiveCatalog(HiveCatalogManager.java:53) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?]
at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:156) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?]
at com.jc.dw.sql.Main.main(Main.java:24) ~[2664d9ad-4ebb-452f-9bb9-98f8e92e0d81_core-1.10-SNAPSHOT.jar:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?]
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:198) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:78) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:67) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.runtime.webmonitor.handlers.JarRunHandler.lambda$handleRequest$0(JarRunHandler.java:100) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
at java.util.concurrent.CompletableFuture$AsyncSupply.run(Unknown Source) [?:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) [?:?]
at java.util.concurrent.FutureTask.run(Unknown Source) [?:?]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source) [?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [?:?]
at java.lang.Thread.run(Unknown Source) [?:?]
2020-09-27 06:07:49,922 ERROR org.apache.hadoop.hive.metastore.utils.MetaStoreUtils [] - Converting exception to MetaException
2020-09-27 06:07:49,924 WARN org.apache.flink.client.deployment.application.DetachedApplicationRunner [] - Could not execute application:
org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Failed to create Hive Metastore client
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:302) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:198) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:78) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:67) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.runtime.webmonitor.handlers.JarRunHandler.lambda$handleRequest$0(JarRunHandler.java:100) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
at java.util.concurrent.CompletableFuture$AsyncSupply.run(Unknown Source) [?:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) [?:?]
at java.util.concurrent.FutureTask.run(Unknown Source) [?:?]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source) [?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [?:?]
at java.lang.Thread.run(Unknown Source) [?:?]
Caused by: org.apache.flink.table.catalog.exceptions.CatalogException: Failed to create Hive Metastore client
at org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:105) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) ~[?:?]
at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223) ~[?:?]
at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1]
at com.jc.dw.sql.catalog.HiveCatalogManager.registerAndUseHiveCatalog(HiveCatalogManager.java:53) ~[?:?]
at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:156) ~[?:?]
at com.jc.dw.sql.Main.main(Main.java:24) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?]
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
... 12 more
Caused by: java.lang.reflect.InvocationTargetException
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) ~[?:?]
at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223) ~[?:?]
at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1]
at com.jc.dw.sql.catalog.HiveCatalogManager.registerAndUseHiveCatalog(HiveCatalogManager.java:53) ~[?:?]
at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:156) ~[?:?]
at com.jc.dw.sql.Main.main(Main.java:24) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?]
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
... 12 more
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:86) ~[?:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95) ~[?:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148) ~[?:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) ~[?:?]
at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223) ~[?:?]
at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1]
at com.jc.dw.sql.catalog.HiveCatalogManager.registerAndUseHiveCatalog(HiveCatalogManager.java:53) ~[?:?]
at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:156) ~[?:?]
at com.jc.dw.sql.Main.main(Main.java:24) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?]
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
... 12 more
Caused by: java.lang.reflect.InvocationTargetException
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Unknown Source) ~[?:?]
at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84) ~[?:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95) ~[?:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148) ~[?:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) ~[?:?]
at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223) ~[?:?]
at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1]
at com.jc.dw.sql.catalog.HiveCatalogManager.registerAndUseHiveCatalog(HiveCatalogManager.java:53) ~[?:?]
at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:156) ~[?:?]
at com.jc.dw.sql.Main.main(Main.java:24) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?]
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
... 12 more
Caused by: org.apache.hadoop.hive.metastore.api.MetaException: Got exception: java.lang.ClassCastException class [Ljava.lang.Object; cannot be cast to class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module java.base of loader 'bootstrap')
at org.apache.hadoop.hive.metastore.utils.MetaStoreUtils.logAndThrowMetaException(MetaStoreUtils.java:168) ~[?:?]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.resolveUris(HiveMetaStoreClient.java:267) ~[?:?]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:182) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Unknown Source) ~[?:?]
at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84) ~[?:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95) ~[?:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148) ~[?:?]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init>(HiveMetastoreClientWrapper.java:71) ~[?:?]
at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) ~[?:?]
at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223) ~[?:?]
at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table-blink_2.12-1.11.1.jar:1.11.1]
at com.jc.dw.sql.catalog.HiveCatalogManager.registerAndUseHiveCatalog(HiveCatalogManager.java:53) ~[?:?]
at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:156) ~[?:?]
at com.jc.dw.sql.Main.main(Main.java:24) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:?]
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.12-1.11.1.jar:1.11.1]
... 12 more
2020-09-27 06:07:49,931 ERROR org.apache.flink.runtime.webmonitor.handlers.JarRunHandler [] - Exception occurred in REST handler: Could not execute application.

问题原因:这是Hive得bug,3版本还没修复

https://issues.apache.org/jira/browse/HIVE-22190

解决办法:将JDK版本改为1.8

问题二

2020-09-27 09:16:44,194 ERROR org.apache.flink.runtime.webmonitor.handlers.JarRunHandler [] - Exception occurred in REST handler: Could not execute application.
2020-09-27 09:30:05,853 WARN org.apache.flink.runtime.webmonitor.handlers.JarRunHandler [] - Configuring the job submission via query parameters is deprecated. Please migrate to submitting a JSON request instead.
2020-09-27 09:30:06,024 INFO org.apache.flink.client.ClientUtils [] - Starting program (detached: true)
2020-09-27 09:30:06,028 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - ------------program params-------------------------
2020-09-27 09:30:06,029 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - -metadataUrl
2020-09-27 09:30:06,029 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - http://dev-env.jcinfo.com/metadata/api/pipeline/0e67d7c9ee02445a9c709f83b1a2ca82
2020-09-27 09:30:06,029 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - -database
2020-09-27 09:30:06,029 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - dw
2020-09-27 09:30:06,029 INFO com.jc.dw.sql.exec.ExecuteProcessHelper [] - -------------------------------------------
2020-09-27 09:30:06,045 INFO com.jc.dw.metadata.MetadataInfoImpl [] - http get metadata. url:http://dev-env.jcinfo.com/metadata/api/pipeline/0e67d7c9ee02445a9c709f83b1a2ca82
2020-09-27 09:30:06,412 INFO com.jc.dw.metadata.MetadataInfoImpl [] - PipelineMetadata before sorting:{"attributeProcess":[],"destModelName":"ccc","destSchema":"{\"type\":\"record\",\"name\":\"ccc\",\"doc\":\"\",\"fields\":[{\"name\":\"ccc\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}","dictInputs":[],"inputs":[{"database":"default","layer":10,"layerName":"ODS","name":"ceshi19","password":"password","tableName":"ods_ceshi19","type":"hive","url":"172.31.6.20:10000","username":"username"}],"output":{"database":"default","layer":20,"layerName":"DWD","password":"password","tableName":"dwd_ccc","type":"hive","url":"172.31.6.20:10000","username":"username"},"pipelineId":"0e67d7c9ee02445a9c709f83b1a2ca82","pipelineName":"sql测试01","pipelineType":"2","sqls":["insert into dwd_baidu_news_test01 SELECT * from dwd_baidu_news LIMIT 10"],"srcInput":{"$ref":"$.inputs[0]"},"srcModelName":"ceshi19","srcSchema":"{\"type\":\"record\",\"name\":\"测试\",\"doc\":\"\",\"fields\":[{\"name\":\"c\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}"}
2020-09-27 09:30:06,413 INFO com.jc.dw.metadata.MetadataInfoImpl [] - PipelineMetadata:{"attributeProcess":[],"destModelName":"ccc","destSchema":"{\"type\":\"record\",\"name\":\"ccc\",\"doc\":\"\",\"fields\":[{\"name\":\"ccc\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}","dictInputs":[],"inputs":[{"database":"default","layer":10,"layerName":"ODS","name":"ceshi19","password":"password","tableName":"ods_ceshi19","type":"hive","url":"172.31.6.20:10000","username":"username"}],"output":{"database":"default","layer":20,"layerName":"DWD","password":"password","tableName":"dwd_ccc","type":"hive","url":"172.31.6.20:10000","username":"username"},"pipelineId":"0e67d7c9ee02445a9c709f83b1a2ca82","pipelineName":"sql测试01","pipelineType":"2","sqls":["insert into dwd_baidu_news_test01 SELECT * from dwd_baidu_news LIMIT 10"],"srcInput":{"$ref":"$.inputs[0]"},"srcModelName":"ceshi19","srcSchema":"{\"type\":\"record\",\"name\":\"测试\",\"doc\":\"\",\"fields\":[{\"name\":\"c\",\"type\":[\"null\",\"string\"],\"doc\":\"\",\"default\":null}]}"}
2020-09-27 09:30:06,446 INFO org.apache.hadoop.hive.conf.HiveConf [] - Found configuration file jar:file:/tmp/jars/flink-web-2ba55738-0699-4597-bd86-a61c528b22f8/flink-web-upload/daed4a12-a55f-415e-a8b4-09cbe433194b_core-1.11.1-SNAPSHOT.jar!/hive-site.xml
2020-09-27 09:30:06,863 INFO com.jc.dw.sql.catalog.HiveCatalogManager [] - getHiveCataLog. name:myhive, defaultDatabase:dw
2020-09-27 09:30:06,889 INFO org.apache.flink.table.catalog.hive.HiveCatalog [] - Created HiveCatalog 'myhive'
2020-09-27 09:30:06,991 WARN org.apache.hadoop.util.NativeCodeLoader [] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2020-09-27 09:30:07,059 INFO org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - Trying to connect to metastore with URI thrift://172.31.5.20:9083
2020-09-27 09:30:07,078 INFO org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - Opened a connection to metastore, current connections: 1
2020-09-27 09:30:07,103 INFO org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - Connected to metastore.
2020-09-27 09:30:07,104 INFO org.apache.hadoop.hive.metastore.RetryingMetaStoreClient [] - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.metastore.HiveMetaStoreClient ugi=flink (auth:SIMPLE) retries=1 delay=1 lifetime=0
2020-09-27 09:30:07,233 INFO org.apache.flink.table.catalog.hive.HiveCatalog [] - Connected to Hive metastore
2020-09-27 09:30:07,252 INFO org.apache.flink.table.catalog.CatalogManager [] - Set the current default catalog as [myhive] and the current default database as [dw].
2020-09-27 09:30:08,001 WARN org.apache.flink.client.deployment.application.DetachedApplicationRunner [] - Could not execute application:
org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Unable to instantiate java compiler
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:302) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:198) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:78) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:67) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.runtime.webmonitor.handlers.JarRunHandler.lambda$handleRequest$0(JarRunHandler.java:100) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604) [?:1.8.0_265]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_265]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_265]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_265]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_265]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_265]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_265]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_265]
Caused by: java.lang.IllegalStateException: Unable to instantiate java compiler
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.compile(JaninoRelMetadataProvider.java:433) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.load3(JaninoRelMetadataProvider.java:374) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.lambda$static$0(JaninoRelMetadataProvider.java:109) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:149) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3542) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2323) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2286) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2201) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache.get(LocalCache.java:3953) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3957) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4875) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.create(JaninoRelMetadataProvider.java:474) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.revise(JaninoRelMetadataProvider.java:487) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.metadata.RelMetadataQueryBase.revise(RelMetadataQueryBase.java:95) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.metadata.RelMetadataQuery.getPulledUpPredicates(RelMetadataQuery.java:780) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.rules.ReduceExpressionsRule$ProjectReduceExpressionsRule.onMatch(ReduceExpressionsRule.java:300) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.plan.AbstractRelOptPlanner.fireRule(AbstractRelOptPlanner.java:328) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.plan.hep.HepPlanner.applyRule(HepPlanner.java:562) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.plan.hep.HepPlanner.applyRules(HepPlanner.java:427) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.plan.hep.HepPlanner.executeInstruction(HepPlanner.java:264) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.plan.hep.HepInstruction$RuleInstance.execute(HepInstruction.java:127) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.plan.hep.HepPlanner.executeProgram(HepPlanner.java:223) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.plan.hep.HepPlanner.findBestExp(HepPlanner.java:210) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.program.FlinkHepProgram.optimize(FlinkHepProgram.scala:69) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.program.FlinkHepRuleSetProgram.optimize(FlinkHepRuleSetProgram.scala:87) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram$$anonfun$optimize$1.apply(FlinkChainedProgram.scala:62) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram$$anonfun$optimize$1.apply(FlinkChainedProgram.scala:58) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at scala.collection.Iterator$class.foreach(Iterator.scala:891) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at scala.collection.AbstractTraversable.foldLeft(Traversable.scala:104) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram.optimize(FlinkChainedProgram.scala:57) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.StreamCommonSubGraphBasedOptimizer.optimizeTree(StreamCommonSubGraphBasedOptimizer.scala:164) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.StreamCommonSubGraphBasedOptimizer.doOptimize(StreamCommonSubGraphBasedOptimizer.scala:80) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.CommonSubGraphBasedOptimizer.optimize(CommonSubGraphBasedOptimizer.scala:77) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.delegation.PlannerBase.optimize(PlannerBase.scala:279) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:164) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1264) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:700) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:787) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:690) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:172) ~[?:?]
at com.jc.dw.sql.Main.main(Main.java:28) ~[?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_265]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_265]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_265]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_265]
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
... 13 more
Caused by: java.lang.ClassCastException: org.codehaus.janino.CompilerFactory cannot be cast to org.codehaus.commons.compiler.ICompilerFactory
at org.codehaus.commons.compiler.CompilerFactoryFactory.getCompilerFactory(CompilerFactoryFactory.java:129) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.codehaus.commons.compiler.CompilerFactoryFactory.getDefaultCompilerFactory(CompilerFactoryFactory.java:79) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.compile(JaninoRelMetadataProvider.java:431) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.load3(JaninoRelMetadataProvider.java:374) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.lambda$static$0(JaninoRelMetadataProvider.java:109) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:149) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3542) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2323) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2286) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2201) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache.get(LocalCache.java:3953) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3957) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.calcite.shaded.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4875) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.create(JaninoRelMetadataProvider.java:474) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.revise(JaninoRelMetadataProvider.java:487) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.metadata.RelMetadataQueryBase.revise(RelMetadataQueryBase.java:95) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.metadata.RelMetadataQuery.getPulledUpPredicates(RelMetadataQuery.java:780) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.rel.rules.ReduceExpressionsRule$ProjectReduceExpressionsRule.onMatch(ReduceExpressionsRule.java:300) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.plan.AbstractRelOptPlanner.fireRule(AbstractRelOptPlanner.java:328) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.plan.hep.HepPlanner.applyRule(HepPlanner.java:562) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.plan.hep.HepPlanner.applyRules(HepPlanner.java:427) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.plan.hep.HepPlanner.executeInstruction(HepPlanner.java:264) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.plan.hep.HepInstruction$RuleInstance.execute(HepInstruction.java:127) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.plan.hep.HepPlanner.executeProgram(HepPlanner.java:223) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.calcite.plan.hep.HepPlanner.findBestExp(HepPlanner.java:210) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.program.FlinkHepProgram.optimize(FlinkHepProgram.scala:69) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.program.FlinkHepRuleSetProgram.optimize(FlinkHepRuleSetProgram.scala:87) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram$$anonfun$optimize$1.apply(FlinkChainedProgram.scala:62) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram$$anonfun$optimize$1.apply(FlinkChainedProgram.scala:58) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at scala.collection.Iterator$class.foreach(Iterator.scala:891) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at scala.collection.AbstractTraversable.foldLeft(Traversable.scala:104) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.program.FlinkChainedProgram.optimize(FlinkChainedProgram.scala:57) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.StreamCommonSubGraphBasedOptimizer.optimizeTree(StreamCommonSubGraphBasedOptimizer.scala:164) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.StreamCommonSubGraphBasedOptimizer.doOptimize(StreamCommonSubGraphBasedOptimizer.scala:80) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.plan.optimize.CommonSubGraphBasedOptimizer.optimize(CommonSubGraphBasedOptimizer.scala:77) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.delegation.PlannerBase.optimize(PlannerBase.scala:279) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:164) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1264) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:700) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:787) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:690) ~[flink-table-blink_2.11-1.11.2.jar:1.11.2]
at com.jc.dw.sql.exec.ExecuteProcessHelper.sqlExecution(ExecuteProcessHelper.java:172) ~[?:?]
at com.jc.dw.sql.Main.main(Main.java:28) ~[?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_265]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_265]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_265]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_265]
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
... 13 more
2020-09-27 09:30:08,005 ERROR org.apache.flink.runtime.webmonitor.handlers.JarRunHandler [] - Exception occurred in REST handler: Could not execute application.

问题原因:Jar包冲突

pom示例
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <parent>
        <artifactId>xxxx-warehouse</artifactId>
        <groupId>com.xx.dw</groupId>
        <version>1.0-SNAPSHOT</version>
    </parent>
    <modelVersion>4.0.0</modelVersion>
 
    <artifactId>core</artifactId>
    <repositories>
        <repository>
            <id>apache.snapshots</id>
            <name>Apache Development Snapshot Repository</name>
            <url>https://repository.apache.org/content/repositories/snapshots/</url>
            <releases>
                <enabled>false</enabled>
            </releases>
            <snapshots>
                <enabled>true</enabled>
            </snapshots>
        </repository>
    </repositories>
    <properties>
        <scala.bin.version>2.11</scala.bin.version>
        <flink.version>1.11.0</flink.version>
        <flink-shaded-hadoop.version>2.6.5-10.0</flink-shaded-hadoop.version>
        <hive.version>3.1.2</hive.version>
    </properties>
    <dependencies>
        <dependency>
            <groupId>com.jc.dw</groupId>
            <artifactId>metadata</artifactId>
            <version>1.0-SNAPSHOT</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.6.5</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-java_${scala.bin.version}</artifactId>
            <version>${flink.version}</version>
            <!--            <scope>provided</scope>-->
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_${scala.bin.version}</artifactId>
            <version>${flink.version}</version>
            <!--            <scope>provided</scope>-->
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-common</artifactId>
            <version>${flink.version}</version>
            <!--            <scope>provided</scope>-->
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java-bridge_${scala.bin.version}</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner-blink_${scala.bin.version}</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-hive_${scala.bin.version}</artifactId>
            <version>${flink.version}</version>
            <!--            <scope>provided</scope>-->
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-sql-connector-kafka-0.11_${scala.bin.version}</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-json</artifactId>
            <version>${flink.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-shaded-hadoop-2-uber</artifactId>
            <version>${flink-shaded-hadoop.version}</version>
            <!--            <scope>provided</scope>-->
        </dependency>
        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-exec</artifactId>
            <version>${hive.version}</version>
            <exclusions>
                <exclusion>
                    <groupId>org.codehaus.janino</groupId>
                    <artifactId>janino</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.codehaus.janino</groupId>
                    <artifactId>commons-compiler</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.pentaho</groupId>
                    <artifactId>pentaho-aggdesigner-algorithm</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.hive</groupId>
                    <artifactId>hive-vector-code-gen</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.hive</groupId>
                    <artifactId>hive-llap-tez</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.hive</groupId>
                    <artifactId>hive-shims</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>commons-codec</groupId>
                    <artifactId>commons-codec</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>commons-httpclient</groupId>
                    <artifactId>commons-httpclient</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.logging.log4j</groupId>
                    <artifactId>log4j-slf4j-impl</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.antlr</groupId>
                    <artifactId>antlr-runtime</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.antlr</groupId>
                    <artifactId>ST4</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.ant</groupId>
                    <artifactId>ant</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.commons</groupId>
                    <artifactId>commons-compress</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.ivy</groupId>
                    <artifactId>ivy</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.zookeeper</groupId>
                    <artifactId>zookeeper</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.curator</groupId>
                    <artifactId>apache-curator</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.curator</groupId>
                    <artifactId>curator-framework</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.codehaus.groovy</groupId>
                    <artifactId>groovy-all</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.calcite</groupId>
                    <artifactId>calcite-core</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.calcite</groupId>
                    <artifactId>calcite-druid</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.calcite.avatica</groupId>
                    <artifactId>avatica</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.calcite</groupId>
                    <artifactId>calcite-avatica</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>com.google.code.gson</groupId>
                    <artifactId>gson</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>stax</groupId>
                    <artifactId>stax-api</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>com.google.guava</groupId>
                    <artifactId>guava</artifactId>
                </exclusion>
            </exclusions>
            <!--            <scope>provided</scope>-->
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.12</version>
            <scope>test</scope>
        </dependency>
    </dependencies>
 
    <build>
        <resources>
            <resource>
                <directory>src/main/scala</directory>
            </resource>
            <resource>
                <directory>src/main/java</directory>
            </resource>
            <resource>
                <directory>src/main/resources</directory>
            </resource>
        </resources>
        <testResources>
            <testResource>
                <directory>src/test/java</directory>
            </testResource>
        </testResources>
        <plugins>
            <plugin>
                <groupId>org.scala-tools</groupId>
                <artifactId>maven-scala-plugin</artifactId>
                <version>2.15.2</version>
                <executions>
                    <execution>
                        <id>scala-compile-first</id>
                        <phase>process-resources</phase>
                        <goals>
                            <goal>add-source</goal>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                    <execution>
                        <id>scala-test-compile</id>
                        <phase>process-test-resources</phase>
                        <goals>
                            <goal>testCompile</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <scalaVersion>2.11.12</scalaVersion>
                    <args>
                        <arg>-target:jvm-1.8</arg>
                    </args>
                </configuration>
            </plugin>
            <plugin>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.1</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                    <archive>
                        <manifest>
                            <!-- 此处指定main方法入口的class -->
                            <mainClass>com.xx.dw.sql.Main</mainClass>
                        </manifest>
                    </archive>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</project>

待调查问题

Caused by: java.lang.IllegalArgumentException: Job client must be a CoordinationRequestGateway. This is a bug.
ERROR org.apache.flink.runtime.webmonitor.handlers.JarRunHandler [] - Exception occurred in REST handler: Could not execute application.
Exception occurred in REST handler: No jobs included in application.

Caused by: java.lang.IllegalStateException: BUG: vertex bc764cd8ddf7a0cff126f51c16239658_720 tries to allocate a slot when its previous slot request is still pending