Ambari2.7.5集成Flink1.13.6和FlinkCDC(flink-sql-connector-hive)异常处理大全
Flink 安装
基本安装文档参考
https://blog.csdn.net/qq_36048223/article/details/116114765
异常一:parent directory /opt/flink/conf doesn't exist
不知啥原因,没解压过来,直接手动解压到该目录
tar -zxvf flink-1.13.2-bin-scala_2.11.tgz -C /opt/flink
cd /opt/flink
mv flink-1.13.2/* /opt/flink
异常二:Sum of configured JVM Metaspace (256.000mb (268435456 bytes)) and JVM Overhead (192.000mb (201326592 bytes)) exceed configured Total Process Memory (256.000mb (268435456 bytes)).
https://blog.csdn.net/NDF923/article/details/123730372
集成 flink-cdc for hive
https://juejin.cn/post/7176084265161982008
https://nightlies.apache.org/flink/flink-docs-release-1.13/zh/docs/connectors/table/hive/overview/
异常三:com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
看我的另一篇文档 : https://www. jianshu.com/p/e6a76d842 2d4
- 1、删除flink-sql-connector-hive-3.1.2_2.11-1.13.6.jar中的com.google.common.base.Preconditions.class
- 2、修改guava-28.0的源码,在Preconditions.jar中增加
public static void checkArgument(String errorMessageTemplate, @Nullable Object p1) {
throw new IllegalArgumentException(*lenientFormat*(errorMessageTemplate, p1));
public static void checkArgument(
@Nullable String errorMessageTemplate,