spark报错java.lang.NoSuchMethodError: org.apache.hadoop.security.authentication.util.KerberosUtil.hasK
最新推荐文章于 2023-05-09 08:42:31 发布
蜗牛^_^
最新推荐文章于 2023-05-09 08:42:31 发布
使用场景:通过spark的Structured Streaming消费kafka中指定topic的数据,将分析结果写入到hbase中
报错:java.lang.NoSuchMethodError: org.apache.hadoop.security.authentication.util.KerberosUtil.hasKerberosKeyTab(Ljavax/security/auth/Subject;)Z
原因:NoSuchMethod可能是因为项目中混合了多个版本的jar包,需要检查下项目依赖
1)起初pom.xml
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>${hbase.version}</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.2</version>
</dependency>
</dependencies>
2)hbase-client.jar中已包含hadoop-common.jar,没必要将hadoop-common.jar引入
3)修正后的pom.xml
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>${hbase.version}</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<!--<dependency>-->
<!--<groupId>org.apache.hadoop</groupId>-->
<!--<artifactId>hadoop-common</artifactId>-->
<!--<version>${hadoop.version}</version>-->
<!--</dependency>-->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.2</version>
</dependency>
</dependencies>
spark报错java.lang.NoSuchMethodError: org.apache.hadoop.security.authentication.util.KerberosUtil.hasK
使用场景:通过spark的Structured Streaming消费kafka中指定topic的数据,将分析结果写入到hbase中报错:java.lang.NoSuchMethodError: org.apache.hadoop.security.authentication.util.KerberosUtil.hasKerberosKeyTab(Ljavax/security/auth/S...
<groupId>
org
.
apache
.
hadoop
</groupId>
<artifactId>
hadoop
-auth</artifactId>
<version>3.1.2</version>
</dependenc...
Caused by:
java
.
lan
g.
NoSuchMethodError
:
org
.
apache
.
hadoop
.
security
.
authentication
.
util
.
Kerberos
Util
.
hasK
erberosKeyTab(L
java
x/
security
/auth/Subject;)Z
很显然时
hadoop
auth包版本的问题,于是将
hadoop
-aut
Spark
报错
:
java
.
lan
g.
NoSuchMethodError
:
org
.
apache
.
hadoop
.
security
.
authentication
.
util
.
Kerberos
Util
.
hasK
erberosKeyTab(L
java
x/
security
/auth/Subject;)Z
该博客提到,NoSuchMethod可能是因为项目中混合了多个版本的jar包,需要检查下项目依赖
查看了pom文件里,这里的依赖有点问题(可能是其他jar包中已经包含了
hadoop
-common.jar,不
启动
spark
-shell时,
报错
如下:
Exception in thread "main"
java
.
lan
g.
NoSuchMethodError
:
org
.
apache
.
hadoop
.
security
.
Hadoop
Kerberos
Name.setRuleMechanism(L
java
/
lan
g/String;)V
at
org
.
apache
.
hadoop
.
security
.Had...
【异常】
NoSuchMethodError
:
org
.
apache
.
hadoop
.
security
.
authentication
.
util
.
Kerberos
Util
.
hasK
erberosKeyTa
Exception in thread "main"
java
.
lan
g.
NoSuchMethodError
: com.google.common.hash.Funnels.stringFunnel(L
java
/nio/charset/Charset;)Lcom/google/common/hash/Funnel;
at BloomFilterCase$.main(BloomFilterCase.scala:11)
at BloomFilterCase.main(BloomFilterCase.scal
6a_dore: