Flink:java.lang.NoSuchMethodError:AvroSchemaConverter
Posted
技术标签:
【中文标题】Flink:java.lang.NoSuchMethodError:AvroSchemaConverter【英文标题】:Flink: java.lang.NoSuchMethodError: AvroSchemaConverter 【发布时间】:2021-12-24 17:37:20 【问题描述】:我正在尝试连接到 Kafka。当我运行一个简单的 JAR 文件时,我收到以下错误:
java.lang.NoSuchMethodError: org.apache.flink.formats.avro.typeutils.AvroSchemaConverter.convertToSchema(Lorg/apache/flink/table/types/logical/LogicalType;)Lorg/apache/avro/Schema
是的,我知道NoSuchMethod
可能意味着版本之间存在冲突;但是,YARN 集群上没有运行 Flink 实例。
我也尝试过使用pom.xml
,但没有运气
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<flink.version>1.13.2</flink.version>
<java.version>1.8</java.version>
<scala.binary.version>2.12</scala.binary.version>
<confluent.version>5.2.1</confluent.version>
<maven.compiler.source>$java.version</maven.compiler.source>
<maven.compiler.target>$java.version</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-sql-connector-kafka_$scala.binary.version</artifactId>
<version>$flink.version</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner-blink_$scala.binary.version</artifactId>
<version>$flink.version</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_$scala.binary.version</artifactId>
<version>$flink.version</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-avro-confluent-registry</artifactId>
<version>$flink.version</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-sql-avro</artifactId>
<version>$flink.version</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>1.8.2</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-statebackend-rocksdb_$scala.binary.version</artifactId>
<version>$flink.version</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-jdbc_$scala.binary.version</artifactId>
<version>$flink.version</version>
</dependency>
<!-- Flink Dependency -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-hive_$scala.binary.version</artifactId>
<version>$flink.version</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java-bridge_$scala.binary.version</artifactId>
<version>$flink.version</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner_$scala.binary.version</artifactId>
<version>$flink.version</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.5</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.5</version>
</dependency>
<dependency>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
<version>1.2</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.5</version>
<configuration>
<archive>
<manifest>
<mainClass>$main.class</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>$java.version</source>
<target>$java.version</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.4</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.azercell.bigdata.flink_data_calculator.flinkBalance</mainClass>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
</transformers>
<relocations>
<relocation>
<pattern>org.codehaus.plexus.util</pattern>
<shadedPattern>org.shaded.plexus.util</shadedPattern>
<excludes>
<exclude>org.codehaus.plexus.util.xml.Xpp3Dom</exclude>
<exclude>org.codehaus.plexus.util.xml.pull.*</exclude>
</excludes>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
版本如上所示。我能够连接到 Hive;当我尝试连接到 Kafka 时,问题就出现了。
【问题讨论】:
【参考方案1】:如果我没记错的话,Flink 1.13.2 已经切换到 Apache Avro 1.10.0
,那么这很可能是您面临的问题,因为您正在尝试使用 1.8.2
avro 库。
【讨论】:
没错。我稍后会发布一个解决方案。但是,您在问题的根本原因上是正确的。【参考方案2】:问题在于依赖性。诀窍是要么从 Flink 中删除 Avro 1.11.0
,要么从 Hive 中删除 Avro 1.7.7
。我最终删除了 Avro 1.11.0
并将其替换为不同的 Avro 版本:
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-sql-avro-confluent-registry</artifactId>
<version>$flink.version</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-avro</artifactId>
<version>1.13.2</version>
<exclusions>
<exclusion>
<groupId>org.apache.avro</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>1.8.2-cdh6.1.1</version>
</dependency>
另一种可行的方法是使用hive-exec::core
和hive-metastore
作为依赖项。
【讨论】:
您的答案可以通过额外的支持信息得到改进。请edit 添加更多详细信息,例如引用或文档,以便其他人可以确认您的答案是正确的。你可以找到更多关于如何写好答案的信息in the help center。以上是关于Flink:java.lang.NoSuchMethodError:AvroSchemaConverter的主要内容,如果未能解决你的问题,请参考以下文章