idea打包Spark 找不到主类

Posted alpha-cat

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了idea打包Spark 找不到主类相关的知识,希望对你有一定的参考价值。

1. 用idea的maven项目 package 打包, 去 Linux 执行

spark-submit --class com.Spark_HDFS --master local ./SXC-1.0-SNAPSHOT.jar

遇到 找不到主类 , 这时候需要更加稳定的打包方式 去打包 scala ,两条横线中间部分就是我用来打包 scala 2.12.10 版本的插件 .

这样取运行是能执行的 , 

去服务器执行spark-shell , 能看到scala版本 , 尽量idea用一样的scala版本编译 .

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>org.example</groupId>
    <artifactId>SXC</artifactId>
    <version>1.0-SNAPSHOT</version>


<build> <plugins> <plugin> <groupId>net.alchim31.maven</groupId> <artifactId>scala-maven-plugin</artifactId> <version>3.2.2</version> <executions> <execution> <id>compile-scala</id> <phase>compile</phase> <goals> <goal>add-source</goal> <goal>compile</goal> </goals> </execution> <execution> <id>test-compile-scala</id> <phase>test-compile</phase> <goals> <goal>add-source</goal> <goal>testCompile</goal> </goals> </execution> </executions> <configuration> <scalaVersion>2.12.10</scalaVersion> </configuration> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>3.2</version> <configuration> <source>1.8</source> <target>1.8</target> </configuration> </plugin> <plugin> <artifactId>maven-assembly-plugin</artifactId> <configuration> <descriptorRefs> <descriptorRef>jar-with-dependencies</descriptorRef> </descriptorRefs> <archive> <manifest> <mainClass></mainClass> </manifest> </archive> </configuration> <executions> <execution> <id>make-assembly</id> <phase>package</phase> <goals> <goal>single</goal> </goals> </execution> </executions> </plugin> <plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>exec-maven-plugin</artifactId> <version>1.6.0</version> <executions> <execution> <goals> <goal>exec</goal> </goals> </execution> </executions> <configuration> <executable>java</executable> <includeProjectDependencies>true</includeProjectDependencies> <includePluginDependencies>false</includePluginDependencies> <classpathScope>compile</classpathScope> <mainClass>cn.spark.study.App</mainClass> </configuration> </plugin> </plugins> </build>


<properties> <scala.version>2.12.10</scala.version> </properties> <repositories> <repository> <id>scala-tools.org</id> <name>Scala-Tools Maven2 Repository</name> <url>http://scala-tools.org/repo-releases</url> </repository> </repositories> <dependencies> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>3.2.1</version> <exclusions> <exclusion> <groupId>com.fasterxml.jackson.module</groupId> <artifactId>*</artifactId> </exclusion> <exclusion> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>*</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>3.2.1</version> <exclusions> <exclusion> <groupId>com.fasterxml.jackson.module</groupId> <artifactId>*</artifactId> </exclusion> <exclusion> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>*</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>3.2.1</version> <exclusions> <exclusion> <groupId>com.fasterxml.jackson.module</groupId> <artifactId>*</artifactId> </exclusion> <exclusion> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>*</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-mapreduce-client-core</artifactId> <version>3.2.1</version> <exclusions> <exclusion> <groupId>com.fasterxml.jackson.module</groupId> <artifactId>*</artifactId> </exclusion> <exclusion> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>*</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>commons-logging</groupId> <artifactId>commons-logging</artifactId> <version>1.2</version> </dependency> <!-- Spark驱动 --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.12</artifactId> <version>3.0.0-preview2</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.12</artifactId> <version>3.0.0-preview2</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.12</artifactId> <version>3.0.0-preview2</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-mllib_2.12</artifactId> <version>3.0.0-preview2</version> <scope>runtime</scope> </dependency> <reporting> <plugins> <plugin> <groupId>org.scala-tools</groupId> <artifactId>maven-scala-plugin</artifactId> <configuration> <scalaVersion>${scala.version}</scalaVersion> </configuration> </plugin> </plugins> </reporting> </project>

 

以上是关于idea打包Spark 找不到主类的主要内容,如果未能解决你的问题,请参考以下文章

jar打包完成后提示找不到主类,高手看一下哪一步有问题?

idea 本地导入的jar包,在打包的时候找不到

idea启动spring boot无法加载或找不到主类

Intellij IDEA 封装Jar包(提示错误: 找不到或无法加载主类)

IDEA错误: 找不到或无法加载主类 com.xxx.freight.dofreight.doFreight解决办法

idea找不到或无法加载主类,看图,百度的方法没有解决,求大神。