java.lang.NoClassDefFoundError: org/apache/spark/SparkConf
package com.oreilly.learningsparkexamples.java;
import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaRDD; import org.apache.spark.api.java.JavaSparkContext;
public class JavaSparkTest { public static void main(String[] args) { SparkConf conf = new SparkConf().setMaster("local").setAppName("My App"); JavaSparkContext sc = new JavaSparkContext(conf); // JavaRDD<String > rdd = sc.textFile("D:\工作\20171227\newdesc.xml"); // JavaRDD<String > lineee = rdd.filter(line -> line.contains("desc")); // System.out.println(lineee.count()); } }
直接git到本地,然后依赖包也全部下载完毕了,但是始终报错找不到jar包: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkConf at com.oreilly.learningsparkexamples.java.JavaSparkTest.main(JavaSparkTest.java:12) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144) Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 6 more
工具是使用的IntelliJ Idea
遇到同样的问题
遇到同样的问题
<dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>${spark.version}</version> <!--<scope>provided</scope>--> </dependency>
注释掉scope或将provided改为compile,因为最后打包提交的时候要用到这个jar包。
java.lang.ClassNotFoundException: org.apache.spark.SparkConf at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:436)
Still having this issue, has it been solved.
Below is my sbt `name := "ProjectA"
version := "0.1"
//scalaVersion := "2.13.1"
scalaVersion := "2.11.8"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.1"`