Skip to content

Spark and Spring Boot – NoClassDefFoundError javax/servlet/Servlet

Issue –

Exception in thread "main" java.lang.NoClassDefFoundError: javax/servlet/Servlet
    at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:223)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:484)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704)
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
    at scala.Option.getOrElse(Option.scala:201)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
Caused by: java.lang.ClassNotFoundException: javax.servlet.Servlet
    at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
    at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
    at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520)
    ... 7 more

Environment –

  • Spring Boot 3 or higher
  • Java 17 or higher
  • Spark 3 or higher

Root cause –

Spring Boot 3 migrated from Javax to Jakarta. Interestingly it is not backward compatible. When you use Spark 3 or higher in a Spring Boot 3 application then you may face this issue where one of the predefined API method of Spark will look for javax.

Solution –

To solve this issue, you need to provide jakarta-servlet and jersey versions explicitly in your pom.xml

<properties>
  <jakarta-servlet.version>4.0.3</jakarta-servlet.version>
  <jersey.version>2.36</jersey.version>
</properties>

Also, you need to make sure your spark dependencies are higher than 3. Following the examples of basic Spark dependencies.

Maven –

<dependency>
 <groupId>org.apache.spark</groupId>
 <artifactId>spark-core_2.12</artifactId>
 <version>3.2.0</version>
</dependency>

<dependency>
 <groupId>org.apache.spark</groupId>
 <artifactId>spark-sql_2.12</artifactId>
 <version>3.2.0</version>
</dependency>

Gradle –

dependencies {
implementation 'org.apache.spark:spark-core_2.12:3.2.0',
implementation 'org.apache.spark:spark-sql_2.12:3.2.0'
}

Do you have another solution?

The solution provided above is based on the scenario our one of the developers/contributors faced. If you faced the same issue and found any other root cause then please share your solution in the comment section below. We will add the solution in this article.

Leave a Reply

Your email address will not be published. Required fields are marked *