Ошибка инициализации SparkContext: «Класс сервлета org.glassfish.jersey.servlet.ServletContainer не является javax.servl ⇐ JAVA
Ошибка инициализации SparkContext: «Класс сервлета org.glassfish.jersey.servlet.ServletContainer не является javax.servl
I am trying to run wordcount script using Java in apache spark installed locally. for that i am getting below mentioned error. Tried with multiple solution/blog but still getting same error.
Here is My ** Controller code **.
package com.spark.sparkapache; import org.apache.spark.SparkConf; import org.apache.spark.SparkContext; public class ApacheSparkApplication { public static void main(String[] args){ String ROOT_PATH = "D:\\Users\\ram\\Desktop\\java_fold"; SparkConf config = new SparkConf().setMaster("local").setAppName("spark-testing"); SparkContext sc= new SparkContext(config); // JavaRDD file = sc.textFile(ROOT_PATH + "spark.txt", 1).toJavaRDD(); // file.saveAsTextFile(ROOT_PATH); } } here is My **console error/exception **
3:24:56 AM: Executing ':ApacheSparkApplication.main()'... > Task :compileJava UP-TO-DATE > Task :processResources UP-TO-DATE > Task :classes UP-TO-DATE > Task :ApacheSparkApplication.main() Picked up JAVA_TOOL_OPTIONS: --add-exports=java.base/sun.nio.ch=ALL-UNNAMED Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties 24/03/03 03:24:59 INFO SparkContext: Running Spark version 3.5.1 24/03/03 03:24:59 INFO SparkContext: OS info Windows Server 2016, 10.0, amd64 24/03/03 03:24:59 INFO SparkContext: Java version 17.0.10 24/03/03 03:24:59 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 24/03/03 03:24:59 INFO ResourceUtils: ============================================================== 24/03/03 03:24:59 INFO ResourceUtils: No custom resources configured for spark.driver. 24/03/03 03:24:59 INFO ResourceUtils: ============================================================== 24/03/03 03:24:59 INFO SparkContext: Submitted application: spark-testing 24/03/03 03:24:59 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 24/03/03 03:24:59 INFO ResourceProfile: Limiting resource is cpu 24/03/03 03:24:59 INFO ResourceProfileManager: Added ResourceProfile id: 0 24/03/03 03:24:59 INFO SecurityManager: Changing view acls to: ram 24/03/03 03:24:59 INFO SecurityManager: Changing modify acls to: ram 24/03/03 03:24:59 INFO SecurityManager: Changing view acls groups to: 24/03/03 03:24:59 INFO SecurityManager: Changing modify acls groups to: 24/03/03 03:24:59 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: ram; groups with view permissions: EMPTY; users with modify permissions: ram; groups with modify permissions: EMPTY 24/03/03 03:25:00 INFO Utils: Successfully started service 'sparkDriver' on port 59975. 24/03/03 03:25:00 INFO SparkEnv: Registering MapOutputTracker 24/03/03 03:25:00 INFO SparkEnv: Registering BlockManagerMaster 24/03/03 03:25:00 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 24/03/03 03:25:00 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 24/03/03 03:25:00 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 24/03/03 03:25:00 INFO DiskBlockManager: Created local directory at D:\Users\ram\AppData\Local\Temp\1\blockmgr-8089bdbf-7df8-4ea0-b4e3-40e58a59dfc4 24/03/03 03:25:00 INFO MemoryStore: MemoryStore started with capacity 4.5 GiB 24/03/03 03:25:00 INFO SparkEnv: Registering OutputCommitCoordinator 24/03/03 03:25:00 INFO JettyUtils: Start Jetty 0.0.0.0:4040 for SparkUI 24/03/03 03:25:00 INFO Utils: Successfully started service 'SparkUI' on port 4040. 24/03/03 03:25:00 INFO Executor: Starting executor ID driver on host WSAMZN-LP0E5V0T.cis.neustar.com 24/03/03 03:25:00 INFO Executor: OS info Windows Server 2016, 10.0, amd64 24/03/03 03:25:00 INFO Executor: Java version 17.0.10 24/03/03 03:25:00 INFO Executor: Starting executor with user classpath (userClassPathFirst = false): '' 24/03/03 03:25:00 INFO Executor: Created or updated repl class loader org.apache.spark.util.MutableURLClassLoader@5f69e2b for default. 24/03/03 03:25:00 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 59990. 24/03/03 03:25:00 INFO NettyBlockTransferService: Server created on WSAMZN-LP0E5V0T.cis.neustar.com:59990 24/03/03 03:25:00 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 24/03/03 03:25:00 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, WSAMZN-LP0E5V0T.cis.neustar.com, 59990, None) 24/03/03 03:25:00 INFO BlockManagerMasterEndpoint: Registering block manager WSAMZN-LP0E5V0T.cis.neustar.com:59990 with 4.5 GiB RAM, BlockManagerId(driver, WSAMZN-LP0E5V0T.cis.neustar.com, 59990, None) 24/03/03 03:25:00 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, WSAMZN-LP0E5V0T.cis.neustar.com, 59990, None) 24/03/03 03:25:00 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, WSAMZN-LP0E5V0T.cis.neustar.com, 59990, None) 24/03/03 03:25:01 ERROR SparkContext: Error initializing SparkContext. javax.servlet.UnavailableException: Servlet class org.glassfish.jersey.servlet.ServletContainer is not a javax.servlet.Servlet at org.sparkproject.jetty.servlet.ServletHolder.checkServletType(ServletHolder.java:514) at org.sparkproject.jetty.servlet.ServletHolder.doStart(ServletHolder.java:386) at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73) at org.sparkproject.jetty.servlet.ServletHandler.lambda$initialize$0(ServletHandler.java:749) at java.base/java.util.stream.SortedOps$SizedRefSortingSink.end(SortedOps.java:357) at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:510) at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) at java.base/java.util.stream.StreamSpliterators$WrappingSpliterator.forEachRemaining(StreamSpliterators.java:310) at java.base/java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:735) at java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:762) at org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:774) at org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:379) at org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:916) at org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:288) at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73) at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:498) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2(SparkUI.scala:79) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2$adapted(SparkUI.scala:79) at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:575) at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:573) at scala.collection.AbstractIterable.foreach(Iterable.scala:933) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1(SparkUI.scala:79) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1$adapted(SparkUI.scala:77) at scala.Option.foreach(Option.scala:437) at org.apache.spark.ui.SparkUI.attachAllHandlers(SparkUI.scala:77) at org.apache.spark.SparkContext.$anonfun$new$30(SparkContext.scala:674) at org.apache.spark.SparkContext.$anonfun$new$30$adapted(SparkContext.scala:674) at scala.Option.foreach(Option.scala:437) at org.apache.spark.SparkContext.(SparkContext.scala:674) at com.spark.sparkapache.ApacheSparkApplication.main(ApacheSparkApplication.java:22) 24/03/03 03:25:01 INFO SparkContext: SparkContext is stopping with exitCode 0. 24/03/03 03:25:01 INFO SparkUI: Stopped Spark web UI at http://WSAMZN-LP0E5V0T.cis.neustar.com:4040 24/03/03 03:25:01 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 24/03/03 03:25:01 INFO MemoryStore: MemoryStore cleared 24/03/03 03:25:01 INFO BlockManager: BlockManager stopped 24/03/03 03:25:01 INFO BlockManagerMaster: BlockManagerMaster stopped 24/03/03 03:25:01 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 24/03/03 03:25:01 INFO SparkContext: Successfully stopped SparkContext Exception in thread "main" javax.servlet.UnavailableException: Servlet class org.glassfish.jersey.servlet.ServletContainer is not a javax.servlet.Servlet at org.sparkproject.jetty.servlet.ServletHolder.checkServletType(ServletHolder.java:514) at org.sparkproject.jetty.servlet.ServletHolder.doStart(ServletHolder.java:386) at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73) at org.sparkproject.jetty.servlet.ServletHandler.lambda$initialize$0(ServletHandler.java:749) at java.base/java.util.stream.SortedOps$SizedRefSortingSink.end(SortedOps.java:357) at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:510) at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) at java.base/java.util.stream.StreamSpliterators$WrappingSpliterator.forEachRemaining(StreamSpliterators.java:310) at java.base/java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:735) at java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:762) at org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:774) at org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:379) at org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:916) at org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:288) at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73) at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:498) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2(SparkUI.scala:79) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2$adapted(SparkUI.scala:79) at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:575) at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:573) at scala.collection.AbstractIterable.foreach(Iterable.scala:933) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1(SparkUI.scala:79) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1$adapted(SparkUI.scala:77) at scala.Option.foreach(Option.scala:437) at org.apache.spark.ui.SparkUI.attachAllHandlers(SparkUI.scala:77) at org.apache.spark.SparkContext.$anonfun$new$30(SparkContext.scala:674) at org.apache.spark.SparkContext.$anonfun$new$30$adapted(SparkContext.scala:674) at scala.Option.foreach(Option.scala:437) at org.apache.spark.SparkContext.(SparkContext.scala:674) at com.spark.sparkapache.ApacheSparkApplication.main(ApacheSparkApplication.java:22) 24/03/03 03:25:01 INFO ShutdownHookManager: Shutdown hook called 24/03/03 03:25:01 INFO ShutdownHookManager: Deleting directory D:\Users\ram\AppData\Local\Temp\1\spark-28587e82-73a2-4e4d-bddf-d9b77f68b4de > Task :ApacheSparkApplication.main() FAILED Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. For more on this, please refer to https://docs.gradle.org/8.5/userguide/c ... e_warnings in the Gradle documentation. 3 actionable tasks: 1 executed, 2 up-to-date FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':ApacheSparkApplication.main()'. > Process 'command 'D:\Users\ram\.jdks\azul-17.0.10\bin\java.exe'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. > Get more help at https://help.gradle.org. BUILD FAILED in 4s 3:25:01 AM: Execution finished ':ApacheSparkApplication.main()'. here is My build.gradle file, originally i have added first four dependency only. after going through different suggestion from blogs/stackoverflow/chatgpt i have added last four dependency.
plugins { id 'java' id 'org.springframework.boot' version '3.2.3' id 'io.spring.dependency-management' version '1.1.4' } group = 'com.spark' version = '0.0.1-SNAPSHOT' java { sourceCompatibility = '17' } repositories { mavenCentral() } dependencies { implementation 'org.springframework.boot:spring-boot-starter-web' testImplementation 'org.springframework.boot:spring-boot-starter-test' implementation group: 'org.apache.spark', name: 'spark-core_2.13', version: '3.5.1' implementation group: 'javax.servlet', name: 'javax.servlet-api', version: '3.1.0' implementation group: 'jakarta.servlet', name: 'jakarta.servlet-api', version: '6.0.0' implementation group: 'org.glassfish.jersey.core', name: 'jersey-server', version: '3.1.5' implementation group: 'org.glassfish.jersey.inject', name: 'jersey-hk2', version: '3.1.5' } configurations { configureEach { exclude group: 'org.springframework.boot', module: 'spring-boot-starter-logging' } } tasks.named('test') { useJUnitPlatform() } ext { jakarta_servlet_version = '6.0.0' jersey_version = '3.15' }
Источник: https://stackoverflow.com/questions/780 ... servlet-se
I am trying to run wordcount script using Java in apache spark installed locally. for that i am getting below mentioned error. Tried with multiple solution/blog but still getting same error.
Here is My ** Controller code **.
package com.spark.sparkapache; import org.apache.spark.SparkConf; import org.apache.spark.SparkContext; public class ApacheSparkApplication { public static void main(String[] args){ String ROOT_PATH = "D:\\Users\\ram\\Desktop\\java_fold"; SparkConf config = new SparkConf().setMaster("local").setAppName("spark-testing"); SparkContext sc= new SparkContext(config); // JavaRDD file = sc.textFile(ROOT_PATH + "spark.txt", 1).toJavaRDD(); // file.saveAsTextFile(ROOT_PATH); } } here is My **console error/exception **
3:24:56 AM: Executing ':ApacheSparkApplication.main()'... > Task :compileJava UP-TO-DATE > Task :processResources UP-TO-DATE > Task :classes UP-TO-DATE > Task :ApacheSparkApplication.main() Picked up JAVA_TOOL_OPTIONS: --add-exports=java.base/sun.nio.ch=ALL-UNNAMED Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties 24/03/03 03:24:59 INFO SparkContext: Running Spark version 3.5.1 24/03/03 03:24:59 INFO SparkContext: OS info Windows Server 2016, 10.0, amd64 24/03/03 03:24:59 INFO SparkContext: Java version 17.0.10 24/03/03 03:24:59 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 24/03/03 03:24:59 INFO ResourceUtils: ============================================================== 24/03/03 03:24:59 INFO ResourceUtils: No custom resources configured for spark.driver. 24/03/03 03:24:59 INFO ResourceUtils: ============================================================== 24/03/03 03:24:59 INFO SparkContext: Submitted application: spark-testing 24/03/03 03:24:59 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 24/03/03 03:24:59 INFO ResourceProfile: Limiting resource is cpu 24/03/03 03:24:59 INFO ResourceProfileManager: Added ResourceProfile id: 0 24/03/03 03:24:59 INFO SecurityManager: Changing view acls to: ram 24/03/03 03:24:59 INFO SecurityManager: Changing modify acls to: ram 24/03/03 03:24:59 INFO SecurityManager: Changing view acls groups to: 24/03/03 03:24:59 INFO SecurityManager: Changing modify acls groups to: 24/03/03 03:24:59 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: ram; groups with view permissions: EMPTY; users with modify permissions: ram; groups with modify permissions: EMPTY 24/03/03 03:25:00 INFO Utils: Successfully started service 'sparkDriver' on port 59975. 24/03/03 03:25:00 INFO SparkEnv: Registering MapOutputTracker 24/03/03 03:25:00 INFO SparkEnv: Registering BlockManagerMaster 24/03/03 03:25:00 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 24/03/03 03:25:00 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 24/03/03 03:25:00 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 24/03/03 03:25:00 INFO DiskBlockManager: Created local directory at D:\Users\ram\AppData\Local\Temp\1\blockmgr-8089bdbf-7df8-4ea0-b4e3-40e58a59dfc4 24/03/03 03:25:00 INFO MemoryStore: MemoryStore started with capacity 4.5 GiB 24/03/03 03:25:00 INFO SparkEnv: Registering OutputCommitCoordinator 24/03/03 03:25:00 INFO JettyUtils: Start Jetty 0.0.0.0:4040 for SparkUI 24/03/03 03:25:00 INFO Utils: Successfully started service 'SparkUI' on port 4040. 24/03/03 03:25:00 INFO Executor: Starting executor ID driver on host WSAMZN-LP0E5V0T.cis.neustar.com 24/03/03 03:25:00 INFO Executor: OS info Windows Server 2016, 10.0, amd64 24/03/03 03:25:00 INFO Executor: Java version 17.0.10 24/03/03 03:25:00 INFO Executor: Starting executor with user classpath (userClassPathFirst = false): '' 24/03/03 03:25:00 INFO Executor: Created or updated repl class loader org.apache.spark.util.MutableURLClassLoader@5f69e2b for default. 24/03/03 03:25:00 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 59990. 24/03/03 03:25:00 INFO NettyBlockTransferService: Server created on WSAMZN-LP0E5V0T.cis.neustar.com:59990 24/03/03 03:25:00 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 24/03/03 03:25:00 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, WSAMZN-LP0E5V0T.cis.neustar.com, 59990, None) 24/03/03 03:25:00 INFO BlockManagerMasterEndpoint: Registering block manager WSAMZN-LP0E5V0T.cis.neustar.com:59990 with 4.5 GiB RAM, BlockManagerId(driver, WSAMZN-LP0E5V0T.cis.neustar.com, 59990, None) 24/03/03 03:25:00 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, WSAMZN-LP0E5V0T.cis.neustar.com, 59990, None) 24/03/03 03:25:00 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, WSAMZN-LP0E5V0T.cis.neustar.com, 59990, None) 24/03/03 03:25:01 ERROR SparkContext: Error initializing SparkContext. javax.servlet.UnavailableException: Servlet class org.glassfish.jersey.servlet.ServletContainer is not a javax.servlet.Servlet at org.sparkproject.jetty.servlet.ServletHolder.checkServletType(ServletHolder.java:514) at org.sparkproject.jetty.servlet.ServletHolder.doStart(ServletHolder.java:386) at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73) at org.sparkproject.jetty.servlet.ServletHandler.lambda$initialize$0(ServletHandler.java:749) at java.base/java.util.stream.SortedOps$SizedRefSortingSink.end(SortedOps.java:357) at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:510) at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) at java.base/java.util.stream.StreamSpliterators$WrappingSpliterator.forEachRemaining(StreamSpliterators.java:310) at java.base/java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:735) at java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:762) at org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:774) at org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:379) at org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:916) at org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:288) at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73) at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:498) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2(SparkUI.scala:79) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2$adapted(SparkUI.scala:79) at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:575) at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:573) at scala.collection.AbstractIterable.foreach(Iterable.scala:933) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1(SparkUI.scala:79) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1$adapted(SparkUI.scala:77) at scala.Option.foreach(Option.scala:437) at org.apache.spark.ui.SparkUI.attachAllHandlers(SparkUI.scala:77) at org.apache.spark.SparkContext.$anonfun$new$30(SparkContext.scala:674) at org.apache.spark.SparkContext.$anonfun$new$30$adapted(SparkContext.scala:674) at scala.Option.foreach(Option.scala:437) at org.apache.spark.SparkContext.(SparkContext.scala:674) at com.spark.sparkapache.ApacheSparkApplication.main(ApacheSparkApplication.java:22) 24/03/03 03:25:01 INFO SparkContext: SparkContext is stopping with exitCode 0. 24/03/03 03:25:01 INFO SparkUI: Stopped Spark web UI at http://WSAMZN-LP0E5V0T.cis.neustar.com:4040 24/03/03 03:25:01 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 24/03/03 03:25:01 INFO MemoryStore: MemoryStore cleared 24/03/03 03:25:01 INFO BlockManager: BlockManager stopped 24/03/03 03:25:01 INFO BlockManagerMaster: BlockManagerMaster stopped 24/03/03 03:25:01 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 24/03/03 03:25:01 INFO SparkContext: Successfully stopped SparkContext Exception in thread "main" javax.servlet.UnavailableException: Servlet class org.glassfish.jersey.servlet.ServletContainer is not a javax.servlet.Servlet at org.sparkproject.jetty.servlet.ServletHolder.checkServletType(ServletHolder.java:514) at org.sparkproject.jetty.servlet.ServletHolder.doStart(ServletHolder.java:386) at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73) at org.sparkproject.jetty.servlet.ServletHandler.lambda$initialize$0(ServletHandler.java:749) at java.base/java.util.stream.SortedOps$SizedRefSortingSink.end(SortedOps.java:357) at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:510) at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) at java.base/java.util.stream.StreamSpliterators$WrappingSpliterator.forEachRemaining(StreamSpliterators.java:310) at java.base/java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:735) at java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:762) at org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:774) at org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:379) at org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:916) at org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:288) at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73) at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:498) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2(SparkUI.scala:79) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2$adapted(SparkUI.scala:79) at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:575) at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:573) at scala.collection.AbstractIterable.foreach(Iterable.scala:933) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1(SparkUI.scala:79) at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1$adapted(SparkUI.scala:77) at scala.Option.foreach(Option.scala:437) at org.apache.spark.ui.SparkUI.attachAllHandlers(SparkUI.scala:77) at org.apache.spark.SparkContext.$anonfun$new$30(SparkContext.scala:674) at org.apache.spark.SparkContext.$anonfun$new$30$adapted(SparkContext.scala:674) at scala.Option.foreach(Option.scala:437) at org.apache.spark.SparkContext.(SparkContext.scala:674) at com.spark.sparkapache.ApacheSparkApplication.main(ApacheSparkApplication.java:22) 24/03/03 03:25:01 INFO ShutdownHookManager: Shutdown hook called 24/03/03 03:25:01 INFO ShutdownHookManager: Deleting directory D:\Users\ram\AppData\Local\Temp\1\spark-28587e82-73a2-4e4d-bddf-d9b77f68b4de > Task :ApacheSparkApplication.main() FAILED Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. For more on this, please refer to https://docs.gradle.org/8.5/userguide/c ... e_warnings in the Gradle documentation. 3 actionable tasks: 1 executed, 2 up-to-date FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':ApacheSparkApplication.main()'. > Process 'command 'D:\Users\ram\.jdks\azul-17.0.10\bin\java.exe'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. > Get more help at https://help.gradle.org. BUILD FAILED in 4s 3:25:01 AM: Execution finished ':ApacheSparkApplication.main()'. here is My build.gradle file, originally i have added first four dependency only. after going through different suggestion from blogs/stackoverflow/chatgpt i have added last four dependency.
plugins { id 'java' id 'org.springframework.boot' version '3.2.3' id 'io.spring.dependency-management' version '1.1.4' } group = 'com.spark' version = '0.0.1-SNAPSHOT' java { sourceCompatibility = '17' } repositories { mavenCentral() } dependencies { implementation 'org.springframework.boot:spring-boot-starter-web' testImplementation 'org.springframework.boot:spring-boot-starter-test' implementation group: 'org.apache.spark', name: 'spark-core_2.13', version: '3.5.1' implementation group: 'javax.servlet', name: 'javax.servlet-api', version: '3.1.0' implementation group: 'jakarta.servlet', name: 'jakarta.servlet-api', version: '6.0.0' implementation group: 'org.glassfish.jersey.core', name: 'jersey-server', version: '3.1.5' implementation group: 'org.glassfish.jersey.inject', name: 'jersey-hk2', version: '3.1.5' } configurations { configureEach { exclude group: 'org.springframework.boot', module: 'spring-boot-starter-logging' } } tasks.named('test') { useJUnitPlatform() } ext { jakarta_servlet_version = '6.0.0' jersey_version = '3.15' }
Источник: https://stackoverflow.com/questions/780 ... servlet-se
-
- Похожие темы
- Ответы
- Просмотры
- Последнее сообщение