java.lang.IllegalAccessException: symbolic reference class is not accessible: class sun.util.calendar.ZoneInfo, from interface org.apache.spark.sql.catalyst.util.SparkDateTimeUtils (unnamed module @7bbc8656)
at java.base/java.lang.invoke.MemberName.makeAccessException(MemberName.java:955) ~[?:?]
at java.base/java.lang.invoke.MethodHandles$Lookup.checkSymbolicClass(MethodHandles.java:3686) ~[?:?]
at java.base/java.lang.invoke.MethodHandles$Lookup.resolveOrFail(MethodHandles.java:3646) ~[?:?]
at java.base/java.lang.invoke.MethodHandles$Lookup.findVirtual(MethodHandles.java:2680) ~[?:?]
at org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.org$apache$spark$sql$catalyst$util$SparkDateTimeUtils$$getOffsetsByWallHandle(SparkDateTimeUtils.scala:206) ~[spark-sql-api_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.org$apache$spark$sql$catalyst$util$SparkDateTimeUtils$$getOffsetsByWallHandle$(SparkDateTimeUtils.scala:201) ~[spark-sql-api_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.sql.catalyst.util.DateTimeUtils$.org$apache$spark$sql$catalyst$util$SparkDateTimeUtils$$getOffsetsByWallHandle$lzycompute(DateTimeUtils.scala:41) ~[spark-catalyst_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.sql.catalyst.util.DateTimeUtils$.org$apache$spark$sql$catalyst$util$SparkDateTimeUtils$$getOffsetsByWallHandle(DateTimeUtils.scala:41) ~[spark-catalyst_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.toJavaDate(SparkDateTimeUtils.scala:228) ~[spark-sql-api_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.toJavaDate$(SparkDateTimeUtils.scala:223) ~[spark-sql-api_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.sql.catalyst.util.DateTimeUtils$.toJavaDate(DateTimeUtils.scala:41) ~[spark-catalyst_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.sql.catalyst.util.DateTimeUtils.toJavaDate(DateTimeUtils.scala) ~[spark-catalyst_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificSafeProjection.createExternalRow_0_2$(Unknown Source) ~[?:?]
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificSafeProjection.apply(Unknown Source) ~[?:?]
at scala.collection.Iterator$$anon$9.next(Iterator.scala:584) ~[scala-library-2.13.14.jar:?]
at scala.collection.Iterator$$anon$9.next(Iterator.scala:584) ~[scala-library-2.13.14.jar:?]
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.savePartition(JdbcUtils.scala:806) ~[spark-sql_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$saveTable$1(JdbcUtils.scala:978) ~[spark-sql_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$saveTable$1$adapted(JdbcUtils.scala:977) ~[spark-sql_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2(RDD.scala:1042) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2$adapted(RDD.scala:1042) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2501) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:171) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.scheduler.Task.run(Task.scala:146) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$5(Executor.scala:640) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) ~[spark-common-utils_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) ~[spark-common-utils_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:99) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:643) [spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1]
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
at java.base/java.lang.Thread.run(Thread.java:840) [?:?]
Кто-нибудь может предложить решение вышеуказанной проблемы!
При попытке записать [b]Spark (v4.0-preview1)[/b] Dataframe в таблицу базы данных ([b]SQL Server[/b]) с помощью [b]драйвера JDBC[/b]. Возникает следующая ошибка. [code]java.lang.IllegalAccessException: symbolic reference class is not accessible: class sun.util.calendar.ZoneInfo, from interface org.apache.spark.sql.catalyst.util.SparkDateTimeUtils (unnamed module @7bbc8656) at java.base/java.lang.invoke.MemberName.makeAccessException(MemberName.java:955) ~[?:?] at java.base/java.lang.invoke.MethodHandles$Lookup.checkSymbolicClass(MethodHandles.java:3686) ~[?:?] at java.base/java.lang.invoke.MethodHandles$Lookup.resolveOrFail(MethodHandles.java:3646) ~[?:?] at java.base/java.lang.invoke.MethodHandles$Lookup.findVirtual(MethodHandles.java:2680) ~[?:?] at org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.org$apache$spark$sql$catalyst$util$SparkDateTimeUtils$$getOffsetsByWallHandle(SparkDateTimeUtils.scala:206) ~[spark-sql-api_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.org$apache$spark$sql$catalyst$util$SparkDateTimeUtils$$getOffsetsByWallHandle$(SparkDateTimeUtils.scala:201) ~[spark-sql-api_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.sql.catalyst.util.DateTimeUtils$.org$apache$spark$sql$catalyst$util$SparkDateTimeUtils$$getOffsetsByWallHandle$lzycompute(DateTimeUtils.scala:41) ~[spark-catalyst_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.sql.catalyst.util.DateTimeUtils$.org$apache$spark$sql$catalyst$util$SparkDateTimeUtils$$getOffsetsByWallHandle(DateTimeUtils.scala:41) ~[spark-catalyst_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.toJavaDate(SparkDateTimeUtils.scala:228) ~[spark-sql-api_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.toJavaDate$(SparkDateTimeUtils.scala:223) ~[spark-sql-api_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.sql.catalyst.util.DateTimeUtils$.toJavaDate(DateTimeUtils.scala:41) ~[spark-catalyst_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.sql.catalyst.util.DateTimeUtils.toJavaDate(DateTimeUtils.scala) ~[spark-catalyst_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificSafeProjection.createExternalRow_0_2$(Unknown Source) ~[?:?] at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificSafeProjection.apply(Unknown Source) ~[?:?] at scala.collection.Iterator$$anon$9.next(Iterator.scala:584) ~[scala-library-2.13.14.jar:?] at scala.collection.Iterator$$anon$9.next(Iterator.scala:584) ~[scala-library-2.13.14.jar:?] at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.savePartition(JdbcUtils.scala:806) ~[spark-sql_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$saveTable$1(JdbcUtils.scala:978) ~[spark-sql_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$saveTable$1$adapted(JdbcUtils.scala:977) ~[spark-sql_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2(RDD.scala:1042) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2$adapted(RDD.scala:1042) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2501) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:171) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.scheduler.Task.run(Task.scala:146) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$5(Executor.scala:640) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) ~[spark-common-utils_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) ~[spark-common-utils_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:99) ~[spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:643) [spark-core_2.13-4.0.0-preview1.jar:4.0.0-preview1] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] at java.base/java.lang.Thread.run(Thread.java:840) [?:?] [/code] Кто-нибудь может предложить решение вышеуказанной проблемы!
При попытке записать Spark (v4.0-preview1) Dataframe в таблицу базы данных ( SQL Server ) с помощью драйвера JDBC . Возникает следующая ошибка.
java.lang.IllegalAccessException: symbolic reference class is not accessible: class...
Я пытаюсь запустить сеанс Spark в Jupyter Notebook на компьютере EC2 Linux с помощью кода Visual Studio. Мой код выглядит следующим образом:
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName( spark_app ).getOrCreate()...
Я хочу сделать каждый параметр необязательным, но столкнулся с проблемой получения ошибки
{
error : Internal Server Error ,
code : 500,
message : class java.util.ArrayList cannot be cast to class java.util.UUID (java.util.ArrayList and...
Я хочу сделать каждый параметр необязательным, но у меня возникла проблема с получением ошибки
{
error : Internal Server Error ,
code : 500,
message : class java.util.ArrayList cannot be cast to class java.util.UUID (java.util.ArrayList and...
У меня есть таблица с китайским именем столбца 人员, используйте org.apache.spark.sql.Dataset для обработки данных.
Когда я фильтрую с английским именем столбца, запуск выполняется успешно.
dataset.filter( ( (name = 'name1') ) ) , queryExecution —...