web-dev-qa-db-ja.com

なぜApache spark=はJava 10で動作しませんか?Java.lang.IllegalArgumentException

spark 2.3がJava 1.10(2018年7月現在)で機能しない理由はありますか?

以下は、spark-submitを使用してSparkPiの例を実行したときの出力です。

$ ./bin/spark-submit ./examples/src/main/python/pi.py 
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.Apache.hadoop.security.authentication.util.KerberosUtil to method Sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.Apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
2018-07-13 14:31:30 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-Java classes where applicable
2018-07-13 14:31:31 INFO  SparkContext:54 - Running Spark version 2.3.1
2018-07-13 14:31:31 INFO  SparkContext:54 - Submitted application: PythonPi
2018-07-13 14:31:31 INFO  Utils:54 - Successfully started service 'sparkDriver' on port 58681.
2018-07-13 14:31:31 INFO  SparkEnv:54 - Registering MapOutputTracker
2018-07-13 14:31:31 INFO  SparkEnv:54 - Registering BlockManagerMaster
2018-07-13 14:31:31 INFO  BlockManagerMasterEndpoint:54 - Using org.Apache.spark.storage.DefaultTopologyMapper for getting topology information
2018-07-13 14:31:31 INFO  BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2018-07-13 14:31:31 INFO  DiskBlockManager:54 - Created local directory at /private/var/folders/mp/9hp4l4md4dqgmgyv7g58gbq0ks62rk/T/blockmgr-d24fab4c-c858-4cd8-9b6a-97b02aa630a5
2018-07-13 14:31:31 INFO  MemoryStore:54 - MemoryStore started with capacity 434.4 MB
2018-07-13 14:31:31 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
...
2018-07-13 14:31:32 INFO  StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
Traceback (most recent call last):
  File "~/Documents/spark-2.3.1-bin-hadoop2.7/./examples/src/main/python/pi.py", line 44, in <module>
    count = spark.sparkContext.parallelize(range(1, n + 1), partitions).map(f).reduce(add)
  File "~/Documents/spark-2.3.1-bin-hadoop2.7/python/lib/pyspark.Zip/pyspark/rdd.py", line 862, in reduce
  File "~/Documents/spark-2.3.1-bin-hadoop2.7/python/lib/pyspark.Zip/pyspark/rdd.py", line 834, in collect
  File "~/Documents/spark-2.3.1-bin-hadoop2.7/python/lib/py4j-0.10.7-src.Zip/py4j/Java_gateway.py", line 1257, in __call__
  File "~/Documents/spark-2.3.1-bin-hadoop2.7/python/lib/pyspark.Zip/pyspark/sql/utils.py", line 63, in deco
  File "~/Documents/spark-2.3.1-bin-hadoop2.7/python/lib/py4j-0.10.7-src.Zip/py4j/protocol.py", line 328, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.Apache.spark.api.python.PythonRDD.collectAndServe.
: Java.lang.IllegalArgumentException
    at org.Apache.xbean.asm5.ClassReader.<init>(Unknown Source)
    at org.Apache.xbean.asm5.ClassReader.<init>(Unknown Source)
    at org.Apache.xbean.asm5.ClassReader.<init>(Unknown Source)
    at org.Apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:46)
    at org.Apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:449)
    at org.Apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:432)
    at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
    at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
    at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
    at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:103)
    at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
    at org.Apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:432)
    at org.Apache.xbean.asm5.ClassReader.a(Unknown Source)
    at org.Apache.xbean.asm5.ClassReader.b(Unknown Source)
    at org.Apache.xbean.asm5.ClassReader.accept(Unknown Source)
    at org.Apache.xbean.asm5.ClassReader.accept(Unknown Source)
    at org.Apache.spark.util.ClosureCleaner$$anonfun$org$Apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:262)
    at org.Apache.spark.util.ClosureCleaner$$anonfun$org$Apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:261)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.Apache.spark.util.ClosureCleaner$.org$Apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:261)
    at org.Apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:159)
    at org.Apache.spark.SparkContext.clean(SparkContext.scala:2299)
    at org.Apache.spark.SparkContext.runJob(SparkContext.scala:2073)
    at org.Apache.spark.SparkContext.runJob(SparkContext.scala:2099)
    at org.Apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:939)
    at org.Apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.Apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
    at org.Apache.spark.rdd.RDD.withScope(RDD.scala:363)
    at org.Apache.spark.rdd.RDD.collect(RDD.scala:938)
    at org.Apache.spark.api.python.PythonRDD$.collectAndServe(PythonRDD.scala:162)
    at org.Apache.spark.api.python.PythonRDD.collectAndServe(PythonRDD.scala)
    at Java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at Java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.Java:62)
    at Java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.Java:43)
    at Java.base/Java.lang.reflect.Method.invoke(Method.Java:564)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.Java:244)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.Java:357)
    at py4j.Gateway.invoke(Gateway.Java:282)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.Java:132)
    at py4j.commands.CallCommand.execute(CallCommand.Java:79)
    at py4j.GatewayConnection.run(GatewayConnection.Java:238)
    at Java.base/Java.lang.Thread.run(Thread.Java:844)

2018-07-13 14:31:33 INFO  SparkContext:54 - Invoking stop() from shutdown hook
...

here のように、Java10ではなくJava8に切り替えることで問題を解決しました。

11
mehdi

主な技術的理由は、SparkがJava 9でプライベートにされたSun.misc.Unsafeを使用したネイティブメモリへの直接アクセスに大きく依存しているためです。

12
user10077548

コミッターはこちら。実際にサポートするのはかなりの作業ですJava 9+: https://issues.Apache.org/jira/browse/SPARK-24417

また、ほぼ完了しており、Spark 3.0に対応している必要があります。これは、Java 8〜11以降で実行する必要があります。

目標(まあ、私のもの)は、モジュールアクセスを開かずに機能させることです。

主な問題は次のとおりです。

  • Sun.misc.Unsafe使用を削除するか回避する必要があります
  • ブートクラスローダーの構造の変更
  • Java 9+のScalaサポート
  • Java 9+で動作する一連の依存関係の更新
  • JAXBは自動的に利用できなくなりました
4
Sean Owen

SparkはJDK 9で変更されたメモリAPIに依存しているため、JDK 9以降では使用できません。

そして、それがこの理由です。

問題を確認してください:

https://issues.Apache.org/jira/browse/SPARK-24421

1
KayV