web-dev-qa-db-ja.com

Apache Spark Sparkを実行しています-YARNエラーでシェル

ダウンロードしました:spark-2.1.0-bin-hadoop2.7.tgz from http://spark.Apache.org/downloads.html 。 HadoopHDFSとYARNは$ start-dfs.sh$ start-yarn.shで始まりました。しかし、$ spark-Shell --master yarn --deploy-mode clientを実行すると、以下のエラーが発生します。

    $ spark-Shell --master yarn --deploy-mode client
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/04/08 23:04:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-Java classes where applicable
17/04/08 23:04:54 WARN util.Utils: Your hostname, Pandora resolves to a loopback address: 127.0.1.1; using 192.168.1.11 instead (on interface wlp3s0)
17/04/08 23:04:54 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
17/04/08 23:04:56 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
17/04/08 23:05:15 ERROR cluster.YarnClientSchedulerBackend: Yarn application has already exited with state FINISHED!
17/04/08 23:05:15 ERROR spark.SparkContext: Error initializing SparkContext.
Java.lang.IllegalStateException: Spark context stopped while waiting for backend
    at org.Apache.spark.scheduler.TaskSchedulerImpl.waitBackendReady(TaskSchedulerImpl.scala:614)
    at org.Apache.spark.scheduler.TaskSchedulerImpl.postStartHook(TaskSchedulerImpl.scala:169)
    at org.Apache.spark.SparkContext.<init>(SparkContext.scala:567)
    at org.Apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
    at org.Apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
    at org.Apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
    at scala.Option.getOrElse(Option.scala:121)
    at org.Apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
    at org.Apache.spark.repl.Main$.createSparkSession(Main.scala:95)
    at $line3.$read$$iw$$iw.<init>(<console>:15)
    at $line3.$read$$iw.<init>(<console>:42)
    at $line3.$read.<init>(<console>:44)
    at $line3.$read$.<init>(<console>:48)
    at $line3.$read$.<clinit>(<console>)
    at $line3.$eval$.$print$lzycompute(<console>:7)
    at $line3.$eval$.$print(<console>:6)
    at $line3.$eval.$print(<console>)
    at Sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at Sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.Java:62)
    at Sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.Java:43)
    at Java.lang.reflect.Method.invoke(Method.Java:498)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
    at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
    at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
    at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
    at org.Apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
    at org.Apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
    at org.Apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
    at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
    at org.Apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
    at org.Apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
    at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
    at org.Apache.spark.repl.Main$.doMain(Main.scala:68)
    at org.Apache.spark.repl.Main$.main(Main.scala:51)
    at org.Apache.spark.repl.Main.main(Main.scala)
    at Sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at Sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.Java:62)
    at Sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.Java:43)
    at Java.lang.reflect.Method.invoke(Method.Java:498)
    at org.Apache.spark.deploy.SparkSubmit$.org$Apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
    at org.Apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
    at org.Apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
    at org.Apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
    at org.Apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/04/08 23:05:15 ERROR client.TransportClient: Failed to send RPC 7918328175210939600 to /192.168.1.11:56186: Java.nio.channels.ClosedChannelException
Java.nio.channels.ClosedChannelException
    at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
17/04/08 23:05:15 ERROR cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Sending RequestExecutors(0,0,Map()) to AM was unsuccessful
Java.io.IOException: Failed to send RPC 7918328175210939600 to /192.168.1.11:56186: Java.nio.channels.ClosedChannelException
    at org.Apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.Java:249)
    at org.Apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.Java:233)
    at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.Java:514)
    at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.Java:488)
    at io.netty.util.concurrent.DefaultPromise.access$000(DefaultPromise.Java:34)
    at io.netty.util.concurrent.DefaultPromise$1.run(DefaultPromise.Java:438)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.Java:408)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.Java:455)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.Java:140)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.Java:144)
    at Java.lang.Thread.run(Thread.Java:745)
Caused by: Java.nio.channels.ClosedChannelException
    at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
17/04/08 23:05:15 ERROR util.Utils: Uncaught exception in thread Yarn application state monitor
org.Apache.spark.SparkException: Exception thrown in awaitResult
    at org.Apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
    at org.Apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
    at org.Apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
    at org.Apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167)
    at org.Apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
    at org.Apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.requestTotalExecutors(CoarseGrainedSchedulerBackend.scala:512)
    at org.Apache.spark.scheduler.cluster.YarnSchedulerBackend.stop(YarnSchedulerBackend.scala:93)
    at org.Apache.spark.scheduler.cluster.YarnClientSchedulerBackend.stop(YarnClientSchedulerBackend.scala:151)
    at org.Apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:467)
    at org.Apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1588)
    at org.Apache.spark.SparkContext$$anonfun$stop$8.apply$mcV$sp(SparkContext.scala:1826)
    at org.Apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1283)
    at org.Apache.spark.SparkContext.stop(SparkContext.scala:1825)
    at org.Apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:108)
Caused by: Java.io.IOException: Failed to send RPC 7918328175210939600 to /192.168.1.11:56186: Java.nio.channels.ClosedChannelException
    at org.Apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.Java:249)
    at org.Apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.Java:233)
    at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.Java:514)
    at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.Java:488)
    at io.netty.util.concurrent.DefaultPromise.access$000(DefaultPromise.Java:34)
    at io.netty.util.concurrent.DefaultPromise$1.run(DefaultPromise.Java:438)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.Java:408)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.Java:455)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.Java:140)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.Java:144)
    at Java.lang.Thread.run(Thread.Java:745)
Caused by: Java.nio.channels.ClosedChannelException
    at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
Java.lang.IllegalStateException: Spark context stopped while waiting for backend
  at org.Apache.spark.scheduler.TaskSchedulerImpl.waitBackendReady(TaskSchedulerImpl.scala:614)
  at org.Apache.spark.scheduler.TaskSchedulerImpl.postStartHook(TaskSchedulerImpl.scala:169)
  at org.Apache.spark.SparkContext.<init>(SparkContext.scala:567)
  at org.Apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
  at org.Apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
  at org.Apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
  at scala.Option.getOrElse(Option.scala:121)
  at org.Apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
  at org.Apache.spark.repl.Main$.createSparkSession(Main.scala:95)
  ... 47 elided
<console>:14: error: not found: value spark
       import spark.implicits._
              ^
<console>:14: error: not found: value spark
       import spark.sql
              ^
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.1.0
      /_/

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_121)
Type in expressions to have them evaluated.
Type :help for more information.

YARNはSparkが実行されていることを検出しましたが、エラーが原因でSparkが未定義のステータスで終了します。

enter image description here

4
Dobob

私はあなたと同じ問題に遭遇しました。 NodeManagerログを確認すると、次の警告が表示されます。

2017-10-26 19:43:21,787警告org.Apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl:コンテナー[pid = 3820、containerID = container_1509016963775_0001_02_000001]が仮想メモリの制限を超えて実行されています。現在の使用量:339.0MBの1GB物理メモリが使用されています。 2.2GBの2.1GB仮想メモリが使用されました。コンテナを殺します。

そこで、より大きな仮想メモリを設定しました(yarn-site.xmlのyarn.nodemanager.vmem-pmem-ratio、デフォルト値は2.1)。その後、それは本当にうまくいきました。

1
xin