web-dev-qa-db-ja.com

SparkContextの初期化エラー:マスターURLを構成で設定する必要があります

このコード を使用しました

私のエラーは:

Using Spark's default log4j profile: org/Apache/spark/log4j-defaults.properties

17/02/03 20:39:24 INFO SparkContext: Running Spark version 2.1.0

17/02/03 20:39:25 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-Java classes where applicable

17/02/03 20:39:25 WARN SparkConf: Detected deprecated memory fraction 
settings: [spark.storage.memoryFraction]. As of Spark 1.6, execution and  
storage memory management are unified. All memory fractions used in the old 
model are now deprecated and no longer read. If you wish to use the old 
memory management, you may explicitly enable `spark.memory.useLegacyMode` 
(not recommended).

17/02/03 20:39:25 ERROR SparkContext: Error initializing SparkContext.

org.Apache.spark.SparkException: A master URL must be set in your 
configuration
at org.Apache.spark.SparkContext.<init>(SparkContext.scala:379)
at PCA$.main(PCA.scala:26)
at PCA.main(PCA.scala)
at Sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at Sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.Java:62)
at Sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.Java:43)
at Java.lang.reflect.Method.invoke(Method.Java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.Java:144)

17/02/03 20:39:25 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.Apache.spark.SparkException: A master URL must be set in your configuration
at org.Apache.spark.SparkContext.<init>(SparkContext.scala:379)
at PCA$.main(PCA.scala:26)
at PCA.main(PCA.scala)
at Sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
   Sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.Java:62)
at   
    Sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.Java:43)

at Java.lang.reflect.Method.invoke(Method.Java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.Java:144)

Process finished with exit code 1
9
fakherzad

spark stand aloneを実行している場合

val conf = new SparkConf().setMaster("spark://master") //missing 

また、ジョブの送信中にパラメーターを渡すことができます

spark-submit --master spark://master

spark localを実行している場合

val conf = new SparkConf().setMaster("local[2]") //missing 

ジョブの送信中にパラメーターを渡すことができます

spark-submit --master local

糸でspark=を実行している場合

spark-submit --master yarn
8

エラーメッセージは非常に明確で、SparkContextまたはspark-submitを介してSparkマスターノードのアドレスを提供する必要があります。

val conf = 
  new SparkConf()
    .setAppName("ClusterScore")
    .setMaster("spark://172.1.1.1:7077") // <--- This is what's missing
    .set("spark.storage.memoryFraction", "1")

val sc = new SparkContext(conf)
5
Yuval Itzchakov
 SparkConf configuration = new SparkConf()
            .setAppName("Your Application Name")
            .setMaster("local");
 val sc = new SparkContext(conf);

それが動作します...

3
Shyam Gupta