Mat*_*gan 2 scala apache-spark
我有:
val sparkBuilder: SparkSession.Builder = SparkSession
.builder
.appName("CreateModelDataPreparation")
.config("spark.master", "local")
implicit val spark: SparkSession = sparkBuilder.getOrCreate()
Run Code Online (Sandbox Code Playgroud)
但是,当我运行我的程序时,我仍然得到:
org.apache.spark.SparkException: A master URL must be set in your configuration
at org.apache.spark.SparkContext.<init>(SparkContext.scala:379)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
Run Code Online (Sandbox Code Playgroud)
SparkSession 按照其他帖子中的建议在 Main 方法中设置。这些似乎并没有解决问题。
这与建议的重复不同,因为我已经尝试过两者:
def main(argv: Array[String]): Unit = {
import DeweyConfigs.implicits.da3wConfig
val commandlineArgs: DeweyReaderArgs = processCommandLineArgs(argv)
val sparkBuilder: SparkSession.Builder = SparkSession
.builder
.appName("CreateModelDataPreparation")
.master("local")
implicit val spark: SparkSession = sparkBuilder.config("spark.master", "local").getOrCreate()
import spark.implicits._
...
Run Code Online (Sandbox Code Playgroud)
和
def main(argv: Array[String]): Unit = {
import DeweyConfigs.implicits.da3wConfig
val commandlineArgs: DeweyReaderArgs = processCommandLineArgs(argv)
val sparkBuilder: SparkSession.Builder = SparkSession
.builder
.appName("CreateModelDataPreparation")
.config("master", "local")
implicit val spark: SparkSession = sparkBuilder.config("spark.master", "local").getOrCreate()
import spark.implicits._
...
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
3825 次 |
| 最近记录: |