hat*_*ouj 4 scala apache-spark spark-streaming apache-spark-standalone
209/5000 您好,我想在我的代码 scala 中添加选项“--deploy-mode cluster”:
val sparkConf = new SparkConfig ().setMaster ("spark: //192.168.60.80:7077")
Run Code Online (Sandbox Code Playgroud)
不使用shell(命令.\Spark-submit)
我想在 Scala 中使用“ spark.submit.deployMode ”
使用 SparkConfig:
//set up the spark configuration and create contexts
val sparkConf = new SparkConf().setAppName("SparkApp").setMaster("spark: //192.168.60.80:7077")
val sc = new SparkContext(sparkConf).set("spark.submit.deployMode", "cluster")
Run Code Online (Sandbox Code Playgroud)
使用 SparkSession:
val spark = SparkSession
.builder()
.appName("SparkApp")
.master("spark: //192.168.60.80:7077")
.config("spark.submit.deployMode","cluster")
.enableHiveSupport()
.getOrCreate()
Run Code Online (Sandbox Code Playgroud)
您可以使用
val sparkConf = new SparkConf ().setMaster ("spark: //192.168.60.80:7077").set("spark.submit.deployMode","cluster")
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
7621 次 |
| 最近记录: |