小编den*_*ise的帖子

使用 spark-on-k8s-operator 在 Kubernetes 上运行 Pyspark 的依赖问题

我花了几天时间试图找出我在 Kubernetes 上运行的 (Py)Spark 遇到的依赖问题。我正在使用spark-on-k8s-operator和 Spark 的 Google Cloud 连接器。

当我尝试使用下面的 .yaml 文件提交没有依赖项的sparkctl create sparkjob.yaml ...Spark作业时,它就像一个魅力。

apiVersion: "sparkoperator.k8s.io/v1beta2"
kind: SparkApplication
metadata:
  name: spark-job
  namespace: my-namespace
spec:
  type: Python
  pythonVersion: "3"
  hadoopConf:
    "fs.gs.impl": "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem"
    "fs.AbstractFileSystem.gs.impl": "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS"
    "fs.gs.project.id": "our-project-id"
    "fs.gs.system.bucket": "gcs-bucket-name"
    "google.cloud.auth.service.account.enable": "true"
    "google.cloud.auth.service.account.json.keyfile": "/mnt/secrets/keyfile.json"
  mode: cluster
  image: "image-registry/spark-base-image"
  imagePullPolicy: Always
  mainApplicationFile: ./sparkjob.py
  deps:
    jars:
      - https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_2.11/2.4.5/spark-sql-kafka-0-10_2.11-2.4.5.jar
  sparkVersion: "2.4.5"
  restartPolicy:
    type: OnFailure
    onFailureRetries: 3
    onFailureRetryInterval: 10
    onSubmissionFailureRetries: 5
    onSubmissionFailureRetryInterval: 20
  driver:
    cores: 1
    coreLimit: "1200m"
    memory: …
Run Code Online (Sandbox Code Playgroud)

dependency-management docker apache-spark kubernetes pyspark

6
推荐指数
1
解决办法
1124
查看次数