Google Cloud Storage requires storage.objects.create permission when reading from pyspark

Yoa*_*oav 4 google-cloud-storage google-cloud-platform apache-spark-sql pyspark airflow

I'm trying to read pyspark DataFrame from Google Cloud Storage, but I keep getting an error that the service account has no storage.objects.create permissions. The account does not have WRITER permissions, but it's just reading parquet files:

spark_session.read.parquet(input_path)

18/12/25 13:12:00 INFO com.google.cloud.hadoop.repackaged.gcs.com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl: Repairing batch of 1 missing directories.
18/12/25 13:12:01 ERROR com.google.cloud.hadoop.repackaged.gcs.com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl: Failed to repair some missing directories.
com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
{
  "code" : 403,
  "errors" : [ {
    "domain" : "global",
    "message" : "***.gserviceaccount.com does not have storage.objects.create access to ***.",
    "reason" : "forbidden"
  } ],
  "message" : "***.gserviceaccount.com does not have storage.objects.create access to ***."
}
Run Code Online (Sandbox Code Playgroud)

Yoa*_*oav 6

我们发现了问题。这是由于 GCS 连接器中的隐式自动修复功能。我们通过设置fs.gs.implicit.dir.repair.enable为 来禁用此行为false