google-cloud-python icon indicating copy to clipboard operation
google-cloud-python copied to clipboard

Set SparkBatch Job Properties

Open HsinghaoFu opened this issue 3 years ago • 1 comments

Hi,

In gcloud command I can set properties like :

gcloud dataproc batches submit job_name
--properties ^~^spark.jars.packages=org.apache.spark:spark-avro_2.12:3.2.1~spark.executor.instances=4

But in this client there is no method to set it in class SparkBatch.

I use the Cloud Composer (Airflow on GCP) to creat the Dataproc Serverless Job and the example code on this page using the DataprocCreateBatchOperator looks like to use our library.

How can I set the properties?

Thank you!

HsinghaoFu avatar Jul 28 '22 15:07 HsinghaoFu

I'm going to transfer this issue to the google-cloud-python repository as we are preparing to move the code for google-cloud-dataproc to that repository in the next 1-2 weeks.

parthea avatar Apr 17 '23 20:04 parthea