google-cloud-python
google-cloud-python copied to clipboard
Set SparkBatch Job Properties
Hi,
In gcloud command I can set properties like :
gcloud dataproc batches submit job_name
--properties ^~^spark.jars.packages=org.apache.spark:spark-avro_2.12:3.2.1~spark.executor.instances=4
But in this client there is no method to set it in class SparkBatch.
I use the Cloud Composer (Airflow on GCP) to creat the Dataproc Serverless Job and the example code on this page using the DataprocCreateBatchOperator looks like to use our library.
How can I set the properties?
Thank you!
I'm going to transfer this issue to the google-cloud-python repository as we are preparing to move the code for google-cloud-dataproc to that repository in the next 1-2 weeks.