[SPARK-51334][CONNECT] Add java/scala version in analyze spark_version response
What changes were proposed in this pull request?
- piggyback off the spark version response and include other env properties like java/scala version
Why are the changes needed?
- client is not able to know what type of java/scala version the server uses which can be problematic with udfs and client has different build env than server
Does this PR introduce any user-facing change?
- no. the proto includes an additional response which doesn't change for existing users
How was this patch tested?
existing tests
Was this patch authored or co-authored using generative AI tooling?
No
I think we could add some tests here to verify the change in the response?
@the-sakthi do you know where i can write them? im not seeing a relevant test suite that already includes tests for analyze plan handler
@dongjoon-hyun IMHO it is ok to return these values, for language agnostic clients they can simply inquire the server about how it runs. I don't think this creates any inconsistency.
@dongjoon-hyun please take another pass thanks!
To @grundprinzip and @garlandz-db , could you propose new messages instead of touching the existing message SparkVersion ? This kind of piggy-backing is not a good design choice because someone might want to add Python version later and then Photon or (Comet) version later.
SparkVersion is not a gateway to look up the server installation, ins't it?
We're closing this PR because it hasn't been updated in a while. This isn't a judgement on the merit of the PR in any way. It's just a way of keeping the PR queue manageable. If you'd like to revive this PR, please reopen it and ask a committer to remove the Stale tag!