[Feature Request] expose Delta version programmatically
Feature request
Overview
Add delta lake library version to python api.
I am unable find installed delta library version programmatically, when using python package.
let me know if feature already exists, i didn't find it.
Willingness to contribute
The Delta Lake Community encourages new feature contributions. Would you or another member of your organization be willing to contribute an implementation of this feature?
- [ ] Yes. I can contribute this feature independently.
- [x] Yes. I would be willing to contribute this feature with guidance from the Delta Lake community.
- [ ] No. I cannot contribute this feature at this time.
There isnt one. But why do you need to know this version? Dont you already know what version you have installed with spark?
i am using delta lake package comes with databricks in Azure. where can i find exact version?
I found delta lake version in databricks runtime docs.
Ex: https://docs.databricks.com/release-notes/runtime/11.1.html
FWIW there is a public SparkSession.version field (exposed in pyspark too)
Hi @gokulyc could you share a little more about your use case? Specifically, what is the pattern you're trying to implement that requires you to programmatically get the Delta version? Thanks!
@nkarpov I observed that python libraries like sklearn, pandas & numpy (Most Libraries) have library version value as __version__ attribute. Value is usually in version.py file.
for example optimize() was introduced in 2.0 in delta-lake library, one can use the value to write logic without exception handling. It will increase developer experience.
please add the minor feature.
It is already present in setup.py, it needs to added to __version__.py or __init__.py
https://github.com/delta-io/delta/blob/edaeb86304211513c8028d056a7d90e98ec2839c/setup.py#L17
Thanks @gokulyc, are you open to submitting a PR for this?