modelstore
modelstore copied to clipboard
🏬 modelstore is a Python library that allows you to version, export, and save a machine learning model to your filesystem or a cloud storage provider.
I originally put this issue into the discussion area, but thought it might be a better fit as an issue. ------------------ I am migrating an existing database that is storing...
## Problem I was trying to run a service in a docker container with a modelstore I created on my own machine as a mounted volume attached to it. This...
Adding this issue here for visibility (I received it via email 📥 ): --- Currently, our models are deployed by being baked into a docker image, partly because of legacy,...
There is to my knowledge no straight forward way of retrieving additional data sent on model upload other than downloading the entire artifact and knowing the exact name of the...
When trying to load a `xgb.core.Booster` object I get a `ValueError: could not find matching manager` error. So it appears this lower level xgboost object (which doesn't implement the sklearn...
Anonymous access of GCP bucket fails with `ValueError: Anonymous credentials cannot be refreshed.`
Affects modelstore 0.0.74. To reproduce: ```shell # create a new environment (Python 3.8) python -m venv env source env/bin/activate # install modelstore and GCP CLI pip install modelstore google-cloud-storage python...
> Use of pkg_resources is deprecated in favor of [importlib.resources](https://docs.python.org/3.11/library/importlib.resources.html#module-importlib.resources), [importlib.metadata](https://docs.python.org/3.11/library/importlib.metadata.html#module-importlib.metadata) and their backports ([importlib_resources](https://pypi.org/project/importlib_resources), [importlib_metadata](https://pypi.org/project/importlib_metadata)). Some useful APIs are also provided by [packaging](https://pypi.org/project/packaging) (e.g. requirements and version parsing). Users...
See https://github.com/VowpalWabbit/vowpal_wabbit
Based on what i've seen from the implementation this should be possible using `root_prefix` + `isoformat_date` fragment and leverage s3, and azureblob search by prefix capability.
I am trying to run the following code using v0.0.80 ```python import tensorflow as tf from modelstore import ModelStore model_store = ModelStore.from_azure( container_name="xyz", root_prefix="xyz", ) def tf_model(): model = tf.keras.models.Sequential(...