MetaSpore
MetaSpore copied to clipboard
Offline feature refactoring
-
[DONE] optimizer_abstraction
Add the following PyTorchEstimator parameters.
loss and metric
- loss_function
- metric_class (and builtin ModelMetric classes)
dataset and minibatch transformers
- training_dataset_transformer & validation_dataset_transformer
- training_minibatch_transformer & validation_minibatch_transformer
- training_minibatch_preprocessor & validation_minibatch_preprocessor & minibatch_preprocessor
coordinator and worker hooks
- coordinator_start_hook & coordinator_stop_hook
- start_workers_hook & stop_workers_hook
- worker_start_hook & worker_stop_hook
-
[DONE] offline_feature_refactoring
- [DONE] move arrow and feature releated files to common/
- [DONE] disable _GLIBCXX_USE_CXX11_ABI and use libarrow.so shared library
- [DONE] replace pandas_udf with mapInPandas
- [DONE] remove IndexBatch and CombineSchema
- [DONE] implement SparseFeatureExtractor and use it in Python
- [DONE] implement SparseFeatureGroup
- [DONE] SparsePull debug
- [DONE] move arrow and feature releated tests
- [Canceled] upgrade algorithms for breaking incompatible changes (to be finished later)