FLAML icon indicating copy to clipboard operation
FLAML copied to clipboard

A fast library for AutoML and tuning. Join our Discord: https://discord.gg/Cppx2vSPVP.

Results 225 FLAML issues
Sort by recently updated
recently updated
newest added

Open #807 Isolate the ensemble part and expose it to users

Hi FLAML developers, We’re researchers at Cornell University focused on quality assurance for machine learning workflows ([our group](https://www.cs.cornell.edu/~saikatd/)). We’d love to introduce you to [**NBTest**](https://github.com/seal-research/NBTest), a new tool we’ve built...

### Is your feature request related to a problem? Please describe. Need to support Python 3.12. ### Describe the solution you'd like _No response_ ### Additional context _No response_

documentation
enhancement
dependencies

>> >> - Stop duplicating X_first in both train/val sets >> - Implement dynamic class balancing >> - Preserve original dataset size ## Why are these changes needed? In `generic_task.py`,...

### Describe the bug After I fit an AutoML ensemble for a regression (`"ensemble": True` in AutoML settings), I get the best model of the ensamble in this way: `best_model...

bug

### Is your feature request related to a problem? Please describe. The `AutoML` instance argument `metric` documentation is currently: > metric - A string of the metric name or a...

enhancement

### Describe the bug After I fit an AutoML ensemble for a regression (`"ensemble": True` in AutoML settings), trying to get the feature importances as follow I get an empty...

enhancement

### Describe the bug I have a data set where I have tried to optimise the hyperparameters on Flaml, and it seems that the model keeps getting worse, the longer...

bug

### Describe the bug Hi: I've created two customized lightGBM estimators for automl: class MyMonotonicLightGBMGBDTClassifier(BaseEstimator): def __init__(self, task = 'binary:logistic', n_jobs = num_cores, **params): super().__init__(task, **params) self.estimator_class = LGBMClassifier #...

bug

### Describe the bug Hi @thinkall, I think that I may have found an issue in FLAML, although it's possible it was a deliberate choice by the developers. Basically if...

bug