-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Closed
Description
Hello,
I'm experiencing difficulties similar to #140 - I'm trying to fit an autosklearn classifier, for which I got reasonable results using sgd logistic regression/xgboost, but all I get is an ensemble containing the DummyClassifier.
Some relevant details:
- Autosklearn version - 0.4.0 - from the release notes I was under the impression that Should baseline DummyClassifier ever be the winning ensemble? #140 should be resolved...
- Tried subsampling.
- Tried increasing the time limit (per run and per task) to a couple of hours - sgd logisitic regression takes less than a minute.
- Tried dense/sparse representations.
- The problem is solved by using AUC loss (as in Should baseline DummyClassifier ever be the winning ensemble? #140), but this isn't very helpful - I'm trying to optimize a custom loss function (for which the problem persists).
- Log loss also isn't working.
Any ideas? Maybe it's possible to exclude the DummyClassifier?
Metadata
Metadata
Assignees
Labels
No labels