Release v1.2.0
Release v1.2.0
Breaking changes
- If you implemented your own
Tuner, the old use case of reporting results withOracle.update_trial()inTuner.run_trial()is deprecated. Please return the metrics inTuner.run_trial()instead. - If you implemented your own
Oracleand overridedOracle.end_trial(), you need to change the signature of the function fromOracle.end_trial(trial.trial_id, trial.status)toOracle.end_trial(trial). - The default value of the
stepargument inkeras_tuner.HyperParameters.Int()is changed toNone, which was1before. No change in default behavior. - The default value of the
samplingargument inkeras_tuner.HyperParameters.Int()is changed to"linear", which wasNone
before. No change in default behavior. - The default value of the
samplingargument inkeras_tuner.HyperParameters.Float()is changed to"linear", which was
Nonebefore. No change in default behavior. - If you explicitly rely on protobuf values, the new protobuf bug fix may affect you.
- Changed the mechanism of how a random sample is drawn for a hyperparameter. They now all start from a random value between 0 and 1, and convert the value to a random sample.
New features
- A new tuner is added,
keras_tuner.GridSearch, which can exhaust all the possible hyperparameter combinations. - Better fault tolerance during the search. Added two new arguments to
TunerandOracleinitializers,max_retries_per_trialandmax_consecutive_failed_trials. - You can now mark a
Trialas failed byraise keras_tuner.FailedTrialError("error message.")inHyperModel.build(),HyperModel.fit(), or your model build function. - Provides better error messages for invalid configs for
IntandFloattype hyperparameters. - A decorator
@keras_tuner.synchronizedis added to decorate the methods inOracleand its subclasses to synchronize the concurrent calls to ensure thread safety in parallel tuning.
Bug fixes
- Protobuf was not converting Boolean type hyperparameter correctly. This is now fixed.
- Hyperband was not loading the weights correctly for half-trained models. This is now fixed.
KeyErrormay occur if usinghp.conditional_scope(), or theparentargument for hyperparameters. This is now fixed.num_initial_pointsof theBayesianOptimizationshould defaults to3 * dimension, but it defaults to 2. This is now fixed.- It would through an error when using a concrete Keras optimizer object to override the
HyperModelcompile arg. This is now fixed. - Workers might crash due to
Oraclereloading when running in parallel. This is now fixed.
New Contributors
- @Firas-RHIMI made their first contribution in #711
- @HanxiaoLyu made their first contribution in #746
- @leleogere made their first contribution in #794
- @LuNoX made their first contribution in #815
Full Changelog: 1.1.3...1.2.0