Skip to content
This repository was archived by the owner on Jan 15, 2024. It is now read-only.

Conversation

eric-haibin-lin
Copy link
Member

Description

Add support for custom optimizer. Fix checkpoint interval check.

Checklist

Essentials

  • PR's title starts with a category (e.g. [BUGFIX], [MODEL], [TUTORIAL], [FEATURE], [DOC], etc)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage
  • Code is well-documented

Changes

  • Feature1, tests, (and when applicable, API doc)
  • Feature2, tests, (and when applicable, API doc)

Comments

  • If this change is a backward incompatible change, why must this change be made.
  • Interesting edge cases to note here

@eric-haibin-lin eric-haibin-lin requested a review from a team as a code owner January 18, 2020 06:31
@codecov
Copy link

codecov bot commented Jan 18, 2020

Codecov Report

Merging #1121 into master will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master    #1121   +/-   ##
=======================================
  Coverage   88.34%   88.34%           
=======================================
  Files          66       66           
  Lines        6290     6290           
=======================================
  Hits         5557     5557           
  Misses        733      733

@mli
Copy link
Member

mli commented Jan 18, 2020

Job PR-1121/1 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-1121/1/index.html

Copy link
Contributor

@leezu leezu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@leezu leezu merged commit 65a7671 into dmlc:master Jan 19, 2020
@eric-haibin-lin eric-haibin-lin deleted the pretrain-fix branch January 20, 2020 05:06
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants