Skip to content
This repository was archived by the owner on Jan 15, 2024. It is now read-only.

Conversation

eric-haibin-lin
Copy link
Member

Description

reopen #851

Checklist

Essentials

  • PR's title starts with a category (e.g. [BUGFIX], [MODEL], [TUTORIAL], [FEATURE], [DOC], etc)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage
  • Code is well-documented

Changes

  • Feature1, tests, (and when applicable, API doc)
  • Feature2, tests, (and when applicable, API doc)

Comments

  • If this change is a backward incompatible change, why must this change be made.
  • Interesting edge cases to note here

cc @dmlc/gluon-nlp-team

@eric-haibin-lin eric-haibin-lin requested a review from a team as a code owner September 30, 2019 21:06
@codecov
Copy link

codecov bot commented Sep 30, 2019

Codecov Report

Merging #950 into master will increase coverage by 9.86%.
The diff coverage is 86.34%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #950      +/-   ##
==========================================
+ Coverage   78.41%   88.27%   +9.86%     
==========================================
  Files          67       67              
  Lines        6277     6287      +10     
==========================================
+ Hits         4922     5550     +628     
+ Misses       1355      737     -618
Impacted Files Coverage Δ
src/gluonnlp/utils/parameter.py 87.09% <100%> (+3.16%) ⬆️
src/gluonnlp/optimizer/__init__.py 100% <100%> (ø) ⬆️
src/gluonnlp/metric/__init__.py 100% <100%> (ø) ⬆️
src/gluonnlp/metric/masked_accuracy.py 100% <100%> (+39.53%) ⬆️
src/gluonnlp/utils/version.py 100% <100%> (ø) ⬆️
src/gluonnlp/utils/files.py 42.62% <18.18%> (-6.4%) ⬇️
src/gluonnlp/model/train/language_model.py 88.51% <25%> (+48.62%) ⬆️
src/gluonnlp/optimizer/bert_adam.py 87.32% <81.57%> (-5.86%) ⬇️
src/gluonnlp/metric/length_normalized_loss.py 89.28% <89.28%> (ø)
src/gluonnlp/data/utils.py 86.39% <95.34%> (+12.34%) ⬆️
... and 22 more

@mli
Copy link
Member

mli commented Sep 30, 2019

Job PR-950/1 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-950/1/index.html

@szha szha added the release focus Progress focus for release label Oct 8, 2019
@mli
Copy link
Member

mli commented Nov 6, 2019

Job PR-950/2 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-950/2/index.html

@mli
Copy link
Member

mli commented Dec 16, 2019

Job PR-950/3 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-950/3/index.html

@mli
Copy link
Member

mli commented Jan 15, 2020

Job PR-950/4 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-950/4/index.html

@leezu
Copy link
Contributor

leezu commented Jan 15, 2020

@eric-haibin-lin

[2020-01-15T17:07:50.259Z] tests/unittest/test_models.py::test_bert_models[False] FAILED            [ 60%]

[2020-01-15T17:08:37.013Z] tests/unittest/test_models.py::test_bert_models[True] FAILED             [ 61%]

@eric-haibin-lin
Copy link
Member Author

I saw that.. And i am not able to reproduce it locally.

@leezu leezu self-assigned this Jan 22, 2020
@eric-haibin-lin
Copy link
Member Author

@leezu were you able to reproduce the err?

@mli
Copy link
Member

mli commented Jan 27, 2020

Job PR-950/5 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-950/5/index.html

@mli
Copy link
Member

mli commented Jan 28, 2020

Job PR-950/6 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-950/6/index.html

Lin added 2 commits January 28, 2020 15:45
@mli
Copy link
Member

mli commented Jan 28, 2020

Job PR-950/7 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-950/7/index.html

@mli
Copy link
Member

mli commented Jan 29, 2020

Job PR-950/8 is complete.
Docs are uploaded to http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR-950/8/index.html

'Cannot override predefined model settings.'
predefined_args = bert_hparams[model_name].copy()
if not hparam_allow_override:
mutable_args = ['use_residual', 'dropout', 'word_embed']
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not changed in this PR, but why is embed_size not part of this?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I copied some from the transformer get model function. Actually, i'm not sure whether we want to have a whitelist of mutable args. It make the fucntion harder to maintain, since we have hparam_allow_override flag.

Copy link
Contributor

@leezu leezu Jan 29, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you want to remove the whitelist then? In this PR or a separate one?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd prefer a separate one

@leezu leezu merged commit 5a776bf into dmlc:master Jan 29, 2020
@eric-haibin-lin eric-haibin-lin deleted the allow branch February 2, 2020 06:21
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
release focus Progress focus for release
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants