-
Notifications
You must be signed in to change notification settings - Fork 30.2k
Add TAPEX #16473
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add TAPEX #16473
Conversation
The documentation is not available anymore as the PR was closed or merged. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for adding this super cool model!
LGTM, just left a few nits.
examples/research_projects/tapex/run_wikitablequestions_with_tapex.py
Outdated
Show resolved
Hide resolved
examples/research_projects/tapex/run_wikitablequestions_with_tapex.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR!
If the expected architecture should be BART by default, then this model should be added in the relevant auto mapping to work with AutoConfig
and AutoModelForXxx
. This is jsut a defult value that can be changed in the config if there are checkpoints that rely on a different architecture.
However, these functions are used in the script, so I can't remove these imports. |
@NielsRogge Thanks for your huge effort! I personally think these two warnings are correct since these two imports are only used in
|
@sgugger I've addressed all comments. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The model is still missing in the auto configuration and auto model API to provide the Bart default, which means that it won't work with AutoModelForSeq2Seq
for instance. This should be added before merging.
All models on the hub do work with the Auto API, can you elaborate? TAPEX is also added to |
b078744
to
5f86fa5
Compare
…estion answering and table-based fact verification.
…kground. - Remove unused code lines in tabfact script. - Disable the deafult `pad_to_max_length` option which is memory-consuming.
* Fix the do_lower_case behaviour of TapexTokenizer. * Add unit tests for target scenarios and cased/uncased scenarios for both source and target.
…enizer function. * Fix typos in tapex example README.
…zer to control whether do_lower_case * Guarantee the hyper-parameter can be run without out-of-memory on 16GB card and report the new reproduced number on wikisql
* Provide evaluation command.
Co-authored-by: Suraj Patil <[email protected]>
Co-authored-by: Sylvain Gugger <[email protected]>
6a2199a
to
8d19141
Compare
What does this PR do?
Remember TAPAS, the table QA model by Google AI? Microsoft has now released TAPEX, a seq2seq model that outperforms TAPAS and is actually much simpler: table QA is just treated as a seq2seq problem.
As the weights can be directly loaded into a BART model, this PR only implements
TapexTokenizer
, which can be used to prepare tables and corresponding texts for the model.This PR also adds 3 scripts that showcase how to fine-tune TAPEX on 3 important benchmarks: WikiSQL and WTQ for table question answering and TabFact for table fact verification.
Kudos to @SivilTaram (the original author) for improving my initial
TapexTokenizer
implementation, as well as adding the 3 fine-tuning scripts.