Support loading fasttext model from custom file
Add two simple parameters to class FastText, making it possible to load fasttext model from custom file.
Related to issue 61
Codecov Report
Merging #102 into master will increase coverage by
0.00%. The diff coverage is100.00%.
@@ Coverage Diff @@
## master #102 +/- ##
=======================================
Coverage 94.41% 94.42%
=======================================
Files 64 64
Lines 1611 1613 +2
=======================================
+ Hits 1521 1523 +2
Misses 90 90
| Impacted Files | Coverage Δ | |
|---|---|---|
| torchnlp/metrics/__init__.py | 100.00% <ø> (ø) |
|
| torchnlp/word_to_vector/fast_text.py | 100.00% <100.00%> (ø) |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact),ø = not affected,? = missing dataPowered by Codecov. Last update cde86ba...56a59d8. Read the comment docs.
Hi! Thanks for your contribution.
It looks like the parameters introduced are directly passed into PretrainedWordVectors. Should we just use PretrainedWordVectors instead of FastText to solve the issue?
Hi! Thanks for your contribution.
It looks like the parameters introduced are directly passed into
PretrainedWordVectors. Should we just usePretrainedWordVectorsinstead ofFastTextto solve the issue?
Yes, this is a good idea. It is more suitable for the problem. I didn't understand the related class thoroughly before.
Would you like to change PretrainedWordVectors to be public? Or would you like me to?
How do I load this in pytorch or pytorch-nlp crawl-300d-2M-subword/crawl-300d-2M-subword.bin ?
Is this feature completed or still in progress ?
How do I load this in pytorch or pytorch-nlp
crawl-300d-2M-subword/crawl-300d-2M-subword.bin?Is this feature completed or still in progress ?
If I understand correctly, it should be done by just passing the name of your file to the "name" parameter.
However, the suggestion from PetrochukM is more appropriate, if you have time you could follow his suggestion and fix the issue. Thanks!