Skip to content
This repository was archived by the owner on Mar 19, 2024. It is now read-only.
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/faqs.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ This allows to build vectors even for misspelled words or concatenation of words

## Why is the hierarchical softmax slightly worse in performance than the full softmax?

The hierachical softmax is an approximation of the full softmax loss that allows to train on large number of class efficiently. This is often at the cost of a few percent of accuracy.
The hierarchical softmax is an approximation of the full softmax loss that allows to train on large number of class efficiently. This is often at the cost of a few percent of accuracy.
Note also that this loss is thought for classes that are unbalanced, that is some classes are more frequent than others. If your dataset has a balanced number of examples per class, it is worth trying the negative sampling loss (-loss ns -neg 100).
However, negative sampling will still be very slow at test time, since the full softmax will be computed.

Expand Down
4 changes: 2 additions & 2 deletions docs/options.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,8 @@ The following arguments are mandatory:
-verbose verbosity level [2]

The following arguments for the dictionary are optional:
-minCount minimal number of word occurences [5]
-minCountLabel minimal number of label occurences [0]
-minCount minimal number of word occurrences [5]
-minCountLabel minimal number of label occurrences [0]
-wordNgrams max length of word ngram [1]
-bucket number of buckets [2000000]
-minn min length of char ngram [3]
Expand Down
2 changes: 1 addition & 1 deletion docs/supervised-tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ $ cd fastText-0.1.0
$ make
```

Running the binary without any argument will print the high level documentation, showing the different usecases supported by fastText:
Running the binary without any argument will print the high level documentation, showing the different use cases supported by fastText:

```bash
>> ./fasttext
Expand Down