How to train a Japanese model with Sentence transformer to get a distributed representation of a sentence
. BERT is a model that can be powerfully applied to natural language processing tasks.
However, it does not do a good job of capturing sentence-wise features.
Some claim that sentence features appear in [ CLS\ ], but This paper](https://arxiv.org/abs/1908.10084) claims that it does not contain that much useful information for the task.
Sentence BERT is a model that extends BERT to be able to obtain features per sentence.
The following are the steps to create Sentence BERT in Japanese.
[Read More]