**
Let’s start with Word2Vec.**
word2vec is a model that can learn the semantic vectors of words from unlabeled text are.
Dealing with word vectors allows for applications such as word similarity calculation and clustering, and BERT. And BERT, an extension of that technology, is also used in Google’s search service.
It’s hard to understand the concept.
But are you struggling to learn about word2vec? It’s hard to understand the concept because it’s not something you’re familiar with.
There are already some resources that explain word2vec, but it’s hard to get to the level of understanding and dealing with it with diagrams alone. However, it is difficult to get to the level of understanding and handling it from diagrams alone. And when described in a working program, even if there are comments in the explanation, it is hard to understand what that line does and how it works. It’s hard to swallow if you’re there.
Use Google Colaboratory!
So we use Google Colaboratory, which allows us to execute programs line by line. We recommend that you do this.
The benefits of using Google Colaboratory include
It’s free if you have a Gmail account, and
The ability to execute line by line, and
One of the most important things to note is that it can be applied to machine learning beyond word2vec.
Many people have actually used Google Colaboratory to learn and experiment with machine learning We’re working on it.
The best way to understand how the algorithm works is to actually get your hands on it.
I actually played around with word2vec to get a better understanding of how it works.
So to help you all learn, the Google Colaboratory program and explanations are available.
Please click on the links below.
Translated with www.DeepL.com/Translator (free version)
https://subcul-science.booth.pm/items/1562211