top of page
Search

Thoughts about embedding

  • maxwellapex
  • Sep 8
  • 1 min read
ree

When it comes to LLM and documents’ NLP, usually we use embedding technique to enhanced the prediction result. It is the process that vectorize words into a space that machine can understand. For example, “felicitas” in Latin, “happy” in English, “bonheur” in France should be similar in the space, and that is how the model understands the word. Theorically, if one has a powerful LLM, he can skip theprocess of embedding, because the LLM can learn by itself. However, in the reality there is no such a LLM like this, therefore we often use embedding to simplfy and reduce the searching space, which can lead to better performance. In short, mastering the embedding skill can help a lot in the NLP process.

 
 
 

Comments


bottom of page