top of page
Search

Local model

  • maxwellapex
  • Sep 1
  • 1 min read
ree

The llama series from Facebook is a well-known open sources language model. So, what can a local model do? Although it is not the best model in the aspect of performance, it can be useful for enterprise because of its transparency.  Also, for one who can’t run LLM in their device, a smaller or distillated model fit better, and is more CPU/GPU friendly. The cost is another advantage. For example, one of my projects is to summaries the content into .json file, and using the local model if totally free because I do not need to worry about the cost of API. One may argue that the commercial model can easily beat the open sources one, but the gap is closing, and I believe soon we will have the chance to use a good daily local model.

 
 
 

Comments


bottom of page