top of page
Search

Float points and computer

  • maxwellapex
  • Oct 9
  • 1 min read
ree

Recently we are working on increasing the performance of machine learning, and we found that float point is important for us. Lower number like FP16 or FP8 is better for the machine learning training, while higher number is better for scientific simulation like weather prediction. To be more specific, although both floating number system were designed for “computation”, they are actually in different development process. Also, according to NVDAs document, some are using FP4 for training, and I think this is the trend.

Finally, I am glad that we learned a new lesson for speeding up the process of machine learning.

 
 
 

Comments


bottom of page