AI decisions assistant
- maxwellapex
- Aug 23
- 1 min read

Won’t it be cool for AI to help you make decisions, especially for tough ones? Well, while I personally believe this is a trend, there are things for us to consider.
1. AI and LLM are just models. By far (2025), there is no AI that can really “think”. Yes, we do have CoT and thinking mode, but they are more like simple repeating process that can be shown to humans. So, relying on that may cause problems.
2. AI tends to please you. Although you can ask it to be objective, the truth is it will pretend to be objective in the way it guesses you like. For example, one AI may compliment your decision and add 10% criticism to show its objectivity, although this may not meet one’s need.
3. AI won’t be in charge. As claimed all the time, AIs are just models and any content generated from them has to be justified.
Simply speaking, AI can be a basic decision helper, but there is more room for improvement. But optimistically speaking, I believe that a professional AI decision assistant will be much more mature and reliable before AGI. We just need to learn how to make the best use of them and wait.



Comments