
People have been throwing around the term “AI” for many years now. It was typically used by non-technical people who talked about anything that was technically impressive. Back then, it was pretty easy to explain the difference between AI and ML (Machine Learning). AI, we technical people had decided, was reserved for the future. Some utopia where we have actual artificial intelligence. But something basic like deciphering handwriting or anomaly detection was just calculus at high speeds - which we call machine learning.
Even in the early day of LLMs, we technical people still tried to correct others when they said used AI to describe what seemed like simple next token prediction. Things have changed now. The latest LLM models on the market are unquestionably artificial intelligence. Strangely, I still see a lot of people try to hold onto the whole “It’s just next token prediction. It’s not actual AI.” If you disagree with me, then tell me what would actual AI look like? Is being smarter and 100x faster than all of human knowledge put together not good enough to call it intelligent?
I don’t care how it does it. I don’t care if it’s next token prediction or if the words are getting streamed down by angels from the heavens above. It’s incredibly impressive. It’s intelligent. It’s artificial. It’s AI.
2/12/2026