Most of what we call AI can learn and update to an extent, in order to be able to match a wider verity if I puts or to improve accuracy on existing inputs, this is a hardcoded solution. If this counts as AI, then every bit of software ever written also counts as AI, which makes the term even more meaningless and marketing buzzwordy than it already is.
I suppose the question is, if an AI has learned and you export that final learned state to use in a now-hardcoded classifier, is that classifier still AI (or part of the overall AI) or is it simply the output? I can imagine arguments on both sides. If you accept that as AI, then sure, this fits the bill!
> Most of what we call AI can learn and update to an extent
Most of what we call AI are hardcoded solutions once in production. There may be ongoing offline improvements being made, but once the improvements are established the production AI is replaced with a point-in-time snapshot of the AI undergoing offline training. Self-learning in production causes all kinds of problems, but most significantly it's a security issue since it gives an attacker the ability to manipulate behavior by curating examples.
It's comical how little people understand about machine learning, no one calls an ODE solver artificial intelligence but gradient descent on an interesting equation is somehow now A.I.
Why does this system have to be hard coded? Certainly you could auotmate the glass fabrication technique with a computer and robotics – I'm just imaginibg that was beyond the scope of this study.
I suppose the question is, if an AI has learned and you export that final learned state to use in a now-hardcoded classifier, is that classifier still AI (or part of the overall AI) or is it simply the output? I can imagine arguments on both sides. If you accept that as AI, then sure, this fits the bill!