I get the hype aversion. I get the 'technology' aversion - but if you want to recognize handwriting in an image, you're not going to do it by explicitly building rules from the ground up, it just doesn't work that way.
"All" machine Learning" and AI (deep learning) is, is a defined process to minimize errors, and it uses numbers to do so. The specific numbers are essentially the rules that have been 'found' by the algorithm that minimizes error, which you don't necessicarily care about because you just want the right answer.
You could simpify all of AI and ML algorithms to playing advanced games of "hot and cold".
- You have a bunch of pixel values that represent text,
- You make a guess "this looks a lot like an L, an i, or 1, but it looks most likely to be a 7, so I'll say that this image is a 7"
- turns out the correct answer was "T" but they write like a chicken, so I'll have to make an adjustment for next time and hopefully be correct then.
Few understand this.
AI is like gunpowder: master it and use it to proliferate your values or get subjugated by those who do.
I am just not equipped to handle AI as a tool. I still hurt myself and others with it by fumblefarting around with it. So, I will do without or just not do at all.
Understood it can be intimidating. This is an area I’m passionate about I’m building @CASCDR and I gave a really straightforward “entry point” to AI talk here: https://m.youtube.com/watch?v=BJv2McQnX6I . First ten mins are about demystifying AI which is not as complicated as nerd charlatans need you to believe it is.