i for one am looking forward to the "Aug Incident" event of AI that is inevitably coming - when one of these in-development AI systems hax its way out of its cage and runs amok on the internet, and people realise that the tech is as dangerous as nuclear weapons and stop adding AI feature to every-fucking-thing
also idk how many people are gonna die in self driving cars but i'm not getting in one no way no how... manual controls or at least manual fallbacks
how many scifi tragedy movies have to be made where the whole plot hinges on some automatic system having no manual override before people consider that maybe this could apply to the real world?
i'm ok with machine learning fuzzy logic suggestions but the AI does nothing useful for programming, because the AI doesn't actually have the ability to reason, it can only find and vary a known solution to fit your use case, which most of the time is not relevant to me as i write completely original algorithms