|"But things like AI and serious DNN based speech recognition, which I am very interested in, I've dug into enough to know that I won't be able to go there. They are already too 'mathematics doctoral thesis'-like for me to take on in the time I have left, at least without completely discarding any existing obligations which ain't too practical."
I think you might be selling yourself short there, unless by "time you have left" you mean "I only have 6-months to a year before I die" - even then, I'd still say to go for it.
Really, it isn't as difficult as you think. With today's tools and frameworks, while you do need some understanding of what is going on under the hood, it's not like you need to have deep level calculus knowledge (that would really only come into play for implementing such frameworks). It's kinda like making a 3D game - you can either spend the time writing an engine, or just grab one already made (with scenegraph, etc) - and get down to writing your game.
In the fall of 2011 it was announced that Stanford was sponsoring a couple of online learning classes, being taught by three top-tier instructors. These classes were called "AI Class" and "ML Class". I managed to get myself enrolled into both.
It's been said that they didn't expect a huge amount of response from people to take these classes, but they were completely blown away by the number of people who eventually did enroll.
The "AI Class" was being taught by Peter Norvig and Sebastian Thrun. The "ML Class" was being taught by Andrew Ng. Do any of these names seem familiar? They should...
These online classes were not a new thing, but they did succeed in showing how to do it properly. Prior to this, online classes tended to be more ad-hoc affairs, cobbled together from pieces, or just courses uploaded for others to browse, but nothing structured properly, outside of a very few paid and expensive offerings. These two courses were really the pioneers of what we call MOOCs today.
Anyhow, I took them. It was a struggle. To make a long story short, I completed the ML Class, and got about halfway thru the AI Class before I had to quit due to some personal issues that I won't go into. But I was doing well at that course (though it was right at the edge of my skill and knowledge base).
As an example of what a student managed to accomplish via what they learned in the "ML Class":
How I built a self-driving (RC) car and you can too.[^]
In 2012, Thrun and Ng each founded their own online MOOC "schools" if you will - they are known as Udacity and Coursera (respectively).
Coursera initially offered (and continues to this day) the same course that was the "ML Class":
Udacity, on the other hand, could not (for some reason, I suspect there was some kind of licensing or other issue with Norvig) release the "AI Class" as a course. Instead, a new course was developed, called (at the time) "How to Build Your Own Self Driving Vehicle" - and now known as "AI for Robotics":
Artificial Intelligence for Robotics | Udacity[^]
It's very slightly different from what I took in 2012 - mainly the "final project". It was very challenging, but I learned a ton from it.
Later on, in 2016, I took the first iteration of Udacity's "Self-Driving Car Engineer Nanodegree":
Self Driving Car Engineer Nanodegree | Udacity[^]
Now - would I classify any or all of these courses as the most challenging you could have when it comes to AI/ML?
Probably not - but they were all very challenging for myself - but also enjoyable (well, except that I b0rked my computer setting up CUDA for my GPU for the nanodegree - but that was on me); your mileage may vary.
But they should give you a good introduction.
If I had to pick any one of them to start with, it would be the Coursera Machine Learning course. It gave me an "aha!" moment about how neural networks actually worked, about how to look at problems better for parallel computation using matrices and vectors, and how neural networks related and used such technologies.
The first Udacity course gave me the basics on what and how to implement SLAM, what Kalman filters were for, and a number of other techniques and ideas to explore for self-driving vehicles and robotics.
The nanodegree took that information, and coupled it with learning how to utilize OpenCV for vision processing, object and lane tracking, plus a number of deep learning tasks. At one point we had to train a neural network to drive a virtual car around a track by building a training set of data from driving the car around the track as it captured snapshots of the track, plus data from the steering wheel (angle) and accelerator/brakes. Using these inputs, plus a custom neural network (I ended up basing mine on a very simplified version of NVidia's "End-to-End" CNN), you trained a model to drive the car around the track. Very fun and exciting (and super frustrating at the same time).
It also introduced me to TensorFlow, Keras, and ROS.
...and that was something in the middle of the entire course. There was a ton of new stuff I learned from that course, and I am glad I took it (though it wasn't cheap).
In the end, all of these courses have taught me there are some things that I don't know that I need to find time at some point to rectify (particularly: Calculus and Probability/Statistics). Someday.
But if I can do this, and it only took a total of about a year's time - and I walked away with more knowledge and understanding that has allowed me to now read better some of those inscrutable AI/ML papers that looked like gibberish before - then anyone else can easily do it too.
But there is something - even those courses aren't for the "faint of heart". Usually - though I never got any real hard numbers - the courses always started out strong, with a ton of students at the beginning. But the first few weeks were the "weed out" phase, and the numbers would then drop precipitously over the time of the course. If you are able to make it to the end of any one of them, you have really accomplished something from what I understand.
Anyhow - good luck!