We humans don't know how to learn. We don't know how learning works. We simply work work work work until we know whatever we set out to know, we don't learn how we learned it, but are happy that we simply know it and leave it at that.
Therefore teaching someone/something else how to learn will be almost inherently impossible, because we don't understand it ourselves (yet?)
And if we do learn how to learn, why would we need an AI to do it for us?
Take any NP-hard problem at its simplest form. Whatever half assed heuristic you use chances are the solution becomes optimal (if it is even half good.) The more complex the problem the more the heuristics fall apart. For the smaller ones we can simply brute force it. The same thing happens here. Compared to something like face recognition, recognizing simple shapes is much simpler. So even the not so great techniques (comparatively) work well enough. They fall apart when we use the same methods for the complex stuff.
A great example would be a greedy algorithm. It works for some problems but doesn't for some others. Take a simple enough problem and you get optimal solution. Push the algorithm to its limits and you don't even get a good solution. You don't have to understand how the best algorithms for a task work to come up with a greedy algorithm.
Therefore teaching someone/something else how to learn will be almost inherently impossible, because we don't understand it ourselves (yet?)
And if we do learn how to learn, why would we need an AI to do it for us?