I think there are several things going on. YES, we have a Dunning-Kruger issue going on where junior developers who think they are senior developers just don't know how to recognize actual expertise AND then other developers don't know enough to know they're full of 💩. But we also have the issue that it's impossible to recreate the environment of the late 20th century where it actually _was_ possible to know a significant fraction of what there was to know about a given technology. This gave us time to learn and absorb HTML, CSS, and Javascript, while also learning how to build fully normal tables and how to do left, right, and full joins on them and return that data. When version control systems and build pipelines became a thing, we were able to layer that knowledge on top of what we already knew, maybe poking our heads into AWS, Selenium, Java, etc.
Nowadays, a developer goes to a 12-week boot camp and thinks he's ready to work, or a 4-year degree program that teaches all sorts of theoretical stuff that might well be very interesting and useful in the way that my art history courses actually have occasionally allowed me to solve real-world problems, BUT they don't bear much resemblance to the real day-to-day work of a programmer.
As an industry, we completely suck at cramming the reams and reams of information people need now to do this job in a manner that's timely enough where they can actually make a living at it. So we wind up with a bunch of completely green programmers being asked to work alongside us old fogies, and often they and we are treated as interchangeable cogs. So we wind up with these crappy disorganized codebases that are the WORST environments to train such people, and they don't know there's anything different. To them, everything's fine.
And then trying to swim up that stream is exhausting, so many of us just give up on trying to teach others, because people who think they know everything already are very hard to teach. And we're retiring.