I've been developing a long time, too, and I think there are pluses and minuses to both ways, when done well. But I think we all need to recognize that a majority of workplaces, just like developers, are average or below (that's just math). And average isn't really that great.
I do believe if you embraced TDD, you'd find many of your complaints are mitigated. For example, if you know you're going to be interrupted a lot, you can write that failing test before the meeting and pick right up where you left off when you get back--that failing test tells you exactly what you need to do when you come back.
And it really helps a lot with the illegible code problem. Testable code is by definition broken down into small, bite-sized chunks. You don't wind up with this giant ball of spaghetti that takes so long to unravel that by the time you figure out how something works you forgot why you wanted to know. Not to mention the tests themselves document the intention of your code.
I don't get your criticism of TDD that you're wasting time writing tests when you discover in the end stages you made bad assumptions in the beginning. Getting things on the screen in a real system can be a very long process--I've been known to code for a week or two before the code was ready to run. If you've coded for a couple of weeks and then you find out you made a really dumb mistake on day 2, that work is at best wasted. However, if it gets to the point someone in management sees it (especially if you spent 2 weeks writing it), chances are you and everyone else on the project will be stuck working around your bad assumptions for the next 2 years.
Whereas I find tests quite frequently show me I've made a stupid mistake pretty quickly. And yeah, it might be a pain to rip out the tests _and_ the code after a day of coding and start over. But I'd rather do that than run the code for the first time and realize it's all a house of cards and I have no time left to fix it. Not to mention I can quickly and easily simulate edge cases that I might never be able to find or make the data for in the real system.
It's tempting to imagine a rosy, idyllic past that compares unfavorably to today, but I think it's important to remember
- We were young and full of energy, and now we're old and crabby. So of course today looks bleak by comparison.
- It was just about possible back then to know everything there was to know about software development. Now the sheer scale of what developers need to know is much larger, and many developers need "guard rails" because of that.
- People didn't just up and decide they needed new processes for no reason. These were put in place to address actual problems faced by real projects. You can argue whether they were the right changes, but a big truism back then was "if it ain't broke, don't fix it."