Other programmers, though, were slow to learn the lessons of KDE 4.0. At the time, Linux’s popularity was still new, and many programmers were not used to criticisms from end users. Not long before, programmers and users had been close to synonymous. Consequently, many responded to criticism with sarcasm and suggested that the complainers should code the changes they demanded for themselves.
This is the problem in a nutshell: basic lack of professionalism and failure to follow design best practices. Innovation is all well and good, but without communicating with the user community and asking what they want or need, working behind closed doors, then presenting their innovative new DE as if casting pearls before swine isn't going to work. Any professional programmer knows this: you work with steering committees composed of representative end-users during the design process, then you work with alpha/beta testers during the implementation phase, and finally QA testers prior to rollout. If your product doesn't satisfy the needs of users it won't be well-received by them. Change only for the sake of change is not the way to do it, it has to be change that serves a real purpose, and you absolutely have to involve the users in the design process to ensure that your proposed changes do that. The only exceptions would be if building something according to management's specifications and orders (and if they don't involve their users they're poor managers and that's not a good place to work.)
The article laments that the desktop UI is pretty much the same as that of Windows 95 or earlier MacOS, but people have been used to using those interfaces since summer of 1995 and in the case of Macs even longer. The thing is, these UIs were very well designed indeed, and that's why they've stood the test of time. Changing everything around just to be "innovative" rather than to make using the computer easier, requiring everyone to relearn how to use their computers which takes time away from getting work done, is no good. That's what happened with Unity and Windows 8, which were designed for touchscreen monitors at a time when almost nobody owned one, with mousability being almost an afterthought. They disrupted peoples' workflow and tried to make them learn a new UI all over again, one that was harder to use with a standard mouse and keyboard. People revolted by, in Ubuntu, either switching DEs or even distros, and in Win8 by installing a third-party app that gave them their Start menu back and made the OS usable. In both cases the UIs were reverted in later versions back to what people wanted, due to popular demand.
DE devs should ask themselves some serious questions when thinking about designing a new DE: Is it easier to use than the current DE, with current hardware? Does it still let people do what they need or want to do? Does it help or hinder people to get work done on their computers? Is it helpful or does it just get in the way? Is their an actual "business" need for these changes? Then they should create a proof-of-concept of some sort, even if it's just mocked-up screen captures, and show them to the user community and ask for feedback (a RFC), and listen to what people say. They should do these things before writing a single line of code.