«In sum, I'm convinced that the Unicode designers blew it, way back when, by insisting on maximizing generative typography except when muscled by an economically important country.
Either of the two extremes would probably have converged on an overall solution more quickly. But it's far too late to change now.»
«The problem with Unicode's puristic approach [ahem, partially puristic approach] is that it was ahead of the technology. The technology to do things like spontaneously ligate f i on display is quite new; if you wanted to see a ligature even five years ago, you had no choice but to use a separate codepoint. And even now that operating systems have started getting smarter about combining codepoints together, the results look less than satisfactory.»
...and so on, with further discussion of how support for composed characters is still flakey after all these years. The Unicode Consortium people seem to have assumed that if they merely specify semantics, little gnomes will magically implement it, and implement it well.
But, for me at least, a big lesson of 1990s open source is: if you want it done right, you have to be ready to do all the work yourself. You may not have to actually do it, but you at least have to be prepared to do so. "Doing all the work" in this case would have meant publishing free fonts and DLLs and the like, and providing Mozilla very early on with the patches to render all these combined characters and context-sensitive characters and whatnot. But ya know, 20/20 hindsight, etc etc.