When I was starting out in computing there was much elegant beauty to behold in it: feats of engineering that appeared carefully crafted and whose creators could be proud so I was easily attracted into learning more. For instance, the sound and graphics chips in the Commodore 64: they were easy to program but able to deliver far more than one would have expected, even more than the designers had planned. The later Amiga had lovely hardware too, again for its time. Or, some of the business systems I worked with in transitioning local clients to the new IBM PC: the software on a Z80-based predecessor from Wang that I worked on was actually really good, being clear and functional and well-behaved, making it a breeze for me to learn my way around them and extract the data ready for transforming into the new system. Throughout the 1980s it was a surprise for me to find bugs in serious business software.
Coincident with the rise of Microsoft Windows I have seen software quality go down. Now it is entirely normal to run into bugs. It is also normal for software to frustrate me: it insists on doing something I don't want or I can't get it to do something it clearly ought to be able to. From within the sausage factory we see a complex enough software stack that largely we sign off on changes when they seem to work. There is no sense that the design is clean and clear enough that we would be surprised if it did not pass QA after all. There may be intermittent failures but nobody is quite sure why. Software development starts to feel less like art and more like wrestling. Modern hardware is little better: routers crash and need power-cycling, patches to expensive firewalls break traffic in odd ways, enterprise storage is flaky, etc.
It isn't all bad. Technologies ranging from OpenBSD to Haskell, while building on the shoulders of giants, are laudable projects gaining in currency, the latter thanks in part to Microsoft Research. Still, I wonder what changed.
(Haskell is a nice example: in my code I can use phantom types to pass around integer IDs of objects such that compile-time checking ensures that I never mix up the ID of one kind of object with that of another but also that the type-checking overhead is compiled out of the native executable; in the real world of the mainstream we are still accepting strings from web forms and not using strict typing to ensure that they can never be confused with a string that has defensively been safely escaped. It seems to me like many common modern bugs should have been professionally engineered away years ago.)
It feels like most software developers now care about little more than getting things working well enough so that their employers can sell them comfortably. The developers wanting to use their mathematical gifts to create systems that are both solid and flexible are largely dwarfed or otherwise out-competed by the many who just want to deliver something that appears to work sufficiently well that everybody gets paid and management either do not appreciate or do not care about the difference.
Maybe the change is partly in the user community: the perception or expectations have changed somehow: users no longer realize how good things could be so they tolerate substandard software enough to keep on paying and the providers have learned that they can get away with this. Whereas, I find it positively outrageous that technology is of such low quality that my smartphone can take a couple of seconds to offer any visual indication that I successfully clicked an on-screen button then take my subsequent attempt to click it as my clicking the button that then pops up in its place. This really is basic stuff.
I don't have any answers, I think it's just how the world works. I would look at retreating to high-assurance systems but look at the Department of Defense's move away from Ada or Ericsson's from Erlang: I think the only refuges are in the past. I could be frustrated to know that modern computer systems are typically no longer built anywhere near as well as they could be but I instead find myself grateful to be able to recall a time when it was normal for them to be both useful and reliable.
Coincident with the rise of Microsoft Windows I have seen software quality go down. Now it is entirely normal to run into bugs. It is also normal for software to frustrate me: it insists on doing something I don't want or I can't get it to do something it clearly ought to be able to. From within the sausage factory we see a complex enough software stack that largely we sign off on changes when they seem to work. There is no sense that the design is clean and clear enough that we would be surprised if it did not pass QA after all. There may be intermittent failures but nobody is quite sure why. Software development starts to feel less like art and more like wrestling. Modern hardware is little better: routers crash and need power-cycling, patches to expensive firewalls break traffic in odd ways, enterprise storage is flaky, etc.
It isn't all bad. Technologies ranging from OpenBSD to Haskell, while building on the shoulders of giants, are laudable projects gaining in currency, the latter thanks in part to Microsoft Research. Still, I wonder what changed.
(Haskell is a nice example: in my code I can use phantom types to pass around integer IDs of objects such that compile-time checking ensures that I never mix up the ID of one kind of object with that of another but also that the type-checking overhead is compiled out of the native executable; in the real world of the mainstream we are still accepting strings from web forms and not using strict typing to ensure that they can never be confused with a string that has defensively been safely escaped. It seems to me like many common modern bugs should have been professionally engineered away years ago.)
It feels like most software developers now care about little more than getting things working well enough so that their employers can sell them comfortably. The developers wanting to use their mathematical gifts to create systems that are both solid and flexible are largely dwarfed or otherwise out-competed by the many who just want to deliver something that appears to work sufficiently well that everybody gets paid and management either do not appreciate or do not care about the difference.
Maybe the change is partly in the user community: the perception or expectations have changed somehow: users no longer realize how good things could be so they tolerate substandard software enough to keep on paying and the providers have learned that they can get away with this. Whereas, I find it positively outrageous that technology is of such low quality that my smartphone can take a couple of seconds to offer any visual indication that I successfully clicked an on-screen button then take my subsequent attempt to click it as my clicking the button that then pops up in its place. This really is basic stuff.
I don't have any answers, I think it's just how the world works. I would look at retreating to high-assurance systems but look at the Department of Defense's move away from Ada or Ericsson's from Erlang: I think the only refuges are in the past. I could be frustrated to know that modern computer systems are typically no longer built anywhere near as well as they could be but I instead find myself grateful to be able to recall a time when it was normal for them to be both useful and reliable.
no subject
Date: 2017-06-15 02:32 pm (UTC)