When I was starting out in computing there was much elegant beauty to behold in it: feats of engineering that appeared carefully crafted and whose creators could be proud so I was easily attracted into learning more. For instance, the sound and graphics chips in the Commodore 64: they were easy to program but able to deliver far more than one would have expected, even more than the designers had planned. The later Amiga had lovely hardware too, again for its time. Or, some of the business systems I worked with in transitioning local clients to the new IBM PC: the software on a Z80-based predecessor from Wang that I worked on was actually really good, being clear and functional and well-behaved, making it a breeze for me to learn my way around them and extract the data ready for transforming into the new system. Throughout the 1980s it was a surprise for me to find bugs in serious business software.
Coincident with the rise of Microsoft Windows I have seen software quality go down. Now it is entirely normal to run into bugs. It is also normal for software to frustrate me: it insists on doing something I don't want or I can't get it to do something it clearly ought to be able to. From within the sausage factory we see a complex enough software stack that largely we sign off on changes when they seem to work. There is no sense that the design is clean and clear enough that we would be surprised if it did not pass QA after all. There may be intermittent failures but nobody is quite sure why. Software development starts to feel less like art and more like wrestling. Modern hardware is little better: routers crash and need power-cycling, patches to expensive firewalls break traffic in odd ways, enterprise storage is flaky, etc.
It isn't all bad. Technologies ranging from OpenBSD to Haskell, while building on the shoulders of giants, are laudable projects gaining in currency, the latter thanks in part to Microsoft Research. Still, I wonder what changed.
(Haskell is a nice example: in my code I can use phantom types to pass around integer IDs of objects such that compile-time checking ensures that I never mix up the ID of one kind of object with that of another but also that the type-checking overhead is compiled out of the native executable; in the real world of the mainstream we are still accepting strings from web forms and not using strict typing to ensure that they can never be confused with a string that has defensively been safely escaped. It seems to me like many common modern bugs should have been professionally engineered away years ago.)
It feels like most software developers now care about little more than getting things working well enough so that their employers can sell them comfortably. The developers wanting to use their mathematical gifts to create systems that are both solid and flexible are largely dwarfed or otherwise out-competed by the many who just want to deliver something that appears to work sufficiently well that everybody gets paid and management either do not appreciate or do not care about the difference.
Maybe the change is partly in the user community: the perception or expectations have changed somehow: users no longer realize how good things could be so they tolerate substandard software enough to keep on paying and the providers have learned that they can get away with this. Whereas, I find it positively outrageous that technology is of such low quality that my smartphone can take a couple of seconds to offer any visual indication that I successfully clicked an on-screen button then take my subsequent attempt to click it as my clicking the button that then pops up in its place. This really is basic stuff.
I don't have any answers, I think it's just how the world works. I would look at retreating to high-assurance systems but look at the Department of Defense's move away from Ada or Ericsson's from Erlang: I think the only refuges are in the past. I could be frustrated to know that modern computer systems are typically no longer built anywhere near as well as they could be but I instead find myself grateful to be able to recall a time when it was normal for them to be both useful and reliable.
Coincident with the rise of Microsoft Windows I have seen software quality go down. Now it is entirely normal to run into bugs. It is also normal for software to frustrate me: it insists on doing something I don't want or I can't get it to do something it clearly ought to be able to. From within the sausage factory we see a complex enough software stack that largely we sign off on changes when they seem to work. There is no sense that the design is clean and clear enough that we would be surprised if it did not pass QA after all. There may be intermittent failures but nobody is quite sure why. Software development starts to feel less like art and more like wrestling. Modern hardware is little better: routers crash and need power-cycling, patches to expensive firewalls break traffic in odd ways, enterprise storage is flaky, etc.
It isn't all bad. Technologies ranging from OpenBSD to Haskell, while building on the shoulders of giants, are laudable projects gaining in currency, the latter thanks in part to Microsoft Research. Still, I wonder what changed.
(Haskell is a nice example: in my code I can use phantom types to pass around integer IDs of objects such that compile-time checking ensures that I never mix up the ID of one kind of object with that of another but also that the type-checking overhead is compiled out of the native executable; in the real world of the mainstream we are still accepting strings from web forms and not using strict typing to ensure that they can never be confused with a string that has defensively been safely escaped. It seems to me like many common modern bugs should have been professionally engineered away years ago.)
It feels like most software developers now care about little more than getting things working well enough so that their employers can sell them comfortably. The developers wanting to use their mathematical gifts to create systems that are both solid and flexible are largely dwarfed or otherwise out-competed by the many who just want to deliver something that appears to work sufficiently well that everybody gets paid and management either do not appreciate or do not care about the difference.
Maybe the change is partly in the user community: the perception or expectations have changed somehow: users no longer realize how good things could be so they tolerate substandard software enough to keep on paying and the providers have learned that they can get away with this. Whereas, I find it positively outrageous that technology is of such low quality that my smartphone can take a couple of seconds to offer any visual indication that I successfully clicked an on-screen button then take my subsequent attempt to click it as my clicking the button that then pops up in its place. This really is basic stuff.
I don't have any answers, I think it's just how the world works. I would look at retreating to high-assurance systems but look at the Department of Defense's move away from Ada or Ericsson's from Erlang: I think the only refuges are in the past. I could be frustrated to know that modern computer systems are typically no longer built anywhere near as well as they could be but I instead find myself grateful to be able to recall a time when it was normal for them to be both useful and reliable.
no subject
Date: 2017-06-15 06:13 am (UTC)My dishwasher doesn't acknowledge that I've pressed the start button until after I'm out of the kitchen and halfway up the stairs, typically. This is a confounded nuisance, because if it *doesn't* start when I'm halfway up the stairs I have to come back down to find out why not. What is this delay of tens of seconds before it starts doing the thing, and how difficult would it be to have a beep or a light or something to demonstrate that it's thinking about it?
no subject
Date: 2017-06-15 07:47 am (UTC)No dishwasher but our combi washer-dryer also has a good think before starting the drying but at least it has an immediate beep and an LCD display (which sometimes goes weirdly wrong and which I am still training the children to look at before pressing ) and I am indeed halfway up the stairs (still within earshot) when the first mechanical noise of drying happens, I think maybe the clunk of it locking the door.
no subject
Date: 2017-06-15 09:30 am (UTC)This bloody printer at work does that too. You send something to print and you walk to the other end of the office to pick it up, and the printer has done nothing, and you don't know whether to wait for the printer to wake up or go back to your desk to see whether Windows has produced a confirmation box or an error since you got up and left.
And they've changed this set-up at work such that when I sit down and shake the mouse it doesn't turn the monitor back on straight away, so I have to stop and see whether it has turned itself off more throughly, which it hasn't. Grr.
no subject
Date: 2017-06-15 01:20 pm (UTC)no subject
Date: 2017-06-15 02:32 pm (UTC)no subject
Date: 2017-06-21 10:12 pm (UTC)One big contributor to this quality problem is that we're forced to use these big libraries/frameworks, and it's hard to always be aware of the assumptions built into them. Even good developers can end up with bugs because of this.
no subject
Date: 2017-06-24 02:15 pm (UTC)