Very good. May I suggest (historically) that this creeping disconnection is actually an inevitable cost of “efficiency” — efficiency as in, the gradual historical supremacy of the time/motion study and the assembly line in our industrial culture. At one point, small workshops are making (say) carriages from scratch, with some specialisation of trades on site, but most workers gathered around the unit being built, uno alla volta. Everyone present can see the whole enchilada. At a later point, a high level engineer/designer has decided how the building process can be divvied up into small, discrete, relatively deskilled steps. Deskilling means that cheaper labour can be hired (and easily replaced). Streamlining the process means we can produce more widgets per day. Sounds good — but now, we have some guy or girl sitting at a machine doing one meaningless action, over and over again, never seeing the whole picture or the finished product. That labour is said (in marxist jargon, which is sometimes useful) to be alienated. That is, the worker is unable to determine the meaning and purpose of his/her own actions; the worker is used like a tool or machine part, for someone else’s purpose.
Fast forward from the assembly line of cars or refrigerators, to the assembly line of code (no pun intended). I was fortunate enough to work, during my coding career (now retired), in a small tight group of programmers in academia — big science. We wrote our projects from the ground up, and we were in close contact with all parts of the project (from the hardware designers to the end users) at all times. We had a very personal (and holistic, if you will forgive the term) relationship with the project. I never had the sense of writing something whose actual purpose or meaning was obscure to me; I knew exactly what I was writing, where it would fit in, and what would happen if it failed. I was lucky. Millions of coders, as this article points out, are given little abstracted chunks of a large project to produce, with no overview, no sense of personal connection. That this results in bad code should be no surprise. The coding environment in which I worked would be described as “inefficient” by an industrial efficiency expert. But it resulted in good code, in the sense that it worked well and lasted many years. Some lines of my code are still chugging away 20 years later. Which in our line of work is not bad.
I’d also suggest that “efficiency” and “robustness” are usually incompatible. There’s an “efficiency” to the factory monocrop farm, but it comes at the price of fragility, less overall biomass productivity than a diverse agriculture, and diminishing returns. There’s an “efficiency” to the consolidation of school districts and the funnelling of students into ever-larger mega-schools, and to the shutting-down of regional hospitals and the funnelling of larger areas into mega-hospitals. But those mega facilities then tend to suffer from disconnection, as evidenced by information lossiness (leading to, for example, misdiagnosis or misprescription), because (imnsho)they are so large/complex that it’s impossible for individual people to grasp the whole picture and they must be managed piecemeal. This disconnects worker from meaning, action from consequence, etc. Efficiency, in other words, though it may offer evident and measurable benefits, does not come without costs (the costs may be a little harder to measure). Producing code on the “one subroutine per cubicle, here’s yer spec sheet, get coding” model is (perhaps) efficient. But, as the author cogently points out… not robust. And what is needed in critical-path code is, above all, robustness.