In the first year of my physics education at TU Delft, I had a mandatory course labelled ‘ethics of engineering and technology’. As part of the accompanying seminar we did a case study based on the Space Shuttle Challenger disaster with a small group of students.
If you don’t know what went wrong: in a nutshell, some critical parts – O-rings – were more likely to fail at low temperatures (they had before), and the conditions of this particular launch were especially harsh for them. The engineers, knowing about the issue, wanted to postpone or abort the launch but were pressurized into not digging their heels in by senior NASA executives.
In the seminar, some students were engineers (NASA and Morton Thiokol), some executives, and of course the end result was a stalemate followed by an executive decision … and a disastrous launch. Nobel prize winning physicist Richard Feynmann later demonstrated what went wrong technically (by submerging the O-ring in ice cold water before the committee), as well as non-technically, with a classic Feynmann quote: “reality must take precedence over public relations, for nature cannot be fooled”.

So what does this have to do with software and IT ?

Everything. Every year human lives and IT become more entangled. The direct consequence is that technical problems in IT can have profound effects on human lives and reputations. With it comes increasing responsibility for the technicians and executives, be it developers, engineers, infrastructure technicians, database administrators, security consultants, project managers or higher level management (CxO).

The examples of this going wrong are numerous. Most often this results in huge budget deficits, but increasingly it leads to more disturbing data leaks (or worse: tampering) and the accompanying privacy concerns.
The most striking cases have been the malfunctioning US healthcare insurance marketplace, which resulted in millions of people who were involuntarily uninsured, or closer to home, the Dutch tax agency which spent 200 million euros on a failed project, or the Dutch government which spends 4-5 billion every year on failed IT projects. I didn’t even mention the National Electronic Health Record yet (Dutch: ‘electronisch patienten dossier’), which is still in an unclear state and ITs own Chernobyl waiting to happen.

And these examples are the published – public – cases: most enterprises with failing IT projects won’t tell you about them, because why would they, since it’s defacing at best (yet another dataleak) ? From personal experience, I’ve seen solutions which were so badly designed people could easily and/or accidentally access the lives of colleagues (HR systems) or find out a case was being made for their firing (before they were told).

So how is this possible ?

Before I answer that question, let me start with two disclaimers. First, in this post I want to focus on technical failure, meaning even though the project can be a project management success (delivered in time and under budget), it can still be a technical timebomb waiting to go off (like the Challenger). Second, a lot has been written about failing IT projects already, and this is by no means the definite story. It is however a developers point of view.

With that out of the way, let’s answer the question: in my perspective this is the result of a lack of professional attitude and transparency on all fronts: by the creative people (developers/engineers), executives, and their higher level management.

A good developer or engineer has a constructive mindset: if at all possible he will not unthinkingly accept what he is told but instead guide his customer (his manager, functional designer or some 3rd party) through their functional requirements and tell them the impact and consequences of different approaches, and suggests alternatives in case those are much more friendly for the technical back-end (either directly or in terms of future maintenance).
However, it can happen that the functional wishes are just not feasible in the framework you have been using and there is no real alternative (except starting from scratch). Depending on where you are, going ahead and ‘doing it anyway’ may lead to negative effects on human lives, and ultimately on the company/government you work for. In cases like this, the professional engineer will dig their heels in. Not because they are lazy and don’t want to do the work. Not because they are assholes or out of spite. But because it’s the professional thing to do.

A good executive expects their people to display this exact behavior: yes they expect a ‘can do’ mentality and for them to ‘make it work’, but by choosing the right solution and telling them when an idea is really bad, not by unthinkingly saying ‘yes sir/mam’ to everything. The good executive will either make sure the requirement goes away, or communicate a clear message to their higher ups when such functional requests lead to delays instead of pressurizing their people to ‘just do it’ and ‘make sacrifices’.

Companies which recruit developers and engineers primarily based on price – and see them as ‘production workers’ – can’t expect to get the kind of critical professionals described above. I’m not sure if companies with such hiring practices have good executives who just lack the awareness of what this results in, or also lack the good executives. However, the end result is that they will recruit the ones just looking for quick dough, and consequently – out of fear – always say yes without being clear about the risks (if they even see them). In fact, they might even make it work on short notice, but your company ends up with a technically weakened product and some serious risks. Word of warning: this doesn’t mean the opposite is true; highly paid technicians are not necessarily good professionals.

Finally, a good higher level management should cultivate a culture of transparency and open communication. If there is a good reason an ambitious goal is not reached, all too often higher level management (under pressure themselves from investors and/or public opinion) turns a blind eye or feign ignorance and indiscriminately punish the departments and project managers for not reaching them, even when those in question reported early this may be the outcome. This behavior will instill a sense of fear of transparency and cultivate a culture in which everyone will, down the whole chain, just make it work, regardless of future risks.

ITQ

Let's talk!

Knowledge is key for our existence. This knowledge we use for disruptive innovation and changing organizations. Are you ready for change?

"*" indicates required fields

First name*
Last name*
Hidden