Wednesday, November 09, 2005

Accountability in Engineering

I hate capitalist droids.

The capitalist world has thoroughly crushed the engineer. The engineer is made to believe he's only a technologist. The MBAdroids of the world will manage them, the accountants will make sure the money adds up, and some liberal arts student will deal with the customers, because engineers couldn't possibly consider money issues or deal with people.

They're desperately trying to forget the purpose of an engineer.

First, let me define what I think an engineer is, purely from my own perspective:
An engineer applies scientific principles towards practical, ethical, and safe solutions to real-life problems, and is accountable for the decisions made by this process.
There's several aspects to this definition:
  • Scientific principles. Best practices don't apply. An engineer must not know just that it works, but must understand why it works, because otherwise they cannot possibly foresee how it will not work.
  • Practical. An engineer must consider reality. Unimplementable solutions and theories are irrelevant. This also implies an engineer must be aware of monetary concerns, and tradeoffs between cost and the other aspects here.
  • Ethical. An engineer is responsible for making ethical decisions, and this trumps other concerns such as monetary constraints. This is critical for the trust of engineers to create critical systems.
  • Safe. An engineer is responsible for protecting people, both their lives and their well-being, trumping monetary concerns. [an interesting followup: which is more important? ethics or safety? would you lie and cheat to save lives? or must there always be a way to reconcile the two?]
  • Real-life problems. Engineers must do something useful for people.
  • Accountability. Here's the crux of it - we must trust engineers to make techology safe for us to use, and they must be held responsible for their choices. I'll delve deeper into this for a bit.
But the problem is, business droids hate this! They'd rather have a technologist. They'd rather define it as: "An engineer applies technology to solve a problem". They don't like the ethical and safe part, because they have their own forumulas to describe these things. Like I always say: to an MBAdroid, everything is uniquely described by its dollar value. They'd rather use their little risk forumas and try to balance the cost of lawsuits, no matter what the moral cost. They'd rather not have any accountability at all, and instead plonk it straight in the laps of end users.

Of course, engineering only exists because the law doesn't allow certain industries to do this. If you build a building, you can't just have a sign saying "we waive all resposibility if the building falls and kills you". If that building falls, the ones who designed and build it will be held accountable. Period. Engineering specifically makes sure that there are people who are qualified to say it won't fall down, and an engineer has to put his or her personal reputation and livelihood on the line on every decision they make. This is also why engineers have gargantuan professional insurance. If a doctor botches an operation, it's one life. An engineer designs a plane that crashes, hundreds die.

But, since capitalism has grown in power, the droids have taken over. Engineering is devalued because no company would dare admit that accountability comes back to it unless the law requires it. "We explicitly disclaim that this product or service will be suitable for any purpose anytime anywhere anyhow." It's the mantra of the lawyers, covering the business people. Nowhere is this more true than software.

But accountability does not imply absolute responsibility. This is the key point that people miss. And something I finally understood when talking to Tina about criminal liability. In summary, in law there's three main ways to categorize liability, transated into the current context.
Absolute: If something goes wrong, they're liable. Period.
Strict: If something goes wrong, they're liable, unless you can PROVE (yes, onus on the defence) that you either were missing some relevant knowledge (not negligently of course), or that you knew it was a possibility but tried to prevent it.
Criminal: You're liable if something goes wrong, and you intended it to go wrong, or were negligent in allowing it to go wrong.

For the most part, companies these days will only barely admit being responsible for criminal liability, and in software in particular, they vehemently argue even that! They just want you to give them money, and they rent you some bits. If those bits don't work or fail, well too bad for you. Of course, software companies are looking to avoid the opposite: absolute liability. Any liability, they'd argue, is absolute liability. If there was a bug, they'd be sued out of existence, and software has lots of bugs, so that won't work.

But what about strict liability? If something goes wrong, then it's on your head, unless you can show that you tried to prevent it or were misinformed. The idea being that this provides the ideal excuse to set up the "chain of accountability". Yes, when the big business boss signs the release that ships that product that kills a bunch of people, he's personally responsible for those deaths. But he says, "but no! My trained and licensed engineer gave me this report which said my product was safe to release!" This is fine - the boss genuinely believed it was safe based on the engineer's evaluation, and engineers are licensed specifically to be able to say those things! Then eyes fall on the engineer to be liable for the deaths. "But no," cries the engineer, "I did all this math, and these drawings, and measured it twice, and left a safety factor of six in there just in case!" In which case, the engineer genuinely believed it was safe, and assuming there was no negligence, they determine it was a bona-fide accident and move on.

Let me describe three scenarios where this works well. The first two where accountability is already in place, then I'll move to a third scenario to show how it would apply to software.

1. The World Trade Center. It's a building. Structural engineers by law had to sign off on it. But planes hit it, it fell, and thousands of people died. Are they liable?
Well, head of the project isn't. He had engineers, and followed their advice (I assume). There's to this day little evidence that the engineers were incompetent, so management's off the hook. The engineers: are they responsible, for not accounting for this possibility? Well, no. Because (and I admit, this is just rumour) apparently the designs for the towers DID take into account the possibility of a plane hitting the towers. However, planes were smaller at the time. Even so, it took quite the calculated act of malice, planes larger than existed at the time, and even then some damned tragic luck, to bring those towers down. That they even had a chance is a testament to the genious of the architects and engineers involved.

2. The Columbia disaster. NASA has some of the world's brightest engineers, yet a shuttle exploded on re-entry, due to wing damage that had occured earlier during liftoff. In this case, first we look to the mission commander. Was the shuttle safe to send up? Perhaps. Perhaps not. Under a criminal liability, it would be fine to send it up as long as there wasn't any direct evidence of an immeidate problem. For strict, as long as the engineers involved were confident in sending it up, the commander is off the hook. But here, we must look to the specific manager, who, having been presented knowledge that the shuttle had been damaged during launch, ordered them to return anyways. If he acted on the advice of his head engineer, he'd be off the hook. If he acted without advice, but thought it was safe, he'd be strictly liable, but not criminally. However, this pathetic management droid acted contrary to the (apparently very demanding) insistence of his engineers that the safety of the mission was at stake (presumably ignoring the advice for political and budgetary concerns ie. number of successful mission on his record). In this case, he'd be criminally liable, and even by the slackest of standards should be in jail.

3. Now, lets apply the example to software: particularly, software security. The latest web server release from a company has a security flaw. This gets exploited by a worm, takes down 100 web sites, and costs about $50M in lost business. Naturally, in real life, the company wouldn't be liable for a cent, and it'd be done. But what if they were? Criminal liability wouldn't be much of a standard, because it'd require them to purposely put it there, or find it and ignore it. Still, this ALREADY would be better than the status quo, punishing companies that purposely quash vulnerability postings rather than fixing the fault [*cough* Oracle *cough*]. What about strict liability? If there's a security flaw, you're liable. BUT, you can defend yourself. You have a software engineer that specializes in security. Did he sign off on the code? Yeah? He signed off on the design too? Good! Well big boss is off the hook, the ball is passed to the security engineer. The security engineer shows the results of his audit, demonstrates that he approved the design, and added precautions for places he expected there could be problems. Company is off the hook, and all it took was one engineer to sign off. If there was no such engineer, or that engineer didn't do his job well, then sucks to be them, they owe a bill. Absolute liability would be too much I admit: no matter how much you try, if it breaks, the company is liable. That's obviously too high a standard, but it's a scenario software people would use to build fear in you.

[
Update:
I actually need to revise this third scenario. There's a a first step: the one who deployed the web server! He's not just 'using' it, he's taking the creative act of making a web server, using software as a tool. Therefore, if a server spreads a virus, the ADMINISTRATOR of that server may be somewhat responsible!

In my ideal world, pretty much any manipulation of software beyond basic operation would transfer liability to the one performing that act. This would be a key point for open-source, where hobbyist authors obviously can't hold liability on their own, and would transfer the onus to whomever compiled it from source. A risk to the poor Linux enthusiast, yes, and one of the potential downsides of this scheme.

On the plus side, it would be an effective business model for open source companies: sure, the source is free, but our distribution is certified for task x, y, z, meaning your ass is covered. Of course, a licensed engineer could always certify it for free, assuming he's competent. Even if he's not, there's always their professional insurance.
]


If we had this sort of accountability in each engineering field that doesn't already have it, those fields would be much better off, and I don't think it'd be that harsh a burden - all a manager would have to do to save his butt, every time, would be to listen to an engineer licensed in the correct field. That's why we have licensing for engineers to the first place. And more to the point, the business droids would HAVE to listen, because otherwise it'd be their own necks sticking out. So engineering would again be glorious and respected, the public would have much better technology at a slightly higher price.

Of course, the business droids don't want that, because they don't want accountability, or to abide by ethics, safety and the law except where it directly impacts the corporate bottom line. They prefer the engineers to be technologists, because then they can ignore their annoying science and logic, but not have to take any risk by it.

Thus the engineer remains neglected. I hate capitalist droids.

[
Update 2:

My school barely emphasized our professional responsibilities at all. I think this is a fundamental failing in our curriculum.

Eamon, on the other hand, says his profs in many courses mention professional obligations regularly at University of Victoria. These are things that up and coming engineers should be thinking about! Why the hell does Waterloo Computer Engineering think their students don't need to consider these issues?
]

2 comments:

Anonymous said...

The problem is that if Engineers could do business and engineering, business guys would be obsoleted out of existence! This is very scary for the business guy who probably lives a cushy lifestyle that his job supports. So boxing engineers would be simple self-preservation.

Toyota, for example, hires Engineers to fill its management positions, and it's doing a lot better than the US manufacturers, but part of the reason for this is that US auto manufacturers tend to be inefficient, once again, to preserve their jobs and political profiles. And I heard that one of the first things you learn in MBA school is where to disrupt the flow of work the fastest (so it's not just that you're not doing anything, but you're actively impeding the flow of work.)

To apply this to software, we have to fundamentally change the way software is viewed. This might be difficult.

(Remind me to talk to you about business sometime, because I'm learning a lot practically here.)

Michael Jarrett said...

Interesting point crypto, but I certainly don't expect (or want) engineers obsoleting the business people. Business is a very vague term, including elements of finance, marketing, and management (and others I'm sure). You still need the business people to give the engineers things to do and raise the funds to do it.

In fact, one could say accountability makes business people MORE important while giving them less responsibility. Could an engineer truly be expected make a rational analysis if they knew rejecting a proposal on safety concerns would cause the board to liquidate the company? Probably not, so we leave the engineer to make those decisions, and leave the other issues solely in the hands of the various droids.

The key aspects of accountability is that it prevents the droids overstepping their bounds, and also prevents them acting unethically or negligently.

Sadly, you're right, the droids will traditionally resist. Hence why I think the changes must come from the legislative acts before it will work.