Risk and responsibility in software errors

  • Software systems may become complex and with various non-foreseen emergent behaviors… and errors…
  • Software, as any technology, always involves risk!

Some arguments about risks…

  • What do you think about…
  1. “Product X is already accepted. Product Y as a lower risk, so Y should be accepted.” (also known as BAT - best available technology)
  2. “Effect A of product X is not natural; so it shouldn’t be accepted.”
  3. “No scientifically demonstrable risk was shown with product X, so it should be accepted.”
  4. “Soon we will know about the risks of product X, so let’s wait and see.”
  5. “Benefits of product X are greater than its risks”.
  • Is it possible to measure risk? Is it possible to measure harm?

Risk contracts and the virtue of courage

  • Who is affected by the risk? Are they OK with that? Did we consult them? Are they really free in that consent?

Example: cars are risky. But, somehow, as a society, we agreed that is best to have them than not having them.

  • However, if no one wants to lose nothing, how can we agree about some risk?
    • Sometimes, virtues of courage and love for each other may need to guide this assessment.

Responsibility attribution in big teams

  • Suppose some code fails in production. Who is responsible?
    • The one who idealized the product?
    • The one who programmed?
    • A library which was used and had bugs?
    • The one who tested the code?
    • The one who distributed the code?
  • Sometimes, finding the one to blame is very hard!
  • A well-debated problem (see Moral Responsibility and the Problem of Many Hands)

The problem of many hands (PMH) occurs if a collective is morally responsible for some result, whereas none of the individuals making up the collective is morally responsible (in the same degree) for this result.

  • How can genuine virtues of justice and wisdom guide us in that?