際際滷

際際滷Share a Scribd company logo
Do no evil?
David Webster April 2018
Software has eaten the world
Defaults influence behaviour
Security really matters
Its on us
Were biased
Understand and communicate
"That we're not just building tools, but that we need to take full
responsibility for the outcomes of how people use those tools as well.
- Mark Zuckerberg

More Related Content

Do no Evil?

Editor's Notes

  • #2: Do no evil or Dont be evil was once a motto from Google's Code of Conduct. As statements go, I find it hard to disagree with, but I do think that as an industry, as people who design and build software, we need to strive to aim higher than to just not be evil.
  • #3: Its been seven years since the Wall Street Journal proclaimed that Software is eating the world", and that declaration still holds - software has a profound impact on our lives. Yet with software engineering theres no widely recognised code of practise. Nor are there the ethical checks and balances used in other industries that push technological barriers like we do. Think of medicine, scientific research, or even structural engineering, and software is the wild west in comparison. This means that as individuals we need to have a greater awareness of our personal responsibilities. Its hard to open a paper at the moment without reading about large software companies that are accused of acting irresponsibly, and in some cases they have. Thankfully Alfresco isnt mentioned, but that doesnt mean its something we can be complacent about. We are a company that hundreds of organisations around the world trust to manage their content, run their business processes and look after the content they deem most important: the records they have a financial or legal obligation to keep safe. With that in mind, lets take a look at a couple of examples of where the software industry has failed its users.
  • #4: Has anyone heard the story of Bobbi Duncan? Shes a lesson, if we ever needed one, that we need to pay attention to default settings and behaviours. We know that software defaults influence users behaviour, especially if they dont understand them. Due to default security settings on Facebook group memberships, when Bobbi joined a gay group, it posted that membership to her timeline, publicly outing her to all of her friends and family. That drove her to a suicide attempt.
  • #5: Another awful example is with lax security having a real impact on users. Is anyone familiar with Ashley Madison? It's an online dating site designed to enable and encourage married people to have affairs. The site's premise is a whole different ethical question, but, that aside, they poorly designed their account security meaning that their entire user base was hacked and posted online. They didnt even validate email addresses, so you could create an account for someone else and make it look like they're a user. That sounds awful, and it is. Those mistakes killed people, as multiple users took their own life after their Ashley Madison membership was disclosed publicly.
  • #6: The last example I'd like to look at is personal liability. Whatever we think of senior management, we're all legally culpable for our own actions. You've probably all heard that the car manufacture VW used software to cheat emission tests, well one of the interesting things that came out of that was that the very first person to be sent to prison was not a company exec, it was one of the engineers on the team who looked after that component.
  • #7: Now, looking to the future, a big issue we need to be aware of is that our subconscious bias, and the bias that exists in the data we use for decisions, is reflected in the software we write. Google translate provides good examples of that, so when translating from a gender neutral language to a gendered language, the translation algorithms assume that, e.g. engineers are male and nurses are female. Given the historic dataset, you can see how it could happen, but we need to be much more self conscious of bias like that and counter act it.
  • #8: So, what can we do? The are organisations like the IEEE who have codes of practice for software engineers. But I think it comes down to this: We need to understand our users & design software for their specific needs. But, Understanding them isn't enough, they need to understand our software too. If they don't understand the implications of an action they perform using our software, then we've failed them. This ties into the design thinking approach were using more and more, but I think its important that we understand how fundamental this is and how we all have a personal responsibility to the end user of our software.
  • #9: To quote Mark Zuckerberg, hes realised:"That we're not just building tools, but that we need to take full responsibility for the outcomes of how people use those tools as well.