« More blog articles

Technology ethics: Ethics for people who work in tech

Posted on: April 19, 2023

By Marc Steen

Let’s talk about the work of computer and data scientists, software developers and engineers. They help to create the algorithms and AI applications that are inside the many products and services that shape our societies and our daily lives. Let’s also include the work of people who work in design, business, marketing, and management, and in the procurement and implementation of these systems in the working processes of all sorts of organizations such as government agencies.

Increasingly, these people feel that ethics is important in their projects; in collecting data, in building applications, algorithms and AI systems, and in the deployment of these. On the other hand, they do not find it particularly easy to integrate ethics in their projects.

Some think of ethics as a barrier that they need to pass. A barrier to innovation. Now, I understand ethics very differently. I understand ethics as a steering wheel. Imagine that your project is a vehicle. Ethics is then the steering wheel. Ethics can help you to bring your project safely from A to B; to keep it on the road, and to avoid crashes and wrong turns.

Ethical Reflection and Deliberation IconEthical Reflection and Deliberation

Over the course of the years, and based on experiences in diverse research and innovation projects, I developed a three-step method that can help you to integrate ethics in your projects.

In step one, you identify potential issues; things that may go wrong in your project and that may have harmful impacts in the world. In step two, you host dialogues about these issues; you can do that within your project team, but also with stakeholders outside your organisation. In step three, you make decisions, based on what you found in these dialogues about these issues.

Ideally, you can organize this as an iterative process of ethical reflection and deliberation. You can make decisions, experiment, evaluate, and course correct. This is how you put the steering wheel in action.

Importantly, any analysis or discussion will need to start with clarifying what we are talking about. What is the system we are working on? In the early stages of innovation, this may only be a sketch, a paragraph or a storyboard. In later stages, there will be some prototype and specifications. Please bear in mind that these three steps don’t immediately yield definitive answers and instead help you to ask more precise questions.

You can turn to various different ethical perspectives to conduct such reflection and deliberation. People have developed different perspectives over the course of centuries, if not millennia. Let’s have a brief look at four perspectives, mainly Western: consequentialism, duty ethics, relational ethics, and virtue ethics.

Consequentialism iconConsequentialism

In this perspective, you assess the potential positive and negative outcomes of your project. If the system you work on is implemented, what will be its effects in the world? This perspective often appeals to people with technology or economy backgrounds. Assessing the plusses and minuses of a project seems straightforward. There are, however, two questions that complicate matters. Where do you draw the boundaries of the system you analyse? What types of plusses and minuses do you include—and which do you exclude? Infamously, economists speak of ‘externalities’ when they choose to not take into account things such as harms to labourers or gig workers, or damages to the environment on another continent. A second question is about the distribution of pluses and minuses. This is a question of distributive justice. Which people do the plusses go to, the benefits? And which people receive the minuses, the costs and risks? In the example of a self-driving car, many benefits go to the driver. But there may be all sorts of harms that go to other people. You will need to use your sensitivity for justice to deal with such questions.

Duty Ethics iconDuty Ethics

This perspective deals with identifying various duties and various rights that are relevant to the project of hand. Key principles are respect for human dignity and human autonomy. A good example is that of cameras in a city’s public squares. The local government has a duty to keep people safe and therefore places cameras in public areas. At the same time, citizens have rights to have their privacy respected. The goal is then to find a balance between these duties and rights. This can become complex when different stakeholders’ duties and rights come into conflict. Interestingly, creativity and innovation can help to combine different interests. In this example with the cameras, the conflict between (the city’s duty to promote) safety and (the citizens’ rights to respect their) privacy can be solved by using privacy enhancing technologies in the software and data minimization measures in the ways in which data are dealt with. Moreover, duty ethics has some overlap with legal concerns. If the system you work on will be deployed in the EU, you may need to look at laws and regulations such as the GDPR and the proposed AI Act.

Relational Ethics iconRelational Ethics

This perspective focuses on the ways in which technologies modify people’s abilities to interact with others and with nature. Relational ethics is different from consequentialism and duty ethics in that it foregrounds people’s personal relationships and subjective experiences. Significantly, it also looks at the role of power, at the distribution of power, and questions inequality and injustice. Imagine a police officer who uses an algorithm that assesses the likelihood a person to behave violently. The algorithm puts a red flag before a person’s name. It could also be a tax inspector; in that case, the red flag indicates potential fraud. Now the police officer or the tax inspector will engage differently with a person who has a red flag before their name. Moreover, the person at the receiving end of that assessment typically cannot get rid of that red flag easily, even if, after investigation, the prediction proves to be false (‘false positive’). Typically, the algorithm gives power to the organization with the algorithm, at the expense of the ‘data subjects’, the people who are assessed by it.

Virtue Ethics iconVirtue Ethics

This perspective goes back to Aristotle and ancient Athene and is used in professional ethics and in the development and deployment of technology. Virtue ethics can be helpful in two ways. First, it can help to understand how technologies can help, or hinder, people to cultivate relevant virtues and how technologies can help, or hinder, people to live well together. If a person uses social media very often, and rather mindlessly, then that person’s ability to exercise self-control will diminish over time. Unfortunately, the technology is often intentionally designed to maximise return visits with brightly coloured notifications, and to maximize staying in-app with suggestions for more enticing content. It will require some work to cultivate self-control in such situations. Second, virtue ethics can help the people who work on the development and deployment of technology, to identify virtues that they need in their projects, and to cultivate these. If you work on an algorithm and you want it to produce fair outcomes, then you may need to cultivate virtues like courage, to speak up about some problematic issue that you foresee, or self-control, to not-include a ‘nice and shiny’ feature because it may lead to misuse.

Integrate Ethics in Your Projects iconIntegrate Ethics in Your Projects

Many people like a checklist every now and then; a checklist can help them remind to actually do what they find important. So, here’s a checklist for integrating ethics in your project:

We have three steps to organize ethical reflection and deliberation:

  1. Identify potential issues: What is your project’s impacts in society and in people’s daily lives
  2. Host dialogues: Discuss these issues with your team or with stakeholders
  3. Make decisions: Critically, you can monitor, evaluate, and course correct

And we have four perspectives to look at these issues, and talk about them:

  1. Consequentialism: Look at the potential positive and negative outcomes of your project
  2. Duty ethics: Look at different stakeholders’ duties and rights that your project involves
  3. Relational ethics: Look at how this technology can affect relationships between people
  4. Virtue ethics: How this technology can help, or hinder, people to cultivate relevant virtues

Ethics for People Who Work in TechLearn more about these three steps, the four ethical perspectives and the practical methods to integrate ethics in your project, like Human-Centred Design, Value Sensitive Design, or Responsible Innovation, by reading Ethics for People Who Work in Tech.