To err is human - that's why AI doesn't know any better
Who is responsible for the ethics of algorithms? Is it the developer who programs the algorithm and embeds a prejudice in the code? Is it the algorithms themselves that make these decisions? In the field of Natural Language Processing, for example, we are amazed at how GPT-3 algorithms arrive at independent results and produce good-sounding articles, for example.
Julia Gundlach, project manager of Algorithmethics, a project of the Bertelsmann Stiftung, comments: "The responsibility for the ethical design of algorithmic systems is similar to that of an orchestra: Every musician is responsible for how the music sounds, and if something sounds wrong, you can't blame the instruments. Even in the world of algorithms, it's always people who decide - that's also true in machine learning." Timo Daum, author of "The Artificial Intelligence of Capital," takes an even more nuanced view: "Are the programmers really the ones who have the overview of the mechanisms and goals of the algorithms they are working on, and at the same time also have the power and influence to recognize and change discriminatory code or biased data sets? Probably not. But the idea that algorithmic bias is unconsciously incorporated into the code by male programming nerds also overestimates the scope of those whose job it is to translate the business logic of their clients into executable code according to strict rules.
Handbooks and guides for ethical practice
Only all of us together can put a stop to this - with many responsibilities. Julia Gundlach of "Ethics of Algorithms" at the Bertelsmann Stiftung calls for responsibility over the ethical design of algorithms to be a central part of corporate culture and collaboration between organizations. "How this responsibility is specifically distributed in an authority or company must be discussed and clearly defined at an early stage," says Gundlach. To this end, the Bertelsmann Stiftung, together with the iRights.Lab and around 500 participants, developed the so-called Algo.Rules for the ethical design of algorithmic systems. In the process, eleven different role profiles in public administration alone were identified, which are involved in the design of algorithmic systems and thus also assume responsibility. "This makes the need for a clear assignment of responsibility particularly clear," she explains. Based on the Algo.Rules, a handout for digital administration and an implementation guide for executives and developers (both PDFs in German) were also created to put ethical principles into practice. "In the best case, companies use such guiding questions as a basis for their own development of suitable principles and concrete operationalization steps, since the corporate needs and prerequisites are different in each case. This requires the commitment of everyone involved so that these changes are not only defined but also implemented," says Julia Gundlach.