As automated decision-making systems become central for distributing rights and services, institutions increasingly recognize their role in public life. Ideas on how to approach this.
“As automated decision-making systems take center stage for distributing rights and services within Europe, institutions across the region increasingly recognize their role in public life, both in terms of opportunities and challenges.”
How will public administration work in ten years? Will the citizens deal with the public administration in a purely digital manner and with automated decisions?
With this question the first event of Algorithm Watch Switzerland was launched. Happy Birthday!
Now, because we humans implicitly make normative assumptions, there cannot be true neutrality of ADM, short for “automated decision making”. So how can we get to the best possible versions of neutrality? The big difference between person-made decisions and ADM lies in the nature of the non-neutrality. Human-made decisions can be more arbitrary between cases, but could as a whole amount to a neutral average. On the other hand, automated decisions are consistent but have the risk of making implicit normative assumptions systematic. Of course, an ADM system could provide more neutral decisions overall than humans.
Data joins in here as a big driver: there is a lot of pressure to make use of data, especially within the public administration. And only software can really scale here. In Swiss mobility for example, non-personal mobility data is hardly regulated while there is big potential in this to organise our mobility much more efficiently. After all, most vehicles just stand around most of the time.
But what are the next steps? We’re thinking out loud here.
So, ADM has the potential to make decisions more neutral and more objective. This potential won’t realise itself but needs to be brought to life consciously and conscientiously. And we can’t leave this up to technology or our infatuation with technology, the risk of perpetuating biases is too big. So let’s make use of our unique ability for ethical thinking here.
Where software makes decisions we need to closely examine the actual solutions embedded in such software. Until now “business” has defined digitalisation and by doing so has also claimed and occupied public service topics. Now, for ADM, Ethics must join Technology and Business. There should be no "because we can". I’m talking about actual implementation here, as ADM is actually done by humans working in IT companies. So first, ethics must become an integral part of those roles and processes. Maybe of product/epic/business owners in Scrum or SaFE - or roles that are yet to be created. Second, pilot testings and prototypes must be used for trials only and any decisions from such test-stage software must remain un-binding for anyone affected. Results from such trials are to be examined closely in regard to their neutrality and transparency before ADM software can go productive.
And third, ethics courses and certifications for IT professionals should become established. (On a sidenote: we only stand to gain with having ethics as a school subject.)
Society must start to broadly own this topic, too. Neutrality and transparency must be widely known as rights we are entitled to, much like how accessibility and data protection came into public awareness.
And not only is it important to get ethics into eGovernment and Civic Tech, but also to get topics of public service back from the market into the realm of public administration. The market lacks basic incentives for neutrality, while we users just want to use what works well, for example a maps app from big tech to organise our mobility. Maybe public-private partnerships will spearhead ethically sound ADM?
Finally, let’s keep working on transparency in ADM. It’s the basis and fallback for imperfect neutrality. With transparency, everyone affected at least has the chance to know how a decision came about. Like why we get suggested a specific route, a specific job ad, a specific product or specific official form. You don’t know why today, already? My point, exactly.