ERCIM news 135
ERCIM news 135
ERCIM news 134
ERCIM news 134
ERCIM news 133
ERCIM news 133
ERCIM news 132
ERCIM news 132
ERCIM news 131
ERCIM news 131
ERCIM news 130
ERCIM news 130
Back Issues Online
Back Issues Online

Jos Baeten

by Jos Baeten

Recently, I attended a lecture by Cathy O’Neil, author of the book “Weapons of Math Destruction”. Clearly, she demonstrated the destructive power of proprietary predictive algorithms that  learn from possibly biased data sets.

I think we need to be able to appeal against decisions by such algorithms, the software implementing these algorithms should be open source, and the underlying data sets should be open for inspection by an authority. Apart from this, each individual should be able to control his/her data, and should have  the right to be informed, the right to inspect and correct.

I shudder to think of a world where we are constantly monitored, guided, even ruled by an internet of interacting AIs, without recourse to human intervention.

More in general, all of us as researchers concerned with the digital domain have a moral obligation to speak out when we feel things are not going right or certain threats come about. Of course, we should always speak from our expertise, and not get caught up in a hype. Again and again, general opinion tends to go overboard, and people say for instance that the quantum computer can solve all problems, or that a normal computer can learn to solve all problems. Then we should also speak out, and temper expectations.

Jos Baeten
General Director, CWI
ERCIM President

Next issue: January 2024
Special theme:
Large Language Models
Call for the next issue
Image ERCIM News 114 epub
This issue in ePub format

Get the latest issue to your desktop
RSS Feed