Stichting VU

04/16/2024 | Press release | Distributed by Public on 04/17/2024 05:41

Christine Moser: Organisational Scientist

Share
16 April 2024
Organisational scientist Christine Moser investigates how new technology changes people's lives and what opportunities and risks are associated with new technologies like artificial intelligence.

What's your background, and how did you get to where you are today?

"My career is quite unusual, because I was trained as a musician and initially worked at a conservatorium for several years. But I wanted more of a challenge, and started studying Culture, Organization and Management at Vrije Universiteit Amsterdam. Doing research really appealed to me, so a few years later I applied for a PhD programme and I've now been working here for 16 years. I currently work as an associate professor of organisation theory."

What's your research about?

"It's about the interaction between people and new technologies. People apply technology in their lives and at work. It makes things easier, but it goes much further than that. By using new technologies, we change as people. Just think about your smartphone. You probably picked it up this morning to read your messages, and you use it throughout the entire day. Without that smartphone, your day would have been very different."

Artificial intelligence (AI) is up and coming. How do people and organisations deal with it?

"The idea of using AI in your life or at work is attractive, because it helps to do things more efficiently. But people often overlook the risks. It's tempting for organisations to unleash self-learning algorithms on large amounts of data in order to gain better insights into what customers want or how business processes can be run more efficiently. But those algorithms are unsuitable for tackling moral, ethical and social issues. The Dutch childcare benefits scandal, which I research, is a case in point.

"Organisations may use self-learning algorithms, but they shouldn't rely on them blindly - yet they do that all the time. It's easy for organisations to express things in numbers, because 'to measure is to know'. Algorithms seem like a good solution, because they can handle numbers well. But not everything can be expressed in numbers.

"What's more, an algorithm doesn't care which country it's used in; they're culture-agnostic. It's easy for organisations to roll out the same algorithm everywhere. But for people, the context matters.

"A third explanation for blind trust is that the results of algorithms, such as scores or percentages, are powerful and convincing. This sounds odd, because we often think that we're in charge of technology. But when we're confronted with the outcome of such advanced calculation models, it's very difficult not to go along with their calculative logic."

What lesson would you like to share with your 18-year-old self?

"Things are rarely black and white, and you shouldn't judge people too quickly. There are many different ways in which people can approach life, depending their situation and background."

Is this way of thinking also reflected in your research?

"Of course! For example, in my research into the Dutch childcare benefits scandal. It's terrible what happened and how much suffering it has caused. As an 18-year old, I would probably have quickly condemned everyone involved in the matter. Now I think that people often act with good intentions, which may nevertheless lead to harmful outcomes.That's why I'm investigating exactly what happened and why, despite those good intentions, nothing was done about it for years."

"You shouldn't judge people too quickly - things are rarely black and white. There are many different ways in which people can approach life, depending their situation and background."